# The latest news and insights from D3X Systems and the broader industry

### Introduction to Principal Component Analysis with Morpheus

##### Author: Xavier Witdouck

Principal Component Analysis (PCA) is a statistical technique used in data analysis and for building predictive models. The technique involves transforming a dataset into a new basis whereby the transformed data is uncorrelated. The transformed basis, which can be represented by an orthogonal matrix, defines the Principal Components of the original dataset. These basis vectors are usually ordered so that the first principal component is the one that accounts for the largest variance in the data, and the last component accounts for the least variance. This article introduces PCA theory and illustrates and example using the D3X Morpheus library.

#### Signal Investment Quality, Through Transparency

###### Opinion

It is becoming increasingly difficult for both boutique and emerging institutional investment managers to differentiate themselves enough to raise meaningful assets. Moreover, the need for a three-year track record is a significant hurdle to overcome for a new business, and even with that in place there is no guarantee of success. It is ironic that the first disclaimer you are likely to come across in the investment business says that past performance is no indication of future results, yet so much emphasis is placed on track record! A good track record is a potentially useful signal, but it by no means tells the whole story.

#### Introduction to Principal Component Analysis with Morpheus

###### Blog Post

Principal Component Analysis (PCA) is a statistical technique used in data analysis and for building predictive models. The technique involves transforming a dataset into a new basis whereby the transformed data is uncorrelated. The transformed basis, which can be represented by an orthogonal matrix, defines the Principal Components of the original dataset. These basis vectors are usually ordered so that the first principal component is the one that accounts for the largest variance in the data, and the last component accounts for the least variance. This article introduces PCA theory and illustrates and example using the D3X Morpheus library.

#### Introduction to Generalized Least Squares Regression with Morpheus

###### Blog Post

One of the Gauss Markov assumptions that must be met for OLS to be the Best, Linear, Unbiased Estimator (BLUE) is that of spherical errors. More specifically, the assumption states that the error or disturbance term in the model must exhibit uniform variance (i.e., homoscedasticity), and that there is no autocorrelation between errors. In the real world, this assumption will not always hold, and errors may exhibit heteroscedasticity or autocorrelation or both. This article introduces a Generalized Least Squares (GLS) model which is appropriate estimator in these cases. 