Algebraic geometry and topology traditionally focused on fairly pure math considerations. With the rise of high-dimensional machine learning, these fields are increasing being pulled into interesting computational applications such as manifold learning. Algebraic statistics and information geometry offer potential to help bridge these fields with modern statistics, especially time-series and random matrices.

Early evidence suggests potential for significant intellectual cross-fertilization with finance, both mathematical and computational. Geometrically, richer modeling and analysis of latent geometric structure than available from classic linear algebraic decomposition (e.g. PCA, one of the main workhorses of modern $\mathbb{P}$ finance); for example, cumulant component analysis. Topologically, more effective qualitative analysis of data sampled from manifolds or singular algebraic varieties; for example, persistent homology (see CompTop).

As evidence by Twitter followers, numerous Quantivity readers are familiar with these fields. Thus, perhaps the best way to explore is to seek insight from readers.

Readers: please use comments or twitter to suggest applied literature from these fields; ideally, although not required, that of potential relevance to finance modeling. All types of literature are requested, from intro texts to survey articles to preprint working papers on specific applications.

These suggestions will be synthesized into one or more subsequent posts, along with appropriate additions to People of Quant Research.

November 1, 2011 10:14 am

You mentioned cumulant component analysis in your post and I was wondering if you had any experience with using it? It seems very much a natural extension of the covariance decomposition that is performed in PCA. However, I would likely prefer to map our features into a kernel space and perform more easily understood linear transformations on them.