Latent Semantic Analysis - Implementation

Implementation

The SVD is typically computed using large matrix methods (for example, Lanczos methods) but may also be computed incrementally and with greatly reduced resources via a neural network-like approach, which does not require the large, full-rank matrix to be held in memory. A fast, incremental, low-memory, large-matrix SVD algorithm has recently been developed. MATLAB and Python implementations of these fast algorithms are available. Unlike Gorrell and Webb's (2005) stochastic approximation, Brand's algorithm (2003) provides an exact solution.

Read more about this topic:  Latent Semantic Analysis