Learning Update: Latent Semantic Analysis
An area of data science that I enjoy is natural language processing (NLP). There’s a technique in NLP that I’ve been learning about called latent semantic analysis (LSA). Long story short, LSA is a way to determine which topics are important across many documents (like a bunch of news articles or tweets) and what words are important for describing each topic. How does it work? Linear algebra!
I’m definitely not a linear algebra master, so I didn’t feel solid on the math behind LSA. But I found an amazing lecture series on singular value decomposition by Steve Brunton that really clicked with my learning style. Why was this exciting for me? Connected with several things I’d been learning, like the discussion of LSA in this textbook by Kamath, Liu and Whitaker, and this marvelous series of videos on linear algebra by Grant Sanderson. It was a real lightbulb moment. 💡
I find that making new connections is a really important part of the learning process. It means that I’m moving beyond a basic understanding of a topic to more advanced applications.
I’m curious: What sort of connections have you made in your learning recently? Anything exciting or unexpected?