Probability Theory: From Events to Numbers
The third installment in our Foundations series, where we connect sets and sigma algebras to the actual rules of probability.
Complete collection of research publications, projects, and reading notes from Second Street Labs.
The third installment in our Foundations series, where we connect sets and sigma algebras to the actual rules of probability.
Notes on Manuel Castells’s framework for understanding cities in the Information Age—functions, meanings, and forms—and why it matters for urban networks.
Continuing our Foundations series, we explore sigma-algebras—the mathematical scaffolding that makes probability rigorous and connects abstract theory to real-world randomness in our models.
As part of our Foundations series, we talk about the foundations of Machine Learning. This section discusses set theory and its elegant applications to everything from LLMs to recommender systems.
Notes from the seminal Sekhari et al paper. We break down the state of machine unlearning and what makes their method particularly attractive.
This paper presents a comprehensive framework for optimization on Riemannian manifolds with applications to machine learning. We develop novel convergence guarantees and demonstrate superior performance on constrained optimization problems arising in neural networks and dimensionality reduction.
We introduce the Memory Pair framework: the first algorithm to learn and unlearn *continuously* on streaming data. It achieves logarithmic regret with deletions, constant memory via online L-BFGS, and $(\varepsilon,\delta)$-certified guarantees—enabling GDPR compliance without expensive retrains.