Fundamentals & History
From OLS and SVMs to information theory — the mathematical and statistical groundwork that every modern model still rests on.
Open the basics
A continuing project of NoteNextra. Bridging the gap from foundational machine learning to state-of-the-art research, one carefully ordered article at a time.
Our goal is to bridge the gap for undergraduates from machine-learning lessons to SOTA.
This is a continuing project of NoteNextra.
Project Veronica re-organizes scattered course notes — primarily from the WashU CSE/Math sequence — into a single, topologically ordered curriculum. Earlier articles never assume material that hasn't been introduced yet, so a curious undergraduate can read top-to-bottom and arrive at the frontier.
Fundamentals & History ──► Deep Neural Networks ──► Computer Vision
│ │
├──────────► Large Language Models
│ │
└────► The Transformer Era (2017 → present)Each section's sidebar is a topological sort of its prerequisites; the Transformer Era track is sorted chronologically, year by year.
Source material is imported and adapted from Trance-0/NoteNextra under its original license. Each imported page links back to the upstream lecture.