Skip to content

Project VeronicaUndergraduate ML → SOTA

A continuing project of NoteNextra. Bridging the gap from foundational machine learning to state-of-the-art research, one carefully ordered article at a time.

Why Veronica?

Our goal is to bridge the gap for undergraduates from machine-learning lessons to SOTA.

This is a continuing project of NoteNextra.

Project Veronica re-organizes scattered course notes — primarily from the WashU CSE/Math sequence — into a single, topologically ordered curriculum. Earlier articles never assume material that hasn't been introduced yet, so a curious undergraduate can read top-to-bottom and arrive at the frontier.

How the curriculum is ordered

Fundamentals & History  ──► Deep Neural Networks ──► Computer Vision
                                  │                       │
                                  ├──────────► Large Language Models
                                  │                       │
                                  └────► The Transformer Era (2017 → present)

Each section's sidebar is a topological sort of its prerequisites; the Transformer Era track is sorted chronologically, year by year.

Acknowledgments

Source material is imported and adapted from Trance-0/NoteNextra under its original license. Each imported page links back to the upstream lecture.

Released under the MIT License. Content imported and adapted from NoteNextra.