Back to Basics

ELI5 maths refreshers. Short, visual explainers for the building blocks that keep showing up in machine learning — logarithms, exponents, summation notation, and more.

8 parts

Part 1

Logarithms — What Are They?

· 7 min read

Part 1 of the Back to Basics series. Logarithms from first principles — what they are, why they exist, and why they show up everywhere in machine learning.

Part 2

Exponents, Powers, and the Number e

· 10 min read

Part 2 of the Back to Basics series. What exponents actually are, why anything to the power of zero is 1, and how the mysterious number e powers everything from …

Part 3

Summation and Product Notation (Σ and Π)

· 7 min read

Part 3 of the Back to Basics series. Breaking down sigma and pi notation from first principles — every piece explained, concrete examples, and the Python …

Part 4

Functions and Graphs

· 12 min read

Part 4 of the Back to Basics series. What a function actually is, how to read a graph, and the key functions that power machine learning — all visualised with …

Part 5

Fractions, Ratios, and Percentages

· 8 min read

Part 5 of the Back to Basics series. Fractions, ratios, and percentages — the language every ML metric is built on, from accuracy to F1 score.

Part 6

Inequalities and Absolute Values

· 9 min read

Part 6 of the Back to Basics series. Greater than, less than, absolute value as distance, and why these simple ideas power thresholds, error metrics, and …

Part 7

Coordinate Systems and Plotting

· 12 min read

Part 7 of the Back to Basics series. From the number line to n-dimensional feature space — how coordinate systems work and why every dataset is a cloud of …

Part 8

Sets and Set Notation

· 11 min read

Part 8 of the Back to Basics series. Sets and set notation from first principles — the foundation of probability theory, data splitting, and half the notation …