Not all AI is GenAI. Clusterflick is a K-Nearest Neighbors physics board game: place labelled training data, flick photo tokens onto the board, and classification happens by proximity.
Sister game to FuzzNet Labs. Where FuzzNet opens the GenAI black box, Clusterflick shows you machine learning outside GenAI: transparent, distance-based, and no black box.
Machine learning is a whole family - GenAI is just one branch
K-NN classifies by similarity to known examples - no neural net required
The training set isn't compressed into weights - every sample stays visible
Knowing which kind of ML fits the job is what real AI literacy looks like
Most AI literacy training stops at GenAI. But "machine learning" covers a whole family of algorithms - K-Nearest Neighbors, decision trees, clustering, regression, support vector machines - most of which look nothing like a transformer. Clusterflick puts you inside one of the simplest and most transparent: K-NN. Place the labelled training data. Flick a new sample onto the board. Where it lands - and which neighbours it lands closest to - decides what it is.
When people say "AI" today they almost always mean GenAI - large neural networks like ChatGPT or image generators. But machine learning is a whole family of algorithms, and most of them aren't black boxes at all. K-Nearest Neighbors, decision trees, clustering, linear regression - these are all "AI" too, and many of them produce predictions you can actually inspect and explain. Clusterflick lets you experience one of those algorithms directly. Once you see how K-NN works, the GenAI hype starts to feel like one tool in a much larger toolbox.
K-NN doesn't "learn" anything in the neural-network sense. There's no training run, no gradient descent, no weights. To classify a new input, K-NN just measures the distance to every labelled example it knows about, looks at the k closest ones, and takes a vote. That's it. In Clusterflick, that "distance" is literal - physical distance on the board between your flicked token and the sample squares around it. Understanding this distance-based logic is half of intuitively understanding most classical ML.
In a neural net, the training data gets compressed into millions of weights and effectively disappears - you can't point at any one number and say "this is what the model learned about cats." K-NN is the opposite extreme: the training data is the model. Every labelled sample stays visible, and predictions are made by direct comparison against them. That makes the consequences of bad data immediately obvious - mislabel one sample and you can watch its bad influence spread to everything that lands near it. It's a powerful, tactile lesson about why data quality matters more than algorithmic cleverness in most real ML projects.
Real AI fluency isn't being able to use ChatGPT. It's knowing which kind of model fits which kind of problem - and pushing back when someone reaches for GenAI on a problem that a 30-line classical ML algorithm would solve faster, cheaper, and more reliably. After Clusterflick, "AI" stops meaning "the magic chatbot" and starts meaning a family of techniques with very different tradeoffs. That's the difference between being talked at by AI vendors and asking the right questions.
Use Clusterflick to teach the variety of machine learning that isn't GenAI - classical, transparent, distance-based methods. Then use FuzzNet Labs to teach the neural-network side: layered architectures, training cycles, and why GenAI is the way it is. Together they show participants that "AI" is bigger than the chatbot they used yesterday.
Open with Clusterflick: Establish that ML is a family of algorithms - and that not all of them are mysterious. Participants build intuition for distance-based classification and the importance of training data placement.
Bridge to FuzzNet Labs: Now contrast: what if your data has thousands of dimensions? What if "distance" stops being meaningful? That's where neural networks come in.
Close with reflection: Which kind of model would you reach for on the problem your team is actually facing? Most of the time, the answer isn't GenAI.
Clusterflick's tutorial is a self-paced overlay you can have participants run before or during the session. The Learn How to Play button walks through K-NN concepts and gameplay together, so you don't need to lecture before play begins.
Share the room code to invite players, or play solo vs a bot.
Have a room code? Enter it below to join.
This will cancel the game for all players. Are you sure?