KNN Visualizer
KNN Visualizer
Explore the K-Nearest Neighbors algorithm with interactive animations and gamification features.
🧠How KNN Works
K-Nearest Neighbors classifies points by finding the 3 closest neighbors and using majority voting. Watch the animated connectionsto see this process in action!
Click anywhere to test classification
Higher confidence = more certain prediction
Try different K values for comparison
Test boundary regions for insights
Controls
115
Two clearly separated classes
Click on the canvas to add test points.
Legend
Class A
Class B
Class C
Class D
Class E
Selected Neighbor
Distance Line
🎯Performance Dashboard
0
Total Score
0
Test Points
0%
Avg Confidence
Achievements
🔒 First Step
Place your first test point
🔒 Explorer
Place 10 test points
🔒 Confident
Achieve 90% average confidence
🔒 Pattern Master
Test all datasets
💡 Tips for Higher Scores:
- Test boundary regions between classes
- Try different K values for the same point
- Explore areas where classes overlap
Understanding the Visualization
- Training points are shown as colored dots, representing different classes.
- Test points appear when you click on the canvas. They are classified based on their K nearest neighbors.
- Dashed lines connect test points to their nearest neighbors.
- The K value determines how many neighbors influence the classification. A higher K is often more robust to noise but may miss important patterns.
- Distance metrics affect how the algorithm measures similarity:
- Euclidean: Straight-line distance (" as the crow flies")
- Manhattan: Sum of absolute differences ("city block" distance)
- Weighted KNN gives closer neighbors more influence in the classification, which can improve accuracy when distance is a strong indicator of similarity.
- Confidence shows how certain the algorithm is about the classification (displayed as a percentage and partial ring around test points).