KNN Visualizer

Explore the K-Nearest Neighbors algorithm with this interactive visualization. Click on the canvas to place test points and see real-time classification.

How KNN Works

K-Nearest Neighbors (KNN) classifies new data points based on their similarity to existing data points. It finds the K closest neighbors and assigns the most common class among them. Adjust the K value to see how it affects classification results.

Controls

115

Two clearly separated classes

Click on the canvas to add test points.

Legend

Class A
Class B
Class C
Class D
Class E
Selected Neighbor
Distance Line

Understanding the Visualization

  • Training points are shown as colored dots, representing different classes.
  • Test points appear when you click on the canvas. They are classified based on their K nearest neighbors.
  • Dashed lines connect test points to their nearest neighbors.
  • The K value determines how many neighbors influence the classification. A higher K is often more robust to noise but may miss important patterns.
  • Distance metrics affect how the algorithm measures similarity:
    • Euclidean: Straight-line distance (" as the crow flies")
    • Manhattan: Sum of absolute differences ("city block" distance)
  • Weighted KNN gives closer neighbors more influence in the classification, which can improve accuracy when distance is a strong indicator of similarity.
  • Confidence shows how certain the algorithm is about the classification (displayed as a percentage and partial ring around test points).