This project features a custom model that recognizes English language alphabets through hand gestures using YOLOv8 framework. The model leverages YOLO's robust hand-tracking capabilities to accurately identify and classify gestures corresponding to individual alphabets. Ideal for applications in sign language interpretations.