| --- |
| tags: |
| - ml-intern |
| --- |
| # Golf Ball Tracker for Mobile Phone Camera |
|
|
| A lightweight, real-time golf ball detection and tracking model optimized for mobile deployment. |
|
|
| ## Model Overview |
|
|
| - **Architecture**: YOLOv8-nano (3M parameters, 8.1 GFLOPs) |
| - **Training Data**: 559 real ball images + 500 synthetic golf ball images |
| - **Test Performance**: mAP50 = 81.2%, mAP50-95 = 58.6% |
| - **Formats**: PyTorch (.pt), ONNX (.onnx) |
| - **Input Size**: 640x640 |
| - **Target FPS**: ~30 FPS on modern mobile devices (via ONNX Runtime or TFLite) |
|
|
| ## Use Cases |
|
|
| - **Golf shot analysis**: Track ball flight from tee to landing |
| - **Swing coaching**: Visual feedback on ball trajectory |
| - **Mobile golf apps**: Real-time ball tracking using phone camera |
| - **Driving range**: Automated ball flight recording |
|
|
| ## Mobile Deployment |
|
|
| ### iOS (CoreML) |
| ```python |
| from ultralytics import YOLO |
| model = YOLO("best.pt") |
| model.export(format="coreml", imgsz=640) |
| ``` |
|
|
| ### Android (TFLite) |
| ```python |
| from ultralytics import YOLO |
| model = YOLO("best.pt") |
| model.export(format="tflite", imgsz=640, int8=True) |
| ``` |
|
|
| ### Cross-Platform (ONNX Runtime) |
| ```python |
| import onnxruntime as ort |
| session = ort.InferenceSession("best.onnx") |
| # Use session for inference on any platform |
| ``` |
|
|
| ## Quick Start |
|
|
| ```python |
| from ultralytics import YOLO |
| |
| # Load model |
| model = YOLO("best.pt") |
| |
| # Detect golf balls in an image |
| results = model("golf_shot.jpg") |
| results[0].show() |
| |
| # Track ball across video frames |
| for frame in video_stream: |
| results = model.track(frame, persist=True) |
| # results[0].boxes.xywh provides bounding boxes |
| ``` |
|
|
| ## Tracking Pipeline |
|
|
| For full trajectory tracking with Kalman filtering and ballistic prediction, see `golf_ball_tracker.py`: |
|
|
| ```python |
| from golf_ball_tracker import GolfBallTracker |
| |
| tracker = GolfBallTracker("best.onnx") |
| tracker.track_video("input.mp4", "output_tracked.mp4") |
| ``` |
|
|
| The tracker includes: |
| - **YOLO detection**: Finds golf ball in each frame |
| - **Kalman filtering**: Smooths trajectory, handles missed detections |
| - **Ballistic prediction**: Predicts flight path when ball is occluded or too small |
| - **Trajectory history**: Stores last 100 positions for visualization |
|
|
| ## Dataset |
|
|
| The model was trained on: |
| 1. **Zenodo Accurate Balls Detection** dataset (559 images of various sports balls) |
| 2. **500 synthetic golf ball images** with varied: |
| - Backgrounds (sky, grass, golf course, indoor, dark) |
| - Ball sizes (4-40 pixels radius, simulating distance) |
| - Motion blur (0-5 levels, simulating high-speed flight) |
| - Brightness variations (0.4-1.7x) |
| - Noise and lighting changes |
|
|
| ## Training Recipe |
|
|
| ```python |
| from ultralytics import YOLO |
| |
| model = YOLO("yolov8n.pt") |
| model.train( |
| data="golf_ball.yaml", |
| epochs=5, # Short training (CPU-friendly) |
| imgsz=640, |
| batch=4, |
| device="cpu", |
| augment=True, |
| mosaic=1.0, |
| scale=0.5, # Critical for small object detection |
| hsv_h=0.015, |
| hsv_s=0.7, |
| hsv_v=0.4, |
| ) |
| ``` |
|
|
| **Key insights for golf ball detection**: |
| - High-resolution features (640x640 input) |
| - Heavy scale augmentation (balls appear at different distances) |
| - Motion blur augmentation (golf balls move at 150+ mph) |
| - Brightness variation (white ball against sky/grass) |
|
|
| ## Performance Tips for Mobile |
|
|
| 1. **Use 320x320 input** for 4x faster inference (small accuracy trade-off) |
| 2. **Quantize to INT8** for 2-4x speedup on mobile NPUs |
| 3. **Frame skipping**: Run detection every 3rd frame, interpolate between |
| 4. **ROI tracking**: After initial detection, only search nearby region |
| 5. **Hardware acceleration**: Use NNAPI (Android) or CoreML (iOS) |
|
|
| ## Limitations |
|
|
| - Model trained on mixed sports ball data (football, etc.) + synthetic golf balls |
| - Real golf ball flight data would improve performance significantly |
| - Small balls at extreme distances (>100 yards) may be challenging |
| - Motion blur at very high speeds may reduce detection rate |
| - Night/low-light conditions not specifically trained |
|
|
| ## Citation |
|
|
| ```bibtex |
| @software{golf_ball_tracker, |
| title = {Golf Ball Tracker for Mobile Phone Camera}, |
| author = {ML Intern}, |
| year = {2026}, |
| url = {https://huggingface.co/notjulietxd/golf-ball-tracker} |
| } |
| ``` |
|
|
| ## License |
|
|
| Apache-2.0 |
|
|
| <!-- ml-intern-provenance --> |
| ## Generated by ML Intern |
|
|
| This model repository was generated by [ML Intern](https://github.com/huggingface/ml-intern), an agent for machine learning research and development on the Hugging Face Hub. |
|
|
| - Try ML Intern: https://smolagents-ml-intern.hf.space |
| - Source code: https://github.com/huggingface/ml-intern |
|
|
| ## Usage |
|
|
| ```python |
| from transformers import AutoModelForCausalLM, AutoTokenizer |
| |
| model_id = 'notjulietxd/golf-ball-tracker' |
| tokenizer = AutoTokenizer.from_pretrained(model_id) |
| model = AutoModelForCausalLM.from_pretrained(model_id) |
| ``` |
|
|
| For non-causal architectures, replace `AutoModelForCausalLM` with the appropriate `AutoModel` class. |
|
|