Custom GPT for Python Code Completion

Model Details

  • Parameters: 135.00M
  • Architecture: Transformer (12 layers, 12 heads, 768 embedding dimension)
  • Context Length (Block Size): 1024 tokens

Training Metrics

  • Final Validation Loss: 1.6620
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support