Text Generation
Transformers
Safetensors
English
qwen2
code
chat
conversational
text-generation-inference
Vedika Coder
Introduction
Vedika Coder is the latest series of Code-Specific Vedika Coder language models (formerly known as Code Vedika). As of now, Qwen2.5-Coder has covered six mainstream model sizes, 0.5, 1.5, 3, 7, 14, 32 billion parameters, to meet the needs of different developers. Qwen2.5-Coder brings the following improvements upon CodeQwen1.5:
- Significantly improvements in code generation, code reasoning and code fixing. Base on the strong Vedika Coder, we scale up the training tokens into 5.5 trillion including source code, text-code grounding, Synthetic data, etc. Vedika Coder has become the current state-of-the-art open-source codeLLM, with its coding abilities matching those of GPT-4o.
- A more comprehensive foundation for real-world applications such as Code Agents. Not only enhancing coding capabilities but also maintaining its strengths in mathematics and general competencies.
- Downloads last month
- 257
Install from pip and serve model
# Install vLLM from pip: pip install vllm# Start the vLLM server: vllm serve "Vedika35/Vedika_coder"# Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Vedika35/Vedika_coder", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'