Yi-Coder
Basic Information
- Company/Brand: 01.AI
- Country/Region: China
- GitHub: https://github.com/01-ai/Yi-Coder
- Hugging Face: https://huggingface.co/01-ai/Yi-Coder-9B
- Type: Open-source code large language model
- Release Date: September 2024
- License: Apache 2.0
Product Description
Yi-Coder is an open-source code language model series launched by 01.AI (founded by Kai-Fu Lee), achieving industry-leading coding performance with fewer than 10 billion parameters. It excels in code generation, code editing, debugging, and even mathematical reasoning, supporting 52 major programming languages, with a maximum context length of 128K Tokens.
Yi-Coder-9B is built on Yi-9B and trained on an additional 2.4 trillion high-quality Tokens, sourced from GitHub's repository-level code corpus and code-related data filtered from CommonCrawl.
Core Features/Characteristics
- Code generation and completion
- Code editing and refactoring
- Debugging assistance
- Mathematical reasoning
- Long context understanding (128K Tokens)
- Support for 52 programming languages
- High performance with small parameter size
Model Variants
| Model | Parameters | Description |
|---|---|---|
| Yi-Coder-9B | 9B | Base model (large version) |
| Yi-Coder-9B-Chat | 9B | Optimized for dialogue and code tasks |
| Yi-Coder-1.5B | 1.5B | Lightweight base model |
| Yi-Coder-1.5B-Chat | 1.5B | Lightweight dialogue version |
Performance
- Yi-Coder-9B-Chat achieves a 23% pass rate on LiveCodeBench
- The only model with fewer than 10B parameters to exceed a 20% pass rate
- Outperforms DeepSeekCoder-33B-Ins (22.3%)
- Outperforms CodeGeex4-9B-all (17.8%)
- Outperforms CodeLlama-34B-Ins (13.3%)
Business Model
Completely open-source and free (Apache 2.0 license). Available and usable through platforms like Hugging Face and Ollama.
Target Users
- Developers in need of small, high-performance code models
- Local deployment and edge computing scenarios
- Resource-constrained environments
- Developers of code assistance tools
Competitive Advantages
- Extremely high parameter efficiency (9B surpasses 33B models)
- Apache 2.0 license (business-friendly)
- 128K ultra-long context window
- Availability of a 1.5B ultra-lightweight version
- Coverage of 52 programming languages
- Brand influence of 01.AI in the AI field
Market Performance
- Covered by international media outlets like VentureBeat
- Set a performance benchmark in the small-parameter code model field
- Active usage in the open-source community
- The 1.5B version offers unique advantages for deployment on edge devices
Relationship with OpenClaw
Yi-Coder's small parameter size and high performance make it particularly suitable for running OpenClaw in resource-constrained environments. The 1.5B version can even be used on mobile devices, complementing OpenClaw's mobile support. As a Chinese AI company, 01.AI complements OpenClaw in the Chinese user market.
External References
Learn more from these authoritative sources: