CodeGeeX is a large-scale multilingual code generation model with 13 billion parameters, trained on 850B tokens across more than 20 programming languages. Developed with MindSpore and later made PyTorch-compatible, it is capable of multilingual code generation, cross-lingual code translation, code completion, summarization, and explanation. It has been benchmarked on HumanEval-X, a multilingual program synthesis benchmark introduced alongside the model, and achieves state-of-the-art performance compared to other open models like InCoder and CodeGen. CodeGeeX also powers IDE plugins for VS Code and JetBrains, offering features like code completion, translation, debugging, and annotation. The model supports Ascend 910 and NVIDIA GPUs, with optimizations like quantization and FasterTransformer acceleration for faster inference.
Features
- 13B parameter transformer model trained on 850B tokens
- Supports 20+ programming languages (Python, C++, Java, Go, JavaScript, etc.)
- Multilingual code generation and cross-lingual code translation
- HumanEval-X benchmark for functional correctness evaluation
- Open source with support for Ascend 910 and NVIDIA GPUs
- IDE integration: free VS Code and JetBrains plugins
- Optimized for deployment with quantization and FasterTransformer acceleration