Edit model card

Yi Coder 9B Chat by 01-Ai

Model creator: 01-ai
Original model: Yi-Coder-9B-Chat
AWQ quantization: done by stelterlab in INT4 GEMM with AutoAWQ by casper-hansen (https://github.com/casper-hansen/AutoAWQ/)

Model Summary:

Yi Coder 9B Chat is a new coding model from Yi, supporting a staggering 52 programming language, and featuring a max context length of 128k, making it great for ingesting large codebases.
This model is tuned for chatting, not auto completion, so should be chatted with for programming questions.
It is the first model under 10B parameters to pass 20% on LiveCodeBench.

Technical Details

Trained on an extensive set of languages:

  • java
  • markdown
  • python
  • php
  • javascript
  • c++
  • c#
  • c
  • typescript
  • html
  • go
  • java_server_pages
  • dart
  • objective-c
  • kotlin
  • tex
  • swift
  • ruby
  • sql
  • rust
  • css
  • yaml
  • matlab
  • lua
  • json
  • shell
  • visual_basic
  • scala
  • rmarkdown
  • pascal
  • fortran
  • haskell
  • assembly
  • perl
  • julia
  • cmake
  • groovy
  • ocaml
  • powershell
  • elixir
  • clojure
  • makefile
  • coffeescript
  • erlang
  • lisp
  • toml
  • batchfile
  • cobol
  • dockerfile
  • r
  • prolog
  • verilog

128k context length, achieves 23% pass rate on LiveCodeBench, surpassing even some SOTA 15-33B models.

For more information see original model card Yi-Coder-9B-Chat

Downloads last month
26
Safetensors
Model size
1.64B params
Tensor type
I32
·
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for stelterlab/Yi-Coder-9B-Chat-AWQ

Base model

01-ai/Yi-Coder-9B
Quantized
(21)
this model