DeepSeek Coder 33B Instruct
accounts/fireworks/models/deepseek-coder-33b-instruct
Deepseek Coder is composed of a series of code language models, each trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language in both English and Chinese. deepseek-coder-33b-instruct is a 33B parameter model initialized from deepseek-coder-33b-base and fine-tuned on 2B tokens of instruction data.
On-demand deployments allow you to use DeepSeek Coder 33B Instruct on dedicated GPUs with Fireworks' high-performance serving stack with high reliability and no rate limits.