Deepseek v3 is a relatively small LLM model (671B parameters) with a MoE architecture. However, it is reportedly close to Claude Sonnet 3.5 in benchmarks with particular strengths in reasoning.
I would be very interested in seeing this model added to Assistant, especially as it may be significantly cheaper per token. It's an open-weight model similar to Llama so self-hosting is an option. Depending on performance, this might make for a better default code assistant - early reports from reddit are promising.