8 AI models in one - Mixtral 8x7B

MustacheAI
MustacheAI
3.3 هزار بار بازدید - 8 ماه پیش - Mixtral 8x7B has a context
Mixtral 8x7B has a context length of 32K tokens, and outperforms GPT 3.5 and Llama 2 70B in benchmarks.
llama.cpp already supports Mixtral-8x7B, and it runs quite fast on CPU only.

More about Mixtral - https://mistral.ai/news/mixtral-of-ex...
Hugging Face (official Instruct ver.) - https://huggingface.co/mistralai/Mixt...
More about OpenRouter - Free AI models on OpenRouter - SillyT...

X (Twitter) - Twitter: MustacheAI

Music - Meat Grinder (Slaughterhouse) - Katana ZERO
8 ماه پیش در تاریخ 1402/09/20 منتشر شده است.
3,362 بـار بازدید شده
... بیشتر