Community
Q&A
Questions & Answers
Feedback
What Twitter are saying
The OpenAI of Europe has arrived?@MistralAI in addition to releasing a new model has received a major funding boost. Open-source generative AI startup Mistral AI raises $415M in funding
— AI Supremacy (@AISupremacyNews)December 11, 2023
Mistral new model is Mixtral-8x7B.pic.twitter.com/jyWNYOZCaEMistral AI, the Paris-based artificial intelligence startup, raises €385 million that will go toward advancing its open-source software and to create a European rival to US tech giants https://t.co/d5VrS73ufW
— Bloomberg (@business) December 11, 2023Running AI locally on my machine using LM Studio, running Dolphin 2.1 Mistral 7B - GGUF; pic.twitter.com/UMLUxcLH5R
— Scott O'Nanski (@NanskiO) December 4, 2023Mistral 8x7B is now available in LangSmith Playground It uses the the implementation by fireworks AI team that reverse-engineered the architecture from the parameter names.This isn't an official implementation, as the model code hasn’t been released.pic.twitter.com/XrExDMSzYq
— Shubham Saboo (@Saboo_Shubham_) December 10, 2023刚发布的多专家模型 MistralAI 8x7B 可以在线使用了!MistralAI 8x7B 是一个 MoE 模型,是用8个7B模型组合起来使用,跟ChatGPT4一样。https://t.co/ygvbITSOUg pic.twitter.com/7AtzhUwMPc
— nash_su - e/acc (@nash_su) December 10, 2023Have you heard of Mixture of Experts (MoE) models? 🤔 With the release of @MistralAI Mixtral 8x7B, MoEs are gaining attention, it is also rumored @OpenAI GPT-4 is an MoE👀 But what exactly are MoEs, and how do they work?We created an in-depth blog.https://t.co/a4DL1A8zCe
— Philipp Schmid (@_philschmid) December 11, 2023