Running Mixtral 8x7B Mixture-of-Experts (MoE) on Google Colab’s free tier
if you are interested in running your very own AI models locally on your home network or hardware you might be interested that it is possible to run Mixtral 8x7B on Google Colab. Mixtral 8x7B is a high-quality sparse mixture of experts model (SMoE) with open weights. Licensed under Apache 2.0, Mixtral outperforms Llama 2 … Read more