Blog

From the Alexa Challenge to Malted AI

Malted AI founders leveraged small language models (SLMs) and distillation to win the Amazon Alexa Challenge. These learnings are the cornerstone of Malted AI unique approach to solving domain-specific tasks in secure enterprise environments. Learn how the Alexa Challenge inspired what Malted AI is today.

Large language models are not always the answer: the rise of small language models

This blog will explore the key differences between small language models (SLMs) and large language models (LLMs), focusing on how they’re built, their trade-offs in efficiency and resource consumption, the situations where one might be more appropriate than the other and what happens when models are combined.

Teaching small models to think big: the secrets of knowledge distillation   

This blog post explores how knowledge distillation, combined with synthetic data, enables the development of small, efficient AI models that retain the capabilities of larger ones, addressing data scarcity, reducing resource requirements, and delivering practical and secure solutions for enterprise applications.

Scroll to Top