Tech »  Topic »  Meta Unveils Llama 4 AI Series Featuring New Expert-Based Architecture

Meta Unveils Llama 4 AI Series Featuring New Expert-Based Architecture


The Llama 4 series is the first to use a “mixture of experts (MoE) architecture,” where only a few parts of the neural network, the “experts,” are used to respond to an input.

Image: Meta

Meta unveiled on April 5 its new AI model series: Llama 4, which includes Llama 4 Maverick and Llama 4 Scout, tailored for conversation and processing large files, respectively, along with an unreleased “teacher” model called Llama 4 Behemoth.

Llama 4 is Meta’s first series to adopt a “mixture of experts (MoE) architecture.” This approach activates only select parts of the neural network, referred to as the “experts,” to handle specific subtasks. The task will be broken down into subtasks and each routed to the most appropriate experts, improving resource efficiency.

What are the specifics about Llama 4 Maverick and Scout?

Llama 4 Maverick features 128 experts and 17 billion active parameters, which represent ...


Copyright of this story solely belongs to techrepublic.com . To see the full text click HERE