Meta Launches Muse Spark to Supercharge Meta AI Performance

Meta has unveiled Muse Spark, the first model in a new series of large language models developed by Meta Superintelligence Labs, marking what the company describes as a major step toward “personal superintelligence.” The launch introduces a redesigned AI architecture, upgraded Meta AI experiences, and expanded multimodal capabilities that will roll out across Meta’s apps and devices in the coming weeks.

According to Meta, Muse Spark is the result of a rapid nine‑month development cycle in which the company rebuilt its AI stack from the ground up. The model is the first in the new Muse series, a structured, scientific approach to scaling where each generation is validated before expanding in size and complexity. Although intentionally compact and optimized for speed, Muse Spark is designed to handle advanced reasoning tasks across science, mathematics, and health.

The model now powers the Meta AI assistant in both the Meta AI app and on meta.ai, enabling more sophisticated reasoning, planning, and multimodal understanding. Meta confirmed that the next generation of Muse models is already in development.

Alongside the model launch, Meta introduced a redesigned Meta AI experience with a new interface and expanded functionality. Users can now switch between different modes depending on the complexity of their task. The assistant can also deploy multiple sub‑agents in parallel, allowing it to break down and solve multi‑part queries more efficiently.

Meta offered an example of planning a family trip to Florida: one sub‑agent can draft an itinerary, another can compare Orlando with the Florida Keys, and a third can identify kid‑friendly activities. The system then synthesizes the results into a single, more comprehensive answer.

A major focus of Muse Spark is enhanced multimodal perception. Meta emphasized that real‑world questions often extend beyond text, and the new model is designed to interpret images, objects, and scenes with greater accuracy.

Users can take a photo of an airport snack shelf and ask Meta AI to identify the highest‑protein options, or scan a product and request comparisons with alternatives. Meta says this shift represents a move from AI systems that rely solely on user descriptions to assistants that can “look at the world with you.”

These capabilities will become even more significant as Muse Spark is integrated into Meta’s AI glasses, enabling the assistant to interpret surroundings in real time.

Meta highlighted health as one of the most common categories of user queries. With Muse Spark, Meta AI can now provide more detailed responses to general health questions, including those involving images, charts, or visual data. The company worked with physicians to refine the model’s ability to deliver helpful, responsible information while maintaining safety boundaries. Meta stressed that the assistant is not a medical professional but can help users better understand common health topics.

Muse Spark also introduces stronger visual coding capabilities. Users can ask Meta AI to generate custom websites, dashboards, or mini‑games directly from a prompt. Meta showcased examples such as a party‑planning dashboard, a retro arcade game, or a whimsical flight simulator—all of which can be shared with friends. The company sees this as part of a broader push to make coding and digital creation more accessible.

Meta AI now includes features designed to help users make decisions about fashion, home styling, and shopping. A new shopping mode, initially launching in the United States, draws on creator content and brand storytelling across Meta’s platforms to surface personalized recommendations.

When users search for places or trending topics, Meta AI can provide contextual information drawn from public posts and community insights. For example, tapping on a location may reveal posts from locals, while asking about a trending topic may surface a curated snapshot of what people are discussing.

The upgraded Meta AI experience—featuring Instant and Thinking modes—is now live in the Meta AI app and on meta.ai in existing markets, with a U.S. rollout beginning first and additional countries to follow. Meta plans to extend Muse Spark’s capabilities across Instagram, Facebook, Messenger, WhatsApp, and its AI glasses, where enhanced perception features will play a central role.

Meta is also opening access to the underlying technology, offering Muse Spark in a private API preview for select partners and expressing interest in opensourcing future versions.

The company says this launch marks the start of a shift toward more contextual, visually rich AI interactions. Upcoming updates will integrate Reels, photos, and posts directly into responses, with creator attribution. Meta also highlighted strengthened safety and privacy measures as it positions Muse Spark as the foundation of its longterm vision for deeply personalized AI.