Meta’s latest hosted model, Muse Spark, is now live in a limited form—and the more interesting story may be less about raw model quality and more about what Meta has quietly bundled around it inside the meta.ai chat UI.
Simon Willison dug into Muse Spark on launch day, noting that it’s Meta’s first model release since Llama 4 roughly a year ago. This time, though, it’s hosted (not open weights) and the API is currently described as a private preview. Muse Spark is already usable via meta.ai (with a Facebook or Instagram login), where it shows up in two modes: “Instant” and “Thinking”. Meta also promises a future “Contemplating” mode aimed at longer reasoning.
A model release… and an agent harness peeking through
What makes Willison’s write-up worth bookmarking is the hands-on look at the tools exposed through Meta’s chat harness—including the rare moment where a shipping assistant actually reveals its tool schema without a fight. After prompting for exact tool names and parameters, Willison received descriptions for 16 tools, spanning several categories that developers will immediately recognize as the building blocks of agentic workflows.
A few highlights that stand out:
- Web browsing primitives via
browser.search,browser.open, andbrowser.find - Meta-first-party search with
meta_1p.content_search, including filters likeauthor_idsand interaction-based parameters - Image generation through
media.image_gen(with modes like “artistic” and “realistic”) - A Python sandbox via
container.python_execution—essentially Code Interpreter, complete with a persistent/mnt/data/
Visual grounding that goes beyond “describe the image”
Willison also spends time with container.visual_grounding, a tool that returns results in point, bbox, or count formats. The demos get delightfully concrete: generating an image, analyzing it with Python tooling, then using visual grounding to localize objects—and even to count fine-grained features like whiskers.
It’s the kind of end-to-end “generate → analyze → annotate” loop that hints at what Meta’s UI can already do, even before general API access lands.
For the full tool list, prompts, and example outputs (including the pelican-counting finale), the original post is here: Meta’s new model is Muse Spark, and meta.ai chat has some interesting tools.