Claude Code adds Monitor tool for async log streaming

Anthropic has rolled out a new Monitor tool for Claude Code, letting agents spawn background processes and stream stdout into the chat without blocking. It reduces token-wasting polling loops and adds debounced, event-driven logging for workflows like kubectl and CloudWatch.

tool cover

TL;DR

  • Monitor tool: Spawns background processes and streams stdout line-by-line into the conversation
  • Non-blocking execution: Output arrives asynchronously without blocking the main thread
  • Replaces polling loops: Avoids “are we done yet?” checks that burn tokens and add noise
  • Debounced streaming: Reduces flooding from high-volume logs while still delivering each stdout line
  • Example workflow: kubectl logs -f | grep .., observe errors, then make a PR to fix streamed crashes
  • Availability: Live now; update via $ claude update; included in .98, supports CloudWatch and Claude agent SDK

Anthropic’s Claude Code picked up a small-but-meaningful primitive this week: a new Monitor tool that lets Claude spawn a background process and stream each stdout line directly into the conversation without blocking the main thread.

The announcement came from Claude Code’s Alistair, who framed it as a practical fix for a familiar pain point in AI-assisted coding workflows: long-running commands that previously required polling inside the agent loop, burning tokens and adding noise. With Monitor, the agent can keep working while output arrives asynchronously.

Streaming stdout, not “are we done yet?” loops

The key behavior is straightforward: Claude starts a background process, then streams stdout line-by-line into the conversation. That means workflows like tailing logs become first-class, rather than being approximated through repeated checks.

Alistair’s example centers on a log-watching flow:

  • Use the monitor tool with something like kubectl logs -f | grep ..
  • Listen for errors as they happen
  • Make a PR to fix crashes that show up in the stream

In short: react to events, rather than spending cycles asking whether anything changed.

Debouncing and “wake-ups”

One immediate question from the replies was what “streams into the conversation” actually means in practice—whether Claude reacts to every new line and how the tool behaves between turns. Alistair clarified that it’s each line from stdout, and that Monitor does debounce.

Without debouncing, high-volume log streams could quickly flood the conversation even if they’re token-efficient compared to polling.

Availability: out now, update required

When asked whether the tool was already available, Alistair said it was live, with a note that an update might be required:

  • $ claude update

He also confirmed that it’s included in “.98” in reply to a version question.

Beyond kubectl: external log sources and the Agent SDK

While Kubernetes logs were the most discussed use case, Alistair also answered “Yes” to whether Monitor could continuously stream logs from an external source such as CloudWatch.

For agent builders, there’s another notable confirmation: asked if Monitor can be used in the Claude agent SDK, Alistair replied yes—suggesting this isn’t limited to interactive Claude Code sessions.

Why developers care: a more event-driven Claude Code

The replies quickly converged on the bigger implication: event-driven behavior. Several developers called out that polling build output or tailing logs in loops can consume a surprising amount of context and time, and that streaming output into the conversation reduces that overhead while keeping the agent closer to real-time signals.

It’s a modest tool on paper—run a process, stream stdout—but it nudges Claude Code toward a workflow where agents can wait on and respond to long-running tasks without repeatedly “checking in.”

Original source: https://x.com/alistaiir/status/2042345049980362819?s=20

Continue the conversation on Slack

Did this article spark your interest? Join our community of experts and enthusiasts to dive deeper, ask questions, and share your ideas.

Join our community