GitHub activity is exploding: 275M commits a week, Actions surging

GitHub’s Kyle Daigle says the platform hit 1B commits in 2025 and is now running at 275M commits per week. GitHub Actions minutes are also spiking, raising new questions about capacity, reliability, and how much is driven by AI automation.

GitHub activity is exploding: 275M commits a week, Actions surging

TL;DR

  • GitHub activity: 1B commits in 2025; currently 275M commits/week
  • Commit trajectory: Linear projection implies 14B commits in 2026, though growth unlikely linear
  • GitHub Actions usage: 500M minutes/week (2023); 1B (2025); 2.1B so far this week
  • Operational focus: Adding CPU capacity, scaling services, strengthening core features to sustain throughput and reliability
  • AI concerns: Questions on agent-driven activity, reviewability, and AI-generated churn impacting code quality
  • Reliability issues raised: Reports of Actions instability, rate-limit/job termination complaints, and leadership perception debates

GitHub’s own Kyle Daigle shared a snapshot of just how fast the platform’s activity is climbing—and the numbers are large enough to shift the conversation from “developer productivity” to plain old capacity planning.

Daigle said GitHub saw 1 billion commits in 2025, and is now running at 275 million commits per week. That pace would imply 14 billion commits in 2026 if growth stayed linear, though Daigle noted it likely won’t.

On the CI side, Daigle also pointed to GitHub Actions growth: 500M minutes/week in 2023, 1B minutes/week in 2025, and 2.1B minutes so far this week. The emphasis “so far” matters; it frames the current week as a notable jump even relative to recent baselines.

Scaling GitHub is becoming the product story

Daigle’s post framed the response as operational as much as it is product-focused: adding CPU capacity, scaling services, and strengthening GitHub’s core features. It’s the kind of status update that reads less like a victory lap and more like a reminder that the “boring” parts of software platforms—throughput, reliability, and headroom—define the day-to-day experience once usage curves steepen.

That framing also resonated in replies. Some responses treated the Actions figure as the more important headline, suggesting that runtime growth translates directly into infrastructure pressure.

The AI question: volume vs. value

Several replies immediately questioned what’s inside those commit counts:

  • Requests for a graph and analysis, including how much activity comes from coding agents
  • Skepticism that a stream of 275M commits/week can be meaningfully reviewed, with one commenter likening git to a write-ahead log
  • More pointed critiques calling some of the growth AI-generated “slop”, with concerns about code quality and churn

Other replies, meanwhile, leaned toward gallows humor and admiration for the scale, including jokes about individual responsibility for the spike and quips about Daigle’s own return to “coding + AI.”

Reliability, rate limits, and leadership perception

The thread also surfaced friction that tends to accompany rapid scale-ups:

  • Direct claims that recent Actions instability has been brutal, alongside offers to share ground-level feedback
  • A complaint about rate limiting behavior—specifically, allowing agent jobs to run for extended periods and then terminating them
  • A broader perception issue: one reply argued GitHub can feel like “a ship without a captain,” to which Daigle responded that the company has committed leadership and support

Through it all, the common theme is that at these volumes, capacity and uptime stop being background concerns and become core features.

Original source

Continue the conversation on Slack

Did this article spark your interest? Join our community of experts and enthusiasts to dive deeper, ask questions, and share your ideas.

Join our community