Skip to content

Scaling Laws Continue to Hold for AI Progress

by Benjamin Mann on July 20, 2025

Ben Mann, co-founder of Anthropic, explains how AI model intelligence continues to follow scaling laws, with progress accelerating rather than plateauing as many believe.

The Persistence of Scaling Laws in AI Development

  • Scaling laws in AI have held true across many orders of magnitude, which is surprising even compared to fundamental laws of physics
  • Progress is actually accelerating, not slowing down as some claim:
    • Model releases used to happen once a year, now occur every 1-3 months
    • Time compression creates a perception that progress is slowing when it's not
    • "This narrative comes out like every six months or so and it's never been true"

Three Key Factors Driving AI Intelligence Scaling

  • Compute: The primary bottleneck is physical infrastructure

    • "If we had 10 times as many chips and had the data centers to power them... it would be a real significant speed boost"
    • Data center capacity and power availability directly limit progress
  • Algorithms: Architectural improvements compound with scale

    • Before transformers, we had LSTMs with lower scaling exponents
    • Transformers have higher exponents, meaning they get more intelligence per unit of compute
    • The transition from normal pre-training to reinforcement learning was necessary to continue scaling
  • Efficiency: Optimization creates multiplicative gains

    • "We've seen in the industry like a 10x decrease in cost for a given amount of intelligence"
    • Through combined algorithmic, data, and efficiency improvements
    • If this continues: "in three years we'll have a thousand times smarter models for the same price"

Why Benchmarks Get Saturated Quickly

  • For some tasks, we're saturating the intelligence needed
  • "When you release a new benchmark within like six to twelve months it immediately gets saturated"
  • The real constraint becomes creating better benchmarks that reveal the ongoing intelligence improvements

The Exponential Nature of Progress

  • "People are really bad at modeling exponential progress"
  • On an exponential curve, progress looks flat at first, then suddenly hits the knee of the curve
  • "It looks flat and almost zero at the beginning... and then it goes vertical"
  • We're currently experiencing this rapid acceleration phase