top of page

AI Is Dot‑Com 2.0 — Do You Have Proof?

  • Oriental Tech ESC
  • Nov 15
  • 5 min read

In the late 1990s, the Dot-com Bubble was fueled by vanity metrics. Startups celebrated “eyeballs” and "page views" as if they were profits, while business models remained unproven. Companies rushed to IPO with little more than a domain name, and investors mistook traffic for traction. When the bubble burst, it became clear that most firms lacked sustainable revenue — hype had far outpaced reality.


Fast forward to 2025, and some analysts are drawing parallels between AI and the Dot-com era. They point to soaring valuations and rapid infrastructure investment as signs of another bubble. But the comparison misses a crucial distinction: AI usage is tangible and measurable.


Every query, model run, and inferencing workload represents real computational work that delivers value to businesses and consumers. Unlike the Dot-com era’s obsession with eyeballs and page views — metrics that rarely translated into revenue — AI adoption is already embedded in workflows, products, and services worldwide.


 

 

Token Usage Growth (2023–2025)


The adoption curve makes the difference unmistakable. From 2023 to 2024, monthly token usage jumped nearly 50-fold, rising from a baseline of 9.7 trillion to an average of 480 trillion. By October 2025, OpenAI alone was processing around 259 trillion tokens per month, while global figures — including Google’s 980 trillion in July and Microsoft’s comparable volumes — reached into the hundreds of trillions. These aren’t vanity metrics; they are auditable computations. Each token reflects real inferencing work delivered to end users, whether in enterprise workflows, consumer applications, or regional deployments. China’s trajectory underscores the scale: from 100 billion daily tokens in early 2024 to 30 trillion by mid-2025, equating to roughly 900 trillion monthly. Unlike the Dot-com era’s hollow “eyeball” counts, these figures tie directly to genuine demand across industries.

.

 

Year/Month

Monthly Token Usage (approx., in trillions)

Growth vs. Prior Period

2023 (Annual Avg.)

9.7

Baseline adoption

2024 (Annual Avg.)

480

~50x YoY increase

2025 (Oct)

~259 (OpenAI); Hundreds globally

Sustained exponential growth, acceleration in early 2025

2025 (China, June)

~900 (regional monthly usage)

Surge from 100B daily in early 2024

 

 

Token Usage Cannot Be Lied About


Token usage cannot be lied about. Tokens represent actual text or data processed, making them a direct proxy for AI activity. Unlike vanity metrics, token numbers are impossible to inflate: even if an AI company tried to simulate usage with 100,000 employees running prompts nonstop, artificial generation could never match trillions of tokens. Real-world adoption proves the scale — enterprises like Salesforce and Shopify each process only 1 trillion tokens via OpenAI. Most importantly, tokens are consumed when AI delivers value: answering queries, analyzing data, generating content, or powering workflows in finance, healthcare, and logistics.

 



Addressing Criticisms: Are AI Data Centers Overbuilt and Usage Inflated?


Some analysts argue that current AI data centers are already sufficient to support global usage, with no need for further expansion. They accuse companies like OpenAI, Oracle, CoreWeave, Microsoft, Google, Meta/Facebook, Nvidia, and AMD of inflating demand to justify new builds. This view overlooks key realities:


  • Exponential Demand Growth: Token usage is surging, not static. Google’s processing doubled between May and July 2025, while Microsoft’s grew seven‑fold year‑over‑year. Inference — real‑time AI use — now outpaces training, requiring more efficient and scalable infrastructure. Capital expenditure projections show record levels ahead, with no slowdown in sight.


  • Capacity vs. Future Needs: Today’s centers handle current loads, but models are becoming more complex (e.g., multimodal with video and audio), and adoption is broadening to billions of users. By 2030, AI could consume as much power as a mid‑sized country, driven by genuine demand rather than hype. Claims of overcapacity ignore bottlenecks such as energy constraints and supply chains, which are pushing builds forward.


  • No Evidence of Inflation: Metrics are transparent. Token volumes are reported by executives and backed by revenue — OpenAI’s $13B annualized run rate in 2025 is one example. Enterprise adoption stands at 78%, with ROI of $3.70 per dollar invested. If usage were inflated, revenues would falter, not rise.



 

Circular Investments or Real Supply Chain Integration?


  • Critics also point to the web of cross‑investments in AI as evidence of a bubble. Nvidia invested in CoreWeave, while both OpenAI and Microsoft lease GPU capacity from CoreWeave. Microsoft owns a major stake in OpenAI, and Nvidia has also invested directly in OpenAI. AMD has granted shares to OpenAI, which in turn committed to using AMD’s AI chips. Each announcement — whether a partnership, investment, or chip purchase — can send the related company’s stock up $10–20 in a morning trading session. To skeptics, this looks like circular investing, reminiscent of Enron’s self‑referential deals.


  • But the comparison doesn’t hold. Enron’s value was built on opaque accounting, while AI demand is anchored in transparent, auditable metrics: trillions of tokens processed monthly, enterprise adoption rates above 78%, and revenues that scale directly with usage. These cross‑investments reflect supply chain interdependence, not financial engineering. Chipmakers, model developers, and cloud providers are building a supply chain together, and the capital flows between them are part of scaling a real industry. While surveys show 54% of investors perceive AI stocks as bubbly, fundamentals — including productivity gains of 26–55% across industries — demonstrate that adoption is delivering measurable business outcomes.



 

Why This Is Not the Dot-com Bubble


  • Usage vs. Capacity: Dot-com firms reported “capacity” (servers, bandwidth). AI measures “usage” (tokens processed) — real demand, not potential.


  • Value Creation vs. Vanity: Internet Bubble: “eyeballs” with little monetization. AI: Tokens power enterprise workflows, yielding measurable productivity gains across sectors.


  • Global Adoption Curve: AI is scaling across industries and geographies, with 78% of organizations using it in 2024 (up from 55% in 2023). Structural shifts are evident in finance, creative industries, and enterprise systems.


  • Efficiency Gains, Not Empty Growth: As prompting improves, fewer tokens per task are needed. Yet aggregate usage continues to rise — showing deeper adoption, not a peak.



 

Narrative Angle: Tokens as AI's Heartbeat


Token usage is the heartbeat of AI adoption:


  • Month‑over‑month (MoM) growth shows the pulse.

  • Quarter‑over‑quarter (QoQ) reveals the rhythm.

  • Year‑over‑year (YoY) surges demonstrate long‑term health.


A future slowdown in raw token growth won’t signal decline — it will mean maturity, as users optimize AI efficiently. But the leap from 9.7 trillion in 2023 to 480 trillion in 2024 proves AI is a structural shift in computation and innovation.

 



Closing Thought


In the Internet Bubble, companies could pretend to grow. In AI, tokens can’t be lied about — they are the most transparent metric of adoption. The numbers show exponential, unfakeable growth.


AI isn’t the next Dot-com bubble. It’s the next industrial revolution — and tokens are the electricity units proving it. While risks like economic downturns exist, the data supports sustained demand over hype.





My job? Connecting them with the talent that turns vision into reality

_________________________________________________________


Contact us and let us know your company's AI staffing requirement. Together, we can improve how we recruit for AI roles to benefit everyone involved.




Recent Posts

See All
bottom of page