How to depreciate a GPU?

A couple weeks ago before Michael Burry, I was sounding the alarm about the depreciation of GPUs by tech companies in my newsletter.

But why is everyone so focused all of a sudden on the boring accounting practice of depreciation?

Before I answer that question, lets dig into what depreciation is and two important examples of depreciation. The first example is that of a company which builds a railroad bridge. The bridge will last for decades and every single year it will provide the same value to the end consumer which is the ability to carry freight and passengers. While the bridge will need some minor ongoing maintenance, its value in year 30 is just about exactly the same as its value in year 1. Because of the large costs involved in building the bridge, accounting standards allow for the expense to be depreciated over a long period of time such as 50 years for long-lived infrastructure. Therefore, if the bridge cost $100M, it would show up as $2M of expenses every year for 50 years.

The second example is an automobile or other piece of equipment where the utility of the good declines rapidly in the first few years. For example, a car that requires $2K in repairs every six months and has barely working AC and a cracked windshield is a lot less valuable and desirable than a brand new car. In this second example, accounting standards suggest a 5-year depreciation schedule where depreciation is a straight-line basis because the car in year 5 provides roughly the same business utility (i.e., the ability to drive) as the car in year 1.

When we talk about GPU's, there are a couple of really interesting differences between them and our two examples:

  1. Utility: Many will say "Google is still running TPU's from 2018 at 100%" and thus GPUs and TPUs continue to add the same value they did over this time period like the bridge example. However, this argument is completely wrong as a TPU from 2018 provides less than 1% of the computing efficiency of a current TPU and the monetary value it provides over this time has precipitously dropped as computing efficiency has increased.

  2. Degradation: When GPUs are run at 100% for years, they start to degrade and various estimates put their useful life at anywhere between 3 to 6 years at most. Unlike, a car that depreciates over 5 years and can last for 20+ years, the GPU after that period of time is effectively a car on cinder blocks in your front yard.

So what are our tech companies doing today with GPU depreciation? Well prior to the AI boom, their accounting firms forced them to depreciate GPUs over 3 years on a straight-line basis. However, as these companies started to invest up to 100%+ of their free cash flow on AI buildouts, they realized that a short depreciation schedule would destroy their earnings and send them to ZERO within three years.

So big tech companies simply convinced their accounting firms to change their accounting standards where Amazon, Meta, Facebook, Google and Microsoft for example moved them from 3 years up to 4-6 years with most of them being at 6 years. Overnight, they were able to dramatically improve their earnings by cutting their AI investment expenses in half. Now this wouldn’t be a big deal if these companies were making a small AI investment but when they are using in some cases 100%+ of their free cash flow, it has a very significant impact.

But this begs the question, what should the depreciation of GPUs look like? Because GPU efficiency is doubling at least once a year, this means the value creation potential of a GPU is dropping at least 50% per year and thus after three years a GPU is worth >87% less than it was on day 1. Therefore, a 3 year depreciation schedule is reasonable but an even more accurate depreciation schedule would show 50% of the expense in year 1, 25% in year 2 and 12.5% in year 3.

To put this in perspective, Microsoft who has roughly $100B in earnings and spent $40B on AI in 2024 and $80B in 2025, is actually unprofitable in 2026-2027 if GPUs and AI infrastructure were properly depreciated.

As Michael Burry so clearly pointed out, these companies are grossly over representing their profitability through changing their depreciation schedule and this is especially true given the pace at which computing efficiency is increasing. This means that the historically high PE ratios for the market we are seeing right now are actually much higher (beyond dotcom heights) especially when you normalize without 2017 tax cuts.

The question then becomes, is the expense worth it?

Next
Next

Traits of a Founder - Introspection