Meta extends server lifespan amid memory chip shortage - WSJ

Meta Prolongs Server Lifespan to Counter AI-Fueled Memory Crunch

Sharing is caring!

Meta extends server lifespan amid memory chip shortage - WSJ

A Strategic Shift in Depreciation (Image Credits: Pexels)

Meta Platforms recently pushed the expected useful life of its servers to five and a half years, a decision set to deliver about $2.9 billion in savings for 2025 alone.[1] The company had previously lengthened server depreciation periods twice in 2022, lifting the timeline from four years to five. This latest adjustment reflects broader pressures in the tech sector, where surging demand for memory chips from artificial intelligence workloads has created persistent supply constraints.

Hyperscalers such as Meta, Microsoft, Google, and Amazon have prioritized high-margin AI server components, exacerbating shortages that manufacturers like Samsung, SK Hynix, and Micron cannot quickly resolve.[2] Prices for key memory types have spiked, forcing companies to rethink hardware strategies.

A Strategic Shift in Depreciation

Meta’s extension builds on lessons from earlier cycles. In the second quarter of 2022, executives added six months to server lifespans amid rising infrastructure costs. Two quarters later, another six months followed, stabilizing the period at five years until the recent update.[1] Analysts note that well-maintained servers can operate far beyond typical refresh cycles, though diminishing energy efficiency often prompts replacements.

This approach directly lowers annual depreciation expenses, freeing capital for AI investments. For Meta, the timing aligned with hefty capital expenditures on data centers. The savings underscore how accounting adjustments can materially impact financials without halting core operations.

AI Demand Ignites Memory Shortage

Data centers now claim a dominant share of global memory production, with projections showing them consuming up to 70 percent of output in 2026.[3] High-bandwidth memory for AI training has drawn the bulk of supply, leaving consumer electronics and mid-range servers underserved. Spot prices for certain DRAM variants jumped 75 percent in early 2026 quarters.[4]

Major producers face delays in expanding capacity, as new wafer fabs require years to build. Industry leaders forecast the imbalance persisting through 2027 or longer, with wafer supply trailing demand by over 20 percent.[5] Meta, like peers, contends with elevated memory costs rippling into products such as VR headsets.

Peers Follow Suit with Hardware Tweaks

Meta’s move mirrors actions across Big Tech. Microsoft stretched server and network equipment lifespans from four to six years in 2022, yielding $3.7 billion in 2023 efficiencies.[1] Amazon made a similar shift to six years early in 2024.

Company Previous Lifespan New Lifespan Reported Savings
Meta 4-5 years 5.5 years $2.9B (2025)[1]
Microsoft 4 years 6 years $3.7B (2023)[1]
Amazon N/A 6 years N/A

These changes have drawn scrutiny, with investors like Michael Burry questioning whether they mask true AI spending pressures by understating depreciation.[6] Still, operational gains from software optimizations support longer hardware use.

Navigating Supply Chains and Future Costs

Tech firms now explore alternatives like reusing RAM modules or opting for lower-memory configurations to stretch budgets.[2] Cloud reliance and vendor negotiations offer further buffers. For data center operators, routine maintenance – such as disk swaps – proves key to viability.

Yet challenges mount. Meta reported higher data center expenses in early 2026 results, tied to server, memory, and storage price hikes.[7] As AI models grow hungrier for compute, balancing innovation with fiscal discipline will define resilience.

The memory squeeze tests the entire ecosystem, from hyperscalers to consumers facing pricier devices. Meta’s server extension buys time, but sustained supply growth remains essential for unchecked AI expansion. Whether this proves a temporary fix or a new norm hinges on how quickly production catches demand.

About the author
Lucas Hayes

Leave a Comment