There's clearly easy/irrational money distorting the markets here. Normally this wouldn't be a problem: prices would go up, supply would eventually increase and everybody would be okay. But with AI being massively subsidized by nation-states and investors, there's no price that is too high for these supplies.

Eventually the music will stop when the easy money runs out and we'll see how much people are truly willing to pay for AI.

Regardless where demand comes from, it takes time to spin up a hard drive factory, and prices would have to rise enough that, as a producer, you would feel confident that a new hard drive factory will actually pay off. Conversely, if you feel that boom is irrational and temporary, as a producer you’d be quite wary of investing money in a new factory if there was a risk it would be producing into a glut in a few years.
I'll add that the GPU, CPU, storage, and RAM industries crashed in 2022 after a Covid-induced boom.[0]

Everything was cheap. Samsung sold SSDs at a loss that year.

TSMC and other suppliers did not invest as much in cap ex in 2022 and 2023 because of the crash.

Parts of the shortage today can be blamed by those years. Of course ChatGPT also launched in late 2022 and the rest is history.

[0]www.trendforce.com/presscenter/news/20221123-11467.html

You act like this wasn't just the same as it has always been.

It's always been cycles of cheap production and then human created demand or catastrophes to reduce supply and increase prices back up again.

I bet the same thing happens when the AI bubble pops.

"but this time is different, it's not a bubble, there's real value there"

Economists use the term “bubble” to describe an asset price that has risen above the level justified by economic fundamentals, as measured by the discounted stream of expected future cash flows that will accrue to the owner of the asset.

I think there's little argument that is happening, the question is more about to what extent is it a bubble.

The entire global software industry is worth less than $1 trillion dollars. Or in other words smaller than the current valuation of just OpenAI + Anthropic.

Planned capital investment this year by the Magnificent 7 alone is $600B. More than 2/3 of the total global software industry. In one year. Good luck buying any computer hardware this year, there will be a shortage of everything, including electricity.

It's a bubble. But when does the music stop?

> The entire global software industry is worth less than $1 trillion dollars

Are you saying "worth" as a shorthand for something like annual profit? If you sort the 2025 data by earnings, you get pretty large numbers quickly: https://en.wikipedia.org/wiki/List_of_largest_technology_com...

That's not how you should measure "worth". In that world, you'd have a P/E ratio of 1. Comparing to a bond, it would be like expecting to get paid the face amount in a single year. Many people are quite happy with 5-10% interest as a risky benchmark, so 10-20 P/E isn't wild. That puts the market cap for tech itself at 10-20T as a reasonable baseline.

  The entire global software industry is worth less than $1 trillion dollars. Or in other words smaller than the current valuation of just OpenAI + Anthropic.
Apple, Microsoft, Google are all worth 3-4x the global software industry just for some context.

Is Microsoft 3x more important than OpenAI and Anthropic combined? Personally no. I think the value generated by OpenAI and Anthropic will surpass Microsoft.

Going off what I could find easily from Google + ChatGPT [1]:

But arguing about the details is kind of missing the point. Microsoft's value is also inflated by the AI bubble and can't be used as a point of reference.

[1] https://www.grandviewresearch.com/industry-analysis/software...

I don't think MS value has been inflated by AI, if anything it's value has decreased from it's AI investments. MS has mostly been on it's same growth path for the last decade.
I think you haven’t been paying attention to the market then over the last couple years. Despite getting hammered recently, it is still up 100% from 2023 lows.
Am I crazy? MSFT has was up about 100% in the prior three years before 2023 as well, and again for the three years before that.

AI has had very little to do with MSFT growth. Pretty sure the 2023 lows were a response to the massive AI spending, and the recovery mostly due to continued Azure services growth.

How long did it take for housing to stop skyrocketing?

That's my guess.

Aka: take a seat, it will be a while

  • 9rx
  • ·
  • 12 hours ago
  • ·
  • [ - ]
Two years. Basically the period when people were stuck at home during COVID restrictions and were willing to spend extra money to make that experience more comfortable. Prices fell precipitously after restrictions were lifted and people had desires outside of the home again.
The problem is "markets can stay irrational longer than you can stay solvent". It doesn't matter when the bubble pops if the governments (especially the US') bail those companies out.

The damage is already being done, whether you are a 401k/IRA holder with a position on the S&P 500 way too overweighted by the Mag7&co and their circular dealings, or just needing to buy computer parts way over their market value because some companies are over-leveraging to outcompete you for that hardware (or electricity), or even at a smaller scale by increasing software costs because everything is "AI-powered" now and of course you wouldn't want only "deterministic" software that just works and doesn't have a slop machine integrated.

At this point I agree. A bubble starts when everyone stops calling it a bubble.
  • spwa4
  • ·
  • 14 hours ago
  • ·
  • [ - ]
If it's a bubble that big it's

1) the only reason any part of the economy is growing at all

2) the only reason US banks aren't bankrupt due to the commercial real estate debacle they got themselves into

In other words, if this is a bubble, if this pops, we're back in the 2008 situation. Where banks will go bankrupt one after the other like dominoes (in the sense that this amount is large enough that large banks will fail their financial obligations). And you can argue as much as you want based on "real" valuation metrics but none of your investments, not even cash dollars or even gold, will come out of that one intact.

Fortunately, there's the counterargument: you know what else is higher than ever? The revenue produced by the software industry. To the point that at the moment you can say, as crazy as it sounds: if revenue of the big software firms keeps growing the way it IS currently growing, this is not enough investment.

In case you're wondering what exactly that means, not enough investment. Think of it like this: you're selling shoes. If you invest too little in new shoes (or whatever resources you need to sell shoes), then you will have to tell customers coming in "sorry, all out of shoes, take your money elsewhere". Currently it's not enough investment. If this growth rate keeps up for 1.5 years, Amazon will have to close the store to anyone who wants more machines, in fact they are turning away large customers right now at Amazon, Google and Microsoft. That's where the "spend more now" madness is coming from. Is it unjustified?

Well, it appears not.

I think you’re wrong.

Time will tell.

If I remember during a previous GPU shortage (crypto?), Nvidia (and/or TSMC?) basically knew the music would stop and didn't want to be caught with its pants down after making the significant investments necessary to increase production

Not to mention that without enough competition, you can just raise prices, which, uh (gestures at Nvidia GPU price trends...)

Similar thing happened with mask manufacturers during COVID.

They didn't spin up additional mask production b/c they knew the pandemic would eventually pass. They learned this lesson from SARS.

Not maxing out production during spikes (or seasonality) in demand is a key tenet of being a "rational economic actor".

too bad the bicycle industry didn't learn this. They acted like COVID was the new-normal, and it has resulted in many companies disappearing when they learned the hard way that demand for bikes in a pandemic is neither sutainable nor normal.
I believe the TSMC CEO said that in a recent interview. They're aware that their now biggest customer Nvidia has a less broad product portfolio than Apple and the high volumes they buy propably won't last. It's too much of a risk to plan more Fabs based on that.
They are indeed planning for more fabs, in order to meet volumes.

Last week: “TSMC's board approves $45 billion spending package on new fabs”

https://www.tomshardware.com/tech-industry/semiconductors/ts...

Silicon Valley is arguing that TSMC isn't investing enough. They should be investing hundreds of billions to build fabs, like how big tech is investing in the AI buildout.

$45 billion for new fabs is peanuts compared to Amazon's $200b and Google's $180b investment in 2026.

Can't really blame TSMC though. It takes years for fabs to go from plan to first wafer. By the time new fabs go online, demand might not be there. Who knows?

According to Elon during his recent Dwarkesh podcast appearance[1], TSMC is limited by resource constraints (fab components, contractors, etc). His claim is that TSMC is building as fast as they can and they are unable to meet industry demand.

Seems legit to me. Nonetheless, I think it's a solvable problem.

1. https://www.youtube.com/watch?v=BYXbuik3dgA

If this is actually true, I think you can find a more reliable source than Elon Musk.

I'm not saying you should never listen to a word he says. His actions shape the world after all, so it's important to understand how his words precede his behavior. But I'm baffled why anyone would take Elon at his word, or even slightly hedge their perception of reality based on Elon's claims of fact.

I was leaving an HN comment, not writing an essay. I'm not fond of Elon's personality, but I listened to the context of the conversation and believe him.

Did you listen to the conversation? There was a great amount of detail. Which parts of the conversation seemed unbelievable to you?

Regardless, it's also been reported in the press over the past quarter, and TSMC's previously largest customer, Apple, notably has had to make fab adjustments and form new partnerships with Intel.

https://stratechery.com/2026/tsmc-risk/

https://www.eetimes.com/tsmc-will-struggle-to-meet-ai-demand...

And even the TSMC CEO himself has acknowledged it on multiple news sources. Here's just one:

"Demand is 3 times higher than what TSMC can produce"

https://wccftech.com/tsmcs-ceo-admits-chip-production-is-ins...

Hopefully, the CEO of the company in question is good enough for you?

> Did you listen to the conversation?

No, and I'm sorry for derailing your point. Thank you for the additional links. I skimmed them all but didn't see anyone corroborate the claim that TSMC is limited by its upstream component suppliers, rather than its own factory underinvestment in prior years. Am I misunderstanding, and those two things are the same?

Ah, that "lays off 50,000 workers because of overhiring" oracle-of-farsight big tech?

Little easier than "laying off" a billion-dollar fab, isn't it?

Actual spending already out the door or pledges? Big difference vs. money spent and money planned.
  • rwmj
  • ·
  • 16 hours ago
  • ·
  • [ - ]
"Silicon Valley" doesn't get to make the decision unless they are willing to send some of those hundreds of billions to TSMC up front. (TSMC isn't going to want future promises of business either since those are worth very little.)
I don't disagree. I wrote the top comment here basically saying the same thing: https://news.ycombinator.com/item?id=46764223

If big tech prepays for the entire fab, I think TSMC would do it.

  • baq
  • ·
  • 15 hours ago
  • ·
  • [ - ]
if what Elon recently said is true (if - but he might not be... inaccurate... on this particular thing) they already have and bought the forward production capacity of those new fabs and it still isn't enough.
I believe that. TSMC would have to start another fab or two.

PS. I'm pretty sure Intel is also at max capacity. They cancelled a bunch of fabs a few years ago when they were on a spiral.

  • toss1
  • ·
  • 15 hours ago
  • ·
  • [ - ]
And if the Big Tech companies think it is so important to get all those compute and/or memory chips sooner and in larger supply, it should be no problem at all for those Big Tech companies to pay for the costs and then have priority access to all (or their portion of) the output for the future years.

OTOH, if they are insisting on not investing their funds or stock, and it is simply pressure on TSMC to take on the risk, TSMC should be very wary of taking on risk for those players (unless TSMC sees another advantage of producing into a likely glut or supply canyon shortly after the new fabs come online).

  • rasz
  • ·
  • 12 hours ago
  • ·
  • [ - ]
> Amazon's $200b and Google's $180b investment

Last time I checked you cant build Chip Fabs with cloud credits.

  • kldg
  • ·
  • 3 hours ago
  • ·
  • [ - ]
it all takes years. it takes years for permitting to open up the power plant to run the chips. at the scale the Big 3/4 (google, amazon, microsoft, and meta-ish) are going, we don't actually have the capacity to BUILD the capacity, despite a forecast of just 1% national electricity consumption growth this year, partly because we were expecting electricity demand to slow down and for an orderly shutdown of our fossil fuel plants. we couldn't even fill >100GW of gas/coal turbine orders over the next 5 years if we had to, and we might have to, because some of our grids (notably PJM's) are forecast to be under their safety margin of over-production in the following years.

meanwhile, regional grid operators are faced with Big Tech driving tens of % of total power into private contracts where there's only one customer; they are making the decisions normally reserved for nation-states, right? reopening Three Mile Island sounded like a pipe dream a few years ago. I hear They have something like 50 more experimental, small-scale NPPs they want to fire up across the country in the next few years, too (but despite sounding like a big boon for energy, they're ~meaningless short-term in the face of how much demand we're looking at). -so this power (uh, literally) gets wrested away from the grid authorities and from what was largely the domain of government, to now be managed by techbros and a select few partners who will be reliant on their money; I'm sure that will work out fine.

anyway, part of the reason it does make some sense in the US for the government to push for more coal/LNG turbines, is because they're already there and we need them now; the permitting to un-mothball, prevent mothballing, or expand facilities, is far less arduous than what a company'd have to go through for a new facility (tho again, we don't have capacity to build all the turbines we require inside 5 years anyway). I'm not saying it's a good idea to start sending up more GHGs, but it's maybe better than pricing out electricity for residences and "real" industry. hey, who knows? maybe they'll simply build natural gas pipelines that don't leak this time.

-oh, and then there's the problem with these new datacenters disrupting the traditional power demand curve, because they don't really do as much peak draw anymore; their peak draw is approaching base load, as LLM batching (when a company has a bunch of stuff they want processed and can wait a day for it to run in "off-hours") is sold, and if unsold, that time can be used as training time; so the modern datacenter is a 24/7/365 organ; the heart, powering our society, Moltbook. the importance of this is it makes solar less financially attractive, because now we need to be able to bank more energy since more demand's shifting to overnight. we might also want to consider just getting the moon really, really hot? then we can get a truly substantial haul of lunar light for our panels. you know, we decided against nuking hurricanes again recently; maybe we could build some new ones and nuke the moon, a lot.

Somewhat ironically the AI boom means Nvidia would've easily made their money back on that investment though and probably even more thoroughly owned the GPGPU space.

But as it is it's not like they made any bad decisions either.

You're talking about how higher prices can motivate higher supply. The parent commenter was talking about how higher prices shift the current point on the demand curve to the right. If hard drives sold for $1 billion per gigabyte, we wouldn't see even AI companies buying as many as they are, and current production would go idle. Even assuming supply is locally inelastic (as it is given no time or space to scale, or given a lack of confidence that scaling is wise), you should be able to find a price point that avoids supply shortages by manipulating demand.

Thus far, we've not found that point.

> it takes time to spin up a hard drive factory

Very good.

The problem with this expectation of usual market behavior is that demand from AI will still be unsatisfied even after buying out the current providers' whole supply, so any new manufacturer entering the market will also prioritize high-paying AI companies above consumers.
Are these factories already running 24/7 that labor can't be added to make more without adding capital infra?

And if they were running 24/7, maybe setting up another factory or line will avoid some of the 24/7 scheduling.

As far as I know, the lithography machines are indeed running 24/7 (barring servicing).

https://www.asml.com/en/products/customer-support

No it’s not an easy fix. Manufacturers don’t have a good pulse on long term demand. The he capex to spin up a new manufacturing plant is significant. Especially with the recency of Covid where some folks did get caught with their pants down and over invested during the huge demand boom.

I don’t quite follow the narrative like yours about nation states and investors. There is certainly an industrial bubble going on and lots of startups getting massive amounts of capital but I here is a strong signal that a good part of this demand is here to stay.

This will be one of those scenarios where some companies will look brilliant and others foolish.

Smart manufacturers will sell 'hard drive futures'. Ie. "Give us $100/drive now for 100k drives for delivery in march 2028".

These contracts are then transferrable. The manufacturer can start work on a factory knowing they'll get paid to produce the drives.

If the AI boom comes to an end, the manufacturer is still going to get paid for their factory, and if the AI company wants to recoup costs they could try to sell those contracts back to the manufacturer for pennies on the dollar, who might then decide (if it is more profitable) to halt work on the factory - and either way they make money.

That only works out if there are enough investors willing to pay for those futures. If the new factory can make a billion drives but they only have 2 of those futures contracts sold (that is 200k drives) they don't build the factory. Remember too if they sell those contacts they are on the hook to deliver - if it is just investors they will accept the street value of 100k drives in 2028 but some of the people might be buyers demanding physical goods.

Every year a few farmers realize they are contracted to deliver more grain than they have in their bins and so have to buy some grain from someone else (often at a loss) just to deliver it. This isn't a common problem but it happens (most often the farmer is using their insurance payout to buy the grain - snip a very large essay on the complexities of this)

> If the new factory can make a billion drives but they only have 2 of those futures contracts sold (that is 200k drives) they don't build the factory.

But the AI companies are flush with cash and trying to buy everything, right? Why wouldn't they buy up as many futures contracts as the fab company needs to justify more fabs?

> Every year a few farmers realize they are contracted to deliver more grain than they have in their bins and so have to buy some grain from someone else (often at a loss) just to deliver it.

This is most commonly because they sold a futures contract for X bushels expecting to grow 2X but 75% of the crop failed and they only have 0.5X.

Semiconductor fab yields aren't as susceptible to how much it will rain next year and the companies generally have a pretty good idea of what their yields are for a given process node.

That is the question - will those ai companies buy the contracts

edit: actually it is worse - who else isn't buying contracts - if they build new capactity on contracts and ai collapses the existing users will take up the contracts but the old capacticy is unused.

If they build the new fabs and AI collapses then they still got all the AI companies' money because they prepaid. The current market price of chips is then going to crash, but that's what happens when AI collapses regardless. Might as well sell them five years worth of chips rather than two years worth of chips before the cash cow dries up.

Meanwhile, the fab companies want to think about what happens if AI collapses, but the AI companies don't. What do they care if they get screwed on a contract the day after they go bankrupt regardless? So offer them a contract where they get screwed if they go bankrupt, e.g. prohibit them from using any of the hardware for anything but AI for five years. Then the hardware is locked into AI stuff regardless of whether AI dries up and you can still go sell the rest of the chips that aren't to PC OEMs etc.

Can you provide some solid examples of companies doing this in an industry with high capex? Yes futures exist but largely in commodity businesses. Because what you described sounds more like pre-purchase agreements which already exist. To have a futures market you would need investors and a product that is more of a commodity and not something highly engineered.

You are also forgetting that the payback period on a plant is not a single year, it will be over many years and most likely no buyer is wanting to arrange purchasing that far out.

I don’t see how what you described sounds is set in reality even for “smart manufacturers”.

There are futures markets for DRAM. Somewhat secretive (hard to find reliable price quotes) but they exist.
> Eventually the music will stop when the easy money runs out and we'll see how much people are truly willing to pay for AI.

Cheap hard drives and ram, yay! Perhaps GPUs too!

You wish. More likely all that data center capacity will be used to sell something as nefarious, like VDI for the masses. You won't need RAM, disk and GPUs when you can rent those from OpenVDI.
It's hard to increase long-run production capacity for what seems to be clearly a short-term spike in datacenter buildout. Even if AI itself is not much of a bubble, at some point spending on new AI facilities has to subside.
This is what a business cycle looks like.

Seeing the first mover succeed, every Tom, Dick and Harry wants to emulate. It distorts the price because people would pay premium for everything. Then there is surplus supply and no takers. People are caught with their pants down and things go for cheap.

This repeats ad nauseum. Whether it was building ISPs during early 2000s or the abundance of streaming service where every media company wanted one. Just because the corporate overlord doesn't want to look foolish for not following a trend.

  • ·
  • 18 hours ago
  • ·
  • [ - ]
AI is going to be what fiber was to the dotcom bubble. Someone spend a lot of money on a lot of infrastructure, some of which is going to be incredibly useful, but sold for much less than it cost to build. Hardware just depreciates much much faster than fiber networks.
I'm not saying that data center buildouts can't overshoot demand but AI and compute is different than fiber buildout. The more compute you have, the smarter the AI. You can use the compute to let the AI think longer (maybe hours/days/weeks) on a solution. You can run multiple AI agents simultaneously and have them work together or check each other's work. You can train and inference better models with more compute.

So there is always use for more compute to solve problems.

Fiber installations can overshoot relatively easily. No matter how much fiber you have installed, that 4k movie isn't going to change. The 3 hours of watch time for consumers isn't going to change.

Did you pay attention in computer science classes? There are problems you can't simply brute-force. You can throw all the computing power you want at them, but they won't terminate before the heat-death of the universe. An LLM can only output a convolution of its data set. That's its plateau. It can't solve problems, it can only output an existing solution. Compute power can make it faster to narrow down to that existing solution, but it can't make the LLM smarter.
Maybe LLMs can solve novel problems, maybe not. We don't know for sure. It's trending like it can.

There are still plenty of problems that having more tokens would allow them to be solved, and solved faster, better. There is no absolutely no way we've already met AI compute demands for the problems that LLMs can solve today.

  • voxl
  • ·
  • 14 hours ago
  • ·
  • [ - ]
There is zero evidence that LLMs can do anything novel without a human in the loop. At most LLM is a hammer. Not exactly useless by any stretch of the imagination, but yes you need a human to swing it.
Every solution generated by an AI for a novel problem was ultimately rescinded. There is no trend, there is only hope.
LLMs are considered Turing complete.
Only if you instantiate it once.

If you use it like an agent and stick it in a loop and run it until it achieves a specific outcome it's not.

Not really. You can leverage randomness (and LLMs absolutely do) to generate bespoke solutions and then use known methods to verify them. I'm not saying LLMs are great at this, they are gimped by their inability to "save" what they learn, but we know that any kind of "new idea" is a function of random and deterministic processes mixed together in varying amounts.

Everything is either random, deterministic, or some shade of the two. Human brain "magic" included.

You can't really use compute more because power is already the bottleneck. Datacenter buildouts are now being measured in GW which tells you everything you need to know. Newer hardware will be a lot more power-efficient but also highly scarce for that reason.
Energy is also being scaled up. But the fundamental difference between compute and fiber buildup is different in my opinion.
  • baq
  • ·
  • 18 hours ago
  • ·
  • [ - ]
current shortages are exactly the result of fabs not wanting to commit extra capex due to overbuild risk and inference demand seems to be growing 10x yoy; you've famously got 8 year old TPUs at google at 100% load.
  • ido
  • ·
  • 18 hours ago
  • ·
  • [ - ]

    Hardware just depreciates much much faster than fiber
The manfucaturing capacity expanded to meet the demand for new hardware doesn't (as much)
But if the demand drops for six months, the manufacturers are going to scale back production.

If it drops for a year, they're likely to start shedding capacity, one way or another.

This is not an equivalent situation. The vast, vast majority of what's being produced for this bubble is going to be waste once it pops.

I guess you could at least mine the boards from defunct AI companies for memory chips ? Latest videos from Gamers Nexus showed it is apparently not that hard to transfer memory chips from board to board.

Then you can leach precious metals from the PCB itself.

Yes, but if they weren't overproduced and then either run into the ground or left waiting for demand that would never come, then wouldn't those memory chips and precious metals all still be more available than otherwise...?

I mean, sure, yes, we can try to recycle waste products—but it's still less wasteful not to produce them unnecessarily in the first place.

Agreed, but basic things like sanity seem to have left this hype cycle long ago.

Thankfully real world and physics has the final word, so we can at least plan what to do once the whole thing runs into a wall of reality & make the best of the wreckage.

This goes beyond profits. It will be important for national security.

  There's clearly easy/irrational money distorting the markets here.
No, I think it is real demand.

AI will cause shortages in everything from GPUs to CPUs, RAM, storage, networking, fiber, etc because of real demand. The physical world can't keep up with AI progress. Hence, shortages.

AI simply increases computer use by magnitudes. Now you can suddenly use Seedance 2.0 to make CGI that would have cost tens of millions 5 years ago for $5.[0] Everyone is going to need more disk space to store all those video files. Someone in their basement can make a full length movie limited only by imagination. The output quality keeps getting better quicker.

AI agents also drastically increase storage demands. Imagine financial companies using AI agents to search, scrape, organize data on stocks that they wouldn't have been able to do prior. Suddenly, disk storage and CPUs are in high demand for tasks like these.

I think the demand for computer hardware and networking gear is real and is only the beginning.

As someone who is into AI, hardware, and investing, I've been investing in physical businesses based on the above hypothesis. The only durable moats will be compute, energy, and data.

[0]https://seed.bytedance.com/en/seedance2_0

  • pjc50
  • ·
  • 17 hours ago
  • ·
  • [ - ]
> The only durable moats will be compute, energy, and data

"Compute" is capital investment; normal and comprehensible, but on a huge scale.

"Data" is .. stolen? That feels like a problem which has been dodged but will not remain solved forever, as everyone goes shields-up against the scrapers.

"Energy" was a serious global problem before AI. All economic growth is traded off against future global temperature increases to some extent, but this is even more acute in this electricity-intensive industry. How many degrees of temperature increase is worth one .. whatever the unit of AI gain-of-function is?

> All economic growth is traded off against future global temperature increases to some extent, but this is even more acute in this electricity-intensive industry. How many degrees of temperature increase is worth one .. whatever the unit of AI gain-of-function is?

The premise here is that if we use more electricity then we burn more carbon. And the media hates AI, so if anybody restarts any coal-fired power plant to run a data center anywhere, that's the story. But then there's this:

https://electrek.co/2026/01/28/eia-99-of-new-us-capacity-in-...

Nobody actually wants coal because solar is cheaper.

And data centers are a pretty good combination for this because the biggest problem with solar and wind is what to do during multi-day periods of low generation, but data centers have backup generators and would be willing to turn them on whenever the cost of grid power is higher than the cost of operating them. Running some gas turbines for a week every two years in exchange for stabilizing the grid and being able to run on renewable power for the other 103 weeks is a pretty good outcome for everybody, not least because that amount of grid stabilization would exceed their consumption, i.e. allow more renewables to be added to the grid than they're using. If they can shed 1GW of load when a 2GW (long-term average) solar farm is generating at 50% of typical capacity for a week, you can add that 2GW of solar to the grid and remove 1GW of fossil fuels even while the data center is increasing consumption by 1GW.

> How many degrees of temperature increase is worth one .. whatever the unit of AI gain-of-function is?

Billionaire. And they are definitely willing to make the trade.

one.. fully autonomous, self improving, replicating, general intelligence.
The question isn’t if the demand is real or not (supplies are low, so demand must exist). The question is if the demand curve has permanently shifted, or is this a short-term issue. No one builds new capacity in response to short term changes, because you’ll have difficulty recouping the capital expense.

If AI will permanently cause an increase in hard drives over the current growth curve, then WD, et al will build new capacity, increasing supply (and reducing costs). But this really isn’t something that is known at this point.

My post argues that the demand has permanently shifted.

By the way, plenty of people on HN and Reddit ask if the demand is real or not. They all think there's some collusion to keep the AI bubble going by all the companies. They don't believe AI is that useful today.

Usefulness and overvaluation are not mutually exclusive. AI is useful, but it is not a fraction as useful as these companies spending rates would have one believe.

If it is, then the world is going to lose pretty much all white collar jobs. That's not really the bright future they're selling either.

> My post argues that the demand has permanently shifted

The time horizon for this is murky at best. This is something you think, but can’t know. But, you’re putting money behind it, so if you’re right, you’ll make a good profit!

But for the larger companies (like WD), over building capacity can be a big problem. They can’t plan factory expansion based on what might be a short term bubble. That’s how companies go out of business. There is plenty to suggest that you’re right, that AI will cause permanently increased demand for computing/storage resources. Because it is useful and does consume and produce a lot of new data and media.

But I’m still skeptical.

The massive increase in spending can’t be sustainable. We can’t continue to see the AI beast at this rate and still have other devices. Silicon wafer fabs can’t be built on demand and take time. SSD/HD factories take time. I think we are seeing an expansion to see who the big players will be in the next 3-5 years. Once that order has been established, then I think we will fall back to more sustainable rates of demand. This isn’t collusion, it’s just market dynamics at play in a common market. Sadly, we are all part of the same pool and so everything is expensive for all of us. At some point though, the AI money will dry up or get more expensive. Then I think we’ll see a reversion back to “normal” demand, maybe slightly elevated, but not the crazy jump we’ve seen for the past two years.

Us being in the same pool as AI is one of the potential risks pointed out by AI safety experts.

To use an analogy, imagine you're a small fluffy mammal that lives in fertile soils in open plains. Suddenly a bunch of humans show up with plows and till you and your environment under to grow crops.

Maybe the humans suddenly won't need crops any longer and you'll get your territory back. But if that doesn't happen and a paradigm change occurred you're in trouble.

AI can be useful today, while also being insanely overvalued, and a bubble.
There will be a bubble. It's inevitable.

The most important question is are we in 1994 or 2000 of the bubble for investors and suppliers like Samsung, WD, SK Hynix, TSMC.

What about 10 years from now? 15 years? Will AI provide more value in 2040 than in 2026? The internet ultimately provided far more value than even peak dotcom bubble thought.

> The internet ultimately provided far more value than even peak dotcom bubble thought.

Yeah, but not to the early investors. The early investors lost their shirts. The internet provided a lot of value after the bubble popped and everyone lost money.

  • azan_
  • ·
  • 17 hours ago
  • ·
  • [ - ]
I wonder if I'm alone in being optimistic about this. I believe that the gigantic inflow of money into hardware will lead to large increase in production capabilities, accelerated progress and perhaps even new, better architectures.
I actually agree: a spike in prices due to bumping against capacity limits is way better than a downturn in the market. But this is only really true if AI hyperscalers are incented to space out their big buildouts over time (while raising their prices enough to ration current demand) so that suppliers can have some guarantee that their expanded capacity will be used.
> AI will cause shortages in everything from GPUs to CPUs, RAM, storage, networking, fiber, etc because of real demand.

Real demand, sure, I agree, but maybe not retail or business demand; at the moment the "demand" is entirely VC demand.

It's a really distorted market which is to be expected in any bubble/hype phase. The current retail/business demand doesn't appear to exist at the price point these investments require - even at the low low cost of "free, gratis and for nothing", not enough consumers and businesses are signing up.

The ones really going all-in on AI are the slop-producers. I dunno if slop is enough to pay back the investment into AI - I mean, even the slop producers are going to realise that paying $200/m to produce something in 1/10th of the time is a race to the bottom because someone else on the same plan is going to do the same, but cheaper.

> The physical world can't keep up with AI progress. Hence, shortages.

I think the word "progress" is inaccurate there - the physical world supply product at the demand maintained by VC's money.

It's not "cannot keep up with progress", it's "cannot keep up with demand from VCs".

> The only durable moats will be compute, energy, and data.

That'll be a first :-) Physical commodities have never been moats on their own before.

This fact never ceases to amaze me. It's so cool how relentlessly AI is pushing the horizons of our current hardware!

Maybe now we will start to see the "optical" CPUs start to be a thing. Or the 3D disk storage,;or other ground breaking technology.

Optical interconnect in the rack is a thing already. It's just a matter of time until it moves to single-PCB scale. And most persistent memories (competing with DDR memory for speed, and with far lower wearout than NAND) are of the "3D storage" type.
  • anthk
  • ·
  • 17 hours ago
  • ·
  • [ - ]
AI's output is not reproducible. It's a disaster.
If we want reproducible output we already have conventional software. Stop using a hammer on screws.
We do in fact want reproducible output. Which means LLMs are not fit for actual work.
  • anthk
  • ·
  • 13 hours ago
  • ·
  • [ - ]
Good luck in a near future. Science be damned modulo the users of reproducible software and distros.
  • 0-_-0
  • ·
  • 16 hours ago
  • ·
  • [ - ]
So it's like humans then
This is wrong for all LLMs which have a temperature setting.

And even if there were guaranteed to be non-deterministic, there is still lots of value in many aspects of content generation.

  • xnx
  • ·
  • 13 hours ago
  • ·
  • [ - ]
> Normally this wouldn't be a problem: prices would go up, supply would eventually increase

Sounds right

> there's no price that is too high for these supplies.

Are you saying even higher prices won't increase supply? I don't understand.

A comment above yours clearly explains why:

https://news.ycombinator.com/item?id=47034480

  • xpe
  • ·
  • 17 hours ago
  • ·
  • [ - ]
> Normally this wouldn't be a problem: prices would go up, supply would eventually increase and everybody would be okay.

This sounds like economic dogma based on pointing at some future equilibrium.

I like the saying that goes something like "life is what is happens when you are waiting for the future". In the same way, it seems to me that equilibrium is increasingly less common for many of us.

Markets are dynamic systems, and there are sub-fields of economics that recognize this. The message doesn't always get out unfortunately.

> But with AI being massively subsidized by nation-states and investors, there's no price that is too high for these supplies.

This feels like more dogma: find a convenient scape-goat: governments.

Time to wake up to what history has shown us! Markets naturally reflect boom and bust cycles, irrationality of people, and various other market failures. None of these are news to competent economists, by the way. Be careful from whence you get your economic "analysis".

Yes, this is why the prices of housing has dropped dramatically. The market stepped up and filled the demand needed and now everyone can afford a place to live

.....

The housing market is a textbook example of the opposite of a free market. In most markets, anything that does not "improve the character of the neighbourhood" is impossible to build by design.
Most chunks of the computing market should be thought of as a text book example of a 'free' market that operates with collusion with the few well monied "competitors" ensuring they don't put each other out of business.

Lets say you wanted to jump into the hard drive producing market. It's going to take you a few years to get there and a lot of billions of dollars. By the time you're close to producing units the existing players will suddenly drop the prices to the point where you cannot produce profits for as long as they need to. Aka, your competitors can collude longer than you can remain solvent. And yes, in two decades you will win the court case against them. And other than a fine nothing will happen because they're are so few manufactures that they are too big to fail.

The better way of putting it is that collusion is ultimately limited by the ease of entry into and exit from the market. The existing players don't need to "suddenly" drop the prices when a new entrant appears, because they've been doing that already.

It's only when supply bumps into short-term capacity constraints and price must rise enough to fully ration demand that this assumption fails. But then even if a new entrant appears that's no reason to drop prices unsustainably, because why would you? They're not taking any share of the market away from you, they're serving new demand.

  • xpe
  • ·
  • 15 hours ago
  • ·
  • [ - ]
I can't tell if the comment is above is sarcastic or serious: it could go either way.
This is 100% serious as it has not gone this way in the US. Housing prices keep going up in general and NIMBYism has stopped a massive amount of density growth.
Earlier gamers got punished by crypto and now they are being punished by AI.
  • pjc50
  • ·
  • 17 hours ago
  • ·
  • [ - ]
"Punished" implies a moral valence to the whole thing which isn't there. It's not like the AI companies were aware of gamers and set out to do this. You simply got run over, like everyone else in front of the trillion dollar bulldozer.
  • tmtvl
  • ·
  • 12 hours ago
  • ·
  • [ - ]
"Don't make the mistake of anthropomorphizing Sam Altman. The lawnmower doesn't hate you"?
So what?

Why gamers must be the most important group?

Gamers are important because they are consistent customers. Crypto buying of GPUs is done (anyone still in this area is buying ASICs). Meanwhile gamers are still buying GPUs - they do sometimes hold off when the economy doesn't allow, but you can trust that gamers will continue to buy GPUs to play their games and thus they are a safe investment. It is rational to sell CPUs to a gamer for much less than someone in crypto because the gamer will be back (even if the gamer "grows up" there are more replacing them). Thus gamer is an important group while crypto is not.

The above was their prediction during the crypto boom and it turns out correct. I'm not sure how AI will turn out, but it isn't unreasonable to predict that AI will also move to dedicated chips (or die?) in a few years thus making gamers more important because gamers will be buying GPUs when this fad is over. Though of course if AI turns out to be a constant demand for more/better GPUs long term they are more important.

Gamers are not the only important GPU market. CAD comes to mind as another group that is a consistent demand for GPUs over the years. I know there are others, they are all important.

the "value" of nvidia to the "AI" companies is their tsmc fab contract

they don't need CUDA, they don't need the 10 years of weird game support, even the networking tech

they need none of nvidia's technology moats

exactly same as the crypto, where they just needed to make an ASIC to pump out sha1 as quickly as possible

which is really, really easy if you have a fab contract

at which point their use of nvidia dropped to zero

I think they're just a proxy/alias for 'state-of-the-art personal computing'.
  • Gud
  • ·
  • 17 hours ago
  • ·
  • [ - ]
I’d rather prefer that the average Joe has a good entertainment system than our corporate overlords has a good surveillance system.
The growth curve of technology has always pointed at the world becoming tiny and non-private.
  • Gud
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Disagree.

Mass surveillance by corporations can be outlawed. Just because something is possible, doesn’t mean it must be necessarily so.

I travel a lot for work to different nations. The cultural differences are stark.

In the UK for example, they love their CCTVs. In Switzerland, they’re only allowed where they are deemed necessary.

I mean back in the cold war we started losing privacy to foreign governments. A parade of overhead satellites is capturing everything you do all the time.

As much as we expound about the rule of law, might makes right if the population isn't vigilant. Simply put technology gives capability. In 1900 we didn't have the capability to monitor everything that everybody did all the time and keep those records their entire life. Now we have technology that can do just that.

This has nothing to do with the law. Zip, zilch, nada. Switzerland is one dark day away from having all their behaviors recorded by businesses/governments.

At the end of the day legality is a theoretical construct, and technological capability is reality.

  • Gud
  • ·
  • 11 hours ago
  • ·
  • [ - ]
There are hardly satellites that capture everything I do.
  • ·
  • 7 hours ago
  • ·
  • [ - ]
[dead]
GPUs before crypto had a lot less amount of VRAM. Crypto investment funded a lot of stupid experiments, of which some did stick to the wall. I don't think gamers had lives completely ruined by crypto in the end.
Crypto didn't need vram did it? It was just about hash rate no?

Besides, a 1080 had 8GB, a 5080 has 16GB. Double in 10 years isn't ground breaking. The industry put VRAM into industrial chips. It didn't make it to consumer hardware.

What games have had to deal with instead is inference based up-scaling solutions. IE using AI to sharpen a lower rest image in real time. It seems to be the only trick being worked on at the moment.

I can't think of anything useful crypto did.

  • bko
  • ·
  • 17 hours ago
  • ·
  • [ - ]
Higher price encourages more supply. Typically when you see a acute shortage, its quickly followed by a glut as supply starts coming online in an over correction.
These factories take years to make and massive amounts of money. That and there are so few manufacturers now they are far more likely to collude
Loved the reference. Probably from Margin Call[0]

0. https://youtu.be/fij_ixfjiZE

  • mcny
  • ·
  • 18 hours ago
  • ·
  • [ - ]
I like to imagine the reference in the movie margin call is that of a merry go round or a game of Musical chair. Like we are all on a ride, none of us are the operator, and all we can do is guess when the music will stop (and the ride ends).

The problem with this AI stuff is we don't know how much we will be willing to pay for it, as individuals, as businesses, as nations. I guess we just don't know how far this stuff will be useful. The reasons for the high valuation is, in my guess, that there is more value here than what we have tapped so far, right?

The revenues that nVidia has reported is based on what we hope we will achieve in the future so I guess the whole thing is speculation?

TBF, all financial market is speculation these days, what only change is the figure/percentage of how much a share is actually the value it's priced.

> The problem with this AI stuff is we don't know how much we will be willing to pay for it, as individuals, as businesses, as nations. I guess we just don't know how far this stuff will be useful. The reasons for the high valuation is, in my guess, that there is more value here than what we have tapped so far, right?

I think the value now comes on how we make a product of it, for example, like OpenClaw. Whether we like or not, AI is really expensive to train, not only in monetary value but also in resources, and the gains have been diminishing with each “generation”. Let's not forget we heard promises that have not been fulfilled, for example AGI or “AI could potentially cure cancer, with enough power”.

And if you've been watching Deepmind AI has been making advances in medical sciences at a pretty damned fast rate. So not fulfilled is a pretty weak statement. The pipeline in medical is very long.

And that's not even talking about the head spinning rate robotics is advancing. The hardware we use for LLMs is also being used in robot simulation for hardware training that gives results in hours that took weeks or months in the past.

I think AI companies are involving these other industries so when the money runs out they will claim the whole thing is too big too fail.
By buying flash and thus shifting demand to HDD? How does that work?
The article doesn't mention flash or HDD. It seems that all storage by WD is already sold.

My point is that directly or indirectly all hardware companies depend on memory and storage. If AI companies fall this could have repercussions to the whole industry.

The quote was from the screenwriter he never said it.
Better stock up with used laptops. I'm going to buy another one this year. Those used ones usually don't last very long.

What if in the near future it is simply too expensive to own "personal" computers? What if you can no longer buy used computers from official channels but have to find local shops or sharpen up on soldering skills and find parts from dumps? The big techs will conveniently "rent out" cloud computer for us to use, in exchange of all of your data.

"Don't you all have cellphones?"

Bingo. A number of corporate interests don't want to let you own your personal computers for different reasons. Google/Apple wants you to get locked down devices, and cloud/AI providers want you to use their services from a weak client.
Time for folks to familiarize themselves with Linux distros designed to run on older hardware. My 2009 laptop runs great, with the exception of the browser. Oh and the fact that 32-bit software is harder and harder to find.
A full-sized laptop from 2009 will probably boot a 64-bit distro, even Pentium M was no longer in use by then.
Yes, I'm grateful I run Linux. You can get quite a bit done with 4GB RAM and a 6th generation (or even earlier) CPU. All 64-bit. I don't think such ancient hardware will be affected by AI demand to the same degree (though I think we'll still see some prices rise if people stop buying new stuff).

The worry is that at some point the older hardware will stop working.

I think about this too. There are several headwinds. Rent-seeking and collapse of economies of scale in the consumer sector for sure, but also I feel like we've basically peaked in hardware's ability to meet routine needs.

Once the phone makers realize that they can sell phones and docking stations to businesses because 90% of knowledge work seems to happen in a web browser through one SaaS or other I think personal computers will be cooked.

  • fwip
  • ·
  • 14 hours ago
  • ·
  • [ - ]
I don't think phones are really any cheaper than the mini PCs that businesses can already buy. Which makes sense, because a phone has to include a battery, touch screen, and is under tighter space constraints.
I have heard that you can get used laptops. But they do not come with memory or SSD anymore... As even used components are now valuable enough to be removed and sold.
In a lot of cases, owners remove the storage not because it has any value but rather they don't want to risk making a mistake letting a device go that still has data on it.

Also pulling and shredding hard drives is cheaper than paying someone to run DBAN or equivalent (which can take many hours to complete on a large drive), and there's no easy way to securely erase an SSD if it wasn't encrypted from the beginning.

Many laptops from last few years have soldered memory. Your previous laptop's SSD can also be reused, since those don't die that quickly compared to the laptop.
Or worse, they have memory and SSD soldered on board, and are broken, so you have to learn soldering skills too.
Damn really? One of my go to moves when helping small political campaigns is to buy like a 2015 MBP and turn it into a locally hosted server to run all their stuff on the cheap.
Older MacBooks with socketed storage may actually be exempted because they use a proprietary connector instead of standard m.2.
I have 3 old employer laptops and my personal gaming laptop, which I use for work now. I'm happy about this now ;)

I probably will only need to return newest laptop if I leave the company.

Non state of the art lithography is pretty much commoditized (DDR3 & DDR4) so we will always have compute, although slower.
I no longer feel obligated to apologize for holding on to older devices for a long time. I have several desktops and laptops that are all still usable.
Why laptops specifically (and not also desktops)?
Eh, if this demand is really sustainable they will eventually start producing in adequate volume
It's still absolutely fascinating to me that basically the whole modern tech industry and the economic growth from it rests on the shoulders of a single company that has all of their important factories on a single island that's under constant threat of invasion. On top of that they themselves are reliant on a single company that's able to produce the machine required to print the wafers.

I don't know if TSMC has anything to do with hard drive production, but the reliance on very few players is also a problem in that industry.

  • mapt
  • ·
  • 17 hours ago
  • ·
  • [ - ]
Investors love a monopoly, and establishing this required more than a trillion dollars of investment sustained over a couple decades.
  • xpe
  • ·
  • 17 hours ago
  • ·
  • [ - ]
> Investors love a monopoly...

Indeed, investors left to their own devices act in this way. Underlying such a single point-of-failure is an implied but immense hope and thus pressure for stability. I wonder what the prediction markets saying about current levels of geopolitical stability in Taiwan?

> Indeed, investors left to their own devices act in this way.

Interesting. Capitalism is often touted to be more decentralized than socialism, but this is an example of how it can centralize.

Socialism is always talked out how it works out in practice, capitalism is talked about how it works out in theory
  • mapt
  • ·
  • 10 hours ago
  • ·
  • [ - ]
We don't even get that far in the US. It has been largely verboten in "nonpartisan" life, either socially or by dint of an active purge in eg academia and Hollywood, to discuss anything beneficial coming out of Soviet, Chinese, or Cuban administrations.

A partisan Republican will reliably interrupt you to shout nonsense, as if admitting a single positive outcome is trying to deceive them. As if a cost-benefit analysis can just be cut in half. As if these were not just authoritarian/totalitarian, but completely lacking domestic support.

This outcome was achieved with a great deal of money and propaganda over more than a century.

> rests on the shoulders of a single company that has all of their important factories on a single island

Isn't this just taking the oft-proposed explore vs exploit dichotomy to the logical conclusion of the "exploit" side?

Every single arbitrarity-finely-divided thing "should" be handled by the single (group|process) that has the greatest relative advantage at that one thing.

And you end up with the total variety/detailedness of everything matching what the substrate of the economy (ie, people with specialized training or education) has capacity to support. So at the limit there is at most one person who knows how to do any one specific thing.

(And the global economic system becomes infinitely fragile, but eh who's counting.)

There are three pillars for the bleeding edge, aren't there? TSMC, ASML, and the Spruce Pine quartz mine.
It's only this way because the American ruling class would rather ship jobs overseas to increase their wealth than competently establish an industrial sector that would pay good wages to average people.

Turns out letting a bunch of MBAs plan your economy is extremely foolish.

  • ·
  • 13 hours ago
  • ·
  • [ - ]
Hey now they went to school for at least 1.5 years and not all of that was at a 9th grade reading level! Some of it was 10th grade
> under constant threat of invasion

And isn't it also in a seismically active region, also prone to eathquake and/or tsunami?

Prone to earthquake yes. Tsunami no
> It's still absolutely fascinating to me that basically the whole modern tech industry and the economic growth from it rests on the shoulders of a single company

Stop getting your news from news.

> that has all of their important factories on a single island that's under constant threat of invasion.

Threat of invasion? Who would dare invade taiwan when it's protected by china?

> I don't know if TSMC has anything to do with hard drive production

Then why bother commenting here?

> but the reliance on very few players is also a problem in that industry.

Ah, you have a political agenda.

Are these the picks and shovels?

Is the profitability of these electronics manufacturers more likely than the companies that are buying up all their future inventory?

If AI continues at this trajectory, sure, likely to the picks and shovels.

If AI has a bubble burst, you could see a lot of used hardware flood the market and then companies like WD could have a hard time selling against their previous inventory.

The problem is more likely that companies like WD doesn't know if this will be a bubble or not. Currently they can milk the market by raising their prices and just rely on their current production facilities, maybe expand a little. If there's going to be crash, then it's better to have raised the price, even if just temporarily, rather than being left standing with excessive production capacity.

If it's long term, it would be better to be the front runner on additional capacity, but that's assuming continuous growth. If it all comes down, or even just plateaus, it's better to simply raise prices.

Given how hard AI is on I/O, while restarting if hardware might go second hand. I dont see hard drives go second hand. Most hardware that we get might be used beyond redeeming even at free price.
Actually used hard disks have gotten pretty popular in the past few years, with sellers like ServerPartDeals building up a reputation by selling drives that are properly tested and recertified. As long as you have redundancy and backups by all accounts they hold up pretty well.
I don't think the HDDs are being used for any intensive loads. They have too much latency for most of that. It's probably just archival storage for their scraped content and generated slop.
For "cold" archival storage you would want to use tape, which is far cheaper per TB at scale.
I don't mean that type of archive, but rather "just in case" data like "last month's scrape of this website" after we scraped it 5 more times this month or higher resolution versions of book scans. You might want to still be able to dump it out quickly if you need it. Money is no object for these companies and the cost of HDDs is more than low enough for the flexibility they provide.
If demand for hard drives is this high then it sounds like there wouldn't be near enough tape around either.
This is why I am buying a couple of LTO 6 tapes. Thus far I've been able to buy 4 for approx 120 EUR, 2,5 TB each. They have been around 30 EUR each the past years, and still are approx such price (leaning towards 35 EUR though). I bought a second hand drive for about 500 EUR, and a HBA for it.
Tapes are great for true cold storage (will easily last many decades!) but they will wear out significantly with more intense use: you only get a couple hundred passes total over their full data capacity, either read or write. In practice, you still need plenty of big hard disks to act as nearline storage for practical use, and the tape only rarely does storage and retrieval in bulk. This is also why you see mechanical tape libraries with tens or hundreds of tapes for a single read/write unit: you don't really need more than that.
Yes, I will use them as cold storage, nothing else. Right now, I have the following scheme:

1x a live server (Proxmox, NAS, firewall, and various other capabilities). 2x RAID1 enterprise NVMe, with NAS storage on RAID1 HDDs. 12th gen Intel, so relatively power friendly apart from the enterprise HDDs. 10 gbit local, remote 1 gbit fiber.

1x a live backup server in same city. Syncs every night. I should disable it otherwise, but I don't as of now, since it also gets a livestream of my doorbell camera (I don't use cloud for it). Has 1 gbit fiber, and RAID-1. Runs OS off RAID-1 (cannot add NVMe, older Synology, well maybe USB3 would work, but I'd rather not).

1x a backup server in same location as my home server which auto starts and syncs every week. This is my main old server, a Xeon so not very power friendly. Also RAID-1 NVMe. 10 gbit local.

1x a remote cloud (Norway), to have another copy of the most important data. Doesn't contain everything. Costs me only 50 EUR/year.

So that is a lot of copies of the same data, and quite frankly I not need this many. For the HDDs I want to get rid of RAID-1 and use either RAID-0 or JBOD, doubling the available data (I'm at the max as it is) while still having great data redundancy. And I will want to store my tapes off-site, although it wouldn't have the latest and greatest backups. I still have to look up how to do FDE with LTFS though, but I'll figure it out.

It also seems a good moment to sell some old hardware, given the current prices, but I am not sure if I will. Just something to ponder on later. You'd think I'd like to sell off the Xeon with its 30W CPU, but I quite like the machine (HP Microserver 10 Gen10 Plus). I'd rather sell the Synology which is still a decent machine, but I use voodoo to run recent software on it, and ZFS (with Homebrew / Nixpkgs). Tho neither is useful for ML.

  • 55555
  • ·
  • 18 hours ago
  • ·
  • [ - ]
What are companies needing all of these hard drives for? I understand their need for memory, and boot. But storing text training data and text conversations isn't that space intensive. There's a few companies doing video models, so I can see how that takes a tremendous amount of space. Is it just that?
Hearing about their scrapping practises it might be that they are storing same data over and over and over again. And then yes, audio and video is likely something they are planning for or already gathering.

And if they produce lot of video, they might keep copies around.

All the latest general purpose models are multimodal (except DeepSeek I think). Transfer learning allows to improve results even after they exhausted all the text in the internet.
Speaking from personal experience.. we treat cloud storage like an infinitely deep bucket. At rest data efficiency is not really a consideration because compute costs are so absurd. Why worry about a $2M year storage bill when your compute bill is $500M? It’s not worth the engineering time to optimize
Storing training data: for example, Anthropic bought millions of second hand books and scanned them:

https://www.washingtonpost.com/technology/2026/01/27/anthrop...

All of Annas archive can be put on 40 drives
  • rasz
  • ·
  • 12 hours ago
  • ·
  • [ - ]
Not if you "scan" them by recording 4K video of someone flipping page after page, you know to teach multi modal models.
Facts. Anything less than 4K/120fps simply won't cut it in '26. Anthropic ain't just flipping pages, they're flipping the world.
I think the somewhat hallucinatory canned response is that they distribute data across drives for a massive throughput. Though idk if that even technically makes sense...
I am surprised by that too. I thought everyone moved to SDDs or NVMe ?

I was toying with getting a 2T HDD for a BSD system I have, I guess not now :)

Everyone moved to SDDs or NVMe. If you're right, that includes manufacturers. HDDs still have advantages over SSDs for specific needs, like more reliable long-term unelectrified storage. It's also possible that the high price of SSDs made HDDs an option again.
Really if you're writing large solid files hard drives aren't that bad. If you can have the system split out one file per drive at a time then you'll avoid a lot of the fragments
I was recently involved in a large server purchase for work, where we wanted 72 hard drives of 24TB each for a server. They were available last year, but last month the largest we could get were 20TB drives.
My machine was built up from parts in 2014.

6-7 years ago when GPU prices went up, I hoped nothing would break. Last year when RAM prices went up I did the same. Now with drive prices going up, it's the same thing.

It's interesting because I've always built mid-tier machines over the years and it was in the neighborhood of ~$700 at the time. Now the same thing is almost double that but the performance is no where near twice as good for general computer usage.

This is the consequence of "I don't want to write this function myself, I'll get the plagiarism machine to do it for me"
And what's wrong with not wanting to write functions yourself? It is a perfectly reasonable thing, and in some cases (ex: crypto), rolling your own is strongly discouraged. That's the reason why libraries exist, you don't want to implement your own associative array every time your work needs it do you?

As for plagiarism, it is not something to even consider when writing code, unless your code is an art project. If someone else's code does the job better then yours, that's the code you should use, you are not trying to be original, you are trying to make a working product. There is the problem of intellectual property laws, but it is narrower than plagiarism. For instance, writing an open source drop-in replacement of some proprietary software is common practice, it is legal and often celebrated as long as it doesn't contain the original software code, in art, it would be plagiarism.

Copyright laundering is a problem though, and AI is very resource intensive for a result of dubious quality sometimes. But that just shows that it is not a good enough "plagiarism machine", not that using a "plagiarism machine" is wrong.

If I use a package for crypto stuff, it will generally be listed as part of the project, in an include or similar, so you can see who actually wrote the code. If you get an LLM to create it, it will write some "new original code" for you, with no ability to tell you any of the names of people who's code went into that, and who did not give their consent for it to be mangled into the algorithm.

If I copy work from someone else, whether that be a paragraph of writing, a code block or art, and do not credit them, passing it off as my own creation, that's plagiarism. If the plagiarism machine can give proper attribution and context, it's not a plagiarism machine anymore, but given the incredibly lossy nature of LLMS, I don't foresee that happening. A search engine is different, as it provides attribution for the content it's giving you (ignoring the "ai summary" that is often included now). If you go to my website and copy code from me, you know where the code came from, because you got it from my website

Why is "plagiarism" "bad"?

Modern society seems to assume any work by a person is due to that person alone, and credits that person only. But we know that is not the case. Any work by an author is the culmination of a series of contributions, perhaps not to the work directly, but often to the author, giving them the proper background and environment to do the work. The author is simply one that built upon the aggregate knowledge in the world and added a small bit of their own ideas.

I think it is bad taste to pass another's work as your own, and I believe people should be economically compensated for creating art and generating ideas, but I do not believe people are entitled to claim any "ownership" of ideas. IMHO, it is grossly egoistic.

Sure, you can't claim ownership of ideas, but if you verbatim repeat other people's content as if it is your own, and are unable to attribute it to its original creator, is that not a bit shitty? That's what LLMs are doing
If a human learns to code by reading other people's code, and then writes their own new code, should they have to attribute all the code they ever read?

Plagiarism is a concept from academia because in academia you rise through the ranks by publishing papers and getting citations. Using someone else's work but not citing them breaks that system.

The real world doesn't work like that: your value to the world is how much you improve it. It would not help the world if everyone were forced to account for all the shoulders they have stood on like academics do. Rather, it's sufficient to merely attribute your most substantial influences and leave it at that.

I honestly think it's not that simple.

The ones who spend billions on integrating public cloud LLM services are not the ones writing that function. They are managers who based on data pulled out of thin air say "your goal for this year is to increase productivity by X%. With AI, while staffing is going slightly down".

I have to watch AI generated avatars on the most boring topics imaginable, because the only "documentation" and link to actual answer is in a form of fake person talking. And this is encouraged!

Then the only measure of success is either AI services adoption (team count), or sales data.

That is the real tragedy and the real scale - big companies pushing (external!) AI services without even proof that it justifies the cost alone. Smooth talking around any other metric (or the lack of it).

In my experience LLMs mimic human thought, so they don't "copy" but they do write from "experience" -- and they know more than any single developer can.

So I'm getting tired of the argument that LLMs are "plagiarism machines" -- yes, they can be coaxed into repeating training material verbatim, but no, they don't do that unless you try.

Opus 4.6's C compiler? I've not looked at it, but I would bet it does not resemble GCC -- maybe some corners, but overall it must be new, and if the prompting was specific enough as to architecture and design then it might not resemble GCC or any other C compiler much at all.

Not only do LLMs mimic human thinking, but also they mimic human faults. Obviously one way in which they mimic human faults is that there are mistakes in the LLMs' training materials, so they will evince some imperfections, and even contradictions (since there will be contradictions in their training materials). Another way is that their context windows are limited, just like ours. I liken their hallucinations to crappy code written by a tired human at 3AM after a 20 hour day.

If they are so human-like, we really cannot ascribe their output to plagiarism except when prompted so as to plagiarize.

[dead]
Yeah, this is slowing down growth and profits. The AI hype is sucking everything dry, from HVAC services to hardware.
These really are some of the toughest years for people trying to buy a computer. We only recently emerged from the cryptocurrency-driven crunch, and now AI has arrived—and this wave is far more severe than crypto. In that light, Apple’s Macs are actually quite good value for money.
This is all basically a textbook example of irrational market decisions. There’s clearly a bubble and not enough money coming in to pay for the AI bonanza.

It’s building materials being in short supply when there’s obviously more houses than buyers. That’s just masked at the moment because of all the capital being pumped in to cover for the lack of actual revenue to pay for everything. The structural mismatch at the moment is gigantic, and the markets are getting increasingly impatient waiting for the revenue to materialize.

Mark this post… in a few years folks will be coming up with creative ideas for cheap storage and GPUs flooding the market after folks pick up the pieces of imploded AI companies.

(For the record, I’m a huge fan of AI, but that doesn’t mean I don’t also think a giant business and financial bubble is about to implode).

  • doom2
  • ·
  • 16 hours ago
  • ·
  • [ - ]
> in a few years folks will be coming up with creative ideas for cheap storage and GPUs flooding the market

COVID was six years ago. In that time, GPU prices haven't gone down (and really have only increased). Count me skeptical that there will be a flood of cheap components.

I feel like the most recent time you could reasonably get an nvidia *80 GPU at the store for a normal amount of money was almost a decade ago.
But that's a tautology. The Nvidia *80 GPU's MSRP has been unreasonable for that long (1080 launched at $600 May 2016, which IMO was already excessive).

But there was a window as recently as fall (3-5 months ago) where you could get most PC parts at MSRP. Granted it was a pretty short window, before the last dying whispers of crypto and COVID induced scarcity were overtaken by the surge of the AI bubble.

The 7800GTX was $600 too.

And yeah, fall is when I got my RTX 5080. It was still $1000 and I had no idea how lucky I was when I pulled the trigger. Still felt like I was trawling discord bots though to get a founders edition.

That was largely due to the crypto mining bubble popping just as the AI bubble was getting started
Is there an industrial bubble? Probably.

> It’s building materials being in short supply when there’s obviously more houses than buyers.

That I think is a hard one to prove and is where folks are figuring it out. There is obvious continued demand and certainly a portion of it is from other startups spending money. I don’t think it’s obvious though where we are at.

More reckless and irresponsible than irrational.
I am interested in hearing how to get hard drives to last longer. Should you keep them locked away in the closet? Spin them up occasionally but not too much? Keep them always-on? I understand the less reading and writing, the better.

How does external compare to internal, if at all? Is 3.5" going to last longer than something smaller?

Spinning HDDs will eventually be at risk of failing for purely mechanical reasons, so beyond handling them with care you can't really do all that much. Keeping them always-on may be a viable strategy for drives that are already mostly on, otherwise, just spin them up once in a while, but don't expect this to lower risk significantly. An old drive should simply have its contents transferred to new media, and then be treated as something ephemeral that may fail at any time without warning.
I console myself with knowledge of the economics maxim that every supply shortage is usually, eventually, followed by a supply glut.

One can only hope that that's the principle at work here, anyway. It could also be a critically damped system for all I know. Unfortunately I studied control systems too...

If storage and memory manufacturer don't respond with increasing supply. There might not be glut. Just postponed demand that will slowly get fulfilled over longer period. That is if we were in steady state.

On other hand if there is bigger economic turmoil that might mean that the postponed demand does not realise as there is no purchasing power...

I was thinking than until my NAS gave me a error on one of my harddrives, now I'm in the market for a replacement while I still have redundancy
  • xpe
  • ·
  • 17 hours ago
  • ·
  • [ - ]
People with a control theory background are welcome in economics; the field is more diverse than some would recognize. Certain professions and subfields are more open than others. There are plenty of economists who care about things like resilience and dampening shocks.

I would love if more non-traditional economists got involved in the public sphere by which I mean: writing about economic trends, public policy, regulation, rate-adjustment, etc.

As an engineer with a passing control theory background and a breadth of general knowledge, I'd love to explore this space more and find a way to apply my knowledge and share the results. Are there any particular problems you think well-suited to this treatment?
  • xpe
  • ·
  • 16 hours ago
  • ·
  • [ - ]
If you have a policy area you like you might start there. From my lens, here are some interesting ways to look at political economy from a broader point of view: economic disruption from AI (could be from energy prices, labor substitution, and lots more), climate modeling and its impacts on economies, conservation and ecosystem stability, and economic growth under different levels of inequality. I would add this to the mix even though it isn't a typical economic area: geopolitical destabilization from autonomous weapons, both physical and cyber.
Those are definitely all areas of interest for me as well. Thanks for the pointers. Do you write anywhere?
There's always πfs. https://github.com/philipl/pifs
I'm not against subsidies but the concentration is a problem. This money could have spurned grass roots participation in these emerging industries, but instead they chose to go with the most heinous of monocultures, leaving billions of people out of the loop.
The only way to combat this is the same way to combat toilet paper shortages.

Market says the more you buy the better pricing you get. Once you start capturing large market share of the product, the price should go up and not down; exponentially.

Example, a person that owns 10 houses means that they are restricting the ability of others to own a single home. By increasing the cost of excessive product ownership ... it will reduce the amount of product that people will hoard and allow others to gain access to it.

The laws of supply and demand are not optional.

If you try to use government to force reality to conform to your idea of how things should work you're just going to get 1,000 companies buying 10,000 hard disks each rather than 10 companies buying 1,000,000 each. And if you try to outlaw that somehow then the market will just route around your new scheme in another way, creating even more unintended consequences in the process.

If you must meddle, you're much better off working with market forces rather than trying to fight against them.

>The only way to combat this is the same way to combat toilet paper shortages.

I buy one hard disk, AI company busy 40% of global supply. Me not buying that one hard disk is not going to change anything.

>According to Western Digital, thanks to a surge in demand from its enterprise customers, the consumer market now accounts for just 5 percent of the company's revenue.

This only works if you exclude datacenters Georg who lives in a server rack and buys 10,000 hard drives a day.

Unfortunately, there are several such outlier entities which collectively control enough resources to price literally everyone else out.

[dead]
It's interesting to see here that spending is irrational, but actually even if AI improvements slow down it's more rational for companies to spend more and underutilize the machines than to underspend and get disrtupted.

On the otherhand lots of people here are even more uncomfortable of the other option, which is quite possible: AI software algorithms may scale better than the capacity of companies that make the hardware. Personally I think hardware is the harder to scale from the two and this is just the beginning.

  • jccc
  • ·
  • 16 hours ago
  • ·
  • [ - ]
Great. I’ve just returned a WD drive to Amazon after it arrived crushed in a torn-open paper bag.

The replacement arrived also in a paper bag and went straight back, this time for a refund.

I guess I should have kept that one and hoped for the best.

Good alternatives? I’ve only recently been enlightened on how profoundly sh__ty SSD is for long-term storage and I have a whole lot of images my parents took traveling the last few years of their lives.

I'm sure Amazon isn't the only shop that delivers to your area
  • jccc
  • ·
  • 16 hours ago
  • ·
  • [ - ]
The premise of this news is that prices are going to climb and availability is going to drop.

And I’m not keen on having anyone ship me one of these anymore.

Walmart sells what appears to be an older version of the drive and I might have to cross my fingers and just get one of those.

> And I’m not keen on having anyone ship me one of these anymore.

Isn't that what you're doing ordering off amazon with their comingled inventory?

Besides, there's a spectrum of sellers between "Amazon" and "anybody", you can even, perhaps, purchase directly from the manufacturer.

  • jccc
  • ·
  • 15 hours ago
  • ·
  • [ - ]
I meant that after the Amazon experience, I don't want to buy a HDD online. Would much prefer to get it locally in person.
Bestbuy with local pickup. They price match Amazon. B&H Photo is another option.
It started with RAM; now with hard drives and SSDs. This is not looking good. But at least you can buy used ones for a pretty good price, for now.
This is getting ridiculous. Never before has an unwanted product been thrust so forcefully and artificially into the market that it disrupts the supply line for real products with actual demand.
HDDs, RAM, and chips have so many health metrics and methods that you really shouldn't be afraid to buy them used. The only special requirement is a RasPi test rig. That and a 30 day return window.
  • saghm
  • ·
  • 15 hours ago
  • ·
  • [ - ]
I feel like the goal is to avoid buying something broken in the first place, not just to be able to tell if you've bought something that turns out to be broken
Presumably they're also looking to increase production capacity as fast as possible - within the year?

I'd have thought HDDs aren't at the top of the list for AI requirements, are other component manufacturers struggling even more to meet demand?

Why would they?

If we weren’t talking about AI, was there another high demand sector / customer for spinning platters?

And their margins get fat now that supply is relatively constant but AI demand has saturated their current production numbers.

I listed some hard drives on Friday on eBay.. most of them refurbished... within 5 minutes got a message from a person who wanted them all... shipped them an hour later
I built a new server this time last year. My board does 6 channel RAM so I bought 6x32GB ECC DDR5. $160 a stick at the time. Just for grins I looked up the same product number at the same supplier I originally bought from. $1300 apiece. One of the VMs running on that server is TrueNAS, with 4 20TB WD Red Pros. God help me if I have to replace a drive.
Best Buy is actively selling 2x8gb sticks of DDR4 3200 for $80 a stick. I was floored. Ten bucks a gig, $160 for the pack.

We're fucking doomed.

Ten bucks a gig is lower than what some DDR5 memory is selling at.
Perhaps there is an incentive to go back to OS that can operate with 640KB RAM ... /s
Is this for NVME only or spinning drives too? I use both, but I actually have use cases for HDDs and hope those are less affected.
It’s affecting both. HDD maybe slightly less/slower, but you’re paying significantly more than six months ago in any case.
All I know is I saw most of my go-to refurbished enterprise HDD’s 2-3x during Black Friday a few months ago compared to a year prior.
  • ErneX
  • ·
  • 18 hours ago
  • ·
  • [ - ]
This particular news is for spinning drives, the other types we already had news about upcoming shortages earlier on.
I bought 6x refurbished ultrastars for ~$100/ea Black Friday 2024. They were over $200/ea 2025. Samsung T7 (and shield) SSD’s have 2x-3x. Can’t get 1TB for less than like $180 right now. It’s ridiculous
  • dgxyz
  • ·
  • 17 hours ago
  • ·
  • [ - ]
Bought a few 2TB T7 shield disks last year before the boom. Thank fuck I did it then.
Take good care of them!
i am happy i bought 5x10TB drives two months ago, anticipating this exact scenario.
Not only storage, cheapest 32 GB RAM that I can find is around 200 euros.
That's actually a bargain, average market price (though highly volatile) is more than double that.
  • 34679
  • ·
  • 17 hours ago
  • ·
  • [ - ]
They're probably looking at DDR4.
sighs at her local backup drive that just gave up the ghost

thanks, AI-boosting motherfuckers, thanks a lot

Rotating or SSD?
I pray someone steps into the market and takes all their consumer/SME business from them. I know thats not that simple or at all probable, but it would be great time to take market share and great for us little guys.
  • ·
  • 18 hours ago
  • ·
  • [ - ]
Any point in setting my laptop to minimal power usage by default and drives to aggressively sleep, in order to try to extend life? Or is sleep and extra power cycle and better to not sleep my M2 drives so they aren't being powered up/down?
The main thing that shortens the lifetime of solid state drives is sustained writes over time. You should disable all system logging options and all "spotlight databases for fast search", avoid swap use and not let the drives "sleep" too aggressively since this will force the system to persist some volume of writes from RAM to disk that might have turned out to be unnecessary down the road.
The reason this is a problem is because whatever value AI may have (personally I’m as long as one can get), companies believe that right now it’s a government sponsored financial bubble.

So they’re unwilling to spend on increasing capacity because they don’t expect this demand to last.

Good luck to everyone. Home you made some reserve.

Yes, AI is nice, but I also like to be able to buy some RAM and drives…

The future is thin clients for everyone, requiring a minimal amount of RAM and storage because all they are is a glorified ChatGPT interface.
I'm running multiple services such as Forgejo, Audiobookshelf, Castopod and they all need no more than roughly 100 MB RAM.

There is one exception though. Open WebUI with a whopping 960 MB. It's literally a ChatGPT interface. I'm only using external API providers. No local models running.

Meanwhile my website that runs via my own Wordpress-like software written in Rust [1] requires only a few MB of RAM, so it's possible.

[1]: https://github.com/rikhuijzer/fx

Is this an inevitable future? The amount of people ready to cede their computational resources, thinking, digital sovereignty, to centralised platforms, all in the name of progress, is truly shocking to me, especially in the current political moment.

The main reason I do not prioritise AI usage in my own life is to retain my skills and mental acuity. All of the forms of computing and opportunities that I value do not require AI to achieve. I can understand why people feel differently from me, though, because AI and AI-adjacent things are where all of the money is right now.

No, clearly not. First, future is never inevitable, and second, I was being ironic. Though the fact is that thin vs rich clients is one of the major fashion cycles in the IT industry.
My apologies. It's so hard to tease apart irony from genuine mouth frothing on HN these days.

C'mon, bubble, burst...

No worries! I definitely sympathise. Irony is truly dead these days.
You know what is the sad part. I don't think software developers or LLMs know how or want anymore to make low resource consumption software that runs on a thin client. It will be some browser based thing capping to whatever memory is available on the system.
Even if the AI bubble bursts, having successfully cornered the compute market they can just go rent seeking instead by renting out cloud workstations, given that they've made the hardware to build a workstation yourself unaffordable.
It won't last. If the demand is sustained then new factories will open up and drive the price down.

More likely a couple of big financing wobbles lead to a fire sale.

It isn't practical for HDD supply to be wedged because in 5 years the disks start failing.

Do they really think they will get some money from the AI ponzi scheme ?

Well, at least they might still have a product to sell once the AI bubble pops, unlike with NVIDIA which does seem to kinda forgot to design new consumer GPUs after getting high on AI money.

They haven't forgotten, they've expressly decided to soft-pivot away from consumer GPUs. RTX 60x0 series is apparently coming in 2018… (oops, 2028. No time travel involved. Probably). If the bubble has burst by then.
> RTX 60x0 series is apparently coming in 2018

That's either a typo, or NVidia has achieved some previously unheard of levels of innovation.

They're hedging on LLMs inventing time travel any day now.
> "apparently coming in 2018… maybe. If the bubble has burst by then."

Spoiler from the future: it hasn't. Get your investments in while you have time.

Take a look at prices of SSDs and RAM too.
  • ·
  • 18 hours ago
  • ·
  • [ - ]
  • ·
  • 18 hours ago
  • ·
  • [ - ]
the current spikes tempt me to sell off my home lab. a mac mini to sell to the open claw bros, 5tb HDD, Intel NUC, some SSDs, and a 5 year old dell laptop. can always buy back after the crash.
You could buy a house and a private jet with these if you held just one more year.
  • Jyaif
  • ·
  • 18 hours ago
  • ·
  • [ - ]
does that only include SSDs, or does it include HDDs as well?
It includes all forms of storage except for USB devices, GPUs and high end CPUs. The latter you can still get but you're going to have some severe sticker shock.
Maybe shucking USB HDDs is the short-term answer.
  • M95D
  • ·
  • 18 hours ago
  • ·
  • [ - ]
Is that still possible? Aren't they native USB with no adapter?
  • ErneX
  • ·
  • 18 hours ago
  • ·
  • [ - ]
Those drives are SATA inside the case.
That depends on the brand. The lower priced brands, yes, those can be SATA, the more vertically integrated companies also make custom PCBs that just have USB-C without any SATA interface exposed internally.
I've shucked WD MyBook drives, just a plain SATA inside. I guess that it's cheaper to have a stock drive and a cheap SATA-USB adaptor in a shell than do custom electronics. I've not heard of any that are otherwise, but I've only done a few. I suppose it's possible that they could solder them in or have custom electronics but I would have thought that rare. It's frequently discussed on Reddit too, so there's plenty of folk doing this.
Do you mean the big ones or the SSD ones?
The big ones with a separate power brick, I've not looked inside the smaller USB-powered ones. My interest was in the >8Gb desktop drives. I'd imagine they're the same deal, but hard to say from the outside. I did have a part of one at one point, a USB-to-SATA circuit board that was useful for adhoc connecting 2.5" drives, but I can't recall if that came from an prebuilt or an old BYOD enclosure.

I have a old WD small one that's kinda faulty (plugged it in then put it down heavily, it's not been right since), I should pop it open just to see what's inside, but it's older than the USBC models so could easily have changed. In any case, I don't think AI is eating the stock of slow laptop HDDs, so I'm not sure there's any need to buy these just for shucking.

Ok. So I have a bunch them here, different sizes, both SSD and spinning rust. The big ones are all consumer grade drives with a little adapter board like you describe. The small ones are a mix of a single custom board with a USB connector and adapter board based ones. The tell tale is the outer dimension in the length, if the case is a little bit longer than a standard drive you have a very good chance of having one with an adapter board if it is the same or even smaller than the standard format then almost all of them are custom boards. The really nice ones have NVME guts in them that you can immediately repurpose.
It's probably feasible to make a "mass storage USB in, SATA protocol out" smart adapter board.
  • ErneX
  • ·
  • 17 hours ago
  • ·
  • [ - ]
I see, but if you plan on shucking you obviously get ones you know are able to.
I read it as both, but UK suppliers have stock of various SATA HDDs available in large and small sizes. It's hard to say if prices will rocket or availability decline, or both. I don't normally advocate panic-buying, but if it's needed now is the time. I have one NAS spare on hand, I don't want or need a drawer full of them, but it'll be a royal pain if I do and can't get parts.
Lower performance/capacity consumer drives might be comparatively safe because there's Chinese end-to-end production capacity for those. Of course the price can still increase, but probably not that much.
Admittedly I've not been tracking price, only availability, and only in the size I might need.
  • ·
  • 18 hours ago
  • ·
  • [ - ]
  • j45
  • ·
  • 14 hours ago
  • ·
  • [ - ]
I might have been more concerned if it was a different drive manufacturer.. some users won't forgive them for the WD Red debacle where they lied about what was in the drive.
Not only does AI want to take my job, it also wants to make my hobbies unaffordable.

I love modern world so much /s