There is no such thing.
But now, are they really going to undermine this partnership for that? Their GPUs probably aren't going to become a cash cow anytime soon, but this thing probably will. The mindset among American business leaders of the past two decades has been to prioritize short-term profits above all else.
I feel bad for gamers - I’ve been considering buying a B580 - but honestly the consumer welfare of that market is a complete sidenote.
...and yet Nvidia is not gambling with the odds. Intel could have challenged Nvidia on performance-per-dollar or per watt, even if they failed to match performance in absolute terms (see AMD's Zen 1 vs Intel)
Any down the road repercussions be damned from their perspective.
I'm sorry that's just not correct. Intel is literally just getting started in the GPU market, and their last several releases have been nearly exactly what people are asking for. Saying "they've lost" when the newest cards have been on the market for less than a month is ridiculous.
If they are even mediocre at marketing, the Arc Pro B50 has a chance to be an absolute game changer for devs who don't have a large budget:
https://www.servethehome.com/intel-arc-pro-b50-review-a-16gb...
I have absolutely no doubt Nvidia sees that list of "coming features" and will do everything they can to kill that roadmap.
To that point, they've been "just getting started" in practically every chip market other than x86/x64 CPUs for over 20 years now, and have failed miserably every time.
If you think Nvidia is doing this because they're afraid of losing market share, you're way off base.
224 GB/s
128 bit
The monkey's paw curls...I love GPU differentiation, but this is one of those areas where Nvidia is justified shipping less VRAM. With less VRAM, you can use fewer memory controllers to push higher speeds on the same memory!
For instance, both the B50 and the RTX 2060 use GDDR6 memory. But the 2060 has a 192-bit memory bus, and enjoys ~336 GB/s bandwidth because of it.
Preempting a (potential) future competitor from entering a market is also an antitrust issue.
Besides, who would actually use them if they don’t support CUDA?
Everyone designs better GPUs than Intel - even Apple’s ARM GPUs have been outpacing Intel for a decade even before the M series.
Though fair and free markets is not at all what the current regime in the US believes in, instead it will be consolidation, leading waste, and little innovation and progress.
But that's exactly what they started doing with Battlemage? It's competitive in its price range and was showing generational strides.
> Besides, who would actually use them if they don’t support CUDA?
ML is starting to trend away from CUDA towards Vulkan, even on Nvidia hardware, for practical reasons (e.g. performance overhead).
https://www.intc.com/news-events/press-releases/detail/1750/...
What’s old is new again: back in 2017, Intel tried something similar with AMD (Kaby Lake-G). They paired a Kaby Lake CPU with a Vega GPU and HBM, but the product flopped: https://www.tomshardware.com/news/intel-discontinue-kaby-lak...
IMHO we will soon see more small/quiet PCs without a slot for a graphics card, relying on integrated graphics. nVidia has no place in that future. But now, by dropping $5B on Intel they can get into some of these SoCs and not become irrelevant.
The nice thing for Intel is that they might be able to claim graphics superiority in SoC land since they are currently lagging in CPU.
This was all for naught as AMD purchased ATi, shutting out all other chipsets and Intel did the same. Things actually looked pretty grim for Nvidia at this point in time. AMD was making moves that suggested APUs were the future and Intel started releasing platforms with very little PCIe connectivity, prompting Nvidia to build things like the Ion platform that could operate over an anemic pcie 1x link. There were really were the beginnings of strategic moves to lock Nvidia out of their own market.
Fortunately, Nvidia won a lawsuit against Intel that required them to have pcie 16x connectivity on their main platforms for 10 years or so and AMD put out non-competitive offerings in the CPU space such that the APU take off never happened. If Intel had actually developed their integrated GPUs or won that lawsuit or if AMD had actually executed Nvidia might well be an also-ran right around now.
To their credit, Nvidia really took advantage of their competitors inability to press their huge strategic advantage during that time. I think we're in a different landscape at the moment. Neither AMD nor Intel can afford boot Nvidia since consumers would likely abandon them for whoever could still slot in an Nvidia card. High performance graphics is the domain of add-in boards now and will be for awhile. Process node shrinks aren't as easy and cooling solutions are getting crazy.
But Nvidia has been shut out of the new handheld market and haven't been a good total package for consoles as SoC both rule the day in those spaces so I'm not super surprised at the desire for this pairing. But I did think nvidia had given up these ambitions was planning to try to build an adjacent ARM based platform as a potential escape hatch.
This feels like a 'brand new sentence' to me because I've never met an ALi chipset that I liked. Every one I ever used had some shitty quirk that made VIA or SiS somehow more palatable [0] [1].
> Intel started releasing platforms with very little PCIe connectivity,
This is also a semi-weird statement to me, in that it was nothing new; Intel already had an established history of chipsets like the i810, 845GV, 865GV, etc which all lacked AGP. [2]
[0] - Aladdin V with it's AGP Instabilities, MAGiK 1 with it's poor handling of more than 2 or 3 'rows' of DDR (i.e. two double-sided sticks of DDR turned it into a shitshow no matter what you did to timings. 3 usually was 'ok-ish' and 2 was stable.)
[1] - SIS 730 and 735 were great chipsets for the money and TBH the closest to the AMD760 for stability.
[2] - If I had a dollar for every time I got to break the news to someone that there was no real way to put a Geforce or 'Radon' [3] in their eMachine, I could have had a then-decent down payment for a car.
[3] - Although, in an odd sort of foreshadowing, most people who called it a 'Radon', would specifically call it an AMD Radon... and now here we are. Oddly prescient.
I realize the AGX is more of a low power solution and it's possible that nvidia is still technically limited when building SOCs but this is just speculation.
Does anybody know actual ground truth reasoning why Nvidia is buying Intel despite the fact that nvidia can make their own SOCs?
Make no mistake - there is no reason to do this besides shortening the hardware lifespan with Box86. But it is possible, most certainly.
/me picturing Khaby Lame gesturing his hands at an obvious workaround.
Doesn't feel the same because the 1997 investment was arranged by Apple co-founder Steve Jobs. He had a long personal relationship with Bill Gates so could just call him to drop the outstanding lawsuits and get a commitment for future Office versions on the Mac. Basically, Steve Jobs at relatively young age of 42 was back at Apple in "founder mode" and made bold moves that the prior CEO Gil Amelio couldn't do.
Intel doesn't have the same type of leadership. Their new CEO is a career finance/investor instead of a "new products new innovation" type of leader. This $5 billion investment feels more like the result of back-channel discussions with the US government where they "politely" ask NVIDIA to help out Intel in exchange for less restrictions selling chips to China.
Stinks of Mussolini-style Corporatism to me.
If you fiddle and concentrate only on the top performers, the bottom falls out. Most of the US economy is still in small companies.
Let's assume Trump admin pressured Nvidia to invest in intel.
Chips act (voted by Democrats / Biden) gave Intel up to $7.8 billion of YOUR money (taxes) in form of direct grants.
Was it more of "Mussolini-style corporatism" to you or not?
It isn't the "method of communication". It's legislation vs. coercion (in the speculative scenario from the parent comment).
https://www.ft.com/content/12adf92d-3e34-428a-8d61-c91695119...
- a bigger R&D budget for their main competitor in the GPU market
- since Nvidia doesn't have their own CPUs, they risk becoming more dependent on their main competitor for total system performance.
This is why they built the Grace CPU - noting that they're using Arm's Neoverse V2 cores rather than their own design.
Had Apple failed, Microsoft would probably have been found to have a clear monopolistic position. And microsoft was already in hot waters due to InternetExplorer IIRC.
Apples demise wouldve nailed the case.
> Nvidia will also have Intel build custom x86 data center CPUs for its AI products for hyperscale and enterprise customers.
Hell has frozen over at Intel. Actually listening to people that want to buy your stuff, whatever next? Presumably someone over there doesn't want the AI wave to turn into a repeat of their famous success with mobile.
In the event Intel ever do get US based fabrication semi competitive again (and the national security motivation for doing so is intense) nVidia will likely have to be a major customer, so this does make sense. I remain doubtful that Intel can pull it off, and it will have to come from someone else.
They turned down Acorn about the 286, which led to Acorn creating the Arm, they have turned down various console makers, they turned down Apple on the iPhone, and so on. In all cases they thought the opportunities were beneath them.
Intel has always been too much about what they want to sell you, not what you need. That worked for them when the two aligned over backwards compat.
Clearly the threat of an Arm or RISC-V finding itself fused to a GPU running AI inference workloads has woken someone up, at last.
It leads to mistakes like you mention, where a new market segment or new entrant is not a sure thing. And then it leads to mistakes like Larrabee and Optane where they talk themselves into overconfidence (“obviously this is a great product, we wouldn’t be doing it if it wasn’t guaranteed to make $1B in the first year”).
It is very hard to grow a business with zero risk appetite. You can’t take risky high return bets, and you can’t acknowledge the real risk in “safe” bets.
2010-2011 was also the time that AMD were starting to moan a bit about DX11 and the higher level APIs not being sufficient to get the most out of GPUs, which led to Mantle/Vulkan/DX12 a few years down the road. Intel did a bit regarding massively parallel software rendering, with the flexibility to run on anything x86 and implement features as you liked, or AMD's efforts for 'fusion' (APU+GPU, after recently acquiring ATi) or HSA which I seem to recall was about dispatching different types of computing to the best suited processor(s) in the system for it. However I got the impression a lot of development effort is more interested in progressing on what they already have instead of starting in a new direction, and game studios want to ship finished and stable/predictable product, which is where support from intel would have helped.
But certainly Intel wasn’t willing to wait for the market. Didn’t make $1 billion instantly; killed.
This relates to the Intel problem because they see the world the way you just described, and completely failed to grasp the importance of SoC development where you are suddenly free to consider the world without the preexisting buses and peripherals of the PC universe and to imagine something better. CPU cores are a means to an end, and represent an ever shrinking part of modern systems.
The problem is, console manufacturers know precisely how much of their product they anticipate to sell, and it's usually a lot. The PlayStation 5 is 80 million units so far.
And at that scale, the console manufacturers want to squeeze every vendor as hard as they can... and Intel didn't see the need to engage in a bidding war with AMD that would have given them a sizable revenue but very little profit margin compared to selling Xeon CPUs to hyperscalers where Intel has much more leverage to command higher prices and thus higher margins.
> they turned down Apple on the iPhone
Intel just was (and frankly, still is) unable to compete on the power envelope with ARM, that's why you never saw x86 take off on Android as well despite quite a few attempts at it.
Apple only chose to go for Intel with its MacBook line as PowerPC was practically dead and offered no way to extract more performance, and they dropped Intel as soon as their own CPUs were competitive. To get Intel CPUs to the same level of power efficiency that M-series CPUs have would require a full rework of the entire CPU infrastructure and external stack, that would require money that even Intel at its best frankly did not have. And getting x86 to be power effective enough for a phone? Just forget it.
> Clearly the threat of an Arm or RISC-V finding itself fused to a GPU running AI inference workloads has woken someone up, at last.
Actually, that is surprising for me as well. NVIDIA's Tegra should easily be powerful enough to run the OS for training or inference workload. If I were to guess, NVIDIA wants to avoid getting caught too hard on the "selling AI shovels" train.
They did push hard on their UMPC x86 SoCs (Paulsbo and derivatives) to Sony, Nokia, etc. These were never competitive on heat or battery life.
80 million in 5 years is a nothing burger as far as volume.
This is very likely the new culture that LBT is bringing in. This can only be good.
I agree that Intel would be better served to spin off its fab division, a potential buyer could be the US government for military and national security relevant projects.
They just need to separate business units.
Maybe this changed with the AI race but there are plenty of people buying older chips by the millions for all sorts of products.
If Nvidia did try to exert any pressure to scrap ARC, that would be both a huge financial and geopolitical scandal. It's in the best interest of the US to not only support Intel's local manufacturing, but also it's GPU tech.
It's pretty clear AMD and Nvidia are gatekeeping memory so they can iterate over time and protect their datacenter cards.
Intel had a prime opportunity to blow this up.
But also, does this amount of ownership even give them the ability to kill anything on Intel's roadmap without broad shareholder consensus (not that that's even how roadmaps are handled anyway)?
https://www.intc.com/news-events/press-releases/detail/1748/...
In a top-down oligarchy, their best interests are served by focusing on the desires of the great leader, in contrast to a competitive bottom-up market economy, where they would focus on the desires of customers and shareholders.
5 Billion is just a start but this is a gift for nVidia to eventually squire intel.
Intel has never been so cheap relative to the kinds of IP assets that Nvidia values and probably will not be ever again if this and other investments keep it afloat.
Trump's FTC would not block.
You write with proper case-sensitivity for their titles which suggests some historic knowledge of the two. They have been very close partners on CPU+GPU for decades. This investment is not fundamentally changing that.
The current CEO is more like a CFO--cutting costs and eliminating waste. There are two exits from that: sell off, as you say, and re-investment in the products of most likely future profit. This could be a signal that the latter is the plan and that the competitive aspects of the nVidia-intel partnership will be sidelined for a while.
They will be dominating AMD now on both fronts if things go smoothly for them.
Intel was well on its way to be a considerable threat to NVIDIA with their Arc line of GPUs, which are getting better and cheaper with each generation. Perhaps not in the enterprise and AI markets yet, but certainly on the consumer side.
This news muddies this approach, and I see it as a misstep for both Intel and for consumers. Intel is only helping NVIDIA, which puts them further away from unseating them than they were before.
Competition is always a net positive for consumers, while mergers are always a net negative. This news will only benefit shareholders of both companies, and Intel shareholders only in the short-term. In the long-term, it's making NVIDIA more powerful.
Intel's foundry costs are probably competitive with nvidia too - nvidia has too much opportunity cost if nothing else.
While it doesn't quite compete at performance and power consumption, it does at price/performance and overall value. It is a $250 card, compared to the $300 of the 4060 at launch. You can still get it at that price, if there's stock, while the 4060 hovers around $400 now. It's also a 12GB card vs the 8GB of the 4060.
So, sure, this is not competitive at the high-end segment, but it's remarkable what they've accomplished in just a few years, compared to the decades that AMD and NVIDIA have on them. It's definitely not far fetched to assume that the gap would only continue to close.
Besides, Intel is not only competing at GPUs, but APUs, and CPUs. Their APU products are more performant and efficient than AMD's (e.g. 140V vs 890M).
I think this partnership will damage nvidia. It might damage intel, but given they're circling the drain already, it's hard to make matters worse.
It's probably bad for consumers in every dimension.
Or to take the opposite, if nvidia rolled over intel and fired essentially everyone in the management chain and started trying to run the fabs themselves, good chance they'd turn the ship around and become even more powerful than they already are.
How was Intel "circling the drain"?
They have a very competitive offering of CPUs, APUs, and GPUs, and the upcoming Panther Lake and Nova Lake architectures are very promising. Their products compete with AMD, NVIDIA, and ARM SoCs from the likes of Apple.
Intel may have been in a rut years ago, but they've recovered incredibly well.
This is why I'm puzzled by this decision, and as a consumer, I would rather use a fully Intel system than some bastardized version that also involves NVIDIA. We've seen how well that works with Optimus.
Also their network cards no longer work properly which is deeply aggravating as that used to be something I could rely on, just bought some realtek ones to work around the intel ones falling over.
We must live in different universes, then.
Intel's 140V competes with and often outperforms AMD's 890M, at around half the power consumption.[1]
Intel's B580 competes with AMD's RX 7600 and NVIDIA's RTX 4060, at a fraction of the price of the 4060.[2]
They're not doing so well with desktop and laptop CPUs, although their Lunar Lake and Arrow Lake CPUs are still decent performers within their segments. The upcoming Panther Lake architecture is promising to improve this.
If these are not the signs of competitive products, and that they're far from "circling the drain", then I don't know what is.
FWIW, I'm not familiar with the health of their business, and what it takes to produce these products. But from a consumer's standpoint, Intel hasn't been this strong since... the early 00s?
[1]: https://www.notebookcheck.net/Radeon-890M-vs-Arc-140V_12524_...
[2]: https://www.notebookcheck.net/Intel-Arc-B580-Benchmarks-and-...
Some would say that's circling the drain.
Intel isn’t at that point, but the companies trajectory isn’t looking good. I’d happily sacrifice ARC to keep a duopoly in CPU’s.
Consumers still have AMD as an alternative for very decent and price attractive GPUs (and CPUs).
AMD has always followed closely NVIDIA in crippling their cheap GPUs for any other applications.
After many years of continuously decreasing performance of the "consumer" GPUs, only Intel has offered in the Battlemage GPUs FP64 performance comparable with what could be easily obtained 10 years ago, but no longer today.
Therefore, if the Intel GPUs disappear, then the choices in GPUs will certainly become much more restricted than today. AMD has almost never attempted to compete with NVIDIA in features, but whenever NVIDIA dropped some feature, so did AMD.
Erm, a rather important point to bury down the story. The fiest question on anyone’s lips will be is this $5bn to build new chip technology, or $5bn for employees to spend on yachts?
> Intel stock experienced dilution because the U.S. government converted CHIPS Act grants into an equity stake, acquiring a significant ownership percentage at a discounted price, which increased the total number of outstanding shares and reduced existing shareholders' ownership percentage, according to The Motley Fool and Investing.com. This led to roughly 11% dilution for existing shareholders
To get money from the outside, you either have to take on debt or you have to give someone a share in the business. In this case, the board of directors concluded the latter is better. I don't understand why you think it is gross.
Intel is up 30% pre market on this news so I think the existing shareholders will be fine.
Looks like using GPU IP to take over other brands' product lines is now officially an nVidia strategy.
I guess the obvious worry here is whether Intel will continue development of their own dGPUs, which have a lovely open driver stack.
So long as the AI craze is hanging in there it feels like having that expertise and IP is going to have high potential upside.
Would be foolish to throw that away now that they're finally getting closer to "a product someone may want to buy" with things like B50 and B60.
They wanted to launch DGX Spark early summer and it's nowhere to be seen, while strix halo is shipping in over 30+ SKUs from all major manufacturers.
That action may cease to exist soon, especially after Vance is POTUS and the courts stacked with Peter Thiel loyalists that back his vision of anti-competition. Bet on it.
[0]: <https://www.fudzilla.com/6882-nvidia-continues-comic-campaig...>
Not all deals are made in cash, they can borrow money against their market share.
But yeah, it's probably easier to just use cash on hand.
However I do imagine Intel GPUs, that were never great to start with, might be doomed, long term.
Also another possibility would be, there goes One API, which I doubt many people would care about, given how many rebrands SYSCL already went through.
I mean that also applies to Intel and Nvidia. Intel does make GPUs but their market impact is basically zero.
AMD's actual commitment to open innovation over the past ~20 years has been game changing in a lot of segments. It is the aspect of AMD that makes it so much more appealing than intel from a hacker/consumer perspective.
This time around Nvidia should HOLDL the stock
I wonder what this means for the ARC line of GPUs?
Why/how is INTC premarket up from $24.90 around 30% (to $32), when Nvidia is buying the stock at $23.28 ? Who is selling the stock?
I suppose the Intel board decided this? Why did they sell under the current market price? Didn't the Intel board have fiduciary duty to get as good a price from Nvidia as possible? If Nvidia buying stock moves it up so much, it seems like a bad deal to sell the stock for so little.
The reason the stock surged up past $30 is the general market's reaction to the news, and subsequent buying pressure, not the stock transaction itself. It seems likely that once the exuberance cools down, the SP will pull back, where to I can't say. Somewhere between $25 and $30 would be my bet, but this is not financial advice, I'm just spitballing here.
I don't like the idea of using Intel given their lack of disclosure for Spectre/Meltdown and some of their practices (towards AMD)
Seems to be an easy bet, if for no other reason than to make the US Government (Trump) happy. Trump gets to tout his +30% return on investment.
Ofc I would kind of hope/expect antitrust to object given that Intel makes both GPUs and CPUs, and Nvidia is/has dipped their toes into CPU production as well.
Intel still has to go through a lot of reorg (i.e. massive cuts) to get to a happy place, and this is what their succession of CEOs have been procrastinating over.
One wonders just how bad things must have been internally for that to be the state of one of their core IPs in this day and age...
USA, where the federal government is picking winners and losers by making risky stock bets with public money.
This needlessly divisive and devoid of any factual basis. No gulags will exist and you know it.
What about "Alligator Alcatraz", that has been called "concentration camp" [1] (so comparable with a gulag), or where the Korean detainees from the raid on the Hyundai/LG plant ended up, alleging utterly horrible conditions [2]? And there's bound to be more places like the latter, that was most likely just the tip of the iceberg and we only know about the conditions there because the South Korean government raised a huge stink and got the workers out of there.
Okay, Alcatraz 2.0 did get suspended in August to my knowledge, but that's only temporary. It's bound to get the legal issues cleaned up and then be re-opened - or the case makes its way through to the Supreme Court with the same result to be expected.
[1] https://newrepublic.com/article/197508/alligator-alcatraz-tr...
ICE doesn't reliably make any distinction, not since they hired thugs off of the streets and issued arrest quotas. Doesn't matter if the arrested have to be released later on.
Also, since this Intel deal makes no sense for NVIDIA, a good observer would notice that lately, he seems to spend more time on Air Force One than with NVIDIA teams. The leak of any evidence, showing this was an investment ordered by the White House, will make his company hostage of future demands from the current corrupt administration. The timing is already incredibly suspicions.
We will know for sure he become a hostage, if the next NVIDIA investment is on World Liberty Financial.
"Anatomy of Two Giant Deals: The U.A.E. Got Chips. The Trump Team Got Crypto Riches." - https://www.nytimes.com/2025/09/15/us/politics/trump-uae-chi...
As customer they get better access to Intel Foundry and can offload some capacity from TSMC.
As I understand it the government's shares are non-voting.
Intel has a market cap just 2.5% of NVDA, so you could give away just 2.5% of your stock to buy the entirety of Intel. It's bonkers.
There have been a lot of mergers where that has not happened.
It looks like a good deal either way and in any amount. But of course I am no expert.
This all ignores the near complete lack of product out of their advanced processes as well.
:-)
They basically baked in a massive investment profit into the deal. When you factor in the stock jump since this announcement, Nvidia has already made billions.