> Then we describe a data-driven method for learning from a dataset of full-orbit α-particle trajectories. We apply this method to the α-particle dynamics shown in Fig. 1 and find the learned non-perturbative guiding center model significantly outperforms the standard guiding center expansion. Our proposed method for learning applies on a per-magnetic field basis; changing requires re-training.
Is this interpolation at its heart? A variable transformation then a data fit?
Anyone know which functionals of these orbits are important? I don't know the space. I am wondering why the orbits with such nuance should be materially important when accessed via lower-order models.
> Are these sorts of instabilities harder to control in a tokamak as compared to a stellarator, or did you just bring those up as examples of magnetic confinement?
I was just shooting from the hip in the earlier comment alluding to your question and bringing them up as examples of two different approaches to addressing the said issues with instabilities that lead to the complexities of confinement. I think its just a terribly fun thing to think about because of its complexity. Stellarators are attempting to solve the issues passively through design. Tokamaks on the other hand with active control. Theres trade offs to both and neither has reached break even output yet.
I’m personally largely bored with them and think linear is the way to go, even though the laser based inertial confinement reactor at Lawrence is the first to reach breakeven output… experimentally at least.
> First we deduce formally-exact non-perturbative guiding center equations of motion assuming a hidden symmetry with associated conserved quantity J. We refer to J as the non-perturbative adiabatic invariant.
Simply: this is not just some kind of unsupervised ML black-box magic. There is a formal mathematical solution to something, but it has a certain gap, namely precisely what quantity is conserved and how to calculate it.
> Then we describe a data-driven method for learning J from a dataset of full-orbit α-particle trajectories. [...] Our proposed method for learning J applies on a per-magnetic field basis; changing B requires re-training. This makes it well-suited to stellarator design assessment tasks, such as α-loss fraction uncertainty quantification.
With the formal simplification of the dynamics in hand, the researchers believe that a trained model can then give a useful approximation of the invariant, which allows the formal model, with its unknown parameters now filled in, to be used to model the dynamics.
In a crude way, I think I have a napkin-level sketch of what they're doing here. Suppose we are modeling a projectile, and we know nothing of kinematics. They have determined that the projectile has a parabolic trajectory (the formal part) and then they are using data analysis to find the g coefficient that represents gravitational acceleration (the data-driven part). Obviously, you would never need machine learning in such a very simple case as I have described, but I think it approximates the main idea.
Finding simplified easy to solve solutions and using them to estimate solutions and using adjustments to get closer to the real solution is a baselime technique. That's the core of the pertubative approach in physics which uses : https://en.wikipedia.org/wiki/Perturbation_theory#:~:text=Pe...
However, now it's possible to train AI models to learn much more complex approximations that allow them to run much quicker and more accurately. A prime example is DeepMinds AlphaFold, IMHO.
I haven't read up on the research to much, but I'd place firm bets that AI models will be critical in controlling any viable fusion technology.
for _ in 0..<1000000000000 { do_something_complicated() }
Or can/do llms operate outside of a CPU? Thanks
They may not be able to match an MIT Ph.D, at analyzing experimental feedback, but they can probably match a lot of research assistants.
It’s like having a billion RAs, all running experiments, and triaging the results. I understand that is how they have made such good progress on medicines, with AI.
> “I have not failed. I've just found 10,000 ways that won't work.”
-Attributed to Thomas Edison
The machine-level instructions being executed are just matrix multiplications. Billions of them. The complexity of LLM behavior is emergent from that.
The engineering challenges are so massive that even if they can be solved, which is far from certain, at what cost? With a dense high-energy plasma, you're dealing with a turbulent fluid where any imperfection in your magnetic confinement will likely dmaage the container.
People get caught up on cheap or free fuel and the fact that stars do this. The fuel cost is irrelevant if the capital cost of a plant is billions and billions of dollars. That has to be amortized over the life of the plant. Producing 1GW of power for $100 billion (made up numbers) is not commercially viable.
And stars solve the confinement problem with gravity and by being really, really large.
Neutron loss remains one of the biggest problems. Not only does this damage the container (ie "neutron embrittlement") but it's a significant energy loss for the system and so-called aneutronic fusion tends to rely on rare fuels like Helium-3.
And all of this to heat water to create steam and turn a turbine.
I see solar as the future. No moving parts. The only form of direct power generation. Cheap and getting cheaper and there are solutions to no power generation at night (eg batteries, long-distance power transmission).
We're not at that point yet with natural gas because a combined cycle turbine is more efficient than a steam turbine.
Yeah, next 50 years you might not see coal/nat gas being replaced by fusion. But you will see fusion displacing chunks of what those powerplants will be powering
The other guy was correct while you are the one who posted the fallacy. If using heat from nuclear sources to drive aluminum production were feasible, people would already be doing it using heat from HTGR reactors rather than waiting for nuclear fusion reactors to be made. The reason it is not feasible is because the heat is an output, not an input. The actual input is electricity, which is what drives the reaction. The 940–980°C temperatures reached during the reaction are from the electricity being converted into heat from resistive losses.
It should be noted that production nuclear fusion reactors would be even more radioactive than nuclear fission reactors in terms of total nuclear waste production by weight. The only reason people think otherwise is that the hypothetical use of helium-3 fuel would avoid it, but getting enough helium-3 fuel to power even a test reactor is effectively an impossibility. There are many things that are hypothetically attainable if all people in the world decide to do it. The permanent elimination of war, crime and poverty are such things. Obtaining helium-3 in the quantity needed for a single reactor is not.
However, the goal of powering the Hall–Héroult process from a nuclear fusion reactor is doable. Just use solar panels. Then it will be powered by the giant fusion reactor we have in the sky. You would want to add batteries to handle energy needs when the sun is not shining or do a grid tie connection and let the grid operator handle the battery needs.
Finally, industrial processes that actually need heat at high temperatures (up to around 950°C if my searches are accurate) as input could be served by HTGR reactors. If they are not already using them, then future fusion reactors will be useless for them, since there is no future in sight where a man made fusion reactor is a cheaper energy source than a man made fission reactor. Honestly, I suspect using solar panels to harness energy from the giant fusion reactor in the sky is a more cost effective solution than the use of any man-made reactor.
Aluminum reduction is electrochemical, not thermochemical. Yes, the pots are hot, but they are kept hot by resistive dissipation as the alumina is electrolysed.
(There is some chemical energy contributed from oxidation of the carbon electrode.)
There is no chance that early fusion plants will be small enough to justify building them in the same building as a factory. They will start large.
> For example, aluminum requires ~14-17MWh to produce 1 ton
The Hall–Héroult process runs at 950 C, just below the melting point of copper. It is close to twice the temperature of steam entering the turbines. It is not something that can be piped around casually- as a gas it will always be at very high pressure because lowering the pressure cools it down. Molten salt or similar is required to transport that much heat as a liquid. Every pipe glows orange. Any industrial process will effectively be a part of the power plant because of how difficult it is to transport that heat away.
Also NB that the Hall–Héroult process is for creating aluminum from ore, and recycling aluminum is the primary way we make aluminum.
Industrial parks centered around power plants might become a thing in the future, being looked at as essential infrastructure investment.
Heat transport could be seen as an entire sub-industry unto itself, adding efficiency and cost-savings for conglamorates that choose to partner with companies that invest in and build power plants.
For a typical consumer in Sweden, the consumer buys most of their electricity when the prices is at its highest point, which is around 4 times of what the energy cost during the cheapest months. Electricity consumption is at the lowest point when prices are at the lowest, which is the same period when solar peak in production. Inversely, consumption is at the highest when solar production is at the lowest.
That “3X” figure assumes a high‐insolation region (CF ~25 %). In Central Europe, where solar CF is only ~12 %, you’d need about 5x the PV capacity to equal a 1 GW coal plant’s annual generation. How does scaling up to 5 GW of PV change the cost comparison vs a coal plant?
https://www.energy.gov/energysaver/solar-water-heaters
I recall hearing that they are 80% efficient while photovoltaics tend to be around 20% efficient.
As a peer post noted (without back it up but seems reasonable):
> Only 20% of our energy needs are supplied by electricity.
It is a fair viewpoint to talk about energy instead of only electricity. For exemple the current EV are build using charcoal (steel and cement for the infrastructure) and parts/final product are moved around continent with oil (ships). Same for solar panels and their underlying steel structure. Same for the road were using those EV, etc… there’s technical solutions for those, but they didn’t prove to be economically competitive yet. So I’ll happily take that 80% efficiency when we need relatively low heat : domestic and commercial AC and water heating. Those are by far the most energy intensive usage in the residential sector when there isn’t an electric vehicle and are most needs in pick time (mornings, evening at winter). We better take that +60%.
The most economical solution for reducing our carbon emissions by 95% is doing these two steps in parallel:
1. Use electricity instead of fossil fuel 2. Generate electricity in carbon free manner
Yes, there are some use cases this doesn't work well at yet: steel & ocean transport are two you listed. But it does cover the 4 biggest sources of carbon emissions: ground transport, heating, electricity generation and agriculture. The big 4 are 95% of our carbon emissions.
The photovoltaic panels have the added bonus that the energy can be used for other purposes (e.g. transport, HVAC, computers, cooking, laundry, A/V equipment) should my hot water needs be low compared to what the system is designed to produce. However, from a pure efficiency standpoint, it is unclear to me which approach is better. They seem to be a rough tie, with losses for both approaches making the real world worse than ideal conditions. I am not sure if one is better than the other in the actual real world and if anyone who knows the answer is kind enough to share it, I would find the answer enlightening.
That said, in my home, I use net metered photovoltaic panels with a Rheem heat pump for domestic hot water. This was not done because I considered it to be a better solution, but because it was the only solution available to me from local installers.
Solar thermal heating used to make more sense but cost of photovoltaics has come down so much along with relatively cheap heat pump systems nobody is doing the former anymore it seems.
I just got a large solar system installed and next up is a heat pump water heater as thats the second largest user of power next to the HVAC, plus it will cool and dehumidify my garage some where the solar inverter and batteries are located, converting some of the waste heat from the inverter into hot water at the same time.
Batteries beyond the scale of a handheld device only started getting massively manufactured and invested in fairly recently as well. Once it's obvious that it's possible to build megabatteries that can power towns, everyone will want in on the market and prices will go down.
Or inverters? (Also not included in your calc I think?)
Except that it needs to be around 30GW plant to compete with a 1GW coal. And it needs storage for several days of energy.
Fusion would use a conventional turbine with boiling water. Is this a better source of mechanical inertia than hydropower or fission?
Is there a better way to solve the problem of frequency instability?
Why is this fact downvoted? This article mentions "synthetic inertia;" what are its drawbacks?
https://www.bloomberg.com/news/articles/2025-05-09/spain-bla...
Obviously, this configuration of solar and battery banks will work more optimally when they are closer to the equator.
Will different types of power grids be required for areas further away, or is it practical to ship power long distances to far Northern/Southern areas?
Mechanical inertia in generators also tends to do well in these situations.
PV panel supply was just not nearly large enough, and if you look at overall PV capacity as a percentage of their grid capacity, it’s pretty obvious it was never going to be enough to stabilize any serious issues.
Could you point to the outage conclusion report?
You're making the obvious mistake here of equating 1 GW solar with 1 GW of any other source with a 95-99% baseload capacity. To achieve the equivalent result, you'll need to have at least >2 GW actual solar power to equally compare the two.
Granted, in most developed places, solar still beats coal, but this is why in many developing economies with ample coal resources, it makes more sense economically to go with the coal plants.
Take any other resource, say hydel or geothermal - solar and wind quickly go down in economic efficiency terms compared to these, in most cases almost doubling or tripling in costs.
Which is why I compared 1GW of coal power to 3GW of solar power.
We can live with huge land areas converted to power generation, but more space efficient alternatives will be a big improvement.
Since you want to produce power all day, you would take about 20% of that to account for tilt variations and day night cycles, and another 20% to factor in cell efficiency.
So with adequate storage, one square meter of solar can generate an average of 40W of continuous electrical power, 24h per day. Let's round that down to 25W to take into consideration outages and maintenance, bad weather, spacing between panels for personnel access etc.
And there you have it 1GW/25W is about 40 square km with quite generous safety factors, an order of magnitude less than your AI figures. This is still a lot of land if you replace farmland with it, but still totally negligible compared with the millions of square km of hot desert the world has available for this use.
For example, scaling this 400x, to cover for the entire US electrical consumption, is still "only" 16000 sq.km , or 3% of the area of the Great Basins desert in the US, which is one of the smallish deserts of the world compared with Sahara, Ghobi, Kalahari, Australia, Arabia etc. Of course, there is little economic sense to build such a mega-solar farm and pay the cost of energy transport. In practice, we are seeing distributed production taking the cheapest available land nearby.
Conclusion, land isn't really a constraint in the US.
Just pointing out that there are real downsides to this energy source, like all the others.
Now is not the time to stop developing energy sources.
PV provides massively more value per acre than agriculture does. If PV were seriously constrained by land costs, agriculture would be impossible.
But society is perfectly fine with having land producing $500/acre/year of hay, instead of $25,000/acre/year of PV output.
Anyway, the area issue seems not too bad. In the US as least, we have places like the Dakotas which we could turn like 70% of into a solar farm and nobody would really notice.
Another reason is that ̶t̶r̶a̶n̶s̶m̶i̶s̶s̶i̶o̶n̶ distribution costs are half of your energy bill... so even if you could theoretically get fusion energy generation for "free" (which is impossible) you've still only cut your power bill in half.
Edit: I meant to say distribution costs not transmission. Looking at last months bill I paid $66.60 to deliver $51.76 of energy (about 56% of my total bill was delivery). The raw distribution charge was $49.32 or 42% of the bill. I'm not alone in these numbers, but your mileage may vary.
Say a house uses 10,000kWh per year at $0.10/kWH so $1000/year electrcitiy bill. Now say you get a solar system that produces 5,000kWh per year, focused in the summer months (where your power bill tends to be higher anyway). You may even export some of that power back to the grid. Have you cut your power bill in half? No. It's probably down ~20-25%.
Why? Because regardless of how much power you use (within limits) you still need a connection to the power grid and that needs to be maintained. You'll often even see this on the electricity bill: fixed charges like "access charge" per month.
We benefit from being on a connected grid. Your own power generation might be insufficient or need maintenance. It's inefficient if everyone is storing their own power. So it's unclaer what the future of the power grid is. Should there be large grids, small grids or no grid?
Renewables and something like Iron-Salt battery containers, would be pretty efficient over all. Easy to roll-out, very safe.
We'll still need some sort of base load somewhere and backup to restart everything obviously. But the big giant power plants (with the huge capital costs, delays and NIMBY headaches) might become less necessary.
This depends on where you live!
One wonders if this is why Lockheed-Martin dropped their effort:
https://www.lockheedmartin.com/en-us/products/compact-fusion...
(that page is still up, but news reporting indicates it has been dropped)
So if you build loads of wind & solar & battery all over - either (1) you've got to build so much battery capacity, all over, that you'll never need the grid, or (2) you've still got to build the grid to get you through occasional "calm & dark" periods.
Either way, you're looking at vastly higher capital expenses.
- moderately overbuild solar
- batteries for short term storage
- natural gas for seasonal storage
Wait, what?
Wikipedia[0] seems to disagree:
> Long-distance transmission (hundreds of kilometers) is cheap and efficient, with costs of US$0.005–0.02 per kWh, compared to annual averaged large producer costs of US$0.01–0.025 per kWh
Do you maybe mean that half electrical energy dissipate between production plant and consummer? But that figure seems quite large compared to what I can find online, and this would not be a problem with "free fusion".
Care to explain?
[0]: https://en.wikipedia.org/wiki/Electric_power_transmission
My point is that the infrastructure related to the delivery of energy to a physical location is a non trivial part of an energy bill, and that this part doesn't go away magically because "fusion".
Distributing tiny fractions of all that energy to each of millions of individual residences, then maintaining all the short/complex/low-capacity wiring needed to do that - that part ain't the least bit efficient.
First, actually getting fusion to positive energy ROI. That's step zero and we're not even close.
Second, scaling the production of fusion in an safe and economical way. Given the utter economic failure of fission nuclear power (there has never been a profitable one), my priors are that the fusion advocates are vastly underestimating, if not willfully ignoring, this part.
Finally, even if we do get to "too cheap to meter" energy, what then? Limitless electricity is not the same thing as limitless stored energy. Only 20% of our energy needs are supplied by electricity. To wit, the crucial industrial processes required to build the nuclear power plant in the first place can only be accomplished with combustible carbon. A power plant cannot generate the energy to build another power plant. Please let that sink in.
We're already seeing countries with photovoltaic and wind hitting $0/kW on sunny windy days - the grid is nearly saturated for daytime load. There isn't enough demand! This makes the economic feasibility of fusion even less attractive. No one is going to make money from it.
I would expect that there have been multiple nuclear power plants that provide a net positive return, specially on countries like France where 70% of their energy is nuclear.
However a reasonable argument can be made the public benefited from externalities like lower pollution and subsidized electricity prices even if it was a money pit and much of the benefit was exported to other countries via cheap off peak prices while France was forced to import at peak rates.
So while regulations may be overkill it’s not arbitrary only hydro is really comparable but hydro also stores water and reduces flood risks most years. Fusion sill had real risks, but there’s no concern around $500+ Billion cleanup efforts.
How exactly would you get meaningful widespread tritium contamination of groundwater? IE not just the trace amounts you see from existing nuclear reactors.
Groundwater doesn’t flow quickly from a point source and tritium has a fairly short half-life. 60 years later you might be looking at a larger though still small area, but 97% of the stuff will have decayed and what remains is now diluted and doesn’t bioaccumulate.
It’s not going to concentrate around some site after entering the atmosphere the way heavier than air particulate pollution would.
Nearly pure tritium is extremely valuable so we aren’t going to be dealing with some long term leak. You hypothetically might have a large tank with say 1 month of T2 fuel but that would be really expensive directly and waste quite a bit of fuel through nuclear decay over time. Having that much fuel across multiple different systems is more plausible but then requires a wide range of different failures. But let’s assume such an improbable tank catastrophically fails, outside of containment, and then completely burns so the tritium will eventually fall back to earth.
It then has to rain over land, though even then storms don’t release all the moisture in the air, that water must be absorbed into the soil rather than running off or evaporating, where it’s further mixed with groundwater as it slowly seeps deep enough to be collected in some well. Thus even if conditions are perfect you’d have trouble reaching above the legal limit for drinking water.
I mean maybe if you intentionally selected the perfect moment with the perfect weather pattern and the perfect local geography and geology perhaps you’d be over the legal limits for a few wells for a little while until it rapidly decays.
Lurking over all this is the issue that loss of property value doesn't require anyone to actually prove tangible harm. The mere fact that property values were affected is enough for a tort.
When people talk about how safe fusion is they aren’t kidding, even breathing in a significant amount of T2 isn’t particularly dangerous radiologically as density is really low and you will quickly exhale it. Huge quantities would be a larger suffocation risks but then you’re talking multi million dollar accidents simply from lost fuel.
Additionally, the industry as a whole is shielded from the liability that would otherwise have bankrupted it multiple times. Notably, the clean up from Fukushima will likely take over 100 years, requires tech not yet invented and will likely cost as much as a trillion dollars [3]. In the US, there is a self-insurance fund paid into by the industry, which would've been exhausted 10-20 times over from a Fukushima level disaster. Plus, Congress severely limits liability from nuclear accidents, both on a per-plant and total basis ie the Price-Anderson Act [4].
Next, it seems like it's the taxpayer who is paying to process and store spent nuclear waste, a problem that will persist for centuries.
Even with all this the levellized-cost-of-energy ("LCOE") of fission power is incredibly expensive and seemingly going up [5].
Some want to reduce costs by using more off-the-shelf tech and replicating it for scale, most notably with small modular reactors ("SMRs") but this actually makes no sense because larger fission reactors are simply more efficient.
[1]: https://theecologist.org/2016/jan/04/after-60-years-nuclear-...
[2]: https://www.ucs.org/resources/nuclear-power-still-not-viable...
[3]: https://cleantechnica.com/2019/04/16/fukushimas-final-costs-...
[4]: https://www.yuccamountain.org/price_anderson.htm
[5]: https://en.wikipedia.org/wiki/Cost_of_electricity_by_source
Most reactors are old and in need of repair, most of these earlier than planned afaik.
There is also the bigger issue that some reactors are shut down in the summer because cooling water would leave the reactor so hot that it would be a danger to the animals living in the river.
I mean sure, waste disposal is a serious issue that deserves serious consideration. But fission waste contaminates a discrete area. Fossil fuels at scale cause climate change that contaminates the entire freaking planet. It's a travesty we haven't had a nuclearized grid for 20-30 years at this point.
So the real number is closer to 40%. If we switch ground transport to EV's and heating to heat pumps we can get up to ~75%.
I’m also skeptical, but I think the emphasis of my skepticism is on “commercially viable” as opposed to an available energy source. That is, I think fusion development will (and should) proceed anyway.
There’s a good argument that nuclear fission is not really commercially viable in its current form. Yet it provides quite a lot of commercially available electricity. And it also powers aircraft carriers and submarines. And similar technology produces plutonium for weapons. In other words, I don’t think fission’s continued availability as a power source is a strictly commercial decision.
I think there’s a quite a lot of technology that is not directly commercially viable, like high energy physics, or the space program. But they remain popular and funded. And they throw off a lot of commercial side benefits.
The growth of solar for domestic consumer power will certainly continue and that is a good thing. But I bet we’ll have fusion too in the long run. There’s no lack of ideas for interesting things to do with extreme amounts of heat and power. For example I’m hopeful that humanity eventually figures out space propulsion powered by fusion.
This is true of Tokamak type designs based around continuous confinement, but perhaps less so with something like Helion's design which is based on magnetically firing plasma blobs at each other and achieving fusion through inertial confinement (cf NIF laser-based fusion), with repeated/pulsed operation rather rather than continuous confinement.
No doubt the containment vessel will still suffer damage, but I guess it's a matter of degree - is it still economically viable to operate or not, which I guess needs to be verified experimentally by scaling up and operating for a sufficiently long period of time. Presumably they at least believe the approach is viable or they'd not be pursuing it (and have an agreement in place with Microsoft to power one of their data centers with one of the early units).
Have you seen the videos of Helion's reactor - hardly a basement project. Sam Altman (OpenAI) also has personally invested hundreds of millions of dollars into Helion, presumably after some due diligence!
"Helion Raises $500 Million, Targets 2024 for Demonstrating Net Electricity from Fusion" https://www.helionenergy.com/articles/helion-raises-500m/
And also an r/fusion post documenting prior claims:
> “The Helion Fusion Engine will enable profitable fusion energy in 2019,” - NBF 7/18/2014.
> “If our physics holds, we hope to reach that goal (net energy gain) in the next three years,” - D. Kirtley, CEO of Helion in the Wall Street Journal 2014.
> “Helion will demonstrate net energy gain within 24 months, and 50-MWe pilot plant by 2019,” - NBF 8/18/2015.
> “Helion will attain net energy output within a couple of years and commercial power in 6 years,” - Science News 1/27/2016.
> “Helion plans to reach breakeven energy generation in less than three years, nearly ten times faster than ITER,” - NBF 10/1/2018.
> Their newest claim on their website is: "We expect that Polaris will be able to demonstrate the production of a small amount of net electricity by 2024."
https://www.reddit.com/r/fusion/comments/133ttne/can_we_talk...
I'm sure all this came up in any due diligence as well. They are on Series E after all.
More than a decade of missed milestones is not the type of company that gets this many rounds of investment.
A lot of people really want fusion to happen, and happen sooner. I think that leads to people taking far higher risks with the capital. This sort of investment is always risky, but donating to a grander cause of technology advancement can be a reason for the investment, in addition to expected future value of the investment.
The IM video you posted, btw, is not to be taken seriously. It appears to be based solely on the Real Engineering video, not on Helion itself.
https://www.reddit.com/r/fusion/comments/10g95m9/the_problem...
I see it similarly to the difference between a car with a combustion engine and an electric one. Combustion engines are fully developed. We're reaching the maximum possible performance and utilisation. It's a dead end. However, with electric cars, for example, new battery development is far from over. E.g sodium batteries.
And just off the top of my head, in fusion, the discovery of better electromagnets, as happened a while back, can quadruple energy output.It's not a dead end, and writing it off would be short-sighted.
And before someone chimes in and says Nuclear doesn't make sense - it made sense at plenty of times and in different places.
It doesn't make sense in Western countries that are hell bent on making it as expensive as possible, strictly to ensure it doesn't get built, so we stick on fossil fuels as long as possible.
For example, people will dismiss arguments saying FTL is likely impossible because people once said that about going to the Moon. To be fair, there was some logic to the anti-Moon argument based on physics. The big change came with multi-stage rockets that solved the weight and thrust problems. And even then it's close [1].
There are good, physical reasons why FTL is highly likely impossible. You know, based on phnysics.
Likewise, the challenges to commercial fusion are also based on physics. Fusion reactions produce neutrons. Neutrons can't be magnetically contained. Neutrons destroy the container and, more importantly, lose energy from the system.
But saying "people once said the Earth was flat" or "people once said we couldn't get to the Moon" and so on are just meaningless platitudes. [1]: https://www.realclearscience.com/blog/2017/07/06/if_earth_wa...
It can be for deep space propulsion. The Orion project [1] demonstrated that you can power a spaceship so that it has both huge thrust and huge specific impulse with hydrogen bombs. The main issue with this project is the proliferation concerns. However, if you replace the bombs with pellets that are imploded by lasers, like the NIF experiment did [2], then you could get to the point where you can drive a rocket with non-weaponizable fusion explosions.
[1] https://en.wikipedia.org/wiki/Project_Orion_(nuclear_propuls...
[2] https://en.wikipedia.org/wiki/Fusion_ignition#2021_and_2022_...
It feels like we forget quickly that we already have nuclear power around. And yet fission constantly suffers due to economics issues, stemming from needing to build massive plants full of steel and concrete, each facility being effectively bespoke, requiring fancy equipment and electronics, needing constant monitoring, having to plan to decommission an irradiated plant and dispose of radioactive waste, and... needing to mine and refine uranium.
Fusion would (maybe, unless we need helium-3) solve that last one, and only sorta solve the radioactive waste one. Everything else remains, perhaps even gets worse.
Still good to see people working on it, maybe it'd be useful for far-future spaceships or areas where solar/wind aren't feasible. But I don't see how it wins economically at large.
That is not what is being fused in the Sun.
From, https://energyeducation.ca/encyclopedia/Nuclear_fusion_in_th...
The overall process of proton-proton fusion within the Sun can be broken down into several simple steps. A visual representation of this process is shown in Figure 1. The steps are:[4]
Two protons within the Sun fuse. Most of the time the pair breaks apart again, but sometimes one of the protons transforms into a neutron via the weak nuclear force. Along with the transformation into a neutron, a positron and neutrino are formed. This resulting proton-neutron pair that forms sometimes is known as deuterium. A third proton collides with the formed deuterium. This collision results in the formation of a helium-3 nucleus and a gamma ray. These gamma rays work their way out from the core of the Sun and are released as sunlight. Two helium-3 nuclei collide, creating a helium-4 nucleus plus two extra protons that escape as two hydrogen. Technically, a beryllium-6 nuclei forms first but is unstable and thus disintegrates into the helium-4 nucleus. The final helium-4 atom has less mass than the original 4 protons that came together (see E=mc2). Because of this, their combination results in an excess of energy being released in the form of heat and light that exits the Sun, given by the mass-energy equivalence. To exit the Sun, this energy must travel through many layers to the photosphere before it can actually emerge into space as sunlight. Since this proton-proton chain happens frequently - 9.2 x 1037 times per second - there is a significant release of energy.[3] Of all of the mass that undergoes this fusion process, only about 0.7% of it is turned into energy. Although this seems like a small amount of mass, this is equal to 4.26 million metric tonnes of matter being converted to energy per second.[3] Using the mass-energy equivalence, we find that this 4.26 million metric tonnes of matter is equal to about 3.8 x 1026 joules of energy released per second!
- Lawson criterion is derived with the assumption of equilibrium plasma, which doesn't hold true in any real tokomak/stellarator
- at required temperatures most of the energy would be in photons of thermal radiation, that don't get confined by magnetic field, so when plasma relaxes from high energy beams to thermal equilibrium, it loses all the pumped energy through radiation
- with high energy beams tokomak is essentially a particle accelerator, where electrons get in the way of collision
But so long as there is a boatload of prestige and funding to be harnessed via fusion research, it'll be a Really Big Thing.
Centuries ago, an ambitious and clever alchemist could harness a fair quantity of those things via transmutation research. Vs. these days, we have repeatedly demonstrated the ability to transmute lead into gold. But somehow, there's no big talk about, or prestige in, or funding for scaling that process up to commercial viability.
But another more nefarious factor is the nexus of fusion energy research and nuclear weapons research [1]. To build and maintain a stockpile of nuclear weapons (specificially thermonuclear weapons) you need appropriate trained nuclear energy physicists.
[1]: https://thebulletin.org/premium/2024-11/the-entanglement-of-...
* The super conducting metals required for confinement randomly stop superconducting.
* The fuels produce absurd amounts of radiation and the Helium-3 solution for that might as well be fairy dust, since even if we convert the energy global economy to helium-3 production, we will not have enough by orders of magnitude to power hypothetical fusion reactors that would handle our needs. Strip mining the moon for it is supposedly a way to get it, but defacing the surface of the moon for minuscule amounts of Helium-3 per acre is unlikely to ever be profitable.
* The amount of radioactive materials produced from the experiments are many times those produced in fission reactors.
This is just off the top of my head. Until recently, I would have included the inability to produce more energy than we put into it on this list, but LLNL’s breakthrough a few years ago seems to have solved that. I suspect that someone with time to look into the practical issues involved in building a fusion reactor would find other issues (such as the design not being practical to use in a production power plant and thus further research being needed to make one that is).I wonder if the only reason countries fund nuclear fusion research is to keep nuclear scientists from finding employment in the production of nuclear weapons.
Yeah, that doesn't happen
> The amount of radioactive materials produced from the experiments are many times those produced in fission reactors.
And neither does this
https://lss.fnal.gov/archive/2023/slides/fermilab-slides-23-...
As for the amount of radioactive material, the experimental reactors are several times the size of fission reactors. It is obvious that they irradiate far more material.
There is a very well researched youtube video that goes over these things:
As for the waste, you are still going to have to pay per unit to clean it up, just like you would with waste from a nuclear fission reactor. You have far more of it since the volume being irradiated is far higher. Although it is by volume rather than by weight, a decomissioned MH-1A PWR power plant produced 89 m^3 of solid radioactive waste and 363 m^3 of liquid waste:
https://world-nuclear.org/information-library/nuclear-fuel-c...
The youtube video stated that the cleanup effort for the Joint European Torus was projected to produce 3000m^3 of waste, and the ITER reactor would be 10 times its size. Neither of them produced or will produce useful energy, yet they produced / will produce orders of magnitude more waste when it is time to decommission them than a decommissioned fission reactor did.
It also does not matter if the half lives are lower. It is still not going to be safe to be around that stuff long after both of us are dead and buried. The costs are effectively the same, since they will both be disposed in the same way.
https://www.youtube.com/watch?v=FrUWoywZRt8
I do not recall where I heard about the helium-3 situation, but I recall hearing some figures and the prospect of having enough to run fusion reactors was not good. Doing a search suggests that I had been only slightly mislead about the scarcity of helium-3. It is still extremely rare, but the US reportedly has used up to 70,000 liters of it per year:
https://en.wikipedia.org/wiki/Helium-3#Human_production
The density of Helium-3 at STP is presumably 0.1338 g/L, based on Helium-4's 0.1784 g/L. That suggests that the total annual US industrial demand is ~9 kg. This is definitely not as bad as I had thought, but it is still fairly dire.
As per wikipedia, a single 1GW nuclear fusion reactor that is 100% efficient at converting helium-3 into electricity would need 52.5 kilograms per year. 6.7 metric tons per year would be needed to power the entire US.
https://en.wikipedia.org/wiki/Helium-3
That assumes 100% efficiency, and if the conversion efficiency is anything like a nuclear fission plant, we would be lucky to see 3% efficiency, but we could be optimistic and assume something higher. Either way, it does not change the conclusion.
With an annual supply of ~9 kg, practical helium-3 fueled nuclear fusion reactors are not happening. Maybe the supply would be higher if you include other countries, and maybe we could get it somewhat higher if we try, but the reality is that helium-3 is an extremely rare isotope and the idea of using it as a practical fuel for a nuclear fusion reactor is a pipe dream unless the supply problem is solved and people figure out how to actually build a reactor that generates power using it, without encountering any of the other known problems that make this unlikely.
The difficulty in scaling supply is why are people discussing wild ideas like mining the moon, or even mining Jupiter. The supply situation is so constrained that the US government is reportedly buying 3 liters of it from a company that promises that it will strip-mine the surface of the moon for it with delivery by April 2029:
https://www.morningstar.com/news/pr-newswire/20250507sf81778...
I fully expect them to fail to deliver.
That said, the fairy dust remark was probably inaccurate, but the idea that our supply is short by orders of magnitude is correct according to mathematics.
Depleted uranium is one example but that has terrible implications due to radioactive pollution that would result, disposal costs and risks, etc.
Surprised theres not more research into meta-materials and alloys that are neutron-resistant, neutron-slowing, or neutron-absorbing.
The problem(s) of scale are not only those of scaling up, but also scaling down.
One of the best and most unsung benefits of solar is that it is profoundly easy and intuitive to build a very small (ie, vehicle- or house-sized) grid.
In an increasingly decentralized and stateless world, it makes sense to look for these qualities in an energy source.
There are interesting small fusion reactors that skip the steam step. They compress plasma magnetically, and when the fusion happens, the expanding plasma in turn expands the magnetic field, and the energy is harvested directly from the field. No steam and turbines.
Here is the video where I learned about it: https://www.youtube.com/watch?v=_bDXXWQxK38
Maybe any physicists in this thread could share insight on how feasible this is?
Your main point stands of course: this is a moonshot project, and solar works TODAY!
Kinda. The main catalyst of stellar fusion is quantum tunneling. Temperature and gravity together are not enough to overcome the Coulomb barrier.
So what is the difference between those two places? Temperature and pressure. In the Sun those arise from gravity. On the Earth, we need to create them mechanically.
Waste in the form of long-lived nuclear fission products is fundamentally an unsolvable issue. Transmutation has been proposed but isn't really practicable, shooting it into the sun isn't really an option either, so the only choice is to confine it for geological timescales somehow.
Both options are really much better, in my opinion, than pumping more carbon dioxide into our biosphere.
You apparently stable salt mines start leaking. Locals don't like having toxic stuff buried below them. Other countries dislike that you dump nuclear waste in the middle of the Atlantic. Digging deep becomes too expensive.
Fusion is only better insofar as the public don't yet understand how radioactive the reactor will become, but counting on that ignorance is a bad long term strategy.
This is a major fallacy that makes people think DT fusion is more promising than it actually is.
Engineering problems are perfectly capable of killing a technology. After all, fission after 1942 was "just an engineering problem". And DT fusion faces very serious engineering problems.
I include cost issues as engineering problems, as engineering cannot be divorced from economic considerations. Engineering involves cost optimization.
You also have the associated economic problems; the up-front cost of a launch loop would be so huge that you could never convince anybody to build it instead of using rockets. Fusion has the same problem; even if you can design a fusion power plant that produces net power, it needs to produce net power by a massive margin to have any chance of being economically competitive with fission let alone solar.
Activated D-T reactor parts only have to be buried for a few decades and then they're fine.
Long-term fission waste is almost entirely transuranics, formed by heavy atoms absorbing neutrons without fissioning. If we were to use fast reactors or thorium molten salt reactors, then that problem would go away, and we'd be left with just fission products, which are only troublesome for a couple centuries.
Never mind what's required to deal with the fuel & waste products.
What does it mean? Beta radiation can cause structural damage? Is it really a problem?
1. High energy particles destroy the container. Alpha particles, which are just Helium nuclei, are quite small and can in between metal atoms. Neutrons too. High energy electrons too; and
2. It's an energy loss for the system to lose particles this way.
Magnetic confinement works for alpha and beta particles because they're electrically charged. Neutrons are a far bigger problem, such that you have fun phrases like "neutron embrittlement".
https://www.jp-petit.org/NUCLEAIRE/ITER/ITER_fusion_non_cont...
Unfortunately, sentences like this are going to be way less common soon.
Why that last bit? Generic tangents supplant narrower/specific topics with broader/generic ones that people tend to already have opinions about, which they are eager to repeat. Because of this, generic tangents—especially on divisive/indignant topics—end up having two bad effects: (1) they take over the conversation, and (2) they are repetitive.
It's similar to how weeds take over a garden. We want a garden of unusual, interesting plants, not the most common ones that take over everywhere if allowed to.
https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
I have at least one friend who runs a biomedical research lab.
From conversations, here is what it going on:
- incoming students and researchers have been retracting their applications because of fear of ending up in detention for having something the regime doesn't like on their phone or on social media, or having their photo snapped at a protest about something the regime doesn't like, or their research being on a subject the regime doesn't like...or even something as stupid as the letters "trans" appearing as part of a word like "transgenic." (That's actually happened.)
- the schools have had to retract offers for others because there's no money to pay their stipends or for their lab/office space
- meeting with their administrations to discuss how long their schools can float salaries for lab staff. Admin assistants, scientific support staff like lab and animal technicians, and so on.
- planning phases of the euthanization of their organism / animal models
- planning phasing of the liquidation of lab equipment (in a market being flooded with such equipment)
My friends are talking about not being able to bear making their techs or researchers mass-euthanize research animal populations (typically rodents) and doing it themselves, in tears. Many of them justify the normal 'sacrifice' of research animals because their deaths help us advance science - but in this case, it's just because some transactional dickhead can't directly draw a crayon line between their research and GDP. But it's also because it's a visceral representation of scientific progress being destroyed. All to "own the libs" (but really to give billionaires tax cuts.)
One said they are trying to figure out what to do now that their career, which they have spent two decades of 60+ hour weeks on, is basically over - what little positions are left will see hundreds if not thousands of applicants. Salaries will plunge both out of necessity and a saturated labor supply.
The damage that has been done in less than 6 months to scientific research is immesurable and the consequences will be generational.
If you don't believe me, go through your list of friends, coworkers, family, etc and see who works in research and see what they're posting on social media or talk with them.
Got any friends who work in companies that make scientific equipment, reagents, etc? They might not have a job already, or soon.
Kids get into science in part because their parents or a family member is in science. Or they see a cool show on PBS about science. All that's going away. We're going to see a precipitous drop in the number of people pursuing scientific educations and careers.
Billionaires are about to find out that it doesn't matter how much money you have if your kid has cancer and there's nobody to treat them, no drugs being researched or manufactured, no diagnostic equipment (that was in part funded by research project grants), and o on.
Nothing to lose any more? Then go and protest, hard. It's too late to undo the damage already caused, but a huge part of why Trump was able to rise to power was because there was by far not enough protest against him.
Distribution is somewhat like this...
Say there are 10,000 people affected by this
5,000 probably have skills to pivot to something else, don't give a shit about future billionaire's kid's problem. People wouldn't want to be scientists if they can't also have a decent career.
2,000 people have means to survive and can't afford to fight the thugs on street
2,000 people are desperate, but otherwise marginalized by current admininstration (immigrant, mexican, black, muslim,... whatever) but don't want to sacrifice their extended family too.
1,000 people are desperate, have the courage to fight (probably white).
If the future of curing the billionaire's kid relies on 1,000 people sacrificing their life... oh well....
France's "yellow vests" or Germany's "Pegida" might disagree with you on that one. Both were pretty darn effective.
> They are good for one thing and one thing only: meeting other people who are just as angry as you about something
Also, Pegida was Nazis protesting that there wasn't enough Nazism happening, so I don't know why you bring them up as an example of a successful protest.
There are a lot of reasons to be skeptical of this claim. For one thing, it's not clear that trump voters respect protestors in the first place. For another thing, we're an extremely geographically distributed population, and most of our cities already swing strongly blue. This means protesting is generally a high-effort, low-return activity.
Whatever will provide friction I do not know, but I don't think protests are going to play a major role outside of maybe providing a narrative about how angry people are. But it's important to note that a significant number of people vote for Trump because he makes certain people angry.... If the right people "protest" in a ridiculous enough manner, you're going to likely strengthen the resolve of his base. Granted, I suspect this isn't much of an issue with science funding, but it's something to keep in mind.
My attitude is: if this country doesn't want science research, let it, follow the research overseas, and let your absence speak for itself.
They do respect one thing, just like their master does: strength. Show up in force, in overwhelming numbers, and all these "don't tread on me" people suddenly find out that, whoops, they aren't the top dogs any more. It used to be the case that you got beaten up or worse for showing up in KKK outfits, these days you got pseudo-edgy kids on social media with them.
I would ask your data source, because Wikipedia has 2024 stats indicating Harvard’s endowment is ~$4.5 billion greater than the UT system’s ($52B vs $47.5B).
I’d also point out that the UT system has almost 9 times the student body size as Harvard (250k+ vs 30k) spread among 14 campuses.
The UT system has a very large endowment, (which appears to be a little smaller than Harvard's), but UT Austin is much smaller (but still very large for a public university.)
I'd also ask why you included the University of Florida in that list, since it appears their endowment is pretty small (at least compared to the other schools in that list.)
https://en.m.wikipedia.org/wiki/List_of_colleges_and_univers...
Doesn't seem to be true? The LLM response claims 47.5 billion but I have no idea where it got that number from after looking through its sources.
edit: Oh, and if you're talking about the Permanent University Fund that's split between the UT + A&M systems. And the ChatGPT response is way off here as well.
And as the others have noted, even if what you said was true it has very little to do with what you're replying to.
The beginning is nearly always federal R&D funding. Much of it won't work, sure, and that's fine. It's not wasted, because when it works, it creates such a massive everlasting surplus and opportunity machine that it overcomes all past failures by orders of magnitude. Such as, computers, and all they enabled over the last 100-ish years.
The myth of the lone inventor in the garage should have been updated even in the pre-WW2 era.
Alas when the govt follows the exact same model, taking high risk, high reward bets, then it's seen as "wasteful spending". Despite the staggering value of the wins, it becomes better to "spend nothing" than waste a penny on research that goes nowhere.
The levels of cognitive dissonance, not to mention hypocrisy, are truly incredible.
And the charge is being lead by someone who literally made his wealth from this model.
ARPA-E is high risk.
ARPA-H is high risk.
Much of NSF is high risk, like NSF Engines, NSF Future Manufacturing, NSF Convergence Accelerators.
DoD SBIR/STTR is high risk. (Confirm it for yourself and look at this month's topics in https://www.dodsbirsttr.mil/topics-app/.)
AFWERX is high risk.
SpaceWERX is high risk.
DIU is high risk.
NASA SBIR is high risk.
NASA NIAC is ultra-high risk.
DoD Office of Strategic Capital is high risk, the kind of risk no investors would fund.
Investors scrutinize pitch decks and then do hard company due diligence which frequently falls through. And conversations die-off with no obligation to provide feedback, unlike in government. And investors will not fund true R&D. They fund scale.
So no, your statement does not hold.
I read that as at least 0.5 years per year of progress against the estimate :)
The paper introduces a new, data-driven method for simulating particle motion in fusion devices that is much more accurate than traditional models, especially for fast particles, and could significantly improve fusion reactor design.
This work is related to actual genuine nuclear fusion, the kind that occurs at energy scales sufficient to overcome that Coulomb barrier. At those energy scales it becomes very hard to manage the plasma in which fusion occurs. This is a claimed advance in plasma management.
What happens is that thermal energies get high enough that the nuclei get close enough to have a significant rate of tunneling through the barrier. It's a quantum mechanical effect.
There is a nonzero rate of tunneling through the barrier even at room temperature -- just extremely low, far lower than putative cold fusion claims.
Worth noting that (while obviously not what is normally meant by "cold fusion") muon-catalyzed fusion is possible and is cold, so the above statement can't be quite right.
There is however Lattice Confinement Fusion [1] which claims to overcome the Coulomb barrier through some kind of "screening" from the electron cloud in the lattice. That seems more like it would work on at everyday scales, though I don't understand it nearly enough to offer any opinion on viability.
[1] https://www1.grc.nasa.gov/space/science/lattice-confinement-...
But they're relevant to the particular argument you were making, which involved only the nuclei and not what was orbiting them. Unless I'm misunderstanding?
> There is however Lattice Confinement Fusion [1] which claims to overcome the Coulomb barrier through some kind of "screening" from the electron cloud in the lattice. That seems more like it would work on at everyday scales, though I don't understand it nearly enough to offer any opinion on viability.
Lattice confinement fusion is generally considered to be crankery, as I understand it. It's what Pons and Fleischmann were doing. There are various people who have continued work on it but my understanding is that basically none of it is credible.
The LHC uses ~86 megawatts, about the same power as a 747's engine at full throttle. It's about the same as a small natural gas powered turbine. GE builds gas turbines that produce 800+ MW.
The LHC is just a controlled environment to study the kind of particle collisions that are happening all over the earth every day. We live next to a giant fusion reaction, and freak particles come in from outer space all the time. We have detected many particles with millions of times more energy than the particles in the LHC- the Oh-My-God particle had 20 million times more energy.
> Can someone tell me what the likelihood of a humongous explosion from nuclear fusion could be?
Fission self-sustains. Each reaction produces 3 neutrons that can start another reaction. It explodes because the neutrons grow like 3, 9, 27 etc.
Fusion does not. You have a number of atoms, and 2 of those atoms have to find each other to fuse. One reaction does not make any other reactions more likely. Unlike fossil fuels or fission reactions, the fuel cannot be lit. It can only burn when carefully confined. You can only build up enough flame to break the containment vessel, at which point it goes out. Since the inside of the vessel is basically a vacuum, it will implode instead of exploding.
Maybe say which country specifically you mean when you say "dual citizenship" ;)