I measured the electron's vector coupling to the Z boson at SLAC in the late 1990s, and the answer from that measurement is: we don't know yet - and that's the point.
Thirty years later, the discrepancy between my experiment and LEP's hasn't been resolved.
It might be nothing. It might be the first whisper of dark matter or a new force. And the only way to find out is to build the next machine. That's not 'dead', that's science being hard.
My measurement is a thread that's been dangling for decades, waiting to be pulled.
But it is not just about making money: The entire field of radiation therapy for cancer exists and continues to improve because people figured out ways to control particle beams with extreme precision and in a much more economical way to study particle physics. Heck, commercial MRIs exist and continue to improve because physicists want cheaper, stronger magnets so they can build more powerful colliders. What if in the future you could do advanced screening quickly and without hassle at your GP's office instead of having to wait for an appointment (and possibly pay lots of money) at an imaging specialist center? And if they find something they could immediately nuke it without cutting you open? We're talking about the ultimate possibility of Star Trek level medbays here.
Let the physicists build the damn thing however they want and future society will be better off for sure. God knows what else they will figure out along the way, but it will definitely be better for the world than sinking another trillion dollars on wars in the middle east.
No. These two cases are absurdly different, and you're even completely misunderstanding (or misrepresenting) the meaning of the "tens of billions of dollars" figure.
Microchips were an incremental improvement where the individual increments yielded utility far greater than the investment.
For particle physics, the problem is that the costs have exploded with the size of facilities to reach higher energies (the "tens of billions of dollars" is for one of them) but the results in scientific knowledge (let alone technological advances) have NOT. The early accelerators cost millions or tens of millions and revolutionized our undestanding of the universe. The latest ones cost billions and have confirmed a few things we already thought to be true.
> Let the physicists build the damn thing and future society will be better off for sure.
Absolutely not.
Yes, but we had hopes that it would lead to more. And had lead to more, something only known to be false in hindsight, who knows where that would have ended us up? What if it upended the standard model instead of reinforcing it?
> Absolutely not.
What are we supposed to do then? As humans, I mean. No one knows why we're here, what the universe really is like. We have some pretty good models that we know are wrong and we don't know what wonders the theoretical implications of any successor models might bring. That said, do we really need to motivate fundamental research into the nature of reality with a promise of technology?
I'm not arguing for mindlessly building bigger accelerators, and I don't think anyone is - there has to exist a solid line of reasoning to warrant the effort. And we might find that there are smarter ways of getting there for less effort - great! But if there isn't, discrediting the venue of particle accelerators due to their high upfront cost as well as historical results would be a mistake. We can afford it, and we don't know the future.
Sure, but it didn't. Which is knowledge that really should factor into the decision to build the next, bigger one.
> What are we supposed to do then? As humans, I mean.
Invest the money and effort elsewhere, for now. There are many other fields of scientific exploration that are very likely to yield greater return (in knowledge and utility) for less. You could fund a hundred smaller but still substantial intiatives instead of one big accelerator. And be virtually guaranteed to have an exciting breakthrough in a few of them.
And who knows, maybe a breakthrough in material science or high-voltage electrophysics will substantially reduce the costs for a bigger particle accelerator?
It was always factored in, and of course it would be in any next iteration.
> Invest the money and effort elsewhere, for now. There are many other fields of scientific exploration that are very likely to yield greater return (in knowledge and utility) for less. You could fund a hundred smaller but still substantial intiatives instead of one big accelerator. And be virtually guaranteed to have an exciting breakthrough in a few of them.
I agree with this to a large extent. I'm just not against particle accelerators as a venue for scientific advancement and in the best of worlds we could do both.
But you are and they are. Just by the comments here its clear that even suggesting not to use untold billions on maybe pushing theoretical physics a little forward is meet with scorn. The value proposition either, in knowledge or technology, is just not well argued anymore besides hand waving.
Engineers not being able to fathom that by building this huge-ass and complicated machines to answer questions about the fundamentals of nature, other problems are solved or new things are invented that improve and change our life will never not be funny to me
I'd not be so sure about that. Doing this research will probably allow us to answer "it works but we don't know exactly why" cases in things we use everyday (i.e. li-ion batteries). Plus, while the machines are getting bigger, the understood tech is getting smaller as the laws of physics allows.
If we are going to insist on "Absolutely not" path, we should start with proof-of-work crypto farms and AI datacenters which consume county or state equivalents of electricity and water resources for low quality slop.
> If we are going to insist on "Absolutely not" path, we should start with proof-of-work crypto farms and AI datacenters which consume county or state equivalents of electricity and water resources for low quality slop.
Who exactly is the "we" that is able to make this decision? The allocation of research budgets is completely unrelated to the funding of AI datacenters or crypto farms. There is no organization on this planet that controls both.
And if you're gonna propose that the whole of human efforts should somehow be organized differently so that these things can be prioritized against each other properly, then I'm afraid that is a much, MUCH harder problem than any fundamental physics.
That's the problem with cutting edge reaserch....you don't even know if you will ever needed it or if a trilion dollar industry is waiting for just a number to be born
Because the costs aren't just numbers. They represent hundreds or thousands of person-years of effort. You're proposing that a large number of people should spend their entire lives supporting this (either directly as scientists, or indirectly through funding it) - and maybe end up with nothing to show for it.
And there's the opportunity costs. You could fund hundreds of smaller, yet still substantial scientific efforts in many different fields for the cost of just one particle accelerator of the size we think is sufficient to yield some new observations.
There are talks of a Muon collider, also there's a spallation source being built in Sweden(?) and also of an electron 'Higgs factory' (and while the LHC was built for the Higgs boson it is not a great source for it - it is built as a generic tool that could produce and see the Higgs)
Everybody knows we are not there yet and how the final knowledge set will look like, if its even possible to cover it (ie are quarks the base layer or we can go deeper, much deeper all the way to planck scales? dynamics of singularities etc)
Back then, we thought our theory was more or less complete while having experimental data which disproved it (Michelson-Morley experiment, Mercury perihelion, I am sure there are others).
Right now, we know our theories are incomplete (since GR and QFT are incompatible) while having no experimental data which contradicts them.
Consider e.g. neutrino masses. We have plenty of experimental data indicating that neutrinos oscillate and therefore have mass. This poses a problem for the standard model (because there are problems unless the mass comes from the Higgs mechanism, but in the standard model neutrinos can't participate in the Higgs mechanism due to always being left-handed). But whenever we do experiments to attempt to verify one of the ways of fixing this problem -- are there separate right-handed neutrinos we didn't know about, or maybe instead the right-handed neutrinos were just antineutrinos all along? -- we turn up nothing.
This again? It's only true if you insist on sticking with the original form of Weinberg's "model of leptons" from 1967 [1], which was written when massless neutrinos were consistent with available experimental data. Adding quark-style (i.e. Dirac) neutrino mass terms to the Standard Model is a trivial exercise. If doing so offends some prejudice of yours that right-handed neutrino can not exist because they have no electric and weak charge (in which case you must really hate photons too, not to mention gravity) you can resort to a Majorana mass term [2] instead.
That question (are neutrinos Dirac or Majorana?) is not a "contradiction", it's an uncertainty caused by how difficult it is to experimentally rule out either option. It is most certainly not "a problem for the standard model".
[1] https://journals.aps.org/prl/pdf/10.1103/PhysRevLett.19.1264
[2] https://en.wikipedia.org/wiki/Majorana_equation#Mass_term
This is getting tiresome...
It really fits well with the OP comments. Nothing really contradicts the theory but there's no deeper theory beyond it. Another comment mentioned as "nightmare" of dark matter only have gravitational interaction with other matter. That would be very unsatisfying for physicists but wouldn't something that really disprove any given theory.
But this might be easier to read: https://www.space.com/astronomy/black-holes/did-astronomers-...
The following seem likely to me: (1) Consciousness exists, and is not an illusion that doesn't need explaining (a la Daniel Dennett), nor does it drop out of some magical part of physical theory we've somehow overlooked until now; (2) Mind-matter interactions do not exist, that is, purely physical phenomena can be perfectly explained by appeals to purely physical theories.
Such are the stakes of "naturalistic dualist" thinkers like David Chalmers. But if this is the case, it implies that the physics of matter and the physics of consciousness are orthogonal to each other. Much like it would be a nightmare to stipulate that dark matter is a purely gravitational interaction and that's that, it would be a nightmare to stipulate that consciousness and qualia arise noninteractionally from certain physical processes just because. And if there is at least one materially noninteracting orthogonal component to our universe, what if there are more that we can't even perceive?
Imagine trying to figure out what is happening on someone's computer screen with only physical access to their hardware minus the screen, and an MRI scanner. And that's a system we built! We've come exceedingly far with brains and minds considering the tools we have to peer inside.
Our brain needs to sense our "inner talk" so we can let it guide our decision-making and actions. If we couldn't remember sentences, we couldn't remember "facts" and would be much worse for that. And talking with our "inner voice" and hearing it, isn't that what most people would call consciousness?
I agree with the poster (and Daniel Dennet and others) that there isn’t anything that needs explaining. It’s just a question framing problem, much like the measurement problem in quantum mechanics.
If it’s completely impossible to even imagine what the answer to a question is, as is the case here, it’s probably the wrong question to pose. Is there any answer you’d be satisfied by?
To me the hard problem is more or less akin to looking for the true boundaries of a cloud: a seemingly valid quest, but one that can’t really be answered in a satisfactory sense, because it’s not the right one to pose to make sense of clouds.
I would be very satisfied to have an answer, or even just convincing heuristic arguments, for the following:
(1) What systems experience consciousness? For example, is a computer as conscious as a rock, as conscious as a human, or somewhere in between? (2) What are the fundamental symmetries and invariants of consciousness? Does it impact consciousness whether a system is flipped in spacetime, skewed in spacetime, isomorphically recast in different physical media, etc.? (3) What aspects of a system's organization give rise to different qualia? What does the possible parameter space (or set of possible dynamical traces, or what have you) of qualia look like? (4) Is a consciousness a distinct entity, like some phase transition with a sharp boundary, or is there no fundamentally rigorous sense in which we can distinguish each and every consciousness in the universe? (5) What explains the nature of phenomena like blindsight or split brain patients, where seemingly high-level recognition, coordination, and/or intent occurs in the absence of any conscious awareness? Generally, what behavior-affecting processes in our brains do and do not affect our conscious experience?
And so on. I imagine you'll take issue with all of these questions, perhaps saying that "consciousness" isn't well defined, or that an "explanation" can only refer to functional descriptions of physical matter, but I figured I would at least answer your question honestly.
And when your fingers type that you experience qualia, are they bullshitting because your fingers have never actually received any signals from your consciousness in any direct or indirect way?
There exists a huge number of fundamental quantities that should be calculated from the parameters of the "standard model", but we cannot compute them, we can only measure them experimentally.
For instance, the masses and magnetic moments of the proton, of the neutron and of all other hadrons, the masses and magnetic moments of the nuclei, the energy spectra of nuclei, of atoms, of ions, of molecules, and so on.
The "standard model" can compute only things of negligible practical importance, like the statistical properties of the particle collisions that are performed at LHC.
It cannot compute anything of value for practical engineering. All semiconductor devices, lasers and any other devices where quantum physics matters are not designed using any consistent theory of quantum physics, but they are designed using models based on a great number of empirical parameters determined by measurement, for which quantum physics is only an inspiration for how the model should look like and not a base from which the model can be derived rigorously.
"in the specific regime covering the particles and forces that make up human beings and their environments, we have good reason to think that all of the ingredients and their dynamics are understood to extremely high precision"[0]
Sean Carroll's own favorite topics (emergent gravity, and the many worlds interpretation) are also things that we don't have any clue about.
Yes there is stuff we can calculate to very high precision. Being able to calculate it, and understanding it, are not necessarily the same thing.
You'd expect that at the bottom, the smallest objects would be extremely simple and would follow some single physical law.
But the smallest objects we know of still have pretty complex behavior! So there's probably another layer underneath that we don't know about yet, maybe more than one.
For a historical analogy, classical physics was and is sufficient for most practical purposes, and we didn't need relativity or quantum mechanics until we had instruments that could manipulate them, or that at least experienced them. While I guess that there were still macroscopic quantum phenomena, perhaps they could have just been treated as empirical material properties without a systematic universal theory accounting for them, when instruments would not have been precise enough to explore and exploit predictions of a systematic theory.
Incompleteness is inherent to our understanding as the universe is too vast and endless for us to ever capture a holistic model of all the variables.
Gödel says something specific about human axiomatic systems, akin to a special relativity, but it generalizes to physical reality too. A written system is made physical writing it out, and never complete. Demonstrates that our grasp of physical systems themselves is always incomplete.
An environment living in Conway’s Game of Life could be quite capable of hypothesizing that it is implemented in Conway’s Game of Life.
Systems can hypothesize about themselves but they cannot determine why the rules they can learn exist in the first place. Prior states are no longer observable so there is always incomplete history.
Conway's Game of Life can't explain its own origins just itself. Because the origins are no longer observable after they occur.
What are the origins of our universe? We can only guess without the specificity of direct observation. Understanding is incomplete with only simulation and theory.
So the comment is right. We would expect to be able to define what is now but not completely know what came before.
The point being it's not at all clear what we might be missing without these impractical little mysteries that so far are very distant from every day life.
We are finding local maximums(induction) but the establishment cannot handle deduction.
Everything is an overly complex bandaid. At some point someone will find something elegant that can predict 70% as good, and at some point we will realize: 'Oh that's great, the sun is actually at the center of the solar system, Copernicious was slightly wrong thinking planets make circular rotations. We just needed to use ellipses!'
But with particles.
Part of the problem is that building bigger colliders, telescopes, and gravitational wave detectors requires huge resources and very powerful computers to store and crunch all the data.
We're cutting research instead of funding it right now and sending our brightest researchers to Europe and China...
For just about anything else, Newton has us covered.
Nothing major.
It's also kind of interesting how causality allegedly has a speed limit and it's rather slow all things considered.
Anyway, in 150 years we absolutely came a long way, we'll figure it that out eventually, but as always, figuring it out might lead even bigger questions and mysteries...
The charge of electrons is -1 and protons +1. It has been experimentally measured out to 12 digits or so to be the same magnitude, just opposite charge. However, there are no theories why this is -- they are simply measured and that is it.
It beggars belief that these just happen to be exactly (as far as we can measure) the same magnitude. There almost certainly is a lower level mechanism which explains why they are exactly the same but opposite.
Consistent quantum field theories involving chiral fermions (such as the Standard Model) are relatively rare: the charges have to satisfy a set of polynomial relationships with the inspiring name "gauge anomaly cancellation conditions". If these conditions aren't satisfied, the mathematical model will fail pretty spectacularly. It won't be unitary, can't couple consistently to gravity, won't allow high and low energy behavior to decouple,..
For the Standard Model, the anomaly cancellation conditions imply that the sum of electric charges within a generation must vanish, which they do:
3 colors of quark * ( up charge 2/3 - down charge 1/3) + electron charge -1 + neutrino charge 0 = 0.
So, there's something quite special about the charge assignments in the Standard Model. They're nowhere near as arbitrary as they could be a priori.
Historically, this has been taken as a hint that the standard model should come from a simpler "grand unified" model. Particle accelerators and cosmology hace turned up at best circumstantial evidence for these so far. To me, it's one of the great mysteries.
> you have to accept there will eventually be (hopefully simple) coincidences between certain fundamental values, no?
When the probability of coincidence is epsilon, then, no. Right now they are the same to 12 digits, but that undersells it, because that is just the trailing digits. There is nothing which says the leading digits must be the same, eg, one could be 10^30 times bigger than the other. Are you still going to just shrug and say "coincidence?"
That there are 26 fundamental constants and this one is just exactly the same is untenable.
Imagine an object made of only red marbles as the 'base state'. Now you somehow manage to remove one red marble: you're at -1. You add a red marble and you're at +1. It doesn't require any other marbles. Then you go and measure the charge of a marble and you and up at some 12 digit number. The one state will show negative that 12 digit number the other will show positive that 12 digit number.
Assigning charge as being the property of a proton or an electron rather than one of their equivalent constituent components is probably a mistake.
Consider: in every known case where we have found a deeper layer of explanation for a "coincidence" in physics, the explanation involved some symmetry or conservation law that constrained the values to a small discrete set. The quark model took seemingly arbitrary coincidences and revealed them as consequences of a restrictive structure. auntienomen's point about anomaly cancellation is also exactly this kind of thing. The smallness of the set in question isn't forced, but it is plausible.
But I actually think we're agreeing more than you realize. You're saying "this can't be a coincidence, there must be a deeper reason." I'm saying the deeper reason might bottom out at "the consistent discrete structures are sparse and this is one of them," which is a real explanation, but it might not have the form of yet another dynamical layer underneath.
It's simple to say "Ah well, it's sparse" that doesn't mean anything and doesn't explain anything.
Symmetries are equivalent to a conserved quantity. They exist because something else is invariant with respect to some transformation and vice versa. We didn't discover arbitrary constraints we found a conserved quantity & the implied symmetry.
"There are integers", "the numbers should be small" all of these are nothing like what works normally. They aren't symmetries. At most they're from some anthropic argument about collections of universes being more or less likely, which is its own rabbit hole that most people stay away from.
No. It’s almost certainly not a coïncidence that these charges are symmetric like that (in stable particles that like to hang out together).
Which makes every constant fair game. Currently, we don’t have a good process for explaining multiple universes beyond divine preference. Hence the notion that a random number settled on mirror whole sums.
Nïce
When an electron-positron pair is formed from a vacuum, we get all sorts of interesting geometry which I struggle to grasp or picture clearly. I understand the fact that these are fermions with spin-1/2 can similarly be explained as localized defects in a field of particles with integer spin (possibly a feature of the exact same "defect" as the charge itself, in the photonic field, which is what defines an electron as an electron).
EDIT:
> However, there are no theories why this is -- they are simply measured and that is it.
My take is that there _are_ accepted hypotheses for this, but solving the equations (of e.g. the standard model, in full 3D space) to a precision suitable to compare to experimental data is currently entirely impractical (at least for some things like absolute masses - though I think there are predictions of ratios etc that work out between theory and measurement - sorry not a specialist in high-energy physics, had more exposure to low-energy quantum topological defects).
eddies in the space-time continuum?
If the question is, why is quantum mechanics the correct theory? Well, I guess that's how our universe works...
And does it even apply here? If the charge on the electron differed from the charge on the proton at just the 12th decimal place, would that actually prevent complex life from forming. Citation needed for that one.
I agree with OP. The unexplained symmetry points to a deeper level.
I was born to this world at a certain point in time. I look around, and I see environment compatible with me: air, water, food, gravity, time, space. How deep does this go? Why I am not an ant or bacteria?
Some lean on the multiverse and the anthropic principle to explain it, but that is far less parsimonious.
Crackpots have found thousands of formula that try to explain the ratio of the proton to electron mass but there is no expectation that there is a simple relationship between those masses since the proton mass is the sum of all sorts of terms.
(I use quotes because those are emergent concepts)
Same as "hacker community" deciding that AI is worth FOMO'ing about
If you tally up the forces, the difference is a residual attraction that can model gravity. It was rejected on various experimental and theoretical grounds, but it goes to show that if things don't cancel out exactly then the result can still leave a universe that would appear normal to us.
In other words: There can be multiple "layers" of linked states, but that doesn't necessarily mean the lower layers "create" the higher layers, or vice versa.
Now, the ratios between these charges appear to be fundamental. But the presence of fractions is arbitrary.
Actually, I doubt it. Because of their color charge, quarks can never be found in an unbound state but instead in various kinds of hadrons. The ways that quarks combine cause all hadrons to end up with an integer charge, with the ⅔ and -⅓ charges on various quarks merely being ways to make them come out to resulting integer charges.
For example, pair production is:
photon + photon = electron + (-)electron
You can take that diagram, rotate it in spacetime, and you have the direct equivalent, which is electrons changing paths by exchanging a photon: electron + photon = electron - photon
There are similar formulas for beta decay, which is: proton = neutron + electron + (-)neutrino
You can also "rotate" this diagram, or any other Feyman diagram. This very, very strongly hints that the fundamental particles aren't actually fundamental in some sense.The precise why of this algebra is the big question! People are chipping away at it, and there's been slow but steady progress.
One of the "best" approaches I've seen is "The Harari-Shupe preon model and nonrelativistic quantum phase space"[1] by Piotr Zenczykowski which makes the claim that just like how Schrodinger "solved" the quantum wave equation in 3D space by using complex numbers, it's possible to solve a slightly extended version of the same equation in 6D phase space, yielding matrices that have properties that match the Harari-Shupe preon model. The preon model claims that fundamental particles are further subdivided into preons, the "charges" of which neatly add up to the observed zoo of particle charges, and a simple additive algebra over these charges match Feyman diagrams. The preon model has issues with particle masses and binding energies, but Piotr's work neatly sidesteps that issue by claiming that the preons aren't "particles" as such, but just mathematical properties of these matrices.
I put "best" in quotes above because there isn't anything remotely like a widely accepted theory for this yet, just a few clever people throwing ideas at the wall to see what sticks.
But again, this is just observation, and it is consistent with the charges we measure (again, just observation). It doesn't explain why these rules must behave as they do.
> This very, very strongly hints that the fundamental particles aren't actually fundamental in some sense.
This is exactly what I am suggesting in my original comment: this "coincidence" is not a coincidence but falls out from some deeper, shared mechanism.
Sure, but that's fundamental to observing the universe from the inside. We can't ever be sure of anything other than our observations because we can't step outside our universe to look at its source code.
> It doesn't explain why these rules must behave as they do.
Not yet! Once we have a a theory of everything (TOE), or just a better model of fundamental particles, we may have a satisfactory explanation.
For example, if the theory ends up being something vaguely like Wolfram's "Ruliad", then we may be able to point at some aspect of very trivial mathematical rules and say: that "the electron and proton charges pop out of that naturally, it's the only way it can be, nothing else makes sense".
We can of course never be totally certain, but that type of answer may be both good enough and the best we can do.
Yes, that's part of the plan. I mean, not to all the physicists, just to those whose work doesn't bring in results anymore, and it hasn't for 30 to 40 years now. At some point they (said physicists) have to stop their work and ask themselves what it is that they're doing, because judging by their results it doesn't seem like they're doing much, while consuming a lot of resources (which could have been better spent elsewhere).
Thousands of people have worked on bringing LHC up during a few decades before, Higgs came to be, across all engineering branches.
This stuff is hard, and there is no roadmap on how to get there.
Physics advances have been generally driven by observation, obtained through better and better instrumentation. We might be entering a long period of technology development, waiting for the moment our measurements can access (either through greater energy or precision) some new physics.
The best known example is the pre- and post-Copernican conceptions of our relationship to the sun. But long before and ever since: if you show me physics with its wheels slipping in mud I'll show you a culture not yet ready for a new frame.
We are so very attached to the notions of a unique and continuous identity observed by a physically real consciousness observing an unambiguous arrow of time.
Causality. That's what you give up next.
Copernicus was proposing circular orbits with the sun at the center instead of the earth. The Copernican model required more epicycles for accurate predictions than the considerably well-proven Ptolemaic model did, with the earth at the centre.
It wasn't until Kepler came along and proposed elliptical orbits that a heliocentric solar system was obviously a genuine advance on the model, both simpler and more accurate.
There was no taboo being preserved by rejecting Copernicus's model. The thinkers of the day rightfully saw a conceptual shift with no apparent advantage and several additional costs.
LLMs were a breakthrough I didn't expect and it's likely the last one we'll see in our lifetime.
Either way this is also opinion based.
There hasn't been a revolutionary change in technology in the last 20 years. I don't consider smart phones to be revolutionary. I consider going to the moon revolutionary and catching a rocket sort of revolutionary.
Actually I take that back I predict mars as a possible break through along with LLMs, but we got lucky with musk.
Oh fuck off. Opinions exist in reality do they not? Pessimism implies biased opinions that are biased towards negativity. My opinions bias towards reality, aka what I observe, not what is negative.
There , clear?
>But the actual reality is that scientific discovery is proceeding at least as fast as it ever has.
No it's not. Space is the best example of this. Computing speed is another example. We are hitting physical barriers to technology and discovery in every practical dimension.
>There's important work happening in solid state physics and materials science. JWST is overturning old theories and spawning new ones in cosmology. There's every reality-based reason to believe there will be plenty of big changes in science in the next 20 years or so.
Astronomy? Give me a break we're trying to infer what's millions of light years a way of course we're going to be stumbling and getting shit wrong all the time. A step function in Astronomy is actually going to the stars. I guarantee even your great great great grandchildren won't go to one.
However, from your later comments, it sounds as though you feel the only operating definition of a "breakthrough" is a change inducing a rapid rise in labor extraction / conventional productivity. I could not disagree more strongly with this opinion, as I find this definition utterly defies intuition. It rejects many, if not most, changes in scientific understanding that do not directly induce a discontinuty in labor extraction. But admittedly if one restricts the definition of a breakthrough in this way, then, well, you're probably about right. (Though I don't see what Mars has to do with labor extraction.)
catching a rocket is very impressive, but its just a lower cost method for earth orbit. it does unlock megaconstellations tho
AI is the step function change. The irony is that it became so pervasive and intertwined with slop people like you forget that what it does now (write all code) was unheard of just a couple years ago. ai surpassed the hype, now it’s popular to talk shit about it.
For decades, progress mostly shifted physical constraints or communication bandwidth. Faster chips, better networks, cheaper storage. Those move slopes, not discontinuities. Humans still had to think, reason, design, write, debug. The bottleneck stayed human cognition.
LLMs changed that. Not marginally. Qualitatively.
The input to the function used to be “a human with training.” The output was plans, code, explanations, synthesis. Now the same class of output can be produced on demand, at scale, by a machine, with latency measured in seconds and cost approaching zero. That is a step change in effective cognitive throughput.
This is why “video calling another continent” feels incremental. It reduces friction in moving information between humans. AI reduces or removes the human from parts of the loop entirely.
You can argue about ceilings, reliability, or long term limits. Fine. But the step already happened. Tasks that were categorically human two years ago are now automatable enough to be economically and practically useful.
That is the function. And it jumped.
I will commit the first sin, by declaring without fear of contradiction the cat actually IS either alive or dead. it is not in a superposition of states. What is unknown is our knowledge of the state, and what collapses is that uncertainty.
If you shift this to the particle, not the cat, what changes? because if very much changes, my first comment about the unsuitability of the metaphor is upheld, and if very little changes, my comment has been disproven.
It would be clear I am neither a physicist nor a logician.
From the wikipedia page: “This thought experiment was devised by physicist Erwin Schrödinger in 1935 in a discussion with Albert Einstein to illustrate what Schrödinger saw as the problems of Niels Bohr and Werner Heisenberg's philosophical views on quantum mechanics.”
[0] Well, and the hidden variables are non-local, which is a whole 'nother can of highly non-intuitive worms.
However I still find it crazy that when you slow down the laser and one photon at a time goes through either slit you still get the bands. Which begs the question, what exactly is it constructively or destructively interfering with?
Still seems like there's much to be learned about the quantum world, gravity, and things like dark energy vs MOND.
(This is what I was told, exploring my belief it's always been fringes in streams of photons not emerging over repeated applications of single photons and I was wrong)
The difficult part is single photon _detectors_, they're the key technology to explore the single-photon version of Young's experiment (which originally showed that light has wave-like properties).
If I make the equivalent of a double slit experiment in a swimming pool, then generate a vortex that propagates towards my plywood slits or whatever, it's not really surprising that the extended volume of the vortex interacts with both slots even though it looks like a singular "particle."
Nuclear physics (ie, low/medium energy physics) covers diverse topics, many with real world application - yet travels with a lot of the same particles (ie, quarks, gluons). Because it is so diverse, it is not dead/dying in the way HEP is today.
"The analysis has been optimized using neural networks to achieve the smallest expected fractional uncertainty on the t¯t production cross section"
What is more interesting currently is things like anomaly detection using ML/NN and foundational models..etc.
Fun fact: I got to read the thesis of one my uncles who was a young professor back in the 90's. Right when they were discovering bosons. They were already modelling them as tensors back then. And probably multilinear transformations.
Now that I am grown I can understand a little more, I was about 10 years old back then. I had no idea he was studying and teaching the state of the art. xD
You can find tensors even in some niche stuff in macroeconomics.
I wish those people focus on practical real world physics. So we all can enjoy new innovations.
Ever used GPS?
A CD player?
A laser?
Semiconductors?
The discovery of the Higgs boson in 2012 completed the Standard Model of particle physics, but the field has since faced a "crisis" due to the lack of new discoveries. The Large Hadron Collider (LHC) has not found any particles or forces beyond the Standard Model, defying theoretical expectations that additional particles would appear to solve the "hierarchy problem"—the unnatural gap between the Higgs mass and the Planck scale. This absence of new physics challenged the "naturalness" argument that had long guided the field.
In 2012, physicist Adam Falkowski predicted the field would undergo a slow decay without new discoveries. Reviewing the state of the field in 2026, he maintains that experimental particle physics is indeed dying, citing a "brain drain" where talented postdocs are leaving the field for jobs in AI and data science. However, the LHC remains operational and is expected to run for at least another decade.
Artificial intelligence is now being integrated into the field to improve data handling. AI pattern recognizers are classifying collision debris more accurately than human-written algorithms, allowing for more precise measurements of "scattering amplitude" or interaction probabilities. Some physicists, like Matt Strassler, argue that new physics might not lie at higher energies but could be hidden in "unexplored territory" at lower energies, such as unstable dark matter particles that decay into muon-antimuon pairs.
CERN physicists have proposed a Future Circular Collider (FCC), a 91-kilometer tunnel that would triple the circumference of the LHC. The plan involves first colliding electrons to measure scattering amplitudes precisely, followed by proton collisions at energies roughly seven times higher than the LHC later in the century. Formal approval and funding for this project are not expected before 2028.
Meanwhile, U.S. physicists are pursuing a muon collider. Muons are elementary particles like electrons but are 200 times heavier, allowing for high-energy, clean collisions. The challenge is that muons are highly unstable and decay in microseconds, requiring rapid acceleration. A June 2025 national report endorsed the program, which is estimated to take about 30 years to develop and cost between $10 and $20 billion.
China has reportedly moved away from plans to build a massive supercollider. Instead, they are favoring a cheaper experiment costing hundreds of millions of dollars—a "super-tau-charm facility"—designed to produce tau particles and charm quarks at lower energies.
On the theoretical side, some researchers have shifted to "amplitudeology," the abstract mathematical study of scattering amplitudes, in hopes of reformulating particle physics equations to connect with quantum gravity. Additionally, Jared Kaplan, a former physicist and co-founder of the AI company Anthropic, suggests that AI progress is outpacing scientific experimentation, positing that future colliders or theoretical breakthroughs might eventually be designed or discovered by AI rather than humans.
The problem is that we've mostly explained everything we have easy access to. We simply don't have that many anomalies left. Theoretical physicists were both happy and disappointed that the LHC simply verified everything--theories were correct, but there weren't really any pointers to where to go next.
Quantum gravity seems to be the big one, but that is not something we can penetrate easily. LIGO just came online, and could only really detect enormous events (like black hole mergers).
And while we don't always understand what things do as we scale up or in the aggregate, that doesn't require new physics to explain.
And, I think, most people would place that kind of stuff under "solid state physics" anyway.
Scaling up particle colliders has arguably hit diminishing returns.
>Cari Cesarotti, a postdoctoral fellow in the theory group at CERN, is skeptical about that future. She notices chatbots’ mistakes, and how they’ve become too much of a crutch for physics students. “AI is making people worse at physics,” she said.
- the universe as a Neural Network (yes yes moving the universe model paradigm from the old Clockwork to machine to computer to neural network)
I found it interesting and speculative but also fascinating
See video here:
https://youtu.be/73IdQGgfxas?si=PKyTP8ElWNr87prG
AI summary of the video:
This video discusses Professor Vitaly Vanchurin's theory that the universe is literally a neural network, where learning dynamics are the fundamental physics (0:24). This concept goes beyond simply using neural networks to model physical phenomena; instead, it posits that the universe's own learning process gives rise to physical laws (0:46).
Key takeaways from the discussion include: • The Universe as a Neural Network (0:00-0:57): Vanchurin emphasizes that he is proposing this as a promising model for describing the universe, rather than a definitive statement of its ontological nature (2:48). The core idea is that the learning dynamics, which are typically used to optimize functions in machine learning, are the fundamental physics of the cosmos (6:20). • Deriving Fundamental Field Equations (21:17-22:01): The theory suggests that well-known physics equations, such as Einstein's field equations, Dirac, and Klein-Gordon equations, emerge from the learning process of this neural network universe. • Fermions and Particle Emergence (28:47-32:15): The conversation delves into how particles like fermions could emerge within this framework, with the idea that useful network configurations for learning survive, similar to natural selection. • Emergent Quantum Mechanics (44:53-49:31): The video explores how quantum behaviors, including the Schrödinger equation, could emerge from the two distinct dynamics within the system: activation and learning. This requires the system to have access to a "bath" or "reservoir" of neurons. • Natural Selection at the Subatomic Scale (1:05:10-1:07:34): Vanchurin suggests that natural selection operates on subatomic particles, where configurations that are more useful for minimizing the loss function (i.e., for efficient learning) survive and those that are not are removed. • Consciousness and Observers (1:15:40-1:24:09): The theory integrates the concept of observers into physics, proposing a three-way unification of quantum mechanics, general relativity, and observers. Consciousness is viewed as a measure of learning efficiency within a subsystem (1:30:38).
The excellent Arvin Ash has a very accessible video about it: https://www.youtube.com/watch?v=paQLJKtiAEE
Maybe you aren’t going to be satisfied with the sort of complicated mathematics which appears to be correct (or, on the right track).
If you have complaints about the aesthetics of how the universe works, take it up with God.
Personally, I think there is a lot of beauty to be found in it.
I’ll admit that there are a few parts that go against my tastes (I don’t like needing to resort to distributions instead of proper functions), but that’s probably just intellectual laziness on my part.