“ For all the talk of artificial intelligence, the most efficient computer on Earth remains the human brain. It can perform the same number of operations per second as the world’s supercomputers, but only requires power equivalent to a fridge lightbulb.”

This is kind of like saying my cat can perform the same number of operations as a supercomputer, because he is full of neurons which do information processing. Sure, I guess. But the number of floating point operations I can do per second is a whole lot less than 1.

I mean yeah, but floating point ops aren’t a great measure of intelligence?
I think that in a physical sense you can compare computation throughput unfairly against GPUs, by comparing 1 FLO against a minimum window of time in which a particular neuron spiking or not is significant, multiplied by the number of neurons.
I'm not sure there's a single metric in which you can argue a single human brain has more compute than 1e6 H100s
Idk if there’s any meaningful metric which we can compare brain compute with digital compute.

Bc… we can do training and inference at the same time… while maintaining balance, operating machinery, or any number of tasks which are computationally expensive for digital computers. Maybe that implies we train and infer more efficiently, but it may just as well imply that our brains are more computationally powerful.

Maybe our conscious mind has less compute than H100s, but our entire brain maybe not.

It’s an apples to oranges comparison any which way.

I don't understand how differences in algorithms (training/inference) affect this argument. I just want to argue that in aggregate the human brain just cannot do nearly as much operations of any kind as a super cluster of gpus.

I guess the only way to know for sure is to faithfully simulate a human brain inside one of these clusters in real time...

> much operations of any kind

What is a brain operation? How do you measure that?

My point is that you can’t a make your argument because you’re comparing two subjects that have nothing in common beside the vague colloquial notion of “computing”

What and how your brain computes your response to my comment is wildly different than how digital hardware does.

They’re apples and oranges. No way to compare them on even ground, so no way to make the argument you’re trying to make.

We don’t have a shared metric with which to measure digital computation speed and organic brain computation speed, so your argument doesn’t make sense on the outset.

I agree with most of this but I think if one system can rather accurately simulate another I would say it is computationally stronger
> I'm not sure there's a single metric

The computing power of cellular machinery in ~3lb of human tissue which includes neurons, are many orders of magnitude larger than 1e6 or even 1e12 H100s.

Quite literally your finger has more compute power than 1e6 H100s, we just use that computational power to stay alive rather than “think.”

Do you think a single neuron has more compute than a H100? I'm not sure even energy based theoretical bounds on computation power allow this even in principle.
Yes.

It’s allowed because the energy limits in the Landauer's principle are based on irreversible state changes.

When an Antibody doesn’t match a surface protein there’s computation with zero energy loss because there’s no change in state. It amounts to a “free” if statement as long as the result is false and it’s almost always false. Though in a larger context it’s really inefficient as it’s a random process only effective because of how many Antibodies flood the body and how quickly they bounce around and can preform this computation.

Cells heavily leverage this kind of computation to the point where it’s not generally thought of as computation until something happens.

Thanks for the explanation--that was helpful. However I am not sure if microscopic states of the neuron are propagated as signals to nearby cells or only some sort of macroscopic state (like the sort of abstraction of thermodynamics) and I suspect there are bounds to the speed of change and number of these macroscopic states.

I guess at some point we should only count computations above some abstraction level otherwise we also undercount the computation power of a gpu.

Are we broadly in agreement? Even if we disagree above the relative computation power of the two

What do you even mean? You are comparing apples and oranges. Can an H100 churn through deterministic instructions to get to a fixed result in a computer's combinatorial envelope fast? No doubt.

However an abacus does not know how or what to add to achieve a goal.

Can an H100 or a bunch of H100s achieve better biological control than the human brain in real time as a self directed agent?

No way.

Apples and oranges.

No i mean a neuron has input electrical signals and outputs. Do you think I cannot simulate a single neuron to fidelity on a H100? Probably theoretically it is possible that it cannot if it's leveraging some physics to do optimization but I think in practice it's possible.

Fwiw I think 10e6 h100s can act as a competent self directed agent with the right algorithms yes

I think 1e6 h100s could run something that’s intelligent enough to hold a meaningful conversation as in AGI assuming we figured out the appropriate algorithms and training methods.

But your description of Neurons is off. They output chemical signals, which is why painkillers can dull the signal etc. Many drugs bind to receptors on neurons which then block signals which wouldn’t work if these where electrical connections.

The do respond to electrical stimulation, and it’s often said they use electrical signals to carry information along the body of the neuron but the actual signal only moves up to ~120 m/s a long way from how fast electrical signals move. https://en.wikipedia.org/wiki/Nerve_conduction_velocity

As to simulation of a neuron, the brains got 86 billion neurons and the peripheral nervous system is doing useful work here. A full fidelity simulation of even one neuron is completely off the table simply because it’s an analog system where timing matters to an arbitrary precision, but what’s good enough is really just a question of context. Good enough to test new drugs for depression? No, though good enough to drive a car is a reasonable goal.

  • skulk
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
Sure, if you're willing to ignore eight or nine orders of magnitude of efficiency.
It just shows how little people understand outside their area of expertise that so many think what the brain does is in anyway like what a computer does.

The brain is a self conscious biological control system. Incidentally it can do some computing.

A computer is a general purpose digital combinatorics machine that generates a constant output for constant input.

The brain is as if you wrote your software in microcode on a real valued analog computer. The brain does not use an instruction set. Instead it uses a complex spaghetti of somewhat self modifying micro code orginating in a self modifying (over longer time scales) genetic blue print.

Most importantly for human and animal intelligence the brain is motivated in certain directions by emotions. Without motivation intelligence does not exist. One could say intelligence is the application of real control toward a goal.

So you rule out the possibility that I rewire all neuron inputs and outputs into a computer instead of a human brain and it behaving competently as a biological human?
  • 65
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
Computers are fast, not smart.
(Am I the only one who gags at the word compute being used as a noun? I can’t be the only one.)
It happens often enough in linguistics to have its own name, a deverbal noun. Have a drink, go for a walk, and I guess now "has more compute". As far as modern english language development, seems better than skibidi toilet
It’s been pretty standard jargon in the industry for as long as I’ve been around.
I’ve been in the industry since 1982. Silicon Valley during the 90’s. Taught and consulted in 26 countries, including for HP and Intel. And the first time I heard compute used as a noun was when ChatGPT came out.

Calling a program “a code,” as in “I wrote a code for that.” Is common use at Los Alamos and Scandia labs, which I discovered when I taught there. So I have encountered some linguistic oddities in my career. Maybe “compute” has been thriving in the AI world and I haven’t been there to notice?

That's a key learning from this discussion.
Your brain is continuously performing an astonishing amount of calculations, you just don't realize it.
The brain does not perform "calculations" in the same way a digital computer does which actually doesn't calculate anything either. Calculating is an act of intention. Computers do not have intention. Rather using a fixed and stored instruction set humans can use the combinatorial networks of computer microcode to generate a certain deterministic result. The flow is one path. Human intention->software->combinatorial machinery->result. One input one output.

In contrast the brain takes multiple inputs and transforms them into controlled action in the real world in the service of goals. This action manifests as multiple complex outputs. The single neuron is a complicated analog computer. The brain is a functional network of 100 billion neurons or whatever the number is.

This is also true of my cat
True for anything with a brain, even a mosquito although a bit less astonishing.
  • anjel
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
  • nis0s
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
The human mind needs help in the form augmentation of its capabilities and capacity, but I am not convinced invasive implants are the way for two reasons: 1) you’ll need brain surgery every time you decide to change hardware for any reason, which will suck if you’re immunocompromised for any reason, which can also happen naturally as you age; 2) invasive hardware will be hard to remove at will.

I think there are different ways to augment human cognition without implants, and these less invasive methods need to be considered first. One simple and tractable method which may be easy to implement, but hard to perfect, is to create a mechanism for offloading tasks as a learned “thought process” to any number of autonomous agents called on demand. If the system relies on vocalization, then it’s not different from voice assistants, whereas if it relies on multi-step prompting, then it’s not reducing the cognitive load of the person using it. There are various nuances even for this relatively simple concept, but I think it’s one of the more doable research quests.

Subtly designed subvocal sensors, paired with an earpiece providing LLM output, could potentially be pretty powerful as a wearable. Not as invasive as glasses-type implementations - no camera or screen needed at all, really.

https://en.m.wikipedia.org/wiki/Subvocal_recognition

  • nis0s
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
Maybe, but a wireless SVR would offer more flexibility and normal use. But a “thought process” could also be composed from a language of neurophysiology, like blood flow and hand/foot movements.
  • nis0s
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
The average human mind has a capacity of holding at best 7 facts together in working memory before reaching cognitive load limits. What’s amazing is that we’ve achieved as much as we have besides current limitations.
Or the average human mind is fully equipped as is — seems like that has been proven over the millennia
Is there anything noteworthy in this article? It seems to be pure conventional wisdom. It could have been written by ChatGPT.

Brain function is important to the economy? Not using your brain might make it less functional? No way, man. I'm spinning in my chair.

> It could have been written by ChatGPT.

Wouldn't that be the ultimate irony?

There's also the concern that a bout of mild COVID costs about 3 IQ points.[1] The damage may be cumulative, but nobody seems to be following up on that, or tracking re-infection data, any more. What will this mean over 20 years?

[1] https://www.scientificamerican.com/article/covid-19-leaves-i...

  • xnx
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
Good opportunity to remember how fantastically amazing the human brain is: 86 billion neurons, each connected to thousands of other neurons, resulting in trillions of connections. Our most advanced research effort struggle to understand even a tiny scanned cube of it.

The most complex structure we know of and we've all got one.

yes but the main reason doesn't appear in the article
Screen time.
It goes deeper than that. If you always have someone or something to bail you out, why do you need to develop independence?
Au contraire, if you haven't anyone or anything to bail you out, how can you justify the risk of innovating?

Man is not an island.

What risk? To whom would you justify it?
Welcome to HN, Mr. Thoreau
...whose mom showed up at Walden Pond every now and again to help with laundry and such.
Specifically when the screen contents are controlled by people who want you as addicted as possible and don't care if it's ruining your life.
[flagged]
Monogamous societies were the most successful societies in human history, yet you claim that they should be inherently dysfunctional, because of dysgenics.
Taboos on incest and cousin marriage are what actually made the difference there, along with harsh winters
  • rixed
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
Not so forbidden a topic that you can't site some sources, I hope.
I'm not getting paid to do your homework for you
  • ·
  • 3 weeks ago
  • ·
  • [ - ]