One thing I lament is the decline of long-term, unfettered research across the industry. I’ve witnessed more companies switching to research management models where management exerts more control over the research directions of their employees, where research directions can abruptly change due to management decisions, and where there is an increased focus on profitability. I feel this short-term approach will cost society in the long term, since current funding models promote evolutionary work rather than riskier, potentially revolutionary work.
As someone who wanted to become a researcher out of curiosity and exploration, I feel alienated in this world where industry researchers are harangued about “delivering value,” and where academic researchers are pressured to raise grant money and to publish. I quit and switched to a full teaching career at a community college. I enjoy teaching, and while I miss the day-to-day lifestyle of research, I still plan to do research during my summer and winter breaks out of curiosity and not for career advancement.
It would be great if there were more opportunities for researchers to pursue their interests. Sadly, though, barring a cultural change, the only avenues I see for curiosity-driven researchers are becoming independently wealthy, living like a monk, or finding a job with ample free time. I’m fortunate to have the latter situation where I have 16 weeks per year that I could devote outside my job.
I came to the same conclusion. This is the path I'm following (trying to set up a company and lean FIRE). It's sad in a way because those efforts and years could have been directed to research but we have to adapt.
That’s what a “scholar” is and Universities provided the perfect environment for that to thrive, which is no longer the case.
In post WW2 America though there was increased funding from the state, large research universities, institutes and national labs could be created. In the era where all that was really working at full speed, the “big scientific breakthrough” came at such a pace that it became hard to see what was big or not.
Then I read this: http://mmcthrow-musings.blogspot.com/2020/10/interesting-opi...
I think the alt-economy that you describe may turn up soon. at least the one that I’m imagining that doesn’t involve registering for Substack.
Don't low interest rates promote long-term thinking, perhaps to an absurd degree (e.g. the "it's okay that we hemorrhage money price dumping for 10 years as long as we develop a monopoly" playbook)? Bigger interest rate = bigger discount for present value of a future reward.
Bell labs came about when AT&T was the monopoly telephone provider in the US.
PARC happened when Xerox had a very lucrative monopoly on copy machines.
On a smaller scale, there is The Institute for Advanced Study where curiosity-driven research is encouraged, and there is the MacArthur Fellowship where fellows are granted $150,000 annual stipends for five years for them to pursue their visions with no strings attached. Other than these, though, I’m unaware of any other institutions or grants that truly promote curiosity-driven research.
I’ve resigned myself to the situation and have thus switched careers to teaching, where at least I have 4 months of the year “off the clock” instead of the standard 3-4 weeks of PTO most companies give in America.
10 billion yearly losses for something that by all accounts isn't close to magically becoming profitable. It honestly just seems like something he thinks is cool and therefore dumps money in.
To be fair, though; Facebook, I mean, Meta is a publicly-traded company and if the shareholders get tired of not seeing any ROI from Meta's VR initiatives, then this could compel Zuck to stop. Even Zuck isn't free from business pressures if the funding is coming from Meta and not out of Zuck's personal funds.
Back to Bell Labs and Xerox PARC, my understanding of how they worked is that while management did set the overall direction, researchers were given very wide latitude when pursuing this direction with little to no pressure to deliver immediate results and to show that their research would lead to profits. Indeed, at one point AT&T was forbidden by the federal government from entering businesses outside of their phone business, and in the case of Xerox PARC, Robert Taylor was able to negotiate a deal with Xerox executives where Xerox's executives wouldn't meddle in the affairs of PARC for the first five years. (Once those five years ended, the meddling began, culminating with Bob Taylor's famous exit in 1983.)
Since he has 57% of the votes, he can tell everyone to pound sand.
if it become the case, meta get 30% of the revenues associated with it.
If it does not, i'm pretty sure they can now make good smartphones and even have a dedicated os. I'm pretty sure they can find a way to make money with it.
A meta quest 3s in inself is an insane experience for 330€ and it's current main disadvantages for gaming are the lack of players and the catalogue size. Even using it as a main monitor with a bluetooth keyboard is "possible". I would have find it 'improbable' a few years ago even as an enthousiasth, i now could totally imagine a headset replacing my screen in a few years with a few improvements on.
But from the days of Bell Labs, haven't we greatly improved our ability to connect between some research concept to the idea of doing something useful, somewhere ?
And once you have that you can be connected to grants or some pre-VC funding, which might suffice, given the tools we have for conceptual development of preliminary ideas(simulation, for ex.) is far better than what they had at Bell?
As big of a fan I am of Xerox PARC and Bell Labs, I don't want to come across as saying that the Bell Labs and Xerox PARC models of research are the only ways to do research. Indeed, Bell Labs couldn't convert many of its research ideas to products due to the agreement AT&T made with the federal government not to expand into other businesses, and Xerox PARC infamously failed to successfully monetize many of its inventions, and many of these researchers left Xerox for other companies who saw the business potential in their work, such as Apple, Adobe, and Microsoft, to name a few.
However, the problem with our current system of grants and VC funding is that they are not a good fit for riskier avenues of research where the impacts cannot be immediately seen, or the impact will take many years to develop. I am reminded of Alan Kay's comments (https://worrydream.com/2017-12-30-alan/) on how NSF grants require an explanation of how the researchers plan to solve the problem, which precludes exploratory research where one doesn't know how to attack the problem. Now, once again, this question from the NSF is not inappropriate; there are different phases of research, and coming up with an "attack plan" that is reasonable and is backed by a command of the prior art and a track record of solving other problems is part of research; all PhD programs have some sort of thesis proposal that requires answering the same question the NSF asks in its proposals. With that said, there is still the early phase of research where researchers are formulating the question, and where researchers are trying to figure out how they'd go about solving the problem. This early phase of research is part of research, too.
I think the funding situation for research depends on the type of research being done. For more applied research that has more obvious impact, especially business impact, then I believe there are plenty of opportunities out there that are more appropriate than old-school industrial research labs. However, for more speculative work where impacts are harder to see or where they are not immediate, the funding situation is much more difficult today compared to in the past where industrial research labs were less driven by the bottom line, and when academics had fewer "publish-or-perish" pressures.
In short, Bell Labs-style institutions not only require consistent sources of funding that only monopolies can commit to, but they also require stakeholders to believe that funding such institutions is beneficial. We don't have those stakeholders today, though.
[1] US weighs Google break-up in landmark antitrust case:
A classification which includes government funding, note.
On a tangent, but think it's related, the curiosity, exploration and research by the kids maybe stalling too.
Just a thought.
We need this. Like, really, we need someone to have created the xerox part of the 21st century, somewhere about 20 years ago.
I honestly though Google would be that - but apparently it's easier to fund R&D on "selling copying machines" than "selling ads". Maybe "selling ads" earn _too much_ money ? I don't know.
I know, I know, DeepMind and OpenAI and xAI are supposed to fix climate change any time soon, and cure cancer while they invent cold fusion etc, etc... and it's only because I'm a pessimistic myopist that I can only see them writing fake essays and generating spam, bad me.
Still. Assuming I'm really grumpy and want to talk about people doing research that affects the physical world in positive way - who's doing that on the scale of PARC or Bell ?
The government also has always kept academia afloat. It is a privilege afforded to professors to believe they do not work for the state, but they do.
Great government and academic jobs forced companies to create these labs where it was better to hire great people and “lose” some hours to them doing whatever they want (which was still often profitable enough) than have zero great people. Can you imagine Claude Shannon putting up with the stuff software engineers deal with today?
The other main change is that how to run big companies has been figured out well enough that “zero great people” is no longer a real survival issue for companies. In the 1970s you needed a research level of talent but most companies today don’t.
One of the things that has changed since the 1990s is the ending of the Cold War. The federal government still has national laboratories, DARPA, NASA, the NSF, etc. However, the general culture has changed. It’s not that technology isn’t revered; far from it. It’s just that “stopping Hitler,” “beating the Soviets,” and grand visions for society have been replaced with visions of creating lucrative businesses. I don’t hear about the Oppenheimers and von Neumanns of today’s world, but I hear plenty about Elon Musk and Sam Altman, not to disrespect what they have done (especially with the adoption of EVs and generative AI, respectively), but the latter names are successful businessmen, while the former names are successful scientists.
I don’t know what government labs are like, but I know that academia these days have high publication and fundraising pressures that inhibit curiosity-driven research, and I also know that industry these days is beholden to short-term results and pleasing shareholders, sometimes at the expense of the long-term and of society at large.
Sadder still is the underlying situaiton behind this: the fact that there's nothing of even remotely comparable significance happening in the public sphere for such minds to devote themselves to, as those man did. Even though the current civilization risk if anything significantly greater than in their time.
Until then, it doesn't have the property of being significant.
These innovations in LEDs, battery technology, low power high performance microchips with features measured in number of atoms is extraordinary, and seemingly taken for granted.
Then we also have medicines that can even bend one’s desire to over eat or even drink alcohol, not to mention better vaccines, cancer therapies, and so on and so forth.
Vaccines are in a golden age, except political assholes are stoking ignorance and rejection.
In fact the two are advancing together. Bespoke vaccines for your cancer in in trial now.
I'll be sure call to you when I meet the first real person cured from cancer by immunotherapy. So I far I only met people who died of cancer. But if you tell me we're in a "golden age", I suppose it will be over soon.
Before the anti-vax lunatics moved in and captured the narrative in social media, President Trump’s extensive funding and regulatory acceleration of the COVID vaccines saved many thousands of lives. The design of the vaccine was done in days and available in weeks. That’s a golden age.
I guess it's a case where the future is there, but we badly need to get it "more evenly distributed".
Any kind of societal grand vision we had has been falling apart since about 1991. Slowly at first (all the talk about what to do with the "peace dividend" we were going to get after the fall of the Soviet Union) And that accelerated with the advent of the internet and then accelerated even more when social media came on the scene. We no longer have any kind of cohesive vision for what the future should look like and I don't see one emerging any time soon. We can't even agree on what's true anymore.
> I don’t know what government labs are like
Many of these are going to be in danger in the next administration especially if the DOGE guys get their way.
We’ve seen this before with Thomas Edison.
Universities are tripping over themselves to create commercialization departments and every other faculty member in departments that can make money (like CS) has a private company on the side. Weird that when these things hit, though, the money never comes back to the schools
Universities put a lot of pressure on faculty to win grants, and take 60-70% of the proceedings for “overhead”, which is supposed to fund less sellable research and provide job security but is, in practice, wasted.
You have to be a fundraiser and a seller if you want to make tenure, but if people are forced to basically put up with private sector expectations, can you fault them when they decide to give themselves private sector pay?
You can bet this spending is going to be among the fist things slashed by DOGE-lile efforts ("Scientists ? They're just liberal elites wasting our hard earned money researching vaccines that will change your dog's gender in order to feed it to communist immigrants.")
I suppose I could be cheered up by the irony, but, not today.
I'm pretty sure Google Brain was exactly what you are looking for: People like to think of DeepMind, but honestly, Brain pretty much had Bell Labs/PARCs strategy: they hired a bunch of brilliant people and told them to just "research whatever is you think is cool". And think all the AI innovations that came out of Brain and were given to the world for free: Transformers, Vision Transformers, Diffusion Models, BERT (I'd consider that the first public LLM), Adam, and a gazillion of other cool stuff I can't think of right now.... Essentially, all of the current AI/LLM craze started at Brain.
Right now the world needs GWh batteries made of salt, cheap fusion from trash, telepathy, a cure for cancer and a vaccine for the common cold - but in the meantime, advertisers can generate photos for their ads, which is, _good_, I guess ?
Humanity’s natural state is abject poverty and strife. Look at any wealth graph of human history and note how people are destitute right up until the Industrial Revolution, and then the graph explodes upward.
In a way we (well, especially the West) are already living in utopia. You’re completely right that we can still vastly improve, but look back at the progress we already made!
"Hey, xchatgclaudma, please conjure up time and energy out of thin air ?"
I mean brain-machine interfaces have been improving for quite a while.
Telepathy might even already exist.
Silicon Valley hippies have been replaced by folks focussed on monetisation and growth.
It’s not great for the west, but those problems are being tackled. We just don’t get to read about it because ‘China bad’ and the fear of what capital flight might do to arguably inflated US stock prices
https://www.energy-storage.news/byd-launches-sodium-ion-grid...
I dont know why it seems bold. So many signs.
[0]This was in the early '90s when $1000 went a long way.
I couldn't disagree more, but perhaps the time I was there (late 90s) was different.
There are companies that push many various technologies
Samsung conglomerate does everything, Intel does hard (semiconductor research, manufacturing) and soft (computer science/software) things
Maybe we're at the point where you need to specialize in one industry, so achieving various stuff like they did at Bell is harder?
When we look back in 20 years, things like the transformer architecture, AlphaFold (which just won a Nobel prize), and Waymo are going to have improved the world in a positive way as much as anything Bell Labs did, and certainly more than PARC.
Google has put quite a bit of resources into quantum computing research. It's not just for selling ads, though I have no doubt that it will be used for that among other things. But right now there's still no guarantee it's going to pay off at all.
I really want gov around the world to take back governance to be solely for the benefit of people. None of this greedy corruption lobbyist stuff.
So they could create Unix, but they weren't allowed to profit off of it. So they just gave it away, because why not.
And it was all done, apparently, at least in the beginning, because they hired smart people and they let them do what they wanted.
Monopoly may have helped them pay for such r&d, but vertical integration is what made it possible for so much r&d to be relevant to the business.
> RCA Laboratories/the Sarnoff Research Center is surely one of the most important of the American corporate labs with similarities to Bell Labs. (It features prominently in Bob Johnstone's We Were Burning https://www.hachettebookgroup.com/titles/bob-johnstone/we-we... : it has a big role in the history of the Japanese semiconductor industry, in large part because of its roles in the development of the transistor and the LCD and its thirst for patent-licensing money.)
>> In Dealers of Lightning, Michael Hiltzik argues that by the 1990s PARC was no longer engaged in such unrestricted research decoupled from product development.
> According to Hiltzik and most other sources, the PARC Computer Science Lab's salad days were over as early as 1983, when Bob Taylor was forced to leave, while the work of the other PARC labs focussed on physics and materials science wasn't as notable in the period up to then.
Seriously: if this kind of thing interests you at all, go and read We Were Burning.
Kind of a strange statement. Fairchild took the "traitorous eight" from Shockley Semiconductor, which was founded by William Shockley, who famously co-invented the transistor at Bell Labs (and who named the "traitorous eight" as such.)
So while Fairchild "didn’t operate anything like a basic research lab", its co-invention of the IC was not unrelated to having a large amount of DNA from Bell Labs.
However, it should be seen as a starting point! Alternative hypothetical pasts and futures abound. One issue is that the stuff from the past always looks more legendary seen through the lens of nostalgia; it's much harder to look at the stuff around you and to go through the effort of really imagining the thing existing.
So that's my hypothesis - there isn't a smaller volume of interesting stuff going on, but viewing it with hope and curiosity might be a tad harder now, when everyone is so "worldy" (i.e., jaded and pessimistic).
Proof:
https://worrydream.com/ (brett victor)
and the other people doing dynamicland and realtalk, both discussed straightforwardly here:
https://dynamicland.org/2024/FAQ/
https://solidproject.org/about -- solid, tim berners-lee and co, also.
https://malleable.systems/catalog/ -- a great many of the projects here are in the same spirit, to me, as well!
https://spritely.institute/ -- spritely, too
https://duskos.org/ -- duskOS, from Virgil Dupras
https://100r.co/site/uxn.html -- 100 rabbits, uxn, vibrating with new ideas and aesthetics
https://qutech.nl/ -- quantum research institute in the netherlands, they recently established a network link for the first time I believe
etc etc. These are off the top of my head, and I'm fairly new to the whole space!
AT&T provided for most of its history, the best quality telephone service in the world, at a comparable price to anyone else, anywhere.
There were structural issues with the AT&T monopoly however, for example cross subsidization - the true cost of services was often hidden because they would use optional services (like toll calling) to subsidize basic access, and business lines would cross subsidize residential service.
The level that AT&T fought foreign connections (aka, bring your own phone), probably hastened their demise, in the end, the very technologies that AT&T introduced would turn long distance from a high margin, to low margin business - the brass at AT&T had to know that, but they still pinned the future of their manufacturing business on that - a manufacturing business that had never had to work in a competitive environment, yet was now expected to - because of this and other factors divestiture was doomed to failure.
I'm a believer in utilities being a natural monopoly, but AT&T was an example of effective regulatory capture, it did not, and does not have to be this way, however it was.
When the decisions were made about divesiture, that bit was non obvious.
I can think of: AT&T, DuPont, Kodak, Xerox PARC, Westinghouse, IBM, GE, the original Edison labs (best as I can tell acquired by Western Union), Microsoft, Rockefeller University, Google Research.
Of notable industries and sectors, there's little I can think of in automobile, shipping, aircraft and aviation (though much is conducted through Nasa and military), railroads, steel (or other metals/mining), petroleum, textiles, or fianance. There's also the Manhattan Project and energy labs (which conduct both general energy research and of course much weapons development).
(I've asked similar questions before, see e.g., <https://news.ycombinator.com/item?id=41004023>.)
I'd like to poke at this question in a number of areas: what developments did occur, what limitations existed, where private-sector or public / government / academic research were more successful, and what conditions lead to both rise and fall of such institutions.
Various advanced computer facilities: UCSD, University of Illinois Urbana-Champaign. Probably others at Carnegie-Mellon, Georgia Tech, and elsewhere.
It has been fun spending a few years building my vision ( https://www.adama-platform.com/ ), but it's hard to communicate it all. I've come to the conclusion, the real magic that makes Research work is having a place where people can collaborate and work together in a shared culture.
At this point, I don't want to hire people into my endeavor, and I can't seem to find a co-founder to do commercialization right.
One thing to consider is that Bell Labs didn't innovate for altruistic reasons like furthering the human race or scientific understanding. They innovated to further AT&T's monopoly and to increase shareholder value. This doesn't seem that different than what Meta, Google, NVIDIA, etc. are doing. Maybe in 10-20-30 years we will view the research that modern tech companies are doing through the same lens.
Although, I do admit that the freedom with which these scientists and engineers were able to conduct research is something special. Maybe that's the real difference here.
Rather than p(r)aying for the smartest people who have ever been born, design a corporation that can have the average high school dropout work in R&D and you will print money, innovation and goodwill.
[1] The Art of Doing Science and Engineering:
https://press.stripe.com/the-art-of-doing-science-and-engine...
1947 was a magical year. That announcement had profound implications. They effectively invented something that would replace the huge existing base of vacuum tube components with miniaturized transistors. This miniaturization phase significantly influenced Von Nuemann's recommendation for the ballistic missile program. Many of the discrete component systems manufactured during this time remained in service into the 1980's.
This is a photo of a D-17B guidance computer that deployed on the Minuteman Missile in 1962, 15 years after creating the transistor, and was typical of military printed circuitry at the time for general purpose computers, disk/drum storage drives, and printers.
https://upload.wikimedia.org/wikipedia/commons/3/38/Autoneti...
"The D-17B weighed approximately 62 pounds (28 kg), contained 1,521 transistors, 6,282 diodes, 1,116 capacitors, and 5094 resistors. These components were mounted on double copper-clad, engraved, gold-plated, glass fiber laminate circuit boards. There were 75 of these circuit boards and each one was coated with a flexible polyurethane compound for moisture and vibration protection."
Share the why, not (just) the what.
These organizations employed too many people of relatively mediocre ability relative to output, leading to waste and eventual disbandment. Today's private sector companies in FAMNG+ are making bigger breakthroughs in AI, apps, self-driving cars, etc. with fewer people relative to population and more profits. This is due to more selective hiring and performance metrics. Yeah those people form the 60s were smart, but today's STEM whiz kids are probably lapping them.
> Unless you’re talking about advanced PR and market manipulation techniques to capture and retain ad revenue
Those very much _are_ the goals at those enterprises.
I have hopes of a resurgence of operations research and linear optimisation as goods in themselves: we could be plotting more nuanced courses in dark waters of competing pressure. Decision systems support across many fields would remove subjective, politicised pressures.
Linear programming, and even integer linear programming are pretty well solved practically speaking.
This stuff while old, is not routine for decision makers. They don't seem to grok how to formulate the questions and the choices.
Those weren’t really the topics people were interested in at the time (depending on your definition of AI).
The shoulders of giants, as they say.