Wake Up, You've Been Asleep for 50 Years

Nintil suggests: “Maybe there was a monocausal event in 1970.”

Many graphs to follow, but first:

And of course, WTF Happened in 1971.

Constitutional Amendments [1]

Energy Use

GDP Doubling Time

Peer Review

Donald Braben, author of Scientific Freedom: “'academic research before about 1970 was essentially unmanaged”

Leaded Gas

Oil Shock

Inflation

Male Income

R&D’s Share of the Federal Budget

Senate Filibusters

See Also:
Scott Alexander: Wage Stagnation: Much More Than You Wanted To Know
Scott Alexander: 1960: The Year The Singularity Was Cancelled
Noah Smith: How the 1970s Changed the U.S. Economy

Footnotes
[1] I’m cheating a bit here. The 27th amendment was proposed in 1789, but ratified in 1992. So we have ratified something since 1971, we just haven’t ratified anything written since then. Having said that, it’s a really boring amendment.

Interpretation
Okay, the obvious objection here is that it’s easy to cherry pick examples for any 5 year period. But is it actually? Maybe you could do WWII, but that wouldn’t be weird, that would just be WWII. This is notably because the events seem largely unrelated.

But you might be right, and it’s possible this is all just noise. I’m genuinely unsure.

A compelling explanation would be that a lot of this is related. Energy use slowed becaused GDP per capital slowed. Inflation skyrocketed because we got off the gold standard. We stopped passing constitutional amendments because of the filibusters.

Compiling this post, I came across several mentions of Mike Mansfield. He’s responsible for changing the filibuster bylaws (1970), prohibiting military funding of research without a direct military application (1969) and limiting DARPA’s scope (1973). Darpa, of course, was responsible for both ARPANET and the Mother of All Demos. For more on why this matters, see Steve Blank’s Secret History of Silicon Valley, summarizing the influence of military funding on seemingly unrelated innovations.

So sure, there are some specific causes we can point to.

But just imagine being around in these years. In 1962 JFK announces we’re going to the Moon, and a mere 7 years later we’re there. That very same year, the internet comes out. 3 years later we have video games, and a year after that, cell phones.

Of course, the average American wouldn’t really have cared. The internet existed in some lab, not in your home. Maybe it’s the same situation today. We have CRISPR, but it hasn’t yet had a big impact on our lives. Google announced quantum supremacy, but I won’t have a quantum computer for a long time. GPT-3 can synthesize shockingly good music, but nothing I would actually listen to. It’s possible 50 years from now we’ll look back at 2018-2023 as an incredible period of innovation, with nearly miraculous coincidence.

Maybe.


[Edit: 2021/02/19] From Tyler Cowen:

The break point in America is exactly 1973 and we don’t know why this is the case. It’s often argued that 1973 is the breakpoint because the price of oil goes up a good deal because of OPEC and the embargo, that might be true. But since that time, the price of oil in real terms has fallen a great deal and productivity has not bounded back. But at least in the short term, that seems to be the relevant shock.

No Revival for the Industrial Research Lab

Bell Labs is dead. It remains dead. And we have killed it.

In fact, even the concept seems to be dead. The top result in Google for Industrial Research Lab is a retrospective history. Pluralize the query as Industrial Research Labs and the top result is a 1946 index of the many labs that used to exist. Try Corporate Research Lab and you’ll get the cheery article The Death Of Corporate Research Labs.

Summarized in this last piece, Rosenthal determines the cause of death:

Lack of anti-trust enforcement, pervasive short-termism, driven by Wall Street’s focus on quarterly results, and management’s focus on manipulating the stock price to maximize the value of their options

Looking backwards, this is a tragedy. How shall we comfort ourselves, the murderers of all murderers?

But looking forward, it’s an immense opportunity! So long as we remedy the underlying causes, the great industrial lab may rise again. We may already be on the right track. Antitrust pressure is rising, and privately held companies not subject to Wall Street’s demands are booming.

What about short-termism? Via Tyler Cowen, Warren (2014) finds “no clear evidence of flawed short-term oriented management practices”.

This is good news! The industrial research lab has fallen, but with the causes gone, it will rise again.

As Ben Southwood concludes his piece for Works in Progress “we will see the return of various large in-house labs.” [1]


I am not so optimistic. Regardless of the antitrust situation, industrial research labs will not return.

Southwood and Rosenthal’s findings are both derived from Ashish Arora, Sharon Belenzon, et al.'s The Changing Structure of American Innovation, worth reading in full. As he concludes:

It seems unlikely that corporate research will rediscover its glory days… For some time, quick wins from low-hanging fruit (such as optimizing auction or advertising formats) may cover up the problem, but the fundamental challenge of managing long-run research inside a for-profit corporation remains a formidable one… incumbent firms continue to rely on outside inventions to fuel their growth. In the longer run, therefore, university research will remain the principal source of new ideas for such inventions. [emphasis mine]

In other words: why bother innovating when you can let someone else do it for you? Firms will still engage in the development half of R&D, but this will take the form of translating existing findings into products, rather than breakthrough fundamental research. Arora & Belenzon again:

In summary, the new innovation ecosystem exhibits a deepening division of labor between universities that specialize in basic research, small start-ups converting promising new findings into inventions, and larger, more established firms specializing in product development and commercialization (Arora and Gambardella, 1994). Indeed, in a survey of over 6,000 manufacturing- and service-sector firms in the U.S., Arora et al. (2016) find that 49 percent of the innovating firms between 2007 and 2009 reported that their most important new product originated from an external source.

…As a result, federal research dollars for the university sector grew from an estimated level of $420 million (1982 dollars) in 1935-1936 to more than $2 billion (1982 dollars) in 1960 and $8.5 billion in 1985. Between 1960 and 1985, the share of university research of GNP grew almost twofold from 0.13 to 0.25… Corporate labs historically operated in an environment where university research and start-up inventions were scarce.

Arora & Belenzon cites a 4.25x increase from 1960 to 1985. What about the years since? Courtesy of the NSF:

Federal funding has continued to increase rapidly, up another 3.5x since 1985. For its part, institutional funding (meaning internal funding in the form of endowments, gifts and so on) has grown over 6x. Faced with this bounty, it’s no wonder firms lost their appetite for footing the bill themselves.


To some extent, this is great news! The industrial research lab may be dead, but that doesn’t mean innovation is over, it’s just coming from universities instead. More funding means more science can get done, and who cares if it happens in industry or academia?

Unfortunately, the research we see today is of a different nature. In a section titled “Inventions originating from large corporate labs are different”, Arora & Belenzon enumerate the kinds of innovations we’ve lost in the shift towards university labs:

  • Corporate labs work on general purpose technologies
  • Corporate labs solve practical problems
  • Corporate labs are multi-disciplinary and have more resources

Again, the paper is worth consulting for full details, but it suffices to say that different mechanisms for attracting, nurturing, and managing talent will result in different types of outputs. None of this is to say that the corporate labs were on balance superior to today’s university labs, merely that we are missing out on some innovations (and likely getting others in return).

Though more sensationally, it’s worth worrying that there’s been a profound exodus of top talent from research into industry. As Patrick Collison describes:

One thing that I think is underemphasized is that [corporate research labs] competed on the basis of compensation. They just paid more than other potential sources of unemployment. PARC’s strategy for aggregating the best computer scientists in the world was to pay them more than they would be earning in academia. And in the 70s, you couldn’t Google and earn millions of dollars a year, Silicon Valley hadn’t really left the launchpad.

To a significant extent, the same thing applies to Bell Labs. They were quite explicit that their strategy was to compensate really well and present more favorable employment than academia… It could be the case that because there are so many high return loci for super talent people to go and deploy their talents, you could never quite aggregate talent to the same extent.

…it’s hard to think of any major successes from these kinds of labs over the last 10 or 15 years. [lightly paraphrased for clarity]

As the lamentation goes, “The best minds of my generation are thinking about how to make people click ads.” It’s hard to prove that there has been a brain drain, but if so, it would indicate a strict loss in quality, as opposed to the more stylistic shifts Arora & Belenzon describes.


Bell Labs is dead, and it’s not coming back. Since corporate labs are uniquely suited to some kinds of valuable research, this loss is troubling.

Where does that leave us?

One option is to attempt a massive overhaul of the entire system. If corporate labs have dwindled in the face of the government funded university system, we ought to redirect some portion of those tax dollars to industry instead and seek a better balance. To some extent, this already happens. DARPA awards funding to small businesses through it’s Small Business Innovation Research grants, as does the NIH.

Even here, corporate labs won’t have the same incentives they had in the glory days from 1940-1970. So long as the university system thrives, firms will pursue growth through innovations discovered externally.

In that light, I have a different proposal: Instead of reviving the corporate research lab out of nostalgia, we should consider the more specific goals these labs accomplished, and then target them directly. If we are eager for more multi-disciplinary or research more closely tied to practical problems, we ought to build institutions to pursue those particular aims.

Even more broadly, we can think of institutions as just one of many possible mechanisms to allocate human capital. Nadia’s Helium Grants are also a mechanism for talent allocation, as is venture capital, and as is Substack.

The scientific world is not merely a collection of modules that produce research in hermetic isolation. Rather, it is better understood as an interconnected ecosystem. Increased profitability in software may cause cost disease elsewhere. An exciting new topic in one field could cause a genius exodus in another. Without this understanding of how it all interacts, attempts to recreate a single piece of the 1960s without the supporting context are doomed from the start.

We should seek to understand conditions as they exist today, think deeply about the particular aims we wish to satisfy, then design improved mechanisms within a contemporary environment.


Thanks to Ashish Arora and Nintil for their comments.


See also
Nintil – Fund People not Projects
Alexey – Reviving Patronage and Revolutionary Industrial Research
Nadia Eghbal – Seed stage philanthropy


Footnotes
[1] Ben Southwood read the same paper as Rosenthal and came to the opposite conclusion: the decline was the result of too much antitrust enforcement. The two only appear correlated because success driven by corporate research leads to antitrust. His full quote is “Perhaps antitrust bodies will be restrained, and we will see the return of various large in-house labs.”, but my summary of the conclusion as optimistic still stands.


Frequently Asked Questions

What about Google Quantum?
I’m horrifically unqualified to make this judgement, but this does stand out as an important achivement.

Jack Hidary of Google’s quantum computing initiatives once said: “We literally created a spreadsheet of the experts in this space. We only came up with 800 names globally.” So maybe this is the exception that proves the rule, and Google was able to succeed precisely because there is not a thriving university system for quantum computing.

It’s also worth understanding the achievement as a result of Google’s partnership with NASA, though I don’t know the details of each party’s contributions.

What about AI?
Again, I’m not really confident here, but as I understand it, breakthroughs in AI consist largely of scaling up existing techniques, or inventing new techniques to enable greater scale.

If that’s a fair summary, the apparent dominance of firms in AI research would seem to be a product of their outsized resources, namely compute and data.

Still, why invest in research instead of applying the findings from academia? My guess is that the field is just moving too quickly, and being even a year or two ahead makes a huge difference.

Does this present a promising blueprint for other fields? Maybe. There are other fields that could see dramatic progress powered largely by advances in compute. Or maybe other fields that are amenable to AI-driven progress sooner than we expect. Though even in these cases, I would not expect Google to take over unless the results translate easily into profits. Instead, we’ll likely continue to see Google partner with universities, while staying focused on their own core competence.

What about Open AI?
OpenAI is a non-profit wrapped in a capped-profit LP, managed like a startup.

But okay, incorporation aside, why does it exist? It could be that AI Safety really is the primary concern, and Open AI was founded by a small group of eccentric billionaires motivated by a contrarian research hypothesis. Or maybe that was once the ostensible excuse, and now it’s just a regular startup that bootstrapped talent agglomeration through hype.

I’m not sure, and I very much hope to read the history of OpenAI once someone (or something) writes it.

What about the development in R&D?
Although the dedicated in-house lab is dead, corporate R&D spending is not.

From Nicholas Bloom in his Conversation with Tyler:

The share of R&D in the US and Europe… funded by the government has been declining over time. In fact, in the US, when you go back to the '60s, roughly two-thirds of it is funded by the government and one-third by private firms. Now it’s the reverse.

According to the NSF, it might be more like 75% private firms:

(I assume “private firms” just means non-governmental, as opposed to “firms not listed on public markets”.)

So yes, corporate R&D spending is very high, but remember that it’s a broad category.

When I think about Bell Labs, I think about Claude Shannon’s Information Theory, a leap in basic research that powered the information age, althought it wasn’t invented for any narrow purpose.

In contrast, the D in R&D stands for “development”, and won’t yield this kind of fundamental breakthrough. Take a look at Google’s 10-K. The costs are broken down as:

  • Cost of revenues
  • Research and development
  • Sales and marketing
  • General and administrative
  • European Commission fines [for antitrust violations]

As described Quartz, “Much of those costs [of revenues] were from the fees Google pays companies like Apple to be the default search engine on iPhones and other devices, which are called traffic acquisition costs.”

So the R&D line refers to Google’s actual published research and the cost of cutting-edge projects like their Quantum Computing initiative, but it’s also just the cost of hiring engineers to work on ads. So yes, corporate R&D spending is high by some measure, but a lot of it is development, and a lot of that development has nothing to do with what we think of as research.

Exactly how much is research and how much is development? I’m not sure, and in some cases, it’s not even clear where you would draw the line.

Progress Studies: A Discipline is a Set of Institutional Norms

In a world with Progress Studies, academic departments and degree programs would not necessarily have to be reorganized. That’s probably going to be costly and time-consuming. Instead, a new focus on progress would be more comparable to a school of thought that would prompt a decentralized shift in priorities among academics, philanthropists, and funding agencies. Over time, we’d like to see communities, journals, and conferences devoted to these questions.

Patrick Collison and Tyler Cowen, We Need a New Science of Progress

Contrary to the article’s title, what we have now is not a “Science of Progress”. It is at best a “Subculture of Progress”, but really, more like a subculture of demanding progress.

How do we bridge the gap?

When William James defined psychology as “the science of mental life”, he did not imagine the institution of academic psychology as it exists today. Psychology is much more than James’s broad charter. It is also an established standard of rigour (e.g. p < 0.05, placebo-controled RCTs), a canonical body of established knowledge founded on that standard, and living practitioners tasked with upholding truth and banishing heresy.

The science of psychology relies also on living institutions. It is a set of journals, grant-making organizations, academic departments and conferences, all with their associated level of prestige.

And then there is folk knowledge: Taboos born from historical failures never to be questioned. Social threads that mediate relationships between practitioners. The particular cultures and subcultures that span those threads. Foundational assumptions held sacred. [1]

As Tyler once described it: “You need barely scratch the surface in our prevailing ideologies to find central questions almost completely unaddressed.”

Unless Progress Studies gains acceptance from the existing institutions, it must strive to build new ones in its name. Otherwise, it risks never ascending to become a genuine “Science of Progress”.

Be careful however, not to cross into institutional role play. It would be too easy to replicate the trappings of “real sciences” without any of the associated benefits. In a quest for legitimacy, we must be careful to ask who we’re seeking it from, lest all power stem from the same corrupted source.

Cargo Cult Science, Serious Social Science

To avoid cargo cult science, you have to first understand the purpose of the thing you’re trying to replicate. It is not enough to build something that looks like an airplane from the outside if you haven’t understood the engine itself.

As Aaron Swartz writes in Serious Social Science:

The first thing that comes is the numbers. Real science papers are filled with tables and graphs and regressions on piles of data, so the social scientists decide to do all that.

…This is not to say that there is anything intrinsically wrong with using math or jargon or making grand claims. But to adopt these habits reflexively is to put the means before the ends. Scientists do not use math because it is complicated but because, for what they are doing, it is effective.

No one has yet attempted to artificially imbue Progress Studies with mathematical complexity, but there have been other attempts to be more like a real science. Remember Jasmine Wang’s attempt to compile a canon of knowledge in the early days of Progress Studies? In the year since, that canon has gone largely untouched by today’s practitioners. [2] Though it still serves as an interesting reference, very little of Wang’s canon is actually widely cited. [3]

I’m prone to my own prescriptive behavior. This whole series is an exercise in trying to explain what Progress Studies ought to be and how it should function. But I’ll admit, these sorts of top-down efforts are unlikely to have much impact.

Instead, the shortest path to becoming a legitimate science is to:

  1. Publish good research
  2. Make the case that it could not exist under an existing field
  3. Label it Progress Studies

This is deceptively simple. In reality, the quality of research can only be judged within a particular institutional framework. We know work is good in other fields because it’s influential, highly cited and revered by the associated scientific community. Despite our fondness of the term “independent researcher”, no such thing has ever been possible. [6]

Minimum Viable Norms

If we don’t need prestigious conferences or journals, what is required?

At a minimum, norms must:

  1. Enable substantive discourse
  2. Which in turn progresses the field
  3. Resulting in a coherent standard of quality
  4. Allowing us to publish good research and label it Progress Studies

Where are we currently in this process?

Without established standards of rigour: authors can go back and forth criticizing each other’s work without making any progress.

Without foundational assumptions: it is too easy to dismiss an entire body of research on grounds it is not even attempting to assert or contest.

Without technical jargon: complex concepts must be rehashed each time, or worse, deployed with different definitions to suit the context. [7]

We already see a bit of this happening. My exchange [8] with Noah Smith is admirable in some sense (at least we are replying to each other), but regrettable in another. I did not really engage with his arguments, but merely attempted to criticize the cultural moment he chose to partake in. In response, he dodged my meta-level objection to optimism as an inconsistent interpretation of the data, and instead chose to double down on his object-level claims.

In other cases, I’ve seen outsiders alienated by the entire concept of Progress Studies as relying on the naive assumption that “progress” is a good worth pursuing. Rather than seen as challenging complacency, we’re accused perpetuating the status quo. Though Tyler Cowen’s Stubborn Attachments attempts to serve as this foundational definition, it is a frequently misunderstood book, still lacking in proper exegesis.

Finally, our jargon is simply not well established, nor are its operationalizations. Just as Effective Altruism settled on DALY, Progress Studies has attempted to coalesce around various measures of productivity and growth. Since Cowen dodged the question in Stubborn Attachments, writing instead of the nebulous Wealth Plus, we’re turned to specific metrics like GDP and Total Factor Productivity. Unfortunately, both are poorly understood, and don’t proxy well for the kind of progress we actually care about. [9]

We may discover further along that more is required, but these three are a good place to start. In the coming months, it will be up to us to propose, experiment with, and coalesce around better norms.

As the Swartz piece concludes:

[It’s] unlikely that the existing disciplines can be reformed. Instead what is needed is a culture of serious social science built outside the existing systems of academia… there is certainly much more to do, including building structures to do the work in.

Science advances one funeral at a time, but scientific institutions merely decay. While tenured professors eventually die, the institutions do not. Without natural senescence, an immortal being can be arbitrarily dysfunctional. [10] [11]

As absurd as it sounds, it is easier to construct a new field from scratch than to reform the existing ones. Without the entrenched interests and sacred institutions, we might actually stand a chance.


Footnotes

[1] See also David Chapman.

[2] For that matter, it’s not clear to me that there are actually Progress Studies researchers. Mostly, it seems to be a side project for people who’s real work is in building non-profits or working at think tanks. [4] [5]

[3] Perhaps you once skimmed Vannevar Bush’s Science: The Endless Frontier (or at least read Nintil’s article about it), but I don’t know anyone who claims to have actually understood Heidegger’s The Question Concerning Technology.

[4] Jason Crawford is at least full-time, though his grant from EA Funds describes it as “Telling the story of human progress to the world, and promote [sic] progress as a moral imperative” which sounds more like propaganda than research. That’s not a bad thing, we do need science educators! But first there must be a science.

[5] I occasionally get emails from people wowed by my blog’s prominence despite not having been around very long: “One does not just show up on the internet and write/think this well out of nowhere.” My answer is that very few other people are even trying! Scott Alexander is among the most popular authors, and has been working a demanding full-time job this entire time. As has Nintil, until he quit very recently. Leopold Aschenbrenner is not employed, but only because he’s a full time student.

[6] A more accurate title is perhaps “extra-institutional researcher” which I first heard here, but that’s a mouthful.

[7] The term “jargon” evokes esoteric slang. What I really mean is technical language, consistently defined and operationalized.

[8] Noah Smith wrote Techno-optimism for the 2020s, I responded with Isolated Demands for Rigour in New Optimism, which he has since replied to in a series of posts (1, 2). This gets messy after a while, but so long as everyone is linking back to the previous posts, it’s not too hard for a reader to follow along.

[9] Certainly, they do not proxy well for the progress I think we ought to care about.

[10] I believe some version of this is attributed to Peter Thiel, but can’t find the source.

[11] Harvard is the oldest institution of higher education in the US, and still ranked #1. Oxford is the “oldest university in the English-speaking world”, and depending on who you ask, shares that #1 ranking.


Appendix: Abolish Peer Review

Rather than attempt to identify a set of “minimum viable norms”, should we just assume all the trappings of academia are necessary? It’s not an ideal system, but that doesn’t mean you can just pick and choose which parts you want, and just hope the whole thing still holds together.

That’s a good argument, and I think agree we will have to do the work of explaining why some norms are not worth keeping.

In machine learning, arXiv has already eroded the importance of journals and conferences. It is still very important to get accepted to NeurIPS, but that very acceptance is contingent on having your pre-print widely cited.

In Progress Studies, many of the conventional trappings are being replaced as well, for better or for worse.

Instead of citations, we have retweets, and instead of journals, anyone can publish on their own blog. There is even a grant system! Though Emergent Ventures is central to Progress Studies, Jason Crawford is funded by a variety of sources, including “Open Philanthropy, the Long-Term Future Fund, and Jaan Tallinn”.

And instead of pre-publication peer-review, we have post-publication rebuttals.

Can post-hoc review even be called a legitimate standard of knowledge production?

Remember that while peer view has been around in some form since the 18th century, the term itself only took off around 1967:

As Scientific American writes:

Science and The Journal of the American Medical Association did not use outside reviewers until after 1940, "(Spier, 2002). The Lancet did not implement peer-review until 1976 (Benos et al., 2006). After the war and into the fifties and sixties, the specialization of articles increased and so did the competition for journal space.

I wasn’t sure about this claim, but as a sanity check, Wikipedia confirms:

The present-day peer-review system evolved from this 18th-century process,[11] began to involve external reviewers in the mid-19th-century,[12] and did not become commonplace until the mid-20th-century.

So it is not quite accurate to say there was no peer-review system before 1970, but it is worth understanding that our modern system is a relatively recent invention.

And yet, so much of the legendary science we now hail as transformative pre-dates 1970. Nintil’s Peer Rejection in Science summarizes breakthrough discoveries once considered crankery. How many of these would never see the light of day in today’s system? Or from Alexey Guzey’s Peer Review is a Disaster:

Peer reviewers in your field are your competitors, who have not themselves solved the problem you claim to be able to solve. They have both personal and professional interest (especially so if funding is limited) in giving low scores to grant applications of competing teams and to recommend rejection of their journal submissions. Further, since they’re experts in the grant application topic, while rejecting your paper or grant application, they can lift your research ideas and then pursue them themselves. This happens more frequently than you would expect.

This is not a niche view held merely among outsides. Richard Smith, former editor of the British Medical Journal once published the widely cited article Peer review: a flawed process at the heart of science and journals, where he writes:

Famously, it is compared with democracy: a system full of problems but the least worst we have.

…You can steal ideas and present them as your own, or produce an unjustly harsh review to block or at least slow down the publication of the ideas of a competitor. These have all happened.

Of course, any method will have false negatives and false positives. I’m not claiming Progress Studies’s current process of post-hoc review is obviously better, merely that is bad in a different way, and thus has the opportunity to produce knowledge that would not otherwise be possible.

We have to try something new, and while meta-science tries to come up with an improved mechanism, we might as well get started experimenting.