The long tail of scientific research

Look around you right now. Think about the things you see. Think about the things you’ve used in the last 24 hours. Most of you will have seen a phone, a computer or something like that. Who does one have to thank for them? Apple, Microsoft, Intel? Of course, but what about Francis Bacon, Faraday and Einstein?

There is a brilliant quote by Norman Augustine, the former chairman and CEO of Lockheed Martin, who quips; “It’s been only half jokingly said that today a third of the GDP is attributable to quantum mechanics”[1]. I’ll return to quantum mechanics later but for now there is a central point I want to emphasise. We are benefiting from the fruits of great minds that started work on fields, decades or even centuries ago, that often showed almost no real world applications but are now the main drivers pushing the modern world forward.

I feel like this may be an under appreciated fact and if so risks us taking for granted the progress and the technology we now have and potentially reducing funding for basic research as the time horizons for its use are so long and its rewards can be so opaque.

Foundations of the modern world

Ada Palmer has a brilliant post where she looks at the historical views on what progress is and whether we can affect it. One of my favourite sections is when she is talks about Francis Bacon basically coming up with the modern view of progress. That of humans being able, by deliberate actions and the pooling of collective knowledge, to enact changes that would raise the collective standard of living. Importantly Bacon focused on the idea that this increase in the standard of living would mean the lives of each generation would be better than the last. The bit I really enjoyed was her description of what happened over the next century or two. In her words:

It really took two hundred years for Bacon’s academy to develop anything useful. There was a lot of dissecting animals, and exploding metal spheres, and refracting light, and describing gravity, and it was very, very exciting, and a lot of it was correct, but – as the eloquent James Hankins put it – it was actually the nineteenth century that finally paid Francis Bacon’s I.O.U.

200 years! And science kept pushing forward through some mixture of personal ambition, curiosity and a belief that eventually it could help all of society for every generation to come. Not only this but many of the inventions we take for granted as being indispensable had to be pushed for against many who saw little use in them. It is said that when asked by the Chancellor of the Exchequer as to the uses of electricity Faraday responded ‘Why, Sir, there is every probability that you will soon be able to tax it’.

Do we really need research though?

It is worth taking a detour to discuss the relative importance of science in the progression of society. After all, one would be justified in thinking that many inventions were originally the result of tinkering and intelligent iterations rather than a keen grasp on the laws of physics. Orville and Wilber Wright didn’t know how to write out the equations for aerodynamics and yet managed flight, and most other innovation have similar stories.

While I would admit that this is all true it seems like tinkering and iterating, without the use of science, can only take you so far. We can achieve flight without knowing about thermodynamics or aerodynamics but now our knowledge of these concepts is used designing every engine and every plane. We can make the computer, but to fit transistors onto an integrated chip at the scales we do now takes knowledge of quantum mechanics. So yes, most of what we have was first conceived before we fully grasped the underlying forces but without eventually using that underlying scientific knowledge we could never have progressed to the level we are currently at.

Back to research being really useful

One area the idea of the long tail of research can be best seen is that of quantum mechanics (QM). Few fields have had a greater effect on modern life that are less understood or appreciated by the general population. What to me is almost as amazing as the impacts QM is having is that at the time the field was being founded, almost 100 years ago, not only did the pioneers think there was little practical use for QM, they questioned whether humans were even capable of understanding the forces at play or if we simply weren’t intelligent enough. Remember that quote from the introduction, “It’s been only half jokingly said that today a third of the GDP is attributable to quantum mechanics”, it’s quite the statement but if you consider what technologies now rely on QM principles it starts to becomes less surprising.

What exactly has QM provided for us? Well, pretty much any communications technology, computing technology, lasers, advances in material science, you get the picture. Again, knowledge of QM was not needed to start any of the areas I just mentioned, but to progress from the level of technology we had decades ago to todays levels would have been impossible without bringing it in at some point.

A more direct example of scientific research that we are yet to benefit directly from, but which is already almost half a century old in its origins, is quantum computing. Most of you will know of Moore’s law, the descriptive (/self fulfilling) rule to do with how many transistors we can fit on an integrated circuit. Now this has held up longer than Moore himself had ever dreamed but it is likely that in the not too distant future it will have to slow down (even more than it may have already). This is nothing to do with human ingenuity but physical limitations, things like the size of an atom or the speed of light, things that aren’t very easy to change.

Why is this a problem? Well much of our economy, as countries get richer, is either online or moving online and this requires growing computing power. If computing power can’t grow we may well face serious issues with scaling the online ecosystem.

So what to do? Well quantum computing, although still in the very early stages of ability, would dramatically increase our computational abilities. To illustrate the difference in power, to store 1024 bits on a normal computer where each bit can either be a 1 or a 0 you need, well, 1024 bits. However, with a quantum computer which uses qubits that can be both a 1 and a 0 you would only need 10 qubits (2^10 = 1024). So without this research that was started 50 years ago there seems to be little way of hoping to maintain our rate of growth in computing power in the long term.

Hopefully by this point I’ve given some idea of the enormous benefits and enormous time horizons basic research can have when applied to the lives of everyday people.

What are the issues with research having such a long tail?

There is a useful distinction I came across relatively recently to describe different kinds of work, that of value creation and value capture. A lottery winner has a lot of value capture but not much value creation. A teacher may well have lots of value creation but likely not much value capture. The research that has lead to many of the advances discussed is firmly in the second camp. I am not trying to suggest that one of the main drivers is/should be value capture but for lots of individuals/institutions that will likely factor in somewhere.

For this reason the drive behind research almost has to be decoupled from monetary rewards, or have some agent with a very large amount of patience. Much research is going to take vast amounts of time to be useful, so much so that the original pioneers may well be dead by the time it is useful to industry. Richard Hamming also has a good quote on how it is rarely the pioneers of fields that understand it the best, it is almost always the followers who get to enter a field with a clearer picture. Even if one was funding/conducting basic research with the goal of financial payoffs on a huge scale the way the research will be incorporated will often be different from what one could imagine and hence plan for.

Another point of friction is that research may really look like it has no obvious application, now or ever. Number theory was thought to have no real world uses and is now the cornerstone of cryptography. Early experiments to do with electricity involved poking frogs legs with metal and noticing that they twitched, at the time this was just seen as some quirky thing rather than any potentially revolutionary discovery. Quantum mechanics aimed to discover more about the smallest parts of reality not help to pack transistors onto a circuit.

There is one further idea that is of a slightly different note. Tyler Cowen in his book, Stubborn Attachments, makes the case for human rights and economic growth being the only two things we should be aiming to maintain. Everything else is secondary. Most of the argument comes from the idea that we massively over discount, i.e undervalue, the lives of future generations. Economic growth has provided the highest compounded returns to quality of life out of any phenomena ever and so if we care about the utility of future generations this is what we should pursue, subject to the boundary constraint of human rights. Yes there are clearly things that are wrong with society, like inequality, which economic growth may not help, but it is tricky to argue things were worse in feudal England. Therefore if we value the welfare of future generations at any rate other than 0 the best thing we can do for their welfare is grow the economy.

I firmly agree with Tyler insofar as it seems future generations are undervalued and as such we have some moral imperative to do things that will benefit them. Can you imagine what the world would be like if Bacon had not pushed his ideas on using science as a force to improve the lot of generations in the future. As such this economic argument for scientific research has a somewhat moral underpinning to it as well. This, to me, provides weight against any arguments of the variety like ‘sure quantum computers are cool but are they really a priority given the other issues around us’. To me the answer is yes because these discoveries are what propels us forward as a race and makes sure each generations quality of life rises above that of the previous one.

Basic research can often be esoteric, but its effects are anything but. Few understand quantum mechanics but at the same time few would deny the benefits of the devices it has led to. Research has a long tail, often longer than generations or even lifetimes but it is the building blocks of the modern world and as such it should be respected and nurtured. Reductions, or a lack of appreciation, in basic research risk reducing the progress towards the blocks that future generations will build on chancing not just economic costs but, if we value the lives of those that come after us, very real moral costs as well.

Appendix

Current state

This is not meant to be an attack on current levels of spending per se, the UK government spent 1.7% of GDP on R&D in 2018, the USA spent 2.7%.[2] These represent record expenditures from a historical standpoint. My aim is primarily to try to draw focus onto the extraordinary advances we are the beneficiaries of that showed little promise of industry application and so to try to change the opinion people may have on the use of basic research in todays world.

Cost increases

There is one very important caveat to this idea that technological progress can improve the lives we experience. What if things are getting more expensive faster than they are getting better? To me at least this idea kind of seemed like a paradox at first. Surely with technology getting better we would likely see things not just getting better but cheaper as well. Keynes in his great essay Economic Possibilities for our Grandchildren, published in 1930, wrote:

“If capital increases, say, 2 per cent per annum, the capital equipment of the world will have increased by a half in twenty years, and seven and a half times in a hundred years. Think of this in terms of material things – houses, transport, and the like……Thus for the first time since his creation man will be faced with his real, his permanent problem – how to use his freedom from pressing economic cares, how to occupy the leisure, which science and compound interest will have won for him, to live wisely and agreeably and well.”

What’s so startling about Keynes’ predictions is that on one hand, that of how the wealth of the nation actually grew, he is almost completely spot on, but on the other hand, that of the impacts this will have on the lives of people, he was incredibly far off. There are a number of reasons this may be, things like concentrations of wealth or cost disease, which I hope to look at more in the future. The main point I want to raise from this is that unless these other issues can be tackled we will not and do not experience the full benefits of the progress we have worked so hard to achieve.

Bell labs[3]

It would be misleading, I feel, to talk so much about the link between basic research and industry without mentioning what was perhaps the greatest force for the transformation of one into the other. Bell labs was the research division of AT&T who for decades had a natural, government protected monopoly on the communications infrastructure of the USA, there were also one of the most prolific creators of innovations that have been used in industry in history.

What came out of Bell labs? Transistors, Mobile, Error correcting codes, Information theory etc. It is worth quickly mentioning that the last two were largely the independent work of two geniuses, Richard Hamming and Claude Shannon, who were partially able to do this research because of the environment Bell labs fostered and the support it gave to basic science.

Why was Bell labs able to do so much? A few reasons in my view:

  • Their monopoly gave them huge resources with which to spend on innovation. It is important to note that I don’t just mean money here but time. John Pierce, one of the pioneers in mobile communications later credited the fact that Bell labs was willing to support an idea for years without it giving them any tangible benefits.
  • Being a monopolist any innovation they created could be applied system wide therefore having the largest effect.
  • There was no free rider problem and so they could rest assured that they would be the only beneficiary of their research
  • They had an incentive to innovate as otherwise they may have their monopoly broken up

Bell labs basically had the idea set up and leadership with which to push for basic research and they, and all of us, benefitted tremendously from it. Their position however seems much less common nowadays. For better or worse natural monopolies like theirs rarely exist and as such many of the advantage listed above become far more difficult to achieve. Still, I felt like it was worth noting how well private enterprise can act with a view to the long tail of research when they have incentives aligned in the right way.

[1] https://www.fnal.gov/pub/today/Augustine2.html

[2] http://uis.unesco.org/apps/visualisations/research-and-development-spending/

[3] Much of the info about Bell labs comes from Jon Gertner’s fantastic book The Idea Factory

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s