They show R&D is effective and R&D spending is up, and conclude obsolescence must be the reason this is not reflected in productivity pretty much by process of elimination. However there is an alternative - that for reasons unrelated to R&D, productivity is actually being driven down, and ever increasing R&D output is necessary just to maintain current levels of productivity.
In particular, they are looking at US manufacturing. While this is a diverse industry, it's clear that in many subfields there is quite a bit of saturation. When everyone has a car, and cars last longer and longer, the need for new cars goes down. Once you get to a point where your industrial capacity can provide enough to satisfy demand, further R&D only reduces the costs of satisfying that demand, not increased output, and in some cases improvements to product quality may even further reduce demand. In the US, light vehicle sales peaked in 2000, and while the numbers dipped during various market downturns, they keep coming back to roughly the same asymptote. The numbers are even more striking if you break it down further - the annual demand for personal vehicles has fallen by a factor of 4 since 1965, and by a factor of 3 since 2000, the difference being made up by increased commercial vehicle demand. Looking at other industries like steel paints a similar picture.
This would only decrease productivity if there was no other demands for people to do something; if this was a simultaneous saturation across everything that a company could do with their current resources.
What happens if the prices of cars are depressed by the increasing efficiency, competition and product longevity, and the people displaced through efficiency are taking lower paid jobs.
If you measure output as GDP - wouldn't GDP have gone down - even if actual production of goods and services has gone up?
If the OECD is to be believed, the elephant in the room is that China negated the value of western R&D by about $500B per year since 2010.
That’s why being a fast follower is so valuable, you get everyone else to waste money on the wrong ways to build things. It also causes R&D to show much worse returns.
> That’s why being a fast follower is so valuable, you get everyone else to waste money on the wrong ways to build things. It also causes R&D to show much worse returns.
Frankly I think this is the big cultural change in tech generally, isn't it? People come up with ideas and everyone goes "oh I want to do that", so they replicate it off the back of the work everyone else has done. Uncritical FOMO.
But actually it's better to be an even slower follower, generally, as a survivor strategy. Apple and Nintendo shows this. It's better to go "I see what they are all doing, and I see where they think consumers are, but I think they are wrong, and I think it's not just a question of them going about it wrong, but that all of this evidence suggests consumers actually wish they were getting this other thing, and that is what we should build".
This works for Apple & Nintendo because they built their brand image while being fast-followers – and even pioneers decades ago (the Apple ][ was among the trailblazers, just like the GameBoy or the (S)NES).
I would bet that a company trying to replicate that strategy without the previously established brand image would not go very far.
Wouldn't those two be the opposite of fast followers? Indeed, they trailblazed once upon a time. But they maintained that after being surpassed by focusing on being slow followers. Analyzing the market, polishing what worked to perfection, and making it super intuitive.
Or perhaps the time scale of "fast follower" is distorted in my head, compared to the scale of business.
I think as they're quite secretive so it seems they're slow followers but they're fairly fast in starting the project, keep it under wraps and take their time to get it right.
Younger readers might not know that before the iPad came out, Michael Arrington tried to make a tablet before tablets were a thing. So the problem back then was that touch screens were expensive and scaling up from a smart phone to a tablet had a lot of engineering problems. It didn't happen overnight. And Arrington started building TechCrunch’s "CrunchPad" in public, and people thought he might steal a march on tablets. It went a bit wrong with a falling out with a manufacturer, and the manufacturer released the JooJoo.
But obviously Apple had been working on the iPad the entire time, kept their mouths shut until they had perfected it and crushed the JooJoo a couple of months after the JooJoo's release date. The JooJoo was more expensive than expected, almost the same price as the iPad, but had performance issues, poor software, no app store and a short battery life.
You might argue that Apple's lost that 'skill' now. For example, the Apple Vision Pro, which didn't nail it.
Apple and Nintendo have some of the strongest brand loyalty this side of macrobrewed beer and sports teams. They can afford to go against the current because they have extremely deep barrels of fanatics who will buy just about anything they make, regardless of whether it's cutting edge or not.
Nintendo in particular also has a different culture than the modern explosive shareholder mentality. They have a huge war chest can likely operate at a loss for over a decade, even if their stock crashed to zero tomorrow.
In other words: they have cultural skin in the game. A bad quarter or even year won't have them seeking out private equity funding.
Checking I understand this right: the paper’s contention is that more R&D, whilst still producing productivity gains, is invalidating gains made from other R@D? e.g. everyone developing their own LLM. It’s an interesting idea, but why would that be happening?
I think the problem might be how you measure productivity.
If you measure effort and output - then surely productivity has gone up - but if they measure it in another way - say capital to return - it might not have?
I do not, and have never, understood why absolute R&D investment is compared to relative growth rate in this literature. It would make sense to compare R&D spending as a percentage of GDP to relative growth, or to compare absolute R&D spending to absolute per-year growth, but I remain mystified by the comparison of absolute R&D spending with relative growth.
The comparison makes sense in endogenous growth theory where knowledge production has a Cobb-Douglas form - doubling researchers should double the absolute ideas produced, which translates to a constant growth rate under standard assumptions about how ideas affect production.
I am writing a book about this topic. I started my career in 1978 at Bell Labs and worked in 3 different startups after that. After 45 years in R&D, I have recently retired. So many times, the inspiration for new inventions we worked on came from unexpected sources; the arts, culture, music, history and many other sources. And I said we on purpose because rarely did a new invention come from one person, it was almost always from collaboration on a team. My conclusion is that invention so often comes from a team of well rounded people with knowledge in many areas and the ability to work in a team. I wonder if the decline in the productivity in R&D comes from a decline in these attributes?
In research on creativity in the arts and sciences, the importance of a supportive community is seen to be so important that some researchers deny the validity of the idea of the creative genius working in isolation.
Interested in your project. Can you point to any similar books and how you are expanding on them?
> some researchers deny the validity of the idea of the creative genius working in isolation.
Is it that they deny the entire possibility of a creative genius working in isolation, or deny that a creative genius working in isolation, without a supporting community to spread the good word, will still see his work make it out into the world?
I think it's more that creative genius requires both the time invested to attain mastery, and time to push the boundaries on paths that may or may not work out.
Ramanujan would have still been Ramanujan had he not worked with Littlewood and Hardy (though the world might not have witnessed both his genius and his contributions), but by all accounts he invested an enormous amount of time and effort in mathematics, to the point that his family urged him to do other things. Einstein worked a job that was so trivial for him that he spent most of his time thinking about other things. Newton invented calculus while his classes were halted because everyone was isolating from the plague. Bukowski famously quipped that his choices were to earn a wage, or to write and starve, and he'd chosen to starve.
In the same way that you probably don't get garage startups in a society where no one has a garage, you probably don't get many creative geniuses without good family structures and some level of slack in the system.
Einstein, Ramanujan, and Newton were boosted by existing networks of review and promotion. A lot of core engineering math was invented by aristocrats and government functionaries around the French Academy. Germany developed its own equivalent scene somewhat later.
All of these followed the model of a relatively small number of smart people bouncing ideas off each other, reviewing them, building on them, and promoting the good ones.
The difference between that and modern R&D is that modern R&D tries to be industrial rather than academic. Academia is trapped in a bullshit job make-work cycle, where quantity gets more rewards than quality and creativity. There isn't room for mavericks like Einstein. Even if they're out there having great ideas, there's no way for them to be discovered and promoted.
Industry focuses more on fill-in developments than game changer mathematical insights, which are the real drivers of scientific progress.
So there's a lot of R&D-like activity in CS, and occasionally something interesting falls out, like LLMs. But fundamental physics has stagnated.
One of the biggest reasons is that the smartest people don't work in research. They work in finance, developing gambling algorithms.
As a lifetime experimenter myself, I'm going to play the cards I'm dealt and I sure like slack in the system, but mainly to make up for my other weaknesses :)
One person's idea can be good enough to be the most revolutionary thing in a field, but it still may not be as well thought-out as if more than one worked on it together from the beginning.
One person's physical efforts can almost always be dwarfed by a team of some kind, and that might be the only way for an idea to become reality, but it's not going to help if there's not a proper team to join or resources to build staff from scratch.
Since most teams do not contain an absolute genius, at least they come up with products because they have a team. Excellent products sometimes, but not often genius level.
In some fields they really think brains are the most important thing, but it's too rare and everybody knows it.
So if they want to get to market any time soon they have to settle for what they have to work with until such a rare genius comes along.
Which may be never so no time to wait, but by the time some miracle-working wizard shows up it's too late because the team has no drop-in task for them to perform, and has not naturally been formed with the necessary structure to leverage anybody's wizardry by then. So never mind, they can't recognize it anyway.
This would seem to be a direct corollary to the red queen hypothesis applied in the context corporations instead of species. That is, in a competitive environment you have to keep spending R&D dollars to stay in the same relative market position because everyone else around is spending as well. However the paper talks about productivity of the individual firm and aggregate productivity (presumably across the whole economy). Therefore I think that the red queen may not be whole story, because firms should still be getting more efficient (more productive) even if they can't capture that value due to competition, the production possibilities frontier should be growing because we need less capital to accomplish the same tasks, leaving more for other things. However it seems that this is not the case? So what the paper seems to mean by "increased rates of obsolescence" is that there is so much churn within organizations that they can't actually get something implemented in a way that actually allows them to capitalize on the potential increased productivity? That sounds like a complexity wall, but I feel like I'm feel like I'm missing something.
> there is so much churn within organizations that they can't actually get something implemented in a way that actually allows them to capitalize on the potential increased productivity?
The churn is in market attention. While you are setting to capitalize on potential increases in productivity, the competition has already come out with something better and the customer has moved on.
They are specifically looking at R&D in Manufacturing. I think there you can make the case (as they do) that one new innovation can erase the productivity benefits of a prior innovation.
Got anywhere you're willing to write them down? I'm like your reverse twin - I never have ideas I think are commercially viable. But I think I'm a decent coder (30+ years of gainful employment in the Linux space).
R & D is about the only thing you have if you want to make something out of nothing.
Most other forms of value-added activity need to start with something of value to begin with.
The closer that that "nothing" can be brought to zero, the greater the leverage by comparison until it wipes the floor with everything else.
It wasn't so bad until all the MBA's came along, they have nothing like the equations that are needed to figure this out when the data is not numbers yet. Regular non-degreed business operators used to be so much more advanced in mathematical intuition regardless.
The researchers stayed as talented, plus got better technology, the founding giants of leadership lasted as long as they could but were replaced by midgets, and here we are.
They show R&D is effective and R&D spending is up, and conclude obsolescence must be the reason this is not reflected in productivity pretty much by process of elimination. However there is an alternative - that for reasons unrelated to R&D, productivity is actually being driven down, and ever increasing R&D output is necessary just to maintain current levels of productivity.
In particular, they are looking at US manufacturing. While this is a diverse industry, it's clear that in many subfields there is quite a bit of saturation. When everyone has a car, and cars last longer and longer, the need for new cars goes down. Once you get to a point where your industrial capacity can provide enough to satisfy demand, further R&D only reduces the costs of satisfying that demand, not increased output, and in some cases improvements to product quality may even further reduce demand. In the US, light vehicle sales peaked in 2000, and while the numbers dipped during various market downturns, they keep coming back to roughly the same asymptote. The numbers are even more striking if you break it down further - the annual demand for personal vehicles has fallen by a factor of 4 since 1965, and by a factor of 3 since 2000, the difference being made up by increased commercial vehicle demand. Looking at other industries like steel paints a similar picture.
This would only decrease productivity if there was no other demands for people to do something; if this was a simultaneous saturation across everything that a company could do with their current resources.
What happens if the prices of cars are depressed by the increasing efficiency, competition and product longevity, and the people displaced through efficiency are taking lower paid jobs.
If you measure output as GDP - wouldn't GDP have gone down - even if actual production of goods and services has gone up?
Not sure how they measure productivity here.
If the OECD is to be believed, the elephant in the room is that China negated the value of western R&D by about $500B per year since 2010.
That’s why being a fast follower is so valuable, you get everyone else to waste money on the wrong ways to build things. It also causes R&D to show much worse returns.
> That’s why being a fast follower is so valuable, you get everyone else to waste money on the wrong ways to build things. It also causes R&D to show much worse returns.
Frankly I think this is the big cultural change in tech generally, isn't it? People come up with ideas and everyone goes "oh I want to do that", so they replicate it off the back of the work everyone else has done. Uncritical FOMO.
But actually it's better to be an even slower follower, generally, as a survivor strategy. Apple and Nintendo shows this. It's better to go "I see what they are all doing, and I see where they think consumers are, but I think they are wrong, and I think it's not just a question of them going about it wrong, but that all of this evidence suggests consumers actually wish they were getting this other thing, and that is what we should build".
This works for Apple & Nintendo because they built their brand image while being fast-followers – and even pioneers decades ago (the Apple ][ was among the trailblazers, just like the GameBoy or the (S)NES).
I would bet that a company trying to replicate that strategy without the previously established brand image would not go very far.
Wouldn't those two be the opposite of fast followers? Indeed, they trailblazed once upon a time. But they maintained that after being surpassed by focusing on being slow followers. Analyzing the market, polishing what worked to perfection, and making it super intuitive.
Or perhaps the time scale of "fast follower" is distorted in my head, compared to the scale of business.
I think as they're quite secretive so it seems they're slow followers but they're fairly fast in starting the project, keep it under wraps and take their time to get it right.
Younger readers might not know that before the iPad came out, Michael Arrington tried to make a tablet before tablets were a thing. So the problem back then was that touch screens were expensive and scaling up from a smart phone to a tablet had a lot of engineering problems. It didn't happen overnight. And Arrington started building TechCrunch’s "CrunchPad" in public, and people thought he might steal a march on tablets. It went a bit wrong with a falling out with a manufacturer, and the manufacturer released the JooJoo.
But obviously Apple had been working on the iPad the entire time, kept their mouths shut until they had perfected it and crushed the JooJoo a couple of months after the JooJoo's release date. The JooJoo was more expensive than expected, almost the same price as the iPad, but had performance issues, poor software, no app store and a short battery life.
You might argue that Apple's lost that 'skill' now. For example, the Apple Vision Pro, which didn't nail it.
Apple's acquistion of fingerworks was probably important here.
https://en.wikipedia.org/wiki/FingerWorks
I'll give you the Gameboy but the NES and SNES were very much "see what the others did, now do it better".
Switch 2 to an extent too (it's basically a much improved Switch 1)
Apple and Nintendo have some of the strongest brand loyalty this side of macrobrewed beer and sports teams. They can afford to go against the current because they have extremely deep barrels of fanatics who will buy just about anything they make, regardless of whether it's cutting edge or not.
Nintendo in particular also has a different culture than the modern explosive shareholder mentality. They have a huge war chest can likely operate at a loss for over a decade, even if their stock crashed to zero tomorrow.
In other words: they have cultural skin in the game. A bad quarter or even year won't have them seeking out private equity funding.
Didn't they almost go broke after the Wii U? I vaguely remember that the Switch was do-or-die for them.
Checking I understand this right: the paper’s contention is that more R&D, whilst still producing productivity gains, is invalidating gains made from other R@D? e.g. everyone developing their own LLM. It’s an interesting idea, but why would that be happening?
I think the problem might be how you measure productivity.
If you measure effort and output - then surely productivity has gone up - but if they measure it in another way - say capital to return - it might not have?
I do not, and have never, understood why absolute R&D investment is compared to relative growth rate in this literature. It would make sense to compare R&D spending as a percentage of GDP to relative growth, or to compare absolute R&D spending to absolute per-year growth, but I remain mystified by the comparison of absolute R&D spending with relative growth.
The comparison makes sense in endogenous growth theory where knowledge production has a Cobb-Douglas form - doubling researchers should double the absolute ideas produced, which translates to a constant growth rate under standard assumptions about how ideas affect production.
Is this another moore's law where we try to suggest that trends will not hit some ceiling and start to taper off in growth?
I am writing a book about this topic. I started my career in 1978 at Bell Labs and worked in 3 different startups after that. After 45 years in R&D, I have recently retired. So many times, the inspiration for new inventions we worked on came from unexpected sources; the arts, culture, music, history and many other sources. And I said we on purpose because rarely did a new invention come from one person, it was almost always from collaboration on a team. My conclusion is that invention so often comes from a team of well rounded people with knowledge in many areas and the ability to work in a team. I wonder if the decline in the productivity in R&D comes from a decline in these attributes?
In research on creativity in the arts and sciences, the importance of a supportive community is seen to be so important that some researchers deny the validity of the idea of the creative genius working in isolation.
Interested in your project. Can you point to any similar books and how you are expanding on them?
> some researchers deny the validity of the idea of the creative genius working in isolation.
Is it that they deny the entire possibility of a creative genius working in isolation, or deny that a creative genius working in isolation, without a supporting community to spread the good word, will still see his work make it out into the world?
I think it's more that creative genius requires both the time invested to attain mastery, and time to push the boundaries on paths that may or may not work out.
Ramanujan would have still been Ramanujan had he not worked with Littlewood and Hardy (though the world might not have witnessed both his genius and his contributions), but by all accounts he invested an enormous amount of time and effort in mathematics, to the point that his family urged him to do other things. Einstein worked a job that was so trivial for him that he spent most of his time thinking about other things. Newton invented calculus while his classes were halted because everyone was isolating from the plague. Bukowski famously quipped that his choices were to earn a wage, or to write and starve, and he'd chosen to starve.
In the same way that you probably don't get garage startups in a society where no one has a garage, you probably don't get many creative geniuses without good family structures and some level of slack in the system.
Einstein, Ramanujan, and Newton were boosted by existing networks of review and promotion. A lot of core engineering math was invented by aristocrats and government functionaries around the French Academy. Germany developed its own equivalent scene somewhat later.
All of these followed the model of a relatively small number of smart people bouncing ideas off each other, reviewing them, building on them, and promoting the good ones.
The difference between that and modern R&D is that modern R&D tries to be industrial rather than academic. Academia is trapped in a bullshit job make-work cycle, where quantity gets more rewards than quality and creativity. There isn't room for mavericks like Einstein. Even if they're out there having great ideas, there's no way for them to be discovered and promoted.
Industry focuses more on fill-in developments than game changer mathematical insights, which are the real drivers of scientific progress.
So there's a lot of R&D-like activity in CS, and occasionally something interesting falls out, like LLMs. But fundamental physics has stagnated.
One of the biggest reasons is that the smartest people don't work in research. They work in finance, developing gambling algorithms.
As a lifetime experimenter myself, I'm going to play the cards I'm dealt and I sure like slack in the system, but mainly to make up for my other weaknesses :)
One person's idea can be good enough to be the most revolutionary thing in a field, but it still may not be as well thought-out as if more than one worked on it together from the beginning.
One person's physical efforts can almost always be dwarfed by a team of some kind, and that might be the only way for an idea to become reality, but it's not going to help if there's not a proper team to join or resources to build staff from scratch.
Since most teams do not contain an absolute genius, at least they come up with products because they have a team. Excellent products sometimes, but not often genius level.
In some fields they really think brains are the most important thing, but it's too rare and everybody knows it.
So if they want to get to market any time soon they have to settle for what they have to work with until such a rare genius comes along.
Which may be never so no time to wait, but by the time some miracle-working wizard shows up it's too late because the team has no drop-in task for them to perform, and has not naturally been formed with the necessary structure to leverage anybody's wizardry by then. So never mind, they can't recognize it anyway.
This would seem to be a direct corollary to the red queen hypothesis applied in the context corporations instead of species. That is, in a competitive environment you have to keep spending R&D dollars to stay in the same relative market position because everyone else around is spending as well. However the paper talks about productivity of the individual firm and aggregate productivity (presumably across the whole economy). Therefore I think that the red queen may not be whole story, because firms should still be getting more efficient (more productive) even if they can't capture that value due to competition, the production possibilities frontier should be growing because we need less capital to accomplish the same tasks, leaving more for other things. However it seems that this is not the case? So what the paper seems to mean by "increased rates of obsolescence" is that there is so much churn within organizations that they can't actually get something implemented in a way that actually allows them to capitalize on the potential increased productivity? That sounds like a complexity wall, but I feel like I'm feel like I'm missing something.
> there is so much churn within organizations that they can't actually get something implemented in a way that actually allows them to capitalize on the potential increased productivity?
The churn is in market attention. While you are setting to capitalize on potential increases in productivity, the competition has already come out with something better and the customer has moved on.
They are specifically looking at R&D in Manufacturing. I think there you can make the case (as they do) that one new innovation can erase the productivity benefits of a prior innovation.
If anyone needs ideas I have about 10 per day
Got anywhere you're willing to write them down? I'm like your reverse twin - I never have ideas I think are commercially viable. But I think I'm a decent coder (30+ years of gainful employment in the Linux space).
There used to be the halfbakery. I guess there still is. https://www.halfbakery.com/
There's a blast from the past!
R & D is about the only thing you have if you want to make something out of nothing.
Most other forms of value-added activity need to start with something of value to begin with.
The closer that that "nothing" can be brought to zero, the greater the leverage by comparison until it wipes the floor with everything else.
It wasn't so bad until all the MBA's came along, they have nothing like the equations that are needed to figure this out when the data is not numbers yet. Regular non-degreed business operators used to be so much more advanced in mathematical intuition regardless.
The researchers stayed as talented, plus got better technology, the founding giants of leadership lasted as long as they could but were replaced by midgets, and here we are.
Their dataset only runs up to 2018?