It is a long read, but an interesting one. The basic premise is that the economic growth that has fueled what we consider to be the "American dream" was an anomaly, caused by a huge boost in productivity from the back-to-back industrial revolutions in Europe and America, and that no such productivity boost is on the horizon. He argues, in essence, that our boom is over for good, and our current anemic recovery is actually our new normal.
I am no economist, so I can't really delve into his methods and judge his data. But I am inclined to think he undersells the potential productivity gains from our current information revolution. He seems to be arguing that because we haven't had huge gains from the advent of computers, the information age can't deliver such gains.
I disagree with both parts of that statement. I think we have had a lot of gains from the information revolution- it is just that a lot of them are focused more on leisure than productivity. Think about the huge difference in how you do things like find a restaurant. Mr. Snarky and I were up in Orange County for a little break last weekend, and he found us two excellent breakfasts just by using Google Maps and the online ratings, all accessed from his phone. Ten years ago, we would have either asked at the hotel (and been directed to their own restaurant), had breakfast at a chain, or just driven around and taken our chances. (Or maybe I would have pulled out my HandSpring and used the Vindigo app I loved back then.)
So it isn't that information technology isn't capable of transforming the way we do things. Maybe we just haven't aimed it at the right problems yet.
I'm not sure how to change that. I, after all, work in a corner of information technology that is far removed from things like finding good breakfast spots in an new city. I work to use IT to make scientists more efficient and productive, and I think my team does a pretty good job at this. But there is still a lot of room for improvement. Why don't we improve faster? Well, we don't have the budget to do as many projects as we've identified, and there are probably quite a few other projects out there that we haven't even identified. There are almost certainly scientists at my company who are working through problems with no inkling that computers could help solve those problems. And there are definitely scientists putting up with a manual process just because we haven't had time to come automate it yet.
And that is at a company full of scientists, with a higher than average amount of money.
Imagine how many other corners of the work world are even worse off, while we meanwhile expend vast sums of money and programmer effort optimizing our shopping experiences and making it easier to find restaurants.
Don't get me wrong. I am happy with my better shopping experiences and easy to find good breakfasts. And I think a lot of those innovations have done genuine good in the world- think, for instance, about how many small entrepreneurs flourish in the ecosystems Etsy and Amazon have created.
I just think that we could also apply some of that same technology to less flashy areas.
Maybe we need more people to get a handle on what computers can do. It seems like there is too much of a disconnect between the people who understand the problems and the people who understand what is possible with computers. I've been thinking about this a lot recently, after reading Noah Veltman's post about his imposter syndrome, and how the world is not divided into Coders and Muggles.
I don't think everyone needs to be a professional level coder, or even that everyone needs to know how to code at any level... but maybe we'd be better off if a lot more people were exposed to coding, and learned to understand what sorts of problems computers are good at solving, and what is involved in solving them.
For instance, I heard a story on NPR a few weeks ago about the fact that someone has discovered that a lot of medical devices, like pacemakers and insulin pumps, are hackable. (I couldn't find the NPR story, but here is a Forbes story on the same topic.) This was a completely avoidable problem. All that was required was for the people involved in developing and regulating these devices to recognize the risk. They didn't have to understand how to protect against the risk, but it sure would have been better if they had understood computers well enough to know to hire someone who did understand how to protect against that risk.
I also think we need to work for more tech diversity. I've blogged on this before- I firmly believe that if we had a more diverse group of techies, we would see technology applied to a more diverse set of problems. No matter what white men on the internet tell you, there is no reason to think that white and Asian men are uniquely suited for understanding STEM. There just isn't. In all the capabilities that matter for this, the difference within each sex is bigger than the difference between the sexes, and that is even when we measure after years of conditioning in differences. (Lise Eliot does a good job of summarizing the available research in her book Pink Brain, Blue Brain, which I discuss a bit here and here.) Roughly the same is true of the various races, although I don't have that data at hand. As long as STEM fields remain so completely dominated by white men, I am confident in saying that we are wasting a lot of talent in these areas.
And no, I'm not worried about a glut of STEM trained folks in the job market. For one thing, I think that is an acceptable cost to pay for having not just a more fair society but more diversity in problems tackled. Also, I just argued that maybe we should expand that job market. But more importantly, I really, truly, think we need to stop equating studying STEM fields with having to have a career in STEM.
I've linked before to Shawn Lent's post arguing for the importance of having artists in a wide variety of fields. I agree with her argument, and think it also applies to scientists and techies. The skills and ways of approaching problems that you learn by studying these fields would bring a lot to other areas of endeavor.
Yes, I know- people are frustrated to spend so many years training for one career only to discover they need to go find another career. But how much of that is due to the expectations we set going in and the lack of respect we accord to people who have left science and pursued other options? Would we ring our hands so much about "wasting" the time to get a PhD if instead of seeing it as a track that leads to only one destination, we saw it as a chance to get paid (albeit not a lot) to work on some really interesting things and pick up some skills that could be used to steer a person to a lot of different destinations?
Of course, I am fundamentally an optimist, and that colors my opinions. However, I don't ridicule or dismiss the pessimists. As I mentioned back when I wrote about reading the Rational Optimist, I think pessimists do society a great service by pointing out things on which we need to focus problem solving resources.
I am also fascinated by the irrefutable fact that societies sometimes collapse. One of my favorite books is The Dream of Scipio,by Iain Pears, which explores what happens when a society collapses, and how individuals find their way through the chaos and upheaval.
So I do not discount the possibility that Robert Gordon is right, and that we will never return to the growth levels to which we have become accustomed, and that if this is true, our society faces some grave problems that could in fact bring it down around our ears.
|We're not to this stage yet.|
However, even if we really are at end of productivity growth from innovation, we could probably still save our society. We have a lot of wealth now. Truly, we do. It is just incredibly unequally distributed. If we can no longer rely on a rising tide to raise all ships, maybe we need a blossoming of innovation in politics and in our thinking about how we organize our society. Maybe we need a period of increased productivity in ideas.
And maybe this is actually something that the information revolution is already helping to bring about- think of all the people (like me! But also a host of really smart academics and industry types) whose ideas you would never have read even 10 years ago. Sure, there is a downside to the fact that anyone can publish their thoughts on the internet (e.g., The Birthers) but maybe there is a big upside yet to be realized, too. I don't think we are there yet. It is too hard to find the useful ideas in amongst all of the crazies, and it is too hard for the people who produce the ideas and take the time to write about them to get paid for their efforts. We cannot hope to have a big boost in the generation and discussion of ideas if we expect people to produce and explain ideas for free. But- here's my fundamental optimism again!- I think we could solve those problems, and I think we could come up with the ideas we need to either innovate our way to growth or innovate our way to a method of handling the lack of growth.
Maybe I'm wrong. But I think we should try.
This turned into a long, rambly post. Bonus points for anyone who stuck with it to the end! And I'd love to read your comments on any of the topics I touched on.
Great post! There's a lot to think about here. I like to think of myself as a classically-trained scientist who's just becoming code-literate enough to really understand what kinds of problems computers are good at solving and to start thinking about how to use them to solve questions that interest me... and it's a really fun (and challenging) process!ReplyDelete
I also really agree with your thoughts about PhDs and career expectations. There's so much opportunity that comes out of getting a PhD in STEM ... and yet we spend so much time whining about how few tenure-track faculty jobs there are.
One note - I think you're missing a big "NO" in the phrase "there is reason to think that white and Asian men are uniquely suited for understanding STEM"! :)
Oops! Thanks for catching the typo. It is fixed now.Delete
I'm hugely sympathetic to the individual scientists who find themselves unable to get a job in their first choice career path- but I'm a lot less sympathetic to the scientific culture we've built up that makes those people feel like failures if they go do something different.
Hm, I am an economist and, um, despite having had a ton of classes on productivity and growth (in history, macro, and labor)... I've never heard of Robert Gordon.ReplyDelete
My professors have either tended to be more bullish on productivity growth (although some of my current colleagues are famous for their believe that the US hegemony is over), or they've argued for more education for high skilled jobs. It is true that we're unlikely to see the kind of productivity growth we saw after WWII (or that the Asian tigers saw a few decades back, or China is seeing now), but only because we're already developed. Catch-up growth is faster. Also current monetary policy is to smooth out business cycles so we don't have huge bubbles and bursts, not that that's necessarily been working as well as some would like.
According to the article he has a named chair at Northwestern.Delete
Yeah, I googled him. And I've studied many professors at Northwestern, but this one doesn't seem to have made the core curriculum.Delete
I'm on the STEM board in town and a female engineer, so much of what you say resonates with me. In fact, I just organized a science street festival in town that had thousands of people. Everyone thought it was cool. Even the goth, punk, hip hop punk kids got into it. Every age group and socioeconomic demographic was enthralled. My mantra is to show people science is fun and exciting and all around you so that the next gen realizes it's the funnest profession out there.ReplyDelete
On a side note, I do think technology is used extensively in the tech world (maybe not in life sciences but certainly in design). The world of predictive engineering has transformed product development, and not always in good ways.
Back in the days of graph paper and the #2 pencil, engineers built in huge fudge factors into their product designs. That's why your grandmas fridge from 1967 is still going strong but your brandy new stainless one will last on average about 7 years. In fact, my neighbor and I both bought washing machines at the same time and they broke within 2 months of each other. Things break a lot quicker because they are optimized for cost and will only be designed to be bare minimum the market with tolerate. The one thing I hate about the technology revolution is it enables engineers to design in obsolescence down to the hour.
Engineers want to make good products, not crappy ones, but managers make decisions based on optimizing profit margins at all costs and too much engineering time is spent on cost out vs innovation. I don't think adoption in my field is the problem. I think companies get greedy and don't spend enough money on innovation and R+D. Certainly some industries need innovative game changing ways to take cost out to succeed (like LED lighting), but for more mature products, someone spends the time to innovate ho does will eventually leapfrog your company because of it. Consumers are partly to blame too. If they buy solely on cost, then the trend will continue in this direction indefinitely.
It has been my experience that engineers are more likely to embrace IT solutions than scientists, but it isn't really that the scientists I work with aren't willing to use IT solutions. It is that they don't always recognize when one will help. And then even if they do, I've got a backlog of projects that is 20-30 projects deep. Even a wealthy company like mine does not have the budget to do as much as we could. If you go look at smaller, scrappier start ups, they often spend almost nothing on IT- there just isn't budget for it.Delete
Haha, too true, except that's not my grandma, it's my mom, and the fridge was probably bought in 1969 as that was the year my parents moved into the house my mother (and the fridge) still live in. I know it's about my age, as I would apparently hold its door to balance as a not-yet-walking baby and then swing my arm to make a point (as I babbled on about something, a lifelong trend it seems), at which the door would swing open and I would crash down.Delete
Whereas our new fridge, now 6 years old, has required two service calls...
I'm fascinated by the topic of growth and productivity. Last summer I read The 4% Solution, which was a very wonky book put out by the George W. Bush Institute (I know, I know) that discussed various things that might boost US GDP growth into the 4% annual range. Educational improvements figure in, of course, as well as new sources of energy (mostly the whole shale gas thing), better immigration policy that rewards education and skills and entrepreneurship, etc. I'd add that ways of making it easier for marginal workers to participate in the labor force (senior citizens, moms who want to work part-time from home, etc.) could also help growth.ReplyDelete
A very, very liberal family member of mine once had as a quote on her Christmas cards: "A good idea can come from anywhere, even a Republican." I like that sentiment. So if the George W. Bush foundation puts out a publication with good ideas, that's great. I am not sure I think shale gas is worth the risk, though! I'm still on the fence about that one and wish we'd focus more on less environmentally risky energy sources, even if they are more financially risky.Delete
I like your idea of making it easier for people who don't want full time work to participate in the labor force.
Regarding 'wasting' time getting a PhD only to then leave academia - I've done it once and was able to leverage my research skills with the many soft-skills I accumulated during my extra-curriculars in grad school. I jumped to a very technology-rich company and the first jump was hard, but now I'm considering a second jump that is even less related to my PhD, and since there were so many arguments around 'oh no, but all those years spent on a PhD' the first time, the second time it's a little less daunting.ReplyDelete
The idea that a PhD could be wasted has always puzzled me- although as I say above, I'm very sympathetic to the deep frustration young scientists feel right now trying to navigate this job market.Delete