Glorifying superstars glorifies excessive risk

Apr 26 JDN 2458964

Suppose you were offered the choice of the following two gambles; which one would you take?

Gamble A: 99.9% chance of $0; 0.1% chance of $100 million

Gamble B: 10% chance of $50,000; 80% chance of $100,000; 10% chance of $1 million

I think it’s pretty clear that you should choose gamble B.

If you were risk-neutral, the expected payoffs would be $100,000 for gamble A and $185,000 for gamble B. So clearly gamble B is the better deal.

But you’re probably risk-averse. If you have logarithmic utility with a baseline and current wealth of $10,000, the difference is even larger:

0.001*ln(10001) = 0.009

0.1*ln(6) + 0.8*ln(11) + 0.1*ln(101) = 2.56

Yet suppose this is a gamble that a lot of people get to take. And furthermore suppose that what you read about in the news every day is always the people who are the very richest. Then you will read, over and over again, about people who took gamble A and got lucky enough to get the $100 million. You’d probably start to wonder if maybe you should be taking gamble A instead.

This is more or less the world we live in. A handful of billionaires own staggering amounts of wealth, and we are constantly hearing about them. Even aside from the fact that most of them inherited a large portion of it and all of them had plenty of advantages that most of us will never have, it’s still not clear that they were actually smart about taking the paths they did—it could simply be that they got spectacularly lucky.

Or perhaps there’s an even clearer example: Professional athletes. The vast majority of athletes make basically no money at sports. Even most paid athletes are in minor leagues and make only a modest living.

There’s certainly nothing wrong with being an amateur who plays sports for fun. But if you were to invest a large proportion of your time training in sports in the hopes of becoming a professional athlete, you would most likely find yourself gravely disappointed, as your chances of actually getting into the major leagues and becoming a multi-millionaire are exceedingly small. Yet you can probably name at least a few major league athletes who are multi-millionaires—perhaps dozens, if you’re a serious fan—and I doubt you can name anywhere near as many minor league players or players who never made it into paid leagues in the first place.

When we spend all of our time focused on the superstars, what we are effectively assessing is the maximum possible income available on a given career track. And it’s true; the maximum for professional athletes and especially entrepreneurs is extremely high. But the maximum isn’t what you should care about; you should really be concerned about the average or even the median.

And it turns out that the same professions that offer staggeringly high incomes at the very top also tend to be professions with extremely high risk attached. The average income for an athlete is very small; the median is almost certainly zero. Entrepreneurs do better; their average and median income aren’t too much worse than most jobs. But this moderate average comes with a great deal of risk; yes, you could become a billionaire—but far more likely, you could become bankrupt.

This is a deeply perverse result: The careers that our culture most glorifies, the ones that we inspire people to dream about, are precisely those that are the most likely to result in financial ruin.

Realizing this changes your perspective on a lot of things. For instance, there is a common lament that teachers aren’t paid the way professional athletes are. I for one am extremely grateful that this is the case. If teachers were paid like athletes, yes, 0.1% would be millionaires, but only 4.9% would make a decent living, and the remaining 95% would be utterly broke. Indeed, this is precisely what might happen if MOOCs really take off, and a handful of superstar teachers are able to produce all the content while the vast majority of teaching mostly amounts to showing someone else’s slideshows. Teachers are much better off in a world where they almost all make a decent living even though none of them ever get spectacularly rich. (Are many teachers still underpaid? Sure. How do I know this? Because there are teacher shortages. A chronic shortage of something is a surefire sign that its price is too low.) And clearly the idea that we could make all teachers millionaires is just ludicrous: Do you want to pay $1 million a year for your child’s education?

Is there a way that we could change this perverse pattern? Could we somehow make it feel more inspiring to choose a career that isn’t so risky? Well, I doubt we’ll ever get children to dream of being accountants or middle managers. But there are a wide range of careers that are fulfilling and meaningful while still making a decent living—like, well, teaching. Even working in creative arts can be like this: While very few authors are millionaires, the median income for an author is quite respectable. (On the other hand there’s some survivor bias here: We don’t count you as an author if you can’t get published at all.) Software engineers are generally quite satisfied with their jobs, and they manage to get quite high incomes with low risk. I think the real answer here is to spend less time glorifying obscene hoards of wealth and more time celebrating lives that are rich and meaningful.

I don’t know if Jeff Bezos is truly happy. But I do know that you and I are more likely to be happy if instead of trying to emulate him, we focus on making our own lives meaningful.

Do I want to stay in academia?

Apr 5 JDN 2458945

This is a very personal post. You’re not going to learn any new content today; but this is what I needed to write about right now.

I am now nearly finished with my dissertation. It only requires three papers (which, quite honestly, have very little to do with one another). I just got my second paper signed off on, and my third is far enough along that I can probably finish it in a couple of months.

I feel like I ought to be more excited than I am. Mostly what I feel right now is dread.

Yes, some of that dread is the ongoing pandemic—though I am pleased to report that the global number of cases of COVID-19 has substantially undershot the estimates I made last week, suggesting that at least most places are getting the virus under control. The number of cases and number of deaths has about doubled in the past week, which is a lot better than doubling every two days as it was at the start of the pandemic. And that’s all I want to say about COVID-19 today, because I’m sure you’re as tired of the wall-to-wall coverage of it as I am.

But most of the dread is about my own life, mainly my career path. More and more I’m finding that the world of academic research just isn’t working for me. The actual research part I like, and I’m good at it; but then it comes time to publish, and the journal system is so fundamentally broken, so agonizingly capricious, and has such ludicrous power over the careers of young academics that I’m really not sure I want to stay in this line of work. I honestly think I’d prefer they just flip a coin when you graduate and you get a tenure-track job if you get heads. Or maybe journals could roll a 20-sided die for each paper submitted and publish the papers that get 19 or 20. At least then the powers that be couldn’t convince themselves that their totally arbitrary and fundamentally unjust selection process was actually based on deep wisdom and selecting the most qualified individuals.

In any case I’m fairly sure at this point that I won’t have any publications in peer-reviewed journals by the time I graduate. It’s possible I still could—I actually still have decent odds with two co-authored papers, at least—but I certainly do not expect to. My chances of getting into a top journal at this point are basically negligible.

If I weren’t trying to get into academia, that fact would be basically irrelevant. I think most private businesses and government agencies are fairly well aware of the deep defects in the academic publishing system, and really don’t put a whole lot of weight on its conclusions. But in academia, publication is everything. Specifically, publication in top journals.

For this reason, I am now seriously considering leaving academia once I graduate. The more contact I have with the academic publishing system the more miserable I feel. The idea of spending another six or seven years desperately trying to get published in order to satisfy a tenure committee sounds about as appealing right now as having my fingernails pulled out one by one.

This would mean giving up on a lifelong dream. It would mean wondering why I even bothered with the PhD, when the first MA—let alone the second—would probably have been enough for most government or industry careers. And it means trying to fit myself into a new mold that I may find I hate just as much for different reasons: A steady 9-to-5 work schedule is a lot harder to sustain when waking up before 10 AM consistently gives you migraines. (In theory, there are ways to get special accommodations for that sort of thing; in practice, I’m sure most employers would drag their feet as much as possible, because in our culture a phase-delayed circadian rhythm is tantamount to being lazy and therefore worthless.)

Or perhaps I should aim for a lecturer position, perhaps at a smaller college, that isn’t so obsessed with research publication. This would still dull my dream, but would not require abandoning it entirely.

I was asked a few months ago what my dream job is, and I realized: It is almost what I actually have. It is so tantalizingly close to what I am actually headed for that it is painful. The reality is a twisted mirror of the dream.

I want to teach. I want to do research. I want to write. And I get to do those things, yes. But I want to them without the layers of bureaucracy, without the tiers of arbitrary social status called ‘prestige’, without the hyper-competitive and capricious system of journal publication. Honestly I want to do them without grading or dealing with publishers at all—though I can at least understand why some mechanisms for evaluating student progress and disseminating research are useful, even if our current systems for doing so are fundamentally defective.

It feels as though I have been running a marathon, but was only given a vague notion of the route beforehand. There were a series of flags to follow: This way to the bachelor’s, this way to the master’s, that way to advance to candidacy. Then when I come to the last set of flags, the finish line now visible at the horizon, I see that there is an obstacle course placed in my way, with obstacles I was never warned about, much less trained for. A whole new set of skills, maybe even a whole different personality, is necessary to surpass these new obstacles, and I feel utterly unprepared.

It is as if the last mile of my marathon must bedone on horseback, and I’ve never learned to ride a horse—no one ever told me I would need to ride a horse. (Or maybe they did and I didn’t listen?) And now every time I try to mount one, I fall off immediately; and the injuries I sustain seem to be worse every time. The bruises I thought would heal only get worse. The horses I must ride are research journals, and the injuries when I fall are psychological—but no less real, all too real. With each attempt I keep hoping that my fear will fade, but instead it only intensifies.

It’s the same pain, the same fear, that pulled me away from fiction writing. I want to go back, I hope to go back—but I am not strong enough now, and cannot be sure I ever will be. I was told that working in a creative profession meant working hard and producing good output; it turns out it doesn’t mean that at all. A successful career in a creative field actually means satisfying the arbitrary desires of a handful of inscrutable gatekeepers. It means rolling the dice over, and over, and over again, each time a little more painful than the last. And it turns out that this just isn’t something I’m good at. It’s not what I’m cut out for. And maybe it never will be.

An incompetent narcissist would surely fare better than I, willing to re-submit whatever refuse they produce a thousand times because they are certain they deserve to succeed. For, deep down, I never feel that I deserve it. Others tell me I do, and I try to believe them; but the only validation that feels like it will be enough is the kind that comes directly from those gatekeepers, the kind that I can never get. And truth be told, maybe if I do finally get that, it still won’t be enough. Maybe nothing ever will be.

If I knew that it would get easier one day, that the pain would, if not go away, at least retreat to a dull roar I could push aside, then maybe I could stay on this path. But this cannot be the rest of my life. If this is really what it means to have an academic career, maybe I don’t want one after all.

Or maybe it’s not academia that’s broken. Maybe it’s just me.

Reflections on Past and Future

Jan 19 JDN 2458868

This post goes live on my birthday. Unfortunately, I won’t be able to celebrate much, as I’ll be in the process of moving. We moved just a few months ago, and now we’re moving again, because this apartment turned out to be full of mold that keeps triggering my migraines. Our request for a new apartment was granted, but the university housing system gives very little time to deal with such things: They told us on Tuesday that we needed to commit by Wednesday, and then they set our move-in date for that Saturday.

Still, a birthday seems like a good time to reflect on how my life is going, and where I want it to go next. As for how old I am? This is the probably the penultimate power of two I’ll reach.

The biggest change in my life over the previous year was my engagement. Our wedding will be this October. (We have the venue locked in; invitations are currently in the works.) This was by no means unanticipated; really, folks had been wondering when we’d finally get around to it. Yet it still feels strange, a leap headlong into adulthood for someone of a generation that has been saddled with a perpetual adolescence. The articles on “Millennials” talking about us like we’re teenagers still continue, despite the fact that there are now Millenials with college-aged children. Thanks to immigration and mortality, we now outnumber Boomers. Based on how each group voted in 2016, this bodes well for the 2020 election. (Then again, a lot of young people stay home on Election Day.)

I don’t doubt that graduate school has contributed to this feeling of adolescence: If we count each additional year of schooling as a grade, I would now be in the 22nd grade. Yet from others my age, even those who didn’t go to grad school, I’ve heard similar experiences about getting married, buying homes, or—especially—having children of their own: Society doesn’t treat us like adults, so we feel strange acting like adults. 30 is the new 23.

Perhaps as life expectancy continues to increase and educational attainment climbs ever higher, future generations will continue to experience this feeling ever longer, until we’re like elves in a Tolkienesque fantasy setting, living to 1000 but not considered a proper adult until we hit 100. I wonder if people will still get labeled by generation when there are 40 generations living simultaneously, or if we’ll find some other category system to stereotype by.

Another major event in my life this year was the loss of our cat Vincent. He was quite old by feline standards, and had been sick for a long time; so his demise was not entirely unexpected. Still, it’s never easy to lose a loved one, even if they are covered in fur and small enough to fit under an airplane seat.

Most of the rest of my life has remained largely unchanged: Still in grad school, still living in the same city, still anxious about my uncertain career prospects. Trump is still President, and still somehow managing to outdo his own high standards of unreasonableness. I do feel some sense of progress now, some glimpses of the light at the end of the tunnel. I can vaguely envision finishing my dissertation some time this year, and I’m hoping that in a couple years I’ll have settled into a job that actually pays well enough to start paying down my student loans, and we’ll have a good President (or at least Biden).

I’ve reached the point where people ask me what I am going to do next with my life. I want to give an answer, but the problem is, this is almost entirely out of my control. I’ll go wherever I end up getting job offers. Based on the experience of past cohorts, most people seem to apply to about 200 positions, interview for about 20, and get offers from about 2. So asking me where I’ll work in five years is like asking me what number I’m going to roll on a 100-sided die. I could probably tell you what order I would prioritize offers in, more or less; but even that would depend a great deal on the details. There are difficult tradeoffs to be made: Take a private sector offer with higher pay, or stay in academia for more autonomy and security? Accept a postdoc or adjunct position at a prestigious university, or go for an assistant professorship at a lower-ranked college?

I guess I can say that I do still plan to stay in academia, though I’m less certain of that than I once was; I will definitely cast a wider net. I suppose the job market isn’t like that for most people? I imagine most people at least know what city they’ll be living in. (I’m not even positive what country—opportunities for behavioral economics actually seem to be generally better in Europe and Australia than they are in the US.)

But perhaps most people simply aren’t as cognizant of how random and contingent their own career paths truly were. The average number of job changes per career is 12. You may want to think that you chose where you ended up, but for the most part you landed where the wind blew you. This can seem tragic in a way, but it is also a call for compassion: “There but for the grace of God go I.”

Really, all I can do now is hang on and try to enjoy the ride.

Why is it so hard to get a job?

JDN 2457411

The United States is slowly dragging itself out of the Second Depression.

Unemployment fell from almost 10% to about 5%.

Core inflation has been kept between 0% and 2% most of the time.

Overall inflation has been within a reasonable range:

US_inflation

Real GDP has returned to its normal growth trend, though with a permanent loss of output relative to what would have happened without the Great Recession.

US_GDP_growth

Consumption spending is also back on trend, tracking GDP quite precisely.

The Federal Reserve even raised the federal funds interest rate above the zero lower bound, signaling a return to normal monetary policy. (As I argued previously, I’m pretty sure that was their main goal actually.)

Employment remains well below the pre-recession peak, but is now beginning to trend upward once more.

The only thing that hasn’t recovered is labor force participation, which continues to decline. This is how we can have unemployment go back to normal while employment remains depressed; people leave the labor force by retiring, going back to school, or simply giving up looking for work. By the formal definition, someone is only unemployed if they are actively seeking work. No, this is not new, and it is certainly not Obama rigging the numbers. This is how we have measured unemployment for decades.

Actually, it’s kind of the opposite: Since the Clinton administration we’ve also kept track of “broad unemployment”, which includes people who’ve given up looking for work or people who have some work but are trying to find more. But we can’t directly compare it to anything that happened before 1994, because the BLS didn’t keep track of it before then. All we can do is estimate based on what we did measure. Based on such estimation, it is likely that broad unemployment in the Great Depression may have gotten as high as 50%. (I’ve found that one of the best-fitting models is actually one of the simplest; assume that broad unemployment is 1.8 times narrow unemployment. This fits much better than you might think.)

So, yes, we muddle our way through, and the economy eventually heals itself. We could have brought the economy back much sooner if we had better fiscal policy, but at least our monetary policy was good enough that we were spared the worst.

But I think most of us—especially in my generation—recognize that it is still really hard to get a job. Overall GDP is back to normal, and even unemployment looks all right; but why are so many people still out of work?

I have a hypothesis about this: I think a major part of why it is so hard to recover from recessions is that our system of hiring is terrible.

Contrary to popular belief, layoffs do not actually substantially increase during recessions. Quits are substantially reduced, because people are afraid to leave current jobs when they aren’t sure of getting new ones. As a result, rates of job separation actually go down in a recession. Job separation does predict recessions, but not in the way most people think. One of the things that made the Great Recession different from other recessions is that most layoffs were permanent, instead of temporary—but we’re still not sure exactly why.

Here, let me show you some graphs from the BLS.

This graph shows job openings from 2005 to 2015:

job_openings

This graph shows hires from 2005 to 2015:

job_hires

Both of those show the pattern you’d expect, with openings and hires plummeting in the Great Recession.

But check out this graph, of job separations from 2005 to 2015:

job_separations

Same pattern!

Unemployment in the Second Depression wasn’t caused by a lot of people losing jobs. It was caused by a lot of people not getting jobs—either after losing previous ones, or after graduating from school. There weren’t enough openings, and even when there were openings there weren’t enough hires.

Part of the problem is obviously just the business cycle itself. Spending drops because of a financial crisis, then businesses stop hiring people because they don’t project enough sales to justify it; then spending drops even further because people don’t have jobs, and we get caught in a vicious cycle.

But we are now recovering from the cyclical downturn; spending and GDP are back to their normal trend. Yet the jobs never came back. Something is wrong with our hiring system.

So what’s wrong with our hiring system? Probably a lot of things, but here’s one that’s been particularly bothering me for a long time.
As any job search advisor will tell you, networking is essential for career success.

There are so many different places you can hear this advice, it honestly gets tiring.

But stop and think for a moment about what that means. One of the most important determinants of what job you will get is… what people you know?

It’s not what you are best at doing, as it would be if the economy were optimally efficient.
It’s not even what you have credentials for, as we might expect as a second-best solution.

It’s not even how much money you already have, though that certainly is a major factor as well.

It’s what people you know.

Now, I realize, this is not entirely beyond your control. If you actively participate in your community, attend conferences in your field, and so on, you can establish new contacts and expand your network. A major part of the benefit of going to a good college is actually the people you meet there.

But a good portion of your social network is more or less beyond your control, and above all, says almost nothing about your actual qualifications for any particular job.

There are certain jobs, such as marketing, that actually directly relate to your ability to establish rapport and build weak relationships rapidly. These are a tiny minority. (Actually, most of them are the sort of job that I’m not even sure needs to exist.)

For the vast majority of jobs, your social skills are a tiny, almost irrelevant part of the actual skill set needed to do the job well. This is true of jobs from writing science fiction to teaching calculus, from diagnosing cancer to flying airliners, from cleaning up garbage to designing spacecraft. Social skills are rarely harmful, and even often provide some benefit, but if you need a quantum physicist, you should choose the recluse who can write down the Dirac equation by heart over the well-connected community leader who doesn’t know what an integral is.

At the very least, it strains credibility to suggest that social skills are so important for every job in the world that they should be one of the defining factors in who gets hired. And make no mistake: Networking is as beneficial for landing a job at a local bowling alley as it is for becoming Chair of the Federal Reserve. Indeed, for many entry-level positions networking is literally all that matters, while advanced positions at least exclude candidates who don’t have certain necessary credentials, and then make the decision based upon who knows whom.

Yet, if networking is so inefficient, why do we keep using it?

I can think of a couple reasons.

The first reason is that this is how we’ve always done it. Indeed, networking strongly pre-dates capitalism or even money; in ancient tribal societies there were certainly jobs to assign people to: who will gather berries, who will build the huts, who will lead the hunt. But there were no colleges, no certifications, no resumes—there was only your position in the social structure of the tribe. I think most people simply automatically default to a networking-based system without even thinking about it; it’s just the instinctual System 1 heuristic.

One of the few things I really liked about Debt: The First 5000 Years was the discussion of how similar the behavior of modern CEOs is to that of ancient tribal chieftans, for reasons that make absolutely no sense in terms of neoclassical economic efficiency—but perfect sense in light of human evolution. I wish Graeber had spent more time on that, instead of many of these long digressions about international debt policy that he clearly does not understand.

But there is a second reason as well, a better reason, a reason that we can’t simply give up on networking entirely.

The problem is that many important skills are very difficult to measure.

College degrees do a decent job of assessing our raw IQ, our willingness to persevere on difficult tasks, and our knowledge of the basic facts of a discipline (as well as a fantastic job of assessing our ability to pass standardized tests!). But when you think about the skills that really make a good physicist, a good economist, a good anthropologist, a good lawyer, or a good doctor—they really aren’t captured by any of the quantitative metrics that a college degree provides. Your capacity for creative problem-solving, your willingness to treat others with respect and dignity; these things don’t appear in a GPA.

This is especially true in research: The degree tells how good you are at doing the parts of the discipline that have already been done—but what we really want to know is how good you’ll be at doing the parts that haven’t been done yet.

Nor are skills precisely aligned with the content of a resume; the best predictor of doing something well may in fact be whether you have done so in the past—but how can you get experience if you can’t get a job without experience?

These so-called “soft skills” are difficult to measure—but not impossible. Basically the only reliable measurement mechanisms we have require knowing and working with someone for a long span of time. You can’t read it off a resume, you can’t see it in an interview (interviews are actually a horribly biased hiring mechanism, particularly biased against women). In effect, the only way to really know if someone will be good at a job is to work with them at that job for awhile.

There’s a fundamental information problem here I’ve never quite been able to resolve. It pops up in a few other contexts as well: How do you know whether a novel is worth reading without reading the novel? How do you know whether a film is worth watching without watching the film? When the information about the quality of something can only be determined by paying the cost of purchasing it, there is basically no way of assessing the quality of things before we purchase them.

Networking is an attempt to get around this problem. To decide whether to read a novel, ask someone who has read it. To decide whether to watch a film, ask someone who has watched it. To decide whether to hire someone, ask someone who has worked with them.

The problem is that this is such a weak measure that it’s not much better than no measure at all. I often wonder what would happen if businesses were required to hire people based entirely on resumes, with no interviews, no recommendation letters, and any personal contacts treated as conflicts of interest rather than useful networking opportunities—a world where the only thing we use to decide whether to hire someone is their documented qualifications. Could it herald a golden age of new economic efficiency and job fulfillment? Or would it result in widespread incompetence and catastrophic collapse? I honestly cannot say.