An unusual recession, a rapid recovery

Jul 11 JDN 2459407

It seems like an egregious understatement to say that the last couple of years have been unusual. The COVID-19 pandemic was historic, comparable in threat—though not in outcome—to the 1918 influenza pandemic.

At this point it looks like we may not be able to fully eradicate COVID. And there are still many places around the world where variants of the virus continue to spread. I personally am a bit worried about the recent surge in the UK; it might add some obstacles (as if I needed any more) to my move to Edinburgh. Yet even in hard-hit places like India and Brazil things are starting to get better. Overall, it seems like the worst is over.

This pandemic disrupted our society in so many ways, great and small, and we are still figuring out what the long-term consequences will be.

But as an economist, one of the things I found most unusual is that this recession fit Real Business Cycle theory.

Real Business Cycle theory (henceforth RBC) posits that recessions are caused by negative technology shocks which result in a sudden drop in labor supply, reducing employment and output. This is generally combined with sophisticated mathematical modeling (DSGE or GTFO), and it typically leads to the conclusion that the recession is optimal and we should do nothing to correct it (which was after all the original motivation of the entire theory—they didn’t like the interventionist policy conclusions of Keynesian models). Alternatively it could suggest that, if we can, we should try to intervene to produce a positive technology shock (but nobody’s really sure how to do that).

For a typical recession, this is utter nonsense. It is obvious to anyone who cares to look that major recessions like the Great Depression and the Great Recession were caused by a lack of labor demand, not supply. There is no apparent technology shock to cause either recession. Instead, they seem to be preciptiated by a financial crisis, which then causes a crisis of liquidity which leads to a downward spiral of layoffs reducing spending and causing more layoffs. Millions of people lose their jobs and become desperate to find new ones, with hundreds of people applying to each opening. RBC predicts a shortage of labor where there is instead a glut. RBC predicts that wages should go up in recessions—but they almost always go down.

But for the COVID-19 recession, RBC actually had some truth to it. We had something very much like a negative technology shock—namely the pandemic. COVID-19 greatly increased the cost of working and the cost of shopping. This led to a reduction in labor demand as usual, but also a reduction in labor supply for once. And while we did go through a phase in which hundreds of people applied to each new opening, we then followed it up with a labor shortage and rising wages. A fall in labor supply should create inflation, and we now have the highest inflation we’ve had in decades—but there’s good reason to think it’s just a transitory spike that will soon settle back to normal.

The recovery from this recession was also much more rapid: Once vaccines started rolling out, the economy began to recover almost immediately. We recovered most of the employment losses in just the first six months, and we’re on track to recover completely in half the time it took after the Great Recession.

This makes it the exception that proves the rule: Now that you’ve seen a recession that actually resembles RBC, you can see just how radically different it was from a typical recession.

Moreover, even in this weird recession the usual policy conclusions from RBC are off-base. It would have been disastrous to withhold the economic relief payments—which I’m happy to say even most Republicans realized. The one thing that RBC got right as far as policy is that a positive technology shock was our salvation—vaccines.

Indeed, while the cause of this recession was very strange and not what Keynesian models were designed to handle, our government largely followed Keynesian policy advice—and it worked. We ran massive government deficits—over $3 trillion in 2020—and the result was rapid recovery in consumer spending and then employment. I honestly wouldn’t have thought our government had the political will to run a deficit like that, even when the economic models told them they should; but I’m very glad to be wrong. We ran the huge deficit just as the models said we should—and it worked. I wonder how the 2010s might have gone differently had we done the same after 2008.

Perhaps we’ve learned from some of our mistakes.

What if everyone owned their own home?

Mar 14 JDN 2459288

In last week’s post I suggested that if we are to use the term “gentrification”, it should specifically apply to the practice of buying homes for the purpose of renting them out.

But don’t people need to be able to rent homes? Surely we couldn’t have a system where everyone always owned their own home?

Or could we?

The usual argument for why renting is necessary is that people don’t want to commit to living in one spot for 15 or 30 years, the length of a mortgage. And this is quite reasonable; very few careers today offer the kind of stability that lets you commit in advance to 15 or more years of working in the same place. (Tenured professors are one of the few exceptions, and I dare say this has given academic economists some severe blind spots regarding the costs and risks involved in changing jobs.)

But how much does renting really help with this? One does not rent a home for a few days or even few weeks at a time. If you are staying somewhere for an interval that short, you generally room with a friend or pay for a hotel. (Or get an AirBNB, which is sort of intermediate between the two.)

One only rents housing for months at a time—in fact, most leases are 12-month leases. But since the average time to sell a house is 60-90 days, in what sense is renting actually less of a commitment than buying? It feels like less of a commitment to most people—but I’m not sure it really is less of a commitment.

There is a certainty that comes with renting—you know that once your lease is up you’re free to leave, whereas selling your house will on average take two or three months, but could very well be faster or slower than that.

Another potential advantage of renting is that you have a landlord who is responsible for maintaining the property. But this advantage is greatly overstated: First of all, if they don’t do it (and many surely don’t), you actually have very little recourse in practice. Moreover, if you own your own home, you don’t actually have to do all the work yourself; you could pay carpenters and plumbers and electricians to do it for you—which is all that most landlords were going to do anyway.

All of the “additional costs” of owning over renting such as maintenance and property taxes are going to be factored into your rent in the first place. This is a good argument for recognizing that a $1000 mortgage payment is not equivalent to a $1000 rent payment—the rent payment is all-inclusive in a way the mortgage is not. But it isn’t a good argument for renting over buying in general.

Being foreclosed on a mortgage is a terrible experience—but surely no worse than being evicted from a rental. If anything, foreclosure is probably not as bad, because you can essentially only be foreclosed for nonpayment, since the bank only owns the loan; landlords can and do evict people for all sorts of reasons, because they own the home. In particular, you can’t be foreclosed for annoying your neighbors or damaging the property. If you own your home, you can cut a hole in a wall any time you like. (Not saying you should necessarily—just that you can, and nobody can take your home away for doing so.)

I think the primary reason that people rent instead of buying is the cost of a down payment. For some reason, we have decided as a society that you should be expected to pay 10%-20% of the cost of a home up front, or else you never deserve to earn any equity in your home whatsoever. This is one of many ways that being rich makes it easier to get richer—but it is probably the most important one holding back most of the middle class of the First World.

And make no mistake, that’s what this is: It’s a social norm. There is no deep economic reason why a down payment needs to be anything in particular—or even why down payments in general are necessary.

There is some evidence that higher down payments are associated with less risk of default, but it’s not as strong as many people seem to think. The big HUD study on the subject found that one percentage point of down payment reduces default risk by about as much as 5 points of credit rating: So you should prefer to offer a mortgage to someone with an 800 rating and no down payment than someone with a 650 rating and a 20% down payment.

Also, it’s not as if mortgage lenders are unprotected from default (unlike, say, credit card lenders). Above all, they can foreclose on the house. So why is it so important to reduce the risk of default in the first place? Why do you need extra collateral in the form of a down payment, when you’ve already got an entire house of collateral?

It may be that this is actually a good opportunity for financial innovation, a phrase that should in general strike terror in one’s heart. Most of the time “financial innovation” means “clever ways of disguising fraud”. Previous attempts at “innovating” mortgages have resulted in such monstrosities as “interest-only mortgages” (a literal oxymoron, since by definition a mortgage must have a termination date—a date at which the debt “dies”), “balloon payments”, and “adjustable rate mortgages”—all of which increase risk of default while as far as I can tell accomplishing absolutely nothing. “Subprime” lending created many excuses for irresponsible or outright predatory lending—and then, above all, securitization of mortgages allowed banks to offload the risk they had taken on to third parties who typically had no idea what they were getting.

Volcker was too generous when he said that the last great financial innovation was the ATM; no, that was an innovation in electronics (and we’ve had plenty of those). The last great financial innovation I can think of is the joint-stock corporation in the 1550s. But I think a new type of mortgage contract that minimizes default risk without requiring large up-front payments might actually qualify as a useful form of financial innovation.

It would also be useful to have mortgages that make it easier to move, perhaps by putting payments on hold while the home is up for sale. That way people wouldn’t have to make two mortgage payments at once as they move from one place to another, and the bank will see that money eventually—paid for by new buyer and their mortgage.

Indeed, ideally I’d like to eliminate foreclosure as well, so that no one has to be kicked out of their homes. How might we do that?

Well, as a pandemic response measure, we should have simply instituted a freeze on all evictions and foreclosures for the duration of the pandemic. Some states did, in fact—but many didn’t, and the federal moratoria on evictions were limited. This is the kind of emergency power that government should have, to protect people from a disaster. So far it appears that the number of evictions was effectively reduced from tens of millions to tens of thousands by these measures—but evicting anyone during a pandemic is a human rights violation.

But as a long-term policy, simply banning evictions wouldn’t work. No one would want to lend out mortgages, knowing that they had no recourse if the debtor stopped paying. Even buyers with good credit might get excluded from the market, since once they actually received the house they’d have very little incentive to actually make their payments on time.

But if there are no down payments and no foreclosures, that means mortgage lenders have no collateral. How are they supposed to avoid defaults?

One option would be wage garnishment. If you have the money and are simply refusing to pay it, the courts could simply require your employer to send the money directly to your creditors. If you have other assets, those could be garnished as well.

And what if you don’t have the money, perhaps because you’re unemployed? Well, then, this isn’t really a problem of incentives at all. It isn’t that you’re choosing not to pay, it’s that you can’t pay. Taking away such people’s homes would protect banks financially, but at a grave human cost.

One option would be to simply say that the banks should have to bear the risk: That’s part of what their huge profits are supposed to be compensating them for, the willingness to take on risks others won’t. The main downside here is the fact that it would probably make it more difficult to get a mortgage and raise the interest rates that you would need to pay once you do.

Another option would be some sort of government program to make up the difference, by offering grants or guaranteed loans to homeowners who can’t afford to pay their mortgages. Since most such instances are likely to be temporary, the government wouldn’t be on the hook forever—just long enough for people to get back on their feet. Here the downside would be the same as any government spending: higher taxes or larger budget deficits. But honestly it probably wouldn’t take all that much; while the total value of all mortgages is very large, only a small portion are in default at any give time. Typically only about 2-4% of all mortgages in the US are in default. Even 4% of the $10 trillion total value of all US mortgages is about $400 billion, which sounds like a lot—but the government wouldn’t owe that full amount, just whatever portion is actually late. I couldn’t easily find figures on that, but I’d be surprised if it’s more than 10% of the total value of these mortgages that would need to be paid by the government. $40 billion is about 1% of the annual federal budget.

Reforms to our healthcare system would also help tremendously, as medical expenses are a leading cause of foreclosure in the United States (and literally nowhere else—every other country with the medical technology to make medicine this expensive also has a healthcare system that shares the burden). Here there is virtually no downside: Our healthcare system is ludicrously expensive without producing outcomes any better than the much cheaper single-payer systems in Canada, the UK, and France.

All of this sounds difficult and complicated, I suppose. Some may think that it’s not worth it. But I believe that there is a very strong moral argument for universal homeownership and ending eviction: Your home is your own, and no one else’s. No one has a right to take your home away from you.

This is also fundamentally capitalist: It is the private ownership of capital by its users, the acquisition of wealth through ownership of assets. The system of landlords and renters honestly doesn’t seem so much capitalist as it does feudal: We even call them “lords”, for goodness’ sake!

As an added bonus, if everyone owned their own homes, then perhaps we wouldn’t have to worry about “gentrification”, since rising property values would always benefit residents.

Love in a time of quarantine

Feb 14JDN 2459260

This is our first Valentine’s Day of quarantine—and hopefully our last. With Biden now already taking action and the vaccine rollout proceeding more or less on schedule, there is good reason to think that this pandemic will be behind us by the end of this year.

Yet for now we remain isolated from one another, attempting to substitute superficial digital interactions for the authentic comforts of real face-to-face contact. And anyone who is single, or forced to live away from their loved ones, during quarantine is surely having an especially hard time right now.

I have been quite fortunate in this regard: My fiancé and I have lived together for several years, and during this long period of isolation we’ve at least had each other—if basically no one else.

But even I have felt a strong difference, considerably stronger than I expected it would be: Despite many of my interactions already being conducted via the Internet, needing to do so with all interactions feels deeply constraining. Nearly all of my work can be done remotely—but not quite all, and even what can be done remotely doesn’t always work as well remotely. I am moderately introverted, and I still feel substantially deprived; I can only imagine how awful it must be for the strongly extraverted.

As awkward as face-to-face interactions can be, and as much as I hate making phone calls, somehow Zoom video calls are even worse than either. Being unable to visit someone’s house for dinner and games, or go out to dinner and actually sit inside a restaurant, leaves a surprisingly large emotional void. Nothing in particular feels radically different, but the sum of so many small differences adds up to a rather large one. I think I felt it the most when we were forced to cancel our usual travel back to Michigan over the holiday season.

Make no mistake: Social interaction is not simply something humans enjoy, or are good at. Social interaction is a human need. We need social interaction in much the same way that we need food or sleep. The United Nations considers solitary confinement for more than two weeks to be torture. Long periods in solitary confinement are strongly correlated with suicide—so in that sense, isolation can kill you. Think about the incredibly poor quality of social interactions that goes on in most prisons: Endless conflict, abuse, racism, frequent violence—and then consider that the one thing that inmates find most frightening is to be deprived of that social contact. This is not unlike being fed nothing but stale bread and water, and then suddenly having even that taken away from you.

Even less extreme forms of social isolation—like most of us are feeling right now—have as detrimental an effect on health as smoking or alcoholism, and considerably worse than obesity. Long-term social isolation increases overall mortality risk by more than one-fourth. Robust social interaction is critical for long-term health, both physically and mentally.

This does not mean that the quarantines were a bad idea—on the contrary, we should have enforced them more aggressively, so as to contain the pandemic faster and ultimately need less time in quarantine. Timing is critical here: Successfully containing the pandemic early is much easier than trying to bring it back under control once it has already spread. When the pandemic began, lockdown might have been able to stop the spread. At this point, vaccines are really our only hope of containment.

But it does mean that if you feel terrible lately, there is a very good reason for this, and you are not alone. Due to forces much larger than any of us can control, forces that even the world’s most powerful governments are struggling to contain, you are currently being deprived of a basic human need.

And especially if you are on your own this Valentine’s Day, remember that there are people who love you, even if they can’t be there with you right now.

2020 is almost over

Dec27 JDN 2459211

I don’t think there are many people who would say that 2020 was their favorite year. Even if everything else had gone right, the 1.7 million deaths from the COVID pandemic would already make this a very bad year.

As if that weren’t bad enough, shutdowns in response to the pandemic, resulting unemployment, and inadequate fiscal policy responses have in a single year thrown nearly 150 million people back into extreme poverty. Unemployment in the US this year spiked to nearly 15%, its highest level since World War 2. Things haven’t been this bad for the US economy since the Great Depression.

And this Christmas season certainly felt quite different, with most of us unable to safely travel and forced to interact with our families only via video calls. New Year’s this year won’t feel like a celebration of a successful year so much as relief that we finally made it through.

Many of us have lost loved ones. Fortunately none of my immediate friends and family have died of COVID, but I can now count half a dozen acquaintances, friends-of-friends or distant relatives who are no longer with us. And I’ve been relatively lucky overall; both I and my partner work in jobs that are easy to do remotely, so our lives haven’t had to change all that much.

Yet 2020 is nearly over, and already there are signs that things really will get better in 2021. There are many good reasons for hope.


Joe Biden won the election by a substantial margin in both the popular vote and the Electoral College.

There are now multiple vaccines for COVID that have been successfully fast-tracked, and they are proving to be remarkably effective. Current forecasts suggest that we’ll have most of the US population vaccinated by the end of next summer.

Maybe the success of this vaccine will finally convince some of the folks who have been doubting the safety and effectiveness of vaccines in general. (Or maybe not; it’s too soon to tell.)

Perhaps the greatest reason to be hopeful about the future is the fact that 2020 is a sharp deviation from the long-term trend toward a better world. That 150 million people thrown back into extreme poverty needs to be compared against the over 1 billion people who have been lifted out of extreme poverty in just the last 30 years.

Those 1.7 million deaths need to be compared against the fact that global life expectancy has increased from 45 to 73 since 1950. The world population is 7.8 billion people. The global death rate has fallen from over 20 deaths per 1000 people per year to only 7.6 deaths per 1000 people per year. Multiplied over 7.8 billion people, that’s nearly 100 million lives saved every single year by advances in medicine and overall economic development. Indeed, if we were to sustain our current death rate indefinitely, our life expectancy would rise to over 130. There are various reasons to think that probably won’t happen, mostly related to age demographics, but in fact there are medical breakthroughs we might make that would make it possible. Even according to current forecasts, world life expectancy is expected to exceed 80 years by the end of the 21st century.

There have also been some significant environmental milestones this year: Global carbon emissions fell an astonishing 7% in 2020, though much of that was from reduced economic activity in response to the pandemic. (If we could sustain that, we’d cut global emissions in half each decade!) But many other milestones were the product of hard work, not silver linings of a global disaster: Whales returned to the Hudson river, Sweden officially terminated their last coal power plant, and the Great Barrier Reef is showing signs of recovery.

Yes, it’s been a bad year for most of us—most of the world, in fact. But there are many reasons to think that next year will be much better.

This is not just about selfishness

Aug 2 JDN 2459064

The Millennial term is “Karen”: someone (paradigmatically a middle-aged White woman) who is so privileged, so self-centered, and has such an extreme sense of entitlement, that they are willing to make others suffer in order to avoid the slightest inconvenience.

I recently saw a tweet (which for some reason has been impossible to find; I think I must have misremembered its precise wording, because putting that in quotes in Google yields nothing) saying that Americans are not simply selfish, we are so selfish that we would gladly let others die to avoid mildly inconveniencing ourselves. Searching Twitter for “Americans are selfish” certainly yields plenty of results.

And it is tempting to agree with this, when it seems that re-opening the economy and so many people refusing to wear masks has given us far worse outcomes from COVID-19 than most other countries.

But this can’t be the whole story. Perhaps Americans are a bit more self-centered than other cultures, because of our history of libertarian individualism. But if we were truly so selfish we’d gladly let others die to avoid inconvenience, whence the fact that we donate more to charity than any other country in the world? I don’t simply mean total amount or per-capita dollars (though both of those are also true); I mean as a fraction of GDP Americans give more to charity than any other country, and by a wide margin.

How then do we explain that so many Americans are not wearing masks?

Well, first of all, most of us are wearing masks. The narrative about people not wearing masks has been exaggerated; the majority of Americans, including the majority of Republicans, agree that wearing masks is a matter of public health rather than personal choice. There are some people who refuse to wear masks, and each one adds a little bit more risk to us all; but it’s really not the case that Americans in general are refusing to wear masks.

But I think the most important failings here come from the top down. The Trump administration has handled the pandemic in an astonishingly poor way. First, they denied that it was even a serious problem. Then, they implemented only a half-hearted response. Then, they turned masks into a culture war. Then, they resisted the economic relief package and prevented it from being as large as it needed to me. At every step of the way, they have been at best utterly incompetent and at worst guilty of depraved indifference murder.

From denying it was a problem, to responding too slowly, to disparaging mask use, to pushing to re-open the economy too soon, at every step of the way our government has made things worse. Above all, a better economic relief package—like what most other First World countries have done—would have done a great deal to reduce the harm of lockdowns, and would have made re-opening the economy far less popular.

Republican-led states have followed the President’s lead, refusing to implement even basic common-sense protections. But even Democrat-led states have suffered greatly as well. New York and California have some of the most cases, though this is surely in part because they are huge states with highly urbanized populations that get a lot of visitors and trade from other places. The trajectory of infections looks worst in Lousiana and Missouri, surely among the most conservative of states; but it also looks quite bad in New Jersey and Hawaii, which are among the most liberal.

I think what this shows us is that America lacks coordination. Despite having United in our name and E pluribus unum as our motto (“In God We Trust” was a Cold War change to spite the Soviets), what we lack most of all is unity. Viruses do not respect borders or jurisdictions. More than perhaps any other issue aside from climate change, fighting a pandemic requires a unified, coordinated response—and that is precisely what we did not have.

In some ways the pluralism of the United States can be a great strength; but this year, it was very much a weakness. And as the many crises around us continue, I fear we grow only more divided.

We still don’t know the fatality rate of COVID-19

May 10 JDN2458978

You’d think after being in this pandemic for several weeks we would now have a clear idea of the fatality rate of the virus. Unfortunately, this is not the case.

The problem is that what we can track really doesn’t tell us what we need to know.

What we can track is how many people have tested positive versus how many people have died. As of this writing, 247,000 people have died and 3,504,000 have tested positive. If this were the true fatality rate, it would be horrifying: A death rate of 7% is clearly in excess of even the 1918 influenza pandemic.

Fortunately, this is almost certainly an overestimate. But it’s actually possible for it to be an underestimate, and here’s why: A lot of those people who currently have the virus could still die.

We really shouldn’t be dividing (total deaths)/(total confirmed infections). We should be dividing (total deaths)/(total deaths + total recoveries). If people haven’t recovered yet, it’s too soon to say whether they will live.

On that basis, this begins to look more like an ancient plague: The number of recoveries is only about four times the number of deaths, which would be a staggering fatality rate of 20%.

But as I said, it’s far more likely that this is an overestimate, because we don’t actually know how many people have been infected. We only know how many people have been infected and gotten tested. A large proportion have never been tested; many of these were simply asymptomatic.
We know this because of the few cases we have of rigorous testing of a whole population, such as the passengers on this cruise liner bound for Antarctica. On that cruise liner, 6 were hospitalized, but 128 tested positive for the virus. This means that the number of asymptomatic infections was twenty times that of the number of symptomatic infections.

There have been several studies attempting to determine what proportion of infections are asymptomatic, because this knowledge is so vital. Unfortunately the results are wildly inconsistent. They seem to range from 5% asymptomatic and 95% symptomatic to 95% asymptomatic and 5% symptomatic. The figure I find most plausible is about 80%: This means that the number of asymptomatic infected is about four times that of the number of symptomatic infected.

This means that the true calculation we should be doing actually looks like this: (total deaths)/(total deaths + total recoveries + total asymptomatic).

The number of deaths seems to be about one fourth the number of recoveries. But when you add the fact that four times as many who get infected are asymptomatic, things don’t look quite so bad. This yields an overall fatality rate of about 4%. This is still very high, and absolutely comparable to the 1918 influenza pandemic.

But the truth is, we just don’t know. South Korea’s fatality rate was only 0.7%, which would be a really bad flu season but nothing catastrophic. (A typical flu has a fatality rate of about 0.1%.) On the (deaths)/(deaths + recoveries) basis, it looks almost as bad as the Black Death.

With so much uncertainty, there’s really only one option: Prepare for the worst-case scenario. Assume that the real death rate is massive, and implement lockdown measures until you can confirm that it isn’t.

Motivation under trauma

May 3 JDN 2458971

Whenever I ask someone how they are doing lately, I get the same answer: “Pretty good, under the circumstances.” There seems to be a general sense that—at least among the sort of people I interact with regularly—that our own lives are still proceeding more or less normally, as we watch in horror the crises surrounding us. Nothing in particular is going wrong for us specifically. Everything is fine, except for the things that are wrong for everyone everywhere.

One thing that seems to be particularly difficult for a lot of us is the sense that we suddenly have so much time on our hands, but can’t find the motivation to actually use this time productively. So many hours of our lives were wasted on commuting or going to meetings or attending various events we didn’t really care much about but didn’t want to feel like we had missed out on. But now that we have these hours back, we can’t find the strength to use them well.

This is because we are now, as an entire society, experiencing a form of trauma. One of the most common long-term effects of post-traumatic stress disorder is a loss of motivation. Faced with suffering we have no power to control, we are made helpless by this traumatic experience; and this makes us learn to feel helpless in other domains.

There is a classic experiment about learned helplessness; like many old classic experiments, its ethics are a bit questionable. Though unlike many such experiments (glares at Zimbardo), its experimental rigor was ironclad. Dogs were divided into three groups. Group 1 was just a control, where the dogs were tied up for a while and then let go. Dogs in groups 2 and 3 were placed into a crate with a floor that could shock them. Dogs in group 2 had a lever they could press to make the shocks stop. Dogs in group 3 did not. (They actually gave the group 2 dogs control over the group 3 dogs to make the shock times exactly equal; but the dogs had no way to know that, so as far as they knew the shocks ended at random.)

Later, dogs from both groups were put into another crate, where they no longer had a lever to press, but they could jump over a barrier to a different part of the crate where the shocks wouldn’t happen. The dogs from group 2, who had previously had some control over their own pain, were able to quickly learn to do this. The dogs from group 3, who had previously felt pain apparently at random, had a very hard time learning this, if they could ever learn it at all. They’d just lay there and suffer the shocks, unable to bring themselves to even try to leap the barrier.

The group 3 dogs just knew there was nothing they could do. During their previous experience of the trauma, all their actions were futile, and so in this new trauma they were certain that their actions would remain futile. When nothing you do matters, the only sensible thing to do is nothing; and so they did. They had learned to be helpless.

I think for me, chronic migraines were my first crate. For years of my life there was basically nothing I could do to prevent myself from getting migraines—honestly the thing that would have helped most would have been to stop getting up for high school that started at 7:40 AM every morning. Eventually I found a good neurologist and got various treatments, as well as learned about various triggers and found ways to avoid most of them. (Let me know if you ever figure out a way to avoid stress.) My migraines are now far less frequent than they were when I was a teenager, though they are still far more frequent than I would prefer.

Yet, I think I still have not fully unlearned the helplessness that migraines taught me. Every time I get another migraine despite all the medications I’ve taken and all the triggers I’ve religiously avoided, this suffering beyond my control acts as another reminder of the ultimate caprice of the universe. There are so many things in our lives that we cannot control that it can be easy to lose sight of what we can.

This pandemic is a trauma that the whole world is now going through. And perhaps that unity of experience will ultimately save us—it will make us see the world and each other a little differently than we did before.

There are a few things you can do to reduce your own risk of getting or spreading the COVID-19 infection, like washing your hands regularly, avoiding social contact, and wearing masks when you go outside. And of course you should do these things. But the truth really is that there is very little any one of us can do to stop this global pandemic. We can watch the numbers tick up almost in real-time—as of this writing, 1 million cases and over 50,000 deaths in the US, 3 million cases and over 200,000 deaths worldwide—but there is very little we can do to change those numbers.

Sometimes we really are helpless. The challenge we face is not to let this genuine helplessness bleed over and make us feel helpless about other aspects of our lives. We are currently sitting in a crate with no lever, where the shocks will begin and end beyond our control. But the day will come when we are delivered to a new crate, and given the chance to leap over a barrier; we must find the strength to take that leap.

For now, I think we can forgive ourselves for getting less done than we might have hoped. We’re still not really out of that first crate.

Is this another Great Depression?

Apr 12 JDN 2458952

In the week from March 15 to March 21, over 3.3 million Americans filed for unemployment. In the following week, this staggering record was broken, when over 6.6 million filed for unemployment. This is an utterly unprecedented number of unemployment filings in a single week; while the data is not as reliable further back, we think this didn’t even happen in the Great Depression.

The Dow Jones Industrial Average is down over 26% in the past quarter. The S&P 500 is down over 23% over the same period. The only comparable stock market crashes are Black Monday and the 1929 market crash.

Does this mean we are on track for another Great Depression? Fortunately, it does not.

This is all happening very fast, because of the rapid shutdowns of businesses during the pandemic. So when we look at short time horizons, things look very scary. But currently unemployment is still only 4.4%, and it is forecasted to rise to about 10% or 11%. This will certainly be a recession—indeed comparable to the Great Recession in 2009—but it will still pale in comparison to the Great Depression, when unemployment hit nearly 25%.

Also, we have a good reason for all this unemployment: We’re making people stay home to stop the spread of the virus. And it seems to be working: California and Washington took some of the most drastic measures, and have shown the fastest reductions in the spread of the virus.

This isn’t a normal recession. We are causing this unemployment on purpose. Paul Krugman makes the analogy to a medically-induced coma: We are shutting some functions down intentionally in order to make it easier to heal.

There is a significant chance, however, that this recession will end up being worse than it’s supposed to be, if our policymakers fail to provide adequate and timely relief to those who become unemployed.

As Donald Marron of the Urban Institute explained quite succinctly in a Twitter thread, there are three types of economic losses we need to consider here: Losses necessary to protect health, losses caused by insufficient demand, and losses caused by lost productive capacity. The first kind of loss is what we are doing on purpose; the other two are losses we should be trying to avoid. Insufficient demand is fairly easy to fix: Hand out cash. But sustaining productive capacity can be trickier.

Given the track record of the Trump administration so far, I am not optimistic. First Trump denied the virus was even a threat. Then he blamed China (which, even if partly true, doesn’t solve anything). Then his response was delayed and inadequate. And now the relief money is taking weeks to get to people—while clearly being less than many people need.

When Trump was first elected, I had several scenarios in my head of what might happen. The best-case scenario was that he’d turn out to be a typical Republican, or be kept on a tight leash by other Republicans. Obviously that didn’t happen. The worst-case scenario was a nuclear war with China; we are all very fortunate that this didn’t happen either. But this is honestly much worse than my median-case scenario, which was that Trump would be like another Reagan or another Nixon. Somehow he turned out to be another Reagan, another Nixon, another Harding, and another Hoover all rolled into one. He somehow combines the worst aspects of every President we’ve ever had, and while facing a historic global crisis his primary concern is his TV ratings.

I can’t tell you how long this is going to last. I can’t tell you just how bad it’s going to get. But I am confident of a few things:

It’ll be worse than it had to be, but not as bad as it could have been. Trump will continue making everything worse, but other, better leaders will make things better. Above all, we’ll make it through this, together.

Do I want to stay in academia?

Apr 5 JDN 2458945

This is a very personal post. You’re not going to learn any new content today; but this is what I needed to write about right now.

I am now nearly finished with my dissertation. It only requires three papers (which, quite honestly, have very little to do with one another). I just got my second paper signed off on, and my third is far enough along that I can probably finish it in a couple of months.

I feel like I ought to be more excited than I am. Mostly what I feel right now is dread.

Yes, some of that dread is the ongoing pandemic—though I am pleased to report that the global number of cases of COVID-19 has substantially undershot the estimates I made last week, suggesting that at least most places are getting the virus under control. The number of cases and number of deaths has about doubled in the past week, which is a lot better than doubling every two days as it was at the start of the pandemic. And that’s all I want to say about COVID-19 today, because I’m sure you’re as tired of the wall-to-wall coverage of it as I am.

But most of the dread is about my own life, mainly my career path. More and more I’m finding that the world of academic research just isn’t working for me. The actual research part I like, and I’m good at it; but then it comes time to publish, and the journal system is so fundamentally broken, so agonizingly capricious, and has such ludicrous power over the careers of young academics that I’m really not sure I want to stay in this line of work. I honestly think I’d prefer they just flip a coin when you graduate and you get a tenure-track job if you get heads. Or maybe journals could roll a 20-sided die for each paper submitted and publish the papers that get 19 or 20. At least then the powers that be couldn’t convince themselves that their totally arbitrary and fundamentally unjust selection process was actually based on deep wisdom and selecting the most qualified individuals.

In any case I’m fairly sure at this point that I won’t have any publications in peer-reviewed journals by the time I graduate. It’s possible I still could—I actually still have decent odds with two co-authored papers, at least—but I certainly do not expect to. My chances of getting into a top journal at this point are basically negligible.

If I weren’t trying to get into academia, that fact would be basically irrelevant. I think most private businesses and government agencies are fairly well aware of the deep defects in the academic publishing system, and really don’t put a whole lot of weight on its conclusions. But in academia, publication is everything. Specifically, publication in top journals.

For this reason, I am now seriously considering leaving academia once I graduate. The more contact I have with the academic publishing system the more miserable I feel. The idea of spending another six or seven years desperately trying to get published in order to satisfy a tenure committee sounds about as appealing right now as having my fingernails pulled out one by one.

This would mean giving up on a lifelong dream. It would mean wondering why I even bothered with the PhD, when the first MA—let alone the second—would probably have been enough for most government or industry careers. And it means trying to fit myself into a new mold that I may find I hate just as much for different reasons: A steady 9-to-5 work schedule is a lot harder to sustain when waking up before 10 AM consistently gives you migraines. (In theory, there are ways to get special accommodations for that sort of thing; in practice, I’m sure most employers would drag their feet as much as possible, because in our culture a phase-delayed circadian rhythm is tantamount to being lazy and therefore worthless.)

Or perhaps I should aim for a lecturer position, perhaps at a smaller college, that isn’t so obsessed with research publication. This would still dull my dream, but would not require abandoning it entirely.

I was asked a few months ago what my dream job is, and I realized: It is almost what I actually have. It is so tantalizingly close to what I am actually headed for that it is painful. The reality is a twisted mirror of the dream.

I want to teach. I want to do research. I want to write. And I get to do those things, yes. But I want to them without the layers of bureaucracy, without the tiers of arbitrary social status called ‘prestige’, without the hyper-competitive and capricious system of journal publication. Honestly I want to do them without grading or dealing with publishers at all—though I can at least understand why some mechanisms for evaluating student progress and disseminating research are useful, even if our current systems for doing so are fundamentally defective.

It feels as though I have been running a marathon, but was only given a vague notion of the route beforehand. There were a series of flags to follow: This way to the bachelor’s, this way to the master’s, that way to advance to candidacy. Then when I come to the last set of flags, the finish line now visible at the horizon, I see that there is an obstacle course placed in my way, with obstacles I was never warned about, much less trained for. A whole new set of skills, maybe even a whole different personality, is necessary to surpass these new obstacles, and I feel utterly unprepared.

It is as if the last mile of my marathon must bedone on horseback, and I’ve never learned to ride a horse—no one ever told me I would need to ride a horse. (Or maybe they did and I didn’t listen?) And now every time I try to mount one, I fall off immediately; and the injuries I sustain seem to be worse every time. The bruises I thought would heal only get worse. The horses I must ride are research journals, and the injuries when I fall are psychological—but no less real, all too real. With each attempt I keep hoping that my fear will fade, but instead it only intensifies.

It’s the same pain, the same fear, that pulled me away from fiction writing. I want to go back, I hope to go back—but I am not strong enough now, and cannot be sure I ever will be. I was told that working in a creative profession meant working hard and producing good output; it turns out it doesn’t mean that at all. A successful career in a creative field actually means satisfying the arbitrary desires of a handful of inscrutable gatekeepers. It means rolling the dice over, and over, and over again, each time a little more painful than the last. And it turns out that this just isn’t something I’m good at. It’s not what I’m cut out for. And maybe it never will be.

An incompetent narcissist would surely fare better than I, willing to re-submit whatever refuse they produce a thousand times because they are certain they deserve to succeed. For, deep down, I never feel that I deserve it. Others tell me I do, and I try to believe them; but the only validation that feels like it will be enough is the kind that comes directly from those gatekeepers, the kind that I can never get. And truth be told, maybe if I do finally get that, it still won’t be enough. Maybe nothing ever will be.

If I knew that it would get easier one day, that the pain would, if not go away, at least retreat to a dull roar I could push aside, then maybe I could stay on this path. But this cannot be the rest of my life. If this is really what it means to have an academic career, maybe I don’t want one after all.

Or maybe it’s not academia that’s broken. Maybe it’s just me.

Fear not to “overreact”

Mar 29 JDN 2458938

It could be given as a story problem in an algebra class, if you didn’t mind terrifying your students:

A virus spreads exponentially, so that the population infected doubles every two days. Currently 10,000 people are infected. How long will it be until 300,000 are infected? Until 10,000,000 are infected? Until 600,000,000 are infected?

The answers:

300,000/10,000 is about 32 = 2^5, so it will take 5 doublings, or 10 days.

10,000,000/10,000 is about 1024=2^10, so it will take 10 doublings, or 20 days.

600,000,000/10,000 is about 64*1024=2^6*2^10, so it will take 16 doublings, or 32 days.

This is the approximate rate at which COVID-19 spreads if uncontrolled.

Fortunately it is not completely uncontrolled; there were about 10,000 confirmed infections on January 30, and there are now about 300,000 as of March 22. This is about 50 days, so the daily growth rate has averaged about 7%. On the other hand, this is probably a substantial underestimate, because testing remains very poor, particularly here in the US.

Yet the truth is, we don’t know how bad COVID-19 is going to get. Some estimates suggest it may be nearly as bad as the 1918 flu pandemic; others say it may not be much worse than H1N1. Perhaps all this social distancing and quarantine is an overreaction? Perhaps the damage from closing all the schools and restaurants will actually be worse than the damage from the virus itself?

Yes, it’s possible we are overreacting. But we really shouldn’t be too worried about this possibility.

This is because the costs here are highly asymmetric. Overreaction has a moderate, fairly predictable cost. Underreaction could be utterly catastrophic. If we overreact, we waste a quarter or two of productivity, and then everything returns to normal. If we underreact, millions of people die.

This is what it means to err on the side of caution: If we are not 90% sure that we are overreacting, then we should be doing more. We should be fed up with the quarantine procedures and nearly certain that they are not all necessary. That means we are doing the right thing.

Indeed, the really terrifying thing is that we may already have underreacted. These graphs of what will happen under various scenarios really don’t look good:

pandemic_graph

But there may still be a chance to react adequately. The advice for most of us seems almost too simple: Stay home. Wash your hands.