The mental health crisis in academia

Apr 30 JDN 2460065

Why are so many academics anxious and depressed?

Depression and anxiety are much more prevalent among both students and faculty than they are in the general population. Unsurprisingly, women seem to have it a bit worse than men, and trans people have it worst of all.

Is this the result of systemic failings of the academic system? Before deciding that, one thing we should consider is that very smart people do seem to have a higher risk of depression.

There is a complex relationship between genes linked to depression and genes linked to intelligence, and some evidence that people of especially high IQ are more prone to depression; nearly 27% of Mensa members report mood disorders, compared to 10% of the general population.

(Incidentally, the stereotype of the weird, sickly nerd has a kernel of truth: the correlations between intelligence and autism, ADHD, allergies, and autoimmune disorders are absolutely real—and not at all well understood. It may be a general pattern of neural hyper-activation, not unlike what I posit in my stochastic overload model. The stereotypical nerd wears glasses, and, yes, indeed, myopia is also correlated with intelligence—and this seems to be mostly driven by genetics.)

Most of these figures are at least a few years old. If anything things are only worse now, as COVID triggered a surge in depression for just about everyone, academics included. It remains to be seen how much of this large increase will abate as things gradually return to normal, and how much will continue to have long-term effects—this may depend in part on how well we manage to genuinely restore a normal way of life and how well we can deal with long COVID.

If we assume that academics are a similar population to Mensa members (admittedly a strong assumption), then this could potentially explain why 26% of academic faculty are depressed—but not why nearly 40% of junior faculty are. At the very least, we junior faculty are about 50% more likely to be depressed than would be explained by our intelligence alone. And grad students have it even worse: Nearly 40% of graduate students report anxiety or depression, and nearly 50% of PhD students meet the criteria for depression. At the very least this sounds like a dual effect of being both high in intelligence and low in status—it’s those of us who have very little power or job security in academia who are the most depressed.

This suggests that, yes, there really is something wrong with academia. It may not be entirely the fault of the system—perhaps even a well-designed academic system would result in more depression than the general population because we are genetically predisposed. But it really does seem like there is a substantial environmental contribution that academic institutions bear some responsibility for.

I think the most obvious explanation is constant evaluation: From the time we are students at least up until we (maybe, hopefully, someday) get tenure, academics are constantly being evaluated on our performance. We know that this sort of evaluation contributes to anxiety and depression.

Don’t other jobs evaluate performance? Sure. But not constantly the way that academia does. This is especially obvious as a student, where everything you do is graded; but it largely continues once you are faculty as well.

For most jobs, you are concerned about doing well enough to keep your job or maybe get a raise. But academia has this continuous forward pressure: if you are a grad student or junior faculty, you can’t possibly keep your job; you must either move upward to the next stage or drop out. And academia has become so hyper-competitive that if you want to continue moving upward—and someday getting that tenure—you must publish in top-ranked journals, which have utterly opaque criteria and ever-declining acceptance rates. And since there are so few jobs available compared to the number of applicants, good enough is never good enough; you must be exceptional, or you will fail. Two thirds of PhD graduates seek a career in academia—but only 30% are actually in one three years later. (And honestly, three years is pretty short; there are plenty of cracks left to fall through between that and a genuinely stable tenured faculty position.)

Moreover, our skills are so hyper-specialized that it’s very hard to imagine finding work anywhere else. This grants academic institutions tremendous monopsony power over us, letting them get away with lower pay and worse working conditions. Even with an economics PhD—relatively transferable, all things considered—I find myself wondering who would actually want to hire me outside this ivory tower, and my feeble attempts at actually seeking out such employment have thus far met with no success.

I also find academia painfully isolating. I’m not an especially extraverted person; I tend to score somewhere near the middle range of extraversion (sometimes called an “ambivert”). But I still find myself craving more meaningful contact with my colleagues. We all seem to work in complete isolation from one another, even when sharing the same office (which is awkward for other reasons). There are very few consistent gatherings or good common spaces. And whenever faculty do try to arrange some sort of purely social event, it always seems to involve drinking at a pub and nobody is interested in providing any serious emotional or professional support.

Some of this may be particular to this university, or to the UK; or perhaps it has more to do with being at a certain stage of my career. In any case I didn’t feel nearly so isolated in graduate school; I had other students in my cohort and adjacent cohorts who were going through the same things. But I’ve been here two years now and so far have been unable to establish any similarly supportive relationships with colleagues.

There may be some opportunities I’m not taking advantage of: I’ve skipped a lot of research seminars, and I stopped going to those pub gatherings. But it wasn’t that I didn’t try them at all; it was that I tried them a few times and quickly found that they were not filling that need. At seminars, people only talked about the particular research project being presented. At the pub, people talked about almost nothing of serious significance—and certainly nothing requiring emotional vulnerability. The closest I think I got to this kind of support from colleagues was a series of lunch meetings designed to improve instruction in “tutorials” (what here in the UK we call discussion sections); there, at least, we could commiserate about feeling overworked and dealing with administrative bureaucracy.

There seem to be deep, structural problems with how academia is run. This whole process of universities outsourcing their hiring decisions to the capricious whims of high-ranked journals basically decides the entire course of our careers. And once you get to the point I have, now so disheartened with the process of publishing research that I can’t even engage with it, it’s not at all clear how it’s even possible to recover. I see no way forward, no one to turn to. No one seems to care how well I teach, if I’m not publishing research.

And I’m clearly not the only one who feels this way.

There should be a glut of nurses.

Jan 15 JDN 2459960

It will not be news to most of you that there is a worldwide shortage of healthcare staff, especially nurses and emergency medical technicians (EMTs). I would like you to stop and think about the utterly terrible policy failure this represents. Maybe if enough people do, we can figure out a way to fix it.

It goes without saying—yet bears repeating—that people die when you don’t have enough nurses and EMTs. Indeed, surely a large proportion of the 2.6 million (!) deaths each year from medical errors are attributable to this. It is likely that at least one million lives per year could be saved by fixing this problem worldwide. In the US alone, over 250,000 deaths per year are caused by medical errors; so we’re looking at something like 100,000 lives we could safe each year by removing staffing shortages.

Precisely because these jobs have such high stakes, the mere fact that we would ever see the word “shortage” beside “nurse” or “EMT” was already clear evidence of dramatic policy failure.

This is not like other jobs. A shortage of accountants or baristas or even teachers, while a bad thing, is something that market forces can be expected to correct in time, and it wouldn’t be unreasonable to simply let them do so—meaning, let wages rise on their own until the market is restored to equilibrium. A “shortage” of stockbrokers or corporate lawyers would in fact be a boon to our civilization. But a shortage of nurses or EMTs or firefighters (yes, there are those too!) is a disaster.

Partly this is due to the COVID pandemic, which has been longer and more severe than any but the most pessimistic analysts predicted. But there shortages of nurses before COVID. There should not have been. There should have been a massive glut.

Even if there hadn’t been a shortage of healthcare staff before the pandemic, the fact that there wasn’t a glut was already a problem.

This is what a properly-functioning healthcare policy would look like: Most nurses are bored most of the time. They are widely regarded as overpaid. People go into nursing because it’s a comfortable, easy career with very high pay and usually not very much work. Hospitals spend most of their time with half their beds empty and half of their ambulances parked while the drivers and EMTs sit around drinking coffee and watching football games.

Why? Because healthcare, especially emergency care, involves risk, and the stakes couldn’t be higher. If the number of severely sick people doubles—as in, say, a pandemic—a hospital that usually runs at 98% capacity won’t be able to deal with them. But a hospital that usually runs at 50% capacity will.

COVID exposed to the world what a careful analysis would already have shown: There was not nearly enough redundancy in our healthcare system. We had been optimizing for a narrow-minded, short-sighted notion of “efficiency” over what we really needed, which was resiliency and robustness.

I’d like to compare this to two other types of jobs.

The first is stockbrokers.Set aside for a moment the fact that most of what they do is worthless is not actively detrimental to human society. Suppose that their most adamant boosters are correct and what they do is actually really important and beneficial.

Their experience is almost like what I just said nurses ought to be. They are widely regarded (correctly) as very overpaid. There is never any shortage of them; there are people lining up to be hired. People go into the work not because they care about it or even because they are particularly good at it, but because they know it’s an easy way to make a lot of money.

The one thing that seems to be different from my image may not be as different as it seems. Stockbrokers work long hours, but nobody can really explain why. Frankly most of what they do can be—and has been—successfully automated. Since there simply isn’t that much work for them to do, my guess is that most of the time they spend “working” 60-80 hour weeks is actually not actually working, but sitting around pretending to work. Since most financial forecasters are outperformed by a simple diversified portfolio, the most profitable action for most stock analysts to take most of the time would be nothing.

It may also be that stockbrokers work hard at sales—trying to convince people to buy and sell for bad reasons in order to earn sales commissions. This would at least explain why they work so many hours, though it would make it even harder to believe that what they do benefits society. So if we imagine our “ideal” stockbroker who makes the world a better place, I think they mostly just use a simple algorithm and maybe adjust it every month or two. They make better returns than their peers, but spend 38 hours a week goofing off.

There is a massive glut of stockbrokers. This is what it looks like when a civilization is really optimized to be good at something.

The second is soldiers. Say what you will about them, no one can dispute that their job has stakes of life and death. A lot of people seem to think that the world would be better off without them, but that’s at best only true if everyone got rid of them; if you don’t have soldiers but other countries do, you’re going to be in big trouble. (“We’ll beat our swords into liverwurst / Down by the East Riverside; / But no one wants to be the first!”) So unless and until we can solve that mother of all coordination problems, we need to have soldiers around.

What is life like for a soldier? Well, they don’t seem overpaid; if anything, underpaid. (Maybe some of the officers are overpaid, but clearly not most of the enlisted personnel. Part of the problem there is that “pay grade” is nearly synonymous with “rank”—it’s a primate hierarchy, not a rational wage structure. Then again, so are most industries; the military just makes it more explicit.) But there do seem to be enough of them. Military officials may lament of “shortages” of soldiers, but they never actually seem to want for troops to deploy when they really need them. And if a major war really did start that required all available manpower, the draft could be reinstated and then suddenly they’d have it—the authority to coerce compliance is precisely how you can avoid having a shortage while keeping your workers underpaid. (Russia’s soldier shortage is genuine—something about being utterly outclassed by your enemy’s technological superiority in an obviously pointless imperialistic war seems to hurt your recruiting numbers.)

What is life like for a typical soldier? The answer may surprise you. The overwhelming answer in surveys and interviews (which also fits with the experiences I’ve heard about from friends and family in the military) is that life as a soldier is boring. All you do is wake up in the morning and push rubbish around camp.” Bosnia was scary for about 3 months. After that it was boring. That is pretty much day to day life in the military. You are bored.”

This isn’t new, nor even an artifact of not being in any major wars: Union soldiers in the US Civil War had the same complaint. Even in World War I, a typical soldier spent only half the time on the front, and when on the front only saw combat 1/5 of the time. War is boring.

In other words, there is a massive glut of soldiers. Most of them don’t even know what to do with themselves most of the time.

This makes perfect sense. Why? Because an army needs to be resilient. And to be resilient, you must be redundant. If you only had exactly enough soldiers to deploy in a typical engagement, you’d never have enough for a really severe engagement. If on average you had enough, that means you’d spend half the time with too few. And the costs of having too few soldiers are utterly catastrophic.

This is probably an evolutionary outcome, in fact; civilizations may have tried to have “leaner” militaries that didn’t have so much redundancy, and those civilizations were conquered by other civilizations that were more profligate. (This is not to say that we couldn’t afford to cut military spending at all; it’s one thing to have the largest military in the world—I support that, actually—but quite another to have more than the next 10 combined.)

What’s the policy solution here? It’s actually pretty simple.

Pay nurses and EMTs more. A lot more. Whatever it takes to get to the point where we not only have enough, but have so many people lining up to join we don’t even know what to do with them all. If private healthcare firms won’t do it, force them to—or, all the more reason to nationalize healthcare. The stakes are far too high to leave things as they are.

Would this be expensive? Sure.

Removing the shortage of EMTs wouldn’t even be that expensive. There are only about 260,000 EMTs in the US, and they get paid the apallingly low median salary of $36,000. That means we’re currently spending only about $9 billion per year on EMTs. We could double their salaries and double their numbers for only an extra $27 billion—about 0.1% of US GDP.

Nurses would cost more. There are about 5 million nurses in the US, with an average salary of about $78,000, so we’re currently spending about $390 billion a year on nurses. We probably can’t afford to double both salary and staffing. But maybe we could increase both by 20%, costing about an extra $170 billion per year.

Altogether that would cost about $200 billion per year. To save one hundred thousand lives.

That’s $2 million per life saved, or about $40,000 per QALY. The usual estimate for the value of a statistical life is about $10 million, and the usual threshold for a cost-effective medical intervention is $50,000-$100,000 per QALY; so we’re well under both. This isn’t as efficient as buying malaria nets in Africa, but it’s more efficient than plenty of other things we’re spending on. And this isn’t even counting additional benefits of better care that go beyond lives saved.

In fact if we nationalized US healthcare we could get more than these amounts in savings from not wasting our money on profits for insurance and drug companies—simply making the US healthcare system as cost-effective as Canada’s would save $6,000 per American per year, or a whopping $1.9 trillion. At that point we could double the number of nurses and their salaries and still be spending less.

No, it’s not because nurses and doctors are paid much less in Canada than the US. That’s true in some countries, but not Canada. The median salary for nurses in Canada is about $95,500 CAD, which is $71,000 US at current exchange rates. Doctors in Canada can make anywhere from $80,000 to $400,000 CAD, which is $60,000 to $300,000 US. Nor are healthcare outcomes in Canada worse than the US; if anything, they’re better, as Canadians live an average of four years longer than Americans. No, the radical difference in cost—a factor of 2 to 1—between Canada and the US comes from privatization. Privatization is supposed to make things more efficient and lower costs, but it has absolutely not done that in US healthcare.

And if our choice is between spending more money and letting hundreds of thousands or millions of people die every year, that’s no choice at all.

Working from home is the new normal—sort of

Aug 28 JDN 2459820

Among people with jobs that can be done remotely, a large majority did in fact switch to doing their jobs remotely: By the end of 2020, over 70% of Americans with jobs that could be done remotely were working from home—and most of them said they didn’t want to go back.

This is actually what a lot of employers expected to happen—just not quite like this. In 2014, a third of employers predicted that the majority of their workforce would be working remotely by 2020; given the timeframe there, it required a major shock to make that happen so fast, and yet a major shock was what we had.

Working from home has carried its own challenges, but overall productivity seems to be higher working remotely (that meeting really could have been an email!). This may actually explain why output per work hour actually rose rapidly in 2020 and fell in 2022.

The COVID pandemic now isn’t so much over as becoming permanent; COVID is now being treated as an endemic infection like influenza that we don’t expect to be able to eradicate in the foreseeable future.

And likewise, remote work seems to be here to stay—sort of.

First of all, we don’t seem to be giving up office work entirely. As of the first quarter 2022, almost as many firms have partially remote work as have fully remote work, and this seems to be trending upward. A lot of firms seem to be transitioning into a “hybrid” model where employees show up to work two or three days a week. This seems to be preferred by large majorities of both workers and firms.

There is a significant downside of this: It means that the hope that remote working might finally ease the upward pressure on housing prices in major cities is largely a false one. If we were transitioning to a fully remote system, then people could live wherever they want (or can afford) and there would be no reason to move to overpriced city centers. But if you have to show up to work even one day a week, that means you need to live close enough to the office to manage that commute.

Likewise, if workers never came to the office, you could sell the office building and convert it into more housing. But if they show up even once in awhile, you need a physical place for them to go. Some firms may shrink their office space (indeed, many have—and unlike this New York Times journalist, I have a really hard time feeling bad for landlords of office buildings); but they aren’t giving it up entirely. It’s possible that firms could start trading off—you get the building on Mondays, we get it on Tuesdays—but so far this seems to be rare, and it does raise a lot of legitimate logistical and security concerns. So our global problem of office buildings that are empty, wasted space most of the time is going to get worse, not better. Manhattan will still empty out every night; it just won’t fill up as much during the day. This is honestly a major drain on our entire civilization—building and maintaining all those structures that are only used at most 1/3 of 5/7 of the time, and soon, less—and we really should stop ignoring it. No wonder our real estate is so expensive, when half of it is only used 20% of the time!

Moreover, not everyone gets to work remotely. Your job must be something that can be done remotely—something that involves dealing with information, not physical objects. That includes a wide and ever-growing range of jobs, from artists and authors to engineers and software developers—but it doesn’t include everyone. It basically means what we call “white-collar” work.

Indeed, it is largely limited to the upper-middle class. The rich never really worked anyway, though sometimes they pretend to, convincing themselves that managing a stock portfolio (that would actually grow faster if they let it sit) constitutes “work”. And the working class? By and large, they didn’t get the chance to work remotely. While 73% of workers with salaries above $200,000 worked remotely in 2020, only 12% of workers with salaries under $25,000 did, and there is a smooth trend where, across the board, the more money you make, the more likely you have been able to work remotely.

This will only intensify the divide between white-collar and blue-collar workers. They already think we don’t do “real work”; now we don’t even go to work. And while blue-collar workers are constantly complaining about contempt from white-collar elites, I think the shoe is really on the other foot. I have met very few white-collar workers who express contempt for blue-collar workers—and I have met very few blue-collar workers who don’t express anger and resentment toward white-collar workers. I keep hearing blue-collar people say that we think that they are worthless and incompetent, when they are literally the only ones ever saying that. I can’t stop saying things that I never said.

The rich and powerful may look down on them, but they look down on everyone. (Maybe they look down on blue-collar workers more? I’m not even sure about that.) I think politicians sometimes express contempt for blue-collar workers, but I don’t think this reflects what most white-collar workers feel.

And the highly-educated may express some vague sense of pity or disappointment in people who didn’t get college degrees, and sometimes even anger (especially when they do things like vote for Donald Trump), but the really vitriolic hatred is clearly in the opposite direction (indeed, I have no better explanation for how otherwise-sane people could vote for Donald Trump). And I certainly wouldn’t say that everyone needs a college degree (though I became tempted to, when so many people without college degrees voted for Donald Trump).

This really isn’t us treating them with contempt: This is them having a really severe inferiority complex. And as information technology (that white-collar work created) gives us—but not them—the privilege of staying home, that is only going to get worse.

It’s not their fault: Our culture of meritocracy puts a little bit of inferiority complex in all of us. It tells us that success and failure are our own doing, and so billionaires deserve to have everything and the poor deserve to have nothing. And blue-collar workers have absolutely internalized these attitudes: Most of them believe that poor people choose to stay on welfare forever rather than get jobs (when welfare has time limits and work requirements, so this is simply not an option—and you would know this from the Wikipedia page on TANF).

I think that what they experience as “contempt by white-collar elites” is really the pain of living in an illusory meritocracy. They were told—and they came to believe—that working hard would bring success, and they have worked very hard, and watched other people be much more successful. They assume that the rich and powerful are white-collar workers, when really they are non-workers; they are people the world was handed to on a silver platter. (What, you think George W. Bush earned his admission to Yale?)

And thus, we can shout until we are blue in the face that plumbers, bricklayers and welders are the backbone of civilization—and they are, and I absolutely mean that; our civilization would, in an almost literal sense, collapse without them—but it won’t make any difference. They’ll still feel the pain of living in a society that gave them very little and tells them that people get what they deserve.

I don’t know what to say to such people, though. When your political attitudes are based on beliefs that are objectively false, that you could know are objectively false if you simply bothered to look them up… what exactly am I supposed to say to you? How can we have a useful political conversation when half the country doesn’t even believe in fact-checking?

Honestly I wish someone had explained to them that even the most ideal meritocratic capitalism wouldn’t reward hard work. Work is a cost, not a benefit, and the whole point of technological advancement is to allow us to accomplish more with less work. The ideal capitalism would reward talent—you would succeed by accomplishing things, regardless of how much effort you put into them. People would be rich mainly because they are brilliant, not because they are hard-working. The closest thing we have to ideal capitalism right now is probably professional sports. And no amount of effort could ever possibly make me into Steph Curry.

If that isn’t the world we want to live in, so be it; let’s do something else. I did nothing to earn either my high IQ or my chronic migraines, so it really does feel unfair that the former increases my income while the latter decreases it. But the labor theory of value has always been wrong; taking more sweat or more hours to do the same thing is worse, not better. The dignity of labor consists in its accomplishment, not its effort. Sisyphus is not happy, because his work is pointless.

Honestly at this point I think our best bet is just to replace all blue-collar work with automation, thus rendering it all moot. And then maybe we can all work remotely, just pushing code patches to the robots that do everything. (And no doubt this will prove my “contempt”: I want to replace you! No, I want to replace the grueling work that you have been forced to do to make a living. I want you—the human being—to be able to do something more fun with your life, even if that’s just watching television and hanging out with friends.)

Welp, I have COVID.

May 1 JDN 2459701

Tuesday night I had a fever. Wednesday morning, I tested positive.

Given how the pandemic has been going, I suppose it was more or less inevitable that this day would come. From almost the beginning, the refrain was “flatten the curve”, not “wait for a cure”. It was expected that almost all of us would get the virus eventually, and just a question of how long we could draw that out. In my case, apparently two years. For that whole time I had been scrupulous about precautions, but I did not sustain all of them all of the time, and indeed as Scotland loosened restrictions I think I became too complacent.

The good news is that I am young and reasonably healthy (migraines notwithstanding), and I had three doses of the Moderna vaccine. As a result my symptoms are relatively mild; I feel like I have a bad cold or perhaps a mild flu. Aside from the fever, which I’ve been able to keep down with NSAIDs, pretty much all my symptoms are in my sinuses. So far, I haven’t even lost my sense of taste.

It hasn’t even really interfered with my work, because my migraines were already doing a bang-up job of that. (My accent remains consistently “American broadcast standard”, but as you can see, I am gradually picking up some Britishisms, such as “bang-up job” and “sorted” with no “out”, as well as learning to put the “u” in “labour” and “behaviour”. I doubt I’ll ever start saying “aye” and “nae” though.) I am in fact even less productive than I was without COVID, but the marginal difference is relatively small. The main activity it has kept me from doing is moving and unpacking boxes (now that our shipment from California has finally arrived).

So, all things considered, if I was going to get infected with a pandemic and potentially life-threatening virus, it could have been a lot worse.

Keynesian economics: It works, bitches

Jan 23 JDN 2459613

(I couldn’t resist; for the uninitiated, my slightly off-color title is referencing this XKCD comic.)

When faced with a bad recession, Keynesian economics prescribes the following response: Expand the money supply. Cut interest rates. Increase government spending, but decrease taxes. The bigger the recession, the more we should do all these things—especially increasing spending, because interest rates will often get pushed to zero, creating what’s called a liquidity trap.

Take a look at these two FRED graphs, both since the 1950s.
The first is interest rates (specifically the Fed funds effective rate):

The second is the US federal deficit as a proportion of GDP:

Interest rates were pushed to zero right after the 2008 recession, and didn’t start coming back up until 2016. Then as soon as we hit the COVID recession, they were dropped back to zero.

The deficit looks even more remarkable. At the 2009 trough of the recession, the deficit was large, nearly 10% of GDP; but then it was quickly reduced back to normal, to between 2% and 4% of GDP. And that initial surge is as much explained by GDP and tax receipts falling as by spending increasing.

Yet in 2020 we saw something quite different: The deficit became huge. Literally off the chart, nearly 15% of GDP. A staggering $2.8 trillion. We’ve not had a deficit that large as a proportion of GDP since WW2. We’ve never had a deficit that large in real billions of dollars.

Deficit hawks came out of the woodwork to complain about this, and for once I was worried they might actually be right. Their most credible complaint was that it would trigger inflation, and they weren’t wrong about that: Inflation became a serious concern for the first time in decades.

But these recessions were very large, and when you actually run the numbers, this deficit was the correct magnitude for what Keynesian models tell us to do. I wouldn’t have thought our government had the will and courage to actually do it, but I am very glad to have been wrong about that, for one very simple reason:

It worked.

In 2009, we didn’t actually fix the recession. We blunted it; we stopped it from getting worse. But we never really restored GDP, we just let it get back to its normal growth rate after it had plummeted, and eventually caught back up to where we had been.

2021 went completely differently. With a much larger deficit, we fixed this recession. We didn’t just stop the fall; we reversed it. We aren’t just back to normal growth rates—we are back to the same level of GDP, as if the recession had never happened.

This contrast is quite obvious from the GDP of US GDP:

In 2008 and 2009, GDP slumps downward, and then just… resumes its previous trend. It’s like we didn’t do anything to fix the recession, and just allowed the overall strong growth of our economy to carry us through.

The pattern in 2020 is completely different. GDP plummets downward—much further, much faster than in the Great Recession. But then it immediately surges back upward. By the end of 2021, it was above its pre-recession level, and looks to be back on its growth trend. With a recession this deep, if we’d just waited like we did last time, it would have taken four or five years to reach this point—we actually did it in less than one.

I wrote earlier about how this is a weird recession, one that actually seems to fit Real Business Cycle theory. Well, it was weird in another way as well: We fixed it. We actually had the courage to do what Keynes told us to do in 1936, and it worked exactly as it was supposed to.

Indeed, to go from unemployment almost 15% in April of 2020 to under 4% in December of 2021 is fast enough I feel like I’m getting whiplash. We have never seen unemployment drop that fast. Krugman is fond of comparing this to “morning in America”, but that’s really an understatement. Pitch black one moment, shining bright the next: this isn’t a sunrise, it’s pulling open a blackout curtain.

And all of this while the pandemic is still going on! The omicron variant has brought case numbers to their highest levels ever, though fortunately death rates so far are still below last year’s peak.

I’m not sure I have the words to express what a staggering achievement of economic policy it is to so rapidly and totally repair the economic damage caused by a pandemic while that pandemic is still happening. It’s the equivalent of repairing an airplane that is not only still in flight, but still taking anti-aircraft fire.

Why, it seems that Keynes fellow may have been onto something, eh?

Reversals in progress against poverty

Jan 16 JDN 2459606

I don’t need to tell you that the COVID pandemic has been very bad for the world. Yet perhaps the worst outcome of the pandemic is one that most people don’t recognize: It has reversed years of progress against global poverty.

Estimates of the number of people who will be thrown into extreme poverty as a result of the pandemic are consistently around 100 million, though some forecasts have predicted this will rise to 150 million, or, in the most pessimistic scenarios, even as high as 500 million.

Pre-COVID projections showed the global poverty rate falling steadily from 8.4% in 2019 to 6.3% by 2030. But COVID resulted in the first upward surge in global poverty in decades, and updated models now suggest that the global poverty rate in 2030 will be as high as 7.0%. That difference is 0.7% of a forecasted population of 8.5 billion—so that’s a difference of 59 million people.

This is a terrible reversal of fortune, and a global tragedy. Ten or perhaps even hundreds of millions of people will suffer the pain of poverty because of this global pandemic and the numerous missteps by many of the world’s governments—not least the United States—in response to it.

Yet it’s important to keep in mind that this is a short-term reversal in a long-term trend toward reduced poverty. Yes, the most optimistic predictions are turning out to be wrong—but the general pattern of dramatic reductions in global poverty over the late 20th and early 21st century are still holding up.

That post-COVID estimate of a global poverty rate of 7.0% needs to be compared against the fact that as recently as 1980 the global poverty rate at the same income level (adjust for inflation and purchasing power of course) income level was a whopping 44%.

This pattern makes me feel deeply ambivalent about the effects of globalization on inequality. While it now seems clear that globalization has exacerbated inequality within First World countries—and triggered a terrible backlash of right-wing populism as a result—it also seems clear that globalization was a major reason for the dramatic reductions in global poverty in the past few decades.

I think the best answer I’ve been able to come up with is that globalization is overall a good thing, and we must continue it—but we also need to be much more mindful of its costs, and we must make policy that mitigates those costs. Expanded trade has winners and losers, and we should be taxing the winners to compensate the losers. To make good economic policy, it simply isn’t enough to increase aggregate GDP; you actually have to make life better for everyone (or at least as many people as you can).

Unfortunately, knowing what policies to make is only half the battle. We must actually implement those policies, which means winning elections, which means restoring the public’s faith in the authority of economic experts.

Some of the people voting for Donald Trump were just what Hillary Clinton correctly (if tone-deafly) referred to as “deplorables“: racists, misogynists, xenophobes. But I think that many others weren’t voting for Trump but against Clinton; they weren’t embracing far-right populism but rather rejecting center-left technocratic globalization. They were tired of being told what to do by experts who didn’t seem to care about them or their interests.

And the thing is, they were right about that. Not about voting for Trump—that’s unforgivable—but about the fact that expert elites had been ignoring their interests and needed a wake-up call. There were a hundred better ways of making that wake-up call that didn’t involve putting a narcissistic, incompetent maniac in charge of the world’s largest economy, military and nuclear arsenal, and millions of people should be ashamed of themselves for not taking those better options. Yet the fact remains: The wake-up call was necessary, and we should be responding to it.

We expert elites (I think I can officially carry that card, now that I have a PhD and a faculty position at a leading research university) need to do a much better job of two things: First, articulating the case for our policy recommendations in a way that ordinary people can understand, so that they feel justified and not simply rammed down people’s throats; and second, recognizing the costs and downsides of these policies and taking action to mitigate them whenever possible.

For instance: Yes, we need to destroy all the coal jobs. They are killing workers and the planet. Coal companies need to be transitioned to new industries or else shut down. This is not optional. It must be done. But we also need to explain to those coal miners why it’s necessary to move on from coal to solar and nuclear, and we need to be implementing various policies to help those workers move on to better, safer jobs that pay as well and don’t involve filling their lungs with soot and the atmosphere with carbon dioxide. We need to articulate, emphasize—and loudly repeat—that this isn’t about hurting coal miners to help everyone else, but about helping everyone, coal miners included, and that if anyone gets hurt it will only be a handful of psychopathic billionaires who already have more money than any human being could possibly need or deserve.

Another example: We cannot stop trading with India and China. Hundreds of millions of innocent people would suddenly be thrown out of work and into poverty if we did. We need the products they make for us, and they need the money we pay for those products. But we must also acknowledge that trading with poor countries does put downward pressure on wages back home, and take action to help First World workers who are now forced to compete with global labor markets. Maybe this takes the form of better unemployment benefits, or job-matching programs, or government-sponsored job training. But we cannot simply shrug and let people lose their jobs and their homes because the factories they worked in were moved to China.

Reasons for optimism in 2022

Jan 2 JDN 2459582

When this post goes live, we will have begun the year 2022.

That still sounds futuristic, somehow. We’ve been in the 20th century long enough that most of my students were born in it and nearly all of them are old enough to drink (to be fair, it’s the UK, so “old enough to drink” only means 18). Yet “the year 2022” still seems like it belongs in science fiction, and not on our wall calendars.

2020 and 2021 were quite bad years. Death rates and poverty rates surged around the world. Almost all of that was directly or indirectly due to COVID.

Yet there are two things we should keep in perspective.

First, those death rates and poverty rates surged to what we used to consider normal 50 years ago. These are not uniquely bad times; indeed, they are still better than most of human history.

Second, there are many reasons to think that 2022—or perhaps a bit later than that, 2025 or 2030—will be better.

The Omicron variant is highly contagious, but so far does not appear to be as deadly as previous variants. COVID seems to be evolving to be more like influenza: Catching it will be virtually inevitable, but dying from it will be very rare.

Things are also looking quite good on the climate change front: Renewable energy production is growing at breathtaking speed and is now cheaper than almost every other form of energy. It’s awful that we panicked and locked down nuclear energy for the last 50 years, but at this point we may no longer need it: Solar and wind are just that good now.

Battery technology is also rapidly improving, giving us denser, cheaper, more stable batteries that may soon allow us to solve the intermittency problem: the wind may not always blow and the sun may not always shine, but if you have big enough batteries you don’t need them to. (You can get a really good feel for how much difference good batteries make in energy production by playing Factorio, or, more whimsically, Mewnbase.)

If we do go back to nuclear energy, it may not be fission anymore, but fusion. Now that we have nearly reached that vital milestone of break-even, investment in fusion technology has rapidly increased.


Fusion has basically all of the benefits of fission with none of the drawbacks. Unlike renewables, it can produce enormous amounts of energy in a way that can be easily scaled and controlled independently of weather conditions. Unlike fission, it requires no exotic nuclear fuels (deuterium can be readily attained from water), and produces no long-lived radioactive waste. (Indeed, development is ongoing of methods that could use fusion products to reduce the waste from fission reactors, making the effective rate of nuclear waste production for fusion negative.) Like both renewables and fission, it produces no carbon emissions other than those required to build the facility (mainly due to concrete).

Of course, technology is only half the problem: we still need substantial policy changes to get carbon emissions down. We’ve already dragged our feet for decades too long, and we will pay the price for that. But anyone saying that climate change is an inevitable catastrophe hasn’t been paying attention to recent developments in solar panels.

Technological development in general seems to be speeding up lately, after having stalled quite a bit in the early 2000s. Moore’s Law may be leveling off, but the technological frontier may simply be moving away from digital computing power and onto other things, such as biotechnology.

Star Trek told us that we’d have prototype warp drives by the 2060s but we wouldn’t have bionic implants to cure blindness until the 2300s. They seem to have gotten it backwards: We may never have warp drive, but we’ve got those bionic implants today.

Neural interfaces are allowing paralyzed people to move, speak, and now even write.

After decades of failed promises, gene therapy is finally becoming useful in treating real human diseases. CRISPR changes everything.

We are also entering a new era of space travel, thanks largely to SpaceX and their remarkable reusable rockets. The payload cost to LEO is a standard measure of the cost of space travel, which describes the cost of carrying a certain mass of cargo up to low Earth orbit. By this measure, costs have declined from nearly $20,000 per kg to only $1,500 per kg since the 1960s. Elon Musk claims that he can reduce the cost to as low as $10 per kg. I’m skeptical, to say the least—but even dropping it to $500 or $200 would be a dramatic improvement and open up many new options for space exploration and even colonization.

To put this in perspective, the cost of carrying a human being to the International Space Station (about 100 kg to LEO) has fallen from $2 million to $150,000. A further decrease to $200 per kg would lower that to $20,000, opening the possibility of space tourism; $20,000 might be something even upper-middle-class people could do as a once-in-a-lifetime vacation. If Musk is really right that he can drop it all the way to $10 per kg, the cost to carry a person to the ISS would be only $1000—something middle-class people could do regularly. (“Should we do Paris for our anniversary this year, or the ISS?”) Indeed, a cost that low would open the possibility of space-based shipping—for when you absolutely must have the product delivered from China to California in the next 2 hours.

Another way to put this in perspective is to convert these prices per mass in terms of those of commodities, such as precious metals. $20,000 per kg is nearly the price of solid platinum. $500 per kg is about the price of sterling silver. $10 per kg is roughly the price of copper.

The reasons for optimism are not purely technological. There has also been significant social progress just in the last few years, with major milestones on LGBT rights being made around the world in 2020 and 2021. Same-sex marriage is now legally recognized over nearly the entire Western Hemisphere.

None of that changes the fact that we are still in a global pandemic which seems to be increasingly out of control. I can’t tell you whether 2022 will be better than 2021, or just more of the same—or perhaps even worse.

But while these times are hard, overall the world is still making progress.

A very Omicron Christmas

Dec 26 JDN 2459575

Remember back in spring of 2020 when we thought that this pandemic would quickly get under control and life would go back to normal? How naive we were.

The newest Omicron strain seems to be the most infectious yet—even people who are fully vaccinated are catching it. The good news is that it also seems to be less deadly than most of the earlier strains. COVID is evolving to spread itself better, but not be as harmful to us—much as influenza and cold viruses evolved. While weekly cases are near an all-time peek, weekly deaths are well below the worst they had been.

Indeed, at this point, it’s looking like COVID will more or less be with us forever. In the most likely scenario, the virus will continue to evolve to be more infectious but less lethal, and then we will end up with another influenza on our hands: A virus that can’t be eradicated, gets huge numbers of people sick, but only kills a relatively small number. At some point we will decide that the risk of getting sick is low enough that it isn’t worth forcing people to work remotely or maybe even wear masks. And we’ll relax various restrictions and get back to normal with this new virus a regular part of our lives.


Merry Christmas?

But it’s not all bad news. The vaccination campaign has been staggeringly successful—now the total number of vaccine doses exceeds the world population, so the average human being has been vaccinated for COVID at least once.

And while 5.3 million deaths due to the virus over the last two years sounds terrible, it should be compared against the baseline rate of 15 million deaths during that same interval, and the fact that worldwide death rates have been rapidly declining. Had COVID not happened, 2021 would be like 2019, which had nearly the lowest death rate on record, at 7,579 deaths per million people per year. As it is, we’re looking at something more like 10,000 deaths per million people per year (1%), or roughly what we considered normal way back in the long-ago times of… the 1980s. To get even as bad as things were in the 1950s, we would have to double our current death rate.

Indeed, there’s something quite remarkable about the death rate we had in 2019, before the pandemic hit: 7,579 per million is only 0.76%. A being with a constant annual death rate of 0.76% would have a life expectancy of over 130 years. This very low death rate is partly due to demographics: The current world population is unusually young and healthy because the world recently went through huge surges in population growth. Due to demographic changes the UN forecasts that our death rate will start to climb again as fertility falls and the average age increases; but they are still predicting it will stabilize at about 11,200 per million per year, which would be a life expectancy of 90. And that estimate could well be too pessimistic, if medical technology continues advancing at anything like its current rate.

We call it Christmas, but it’s really a syncretized amalgamation of holidays: Yule, Saturnalia, various Solstice celebrations. (Indeed, there’s no particular reason to think Jesus was even born in December.) Most Northern-hemisphere civilizations have some sort of Solstice holiday, and we’ve greedily co-opted traditions from most of them. The common theme really seems to be this:

Now it is dark, but band together and have hope, for the light shall return.

Diurnal beings in northerly latitudes instinctively fear the winter, when it becomes dark and cold and life becomes more hazardous—but we have learned to overcome this fear together, and we remind ourselves that light and warmth will return by ritual celebrations.

The last two years have made those celebrations particularly difficult, as we have needed to isolate ourselves in order to keep ourselves and others safe. Humans are fundamentally social at a level most people—even most scientists—do not seem to grasp: We need contact with other human beings as deeply and vitally as we need food or sleep.

The Internet has allowed us to get some level of social contact while isolated, which has been a tremendous boon; but I think many of us underestimated how much we would miss real face-to-face contact. I think much of the vague sense of malaise we’ve all been feeling even when we aren’t sick and even when we’ve largely adapted our daily routine to working remotely comes from this: We just aren’t getting the chance to see people in person nearly as often as we want—as often as we hadn’t even realized we needed.

So, if you do travel to visit family this holiday season, I understand your need to do so. But be careful. Get vaccinated—three times, if you can. Don’t have any contact with others who are at high risk if you do have any reason to think you’re infected.

Let’s hope next Christmas is better.

Unending nightmares

Sep 19 JDN 2459477

We are living in a time of unending nightmares.

As I write this, we have just passed the 20th anniversary of 9/11. Yet only in the past month were US troops finally withdrawn from Afghanistan—and that withdrawal was immediately followed by a total collapse of the Afghan government and a reinstatement of the Taliban. The United States had been at war for nearly 20 years, spending trillions of dollars and causing thousands of deaths—and seems to have accomplished precisely nothing.

Some left-wing circles have been saying that the Taliban offered surrender all the way back in 2001; this is not accurate. Alternet even refers to it as an “unconditional surrender” which is utter nonsense. No one in their right mind—not even the most die-hard imperialist—would ever refuse an unconditional surrender, and the US most certainly did nothing of the sort.)

The Taliban did offer a peace deal in 2001, which would have involved giving the US control of Kandahar and turning Osama bin Laden over to a neutral country (not to the US or any US ally). It would also have granted amnesty to a number of high-level Taliban leaders, which was a major sticking point for the US. In hindsight, should they have taken the deal? Obviously. But I don’t think that was nearly so clear at the time—nor would it have been particularly palatable to most of the American public to leave Osama bin Laden under house arrest in some neutral country (which they never specified by the way; somewhere without US extradition, presumably?) and grant amnesty to the top leaders of the Taliban.

Thus, even after the 20-year nightmare of the war that refused to end, we are still back to the nightmare we were in before—Afghanistan ruled by fanatics who will oppress millions.

Yet somehow this isn’t even the worst unending nightmare, for after a year and a half we are still in the throes of a global pandemic which has now caused over 4.6 million deaths. We are still wearing masks wherever we go—at least, those of us who are complying with the rules. We have gotten vaccinated already, but likely will need booster shots—at least, those of us who believe in vaccines.

The most disturbing part of it all is how many people still aren’t willing to follow the most basic demands of public health agencies.

In case you thought this was just an American phenomenon: Just a few days ago I looked out the window of my apartment to see a protest in front of the Scottish Parliament complaining about vaccine and mask mandates, with signs declaring it all a hoax. (Yes, my current temporary apartment overlooks the Scottish Parliament.)

Some of those signs displayed a perplexing innumeracy. One sign claimed that the vaccines must be stopped because they had killed 1,400 people in the UK. This is not actually true; while there have been 1,400 people in the UK who died after receiving a vaccine, 48 million people in the UK have gotten the vaccine, and many of them were old and/or sick, so, purely by statistics, we’d expect some of them to die shortly afterward. Less than 100 of these deaths are in any way attributable to the vaccine. But suppose for a moment that we took the figure at face value, and assumed, quite implausibly, that everyone who died shortly after getting the vaccine was in fact killed by the vaccine. This 1,400 figure needs to be compared against the 156,000 UK deaths attributable to COVID itself. Since 7 million people in the UK have tested positive for the virus, this is a fatality rate of over 2%. Even if we suppose that literally everyone in the UK who hasn’t been vaccinated in fact had the virus, that would still only be 20 million (the UK population of 68 million – the 48 million vaccinated) people, so the death rate for COVID itself would still be at least 0.8%—a staggeringly high fatality rate for a pandemic airborne virus. Meanwhile, even on this ridiculous overestimate of the deaths caused by the vaccine, the fatality rate for vaccination would be at most 0.003%. Thus, even by the anti-vaxers’ own claims, the vaccine is nearly 300 times safer than catching the virus. If we use the official estimates of a 1.9% COVID fatality rate and 100 deaths caused by the vaccines, the vaccines are in fact over 9000 times safer.

Yet it does seem to be worse in the United States, as while 22% of Americans described themselves as opposed to vaccination in general, only about 2% of Britons said the same.

But this did not translate to such a large difference in actual vaccination: While 70% of people in the UK have received the vaccine, 64% of people in the US have. Both of these figures are tantalizingly close to, yet clearly below, the at least 84% necessary to achieve herd immunity. (Actually some early estimates thought 60-70% might be enough—but epidemiologists no longer believe this, and some think that even 90% wouldn’t be enough.)

Indeed, the predominant tone I get from trying to keep up on the current news in epidemiology is fatalism: It’s too late, we’ve already failed to contain the virus, we won’t reach herd immunity, we won’t ever eradicate it. At this point they now all seem to think that COVID is going to become the new influenza, always with us, a major cause of death that somehow recedes into the background and seems normal to us—but COVID, unlike influenza, may stick around all year long. The one glimmer of hope is that influenza itself was severely hampered by the anti-pandemic procedures, and influenza cases and deaths are indeed down in both the US and UK (though not zero, nor as drastically reduced as many have reported).

The contrast between terrorism and pandemics is a sobering one, as pandemics kill far more people, yet somehow don’t provoke anywhere near as committed a response.

9/11 was a massive outlier in terrorism, at 3,000 deaths on a single day; otherwise the average annual death rate by terrorism is about 20,000 worldwide, mostly committed by Islamist groups. Yet the threat is not actually to Americans in particular; annual deaths due to terrorism in the US are less than 100—and most of these by right-wing domestic terrorists, not international Islamists.

Meanwhile, in an ordinary year, influenza would kill 50,000 Americans and somewhere between 300,000 and 700,000 people worldwide. COVID in the past year and a half has killed over 650,000 Americans and 4.6 million people worldwide—annualize that and it would be 400,000 per year in the US and 3 million per year worldwide.

Yet in response to terrorism we as a country were prepared to spend $2.3 trillion dollars, lose nearly 4,000 US and allied troops, and kill nearly 50,000 civilians—not even counting the over 60,000 enemy soldiers killed. It’s not even clear that this accomplished anything as far as reducing terrorism—by some estimates it actually made it worse.

Were we prepared to respond so aggressively to pandemics? Certainly not to influenza; we somehow treat all those deaths are normal or inevitable. In response to COVID we did spend a great deal of money, even more than the wars in fact—a total of nearly $6 trillion. This was a very pleasant surprise to me (it’s the first time in my lifetime I’ve witnessed a serious, not watered-down Keynesian fiscal stimulus in the United States). And we imposed lockdowns—but these were all-too quickly removed, despite the pleading of public health officials. It seems to be that our governments tried to impose an aggressive response, but then too many of the citizens pushed back against it, unwilling to give up their “freedom” (read: convenience) in the name of public safety.

For the wars, all most of us had to do was pay some taxes and sit back and watch; but for the pandemic we were actually expected to stay home, wear masks, and get shots? Forget it.

Politics was clearly a very big factor here: In the US, the COVID death rate map and the 2020 election map look almost equivalent: By and large, people who voted for Biden have been wearing masks and getting vaccinated, while people who voted for Trump have not.

But pandemic response is precisely the sort of thing you can’t do halfway. If one area is containing a virus and another isn’t, the virus will still remain uncontained. (As some have remarked, it’s rather like having a “peeing section” of a swimming pool. Much worse, actually, as urine contains relatively few bacteria—but not zero—and is quickly diluted by the huge quantities of water in a swimming pool.)

Indeed, that seems to be what has happened, and why we can’t seem to return to normal life despite months of isolation. Since enough people are refusing to make any effort to contain the virus, the virus remains uncontained, and the only way to protect ourselves from it is to continue keeping restrictions in place indefinitely.

Had we simply kept the original lockdowns in place awhile longer and then made sure everyone got the vaccine—preferably by paying them for doing it, rather than punishing them for not—we might have been able to actually contain the virus and then bring things back to normal.

But as it is, this is what I think is going to happen: At some point, we’re just going to give up. We’ll see that the virus isn’t getting any more contained than it ever was, and we’ll be so tired of living in isolation that we’ll finally just give up on doing it anymore and take our chances. Some of us will continue to get our annual vaccines, but some won’t. Some of us will continue to wear masks, but most won’t. The virus will become a part of our lives, just as influenza did, and we’ll convince ourselves that millions of deaths is no big deal.

And then the nightmare will truly never end.

A new chapter in my life, hopefully

Jan 17 JDN 2459232

My birthday is coming up soon, and each year around this time I try to step back and reflect on how the previous year has gone and what I can expect from the next one.

Needless to say, 2020 was not a great year for me. The pandemic and its consequences made this quite a bad year for almost everyone. Months of isolation and fear have made us all stressed and miserable, and even with the vaccines coming out the end is still all too far away. Honestly I think I was luckier than most: My work could be almost entirely done remotely, and my income is a fixed stipend, so financially I faced no hardship at all. But isolation still wreaks its toll.

Most of my energy this past year has been spent on the job market. I applied to over 70 different job postings, and from that I received 6 interviews, all but one of which I’ve already finished. Then, if they liked how I did in those interviews, I will be invited to another phase, which in normal times would be a flyout where candidates visit the campus; but due to COVID it’s all being done remotely now. And then, finally, I may actually get some job offers. Statistically I think I will probably get some kind of offer at this point, but I can’t be sure—and that uncertainty is quite nerve-wracking. I may get a job and move somewhere new, or I may not and have to stay here for another year and try again. Both outcomes are still quite probable, and I really can’t plan on either one.

If I do actually get a job, this will open a new chapter in my life—and perhaps I will finally be able to settle down with a permanent career, buy a house, start a family. One downside of graduate school I hadn’t really anticipated is how it delays adulthood: You don’t really feel like you are a proper adult, because you are still in the role of a student for several additional years. I am all too ready to be done with being a student. I feel as though I’ve spent all my life preparing to do things instead of actually doing them, and I am now so very tired of preparing.

I don’t even know for sure what I want to do—I feel disillusioned with academia, I haven’t been able to snare any opportunities in government or nonprofits, and I need more financial security than I could get if I leapt headlong into full-time writing. But I am quite certain that I want to actually do something, and no longer simply be trained and prepared (and continually evaluated on that training and preparation).

I’m even reluctant to do a postdoc, because that also likely means packing up and moving again in a few year (though I would prefer it to remaining here another year).

I have to keep reminding myself that all of this is temporary: The pandemic will eventually be quelled by vaccines, and quarantine procedures will end, and life for most of us will return to normal. Even if I don’t get a job I like this year, I probably will next year; and then I can finally tie off my education with a bow and move on. Even if the first job isn’t permanent, eventually one will be, and at last I’ll be able to settle into a stable adult life.

Much of this has already dragged on longer than I thought it would. Not the job market, which has gone more or less as expected. (More accurately, my level of optimism has jumped up and down like a roller coaster, and on average what I thought would happen has been something like what actually happened so far.) But the pandemic certainly has; the early attempts at lockdown were ineffective, the virus kept spreading worse and worse, and now there are more COVID cases in the US than ever before. Southern California in particular has been hit especially hard, and hospitals here are now overwhelmed just as we feared they might be.

Even the removal of Trump has been far more arduous than I expected. First there was the slow counting of ballots because so many people had (wisely) voted absentee. Then there were the frivolous challenges to the counts—and yes, I mean frivolous in a legal sense, as 61 out of 62 lawsuits were thrown out immediately and the 1 that made it through was a minor technical issue.

And then there was an event so extreme I can barely even fathom that it actually happened: An armed mob stormed the Capitol building, forced Congress to evacuate, and made it inside with minimal resistance from the police. The stark difference in how the police reacted to this attempted insurrection and how they have responded to the Black Lives Matter protests underscores the message of Black Lives Matter better than they ever could have by themselves.

In one sense it feels like so much has happened: We have borne witness to historic events in real-time. But in another sense it feels like so little has happened: Staying home all the time under lockdown has meant that days are alway much the same, and each day blends into the next. I feel somehow unhinged frrom time, at once marveling that a year has passed already, and marveling that so much happened in only a year.

I should soon hear back from these job interviews and have a better idea what the next chapter of my life will be. But I know for sure that I’ll be relieved once this one is over.