I’m old enough to be President now.

Jan 22 JDN 2459967

When this post goes live, I will have passed my 35th birthday. This is old enough to be President of the United States, at least by law. (In practice, no POTUS has been less than 42.)

Not that I will ever be President. I have neither the wealth nor the charisma to run any kind of national political campaign. I might be able to get elected to some kind of local office at some point, like a school board or a city water authority. But I’ve been eligible to run for such offices for quite awhile now, and haven’t done so; nor do I feel particularly inclined at the moment.

No, the reason this birthday feels so significant is the milestone it represents. By this age, most people have spouses, children, careers. I have a spouse. I don’t have kids. I sort of have a career.

I have a job, certainly. I work for relatively decent pay. Not excellent, not what I was hoping for with a PhD in economics, but enough to live on (anywhere but an overpriced coastal metropolis). But I can’t really call that job a career, because I find large portions of it unbearable and I have absolutely no job security. In fact, I have the exact opposite: My job came with an explicit termination date from the start. (Do the people who come up with these short-term postdoc positions understand how that feels? It doesn’t seem like they do.)

I missed the window to apply for academic jobs that start next year. If I were happy here, this would be fine; I still have another year left on my contract. But I’m not happy here, and that is a grievous understatement. Working here is clearly the most important situational factor contributing to my ongoing depression. So I really ought to be applying to every alternative opportunity I can find—but I can’t find the will to try it, or the self-confidence to believe that my attempts could succeed if I did.

Then again, I’m not sure I should be applying to academic positions at all. If I did apply to academic positions, they’d probably be teaching-focused ones, since that’s the one part of my job I’m actually any good at. I’ve more or less written off applying to major research institutions; I don’t think I would get hired anyway, and even if I did, the pressure to publish is so unbearable that I think I’d be just as miserable there as I am here.

On the other hand, I can’t be sure that I would be so miserable even at another research institution; maybe with better mentoring and better administration I could be happy and successful in academic research after all.

The truth is, I really don’t know how much of my misery is due to academia in general, versus the British academic system, versus Edinburgh as an institution, versus starting work during the pandemic, versus the experience of being untenured faculty, versus simply my own particular situation. I don’t know if working at another school would be dramatically better, a little better, or just the same. (If it were somehow worse—which frankly seems hard to arrange—I would literally just quit immediately.)

I guess if the University of Michigan offered me an assistant professor job right now, I would take it. But I’m confident enough that they wouldn’t offer it to me that I can’t see the point in applying. (Besides, I missed the application windows this year.) And I’m not even sure that I would be happy there, despite the fact that just a few years ago I would have called it a dream job.

That’s really what I feel most acutely about turning 35: The shattering of dreams.

I thought I had some idea of how my life would go. I thought I knew what I wanted. I thought I knew what would make me happy.

The weirdest part it that it isn’t even that different from how I’d imagined it. If you’d asked me 10 or even 20 years ago what my career would be like at 35, I probably would have correctly predicted that I would have a PhD and be working at a major research university. 10 years ago I would have correctly expected it to be a PhD in economics; 20, I probably would have guessed physics. In both cases I probably would have thought I’d be tenured by now, or at least on the tenure track. But a postdoc or adjunct position (this is sort of both?) wouldn’t have been utterly shocking, just vaguely disappointing.

The biggest error by my past self was thinking that I’d be happy and successful in this career, instead of barely, desperately hanging on. I thought I’d have published multiple successful papers by now, and be excited to work on a new one. I imagined I’d also have published a book or two. (The fact that I self-published a nonfiction book at 16 but haven’t published any nonfiction ever since would be particularly baffling to my 15-year-old self, and is particularly depressing to me now.) I imagined myself becoming gradually recognized as an authority in my field, not languishing in obscurity; I imagined myself feeling successful and satisfied, not hopeless and depressed.

It’s like the dark Mirror Universe version of my dream job. It’s so close to what I thought I wanted, but it’s also all wrong. I finally get to touch my dreams, and they shatter in my hands.

When you are young, birthdays are a sincere cause for celebration; you look forward to the new opportunities the future will bring you. I seem to be now at the age where it no longer feels that way.

Charity shouldn’t end at home

It so happens that this week’s post will go live on Christmas Day. I always try to do some kind of holiday-themed post around this time of year, because not only Christmas, but a dozen other holidays from various religions all fall around this time of year. The winter solstice seems to be a very popular time for holidays, and has been since antiquity: The Romans were celebrating Saturnalia 2000 years ago. Most of our ‘Christmas’ traditions are actually derived from Yuletide.

These holidays certainly mean many different things to different people, but charity and generosity are themes that are very common across a lot of them. Gift-giving has been part of the season since at least Saturnalia and remains as vital as ever today. Most of those gifts are given to our friends and loved ones, but a substantial fraction of people also give to strangers in the form of charitable donations: November and December have the highest rates of donation to charity in the US and the UK, with about 35-40% of people donating during this season. (Of course this is complicated by the fact that December 31 is often the day with the most donations, probably from people trying to finish out their tax year with a larger deduction.)

My goal today is to make you one of those donors. There is a common saying, often attributed to the Bible but not actually present in it: “Charity begins at home”.

Perhaps this is so. There’s certainly something questionable about the Effective Altruism strategy of “earning to give” if it involves abusing and exploiting the people around you in order to make more money that you then donate to worthy causes. Certainly we should be kind and compassionate to those around us, and it makes sense for us to prioritize those close to us over strangers we have never met. But while charity may begin at home, it must not end at home.

There are so many global problems that could benefit from additional donations. While global poverty has been rapidly declining in the early 21st century, this is largely because of the efforts of donors and nonprofit organizations. Official Development Assitance has been roughly constant since the 1970s at 0.3% of GNI among First World countries—well below international targets set decades ago. Total development aid is around $160 billion per year, while private donations from the United States alone are over $480 billion. Moreover, 9% of the world’s population still lives in extreme poverty, and this rate has actually slightly increased the last few years due to COVID.

There are plenty of other worthy causes you could give to aside from poverty eradication, from issues that have been with us since the dawn of human civilization (the Humane Society International for domestic animal welfare, the World Wildlife Federation for wildlife conservation) to exotic fat-tail sci-fi risks that are only emerging in our own lifetimes (the Machine Intelligence Research Institute for AI safety, the International Federation of Biosafety Associations for biosecurity, the Union of Concerned Scientists for climate change and nuclear safety). You could fight poverty directly through organizations like UNICEF or GiveDirectly, fight neglected diseases through the Schistomoniasis Control Initiative or the Against Malaria Foundation, or entrust an organization like GiveWell to optimize your donations for you, sending them where they think they are needed most. You could give to political causes supporting civil liberties (the American Civil Liberties Union) or protecting the rights of people of color (the North American Association of Colored People) or LGBT people (the Human Rights Campaign).

I could spent a lot of time and effort trying to figure out the optimal way to divide up your donations and give them to causes such as this—and then convincing you that it’s really the right one. (And there is even a time and place for that, because seemingly-small differences can matter a lot in this.) But instead I think I’m just going to ask you to pick something. Give something to an international charity with a good track record.

I think we worry far too much about what is the best way to give—especially people in the Effective Altruism community, of which I’m sort of a marginal member—when the biggest thing the world really needs right now is just more people giving more. It’s true, there are lots of worthless or even counter-productive charities out there: Please, please do not give to the Salvation Army. (And think twice before donating to your own church; if you want to support your own community, okay, go ahead. But if you want to make the world better, there are much better places to put your money.)

But above all, give something. Or if you already give, give more. Most people don’t give at all, and most people who give don’t give enough.

Is the cure for inflation worse than the disease?

Nov 13 JDN 2459897

A lot of people seem really upset about inflation. I’ve previously discussed why this is a bit weird; inflation really just isn’t that bad. In fact, I am increasingly concerned that the usual methods for fixing inflation are considerably worse than inflation itself.

To be clear, I’m not talking about hyperinflationif you are getting triple-digit inflation or more, you are clearly printing too much money and you need to stop. And there are places in the world where this happens.

But what about just regular, ordinary inflation, even when it’s fairly high? Prices rising at 8% or 9% or even 11% per year? What catastrophe befalls our society when this happens?

Okay, sure, if we could snap our fingers and make prices all stable without cost, that would be worth doing. But we can’t. All of our mechanisms for reducing inflation come with costs—and often very high costs.

The chief mechanism by which inflation is currently controlled is open-market operations by central banks such as the Federal Reserve, the Bank of England, and the European Central Bank. These central banks try to reduce inflation by selling bonds, which lowers the price of bonds and reduces capital available to banks, and thereby increases interest rates. This also effectively removes money from the economy, as banks are using that money to buy bonds instead of lending it out. (It is chiefly in this odd indirect sense that the central bank manages the “money supply”.)

But how does this actually reduce inflation? It’s remarkably indirect. It’s actually the higher interest rates which prevent people from buying houses and prevent companies from hiring workers which result in reduced economic growth—or even economic recession—which then is supposed to bring down prices. There’s actually a lot we still don’t know about how this works or how long it should be expected to take. What we do know is that the pain hits quickly and the benefits arise only months or even years later.

As Krugman has rightfully pointed out, the worst pain of the 1970s was not the double-digit inflation; it was the recessions that Paul Volcker’s economic policy triggered in response to that inflation. The inflation wasn’t exactly a good thing; but for most people, the cure was much worse than the disease.

Most laypeople seem to think that prices somehow go up without wages going up, but that simply isn’t how it works. Prices and wages rise at close to the same rate in most countries most of the time. In fact, inflation is often driven chiefly by rising wages rather than the other way around. There are often lags between when the inflation hits and when people see their wages rise; but these lags can actually be in either direction—inflation first or wages first—and for moderate amounts of inflation they are clearly less harmful than the high rates of unemployment that we would get if we fought inflation more aggressively with monetary policy.

Economists are also notoriously vague about exactly how they expect the central bank to reduce inflation. They use complex jargon or broad euphemisms. But when they do actually come out and say they want to reduce wages, it tends to outrage people. Well, that’s one of three main ways that interest rates actually reduce inflation: They reduce wages, they cause unemployment, or they stop people from buying houses. That’s pretty much all that central banks can do.

There may be other ways to reduce inflation, like windfall profits taxes, antitrust action, or even price controls. The first two are basically no-brainers; we should always be taxing windfall profits (if they really are due to a windfall outside a corporation’s control, there’s no incentive to distort), and we should absolutely be increasing antitrust action (why did we reduce it in the first place?). Price controls are riskier—they really do create shortages—but then again, is that really worse than lower wages or unemployment? Because the usual strategy involves lower wages and unemployment.

It’s a little ironic: The people who are usually all about laissez-faire are the ones who panic about inflation and want the government to take drastic action; meanwhile, I’m usually in favor of government intervention, but when it comes to moderate inflation, I think maybe we should just let it be.

Now is the time for CTCR

Nov 6 JDN 2459890

We live in a terrifying time. As Ukraine gains ground in its war with Russia, thanks in part to the deployment of high-tech weapons from NATO, Vladimir Putin has begun to make thinly-veiled threats of deploying his nuclear arsenal in response. No one can be sure how serious he is about this. Most analysts believe that he was referring to the possible use of small-scale tactical nuclear weapons, not a full-scale apocalyptic assault. Many think he’s just bluffing and wouldn’t resort to any nukes at all. Putin has bluffed in the past, and could be doing so again. Honestly, “this is not a bluff” is exactly the sort of thing you say when you’re bluffing—people who aren’t bluffing have better ways of showing it. (It’s like whenever Trump would say “Trust me”, and you’d know immediately that this was an especially good time not to. Of course, any time is a good time not to trust Trump.)

(By the way, financial news is a really weird thing: I actually found this article discussing how a nuclear strike would be disastrous for the economy. Dude, if there’s a nuclear strike, we’ve got much bigger things to worry about than the economy. It reminds me of this XKCD.)

But if Russia did launch nuclear weapons, and NATO responded with its own, it could trigger a nuclear war that would kill millions in a matter of hours. So we need to be prepared, and think very carefully about the best way to respond.

The current debate seems to be over whether to use economic sanctions, conventional military retaliation, or our own nuclear weapons. Well, we already have economic sanctions, and they aren’t making Russia back down. (Though they probably are hurting its war effort, so I’m all for keeping them in place.) And if we were to use our own nuclear weapons, that would only further undermine the global taboo against nuclear weapons and could quite possibly trigger that catastrophic nuclear war. Right now, NATO seems to be going for a bluff of our own: We’ll threaten an overwhelming nuclear response, but then we obviously won’t actually carry it out because that would be murder-suicide on a global scale.

That leaves conventional military retaliation. What sort of retaliation? Several years ago I came up with a very specific method of conventional retaliation I call credible targeted conventional response (CTCR, which you can pronounce “cut-core”). I believe that now would be an excellent time to carry it out.

The basic principle of CTCR is really quite simple: Don’t try to threaten entire nations. A nation is an abstract entity. Threaten people. Decisions are made by people. The response to Vladimir Putin launching nuclear weapons shouldn’t be to kill millions of innocent people in Russia that probably mean even less to Putin than they do to us. It should be to kill Vladimir Putin.

How exactly to carry this out is a matter for military strategists to decide. There are a variety of weapons at our disposal, ranging from the prosaic (covert agents) to the exotic (precision strikes from high-altitude stealth drones). Indeed, I think we should leave it purposefully vague, so that Putin can’t try to defend himself against some particular mode of attack. The whole gamut of conventional military responses should be considered on the table, from a single missile strike to a full-scale invasion.

But the basic goal is quite simple: Launching a nuclear weapon is one of the worst possible war crimes, and it must be met with an absolute commitment to bring the perpetrator to justice. We should be willing to accept some collateral damage, even a lot of collateral damage; carpet-bombing a city shouldn’t be considered out of the question. (If that sounds extreme, consider that we’ve done it before for much weaker reasons.) The only thing that we should absolutely refuse to do is deploy nuclear weapons ourselves.

The great advantage of this strategy—even aside from being obviously more humane than nuclear retaliation—is that it is more credible. It sounds more like something we’d actually be willing to do. And in fact we likely could even get help from insiders in Russia, because there are surely many people in the Russian government who aren’t so loyal to Putin that they’d want him to get away with mass murder. It might not just be an assassination; it might end up turning into a coup. (Also something we’ve done for far weaker reasons.)


This is how we preserve the taboo on nuclear weapons: We refuse to use them, but otherwise stop at nothing to kill anyone who does use them.

I therefore call upon the world to make this threat:

Launch a nuclear weapon, Vladimir Putin, and we will kill you. Not your armies, not your generals—you. It could be a Tomahawk missile at the Kremlin. It could be a car bomb in your limousine, or a Stinger missile at Aircraft One. It could be a sniper at one of your speeches. Or perhaps we’ll poison your drink with polonium, like you do to your enemies. You won’t know when or where. You will live the rest of your short and miserable life in terror. There will be nowhere for you to hide. We will stop at nothing. We will deploy every available resource around the world, and it will be our top priority. And you will die.

That’s how you threaten a psychopath. And it’s what we must do in order to keep the world safe from nuclear war.

The United Kingdom in transition

Oct 30 JDN 2459883

When I first decided to move to Edinburgh, I certainly did not expect it to be such a historic time. The pandemic was already in full swing, but I thought that would be all. But this year I was living in the UK when its leadership changed in two historic ways:

First, there was the death of Queen Elizabeth II, and the coronation of King Charles III.

Second, there was the resignation of Boris Johnson, the appointment of Elizabeth Truss, and then, so rapidly I feel like I have whiplash, the resignation of Elizabeth Truss.

In other words, I have seen the end of the longest-reigning monarch and the rise and fall of the shortest-reigning prime minister in the history of the United Kingdom. The three hundred-year history of the United Kingdom.

The prior probability of such a 300-year-historic event happening during my own 3-year term in the UK is approximately 1%. Yet, here we are. A new king, one of a handful of genuine First World monarchs to be coronated in the 21st century. The others are the Netherlands, Belgium, Spain, Monaco, Andorra, and Luxembourg; none of these have even a third the population of the UK, and if we include every Commonwealth Realm (believe it or not, “realm” is in fact still the official term), Charles III is now king of a supranational union with a population of over 150 million people—half the size of the United States. (Yes, he’s your king too, Canada!) Note that Charles III is not king of the entire Commonwealth of Nations, which includes now-independent nations such as India, Pakistan, and South Africa; that successor to the British Empire contains 54 nations and has a population of over 2 billion.

I still can’t quite wrap my mind around this idea of having a king. It feels even more ancient and anachronistic than the 400-year-old university I work at. Of course I knew that we had a queen before, and that she was old and would presumably die at some point and probably be replaced; but that wasn’t really salient information to me until she actually did die and then there was a ten-mile-long queue to see her body and now next spring they will be swearing in this new guy as the monarch of the fourteen realms. It now feels like I’m living in one of those gritty satirical fractured fairy tales. Maybe it’s an urban fantasy setting; it feels a lot like Shrek, to be honest.

Yet other than feeling surreal, none of this has affected my life all that much. I haven’t even really felt the effects of inflation: Groceries and restaurant meals seem a bit more expensive than they were when we arrived, but it’s well within what our budget can absorb; we don’t have a car here, so we don’t care about petrol prices; and we haven’t even been paying more than usual in natural gas because of the subsidy programs. Actually it’s probably been good for our household finances that the pound is so weak and the dollar is so strong. I have been much more directly affected by the university union strikes: being temporary contract junior faculty (read: expendable), I am ineligible to strike and hence had to cross a picket line at one point.

Perhaps this is what history has always felt like for most people: The kings and queens come and go, but life doesn’t really change. But I honestly felt more directly affected by Trump living in the US than I did by Truss living in the UK.

This may be in part because Elizabeth Truss was a very unusual politician; she combined crazy far-right economic policy with generally fairly progressive liberal social policy. A right-wing libertarian, one might say. (As Krugman notes, such people are astonishingly rare in the electorate.) Her socially-liberal stance meant that she wasn’t trying to implement horrific hateful policies against racial minorities or LGBT people the way that Trump was, and for once her horrible economic policies were recognized immediately as such and quickly rescinded. Unlike Trump, Truss did not get the chance to appoint any supreme court justices who could go on to repeal abortion rights.

Then again, Truss couldn’t have appointed any judges if she’d wanted to. The UK Supreme Court is really complicated, and I honestly don’t understand how it works; but from what I do understand, the Prime Minister appoints the Lord Chancellor, the Lord Chancellor forms a commission to appoint the President of the Supreme Court, and the President of the Supreme Court forms a commission to appoint new Supreme Court judges. But I think the monarch is considered the ultimate authority and can veto any appointment along the way. (Or something. Sometimes I get the impression that no one truly understands the UK system, and they just sort of go with doing things as they’ve always been done.) This convoluted arrangement seems to grant the court considerably more political independence than its American counterpart; also, unlike the US Supreme Court, the UK Supreme Court is not allowed to explicitly overturn primary legislation. (Fun fact: The Lord Chancellor is also the Keeper of the Great Seal of the Realm, because Great Britain hasn’t quite figured out that the 13th century ended yet.)

It’s sad and ironic that it was precisely by not being bigoted and racist that Truss ensured she would not have sufficient public support for her absurd economic policies. There’s a large segment of the population of both the US and UK—aptly, if ill-advisedly, referred to by Clinton as “deplorables”—who will accept any terrible policy as long as it hurts the right people. But Truss failed to appeal to that crucial demographic, and so could find no one to support her. Hence, her approval rating fell to a dismal 10%, and she was outlasted by a head of lettuce.

At the time of writing, the new prime minister has not yet been announced, but the smart money is on Rishi Sunak. (I mean that quite literally; he’s leading in prediction markets.) He’s also socially liberal but fiscally conservative, but unlike Truss he seems to have at least some vague understanding of how economics works. Sunak is also popular in a way Truss never was (though that popularity has been declining recently). So I think we can expect to get new policies which are in the same general direction as what Truss wanted—lower taxes on the rich, more privatization, less spent on social services—but at least Sunak is likely to do so in a way that makes the math(s?) actually add up.

All of this is unfortunate, but largely par for the course for the last few decades. It compares quite favorably to the situation in the US, where somehow a large chunk of Americans either don’t believe that an insurrection attempt occurred, are fine with it, or blame the other side, and as the guardrails of democracy continue breaking, somehow gasoline prices appear to be one of the most important issues in the midterm election.

You know what? Living through history sucks. I don’t want to live in “interesting times” anymore.

The era of the eurodollar is upon us

Oct 16 JDN 2459869

I happen to be one of those weirdos who liked the game Cyberpunk 2077. It was hardly flawless, and had many unforced errors (like letting you choose your gender, but not making voice type independent from pronouns? That has to be, like, three lines of code to make your game significantly more inclusive). But overall I thought it did a good job of representing a compelling cyberpunk world that is dystopian but not totally hopeless, and had rich, compelling characters, along with reasonably good gameplay. The high level of character customization sets a new standard (aforementioned errors notwithstanding), and I for one appreciate how they pushed the envelope for sexuality in a AAA game.

It’s still not explicit—though I’m sure there are mods for that—but at least you can in fact get naked, and people talk about sex in a realistic way. It’s still weird to me that showing a bare breast or a penis is seen as ‘adult’ in the same way as showing someone’s head blown off (Remind me: Which of the three will nearly everyone have seen from the time they were a baby? Which will at least 50% of children see from birth, guaranteed, and virtually 100% of adults sooner or later? Which can you see on Venus de Milo and David?), but it’s at least some progress in our society toward a healthier relationship with sex.

A few things about the game’s world still struck me as odd, though. Chiefly it has to be the weird alternate history where apparently we have experimental AI and mind-uploading in the 2020s, but… those things are still experimental in the 2070s? So our technological progress was through the roof for the early 2000s, and then just completely plateaued? They should have had Johnny Silverhand’s story take place in something like 2050, not 2023. (You could leave essentially everything else unchanged! V could still have grown up hearing tales of Silverhand’s legendary exploits, because 2050 was 27 years ago in 2077; canonically, V is 28 years old when the game begins. Honestly it makes more sense in other ways: Rogue looks like she’s in her 60s, not her 80s.)

Another weird thing is the currency they use: They call it the “eurodollar”, and the symbol is, as you might expect, €$. When the game first came out, that seemed especially ridiculous, since euros were clearly worth more than dollars and basically always had been.

Well, they aren’t anymore. In fact, euros and dollars are now trading almost exactly at parity, and have been for weeks. CD Projekt Red was right: In the 2020s, the era of the eurodollar is upon us after all.

Of course, we’re unlike to actually merge the two currencies any time soon. (Can you imagine how Republicans would react if such a thing were proposed?) But the weird thing is that we could! It almost is like the two currencies are interchangeable—for the first time in history.

It isn’t so much that the euro is weak; it’s that the dollar is strong. When I first moved to the UK, the pound was trading at about $1.40. It is now trading at $1.10! If it continues dropping as it has, it could even reach parity as well! We might have, for the first time in history, the dollar, the pound, and the euro functioning as one currency. Get the Canadian dollar too (currently much too weak), and we’ll have the Atlantic Union dollar I use in some of my science fiction (I imagine the AU as an expansion of NATO into an economic union that gradually becomes its own government).Then again, the pound is especially weak right now because it plunged after the new prime minister announced an utterly idiotic economic plan. (Conservatives refusing to do basic math and promising that tax cuts would fix everything? Why, it felt like being home again! In all the worst ways.)

This is largely a bad thing. A strong dollar means that the US trade deficit will increase, and also that other countries will have trouble buying our exports. Conversely, with their stronger dollars, Americans will buy more imports from other countries. The combination of these two effects will make inflation worse in other countries (though it could reduce it in the US).

It’s not so bad for me personally, as my husband’s income is largely in dollars while our expenses are in pounds. (My income is in pounds and thus unaffected.) So a strong dollar and a weak pound means our real household income is about £4,000 than it would otherwise have been—which is not a small difference!

In general, the level of currency exchange rates isn’t very important. It’s changes in exchange rates that matter. The changes in relative prices will shift around a lot of economic activity, causing friction both in the US and in its (many) trading partners. Eventually all those changes should result in the exchange rates converging to a new, stable equilibrium; but that can take a long time, and exchange rates can fluctuate remarkably fast. In the meantime, such large shifts in exchange rates are going to cause even more chaos in a world already shaken by the COVID pandemic and the war in Ukraine.

On (gay) marriage

Oct 9 JDN 2459862

This post goes live on my first wedding anniversary. Thus, as you read this, I will have been married for one full year.

Honestly, being married hasn’t felt that different to me. This is likely because we’d been dating since 2012 and lived together for several years before actually getting married. It has made some official paperwork more convenient, and I’ve reached the point where I feel naked without my wedding band; but for the most part our lives have not really changed.

And perhaps this is as it should be. Perhaps the best way to really know that you should get married is to already feel as though you are married, and just finally get around to making it official. Perhaps people for whom getting married is a momentous change in their lives (as opposed to simply a formal announcement followed by a celebration) are people who really shouldn’t be getting married just yet.

A lot of things in my life—my health, my career—have not gone very well in this past year. But my marriage has been only a source of stability and happiness. I wouldn’t say we never have conflict, but quite honestly I was expecting a lot more challenges and conflicts from the way I’d heard other people talk about marriage in the past. All of my friends who have kids seem to be going through a lot of struggles as a result of that (which is one of several reasons we keep procrastinating on looking into adoption), but marriage itself does not appear to be any more difficult than friendship—in fact, maybe easier.

I have found myself oddly struck by how un-important it has been that my marriage is to a same-sex partner. I keep expecting people to care—to seem uncomfortable, to be resistant, or simply to be surprised—and it so rarely happens.

I think this is probably generational: We Millennials grew up at the precise point in history when the First World suddenly decided, all at once, that gay marriage was okay.

Seriously, look at this graph. I’ve made it combining this article using data from the General Social Survey, and this article from Pew:

Until around 1990—when I was 2 years old—support for same-sex marriage was stable and extremely low: About 10% of Americans supported it (presumably most of them LGBT!), and over 70% opposed it. Then, quite suddenly, attitudes began changing, and by 2019, over 60% of Americans supported it and only 31% opposed it.

That is, within a generation, we went from a country where almost no one supported gay marriage to a country where same-sex marriage is so popular that any major candidate who opposed it would almost certainly lose a general election. (They might be able to survive a Republican primary, as Republican support for same-sex marriage is only about 44%—about where it was among Democrats in the early 2000s.)

This is a staggering rate of social change. If development economics is the study of what happened in South Korea from 1950-2000, I think political science should be the study of what happened to attitudes on same-sex marriage in the US from 1990-2020.

And of course it isn’t just the US. Similar patterns can be found across Western Europe, with astonishingly rapid shifts from near-universal opposition to near-universal support within a generation.

I don’t think I have been able to fully emotionally internalize this shift. I grew up in a world where homophobia was mainstream, where only the most radical left-wing candidates were serious about supporting equal rights and representation for LGBT people. And suddenly I find myself in a world where we are actually accepted and respected as equals, and I keep waiting for the other shoe to drop. Aren’t you the same people who told me as a teenager that I was a sexual deviant who deserved to burn in Hell? But now you’re attending my wedding? And offering me joint life insurance policies? My own extended family members treat me differently now than they did when I was a teenager, and I don’t quite know how to trust that the new way is the true way and not some kind of facade that could rapidly disappear.

I think this sort of generational trauma may never fully heal, in which case it will be the generation after us—the Zoomers, I believe we’re calling them now—who will actually live in this new world we created, while the rest of us forever struggle to accept that things are not as we remember them. Once bitten, we remain forever twice shy, lest attitudes regress as suddenly as they advanced.

Then again, it seems that Zoomers may be turning against the institution of marriage in general. As the meme says: “Boomers: No gay marriage. Millennials: Yes gay marriage. Gen Z: Yes gay, no marriage.” Maybe that’s for the best; maybe the future of humanity is for personal relationships to be considered no business of the government at all. But for now at least, equal marriage is clearly much better than unequal marriage, and the First World seems to have figured that out blazing fast.

And of course the rest of the world still hasn’t caught up. While trends are generally in a positive direction, there are large swaths of the world where even very basic rights for LGBT people are opposed by most of the population. As usual, #ScandinaviaIsBetter, with over 90% support for LGBT rights; and, as usual, Sub-Saharan Africa is awful, with support in Kenya, Uganda and Nigeria not even hitting 20%.

Europe is paying the price for relying on Russian natural gas

Sep 18 JDN 2459841

For far too long, Europe has relied upon importing cheap natural gas from Russia to supply a large proportion of its energy needs. Now that the war in Ukraine has led to mutual sanctions, they are paying the price—literally, as the price of natural gas has absolutely ballooned. Dutch natural gas futures have soared from about €15 per megawatt-hour in 2020 to over €200 today.

Natural gas prices are rising worldwide, but not nearly as much: Henry Hub natural gas prices (a standard metric for natural gas prices in the US) have risen from under $2 per million BTU in 2020 to nearly $9 today. This substantial divide in prices can only be sustained because transporting natural gas is expensive and requires substantial infrastructure. (1 megawatt-hour is about 3.4 million BTU, and the euro is trading at parity with the dollar (!), so effectively US prices rose from €7 per MWh to €31 per MWh—as opposed to €200.)

As a result, a lot of people in Europe are suddenly finding their utility bills unaffordable. (I’m fortunate that my flat is relatively well-insulated and my income is reasonably high, so I’m not among them; the higher prices will be annoying, but not beyond my means.) What should we do about this?

There are some economists who would say we should do nothing at all: Laissez-faire. Markets are efficient, right? So just let people freeze! Fortunately, Europe is not governed by such people nearly as much as the US is.

But while most economists would agree that we should do something, it’s much harder to get them to agree on exactly what.

Rising prices of natural gas are sort of a good thing, from an environmental perspective; they’ll provide an incentive to reduce carbon emissions. So it’s tempting to say that we should just let the prices rise and then compensate by raising taxes and paying transfers to poor families. But that probably isn’t politically viable; all three parts—letting prices rise, raising taxes, and increasing transfers—are all going to make enemies, and we really must have all three for such a plan to work.

The current approach seems to be based on price controls: Don’t let the prices rise so much. The UK has such a policy in place: Natural gas prices for consumers are capped by regulations. The cap has been increased in response to the crisis (itself an unpopular, but clearly necessary, move), but even so 31 gas companies have already gone under across the UK since the start of 2021. It really seems to be the case that for many gas companies, especially the smaller ones with less economy of scale, it’s simply not possible to continue providing natural gas to homes with input prices so high and output prices capped so low.

Or, we could let prices rise that high for producers, but subsidize consumers so that they don’t feel it; several European countries are already doing this. That at least won’t result in gas companies failing, but it will cost a lot of government funds. Greece in particular is spending over 3% of their GDP on it! (For comparison, the US military budget is about 4% of GDP.) I think this might actually be the best option, though all that spending will mean more government debt or higher taxes.

European governments have also been building up strategic reserves of natural gas, which may help us get through the winter—but it also makes the current price increases even worse.

We could also ration energy use, as we’ve often done during wartime. (Is this wartime? Kind of? Not really? It certainly is starting to feel like Cold War II.) Indeed, the President of the European Commission basically said that this should happen. That, at least, would reap some of the environmental benefits of reduced natural gas consumption. Rationing also feels fair to most people in a way that simply letting market prices rise does not; there is a sense of shared sacrifice. What worries me, however, is that the rations won’t be well-designed enough to account for energy usage that isn’t in a family’s immediate control. If you’re renting a flat that is poorly insulated, you can’t immediately fix that. You can try to pressure the landlord into buying better insulation, but in the meantime you’re the one paying the energy bills—or getting cold when the natural gas ration isn’t enough.

Actually I strongly suspect that most household usage of natural gas is of this kind; people don’t generally heat their homes more than necessary just because gas is cheap. Maybe they can set the thermostat a degree or two lower when gas is expensive, or maybe they use the gas oven less often and the microwave more; but the vast majority of their gas consumption is a function of the climate they live in and the insulation of their home, not their day-to-day choices. So if we’re trying to incentivize more efficient energy usage, that’s a question of long-term investment in construction and retrofitting, not something that sudden price spikes will really help with.

In the long run, what we really need to do is wean ourselves off of natural gas. Currently natural gas provides 33% of energy and nearly 40% of heating in Europe. (US figures are comparable.) Switching to electric heat pumps and powering them with solar and wind power isn’t something we can do overnight—but it is something we surely must do.

I think ultimately what is going to happen is all of the above: Different countries will adopt different policy mixes, all of them will involve difficult compromises, none of them will be particularly well-designed, and we’ll all sort of muddle through as best we can.

The War on Terror has been a total failure.

Sep 11 JDN 2459834

Since today happens to be September 11, I thought I’d spend this week’s post reflecting on the last 21 years (!) of the War on Terror.

At this point, I can safely say that the War on Terror has been a complete, total, utter failure. It has cost over $8 trillion and nearly a million lives, and not only didn’t reduce terrorism, it actually appears to have substantially increased it.

Take a look at this graph from Our World in Data:

Up until the the 1980s, terrorism worldwide was a slow smoldering, killing rarely more than a few hundred people each year. Obviously it’s terrible if you or one of your loved ones happen to be among those few hundred, but in terms of its overall chance of killing you or your children, terrorism used to be less dangerous than kiddie pools.

Then terrorism began to rise, until it was killing several thousand people a year. I was surprised to learn that most of these were not in the Middle East, but in fact spread all over the world, with the highest concentrations actually being in South Asia and Sub-Saharan Africa.

Notably, almost none of these deaths were in First World countries, and as a result most First World governments largely ignored them. Terrorism was something that happened “over there”, to other people.

Then of course came 2001, and 9/11/2001, in which nearly 3,000 Americans were killed in a single day. And suddenly the First World took notice, and decided to respond with overwhelming force.

We have been at war basically ever since. All this war has accomplished… approximately nothing.

The deadliest year of terrorism in the 21st century was not 2001; it was 2014, after the US had invaded both Afghanistan and Iraq, and in fact withdrawn from Iraq (but not yet Afghanistan). This was largely the result of the rise of Daesh (which is what you should call them by the way), which seems to be the most fanatical and violent Islamist terrorist organization the world has seen in decades if not centuries.

Even First World terrorism is no better today than it was in the 1990s—though also no worse. It’s back to a slow smolder, and once again First World societies can feel that terrorism is something that happens to someone else. But terrorism in the Middle East is the worst it has been in decades.

Would Daesh not have appeared if the US had never invaded Afghanistan and Iraq? It’s difficult to say. Maybe their rise was inevitable. Or maybe having a strong, relatively secular government in the region under Saddam Hussein would have prevented them from becoming so powerful. We can at least say this: Since the US withdrew from Afghanistan and the Taliban retook control, the Taliban and Daesh have been fighting each other quite heavily. Presumably that would have been happening all along if the US had not intervened to suppress the Taliban.

Don’t get me wrong: The Taliban were, and are, a terrible regime, and Saddam Hussein was a terrible dictator. But Daesh is clearly worse than either, and sometimes in geopolitics you have to accept the lesser evil.

If we’d actually had a way to take over Afghanistan and Iraq and rebuild them as secular liberal democracies as the US government intended, that would have been a good thing, and might even have been worth all that blood and treasure. But that project utterly failed, and we should have expected it to fail, as never in history has anyone successfully imposed liberal democracy by outside force like that.

When democracy spreads, it usually does so slowly, through the cultural influence of trade and media. Sometimes it springs up in violent revolution—as we hoped it would in the Arab Spring but were sadly disappointed. But there are really no clear examples of a democratic country invading an undemocratic country and rapidly turning it democratic.

British colonialism was spread by the sword (and especially the machine gun), and did sometimes ultimately lead to democratic outcomes, as in the US, Australia, and Canada, and more recently in India, South Africa, and Botswana. But that process was never fast, never smooth, and rarely without bloodshed—and only succeeded when the local population was willing to fight for it. Britain didn’t simply take over countries and convert them to liberal democracies in a generation. No one has ever done that, and trying to was always wishful thinking.

I don’t know, maybe in the very long run, we’ll look back on all this as the first, bloody step toward something better for the Middle East. Maybe the generation of women who got a taste of freedom and education in Afghanistan under US occupation will decide to rise up and refuse to relinquish those rights under the new Taliban. Daesh will surely die sooner or later; fanaticism can rarely sustain organizations in the long term.

But it’s been 20 years now, and things look no better than they did at the start. Maybe it’s time to cut our losses?

Working from home is the new normal—sort of

Aug 28 JDN 2459820

Among people with jobs that can be done remotely, a large majority did in fact switch to doing their jobs remotely: By the end of 2020, over 70% of Americans with jobs that could be done remotely were working from home—and most of them said they didn’t want to go back.

This is actually what a lot of employers expected to happen—just not quite like this. In 2014, a third of employers predicted that the majority of their workforce would be working remotely by 2020; given the timeframe there, it required a major shock to make that happen so fast, and yet a major shock was what we had.

Working from home has carried its own challenges, but overall productivity seems to be higher working remotely (that meeting really could have been an email!). This may actually explain why output per work hour actually rose rapidly in 2020 and fell in 2022.

The COVID pandemic now isn’t so much over as becoming permanent; COVID is now being treated as an endemic infection like influenza that we don’t expect to be able to eradicate in the foreseeable future.

And likewise, remote work seems to be here to stay—sort of.

First of all, we don’t seem to be giving up office work entirely. As of the first quarter 2022, almost as many firms have partially remote work as have fully remote work, and this seems to be trending upward. A lot of firms seem to be transitioning into a “hybrid” model where employees show up to work two or three days a week. This seems to be preferred by large majorities of both workers and firms.

There is a significant downside of this: It means that the hope that remote working might finally ease the upward pressure on housing prices in major cities is largely a false one. If we were transitioning to a fully remote system, then people could live wherever they want (or can afford) and there would be no reason to move to overpriced city centers. But if you have to show up to work even one day a week, that means you need to live close enough to the office to manage that commute.

Likewise, if workers never came to the office, you could sell the office building and convert it into more housing. But if they show up even once in awhile, you need a physical place for them to go. Some firms may shrink their office space (indeed, many have—and unlike this New York Times journalist, I have a really hard time feeling bad for landlords of office buildings); but they aren’t giving it up entirely. It’s possible that firms could start trading off—you get the building on Mondays, we get it on Tuesdays—but so far this seems to be rare, and it does raise a lot of legitimate logistical and security concerns. So our global problem of office buildings that are empty, wasted space most of the time is going to get worse, not better. Manhattan will still empty out every night; it just won’t fill up as much during the day. This is honestly a major drain on our entire civilization—building and maintaining all those structures that are only used at most 1/3 of 5/7 of the time, and soon, less—and we really should stop ignoring it. No wonder our real estate is so expensive, when half of it is only used 20% of the time!

Moreover, not everyone gets to work remotely. Your job must be something that can be done remotely—something that involves dealing with information, not physical objects. That includes a wide and ever-growing range of jobs, from artists and authors to engineers and software developers—but it doesn’t include everyone. It basically means what we call “white-collar” work.

Indeed, it is largely limited to the upper-middle class. The rich never really worked anyway, though sometimes they pretend to, convincing themselves that managing a stock portfolio (that would actually grow faster if they let it sit) constitutes “work”. And the working class? By and large, they didn’t get the chance to work remotely. While 73% of workers with salaries above $200,000 worked remotely in 2020, only 12% of workers with salaries under $25,000 did, and there is a smooth trend where, across the board, the more money you make, the more likely you have been able to work remotely.

This will only intensify the divide between white-collar and blue-collar workers. They already think we don’t do “real work”; now we don’t even go to work. And while blue-collar workers are constantly complaining about contempt from white-collar elites, I think the shoe is really on the other foot. I have met very few white-collar workers who express contempt for blue-collar workers—and I have met very few blue-collar workers who don’t express anger and resentment toward white-collar workers. I keep hearing blue-collar people say that we think that they are worthless and incompetent, when they are literally the only ones ever saying that. I can’t stop saying things that I never said.

The rich and powerful may look down on them, but they look down on everyone. (Maybe they look down on blue-collar workers more? I’m not even sure about that.) I think politicians sometimes express contempt for blue-collar workers, but I don’t think this reflects what most white-collar workers feel.

And the highly-educated may express some vague sense of pity or disappointment in people who didn’t get college degrees, and sometimes even anger (especially when they do things like vote for Donald Trump), but the really vitriolic hatred is clearly in the opposite direction (indeed, I have no better explanation for how otherwise-sane people could vote for Donald Trump). And I certainly wouldn’t say that everyone needs a college degree (though I became tempted to, when so many people without college degrees voted for Donald Trump).

This really isn’t us treating them with contempt: This is them having a really severe inferiority complex. And as information technology (that white-collar work created) gives us—but not them—the privilege of staying home, that is only going to get worse.

It’s not their fault: Our culture of meritocracy puts a little bit of inferiority complex in all of us. It tells us that success and failure are our own doing, and so billionaires deserve to have everything and the poor deserve to have nothing. And blue-collar workers have absolutely internalized these attitudes: Most of them believe that poor people choose to stay on welfare forever rather than get jobs (when welfare has time limits and work requirements, so this is simply not an option—and you would know this from the Wikipedia page on TANF).

I think that what they experience as “contempt by white-collar elites” is really the pain of living in an illusory meritocracy. They were told—and they came to believe—that working hard would bring success, and they have worked very hard, and watched other people be much more successful. They assume that the rich and powerful are white-collar workers, when really they are non-workers; they are people the world was handed to on a silver platter. (What, you think George W. Bush earned his admission to Yale?)

And thus, we can shout until we are blue in the face that plumbers, bricklayers and welders are the backbone of civilization—and they are, and I absolutely mean that; our civilization would, in an almost literal sense, collapse without them—but it won’t make any difference. They’ll still feel the pain of living in a society that gave them very little and tells them that people get what they deserve.

I don’t know what to say to such people, though. When your political attitudes are based on beliefs that are objectively false, that you could know are objectively false if you simply bothered to look them up… what exactly am I supposed to say to you? How can we have a useful political conversation when half the country doesn’t even believe in fact-checking?

Honestly I wish someone had explained to them that even the most ideal meritocratic capitalism wouldn’t reward hard work. Work is a cost, not a benefit, and the whole point of technological advancement is to allow us to accomplish more with less work. The ideal capitalism would reward talent—you would succeed by accomplishing things, regardless of how much effort you put into them. People would be rich mainly because they are brilliant, not because they are hard-working. The closest thing we have to ideal capitalism right now is probably professional sports. And no amount of effort could ever possibly make me into Steph Curry.

If that isn’t the world we want to live in, so be it; let’s do something else. I did nothing to earn either my high IQ or my chronic migraines, so it really does feel unfair that the former increases my income while the latter decreases it. But the labor theory of value has always been wrong; taking more sweat or more hours to do the same thing is worse, not better. The dignity of labor consists in its accomplishment, not its effort. Sisyphus is not happy, because his work is pointless.

Honestly at this point I think our best bet is just to replace all blue-collar work with automation, thus rendering it all moot. And then maybe we can all work remotely, just pushing code patches to the robots that do everything. (And no doubt this will prove my “contempt”: I want to replace you! No, I want to replace the grueling work that you have been forced to do to make a living. I want you—the human being—to be able to do something more fun with your life, even if that’s just watching television and hanging out with friends.)