Will effective altruism survive Sam Bankman-Fried’s FTX crypto crash?


Former crypto billionaire Sam Bankman-Fried wasn’t just one of the major figures associated with “effective altruism,” or EA, the movement in philanthropy to rigorously hold charitable donations to external measures of effectiveness and impact. He has also exerted significant influence on the movement through his once-prodigious funding capacities.

Bankman-Fried’s rapid fall from grace — with the recent collapse of his crypto exchange FTX, which filed for bankruptcy on Friday — casts doubt on the future of the movement, given his advocacy for it and close relationships with its leading intellectuals and leaders. It also raises questions about whether Bankman-Fried’s EA-influenced beliefs about risk and reward played a role in the decisions that led to FTX’s stunning fall.

The Future Fund, a subset of Bankman-Fried’s charitable apparatus that was largely associated with the related movement known as “longtermism” — the idea that the welfare of future people should weigh heavily on the present, and thus, one of the highest impact actions to take today is to minimize risks of extinction or near-extinction — committed “over $160 million” to a wide range of individuals and organizations aligned with the view. The fund’s five external staffers resigned last week.

A lack of cash is the immediate problem. Some projects will go unfunded. Bankman-Fried did not set up an endowment and instead funded projects as they came, telling the New York Times, “It’s more of a pay-as-we-go thing, and the reason for that, frankly, is I’m not liquid enough for it to make sense to do an endowment right now.”

Some organizations, like the nonprofit journalism outlet ProPublica, said they won’t be receiving the full allotment of their Bankman-Fried funds for reporting on public health threats; Josh Morrison, the founder of 1Day Sooner, an organization that advocates for human vaccine challenge trials, told Grid that FTX made up about 12 percent of the group’s funding and was its fourth-largest funder.

But there’s another issue beyond making up funding shortfalls, one the EA community is beginning to confront in a wrenching, often quite public way. (You can read the forum posts yourself.) What responsibility does the EA community and its ideas, especially around longtermism, bear in motivating Bankman-Fried and FTX’s high-risk, destructive and perhaps illegal financial maneuvering? Did the movement’s own lofty goals shift into a messianism that blinded its members to the risks in their midst?

“Hardly anyone associated with Future Fund saw the existential risk to … Future Fund, even though they were as close to it as one could possibly be,” Tyler Cowen, the EA-adjacent George Mason University economist, wrote on his blog. “I am thus skeptical about their ability to predict existential risk more generally, and for systems that are far more complex and also far more distant.”

The greatest risk to the movement turned out to be Bankman-Fried, a man who employed the movement’s leading philosopher and public face, Will MacAskill. MacAskill is credited with convincing Bankman-Fried to do his part for effective altruism by getting rich and donating what he could. Before FTX’s collapse, the 30-year-old Bankman-Fried was estimated to be worth $16 billion.

“I want to make it utterly clear: if those involved deceived others and engaged in fraud (whether illegal or not) that may cost many thousands of people their savings, they entirely abandoned the principles of the effective altruism community,” MacAskill tweeted last week. “Sam and FTX had a lot of goodwill — and some of that goodwill was the result of association with ideas I have spent my career promoting. If that goodwill laundered fraud, I am ashamed.”

Gambling with the future

MacAskill’s close connection to Bankman-Fried has brought attention to his capacious and overlapping role in the EA community, as both a public figurehead following the publication of his book “What We Owe the Future” and his role in a number of EA organizations, including those funded by Bankman-Fried.

“I think people follow individual leaders, so there was some logic in building his brand as a way of creating a public face for effective altruism for people to connect with emotionally and follow,” said Morrison, referring to MacAskill. “For reasons mostly out of his control, he is less able to do that than he was pre-FTX, though I think his association with cryptocurrency was an understandable but unnecessary risk.”

In the wake of FTX’s collapse, much attention has been paid to its sister trading firm, Alameda Research. FTX lent billions of dollars in customer assets to Alameda, run by Bankman-Friend associate Caroline Ellison; when the news emerged, FTX was brought down by a modern-day bank run. While the details of Alameda’s trading and FTX’s relationship with the firm remain to be excavated by law enforcement, auditors, lawyers and reporters, Bankman-Fried had been relatively open about how he dealt with questions of risk and reward, at least in a theoretical way.

“If your goal is to have impact on the world — and in particular if your goal is to maximize the amount of impact that you have on the world — that has pretty strong implications for what you end up doing,” Bankman-Fried told Robert Wiblin, a prominent effective altruist who is the director of research for 80,000 Hours, a nonprofit co-founded by MacAskill that tries to guide young people interested in maximizing human well-being. “If you really are trying to maximize your impact, then at what point do you start hitting decreasing marginal returns?”

Bankman-Fried argued that philanthropists and those earning money to give it away should be more tolerant of risk: “Your strategy is very different if you’re optimizing for making at least a million dollars, versus if you’re optimizing for just the linear amount that you make.” He said he founded FTX because “there’s well, and then there’s better than well — there’s no reason to stop at just doing well.”

Even the very richest have what’s known as “marginal declining utility of wealth”: The first million dollars does more for their well-being than the next million. The first hundred million is better than the next (there are only so many yachts and so many hours to spend on them). But if you’re earning with the goal of giving your money away to solve big problems that bear on the well-being of millions of living people — and billions or trillions of people not yet born (something EAs think a lot about) — maybe things are different.

“More good is more good. It’s not like you did some good, so good doesn’t matter anymore. But how about money? Are you able to donate so much that money doesn’t matter anymore? And the answer is, I don’t exactly know,” Bankman-Fried said to Wiblin.

He continued: “The expected value of how much impact you have, I think, is going to be a function sort of weighted towards upside tail cases. That’s what I think my prior would be. And if your impact is weighted towards upside tail cases, then what’s that probability distribution of impact probably look like? I think the odds are, it has decent weight on zero. Maybe majority weight.”

In other words, the expected benefits from Alameda and FTX amassing capital and making money were so high that it was OK — maybe even ethically mandated — to accept the risk of losing everything.

“If you see yourself as fighting against this risk of humanity and you see humanity as [lasting] the next trillion years, it’s easy to have a god complex and lose humility as a virtue,” Morrison said.

Double or nothing?

In a now-deleted Tumblr blog, Ellison was even more extreme: “If you abstract away the financial details there’s also a question of like, what your utility function is. Is it infinitely good to do double-or-nothing coin flips forever? Well, sort of, because your upside is unbounded and your downside is bounded at your entire net worth. But most people don’t do this, because their utility is more like a function of their log wealth or something and they really don’t want to lose all of their money. (Of course those people are lame and not EAs; this blog endorses double-or-nothing coin flips and high leverage.)”

If the universe of worthwhile projects for others is greater than the amount of money you can make on a bet, Ellison seems to argue, it makes sense to continually go double or nothing. The most you can lose is your net worth, while the most you can gain is the well-being of millions. Of course, in the end, Bankman-Fried and Ellison lost far more than their own net worth; there’s a gaping hole of some billions of dollars where client funds used to be in FTX.

“There’s an attempt at maximizing whatever quantity they’re interested in. That is one of the risks inherent in that philosophy, you’re hoping to max out on something. Sometimes, according to that calculation, you’re justified in taking high risk,” Carla Zoe Cremer, an Oxford scholar and EA critic, told Grid.

“The kind of questions they’re asking are about the long-term future, which we have no information for. You can tweak on those variables to the extent you can make any argument for whatever,” Cremer said.

Cremer has written a number of papers and EA Forum posts criticizing the longtermist framework around existential risk, has called for governance reforms within the EA community, and specifically warned of the risk of becoming too dependent on a few charismatic — and rich — individuals. “EA needs to diversify funding sources by breaking up big funding bodies and by reducing each orgs’ reliance on EA funding and tech billionaire funding,” she wrote late last year.

Morrison agreed, telling Grid, “I have believed for a while that EA should be more decentralized. … There was a closeness and reliance on crypto…



Read More:Will effective altruism survive Sam Bankman-Fried’s FTX crypto crash?

2022-11-15 21:35:40

altruismBankmanFriedscrashcryptoCryptocurrencyeffectiveFTXSamscience-tech-teamsurvive
Comments (0)
Add Comment