đź“Ť

Why Manifold may be a bad use of EA resources

What could be more EA than self-criticism?

Prediction markets aren’t that useful

  • Any individual market provides very few bits of data: just a probability from 0 to 100%
  • Decision-relevancy continues to be a major thorn
    • Conditional prediction markets get less trading; hard to assess how accurate they are
  • “Market” structure favors short-term over long-term forecasts
  • Anecdotally: Austin went through a “prediction markets on everything” phase, but in the long run creating markets seems to require more work than is worthwhile, most of the time

Prediction markets are major attention drains

  • Lots of EA celebs and AI safety researchers confess spending a lot of time on the site or call it “addictive”
    • Scott Alexander, Peter Wildeford, Katja Grace
  • Anecdotally: Austin often finds the dopamine loop of Manifold itself fairly disruptive; objectively it should be blocked just like EA Forum/Twitter

Startup incentives make Manifold chase growth

  • Primary metric that Manifold tracks is DAU, not something like “value added” or “market accuracy”
  • Revenue is mostly anti-correlated with usefulness; peak purchases coincide with gambling-like behavior

Manifold funges against more epistemically virtuous sites

Metaculus: Will at least 1 million people die from nuclear weapons before the end of 2025?

Manifold markets: Will Eliezer Yudkowsky write a tweet containing the word "rationalussy" by the end of 2023?

Metaculus, Sage, and QURI all strive more in the direction of truthseeking

See also