Skip to content

Fetch forecasts from prediction markets/forecasting platforms to make them searchable. Integrate these forecasts into other services.

License

Notifications You must be signed in to change notification settings

quantified-uncertainty/metaforecast

Repository files navigation

What this is

Metaforecast is a search engine for probabilities from various prediction markes and forecasting platforms. Try searching "Trump", "China" or "Semiconductors".

This repository includes the source code for both the website and the library that fetches forecasts needed to replace them. We also aim to provide tooling to integrate metaforecast with other services.

How to run

1. Download this repository

$ git clone https://github.com/quantified-uncertainty/metaforecast
$ cd metaforecast
$ pnpm install

2. Set up a database and environment variables

You'll need a PostgreSQL instance, either local (see https://www.postgresql.org/download/) or in the cloud (for example, you can spin one up on https://www.digitalocean.com/products/managed-databases-postgresql or https://supabase.com/).

Environment can be set up with an .env file. You'll need to configure at least DIGITALOCEAN_POSTGRES.

See ./docs/configuration.md for details.

3. Actually run

After installing and building (pnpm run build) the application, pnpm run cli starts a local CLI which presents the user with choices. If you would like to skip that step, use the option name instead, e.g., pnpm run cli wildeford.

npm run next-dev starts a Next.js dev server with the website on http://localhost:3000.

So overall this would look like

$ git clone https://github.com/quantified-uncertainty/metaforecast
$ cd metaforecast
$ pnpm install
$ pnpm run build
$ pnpm run cli
$ pnpm run next-dev

4. Example: download the metaforecasts database

$ git clone https://github.com/quantified-uncertainty/metaforecast
$ cd metaforecast
$ pnpm install
$ node src/backend/manual/manualDownload.js

Integrations

Metaforecast has been integrated into:

  • Twitter, using our @metaforecast bot
  • Global Guessing, which integrates our dashboards
  • Fletcher, a popular Discord bot. You can invoke metaforecast with !metaforecast search-term
  • Elicit, which uses GPT-3 to deliver vastly superior semantic search (as opposed to fuzzy word matching). If you have access to the Elicit IDE, you can use the action "Search Metaforecast database. This is not being updated regularly.

You can use our GraphQL API to build your own integration.

We are also open to integrating our Elasticsearch instance with other trusted services (in addition to Fletcher.)

In general, if you want to integrate metaforecast into your service, we want to hear from you.

Code layout

  • frontend code is in src/pages/, src/web/ and in a few other places which are required by Next.js (e.g. root-level configs in postcss.config.js and tailwind.config.js)
  • various backend code is in src/backend/
  • fetching libraries for various platforms is in src/backend/platforms/
  • rudimentary documentation is in docs/

What are "stars" and how are they computed

Star ratings—e.g. ★★★☆☆—are an indicator of the quality of an aggregate forecast for a question. These ratings currently try to reflect my own best judgment and the best judgment of forecasting experts I've asked, based on our collective experience forecasting on these platforms. Thus, stars have a strong subjective component which could be formalized and refined in the future. You can see the code used to decide how many stars a forecast should get by looking at the function calculateStars() in the files for every platform here.

With regards the quality, I am most uncertain about Smarkets, Hypermind, Ladbrokes and WilliamHill, as I haven't used them as much. Also note that, whatever other redeeming features they might have, prediction markets rarely go above 95% or below 5%.

Tech stack

Overall, the services which we use are:

  • Elasticsearch for search
  • Vercel for website deployment
  • Heroku for background jobs, e.g. fetching new forecasts
  • Postgres on DigitalOcean for database

Various notes

  • This repository is released under the MIT license. See LICENSE.md
  • Commits follow conventional commits
  • For elicit and metaculus, this library currently filters out questions with <10 predictions.
  • The database is updated once a day, at 3:00 AM UTC, with the command ts-node -T src/backend/flow/doEverythingForScheduler.ts. The frontpage is updated after that, at 6:00 AM UTC with the command ts-node -T src/backend/index.ts frontpage. It's possible that either of these two operations makes the webpage briefly go down.

To do

  • Update Metaculus and Manifold Markets fetchers
  • Add markets from Insight Prediction.
  • Update broken fetchers:
    • For Good Judgment
    • Kalshi: Requires a US person to create an account to access their v2 api.
  • Use https://news.manifold.markets/p/above-the-fold-midterms-special to update stars calculation for Manifold.
  • Add a few more snippets, with fetching individual questions, questions with histories, questions added within the last 24h to the /contrib folder (good first issue)
  • Refactor code so that users can capture and push the question history chart to imgur (good first issue)
  • Upgrade to React 18. This will require dealing with the workaround we used for this issue
  • Add database of resolutions
  • Allow users to embed predictions in the EA Forum/LessWrong (in progress)
  • Find a long-term mantainer for this project
  • Allow users to record their own predictions
  • Release snapshots (I think @niplav is working on this)
  • ...

About

Fetch forecasts from prediction markets/forecasting platforms to make them searchable. Integrate these forecasts into other services.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors 4

  •  
  •  
  •  
  •