5 minute read

Superforecasters share three strategies for making accurate predictions

Next Article
TikTok Dough

TikTok Dough

THE PREDICTION TRADE

Open-Minded Forecasting in a Deeply Polarized World

Advertisement

By Warren Hatch

Americans are more polarized than ever, and their split along two ideological extremes complicates a forecaster’s job. Polarization stresses feelings over facts, confounding the separation of signal from noise that’s essential to forecasting accuracy. Also, the forecaster’s own biases and preferences can be harder to recognize— and set aside—when society at large is polarized and the outcomes are personally consequential.

When Good Judgment Inc, a forecasting company, asked its professional Superforecasters to predict the outcome of the 2020 U.S. election cycle, these challenges were front and center. Many Superforecasters live in the United States and feel deeply about political issues in the country. Some of them worried this could cloud their forecasting judgment. Here’s what they did, and what you can do to improve the accuracy of your own predictions in a polarized world.

U.S. election 2020

The Superforecasters predicted in March 2020 that the Democrats would win the White House and never looked back. As early as June, they began predicting both the House and Senate would go to the Democrats. They accurately called:

• the long-delayed concession

• the record voter turnout, and

• the Democrats’ presidential fundraising edge as of Sept. 30.

But getting it right is only half of the picture. Good Judgment strives to be right for the right reasons. To calibrate their thinking, Superforecasters use three simple strategies that consistently result in more accurate predictions.

Consider alternatives

While the Superforecasters as a group assigned high odds for a Democratic sweep, individual Superforecasters predicted a variety of outcomes. A diversity of views is essential for good forecasting, but on issues you hold dear, considering other views is easier said than done. Over the week before the election, Good Judgment asked the Superforecasters as a group to imagine they could time-travel to a future in which the Republicans retained both the White House and the Senate. Regardless of their individual forecasts, they were then asked to explain why a “blue wave” election failed to occur in such a future.

This is called a pre-mortem, or “what if” exercise. Thinking through alternative scenarios ahead of the actual outcome accomplishes several goals. It forces the forecaster to consider other perspectives and to rethink the reasoning and evidence supporting their forecasts. It also tests the forecaster’s level of confidence (over-confidence being far more common than under-confidence) and helps forecasters avoid hindsight bias when evaluating the forecasts later.

Because Superforecasters already weigh multiple alternatives in making forecasts, this pre-mortem produced little change in the over-

On issues you hold dear, considering other views is easier said than done.

Tune in for Season 2 of The Prediction Trade podcast from Luckbox. It features all-new ways to make money by forecasting the financial markets, politics, crypto, sports and other events where it’s legal to wager.

all forecasts. Even after several days of internal debate on the “what if” scenarios, their aggregate probabilities barely moved.

But the exercise was useful. It showed that the Superforecasters’ predictions were well-calibrated. It also produced multiple scenarios with detailed commentary, some of which proved clear-eyed in light of the actual events following the election.

Kjirste Morrell, one of Good Judgment’s leading Superforecasters, was among the participants in the exercise. She says she didn’t make large changes to her forecasts but underscores the value of the discussion.

“In retrospect, I should have placed more credence on the possibility of violence after the election, which was mentioned during the pre-mortem exercise,” she said.

Keep it civil

A wise crowd encompasses diverse views. Studies based on the Good Judgment Project found that being an “actively open-minded thinker” correlates with being an accurate forecaster. That’s no mystery. Exposure to views with which we disagree can inform our understanding of the world. But Superforecasters don’t simply agree with everything. They know how to “disagree without being disagreeable.” All forecasters can master this trait, as witnessed on our public forecasting platform, Good Judgement Open, at gjopen.com.

Throughout the 2020 election cycle, moderators observed very few comments that fell outside the bounds of reasonable civil discourse. This relative civility on GJ Open may surprise those accustomed to the rough-andtumble of the Twitterverse. But it comes as no shock to Good Judgment’s co-founder Barb Mellers, whose research suggests that forecasting tournaments can reduce political polarization. As the election cycle intensified and the public debate grew more heated and personal elsewhere on social media, GJ Open continued to emphasize facts and reasoned argument. It showed that forecasters can learn to remain focused on what matters to the accuracy of their predictions and block out the noise of inflammatory rhetoric.

Keep score

Keeping score is essential to good forecasting, says Good Judgment’s co-founder Philip E. Tetlock. Superforecasters are not the only professionals who recognize this. Weather forecasters, bridge players and internal auditors all know that tracking prediction outcomes and getting timely feedback are strategies that improve forecasting performance. Superforecasters use quantifiable probabilities to express their forecasts and Brier scores to measure accuracy. Keeping score enables forecasters and companies to learn from past mistakes and calibrate their forecasts in the future.

No single forecast is truly right or wrong unless it is expressed in terms of absolute certainty (0% or 100%). If the probability of President Donald Trump being re-elected were 13% (Good Judgment’s forecast as of Nov. 1), he would win the election 13 out of 100 times if we could re-run history repeatedly. That’s why forecasting accuracy is best judged over large numbers of questions.

The Superforecasters’ accuracy has been scrutinized over hundreds of questions, and a forecasting method that can beat them consistently has yet to be found. The Superforecasters know what they know—and what they don’t know. They know how to think through alternative scenarios, and they know the importance of keeping score. When it comes to calculating the odds for even highly polarized topics, their process shows how best practices deliver the best accuracy.

Warren Hatch, a former Wall Street investor, is CEO of Good Judgment Inc, a commercial enterprise that provides forecasts and forecasting training based on the expertise and research of the Good Judgment Project. @wfrhatch

FAR-OUT FORECASTS

When will SpaceX’s satellite internet service, Starlink, begin offering commercial service in North America?

123 Forecasters on gjopen.com 581 Forecasts*

Before March 31, 2021 0%

Between April 1, and June 30, 2021 41%

Not before July 1, 2021 59%

*As of Feb. 25, 2020

Will SpaceX and/ or Virgin Galactic complete a successful space tourist flight before Jan. 1, 2022?

95 Forecasters on gjopen.com 116 Forecasts*

Yes, only SpaceX 31%

Yes, only Virgin Galactic 5%

Yes, both 3%

No 61%

*As of Feb. 25, 2020

To forecast these and many other questions, visit gjopen.com

This article is from: