I see a forecast in your future
A few months ago I wrote about the best single metric for measuring marketing. That metric:

It’s the forecast, when compared to actuals. If the forecast is accurate to ±3%, you’ve got great marketing. If ±10% you’ve got good marketing.

So I was happy to see a book on the scientific study of forecasting. Short review: a great book for those curious about the future.

Superforecasting
America spends over $50B annually on our various “intelligence” services. There’s no exact number because the budgets are classified.Superforecasting

The Intelligence Community includes the CIA, NSA, DIA and another dozen or so alphabet agencies. When agency heads aren’t busy lying to Congress, they’re working hard to plant backdoors in the Internet – another forecastable fiasco.

They also aren’t very good forecasters. They missed 9/11 – well, George W. Bush blew off their warning, too – but the worst was their “slam dunk” assessment of WMDs in Iraq. A trillion dollars and many thousands of casualties later, oopsie!

The Good Judgement Project
To their credit, the Intelligence Advanced Projects Agency (IARPA) decided to see what could be done to improve IC accuracy. That’s where the authors of Superforecasting – Penn prof Philip E. Tetlock and writer Dan Gardner – come in.

IARPA set up a prediction tournament and invited five scientific teams to make frequent and measurable predictions, with a control group and a team from IC that had all the intelligence that $50B can buy. The contest was supposed to run for 5 years, but the other teams were dropped after two because Tetlock’s team blew them all away, including the IC team.

The top two percent of Tetlock’s Good Judgement Project team excelled and, for the most part, continued to offer excellent forecasts for the remainder of the contest. The superforecasters weren’t geniuses either. Just bright, curious, flexible and careful to make fine distinctions. Yes, making fine distinctions makes you more accurate – among other best practices.

The StorageMojo take
In short, anyone who is a product or marketing manager at a tech company should be able to dramatically improve their forecast skills if they take the book’s lessons to heart. And it describes the issues that are common to bad forecasts: use it to amp up your BS detector.

The book is well-written – thanks Dan! – and is a gentle but thorough introduction to forecasting pitfalls, both psychological and statistical. There’s helpful appendix Ten Commandaments for Aspiring Superforecasters, but this is one case where the book is meaty enough to reward a careful read.

Highly recommended!

Courteous comments welcome, of course.