Wading through a world of uncertainty
Computer game improves decision-making
A prototype game helps improve decision-making skills by training users to understand their uncertainties.
World of Uncertainty presents a series of multiple-choice questions; the aim isn't to answer them correctly, but to assess a participant's level of uncertainty.
An interactive slider indicates their confidence and points are awarded according to how accurately they estimated their level of certainty. Feedback helps users recognise and correct how confident they are when faced with similar decisions.
Dr David Newman, of Queen's University Belfast who led the project, said: "“Whether the choices facing us are simple or complex, a greater awareness of uncertainty and of our own biases can improve the quality of our decision-making. We believe there’s real potential for people to acquire that awareness through computer games.”
The researchers think the software could be adapted for use by commercial games developers and turned into an e-learning training tool, or incorporated into existing video-games that have a strategic element.
World of Uncertainty is a four-year project that received over £269,000 in funding from the EPSRC - the main UK government agency for funding research and training in engineering and physical sciences. It invests £850m a year into a broad range of subjects.
COMMENTS
Programming fail
loggedin.aspx - "You won't see this because this page basically redirects to the relevant page based on role type"
O RLY?
This does not fill me with confidence regarding the qualities of the "game".
The point is to calibrate your uncertainty
Once you have done more than 4 quizzes, you get a calibration curve, showing where you are over- or under- confident. The point is to see if it is possible to develop your skills in estimating your subjective Bayesian probability. The scoring formula used is Brier's proper scoring rule - but modified so that you cannot get negative scores.
The point was not to make a commercial game, but to develop a research tool to help us find out if people can improve their skills in estimating odds. It was tested on over 500 members of iPoints. It turns out that if you are in a hurry, you don't improve at anything - in fact you get worse. If you take more than 20 or 30 seconds per question (i.e. you stop and think), you get better over time.
As for the coding, it was done by a games developer attached to Brunel University - nothing to do with Queen's University Belfast (where a Ph.D. student has been researching the learning effects in the Management School, not CS). In any case, a commercial game costs $10,000 to build and test. There is a limit to what one person can do when contracted part-time over a year. The big difference from commercial games is that the uncertainty is explicit, not hidden in the code that drives a monster. Imagine someone developing an uncertainty engine, analogous to physics engines, that could then be used in commercial games.
buggy? unpolished?
sounds about right for software coming out of QUB CS School.
anon 'cos I studied there.
So just to be completely clear on this
They are taking the number of correct entries, and multiplying that scale factor against the certainty level you entered?
So they have this data harvesting site, and an extremely basic statistical model behind it. No adaptive routines? No advanced mathematical formulae?
Seriously?
Finished
Apparently this project was set to be completed in June 2010, so what you see is the polished completed product.
http://gow.epsrc.ac.uk/ViewGrant.aspx?GrantRef=EP/E018092/1
The mind boggles.
