posted on February 01, 2013 16:10
Thanks to Crossover I learned a lot this week about the sophisticated models used to help policy makers come to their decisions. These models, their development and usage, were the topic of the project's second workshop, this one held in Washington and organised by the Millennium Institute.
When I hear about computer models I think maybe of weather forecasting, budgetary forecasting models or engineering tests. Policy modelling uses those techniques to examine the likely impact of decisions on things like water supply, energy consumption, unemployment, health and so on. The Millennium Institute has a very sophisticated and impressive modelling platform called Threshold 21, or T21. This has been used by governments in Tanzania, Ghana, China, the US and more. One model we saw for China considered the likely impact of choosing a high tech or low tech strategy. If I recall correctly the high tech one predicted a higher future GDP but unemployment at nearly 20%. 1 in 5 Chinese unemployed? Not a good plan. Likewise other options would lead (various countries) to hugely improbably energy requirements, importation of water and so on. This kind of modelling helps present the bigger picture, bringing in more factors than any one person or specialised group of people can keep in their heads.
A recurring theme was climate change. Not modelling the climate change itself but the likely impact of various policy choices. A study on energy intensive industries was interesting. Not all the factors in such a domain will be within the control of the industry or the government – the international price of aluminium for example. Using the model, discussion between government and industry in that case lead to consensus about what should be done that was a positive policy in terms of climate change mitigation and that did not adversely affect the industry.
We heard that greenhouse gas emissions could be substantially reduced by a switch from road to rail freight in the US (hasn't that been said for years?) and how, thanks to modelling, young people are not going to repeat the mistakes of their parents and grandparents. Good luck with making all the right decisions from now on then!
The overall themes from the event that I noted were:
- That models need to take advantage of very non-linear things, namely people. A model of a physical system can predict with certainty that if you knock a ball the same way twice it will react the same way both times. People aren't like that and human behaviour itself needs to be part of the model.
- That policy models work when many different stakeholders are engaged. For example, it's no good just talking to farmers about water management, you need to talk to the mining industry as well. That's an over-simplified example – the point made repeatedly was that we all see the world from our own perspective and models must take account of multiple viewpoints if they're to be effective.
- Messages from the models need to be conveyed in easy to understand stories, sometimes even childlike stories.
- Open data is providing modellers with significant new sources of data.
- It's crucial to know and understand the assumptions that lie behind some data sets. Different assumptions prevents interoperability between different data sets about the same thing.
- The digital divide means that models that only take account of online opinion will miss significant proportions of the population – perhaps the very proportion the policy is addressing.
- Models show predictions for different scenarios and allow policy makers to take the decisions. They don't all lead to a single 'right answer', indeed, they often show that all options are bad in some way.
- Models rely on the choices made by the modeller for where to draw the boundary between what is in and outside the model. That can't not have an effect on the model.
During the two days either in plenary or in one to one conversations, I saw three very noteworthy initiatives. One of the Crossover project's animators, Scott Fortmann-Roe is behind Insight Maker. This is a free online tool that allows you to create your own models with powerful simulation algorithms for System Dynamics and Agent Based Modelling. iMODELER is an alternative application that emphasises its analytical and visualisation capabilities. Both of these run directly in a Web browser. A third system, GLEAMviz runs outside the browser and so needs to be downloaded separately but offers a different set of capabilities. GLEAMviz is specifically designed to model epidemiology and uses a variety of data sources within its model, notably mobility data (how many people fly how often from where to where, travel to work times and so on).
What appeals to me about these tools is that they're based on combinations of Web technologies and open data and offer prime examples of what the Web can do today that wouldn't have been possible 5 years ago. Now all we need to do is to make sure that decision makers recognise the enormous power of these tools so that we can look forward to more evidence-based policy making.
Mayumi Sakoh, Andrea Bassi and Jed Shilling of the Millennium Institute are to be congratulated on putting on a very interesting event despite the weather on Monday morning causing problems. Thanks too to the New America Foundation for hosting the event and, of course, to the speakers and fellow Crossover partners for all the contributions.