Introduction and definition As policy models grow in size and complexity, the process of analysing and visualising the resulting large amounts of data becomes an increasingly difficult task. Traditionally, data analysis and visualisation were performed as post-processing steps after a simulation had been completed. As simulations increased in size, this task became increasingly difficult, often requiring significant computation, high-performance machines, high capacity storage, and high bandwidth networks.
Computational steering is an emerging technology that addresses this problem by “closing the loop” and providing a mechanism for integrating modelling, simulation, data analysis and visualisation. This integration allows a researcher to interactively control simulations and perform data analysis while avoiding many of the pitfalls associated with the traditional batch / post processing cycle.
This research challenge refers to the issue of the integration of visualisation techniques within an integrated simulation environment. This integration plays a crucial role in making the policy modelling process more extensive and, at the same time, comprehensible. In fact, the real aim of interactive simulation is, on the one hand, to allow model developers to easily manage complex models and their integration with data (e.g. real-time data or qualitative data integration) and, on the other hand, to allow the other stakeholders not only to better understand the simulation results, but also to understand the model and, eventually, to be involved in the modelling process.
Interactive simulation can dramatically increase the efficiency and effectiveness of the modelling and simulation process, allowing the inclusion and automation of some phases (e.g. output and feedback analysis) that were not managed in a structured way up to this point.
Why it matters in governance Interactive simulation is a particular aspect of simulation. As far as the Policy Assessment in Governance is concerned, this challenge may:
- Accelerate the simulation process: policy makers would be able to analyse simulation results, eventually run new scenarios and make decisions as soon as possible and at the minimum cost.
- Collaborative environment: the bigger is the number of stakeholders involved in policy modelling and simulation process, the greater is the necessity of an interactive simulation environment that allows non-experts to use the model and understand results as well as permit experts to easily understand new requirements and consequent modification.
- Citizen engagement: interactive simulation tools help to engage citizens in policy-making process and to display to them in a simple way the results.
- Data integration: interactive simulation tools allow better managing of a large number and different types of data and information, both for input and output/feedback analysis.
Current Practice and Inspiring cases In current practice, data analysis and visualisation, albeit critical for the process, are often performed as a post-processing step after batch jobs are run. For this reason, the errors invalidating the results of the entire simulation may be discovered only during post-processing. What is more, the decoupling of simulation and analysis/visualisation can present serious scientific obstacles to the researcher in interpreting the answers to “what if” questions. Given the limitations of the batch / post processing cycle, it might be advisable to break the cycle and improve the integration of simulation and visualisation. Implementation of an interactive simulation and visualisation environment requires a successful integration of the many aspects of scientific computing, including performance analysis, geometric modelling, numerical analysis, and scientific visualisation. These requirements need to be effectively coordinated within an efficient computing environment. Recently, several tools and environments for computational steering have been developed. They range from tools that modify performance characteristics of running applications, either by automated means or by user interaction, to tools that modify the underlying computational application, thereby allowing application steering of the computational process.
Available Tools A review of the available tools is ongoing.
Key challenges and gaps However, the development of immersive tools is still based on model developers needs and therefore a gap still exists between requirements of policy makers and those of developers. In a collaborative modelling environment, interaction is fundamental in order to speed up the process and make ICT tools user-friendly for all the stakeholders involved in the policy model development process.
Current research In the current research, interactive visualisation typically combines two main approaches: providing efficient algorithms for the presentation of data and providing efficient access to the data. The first advance is evident albeit challenging. Even though computers continually get faster, data sizes are growing at an even more rapid rate. Therefore, the total time from data to picture is not decreasing for many of the problem domains. Alternative algorithms, such as ray tracing (Nakayama, 2002) and view dependent algorithms (Lessig, 2009) can restore a degree of interactivity for very large datasets. Each of those algorithms has its trade-offs and is suitable for a different scenario. The second advance is less evident but very powerful. Through the integration of visualisation tools with simulation codes, a scientist can achieve a new degree of interactivity through the direct visualisation and even manipulation of the data. The scientist does not necessarily wait for the computation to finish before interacting with the data, but can interact with a running simulation. While conceptually simple, this approach poses numerous technical challenges.
Future research With regard to future research, interactive simulation plays a crucial role in a collaborative modelling environment. The trade-off between the possibility of enlarging models and including several kinds of data, and the number of people that can understand and modify the model should be deeply analysed. For this purpose, some fundamental issues must be approached:
- Systems should be modular and easy to extend within the existing codes.
- Users of the systems should be able to add new capabilities easily without being experts in systems programming.
- Input / output systems should be easily integrated.
- Steering systems should be adaptable to hardware ranging from the largest of supercomputing systems to low-end workstations and PCs.