1. Systems of Atomized Models
This research challenge seeks to find the way to model a system by using already existing models or composing more comprehensive models by using smaller building blocks, sometimes also called “atoms”, either by reusing existing objects/models or by generating/building them from the very beginning.

Introduction and definition

This research challenge seeks to find the way to model a system by using already existing models or composing more comprehensive models by using smaller building blocks, sometimes also called “atoms”, either by reusing existing objects/models or by generating/building them from the very beginning. Therefore, the most important issue is the definition/identification of proper (or most apt) modelling standards, procedures and methodologies by using existing ones or by defining new ones. Further to that, the present sub-challenge calls for establishing the formal mechanisms by which models might be integrated in order to build bigger models or to simply exchange data and valuable information between the models. Finally, the issue of model interoperability as well as the availability of interoperable modelling environments should be tackled.

Why it matters in governance

Using existing objects/models that are able to describe systems, sub-systems and interaction among them, allows everyone to build his own insight on a specific problem/solution. So, in governance, such opportunity give us the chance to:
  1. Release public data, linking them and producing visual representations able to reveal unanticipated insights.
  2. Use social computing to promote engagement and citizens’ inclusion in policy decision, and exploit the power of ICT in mining and understanding the opinions they express.
  3. Analyse policies and produce models that can be visualised and run to produce simulations able to show the effects and impacts from different perspectives such as political, economic, social, technological, environmental and legal facets.

Current Practice and Inspiring cases

In systems analysis, it is common to deal with the complexity of an entire system by considering it to consist of interrelated sub-systems. This leads naturally to consider models as consisting of sub-models. Such a (conceptual) model can be implemented as a computer model that consists of a number of connected component models (or modules). Component-oriented designs actually represent a natural choice for building scalable, robust, large-scale applications, and to maximize the ease of maintenance in a variety of domains.

An implementation based on component models has at least two major advantages:
  1. First, new models can be constructed by connecting existing component models of known and guaranteed quality together with new component models. This has the potential to increase the speed of development.
  2. Secondly, the forecasting capabilities of two different component models can be compared, as opposed to compare whole simulation systems as the only option. 
Further, common and frequently used functionalities, such as numerical integration services, visualisation and statistical ex-post analyses tools, can be implemented as generic tools and developed once for all and easily shared by model developers. By the way, the current practice in composing and re-using models is still not sufficiently widespread. In relation to Model Reuse, this is mainly due to the fact that little to any repository actually exists2. Moreover, the publicly available models are not “open” to modification or re-use. Some modelling environments (or modelling suites) provide some examples and small libraries of ready-to-use models, but in most cases, they are not completely open nor any explanation is provided on how to reproduce them (their structure, parameters, etc.). As an inspiring case see the SEAMLESS project, which was funded by the EU Framework Programme 6 (Global Change and Ecosystems), ran from 2005 till March 2009, and developed a computerized framework for integrated assessment of agricultural systems and the environment3. During the project, a modular approach was chosen to develop a system named “Agricultural Production and Externalities Simulator (APES)”, illustrated in figure (5). APES is a modular simulation system targeted at estimating the biophysical behaviour of agricultural production systems in response to the interaction of weather, soils and different options of agro-technical management. Although a specific, limited set of components is available in the first release, the system is being built to incorporate, at a later time, other modules which might be needed to simulate processes not included in the first version. The processes are simulated in APES with deterministic approaches which are mostly based on mechanistic representations of biophysical processes. APES was used to compare alternative agricultural and environmental policy options, facilitating the process of assessing key indicators that characterize interactions between agricultural systems, natural and human resources, and society. The developed framework, named SEAMLESS-IF in a finale stage, also enabled linkage of quantitative models, pan-European databases and qualitative procedures to simulate the impact on society of biophysical, economic and behavioural changes. SEAMLESS-IF now facilitates ex-ante assessments at the full range of scales from the global to the field level to support policy and decision making for sustainable development. SEAMLESS-IF nowadays can be used to investigate the effects of agricultural and environmental policies while accounting for technical innovations. Further, the interactions of such policies with other major trends such as climate change and increasing land used for bio-fuel crops can be studied efficiently in the near future.

[Figure 4: Agricultural Production and Externalities Simulator (APES) – Awaiting Image]

Analyses with SEAMLESS-IF can be done at multiple scales and with varying time horizons, whilst focusing on the most important issues emerging at each scale. This is possible as the framework is based on research innovations in linking models across scales allowing consistent “micro-macro” analysis as well as linking models across disciplines allowing “economicbiophysical” analysis. The linked models range from a bio-physical field model to a farm model and to an agricultural sector model for the EU; in other words they ensure a consistent analysis of what effects EC policies may have on agricultural markets, farming systems and the environment. In addition, the effectiveness of a policy in its institutional context is assessed by applying qualitative procedures. The interlinked pan-European database provides the relevant data needed at different scales.

Another inspiring example see the Insight Maker case at http://insightmaker.com. Insight Maker allows to build simulation models ("Insights") for all scales: from the smallest cell, to the social effects of product adoption, to global climate change. Once they are built one can share them with others. The models are called “an Insight” as they will typically reveal one or more fascinating point about the system under study. All the simulations built with Insight Maker can be shared via the web. This means people can change the variables and see the results for themselves.

Vensim Molecules4 is a software used for constructing system dynamics models from molecules of system dynamics structure. Molecules are made of primitive stock and flow or auxiliary elements and are, in turn, the building blocks of complete models, elements of substructure serving a particular purpose. Molecules provide a framework for presenting important and commonly used elements of model structure making faster and easier to develop system dynamics models.

Project Anylogic5, a multi-method simulation modelling tool capable of integrating and combining the following modelling approaches: system dynamics, discrete event simulation and agent-based modelling. Anylogic’s simulation language is composed by stock and flow diagrams (used for System Dynamics modelling), statecharts, which define the agents’ behaviour in Agent Based modelling, action charts (used to define algorithms), and finally process flowcharts which are the basic constructions for defining processes in Discrete Event modelling.

Available Tools

A comprehensive review of the available tools is ongoing.

Key challenges and gaps

With regards to implementation architecture and use of modelling frameworks, there are two major problems:
  1. the framework design and implementation must be optimized to balance carefully its flexibility and its usability to avoid incurring either a performance penalty or users having too steep a learning curve, and
  2. developing components for a specific framework constrains their use to that framework.
The most immediate option to overcome such problems is developing inherently reusable components (i.e. non framework specific), which can be used in a specific modelling framework by encapsulating them using dedicated classes called “wrappers”; such classes act as bridges between the framework and the component interface. The disadvantage of this solution is the creation of another “layer” in the implementation, which adds to the already implemented machinery in the framework. The appropriateness of this solution, both as ease of implementation and overall performance, must be evaluated case by case.

Regardless of the choice of developing framework specific or intrinsically reusable components, there is a basic choice which must be carefully evaluated prior to that and which is related, in general terms, to the framework as a flexible modelling environment to build complex models (model linking), but also to the framework as an efficient engine for simulation, calibration and simulation of model components (model execution). Modern software technologies allow building flexible, coherent and elegant constructs, but that comes at a performance cost. Without even introducing specific references to Object Oriented Programming (OOP), it seems important to point out that the use of object-oriented programming constructs, which actually enhance flexibility, modularity and reuse of software, all nice things, require the compiler to use virtual methods calls, dynamic dispatching, and so on. All these operations are resource intensive and in some cases, they can heavily affect the code performance, and this becomes evident in applications in which such use is done thousand times every simulation step.

By the way, the Model Composition horizon is even more clouded as the potential advantages resulting from the possibility of composing bigger models from smaller ones have been shown only recently. It is essentially due to the problem of interoperability and integration of different vendors’ (thus proprietary) model formats and to the lack of standards allowing performing composition tasks. Another problem stems from the fact that many models are still too dependent on their implementation methodology. Moreover, model integration is at present almost non-existing. Very few modelling environments/suites provide the import/export functionalities and a standard language for model interoperability is not currently available. Most of the current practice for data communication or information transfer is performed by means of third party solutions (e.g.: interoperability in most cases is achieved by transferring data via electronic spreadsheets or, only in rare cases, by using Database Management Systems (DBMS) or Enterprise Resource Planning (ERP) systems.

Current research

Current research, as well as previous research, has not yet worked on (with the exception of just a few cases) the problem of different models integration. At present, due to the plethora of different modelling/simulation environments/suites, as well as to differences at the scientific field level, many competing file formats exist. It is possible that vendors perceive the modelling practice as a very small market niche (as the users stem mainly from Academia and to a very small extent from private companies where a Decision Support Systems is used, what is more the Public Administration share is negligible) and therefore are reluctant to introduce interoperable features.

Also, current research, as well as previous research, has only recently begun to explore the following issues:
  1. Open-source modelling and simulation environments (there are open environments that are rising in importance in the research community, albeit in most cases they only provide the possibility to implement and simulate a model according to the modelling methodology they refer to).
  2. Communication of data among models developed in different proprietary (or open) environments by depending on third party solutions (e.g.: interoperability is in most cases only achieved by transferring data by means of electronic spreadsheets or, only in rare cases, by using a DBMS or an organisation’s ERP).
  3. Open visualisation of results stemming from model simulation (e.g.: online visualisation of simulation results in a browser by interfacing - only in a few cases - the simulation engines, or - as it is more often the case - by connecting to a third party mean, as described in the previous bullet point).
Future research

Future research should therefore focus on:

  1. Definition of standard procedures for model composition/decomposition, e.g. how to deductively pass from a macro-description of models to the fine definition of its building-blocks or molecules (top-down approach), how to inductively conceive a progressive composition of bigger models by aggregating new parts as soon as they are needed (bottom-up approach) or by expanding already existing objects.
  2. Proposition of a minimum set of archetypical structures, building blocks or molecules that might be used according to the proper level of decomposition of the model (e.g. systemic archetypes, according to the Systems Thinking / System Dynamics approach, might be useful to describe the overall behaviour thanks to the main variables in the system to be modelled at a macro-to-middle level). The procedures to implement, validate and redistribute any further improvement of these “minimal” objects should be investigated.
  3. Definition of open modelling standards, as the basis for interoperability, that is defining common file formats and templates (i.e.: by means of XML), which would allow the models described by means of these XML files to be opened, accessed and integrated into every (compliant) model-design and simulation environment.
  4. Interoperability, also intended in terms of Service Oriented Architectures (e.g.: certain stand-alone and always operative models might expose some “services” in order to make available either their endogenous data or bits of information, or some peculiar function or structural part, while some other may request to use those services when needed. In consequence, it creates a need for a definition of model repositories, a list of operative models and the functionalities that they might expose which finally, entails the definition of a SOA among interoperable models).
  5. Definition and implementation of model repositories (and procedures to add new objects to them), even if they are restricted to hosting models developed according to a specific methodology (Agent Based, System Dynamics, Event Oriented, Stochastic, etc...)
  6. Definition and implementation of new relationships that are created when two models are integrated. All possible important relationships resulting from a model integration/composition should be identified and eventually included in the new deriving integrated model.
  7. Input / Output definition / re-definition: the integration of modelling techniques is a pertinent issue in the scope of this challenge. The multi-modelling tools should be, in future, available not only to experts but also to lay users. Moreover, at present, only a few of the actually available modelling/simulation suites are able to provide the possibility to build a model by referring to a different modelling methodology.
Immediately related elementsHow this works
-
Crossover Research Roadmap – Policy-Making 2.0 »Crossover Research Roadmap – Policy-Making 2.0
4. Research Challenges for Policy-Making 2.0 »4. Research Challenges for Policy-Making 2.0
Policy Modelling »Policy Modelling
1. Systems of Atomized Models
+Commentaires (0)
+Citations (0)
+About