Call for Papers: Environmental Micro-simulation

This call for papers for a special issue of Ecological Complexity addresses some of the issues I’ve been discussing recently, and hopes to present examples of multi-model approaches to assess environmental simulation model. If I’d seen this earlier or I might have tried to put something together. As it is I’ll just have to keep my eye open for the issue when it comes out next year sometime.

Call for Papers

Ecological Complexity is pleased to announce a special issue on: Environmental micro-simulation: From data approximation to theory assessment

Spatial micro-simulation has recently become a mainstream element in environmental studies. Essentially, different models, representing the same phenomena, are being extensively published and the “next step” sought is hypothesis testing, regarding the factors that determine system dynamics. However, the problem arises that assessment of environmental theories using spatial micro-simulation lacks a leading paradigm. While the Occam’s razor of positivism, which works perfectly in physics and chemistry, demands datasets covering the entire space of model parameters, the experimental abilities of environmentalists are limited and the data collected in the field represent only a small part of the always multi-dimensional parameter space. Consequently, any given model can be considered as merely approximating the few data sets available for verification and its theoretical validity is thus brought into question.

To overcome this limitation, we propose to generate a virtual world that will allow hypothesis testing based on environmental theory. That is, we propose to implement micro-simulation models using high-resolution GIS database and use them as a surrogate for reality, instead of the limited empirical database. GIS enables a realistically looking virtual world to be generated that, unlike the real one, provides the parameters characteristic of every trajectory. The almost unlimited data that can be generated from such a virtual world can then be used to assess our ability to extract rules and dependencies, estimate parameters and, finally, make applicable forecasts.

This special issue will focus on investigating models as representations of environmental theory with the help of a combination of real data and artificial worlds. We invite innovative research papers that employ different high-resolution models for generating virtual worlds, comparing them to each other, with the aim being to develop a better understanding of environmental theory. Examples can be studies of a model’s robustness, a comparative study of dynamic models, investigation of the limitations of data fitting methods and of a model’s sensitivity to changes in spatial and temporal resolution.

Scope
All sorts of micro-simulation, including cellular automata, agent-based systems, fuzzy systems, ANN and genetic algorithms, are welcome. The environmental systems of interest include, but are not limited, to:

  • Complex ecosystems
  • Landscape ecology
  • Terrain analysis and landscape evolution
  • Agriculture and pastoralism
  • Human-environment interaction
  • Land-use and land-cover changes
  • Urban dynamics

Submission instructions
Abstracts of 2 pages in length should be submitted to the Guest Editors by July 14, 2007. The review process of those abstracts considered to be the most relevant will continue and authors will be required to upload the full manuscript to the Ecological Complexity website by November 1, 2007.

Guest Editors
Tal Svoray
Ben-Gurion University of the Negev,
tsvoray@bgu.ac.il

Itzhak Benenson
Tel Aviv University,
bennya@post.tau.ac.il

Alternative Model Assessment Criteria

Given the discussion in the previous posts regarding the nature of socio-ecological systems, equifinality and relativism in environmental modelling, how should we go about assessing the worth and performance of our simulation models of human-environment systems?

Simulation models are tangible manifestations of a modellers’ ‘mental model’ of the structure of the system being examined. Socio-Ecological Simulation Models (SESMs) may be thought of as logical and factual arguments made by a modeller, based on their mental model. If the model assumptions hold, these arguments should provide a cogent and persuasive indication of how system states may change under different scenarios of environmental, economic and social conditions. However, the resulting simulation model, based upon a logical and factually coherent mental model, is unlikely to be validated on these two criteria (logic and fact) alone.

First, the problems of equifinality suggest that there are multiple logical model structures that could be implemented for any particular system. Second, accurate mimetic reproduction of an empirical system state by a model may be the most persuasive form of the factual proof of a model in many eyes, but the dangers of affirming the consequent make it impossible to prove temporal predictions in models of open systems are truly accurate. Simulation models may be based on facts about empirical systems, but their results cannot be taken as facts about the modelled empirical system.

Thus, some other criteria alongside the logical and factual criteria will be useful to evaluate or validate a SESM. A third and fourth criteria, for environmental simulation models that consider the interaction of social and ecological systems at least, are available by specifically considering the user(s) of a model and its output. These criteria are closely linked.

My third proposed criterion is the establishment of user trust in the model. Trust is used here in the sense of ‘confidence in the model’. If a person using a model or its results does not trust the model it will likely not be deemed fit for its intended purpose. If confidence is lacking in the model or its results, confidence will consequently be lacking in any knowledge derived, decision made, or policy recommended based upon the model. Thus, the use of trust as a criterion for validation is a form of ‘social validation’, ensuring that user(s) agree the model is a legitimate representation of the system.

The fourth criteria by which a model might achieve legitimacy and receive a favourable evaluation (i.e. be validated), is the provision of some form of utility to the user. This utility will be termed ‘practical adequacy’. If a model is not trusted then it will not be practically adequate for its purpose. However, regardless of trust, if the model is not able to address the problems or questions set by the user then the model is equally practically inadequate.

The addition of these two criteria, centred on the model user rather than the model itself, suggests a shift away from falsification and deduction as model validation techniques, toward more reflexive approaches. The shift in emphasis is away from establishing the truth and mimetic accuracy of a model and toward ensuring trust and practical adequacy. By considering trust and practical adequacy, validation becomes an exercise in model evaluation and reclaims its more appropriate meaning of ‘establising a model’s legitimacy’.

From his observation of experimental physicists and work on the ‘experimenter’s regress’, Collins has arrived at the view that there is no distinction between epistemological criteria and social forces to resolve a scientific dispute. The position outlined previously seems to imply a similar situation for models of open, middle-numbered systems where modellers are required to resort to social criteria to justify their models due the inability to do so convincingly epistemologically. This is not necessarily an idea that many natural scientists will sit comfortably with. However, the shift away from truth and mimetic accuracy should not necessarily be something modellers would object to.

First, all modellers know that their models are not true, exact replications of reality. A model is an approximation of reality – there is no need to create a model system if experimentation on the existing empirical system is possible. Furthermore, accepting the results of a model are not ‘true’ (i.e. in the sense that they are perfect predictions of the future) in no way requires the model be built on incorrect logic or facts. As Hesse notes in criticism of Collins, whilst the resolution of scientific disputes might result from a social decision that is not forced by the facts, “it does not follow that social decision has nothing to do with objective fact”.

Second, regardless of truth and mimetic accuracy, modellers have several options to build trust and ensure practical adequacy scientifically. Ensuring models are logically coherent and not factually invalid (i.e. criteria one and two) will already have come some way to make a scientific case. Furthermore, the traditions of scientific methodological and theoretical simplicity and elegance can be observed, and the important unifying potential across theories and between disciplines that modelling offers can be emphasised. Thus, regardless of the failures of epistemological methods for justifying them, socio-ecological and other environmental simulation models must be built upon solid logical and factual foundations;

“The postmodern world may be a nightmare for … normal science (Kuhn 1962), but science still deserves to be privileged, because it is still the best game in town. … [Scientists] need to continue to be meticulous and quantitative. But more than this, we need scientific models that can inform policy and action at the larger scales that matter. Simple questions with one right answer cannot deliver on that front. The myth of science approaching singular truth is no longer tenable, if science is to be useful in the coming age.”
(Allen et al. p.484)

Post-normal science highlights the importance of finding alternative ways for science to engage with both the problems faced in the contemporary world and the people living in that world. As they have been defined here, SESMs will inherently address questions that will be of concern to more than just scientists, including problems of the ‘risk society’. From a modelling perspective, a post-normal science approach highlights the need to build trust in the eyes of non-scientists such that understanding is fostered.

Further, it emphasises the need for SESMs to be practically adequate such that good decisions can be made promptly. It also implies that the manner in which a ‘normal’ scientist will go about assessing the trustworthiness or practical adequacy of a model (such as the methods described above) will differ markedly from that of a non-scientist. For example, scientific model users will often, but not always, have also been the person to develop and construct the model. In such a case the model will be constructed to ensure the model is practically adequate to address their particular scientific problems and questions.

When the model is to be used by other parties the issue of ensuring practical adequacy will not be so straight-forward, and particularly so when the user is a non-scientist. In such situations, the modeller needs to ask the question ‘practically adequate for what’? The inhabitants of the study areas investigated will have a vested interest in the processes being examined and will themselves have questions that could be addressed by the model. In all probability many of these questions will be ones that the modeller themselves has not considered or, if they have, may not have considered relevant. Further, the questions asked by local stakeholders may be non-scientific – or at least may be questions that environmental scientists are not used to attempting to answer.

The use and improvements in technical approaches (such a spatial error matrices from pixel-by-pixel model assessment) will remain useful and necessary in the future. Here however, I have emphasised potential alternative methods for model validation (assessment) might be useful to utilise the additional information and knowledge which is available from those actors driving change in a socio-ecological system. In other words, there is information within the system of study that is not utilised for model assessment by simply comparing observed and predicted system states. This information is present in the form of local stakeholders’ knowledge and experience.

Relativism in Environmental Simulation Modelling

Partly as a result of the epistemological problems described in my previous few posts, Keith Beven has forwarded a modelling philosophy that accepts uncertainty and a more relativist perspective. This realist approach demands greater emphasis on pluralism, use of multiple hypotheses, and probabilistic approaches when formulating and parameterising models. When pressed to comment further on his meaning of relativism, Beven highlights the problems of rigidly objective measures of model performance and of ‘observer dependence’ throughout the modelling process;

“Claims of objectivity will often prove to be an illusion under detailed analysis and for general applications of environmental models to real problems and places. Environmental modelling is, therefore, necessarily relativist.”

Beven suggests the sources of relativistic operator dependencies include;

  1. Operator dependence in setting up one or more conceptual model(s) of the system, including subjective choices about system structure and how it is closed for modelling purposes; the processes and boundary conditions it is necessary to include and the ways in which they are represented.
  2. Operator dependence in the choice of feasible values or prior distributions (where possible) for ‘free’ parameters in the process representations, noting that these should be ‘effective’ values that allow for any implicit scale, nonlinearity and heterogeneity effects.
  3. Operator dependence in the characterization of the input data used to drive the model predictions and the uncertainties of the input data in relation to the available measurements and associated scale and heterogeneity effects.
  4. Operator dependence in deciding how a model should be evaluated, including how predicted variables relate to measurements, characterization of measurement error, the choice of one or more performance measures, and the choice of an evaluation period.
  5. Operator dependence in the choice of scenarios for predictions into the future.

The operator dependencies have been highlighted in the past, but have re-emerged in the thoughts of geographers (Demeritt, Brown, O’Sullivan, Lane et al.), environmental scientists (Oxley and Lemon), social scientists (Agar) and philosophers of science (Collins, Winsberg).

Notably, although with reference to experimental physics rather than environmental simulation modelling, Collins identified the problem of the ‘experimenter’s regress’. This problem states that a successful experiment occurs when experimental apparatus is functioning properly – but in novel experiments the proper function of the apparatus can only be confirmed by the success or failure of the experiment. So in situations at the boundaries of established knowledge and theory, not only are hypotheses contested, but so too are the standards and methods by which those hypotheses are confirmed or refuted. As a result, Collins suggests experimentation becomes a ‘skilful practice’ and that experimenters accept results based not on epistemological or methodological grounds, but on a variety of social (e.g. group consensus) and expert (e.g. perceived utility) factors.

This stance is echoed in many respects by Winsberg’s ‘epistemology of simulation’, which suggests simulation is a ‘motley’ practice and has numerous ingredients of which theoretical knowledge is only one. The approximations, idealisations and transformations used by simulation models to confront analytically intractable problems (often in the face of sparse data), need to be justified internally (within the model construction process) on the basis of existing theory, available data, empirical generalisations, and the modeller’s experience of the system and other attempts made to model it.

Similarly, Brown suggests that in the natural sciences uncertainty is rarely viewed as being due to the interaction of social and physical worlds (though Beven’s environmental modelling philosophy outlined above does) and that modellers of physical environmental processes might learn from the social sciences where the process of gaining knowledge is understood to be important for assessing uncertainty.

However, whilst an extreme rationalist perspective prevents validation and useful analysis of the utility of a model, its output, and the resulting knowledge (because of things like affirming the consequent), so too does an extreme relativist stance which understands model and model builder are inseparable. Rather, as Kleindorfer et al. suggest, modellers need to develop the means to increase the credibility of the model such that “meaningful dialogue on a model’s warrantability” can be conducted. How and why this might be achieved will be discussed in future posts.

Daniel Botkin’s Renegade Blog

Daniel Botkin, eminent Ecologist and author of Discordant Harmonies, has recently started a blog called Reflections of a renegade naturalist. Two recent posts caught my eye.

The days of Smokey Bear, an enduring American icon of wildland management and its efforts to communicate with the public, are apparently numbered. Whilst his message about taking precautions against starting wildfires remains necessary, the underlying ethos of forest (and environmental) management has changed. Once, ecologists’ theoretical foundation was the ‘balance of nature’ and the presence of equilibrium and stability within ecosystems. But over the past three decades this perception has dramatically shifted and now ‘change is natural’ would be a more apt motto. Ecosystems are dynamic. Disturbance, such a wildfire, is now seen as an inherent and necessary component of many landscapes to ensure ecosystem health. This shift in thinking is evident on the Smokey website, with sections discussing the use of prescribed fire, fire’s role in ecosystem function, and the potential pitfalls of excluding fire entirely. George Perry has written an excellent review of these shifts in ecological understanding.


So what about Smokey Bear? His message about taking precautions in wilderness areas still remain of course. But with this new ecological ethos in mind, Botkin was asked for suggestions for a new management mascot. He came up with Morph the Moose. I haven’t seen anything about Morph previously, and a quick Google search currently only throws up 7 hits, so we’ll have to watch out for Morph wandering around with his new message soon.

The second post that got my eye is related to the evaluation of the forest growth model JABOWA that Botkin developed. JABOWA is an individual-based model that considers the establishment, growth and senescence of individual trees. In 1991 JABOWA was used to forecast how potential global warming would influence the Kirtland’s warbler, an endangered species that nests only in Michigan. Botkin and his colleagues forecast that by 2015 the Jack pine habitat of the warbler would decline significantly with detrimental consequences for the warbler. On his blog he suggests that matching this prediction with contemporary observations will be an ideal test to validate the predictions of the JABOWA model. Given my previous discussion about ‘affirming the consequent’ (i.e. deeming a model a true representation of reality if its predictions match observed reality, and false if it does not) it’s good to see Botkin does not suggest a valid prediction indicates the validity of the model itself. We’re advised us to stay tuned for the results. Given the subject matter and quality of the articles on the new renegade blog I certainly will.

Affirming the Consequent

A third epistemological problem of knowing whether a given (simulation) model structure is appropriate, after Equifinality and Interactive Kinds, regards the comparison of model results with real-world empirical data. Comparison of models’ predictions with empirical events has frequently been used in an attempt to show that the model structure is an accurate representation of the system being modelled (i.e. demonstrate it is ‘true’). Such an idea arises from the hypothetico-deductive scientific method of isolating a system and then devising experiments to logical prove a hypothesis via deduction. As I’ve discussed, such an approach may be useful in closed laboratory-type situations and systems, but less so in open systems.

The issue here is that predictions about real-world environmental systems are temporal predictions about events occurring at explicit points in time or geographical space, not logical predictions that are independent of space and time and that allow the generation of science’s ‘universal laws’. These temporal predictions have often been treated with the same respect given to the logical prediction of the hypothetico-deductive method. However, as Naomi Oreskes points out, it is unclear whether the comparison of a temporal prediction produced by a simulation model with empirical events is a test of the input data, the model structure, or the established facts upon which the structure is based. Furthermore, if the model is refuted (i.e. temporal predictions are found to be incorrect) given the complexity of many environmental simulation models it would be hard to pin-point which part of the model was at fault.

In the case of spatial models, the achievement of partially spatially accurate prediction does little to establish where or why the model went wrong. If the model is able to predict observed events, this is still no guarantee that the model will be able predict into the future given it cannot be guaranteed that the stationarity assumption will be maintained. This assumption is that the processes being modelled are constant thought time and space within the scope of the model. Regardless, Oreskes et al. (1994) have argued that temporal prediction is not possible by numerical simulation models of open, middle-numbered systems because of theoretical, empirical, and parametric uncertainties within the model structure. As a consequence, Oreskes et al. (1994) warn that numerical simulation modellers must beware of making the fallacy of ‘affirming the consequent’ by deeming a model invalid (i.e. false) if it does not reproduce the observed real-world data, or valid (i.e. true) if it does.

let’s go nuts!


Let’s go Lansing Lugnuts that is. Last night I went to my first Minor League Baseball game. I’ve been to a couple of Major League games before, but on a nice summers’ evening it was about time to find out more about what goes on in the lower echelons of the game that has always intrigued me. When I was about 8 my uncle brought me back a Red Socks baseball and pennant from a business trip. Maybe that got it started. One of my favourite writers Stephen Jay Gould was a huge baseball fan and used the apparent extinction of the .400 batting average as an adroit metaphor in one of his books to discount the idea of evolutionary progress with humans at the pinnacle in. And of course there are the parallels with cricket.

The lower levels of professional sport rarely get heard above the din and clamour for the biggest and best teams. The FA Premiership is now the richest football league in the world and followed avidly by many fans around the world. Its transition from a league with a reputation of violence and hooliganism to one of the most marketable sporting brands in the world has come via a change in attitude and facilities. I have a vivid memory from one of my first trips to a Bristol City game in the late 1980’s (again, I must have been about 8 – I hasten to add City are not, unfortunately, in the Premiership). I needed to use a bathroom so Dad took me to the ‘Gents’ where I was confronted simply by a 10 foot wall painted black with a gutter of urine running along the bottom. The smell was ‘colourful’ as was the language around me. It was intense to say the least. How this experience has effected me later personal development I can only guess – Mum certainly didn’t approve of me going along. But the violent and abusive behaviour that once embodied watching the game is no longer tolerated and the terraces have been replaced by more manageable and comfortable rows of covered seating (and more hygienic toilets).

Apparently a similar change has occurred in the minor leagues of baseball. In the game programme was a piece about the rise in popularity of Minor League games. Season attendances in every season since 2000 have been placed in the top 10 since the leagues began and in 2006 the current record was set at 41.7 million fans. That’s more than the NBA, and more than the NFL and NHL combined, each year. Fifth Third Field in Dayton Ohio has sold out every game since it opened in 2000. But the continuing growth has come since the 1990’s and a similar attitude toward the game as has changed football in the UK. And the programme article described a lady faced by a similar toilet experience as my childhood one – it’s certainly not like that now. The emphasis has shifted toward entertainment and whilst the minor league game hasn’t changed, the crowds have. In family-friendly America this means kids. And lots of ’em.

So whilst the high pitched screaming wasn’t so good for my ears, the $9 seat in the third row along the first base line was good for my wallet and got me close to those 90 mph pitches. I have got to say though, even with my uneducated eye, the quality of play wasn’t quite up there with, say, the SF Giants. The Lugnuts gave up 4 runs in the first inning and it wasn’t looking good. But then South Bend gave up 5 in the second and from there on we cruised to victory (8-5). Highlights from ‘the game’ for me included a Lugnuts batter snapping his bat over his knee (golfer style) after he struck out with the bases loaded, and the genius sack race ‘run’ by some ‘hefty’ women from the crowd between 8th and 9th innings. I was less impressed that they wouldn’t refill my plastic beer glass when buying a second and that I HAD to have a new one. Grrr…


Regardless of the quality of play it was a good night. And seemingly the growth of Minor League Baseball is good for the cities in which the teams are located. Oldsmobile Park is leading the much needed regeneration of the waterfront area of downtown Lansing. After the game, the fireworks reflected in the windows of the old Ottawa Power Station (above) that has lain empty for over a decade. Regeneration is needed in Michigan of all places in the States, where the decline of the American auto industry has hit hard. With manufacturing in sharp decline the state and the city need to turn to alternative industries for income and regeneration. The dollars spent in the stadium are now helping to boost the local economy, and give this part of town something to build around for the future. So, let’s go nuts!

Interactive vs. Indifferent Kinds

Models that consider human activity are particularly difficult to ‘close’ because of their consideration of ‘interactive’ kinds. Ian Hacking highlights the distinction between the classification of ‘interactive’ and ‘indifferent’ kinds. Different kinds of people are ‘interactive kinds’ because people are aware and can respond to how they are being classified. Hacking contrasts the interactive kinds that are often studied in the social sciences with the indifferent kinds of the natural sciences. Indifferent kinds – such as trees, rocks, or fish – are not aware that they are being classified by an observer. This indifference to classification means their behaviour does not change because of it [but see my point at the end of this post].

The representation of interactive kinds potentially results in a ‘looping effect’ that has implications for model closure and validation – socioecological simulation models have the potential to feedback into, and therefore transform, the systems they represent via the conscious awareness of local stakeholders using the model or its results (or participating in the modelling process). If this transformation occurs it is likely that the model will be a less accurate representation of the empirical system than previously. Such a situation implies that a simulation model of a socioecological system may never truly represent that system (if it used by those it represents). Therefore in the case where a model is to be used by those being represented (for decision-making for example) I’d suggest that an iterative modelling process would be most appropriate to ensure continued utility.

[If anyone has any thoughts on how Hacking’s kinds relate to the whole Schrödinger’s Cat problem I’m all ears – interactive or indifferent?]

notes from sri lanka


Erin (AKA travelorphan) has been offline for a while, but on her return from the field she’s made several posts to her blog detailing some of her recent work and the events in Sri Lanka.

Many people are still trying to rebuild their lives following the devastation of the 2005 tsunami and Erin has had the opportunity to assist local evacuation and disaster management using activities such as community-led vulnerability mapping. However, much of this recovery goes on in the midst of an ongoing conflict, which is endangering those offering aid and diverting resources away from civilian and toward military uses.

Check out some of her notes and pictures. Stirring stuff.

Initial Michigan UP Ecological Economic Modelling Webpage


We now have a very basic webpage online, (very) briefly outlining the Michigan UP Ecological-Economic Modeling project. This is just so that we have an online presence for now – in time we will develop this into a much more comprehensive document detailing the model, its construction and use. Hopefully, at some point in the future we’ll also mount a version of the model online. I’ll keep you posted on the online development of the project.