Agent-Based Modelling for Interdisciplinary Geographical Enquiry

Bruce Rhoads argued that;

“The time has come for geography to fulfil its potential by adopting a position of intellectual leadership in the realm of interconnections between human and biophysical systems.”

Many areas of scientific endeavour are currently attempting to do the same and interdisciplinarity has become a big buzzword. Modelling has become a common tool for this interdisciplinary study (for example ecological-economic models), with several different approaches available. Increases in computing power and the arrival of object-oriented programming have led to the rise of agent-based modelling (also termed individual-based and discrete element).

In their latest paper in Geoforum, Bithell et al. propose this form of modelling, with its “rich diversity of approaches”, as an opportune way to explore the interactions of social and environmental processes in Geography. The authors illustrate the potential of this form of modelling by providing outlines of individual-based models from hydrology, geomorphology, ecology and land-use change (the latter of which I have tried to turn my hand to). The advantages of agent-based modelling, the authors suggest, include the ability to represent

  1. agents as embedded within their environment,
  2. agents as able to perceive both their internal state and the state of their environment
  3. agents that may interact with one another in a non-homogeneous manner
  4. agents that can take action to change both their relationships with other agents and their environment
  5. agents that can retain a ‘memory’ of a history of past events.

However the development of these representation can be a challenging task as I found during my PhD modelling exploits, and requires a ‘diversity of resources’. When representing human agents these resources include past population censuses, surveys and interviews of contemporary populations, and theoretical understanding of social, cultural and economic behaviour from the literature. In my modelling of a contemporary population I used interviews and theoretical understanding from the literature and found that, whilst more resource intensive, actually going to speak with those being represented in the model was by far more useful (and actually revealed the deficiencies of accepted theories).

In their discussion, Bithell et al. consider the problems of representing social structures within and an individual-based model suggesting that;

“simulation of social structure may be a case of equipping model agents with the right set of tools to allow perception of, and interaction with, dynamic structures both social and environmental at scales much larger than individual agents”.

Thus, the suggestion is that individually-based models of this type may need some form of hierarchical representation.

Importantly I think, the authors also briefly highlight the reflexive nature of agent-based models of human populations. This reflexivity occurs of the model is embedded within the society which it represents, thus potentially modifying the structure of system it represents. This situation has parallels with Hacking’s ‘looping effect’ that I’ll write about more another time. Bithell et al. suggest that this reflexive nature may, in the end, limit the questions that such models can hope meaningfully address. However, this does not prevent them from concluding;

“The complex intertwined networks of physical, ecological and social systems that govern human attachment to, and exploitation of, particular places (including, perhaps, the Earth itself) may seem an intractable problem to study, but these methods have the potential to throw some light on the obscurity; and, indeed, to permit geographers to renew their exploration of space–time geographies.”

memories of a British coastal landscape


Before my impending departure to the States I’ve been out and about visiting a few places that I won’t see for a while. This week, I took my Grandmother back to the town where she grew up on the English south coast – Lyme Regis in Dorset. I’d never been and she hadn’t been back for a while so it was a trip down both new and old memory lanes.


And what steep lanes. Apparently they used drag cargo up Cobb Road from ships docked in ‘the Cobb’. They realised it was a bit much like hard work up these steepled slopes and stopped a fair while ago. But there were other war-time stories about the inclines; run-away trucks with failed breaks, careening down narrow lanes toward the sea-front, their landings cushioned not by a sandy beach but by the solid walls of the old coal merchants (it seems it’s still happening these days too). Line upon line of American soldiers snaking up and down Broad Street outside the old Regent Cinema (then The New Thing In town). Apparently it remains quintessentially British today – tea and biscuits from a china cups and saucers before taking your seats (aside the fact it shows the latest Hollywood block-busters of course).


The vertiginous topography has not only caused rapid runaway of trucks, but also the rapid (and creeping) runaway of the soil. Efforts to manage and reduce land slippage are being undertaken in parallel with a £17 million coastal defence and harbour improvement scheme. Whilst understanding that it is necessary if they want to save their sea-front industry (which has changed from sea-trading and fishing to sea-swimming and tourism), locals aren’t happy about the large new shingle banks that provide the needed protection. Sand has accumulated in the harbour over recent years and has now been joined by a nice sandy beach imported from France.


Alongside visiting the sea-side we had tea and cake at some old friend’s house – all in all a good day stocking up on memories of the British coastal landscape before I jet off across the pond.

Useless Arithmetic?

Can we predict the future? Orrin Pilkey and Linda Pilkey-Jarvis say we can’t. They blame the complexity of the real world alongside a political preference to rely on the predictive results of models. I’m largely in agreement with them on many of their points but their popular science book doesn’t do an adequate job of explaining why.

The book is introduced with an example of the failure of mathematical models to predict the collapse of the Grand Banks cod fisheries. The second chapter tries to lay the basis of their argument, providing an outline of underlying philosophy and approaches of environmental modelling. This is then followed by six case studies of the difficulties of using models and modelling in the real world: the Yucca Mountain nuclear waste depository, climate change and sea-level rise, beach erosion, open-cast pit mining, and invasive plant species. Their conclusion is entitled ‘A Promise Unfulfilled’ – those promises having been made by engineers attempting to apply methods developed in simple, closed systems to those of complex, open systems.

Unfortunately the authors don’t describe this conclusion in such terms. The main problems here are the authors’ rather vague distinction between quantitative and qualitative models and their inadequate examination of ‘complexity’. In the authors’ own words;

“The distinction between quantitative and qualitative models is a critical one. The principle message in this volume is that quantitative models predicting the outcome of natural processes on the surface of the earth don’t work. On the other hand, qualitative models, when applied correctly, can be valuable tools for understanding these processes.” p.24

This sounds fine, but it’s hard to discern, from their descriptions, exactly what the difference between quantitative and qualitative models is. In their words again,

Quantitative Models:

  • “are predictive models that answer the questions ‘where’, ‘when’, ‘how much'” p.24
  • “if the answer [a model provides] is a single number the model is quantitative” p.25

Qualitative Models:

  • “predict directions and magnitudes” p.24
  • do not provide a single number but consider relative measures, e.g “the temperature will continue to increase over the next century” p.24

So they both predict, just one produces absolute values and the other relative values. Essentially what the authors are saying is that both types of models predict and both produce some form of quantitative output – just one tries to be more accurate than another. That’s a pretty subtle difference.

Further on they try to clarify the definition of a qualitative model by appealing to concepts;

“a conceptual model is a qualitative one in which the description or prediction can be expressed as written or spoken word or by technical drawings or even cartoons. The model provides an explanation for how something works – the rules behind some process” p.27.

But all environmental models considering process (i.e. that are not empirical/statistical) are conceptual, regardless of whether they produce absolute or relative answers! Whether the model is Arrhenius’ back of the envelope model of how the greenhouse effect works, or a Global Circulation Model (GCM) running on a Cray Supercomputer and considering multiple variables, they are both built on conceptual foundations. We could write down the structure of the GCM, it would just take a long time. So again, their distinction between quantitative and qualitative models doesn’t really make things much clearer.

With this sandy foundation the authors examine suggest that the problem is that the real world is just too complex for the quantitative models to be able to predict anything. So what is this ‘complexity’? According to Pilkey and Pilkey-Jarvis;

“Interactions among the numerous components of a complex system occur in unpredictable and unexpected sequences.” p.32

So, models can’t predict complex systems because they’re unpredictable. hmm… A tautology no? The next sentence;

“In a complex natural process, the various parameters that run it may kick in at various times, intensities, and directions, or they may operate for various time spans”.

Okay, now were getting somewhere – a complex system is one that has many components in which the system processes might change in time. But that’s it, that’s our lot. That’s what complexity is. That’s why environmental scientists can’t predict the future using quantitative models – because there are too many components or parameters that may change at any time to keep track of such that we couls calculate an absolute numerical result. A relative result maybe, but not an absolute value. I don’t think this analysis quite lives up to it’s billing as a sub-title. Sure, the case-studies are good, informative and interesting but I think this conceptual foundation is pretty loose.

I think the authors’ would have been better off making more use of Naomi Oreskes’ work (which they themselves cite) by talking about the difference between logical and temporal prediction, and the associated difference between ‘open’ and ‘closed’ systems. Briefly, closed systems are those in which the intrinsic and extrinsic conditions remain constant – the structure of the system, the processes operating it, and the context within which the system sits do no change. Thus the system – and predictions about it – are outside history and geography. Think gas particles bouncing around in a sealed box. If we know the volume of the box and the pressure of the gas, assuming nothing else changes we can predict the temperature.

Contrast this with an ‘open’ system in which the intrinsic and extrinsic conditions are open to change. Here, the structure of the system and the processes operating the system might change as a result of the influence of processes or events outside the system of study. In turn, where the system is situated in time and space becomes important (i.e. these are geohistorical systems), and prediction becomes temporal in nature. All environmental systems are open. Think the global atmosphere. What do we need to know in order to predict the temperature in the future in this particular atmosphere? Many processes and events influencing this particular system (the atmosphere) are clearly not constant and are open to change.

As such, I am in general agreement with Pilkey and Pilkey-Jarvis’ message, but I don’t think they do the sub-title of their book justice. They show plenty of cases in where quantitative predictive models of environmental and earth systems haven’t worked, and highlight many of the political reasons why this approach has been taken, but they don’t quite get to the guts of why environmental models will never be able to accurately make predictions about specific places at specific times in the future. The book Prediction: Science, Decisions Making, and the Future of Nature provides a much more comprehensive consideration of these issues and, if you can get your hands on it, is much better.

I guess that’s the point though isn’t it – this is a popular science book that is widely available. So I shouldn’t moan too much about this book as I think it’s important that non-modellers be aware of the deficiencies of environmental models and modelling and how they are used to make decisions about, and manage, environmental systems. These include:

  • the inherent unpredictability of ‘open’ systems (regardless of their complexity)
  • the over-emphasis of environmental models’ predictive capabilities and expectations (as a result of positivist philosophies of science that have been successful in ‘closed’ and controlled conditions)
  • the politics of modelling and management
  • the need to publish (or at least make available) model source code and conceptual structure
  • an emphasis on models to understand rather than predict environmental systems
  • any conclusions based on experimentation with the model are conclusions about the structure of the model not the structure of nature

I’ve come to these conclusions over the last couple of years during the development of a socio-ecological model, in which I’ve been confronted by differing modelling philosophies. As such, I think the adoption of something more akin to ‘Post-Normal’ Science, and greater involvement of the local publics in the environments under study is required for better management. The understanding of the interactions of social, economic and ecological systems poses challenges, but is one that I am sure environmental modelling can contribute. However, given the open nature of these systems this modelling will be more useful in the ‘qualitative’ sense as Pilkey and Pilkey-Jarvis suggest.

Orrin H. Pilkey and Linda Pilkey-Jarvis (2007)
Useless Arithmetic: Why Environmental Scientists Can’t Predict the Future
Columbia University Press
ISBN: 978-0-231-13212-1

Buy at Amazon.com

[June 3rd 2007: I just noticed Roger Pielke reviewed Useless Arithmetic for Nature the same day as this original post. Read the review here.]

Ecological and economic models for biodiversity conservation

As a follow-up to yesterday’s post, the latest volume of Ecological Economics has a paper by Drechsler et al. entitled, ‘Differences and similarities between ecological and economic models for biodiversity conservation’. They compare 60 ecological and economic models and suggest:

“Since models are a common tool of research in economics and ecology, it is often implicitly assumed that they can easily be combined. By making the differences between economic and ecological models explicit we hope to have helped to avoid miscommunication that may arise if economists and ecologists talk about “models” and believe they mean the same but in fact think of something different. The question that arises from the analysis of this paper is, of course: What are the reasons for the differences between economic and ecological models?”

The authors suggest five possible routes into the examination of this question:

  1. Different disciplinary traditions
  2. Differences in the systems analysed
  3. Differences in the perception of the system analysed
  4. Varying personal preferences of researchers
  5. Models serve different purposes

Drechsler et al. conclude:

“The general lesson from this is that economists who start thinking about developing ecological–economic models have to be prepared that they might be involved in complex modelling not typical and possibly less respected in economics. On the other hand, ecologists starting collaborations with modellers from economics have to be aware that in economics analytical tractability is much higher valued and simple models are more dominant than in ecology.”

Integrating Ecology and Economics

With my viva voce just over two weeks ago I really should be concentrating all my efforts on ensuring that I’m adequately prepared for the oral defence of my PhD thesis. I’m doing OK, but I’m a little distracted by my impending move to the Center for Systems Integration and Sustainability at Michigan State University. There I’ll be working on a project that will take a systems approach to develop an integrated ecological-economic model for the management of a forest landscape in Michigan’s Upper Peninsula.

I touched on some of the difficulties of integrated ecological-economic modelling in my thesis:

The difficulties of integrating ecological and economic theory into a model or framework for study have been outlined by Svedin and Bockstael et al.. These authors highlight some common points regarding time and space scales. First, the spatial boundaries on systems’ analysis may not coincide, as economists place their boundaries according to the extent of the market, whilst ecologists typically use physical features. Second, the temporal extents of study may differ vastly as economists do not believe they can predict too far into the future, but ecologists are often more ambitious. Potentially the biggest stumbling block for integrating economic and ecological approaches however, is the difference in the disciplines’ fundamental approach and philosophy. First, economists disregard things that they cannot value financially but ecologists believe that a theoretical framework must take account of the most important aspects of a problem (regardless of financial value – Bockstael et al.). As ecosystem processes are very difficult (if not impossible) to value in financial terms, these two standpoints are hard to reconcile. These differences in approach, and the difference in the systems of study, result in different “units of measurement, populations of interest, handling of risk and uncertainty and paradigms of analysis” when modelling (Bockstael et al. p.146). Svedin discusses the potential of using energy or information as fundamental units that might be used in common by the two disciplines. However, Bockstael et al. point out that reducing systems to the lowest possible common denominator has often simply resulted in larger black box models, compromising individual model modules’ integrity. Svedin possibly realised this when he concluded that integration should be context-dependent for the study at hand, and that the underlying philosophies of different disciplines must be remembered when attempting integration.

One method that has been developed to address these issues is economic valuation of ecosystem services. A recent example of this sort of exercise was undertaken for the trees of New York City. Designed for use in urban areas, the USFS Stratum model uses a tree growth model coupled with data on the regional climate, building construction and energy use patterns, fuel mix for energy production, and air pollutant concentrations to estimate environmental benefits and costs as well as effects on property values. Alongside the economic value of the trees (the annual monetary value of the benefits provided and costs accrued), Stratum estimates the resource’s structure (species composition, extent and diversity), function (the environmental & aesthetic benefits trees afford the community), and resource management needs (evaluations of diversity, canopy cover, and pruning needs). According to Stratum the nearly 600,000 trees lining the streets of New York City are worth $122 million – and this doesn’t include the 4.5 million trees in parks and on private land.

As the outputs of Stratum suggest, there are both monetary and non-monetary forms of ecosystem valuation, both with pros and cons. One notable form of monetary ecosystem valuation is non-market valuation. Non-market valuation attempts to estimate the value of goods and services that do not have observable market values. In the forthcoming project at CSIS we hope to use non-market valuation as a complementary approach to more traditional market valuation analysis to better examine economic trade-offs between various ecosystem services and ensure the development of sustainable management plans. In developing the model in this way we will be exploring ways to overcome the fundamental differences between economic and ecological theory.

Reference
Svedin, U. (1985) Economic and ecological theory: differences and similarities In: Hall, D. O., Myers, N. and Margaris, N. S. Economics of ecosystems management:31-39 Dordrecht: Dr W. Junk Publishers

‘What I Want’ versus ‘What Is Best’

When ‘what is best’ doesn’t align with ‘what I want’, making the right decision is hard. We need to find ways of working out how make these options align as closely as possible.

Jared Diamond’s point in Collapse is that the fate of contemporary society is in our own hands. I read and wrote about the introductory chapter to a while ago. Eventually I did read the whole book, though as Michael Kavanagh points out;

“You could read the introduction and the last few chapters and get the point. But then you’d miss out on what Jared Diamond does best: tell stories.”

Kavanagh is right; as I’ve talked about before here storytelling is an important way of understanding the world. William Cronon has suggested narratives of global change that offer hope are needed for us to tackle the (potential) problems that contemporary society faces. Most of Diamond’s stories about the fate of previous societies don’t offer much hope however – most collapsed and the only modern example of positive action on the environment is Iceland. Diamond’s identifies five contributing factors to societal collapse:

“… climate change, hostile neighbours, trade partners (that is, alternative sources of essential goods), environmental problems, and, finally, a society’s response to its environmental problems. The first four may or may not prove significant in each society’s demise, Diamond claims, but the fifth always does. The salient point, of course, is that a society’s response to environmental problems is completely within its control, which is not always true of the other factors. In other words, as his subtitle puts it, a society can “choose to fail.”

Diamond emphasises the need for individual action – for a bottom-up approach to make sure that we choose not to fail. Kavanagh suggests the implications is that

“in a world where public companies are legally required to maximize their profits, the burden is on citizens to make it unprofitable to ruin the environment — for an individual, a company, or a society as a whole.”

Others suggest more dramatic action is needed however. Richard Smith suggests that this ‘market meliorist strategy’ won’t be enough. Smith contrasts the bottom-up decision-making of the New Guinea villages that Diamond uses as a potential model for contemporary decision-making with that of contemporary capitalist society. Whereas the New Guinea villages’ decision-making process takes into account everyone’s input:

“…we do not live in such a ‘bottom-up’ democratic society. We live in a capitalist society in which ownership and control of the economy is largely in the hands of private corporations who do not answer to society. In this system, democracy is limited to the political sphere. …under capitalism, economic power is effectively monopolized by corporate boards whose day-to-day requirements for reproduction compel their officers to systematically make ‘wrong’ decisions, to favour the particular interests of shareholders against the general interests of society.”

Smith’s solution? As the global issues contemporary society faces are so interconnected and international, international governance by a “global citizenry” is required. Critics to this approach are likely to be many, but whether it will be enough for individual consumers to “make it unprofitable to ruin the environment”, or whether the we develop a “global citizenry”, the ultimate question here seems to be ‘Are we prepared to change our lifestyles to ensure the survival of our contemporary (global) society’?

With the “End of Tradition” in western societies (i.e. life is no longer lived as fate in these societies) maybe the future of society really is in our hands as Diamond suggests. On the other hand, as Beck points out, as contemporary problems are due to dispersed causes (e.g. individuals driving their car to work everyday) responsibility is rather easily evaded and some form of global decision-making would be useful. To me the latter seems unlikely – those with power are unlikely to give it up easily. The ‘global’ institutions we currently have are frequently undermined by the actions of individual states and leaders. The power to change society and lifestyles (in the west at least) now lies with individuals. But with power comes a responsibly which, on the whole, currently we individuals are shirking.

The changes my and the next generation will need to make will have to go further than simply throwing our glass, paper and plastic in different boxes. There are small ways in which we can save ourselves money whilst helping the environment and they all add up. But sea changes in lifestyle are likely to be required. Governments will not make people do that, and have no right in a democracy. They can cajole via taxation (if they do it right) but they can’t force people to change their lifestyles. People must make those changes themselves because they want to make it profitable to sustain contemporary society. The problem is it’s very difficult to do what’s best when it doesn’t align with what you want. It can hurt. Findings ways of making the two align will become increasingly important. Often the two will not align and it will be necessary to take individual responsibility by accepting there will be a degree of pain. But once this responsibility has been accepted, the next step can be taken – working to minimise the pain whilst ensuring people get as close to what they want as possible.

Inevitably, I think modelling may have something to offer here. Just as Diamond uses evidence of historical environmental, technological and social change to discuss and tell stories about past problems we might use models to discuss and tell stories about potential problems we might face in the future. Simulation models, if appropriately constructed, offer us a tool to reconstruct and examine uncertain landscape change due to environmental, technological and social change in the future. Further, simulation models offer the opportunity to examine alternative futures, to investigate traps that might lie in wait. Just as we should learn from past histories of landscape change (as Diamond suggests), we should be able to use simulation models to construct future histories of change in our contemporary landscapes.

These alternative ‘model futures’ are unlikely to be realised exactly as the model says (that’s the nature of modelling complex open systems), and may not contain the details some people might like, but if they are useful to get people around a table discussing the most sustainable ways of managing their consumption of natural resources then they can’t be a bad thing. Modelling offers insight into states of potential future environmental systems given different scenarios of human activity. At the very least, models will provide a common focus for debate on, and offer a muse to inspire reflection about, how to align ‘what I want’ with ,‘what is best’.

EGU 2007 Poster

I’m not attending the European Geophysics Union General Assembly this year as I have done the past couple. However, I do have a poster there (today, thanks to Bruce Malamud for posting it) on some work I have been doing with Raul Romero Calcerrada at Universidad Rey Juan Carlos in Madrid, Spain. We have been using various spatial statistical modelling techniques to examine the spatial patterns and causes (including both socioeconomic and biophysical) of wildfire ignition probabilities in central Spain. The poster abstract is presented below and we’re working on writing a couple of papers related to this right now.

Spatial analysis of patterns and causes of fire ignition probabilities using Logistic Regression and Weights-of-Evidence based GIS modelling
R. Romero-Calcerrada, J.D.A. Millington
In countries where more than 95% of wildfires are caused by direct or indirect human activity, such as those in the Iberian Peninsula, ignition risk estimation must consider anthropic influences. However, the importance of human factors has been given scant regard when compared to biophysical factors (topography, vegetation and meteorology) in quantitative analyses of risk. This disregard for the primary cause of wildfires in the Iberian Peninsula is owed to the difficulties in evaluating, modelling and representing spatially the human component of both fire ignition and spread. We use logistic regression and weights-of-evidence based GIS modelling to examine the relative influence of biophysical and socio-economic variables on the spatial distribution of wildfire ignition risk for a six year time series of 508 fires in the south west of the Autonomous Community of Madrid, Spain. We find that socioeconomic variables are more important than biophysical to understand spatial wildfire ignition risk, and that models using socioeconomic data have a greater accuracy than those using biophysical data alone. Our findings suggest the importance of socioeconomic variables for the explanation and prediction of the spatial distribution of wildfire ignition risk in the study area. Socioeconomic variables need to be included in models of wildfire ignition risk in the Mediterranean and will likely be very important in wildfire prevention and planning in this region.

Problems in Modelling Nature

I haven’t posted much over the last week or so – things have been super busy trying to complete my PhD thesis. I hope to be submitting the thesis in the next few weeks so there’s not likely to be much blogging going on until that’s done (and I’ve had a little rest). So until I get back something resembling a ‘normal’ routine I’ll leave you with this…

One of my advisors point out this book review in the New York Times to me. From the article it seems that in Why Environmental Scientists Can’t Predict the Future, Orrin Pilkey and Linda Pilkey-Jarvis suggest environmental models aren’t up to the job that the modellers building and using them say they are:


Dr. Pilkey and his daughter Linda Pilkey-Jarvis, a geologist in the Washington State Department of Geology, have expanded this view into an overall attack on the use of computer programs to model nature. Nature is too complex, they say, and depends on too many processes that are poorly understood or little monitored — whether the process is the feedback effects of cloud cover on global warming or the movement of grains of sand on a beach.

Their book, “Useless Arithmetic: Why Environmental Scientists Can’t Predict the Future,” originated in a seminar Dr. Pilkey organized at Duke to look into the performance of mathematical models used in coastal geology. Among other things, participants concluded that beach modelers applied too many fixed values to phenomena that actually change quite a lot. For example, “assumed average wave height,” a variable crucial for many models, assumes that all waves hit the beach in the same way, that they are all the same height and that their patterns will not change over time. But, the authors say, that’s not the way things work.

Also, modelers’ formulas may include coefficients (the authors call them “fudge factors”) to ensure that they come out right. And the modelers may not check to see whether projects performed as predicted.

Along the way, Dr. Pilkey and Ms. Pilkey-Jarvis describe and explain a host of modeling terms, including quantitative and qualitative models (models that seek to answer precise questions with more or less precise numbers, as against models that seek to discern environmental trends).

They also discuss concepts like model sensitivity — the analysis of parameters included in a model to see which ones, if changed, are most likely to change model results.

But, the authors say it is important to remember that model sensitivity assesses the parameter’s importance in the model, not necessarily in nature. If a model itself is “a poor representation of reality,” they write, “determining the sensitivity of an individual parameter in the model is a meaningless pursuit.”

Given the problems with models, should we abandon them altogether? Perhaps, the authors say. Their favored alternative seems to be adaptive management, in which policymakers may start with a model of how a given ecosystem works, but make constant observations in the field, altering their policies as conditions change. But that approach has drawbacks, among them requirements for assiduous monitoring, flexible planning and a willingness to change courses in midstream. For practical and political reasons, all are hard to achieve.

Besides, they acknowledge, people seem to have such a powerful desire to defend policies with formulas (or “fig leaves,” as the authors call them), that managers keep applying them, long after their utility has been called into question.

So the authors offer some suggestions for using models better. We could, for example, pay more attention to nature, monitoring our streams, beaches, forests or fields to accumulate information on how living things and their environments interact. That kind of data is crucial for models. Modeling should be transparent. That is, any interested person should be able to see and understand how the model works — what factors it weighs heaviest, what coefficients it includes, what phenomena it leaves out, and so on. Also, modelers should say explicitly what assumptions they make.

Some of these suggestions sounds sensible and similar to what I’ve been thinking about in my thesis. However, to suggest abandoning environmental modelling altogether – claiming that is it of no value whatsoever – seems a little excessive and I’m going reserve my judgment for now.

I’m being sent a review copy so when I get my life back I’ll take a look at it and post some more informed criticism.

Post-Normal Science (& Simulation Modelling)

Last week I didn’t quite manage to complete the JustScience week challenge to blog on a science topic, and only on science, every day that week. I managed five days but then the weekend got in the way. On those five days I wrote about the application of scientific methods to examine landscape processes – specifically wildfire regimes and land use/cover change (LUCC). Another of my ‘scientific’ interests is the relationship between science and policy- and decision-making, so what I was planning to write on Saturday might not have fitted the JustScience bill anyway. I’ll post it now instead; a brief review of some of the ways commentators have suggested science may need to adapt in the 21st century to ensure it remains relevant to ‘real world problems’.

Ulrich Beck has suggested that we now live in the ‘risk society‘. Beck’s view that the risks contemporary societies face – such as changing climates, atmospheric pollution, exposure to radioactive substances – shares common themes with others examining contemporary society and their relationships with science, technology and their environment (Giddens for example).

In the risk society, many threats are difficult to identify in everyday life, requiring complicated, expensive (usually scientific) equipment to measure and identify them. These threats, requiring methods and tools from science and technology to investigate them, have frequently been initiated by previous scientific and technological endeavours. Consequences which are no longer simply further academic and scientific problems for study, but consequences that are important socially, politically, culturally, and environmentally. Furthermore, these consequences may be imminent, potentially necessitating action before the often lengthy traditional scientific method (hypothesis testing, academic peer review etc.) has produced a consensus on the state of knowledge about it.

Beck goes on to suggest a distinction between two divergent sciences; the science of data and the science of experience. The former is older, specialised, laboratory-based science that uses the language of mathematics to explore the world. The latter will identify consequences and threats, publicly testing its objectives and standards to examine the doubts the former ignores. Traditional science, Beck suggests, is at the root of current environmental problems and will simply propagate risk further rather than reducing it.

Taking a similar perspective, Funtowicz and Ravetz have presented ‘post-normal’ science as a new type of science to replace the reductionist, analytic worldview of ‘normal’ science with a “systemic, synthetic and humanistic” approach. The term ‘post-normal’ deliberately echoes Thomas Kuhn’s formulation of ‘normal’ science functioning between paradigm shifts, to emphasise the need for a shift in scientific thinking and practices that takes it outside of the standard objective, value-free perspective. The methodology of post-normal science then, emphasises uncertainties in knowledge, quality of method, and complexities in ethics. Post-normal science, according to Funtowicz and Ravetz, embraces the uncertainties inherent in issues of risk and the environment, makes values explicit rather than presupposing them, and generates knowledge and understanding through an interactive dialogue rather than formalised deduction. You can read more about Post-Normal science itself at NUSAP.net, and the Post-Normal Times blog will keep you up-to-date on recent events and issues at the interface between science and policy-making.

Recently I’ve been thinking about the utility of environmental simulation models (particularly those that explicitly consider human activity) for examining the sorts of problems present in the ‘risk society’ and that post-normal science has been promoted as being able to contribute to. I’ll write in more detail at a later date, but briefly many of the theoretical facets post-normal science suggests seem relevant to the issues facing environmental (and landscape) simulation models. Particularly, the epistemological problems of model validation recently discussed in the academic literature (e.g. Naomi Oreskes et al., Keith Beven , and which I have touched on briefly in the past, but must post about in more detail soon) have highlighted the importance of considering the subjective aspects of the model construction process.

As a result I have come to think that model ‘validation’ might be better achieved by taking an evaluative, qualitative approach to rather than a confirmatory approach. A shift in this approach would essentially mean asking “is this model good enough” rather than “is this model true”? Ethical questions about who should be asked, and who is qualified to ask, whether a model is to be deemed trustworthy and fit for purpose to examine real world problems (and not those confined to a laboratory) also become important when these criteria are used. These model validation issues are thus resonant with a post-normal science perspective toward examining the environmental issues contemporary societies currently face.

I’ll write more on both the epistemological problems of confirmatory model validation for environmental and landscape simulation models and potential ways we might go about assessing the trustworthiness and practical adequacy of these models for addressing the problems of the ‘risk society‘ soon.

Technorati Tags: , , , , , , ,

Landscape Simulation Modelling

This is my fifth contribution to JustScience week.

The last couple of days I’ve discussed some techniques and case studies of statistical model of landscape processes. Monday and Tuesday I looked at the power-law frequency-area characteristics of wildfire regimes in the US, Wednesday and Thursday I looked at regression modelling for predicting and explaining land use/cover change (LUCC). The main alternative to these empirical modelling methods are simulation modelling techniques.

When a problem is not analytically tractable (i.e. equations cannot be written down to represent the processes) simulation models may be used to represent a system by making certain approximations and idealisations. When attempting to mimic a real world system (for example a forest ecosystem), simulation modelling has become the method of choice for many researchers. This may have become the case since simulation modelling can be used when data is sparse. Also, simulation modelling overcomes many of the problems associated with the large time and space scales involved in landscapes studies. Frequently, study areas are so large (upwards of 10 square kilometres – see photo below of my PhD study area) that empirical experimentation in the field is virtually impossible because of logistic, political and financial constraints. Experimenting with simulation models allows experiments and scenarios to be run and tested that would not be possible in real environments and landscapes.

Spatially-explicit simulation models of LUCC have been used since the 1970s and have dramatically increased in use recently with the growth in computing power available. These advances mean that simulation modelling is now one of the most powerful tools for environmental scientists investigating the interaction(s) between the environment, ecosystems and human activity. A spatially explicit model is one in which the behaviour of a single model unit of spatial representation (often a pixel or grid cell) cannot be predicted without reference to its relative location in the landscape and to neighbouring units. Current spatially-explicit simulation modelling techniques allow the spatial and temporal examination of the interaction of numerous variables, sensitivity analyses of specific variables, and projection of multiple different potential future landscapes. In turn, this allows managers and researchers to evaluate proposed alternative monitoring and management schemes, identify key drivers of change, and potentially improve understanding of the interaction(s) between variables and processes (both spatially and temporally).

Early spatially-explicit simulation models of LUCC typically considered only ecological factors. Because of the recognition that landscapes are the historical outcome of multiple complex interactions between social and natural processes, more recent spatially-explicit LUCC modelling exercises have begun to integrate both ecological and socio-economic process to examine these interactions.

A prime example of a landscape simulation model is LANDIS. LANDIS is a spatially explicit model of forest landscape dynamics and processes, representing vegetation at the species-cohort level. The model requires life-history attributes for each vegetation species modelled (e.g. age of sexual maturity, shade tolerance and effective seed-dispersal distance), along with various other environmental data (e.g. climatic, topographical and lithographic data) to classify ‘land types’ within the landscape. Previous uses of LANDIS examined the interactions between vegetation-dynamics and disturbance regimes , the effects of climate change on landscape disturbance regimes , and simulated the impacts of forest management practices such as timber harvesting.

Recently, LANDIS-II was released with a new website and a paper published in Ecological Modelling;


LANDIS-II advances forest landscape simulation modeling in many respects. Most significantly, LANDIS-II, 1) preserves the functionality of all previous LANDIS versions, 2) has flexible time steps for every process, 3) uses an advanced architecture that significantly increases collaborative potential, and 4) optionally allows for the incorporation of ecosystem processes and states (eg live biomass accumulation) at broad spatial scales.

During my PhD I’ve been developing a spatially-explicit, socio-ecological landscape simulation model. Taking a combined agent-based/cellular automata approach, it directly considers:

  1. human land management decision-making in a low-intensity Mediterranean agricultural landscape [agent-based model]
  2. landscape vegetation dynamics, including seed dispersal and disturbance (human or wildfire) [cellular automata model]
  3. the interaction between 1 and 2

Read more about it here. I’m nearly finished now, so I’ll be posting results from the model in the near future. Finally, some other useful spatial simulation modelling links:

Wisconsin Ecosystem Lab – at the University of Wisconsin

Center for Systems Integration and Sustainability – at Michigan State University

Landscape Ecology and Modelling Laboratory – at Arizona State University

Great Basin Landscape Ecology Lab – at the University of Nevada, Reno

Baltimore Ecosystem Study – at the Institute of Ecosystems Studies

The Macaulay Institute – Scottish land research centre