Creating a Genuine Science of Sustainability

Previously, I wrote about Orrin Pilkey and Linda Pilkey-Jarvis’ book, Useless Arithmetic: Why Environmental Scientists Can’t Predict the Future. In a recent issue of the journal Futures, Jerome Ravetz reviews their book alongside David Waltner-Toews’ The Chickens Fight Back: Pandemic Panics and Deadly Diseases That Jump From Animals to Humans. Ravetz himself points out that the subject matter and approaches of the books are rather different, but suggests that “Read together, they provide insights about what needs to be done for the creation of a genuine science of sustainability”.

Ravetz (along with Silvio Funtowicz) has developed the idea of ‘post-normal’ science – a new approach to replace the reductionist, analytic worldview of ‘normal’ science. Post-normal science is a “systemic, synthetic and humanistic” approach, useful in cases where “facts are uncertain, values in dispute, stakes high and decisions urgent”. I used some of these ideas to experiment with some alternative model assessment criteria for the socio-ecological simulation model I developed during my PhD studies. Ravetz’s perspectives toward modelling, and science in general, shone through quite clearly in his review:

“On the philosophical side, the corruption of computer models can be understood as the consequence of a false metaphysics. Following on from the prophetic teachings of Galileo and Descartes, we have been taught to believe that Science is the sole and certain path to truth. And this Science is mathematical, using quantitative data and abstract reasonings. Such a science is not merely necessary for achieving genuine knowledge (an arguable position) but is also sufficient. We are all victims of the fantasy that once we have numerical data and mathematical argument (or computer programs), truth will inevitably follow. The evil consequences of this philosophy are quite familiar in neo-classical economics where partly true banalities about markets are dressed up in the language of the differential calculus to produce justifications for every sort of expropriation of the weak and vulnerable. ‘What you can’t count, doesn’t count’ sums it all up neatly. In the present case, the rule of models extends over nearly all the policy-relevant sciences, including those ostensibly devoted to the protection of the health of people and the environment.

We badly need an effective critical philosophy of mathematical science. … Now science has replaced religion as the foundation of our established order, and in it mathematical science reigns supreme. Systematic philosophical criticism is hard to find. (The late Imre Lakatos did pioneering work in the criticism of the dogmatism of ‘modern’ abstract mathematics but did not focus on the obscurities at the foundations of mathematical thinking.) Up to now, mathematical freethinking is mainly confined to the craftsmen, with their jokes of the ‘Murphy’s Law’ sort, best expressed in the acronym GIGO (Garbage In, Garbage Out). And where criticism is absent, corruption of all sorts, both deliberate and unaware, is bound to follow. Pseudo-mathematical reasonings about the unthinkable helped to bring us to the brink of nuclear annihilation a half-century ago. The GIGO sciences of computer models may well distract us now from a sane approach to coping with the many environmental problems we now face. The Pilkeys have done us a great service in providing cogent examples of the situation, and indicating some practical ways forward.”

Thus, Ravetz finds a little more value in the Useless Arithmetic book than I did. But equally, he highlights that the Pilkeys offer few, rather vague, solutions and instead turns to Waltner-Toews’ book for inspiration for the future:

Pilkey’s analysis of the corruptions of misconceived reductionist science shows us the depth of the problem. Waltner-Toews’ narrative about ourselves in our natural context (not always benign!) indicates the way to a solution.”

Using the outbreak of avian flu as an example of how to tackle complex environmental in the ‘risk society’ in which we now live, Waltner-Toews:

“… makes it very plain that we will never ‘conquer’ disease. Considering just a single sort of disease, the ‘zoonoses’ (deriving from animals), he becomes a raconteur of bio-social-cultural medicine …

What everyone learned, or should have learned, from the avian flu episode is that disease is a very complex entity. Judging from TV adverts for antiseptics, we still believe that the natural state of things is to be germ-free, and all we need to do is to find the germs and kill them. In certain limiting cases, this is a useful approximation to the truth, as in the case of infections of hospitals. But even there complexity intrudes … “

Complexity which demands an alternative perspective that moves beyond the next stage of ‘normal’ science to a post-normal science (to play on Kuhn’s vocabulary of paradigm shifts):

“That old simple ‘kill the germ’ theory may now be derided by medical authorities as something for the uneducated public and their media. But the practice of environmental medicine has not caught up with these new insights.

The complexity of zoonoses reflects the character of our interaction with all those myriads of other species. … the creatures putting us at risk are not always large enough to be fenced off and kept at a safe distance. … We can do all sorts of things to control our interactions with them, but one thing is impossible: to stamp them out, or even to kill the bad ones and keep the good ones.

Waltner-Toews is quite clear about the message, and about the sort of science that will be required, not merely for coexisting with zoonoses but also for sustainable living in general. Playing the philological game, he reminds us that the ancient Indo-European world for earth, dgghem, gave us, along with ‘humus’, all of ‘human’, ‘humane’ and ‘humble’. As he says, community by community, there is a new global vision emerging whose beauty and complexity and mystery we can now explore thanks to all our scientific tools.”

This global vision is a post-normal vision. It applies to far more than just avian flu – from coastal erosion and the disposal of toxic or radioactive waste (as the Pilekys discuss for example) to climate change. This post-normal vision focuses on uncertainty, value loading, and a plurality of legitimate perspectives that demands an “extended peer community” to evaluate the knowledge generated and decisions proposed.

In all fairness, it would not be easy to devise a conventional science-based curriculum in which Waltner-Toews’ insights could be effectively conveyed. For his vision of zoonoses is one of complexity, intimacy and contingency. To grasp it, one needs to have imagination, breadth of vision and humility, not qualities fostered in standard academic training. … “

This post-normal science won’t be easy and won’t be learned or fostered entirely within the esoteric confines of an ivory tower. Science, with its logical rigour, is important. It is still the best game in town. But the knowledge produced by ‘normal’ science is provisional and its march toward truth is seemingly Sisyphean when confronted faced with the immediacy of complex contemporary environmental problems. To contribute to the production a sustainable future, a genuine science of sustainability would do well to adopt a more post-normal stance toward its subject.

What’s your model?

In their feature Formulae for the 21st Century, Edge ask ‘What is your formula? Your equation?’ Scientists, Philosophers, Artists and Writers have replied. Some gave their favourite, or what they thought to be the most important, formulas from their fields.

But many gave their models of the world. I think that’s why I like these so much – they’re models, simplifications, abstractions, essences of an aspect of life or thought. From Happiness (Danny Kahneman, Jonathan Haidt) and Creativity (Geoffrey Miller, Richard Foreman), through Cognition (Steven Pinker, Ernst Poppel), Economics (Matt Ridley), Society (Doug Rushkoff, John Horgan), Science (Richard Dawkins, Neil Shubin), Life (Alison Gopnik, Tor Nørretranders) and the Universe (Michael Shermer, Dimitar D. Sasselov) all the way (full circle maybe) to Metaphysics (Paul Bloom).

My favourites are the most simple – model parsimony, Occam’s Razor and all that. Here are a couple (click for larger images).

This got me thinking about why I like quotes so much too – because they’re models. Take the essence of an idea and express it as elegantly as possible. That’s what scientists and mathematicians do, but equally it’s what writers and artists do. Take it far enough, and being a bit of critical realist, I would say that all human perception is a model. But these elegant models are more useful than our sensory apparatus alone (which, along with our subconscious does plenty of filtering already) – they observe whilst simultaneously interpreting and synthesizing.

So what’s my model? I’m not sure – it would have to involve change. My personal models are continually changing, vacillating. Sometimes I believe time has an arrow, sometimes it doesn’t. Sometimes the world is equations and energy, sometimes it’s story and sentiment. Sometimes life is light, sometimes life is heavy. Even when my model is relatively stable it’s usually paradoxical (or should that be hypocritical?) and ironic. I’ll try to parse it down to it’s most parsimonious state and then find some words and symbols to express it elegantly. Then I’ll post it here. I can’t guarantee that will be any time soon mind you…

In the meantime, what’s your model?

The Wilderness Ideal

One evening whilst sitting on a deck overlooking a tranquil lake in the wilds of the UP’s northern hardwood forests, I began reading William Cronon’s contributions to the volume he edited himself; Uncommon Ground. The book has been around for a decade and more but it is only recently that I came across a copy in a secondhand book store. It seems apt that I considered what it had to say about the ‘social construction’ of nature in a setting of the type that has long intrigued me. Maybe the view of a landscape which confronted me is another of the reasons I am doing what I am right now. I have had pictures of these large wilderness landscapes on the walls of my mind, and elsewhere, for a while.

Cronon examines “the trouble with wilderness” with reference to the Edenic ideal that underlay it from the beginning. Wordsworth and Thoreau were in bewildered or lost awe of the sublime landscapes they travelled, but by the time John Muir came to the Sierra Nevada the landscape was an ecstasy. Whilst Adam and Eve may have been driven from the garden out into the wilderness, the myth was now ‘the mountain as cathedral’ and sacred wilderness was a place to worship God’s natural world. Furthermore, as the American frontier diminished with time and technology,

“wilderness came to embody the national frontier myth, standing for the wild freedom of America’s past and and seeming to represent a highly attractive natural alternative to the ugly artificiality of modern civilization. … Ever since the nineteenth century, celebrating wilderness has been an activity mainly for well-to-do city folks. Country people generally know far too much about working the land to regard unworked land as their ideal.” (p.78)

Cronon suggests that there is a paradox at the heart of the Wilderness ideal, this conception that true nature must also be wild and that humans must set aside areas of the world for it to remain pristine. As Cronon puts it, this paradox is that “The place where we are is the place where nature is not”. Taking this logic to its extreme results in the need for humans to kill themselves in order to preserve the natural world;

“The absurdity of this proposition flows from the underlying dualism it expresses. … The tautology gives us no way out: if wild nature is the only thing worth saving, and if our mere presence destroys it, then the sole solution to our own unnaturalness, the only way to protect sacred wilderness from profane humanity, would seem to be suicide. It is not a proposition that seems likely to produce very positive or practical results.” (p.83)

I’ll say. But Cronon is not saying that protected wilderness areas are themselves undesirable things, of course not. His point is about the idea of Wilderness. As a response he suggests that rather than thinking of nature as ‘out there’, we need to learn how to bring the wonder we feel when in the wilderness closer to home. We need to abandon the idea of the tree in the garden as artificial and the tree in the wilderness as natural. If we see both trees as natural, as wild, then we will be able to see nature and wildness everywhere; in the fields of the countryside, between the cracks in the city pavement, and even in our own cells.

“If wildness can stop being (just) out there and start being (also) in here, if it can start being as humane as it is natural, then perhaps we can get on with the unending task of struggling to live rightly in the world – not just in the garden, not just in the wilderness, but in the home that encompasses both” (p.90)

Sitting on that deck looking out over the lake it was clear that landscapes such as the one I was in aren’t the idealised, pristine, wilderness that they may be portrayed as in books, photographs and travel brochures. Just as in studying its nature I have come to understand a little better the uncertainties of the scientific method that is supposed to bring facts and truth, so I think have come to better understand the place of human needs within these ‘wild’ landscapes. As naive as it is to think that science might offer the absolute truth (it can’t, but it is still the best game in town to understand the world around us), thinking humans are inseparable from nature seems equally foolish.

In the introduction to a book on natural resource economics (which has mysteriously vanished from my bookshelf), an author describes a similar situation. As a young man he wanted to study the environment in order that he might save it from destructive hands of humans. But in time he came to realise this was unrealistic and that better would be to study the means by which humans use the ‘natural world’ to harvest and produce the resources we need to live. Economics is concerned with the means by which we allocate, and create value from, resources. Just as it is important to understand how ‘nature’ works, it is also important to understand how a world in which humans are a natural component works, and how it can continue to function indefinitely.

Landscape Ecology and Ecological Economics have grown out of this understanding. Whilst theories and models about the natural world independent of humans remain necessary, increasingly important are theories and models that consider the interaction between the social, economic and biophysical components of the natural world. These tools might help us get on with the task of living sustainably in the place which humans should naturally call home.

Buy on Amazon

Summary – Validating and Interpreting Socio-Ecological Simulation Models

So finally the summary to my set of posts about the validation and interpretation of Socio-Ecological Simulation Models (SESMs)that arose out of some of the thinking I did during my PhD thesis.

The nature of open systems requires SESMs to specify and place boundaries on the system such that it may analysed effectively. Recent debate in the geographical and environmental modelling communities has highlighted the importance of observer dependencies when identifying the appropriate model ‘closure’. Furthermore, because an ‘open’ system can be ‘closed’ for study in multiple ways whilst still adequately representing system behaviour, the issue of model ‘affirming the consequent’ is present when attempting to model these systems.

Because of these issues I suggested that a more reflexive approach, emphasising trust via practical adequacy over the establishment of true model structure via mimetic accuracy, will put SESMs in a better position to provide understanding for non-modellers and contribute more readily to the decisions and debates regarding contemporary problems facing many real world environmental systems.

This is not to say issues regarding mimetic accuracy and model structure should be totally ignored – these model validation criteria will still have a role to play. However, emphasising trust via practical adequacy over truth via mimetic accuracy, ensures the model validation question is ‘how good it this model for my purposes?’ and not ‘is this model true?’. Engagement with local stakeholders throughout the modelling processes, contributing to model development and application should ensure practical adequacy, but also, in parallel, trust. As a result of this participatory model evaluation exercise, confidence in the model should be built, hopefully to the level where it can be deemed to be ‘validated’ (i.e. fit for purpose).

Critical Realism for Environmental Modelling?

As I’ve discussed before, Critical Realism has been suggested as a useful framework for understanding the nature of reality (ontology) for scientists studying both the environmental and social sciences. The recognition of the ‘open’ and middle-numbered nature of real world systems has led to a growing acceptance of both realist (and relativist – more on that in a few posts time) perspectives toward the modelling of these systems in the environmental and geographical sciences.

To re-cap, the critical realist ontology states that reality exits independently of our knowledge, and that it is structured into three levels: real natural generating mechanisms; actual events generated by those mechanisms; and empirical observations of actual events. Whilst mechanisms are time and space invariant (i.e are universal), actual events are not because they are realisations of the real generating mechanisms acting in particular conditions and contingent circumstances. This view seems to fit well with the previous discussion on the nature of ‘open’ systems – identical mechanisms will not necessarily produce identical events at different locations in space and time in the real world.

Richards initiated debate on the possibility of adopting a critical realist perspective toward research in the environmental sciences by criticising emphasis on rationalist (hypothetico-deductive) methods. The hypothetico-deductive method states that claims to knowledge (i.e. theories or hypotheses) should be subjected to tests that are able to falsify those claims. Once a theory has been produced (based on empirical observations) a consequence of that theory is deduced (i.e. a prediction is made) and an experiment constructed to examine whether the predicted consequences are observed. By replicating experiments credence is given to the theory and knowledge based upon it (i.e. laws and facts) is held as provisional until evidence is found to disprove the theory.

However, critical realism does not value regularity and replication as highly as rationalism. The separation of real mechanisms from empirical observations, via actual events, means that “What causes something to happen has nothing to do with the number of times we have observed it happening”. Thus, in the search for the laws of nature, a rationalist approach leaves open the possibility of the creation of laws as artefacts of the experimental (or model) ‘closure’ of the inherently open system it seeks to represent (more on model ‘closure’ next time).

The separation of the three levels of reality means that whilst reality exists objectively and independently, we cannot observe it. This separation causes a problem – how can science progress toward understanding the true nature of reality if the real world is unobservable? How do critical realists assess whether they have reached the real underlying mechanisms of a system and can stop studying it?

Whilst critical realism offers reasons for why the nature of reality makes the modelling of ‘open’ systems tricky for scientists, it doesn’t seem to provide a useful method by which to overcome the remaining epistemological problem of knowing whether a given (simulation) model structure is appropriate. In the next few posts I’ll examine some of these epistemological issues (equifinality, looping effects, and affirming the consequent) before switching to examine some potential responses.

Validating Models of Open Systems

A simulation model is an internally logically-consistent theory of how a system functions. Simulation models are currently recognised by environmental scientists as powerful tools, but the ways in which these tools should be used, the questions they should be used to examine, and the ways in which they can be ‘validated’ are still much debated. Whether a model aims to represent an ‘open’ or ‘closed’ systems has implications for the process of validation.

Issues of validation and model assessment are largely absent in discussions of abstract models that purport to represent the fundamental underlying processes of ‘real world’ phenomena such as wildfire, social preferences and human intelligence. These ‘metaphor models’ do not require empirical validation in the sense that environmental and earth systems modellers use it, as the very formulation of the system of study ensures it is ‘closed’. That is, the system the model examines is logically self-contained and uninfluenced by, nor interactive with, outside statements or phenomena. The modellers do not claim to know much about the real world system which their model is purported to represent, and do not claim their model is the best representation of it. Rather, the modelled system is related to the empirical phenomena via ‘rich analogy’ and investigators aim to elucidate the essential system properties that emerge from the simplest model structure and starting conditions.

In contrast to these virtual, logically closed systems, empirically observed systems in the real world are ‘open’. That is, they are in a state of disequilibrium with flows of mass and energy both into and out of them. Examples in environmental systems are flows of water and sediment into and out of watersheds and flows of energy into (via photosynthesis) and out of (via respiration and movement) ecological systems. Real world systems containing humans and human activity are open not only in terms of conservation of energy and mass, but also in terms of information, meaning and value. Political, economic, social, cultural and scientific flows of information across the boundaries of the system cause changes in the meanings, values and states of the processes, patterns and entities of each of the above social structures and knowledge systems. Thus, system behaviour is open to modification by events and phenomena outside the system of study.

Alongside being ‘open’, these systems are also ‘middle-numbered’. Middle-numbered systems differ from small-numbered systems (controlled situations with few interacting components, e.g. two billiard balls colliding) that can be described and studied well using Cartesian methods, and large-numbered systems (many, many interacting components, e.g. air molecules in a room) that can be described and studied using techniques from statistical physics. Rather, middle-numbered systems have many components, the nature of interactions between which is not homogenous and is often dictated or influenced by the condition of other variables, themselves changing (and potentially distant) in time and space. Such a situation might be termed complex (though many perspectives on complexity exist). Systems at the landscape scale in the real world are complex and middle-numbered. They exist in a unique time and place. In these systems history and location are important and their study is necessarily a <a href="http://dx.doi.org/10.1130/0016-7606(1995)1072.3.CO;2&#8243; target=”_blank” class=”regular”>‘historical science’ that recognises the difficulty of analysing unique events scientifically through formal, laboratory-type testing and the hypothetico-deductive method. Most real-world systems possess these properties, and coupled human-environment systems are a prime example.

Traditionally laboratory science has attempted to isolate real world systems such that they become closed and amenable to the hypothetico-eductive method. The hypothetico-deductive method is based upon logical prediction of phenomena independent of time and place and is therefore useful for generating knowledge about logically, energetically and materially ‘closed’ systems. However, the ‘open’ nature of many real-world, environmental systems (which cannot be taken into the laboratory and instead must be studies in situ) is such that the hypothetico-deductive method is often problematic to implement in order to generate knowledge about environmental systems from simulation models. Any conclusions draw using the hypothetico-deductive method for open systems using a simulation model will implicitly be about the model rather than the open system it represents. Validation has also frequently been used, incorrectly, as synonymous with demonstrating that the model is a truly accurate representation of the real world. By contrast, validation in the discussion presented in this series of blog posts refers to the process by which a model constructed to represent a real-world system has been shown to represent that system well enough to serve that model’s intended purpose. That is, validation is taken to mean the establishment of model legitimacy – usually of arguments and methods.

In the next few posts I’ll examine the rise of (critical) realist philosophies in the environmental sciences and environmental modelling and will explore the philosophy underlying these problems of model validation in more detail.

Validating and Interpreting Socio-Ecological Simulation Models

Over the next 9 posts I’ll discuss the validation, evaluation and interpretation of environmental simulation modelling. Much of this discussion is taken from chapter seven of my PhD thesis, arising out of my efforts to model the impacts of agricultural land use change on wildfire regimes in Spain. Specifically, the discussion and argument are focused on simulation models that represent socio-ecological systems. Socio-Ecological Simulation Models (SESMs), as I will refer to them, are those that represent explicitly the feedbacks between the activities and decisions of individual actors and their social, economic and ecological environments.

To represent such real-world behaviour, models of this type are usually spatially explicit and agent-based (e.g. Evans et al., Moss et al., Evans and Kelley, An et al., Matthews and Selman) – the model I developed is an example of a SESM. One motivating question for the discussion that follows is, considering the nature of the systems and issues they are used to examine, how we should go about approaching model evaluation or ‘validation’. That is, how do we identify the level of confidence that can be placed in the knowledge produced by the use of a SESM? A second question is, given the nature of SESMs, what approaches and tools are available and should be used to ensure models of this type provide the most useful knowledge to address contemporary environmental problems?

The discussion that follows adopts a (pragmatic) realist perspective (in the tradition of Richards and Sayer) and recognises and the importance of the open, historically and geographically contingent nature of socio-ecological systems. The difficulties of attempting to use general rules and theories (i.e. a model) to investigate and understand a unique place in time are addressed. As increasingly acknowledged in environmental simulation modelling (e.g. Sarewitz et al.), socio-ecological simulation modelling is a process in itself in which human decisions come to the fore – both because human decision-making is being modelled but also, importantly, because modellers’ decisions during model construction are a vital component of the process.

If these models intended to inform policy-makers and stakeholders about potential impacts of human activity, the uncertainty inherent in them needs to be managed to ensure their effective use. Fostering trust and understanding via a model that is practically adequate for purpose may aid traditional scientific forms of model validation and evaluation. The list below gives the titles of the posts that will follow over the next couple of weeks (and will become links when the post is online).

The Nature of Open Systems
Realist Philosophy in the Environmental Sciences
Equifinality
Interactive vs. Indifferent Kinds
Affirming the Consequent
Relativism in Modelling
Alternative Model Assessment Criteria
Stakeholder Participation and Expertise
Summary

Useless Arithmetic?

Can we predict the future? Orrin Pilkey and Linda Pilkey-Jarvis say we can’t. They blame the complexity of the real world alongside a political preference to rely on the predictive results of models. I’m largely in agreement with them on many of their points but their popular science book doesn’t do an adequate job of explaining why.

The book is introduced with an example of the failure of mathematical models to predict the collapse of the Grand Banks cod fisheries. The second chapter tries to lay the basis of their argument, providing an outline of underlying philosophy and approaches of environmental modelling. This is then followed by six case studies of the difficulties of using models and modelling in the real world: the Yucca Mountain nuclear waste depository, climate change and sea-level rise, beach erosion, open-cast pit mining, and invasive plant species. Their conclusion is entitled ‘A Promise Unfulfilled’ – those promises having been made by engineers attempting to apply methods developed in simple, closed systems to those of complex, open systems.

Unfortunately the authors don’t describe this conclusion in such terms. The main problems here are the authors’ rather vague distinction between quantitative and qualitative models and their inadequate examination of ‘complexity’. In the authors’ own words;

“The distinction between quantitative and qualitative models is a critical one. The principle message in this volume is that quantitative models predicting the outcome of natural processes on the surface of the earth don’t work. On the other hand, qualitative models, when applied correctly, can be valuable tools for understanding these processes.” p.24

This sounds fine, but it’s hard to discern, from their descriptions, exactly what the difference between quantitative and qualitative models is. In their words again,

Quantitative Models:

  • “are predictive models that answer the questions ‘where’, ‘when’, ‘how much'” p.24
  • “if the answer [a model provides] is a single number the model is quantitative” p.25

Qualitative Models:

  • “predict directions and magnitudes” p.24
  • do not provide a single number but consider relative measures, e.g “the temperature will continue to increase over the next century” p.24

So they both predict, just one produces absolute values and the other relative values. Essentially what the authors are saying is that both types of models predict and both produce some form of quantitative output – just one tries to be more accurate than another. That’s a pretty subtle difference.

Further on they try to clarify the definition of a qualitative model by appealing to concepts;

“a conceptual model is a qualitative one in which the description or prediction can be expressed as written or spoken word or by technical drawings or even cartoons. The model provides an explanation for how something works – the rules behind some process” p.27.

But all environmental models considering process (i.e. that are not empirical/statistical) are conceptual, regardless of whether they produce absolute or relative answers! Whether the model is Arrhenius’ back of the envelope model of how the greenhouse effect works, or a Global Circulation Model (GCM) running on a Cray Supercomputer and considering multiple variables, they are both built on conceptual foundations. We could write down the structure of the GCM, it would just take a long time. So again, their distinction between quantitative and qualitative models doesn’t really make things much clearer.

With this sandy foundation the authors examine suggest that the problem is that the real world is just too complex for the quantitative models to be able to predict anything. So what is this ‘complexity’? According to Pilkey and Pilkey-Jarvis;

“Interactions among the numerous components of a complex system occur in unpredictable and unexpected sequences.” p.32

So, models can’t predict complex systems because they’re unpredictable. hmm… A tautology no? The next sentence;

“In a complex natural process, the various parameters that run it may kick in at various times, intensities, and directions, or they may operate for various time spans”.

Okay, now were getting somewhere – a complex system is one that has many components in which the system processes might change in time. But that’s it, that’s our lot. That’s what complexity is. That’s why environmental scientists can’t predict the future using quantitative models – because there are too many components or parameters that may change at any time to keep track of such that we couls calculate an absolute numerical result. A relative result maybe, but not an absolute value. I don’t think this analysis quite lives up to it’s billing as a sub-title. Sure, the case-studies are good, informative and interesting but I think this conceptual foundation is pretty loose.

I think the authors’ would have been better off making more use of Naomi Oreskes’ work (which they themselves cite) by talking about the difference between logical and temporal prediction, and the associated difference between ‘open’ and ‘closed’ systems. Briefly, closed systems are those in which the intrinsic and extrinsic conditions remain constant – the structure of the system, the processes operating it, and the context within which the system sits do no change. Thus the system – and predictions about it – are outside history and geography. Think gas particles bouncing around in a sealed box. If we know the volume of the box and the pressure of the gas, assuming nothing else changes we can predict the temperature.

Contrast this with an ‘open’ system in which the intrinsic and extrinsic conditions are open to change. Here, the structure of the system and the processes operating the system might change as a result of the influence of processes or events outside the system of study. In turn, where the system is situated in time and space becomes important (i.e. these are geohistorical systems), and prediction becomes temporal in nature. All environmental systems are open. Think the global atmosphere. What do we need to know in order to predict the temperature in the future in this particular atmosphere? Many processes and events influencing this particular system (the atmosphere) are clearly not constant and are open to change.

As such, I am in general agreement with Pilkey and Pilkey-Jarvis’ message, but I don’t think they do the sub-title of their book justice. They show plenty of cases in where quantitative predictive models of environmental and earth systems haven’t worked, and highlight many of the political reasons why this approach has been taken, but they don’t quite get to the guts of why environmental models will never be able to accurately make predictions about specific places at specific times in the future. The book Prediction: Science, Decisions Making, and the Future of Nature provides a much more comprehensive consideration of these issues and, if you can get your hands on it, is much better.

I guess that’s the point though isn’t it – this is a popular science book that is widely available. So I shouldn’t moan too much about this book as I think it’s important that non-modellers be aware of the deficiencies of environmental models and modelling and how they are used to make decisions about, and manage, environmental systems. These include:

  • the inherent unpredictability of ‘open’ systems (regardless of their complexity)
  • the over-emphasis of environmental models’ predictive capabilities and expectations (as a result of positivist philosophies of science that have been successful in ‘closed’ and controlled conditions)
  • the politics of modelling and management
  • the need to publish (or at least make available) model source code and conceptual structure
  • an emphasis on models to understand rather than predict environmental systems
  • any conclusions based on experimentation with the model are conclusions about the structure of the model not the structure of nature

I’ve come to these conclusions over the last couple of years during the development of a socio-ecological model, in which I’ve been confronted by differing modelling philosophies. As such, I think the adoption of something more akin to ‘Post-Normal’ Science, and greater involvement of the local publics in the environments under study is required for better management. The understanding of the interactions of social, economic and ecological systems poses challenges, but is one that I am sure environmental modelling can contribute. However, given the open nature of these systems this modelling will be more useful in the ‘qualitative’ sense as Pilkey and Pilkey-Jarvis suggest.

Orrin H. Pilkey and Linda Pilkey-Jarvis (2007)
Useless Arithmetic: Why Environmental Scientists Can’t Predict the Future
Columbia University Press
ISBN: 978-0-231-13212-1

Buy at Amazon.com

[June 3rd 2007: I just noticed Roger Pielke reviewed Useless Arithmetic for Nature the same day as this original post. Read the review here.]

Post-Normal Science (& Simulation Modelling)

Last week I didn’t quite manage to complete the JustScience week challenge to blog on a science topic, and only on science, every day that week. I managed five days but then the weekend got in the way. On those five days I wrote about the application of scientific methods to examine landscape processes – specifically wildfire regimes and land use/cover change (LUCC). Another of my ‘scientific’ interests is the relationship between science and policy- and decision-making, so what I was planning to write on Saturday might not have fitted the JustScience bill anyway. I’ll post it now instead; a brief review of some of the ways commentators have suggested science may need to adapt in the 21st century to ensure it remains relevant to ‘real world problems’.

Ulrich Beck has suggested that we now live in the ‘risk society‘. Beck’s view that the risks contemporary societies face – such as changing climates, atmospheric pollution, exposure to radioactive substances – shares common themes with others examining contemporary society and their relationships with science, technology and their environment (Giddens for example).

In the risk society, many threats are difficult to identify in everyday life, requiring complicated, expensive (usually scientific) equipment to measure and identify them. These threats, requiring methods and tools from science and technology to investigate them, have frequently been initiated by previous scientific and technological endeavours. Consequences which are no longer simply further academic and scientific problems for study, but consequences that are important socially, politically, culturally, and environmentally. Furthermore, these consequences may be imminent, potentially necessitating action before the often lengthy traditional scientific method (hypothesis testing, academic peer review etc.) has produced a consensus on the state of knowledge about it.

Beck goes on to suggest a distinction between two divergent sciences; the science of data and the science of experience. The former is older, specialised, laboratory-based science that uses the language of mathematics to explore the world. The latter will identify consequences and threats, publicly testing its objectives and standards to examine the doubts the former ignores. Traditional science, Beck suggests, is at the root of current environmental problems and will simply propagate risk further rather than reducing it.

Taking a similar perspective, Funtowicz and Ravetz have presented ‘post-normal’ science as a new type of science to replace the reductionist, analytic worldview of ‘normal’ science with a “systemic, synthetic and humanistic” approach. The term ‘post-normal’ deliberately echoes Thomas Kuhn’s formulation of ‘normal’ science functioning between paradigm shifts, to emphasise the need for a shift in scientific thinking and practices that takes it outside of the standard objective, value-free perspective. The methodology of post-normal science then, emphasises uncertainties in knowledge, quality of method, and complexities in ethics. Post-normal science, according to Funtowicz and Ravetz, embraces the uncertainties inherent in issues of risk and the environment, makes values explicit rather than presupposing them, and generates knowledge and understanding through an interactive dialogue rather than formalised deduction. You can read more about Post-Normal science itself at NUSAP.net, and the Post-Normal Times blog will keep you up-to-date on recent events and issues at the interface between science and policy-making.

Recently I’ve been thinking about the utility of environmental simulation models (particularly those that explicitly consider human activity) for examining the sorts of problems present in the ‘risk society’ and that post-normal science has been promoted as being able to contribute to. I’ll write in more detail at a later date, but briefly many of the theoretical facets post-normal science suggests seem relevant to the issues facing environmental (and landscape) simulation models. Particularly, the epistemological problems of model validation recently discussed in the academic literature (e.g. Naomi Oreskes et al., Keith Beven , and which I have touched on briefly in the past, but must post about in more detail soon) have highlighted the importance of considering the subjective aspects of the model construction process.

As a result I have come to think that model ‘validation’ might be better achieved by taking an evaluative, qualitative approach to rather than a confirmatory approach. A shift in this approach would essentially mean asking “is this model good enough” rather than “is this model true”? Ethical questions about who should be asked, and who is qualified to ask, whether a model is to be deemed trustworthy and fit for purpose to examine real world problems (and not those confined to a laboratory) also become important when these criteria are used. These model validation issues are thus resonant with a post-normal science perspective toward examining the environmental issues contemporary societies currently face.

I’ll write more on both the epistemological problems of confirmatory model validation for environmental and landscape simulation models and potential ways we might go about assessing the trustworthiness and practical adequacy of these models for addressing the problems of the ‘risk society‘ soon.

Technorati Tags: , , , , , , ,