Gidden’s Risk and Responsibility

A good story is one that grips you; it’s hard to guess what is coming next but once it has been told, the outcome seemed inevitable. The same could be said for theories about how the world is — sometimes you read something that just makes sense. You knew that’s how the world was before you read about it, but couldn’t put it into words quite so eloquently. That’s how I felt when I was reading Giddens’ ‘Risk and Responsibility’.

The story goes that we have reached the End of Nature and the End of Tradition, and we are no longer in a time of External Risk but now live in a time of Manufactured Risk. Essentially Giddens‘ piece is a discussion of the about how the threats to contemporary society are a product of science and technology, and in this sense is based within the notion of the Risk Society.

To be more specific, the advances of science and technology and the ‘domination’ over nature it allows us, means that our environmental worries are no longer about what nature might do to us, but what we are doing to nature. This may be true in the majority of developed societies, but there are still plenty of developing areas in the world for which this does not apply (and natural hazards still pose a major threat to some areas of the developed world). But let’s leave that point aside and remember the problems of anthropogenically caused climate change, pollution of the worlds water-ways, deforestation of tropical rainforest, the problem of radioactive waste, and all the other protection-of-the-global-commons-type issues. In many ways, we humans have more of an influence over our environment than it has over us. This is the End of Nature.

The End of Tradition, in Gidden’s own words, “is essentially to be in a world where life is no longer lived as fate.” Previously in industrial society, the man went out to work and the woman stayed at home with the kids. But all this has changed; we are more socially mobile and we live in a world where information (via the internet), freedom (via democracy) and opportunities (via strong economies) abound. We can do what we want to do and take control of our own lives. Again, there is a limit to this and it applies mainly to developed areas of the world, but it sounds about right doesn’t it?

Risk as a concept only originated as humans began to think they might be able to take control of their environment. Whilst nature and tradition had not yet ended their demise was on the horizon. Prior to this dangers were ‘taken as given’, as ‘acts of god’ that humans could not control. Humans had little control over external risk, but they could take steps to reduce their losses in the face of frequent hazards. External Risk originated in early industrial societies with the advent of public and private insurance — we couldn’t do much about the risk (because it was external) but we could at least mitigate against our losses.

And now, finally, we have Manufactured Risk, a symptom of the risk society. Manufactured risk is the very risk caused by our own human progress and development, primarily because of the fantastic recent advances of scientific knowledge and technological innovation. Although manufactured risk is caused by human activity, because it is new and we have little experience of it we cannot calculate any probabilities associated with it. Although created by science and technology, science and technology cannot solve the problems they’ve caused — they produce uncertainty as fast as they destruct it. And besides, problem-solving is not the goal of science, science is for generating knowledge (via puzzle-solving).

Thus, whilst science and technology have reduced the problems of external risk, they have also brought the end of nature and manufactured risk with it. The threats and risks produced in our risk society are dispersed in nature and origin. Beck suggests that from this situation emerges ‘organised irresponsibility’; whilst anthropic in nature, no individual actor(s) can be held responsible. This also seems to resonate with the idea of The Tyranny of Small Decisions that I was describing just a few days ago. Scientific knowledge and technological innovation developed in an accumulative fashion, and are used by everyone that has access to it. Who can you blame?

So this is all very gloomy isn’t it? The End of Tradition. The End of Nature. The End of the Story? What can we do about this?

Seemingly the tools we used to get us to this point won’t work to help us move on and deal with the pressing environmental problems facing contemporary society; global warming, pollution, radiation, deforestation, carcinogens… To continue the story and solve these problems it’s been suggested we need a new kind of science. Not a ‘normal’, universal, value-free, distant science, but a situated, value-laden, engaged science. It’s time science stopped sticking it’s head in the sand saying “we just produce the knowledge, it’s up to society to decide what to do with it”. This ‘new’ science been named post-normal science and will be the subject (maybe hero?) as the the story continues another time. Gripping eh?

Categories: , , , , , , , , ,

The Tyranny of Small Decisions

How did we get to where we are today?

William Odum highlighted the importance of small decisions effects on wider environmental issues and management — the “tyranny of small decisions” as it is has been called. When the accretion of small decisions give rise to broader scale events and phenomena, the results that emerge are not necessarily optimal for society or the environment.

In the case of my PhD study area an important issue is the sustainable maintenance of the relationship between fire, vegetation and human activity across landscape, as a result of land use decisions made by individual humans within that landscape. To ignore the potential effects of these small decisions on the wider environment could prove costly.

Equally, when studying these effects of these individual decisions, the reciprocal effects of changes in the wider environment (e.g. the wildfire regime) upon them shouldn’t be omitted. The tyranny of small decisions means that any model of landscape change in needs to represent the feedbacks between individual decisions and the landscape consequences. Agent-based modelling, integrated with a cellular automata is one way I’m attempting to do this.

But these feedbacks don’t only happen in space across landscapes, they happen over time through one’s life. All the small minute-by-minute decisions that have led me to be where I am now. Individual minute-by-minute decisions made now, with an eye to future based on past experience. If making a decision at the current time is dependent upon one’s situation at the present time, which in turn has arisen due to past decisions, have those past decisions reduced the viable options one has open at the present time? Or have as many new doors opened as closed?

The tyranny of minute decisions. Why every minute counts. Any why we’ll never know whether a decision was the right or wrong one to make until all our minutes have gone…

Reference
Odum, W.E. 1982 Environmental Degradation and the Tyranny of Small Decisions Bioscience 32:9

Categories:
, , , , , ,

Reading is Believing?

“Don’t believe everything you read in the newspapers” they used to say. Well now it seems that the phrase is (or should be) “Don’t believe everything you read on a blog“.

As we’ve seen on this very blog, we need to be wary of of fact and fiction. But everybody knew that already did’t they? Richard Ladle suggests:

Misreporting and misrepresentation are important because they can lead to a loss of trust at a time when public support for pro-environmental policies is most crucial.

Poor reporting of environmental science may also have a disproportionate effect on children who are increasingly turning the internet as their preferred source of information and who are least able to judge the validity of claims or the legitimacy of one blog over another.

So how should we be responding to the challenges and opportunities presented by the blogosphere?

One way to deal with misrepresentation in blogs is to increase the weight of informed opinions in the blogosphere. An influx of scientifically informed opinion and accurate information would also help combat and correct misrepresentations in the traditional news media and draw public attention to important new research findings.

Recently there has been plenty of debate about the politicisation of environmental science. Scientists are increasingly using the media, including blogs, to promote and disseminate their work. This has left them open to criticism that they are cherrypicking their arguments and misrepresenting science. NGO’s and advocacy groups have been cherrypicking their arguments for decades — but scientists shouldn’t fall into this habit and they will only be devaluing their credibility if they do. However, this is not to say scientists should not be disseminating their work, quite contrary. They should, if nothing else to add to the debate. If a scientific finding is to be useful in the ‘real world’ it will be always be political – we have to accept that. The time has gone when scientists were able to say “I just do the science, not the politics”. Environmental science in the 21st century must accept this, and learn how to engage with the public at large to communicate its findings and the implications of those findings. (How this in turn influences how the science proceeds is another, interesting, question).

Of course there’s uncertainty in science, and there always will be. As Rodger Bradbury suggests, Science is 3-tuple:

  1. a body of knowledge,
  2. a method for generating that knowledge, and
  3. a collection of people using those methods to increase that knowledge.

The knowledge generated by science is, and should be, constantly re-evaluated and questioned. For me this is the best way to examine the world, by constantly questioning. But the people using the methods and tools of science will always have their own agenda — even if it is simply the advancement of their own scientific career. Scientists are human beings. But now we need to continue to work to improve another facet of our scientific toolbag – the accurate communication of scientific work to the public at large. Who is better qualified to deliver the message, the scientist or the journalist? Scientists should work hard to make sure it is them.

But what about the situation right now? Richard Ladle again:
Fortunately, there are several ways in which the credibility of a website or blog can be quickly assessed:

  • Check the data – strong scientific arguments are based on information from recognised sources that is available for public scrutiny, while weak or spurious arguments are often backed up with data from secondary sources or often no data at all
  • Take note of the language – arguments couched in hyperbolic language may be masking a lack of understanding or sound information


These are sound starting points for anyone reading anything on the internet. Personally, on this blog I try to make a distinction between ‘serious’ comments and more ‘tounge-in-cheek’ comments by capitalising words in titles of the former, but not the latter. A good scientist will never deliberately mislead — but at the same time it needs to be understood that they can never be 100% sure of their findings. Scientific knowledge is provisional and always open to scrutiny and change. That’s how it differs from faith.

Naveh’s Holistic Landscape Ecology

(or “One of the reasons I’ve ended up doing what I’m doing“)

I don’t know if he was the first to come up with the term, but I first read about holistic landscape ecology in a couple of papers by Prof. Zev Naveh (in 2001 during my third year undergrad course at King’s, ‘Landscape Ecology’ run by Dr. George Perry). Whilst reading today I came across some old notes I made from one of those papers (not terribly critical as you can see!?). Distinguished Professors of a Certain Age are allowed licence to run riot with their accumulated wisdom as you can see. I’m not being facetious – they can write bigger ‘blue skies’, ‘call to arms’ pieces than other (more lowly) academics.

These are the two papers that really got me interested anwyay (as well as my Disertation; finally, as a 3rd year undergrad!?). I think I thought something along the lines of, “there are problems here that we should be thinking about now and this guy is suggesting a paradigm of how we might start approaching them scientifically“. I think they’re one of the reasons I started a MSc (“I can’t stop now I’ve only just found this stuff“), and then later continued onto this ‘ere PhD (“this is interesting – I want to keep going“).

Later I got to these questions;

  • What sort of scientific tools and methods will we need to address problems that we have in our socio-environmental systems now?
  • How do we integrate tools and methods from different scientific disciplines? (i.e. how do we really become ‘inter-disciplinary’?)
  • What sort of science will this be? Normal? Post-Normal? Something else?

It could take a while to answer these – but it doesn’t seem like we’ve got that long. We’ll have to work them out as we go along I think…

Refs

Categories: , , , , , ,

Applications of Complex Systems to Social Sciences

I’ve recently returned from the GIACS summer school in Poland: Applications of Complex Systems to Social Sciences. Whilst not a social scientist, I am interested in the incorporation of aspects of human/social behaviour into models of the physical environment and its change. I thought this summer school might be an opportunity to get a glimpse at what the future of modelling these systems might be, and how others are approaching investigation of social phenomena.

The set of lecturers was composed of a Psychologist, three Physicists (P1, P2, and P3), a Geographer, and an Economist. I’m sure plenty of ‘real social scientists’ wouldn’t be too happy with what some of these modellers are doing with their differential equations, cellular automata, agent-based models and network theory. One of the students I spoke to (a social psychologist) complained that these guys were modelling social systems but not humans; another (a computer scientist interested in robotics) suggested the models were too ‘reactive’ rather than ‘proactive’. Pertinent comments I think, and ones that made me realise that to really understand what was going on would need me to take a step back and look at the broader modelling panorama.

Some of the toughest comments from the school attendees were levelled at the Geographer’s model (or “virtual Geography”) that attempts to capture the patterns of population growth observed for European cities, using a mechanistic approach based on the representation of economic processes. The main criticism was that the large parameter space of this model (i.e. a large number of interacting parameters) makes the model very difficult to analyse, interpret and understand. Such criticisms were certainly valid and have been previously observed by other modellers of geographic systems. However, the same criticisms could not be levelled at the models presented at the physicists’ (and psychologist’s) models, simply because their models have far fewer parameters.

And so this, I think, is the one of the problems that the social psychologist and cognitive scientist alluded to; the majority of the models arising from the techniques of physics (and mathematics) are generally interested in the system properties as whole and not individual interactions and components. One or two key state variables (a variable used to describe the state of the system) are reported and analysed. But actually, there’s nothing wrong with this approach because of the nature of their models, based as they are on very simple assumptions and largely homogenous in the agents, actors and interactions they considered.

Such an approach didn’t settle well with the social psychologist because the agents being modelled are supposed to be representative of humans; humans are individuals that make decisions based on their individual preferences and understandings. The computer scientists didn’t want to know about broad decision-making strategies – he wants his robot to be able to make the right decision in individual, specific situations (i.e. move left and survive not right and fall off a cliff). Understanding broad system properties of homogenous agents and interactions is no good to these guys.

It’s also why the Geographer’s model stood out from the rest – it actually tries to recreate European urban development (or more specifically, “Simulate the emergence of a system of cities functionally differentiated from initial configurations of settlements and resources, development parameters and interaction rules”). It’a a model that attempts to understand the system within its context. [One other model presented that did model a specific system within its context was presented by the Economist’s model (“virtual archaeology”) of the population dynamics of the lost Kayenta Anasazi civilisation in New Mexico. This model also has a large parameter space but performed well largely (I’d suggest) because it was driven by such good data for parameterisation (though some parameter tuning was clearly needed).]

So no, there is nothing wrong with an approach that considers homogenous agents, actors and interactions with simple rules. It’s just that these models are more divorced from ‘reality’ – they are looking at the essence of the system properties that arise from the simplest of starting conditions. What is really happening here it that the systems that have not be modelled previously because of the problems of quantitative representation of systems of ‘middle numbers’ (i.e. systems that have neither so many system elements and interactions that statistical mechanics is not useful, but have more elements and interactions than allows simple modelling and analysis) are now being broken down for analysis. The attitude is “we have to start somewhere, so lets start at the bottom with the simplest cases and work our way up”. Such an approach has recently been suggested for the advancement of social science as a whole.

This still means our “virtual Geographies” and “virtual Landscapes” will still be hampered by huge parameter spaces for now. But what about if we try to integrate simple agent-based models of real systems into larger models of systems that we know to be more homogenous (‘predictable’?) in their behaviour. This is the problem I have been wrestling with regarding my landscape model – how do I integrate a model of human decision-making with a model of vegetation dynamics and wildfire. From the brief discussion I’ve presented here (and some other thinking) I think the most appropriate approach is to treat the agent-based decision-making model like the physicists do – examine the system properties that emerge from the individual interactions of agents. In my case, I can run the model for characteristic parameter sets and examine the composition (i.e. “how much?”) and configuration (i.e. “how spatially oriented?”) of the land cover that emerges and use this to constrain the vegetation dynamics model.

So, the summer school was very interesting, I got to meet many people from very different academic backgrounds (physicists, mathematicians, computer scientists, cognitive scientists, psychologists, sociologists, economists…) and discuss how they approach their problems. I think this has given me a broader understanding of the types and uses of models available for studying complex systems. Hopefully I’ll be able to use some of this understanding of different techniques in the future to good effect when studying the interaction between social and environmental systems.

The complex systems approach does offer many possibilities for the investigation of social systems. However, for the study of humans and society this sort of modelling will only go so far. We’ll still need our sociologists, ‘human’ geographers, and the like to study the qualitative aspects of these systems, their components and interactions. After all, real people don’t like being labelled or pigeon-holed.

Millington 2006 Book Chapter

I’ve just received the offprint from the book chapter I wrote with George Perry and Bruce Malamud and have posted it on my website.

MILLINGTON, J.D.A, Perry, G.L.W. and Malamud, B.D. (2006) Models, data and mechanisms: quantifying wildfire regimes In: Cello G. & Malamud B. D. (Eds.) Fractal Analysis for Natural Hazards. Geological Society, London, Special Publications

Abstract
The quantification of wildfire regimes, especially the relationship between the frequency with which events occur and their size, is of particular interest to both ecologists and wildfire managers. Recent studies in cellular automata (CA) and the fractal nature of the frequency–area relationship they produce has led some authors to ask whether the power-law frequency–area statistics seen in the CA might also be present in empirical wildfire data. Here, we outline the history of the debate regarding the statistical wildfire frequency–area models suggested by the CA and their confrontation with empirical data. In particular, the extent to which the utility of these approaches is dependent on being placed in the context of self-organized criticality (SOC) is examined. We also consider some of the other heavy-tailed statistical distributions used to describe these data. Taking a broadly ecological perspective we suggest that this debate needs to take more interest in the mechanisms underlying the observed power-law (or other) statistics. From this perspective, future studies utilizing the techniques associated with CA and statistical physics will be better able to contribute to the understanding of ecological processes and systems.

Bill Cronon: Secular Apocaplyse


I saw this photo a couple of days ago. It’s a comparison of the state of a Chilean glacier in 1928 with 2004. The glacier is retreating by 14 metres per year, attributed by scientists to a warming of the global climate. At that rate of retreat the it could be gone in 25 years. Look at that panorama though – would’t it be great to go and see that before it’s gone? Imagine if you were stood there confronted by this awesome sight, what would you be thinking? Greenpeace have been pretty sneaky though (as they have a right to be). Using those beautiful photos that would stick in my mind; when I arrived at that vista I might just think, “I contributed to this”.

I made a point of going to see Bill Cronon at the Thursday morning plenary “Narrative of climate change” at the RGS conference. He suggested that narratives of climate change have been used as both prediction AND (secular) prophecy. This idea of a secular prophecy comes from recent intonations of Nature as a secular proxy for God. Prophecies are often told as stories of retribution that will be incurred if God’s laws were broken. If Nature is a proxy for God then Climate Change is portrayed as a retribution for humans breaking the laws of Nature.

Cronon suggests that Global Narratives are abstract, virtual, systemic, remote, vast, have a diffuse sense of agency, posses no individual characters (i.e. no heros/villains), and are repetitive (so boring). These characteristics make it difficult to emphasise and justify calls for human action to mitigate against the anthropic influence on the climate. Cronon suggests these types of prophetic narrative are ‘unsustainable’ because they do not offer the possibility of individual or group action to reverse or address global climate problems, and therefore are no use politically or socially.

Coronon went on to discuss the micro-cosms (micro narratives) Elizabeth Kolbert uses in her book “Field Notes from a Catastrophe” to illustrate the impacts of global change in a localised manner. She uses individual stories that are picked because they are not expected, they are non-abstract and the antithesis of the unsustainable global narratives. He concluded that we need narratives that offer hope, and not those tied to social and political models based on anarchic thought that do not address the systemic issues driving the change itself. This is the political challenge he suggests – to create narratives that not only make us think “I contributed to this” when we see evidence of glacier retreat, but that offer us hope of finding ways to reduce our future impact upon the environment.

RGS 2006: Day One

I went to the participatory techniques showcase session on the first day of the RGS annual conference. Nick Lunch (from Insight) made an interesting presentation on Participatory Video – something they call a “community empowerment tool”, but also a way it seems to me of eliciting local knowledge and understandings. I’d suggest that when modelling the interaction between local communities and their environment, this would be a good way to enable the modeller to improve their understanding of the what the problem is and what the key variables and parameters that need to be considered in a model are.

Nick also said that insight have found that one of the techniques best uses was as a catalyst to ‘do things’ and initiate local action within their communities. I can see why this might the case – I’ve found this blog enables me to ‘get things done’ too. It’s given me confidence just to start writing and prompted me to record my thought processes better (both on and off blog) – something I haven’t been strict enough about with myself during the PhD modelling project. This is defintiely a lesson learned from my PhD work and something I want to make sure I do better in the next project I tackle.

It also helps to “crystalise one’s thoughts” as one colleague put it. I often have several ideas swirling around in my head at once, and ususally have a general ‘impression’ of how they relate. But it’s not until I write it down that I really understand – writing something in prose really demands the idea is properly understood. The process of writing clearly aids the process of understanding. And whilst writing in prose helps to shore up these loosely tied ideas, coding demands an even more explicit understanding. This is where I see the worth of the process of generating a simulation model in itself.

Depending upon how much a modeller wishes to publish online, a blog might be an interesting way to demonstrate the modelling process and a way to document and highlight the dead ends that a modelling project often finds itself following. Mike Bithell suggested in a presentation later on the first day at the RGS conference that that the limitations of modelling often cannot be explored without going through the process of producing a model itself. From some of the issues I’ve encountered in my modelling exploits, I can understand what he means.

Landscape Influences Human Social Interaction

Scientific American: Landscape Influences Human Social Interaction

Thay know all about this in Spain. One of the presentations at the THEMES Summer School I was at in June was all about the current problems in the Barcelona suburbs as people decide they want nice green lawns like they see on Desperate Housewives.

Domene E., Sauri D., Parés M. 2005, ‘Urbanization and Sustainable
Resource Use: The Case of Garden Watering in the Metropolitan Region of Barcelona’,
Urban Geography, Vol.26, Number 6, pp.520-535.

So maybe it’s “natural” for us to want to live in “unnatural” surroundings…

RGS Programme

The programme for this year’s RGS-IBG Annual Conference was published today. I’m presenting two papers:

  1. ‘A simulation model of vegetation dynamics and
    wildfire for a Mediterranean landscape’ in PGF & QMRG Session ‘Postgraduate research in quantitative geography‘, 9am Friday 1st September
  2. ‘Modelling Feedbacks between land-use decision-making and ecological processes in a Mediterranean landscape‘ in QMRG Session ‘Social simulation & modelling complexity’, 2pm Friday 1st September