Applications of Complex Systems to Social Sciences

I’ve recently returned from the GIACS summer school in Poland: Applications of Complex Systems to Social Sciences. Whilst not a social scientist, I am interested in the incorporation of aspects of human/social behaviour into models of the physical environment and its change. I thought this summer school might be an opportunity to get a glimpse at what the future of modelling these systems might be, and how others are approaching investigation of social phenomena.

The set of lecturers was composed of a Psychologist, three Physicists (P1, P2, and P3), a Geographer, and an Economist. I’m sure plenty of ‘real social scientists’ wouldn’t be too happy with what some of these modellers are doing with their differential equations, cellular automata, agent-based models and network theory. One of the students I spoke to (a social psychologist) complained that these guys were modelling social systems but not humans; another (a computer scientist interested in robotics) suggested the models were too ‘reactive’ rather than ‘proactive’. Pertinent comments I think, and ones that made me realise that to really understand what was going on would need me to take a step back and look at the broader modelling panorama.

Some of the toughest comments from the school attendees were levelled at the Geographer’s model (or “virtual Geography”) that attempts to capture the patterns of population growth observed for European cities, using a mechanistic approach based on the representation of economic processes. The main criticism was that the large parameter space of this model (i.e. a large number of interacting parameters) makes the model very difficult to analyse, interpret and understand. Such criticisms were certainly valid and have been previously observed by other modellers of geographic systems. However, the same criticisms could not be levelled at the models presented at the physicists’ (and psychologist’s) models, simply because their models have far fewer parameters.

And so this, I think, is the one of the problems that the social psychologist and cognitive scientist alluded to; the majority of the models arising from the techniques of physics (and mathematics) are generally interested in the system properties as whole and not individual interactions and components. One or two key state variables (a variable used to describe the state of the system) are reported and analysed. But actually, there’s nothing wrong with this approach because of the nature of their models, based as they are on very simple assumptions and largely homogenous in the agents, actors and interactions they considered.

Such an approach didn’t settle well with the social psychologist because the agents being modelled are supposed to be representative of humans; humans are individuals that make decisions based on their individual preferences and understandings. The computer scientists didn’t want to know about broad decision-making strategies – he wants his robot to be able to make the right decision in individual, specific situations (i.e. move left and survive not right and fall off a cliff). Understanding broad system properties of homogenous agents and interactions is no good to these guys.

It’s also why the Geographer’s model stood out from the rest – it actually tries to recreate European urban development (or more specifically, “Simulate the emergence of a system of cities functionally differentiated from initial configurations of settlements and resources, development parameters and interaction rules”). It’a a model that attempts to understand the system within its context. [One other model presented that did model a specific system within its context was presented by the Economist’s model (“virtual archaeology”) of the population dynamics of the lost Kayenta Anasazi civilisation in New Mexico. This model also has a large parameter space but performed well largely (I’d suggest) because it was driven by such good data for parameterisation (though some parameter tuning was clearly needed).]

So no, there is nothing wrong with an approach that considers homogenous agents, actors and interactions with simple rules. It’s just that these models are more divorced from ‘reality’ – they are looking at the essence of the system properties that arise from the simplest of starting conditions. What is really happening here it that the systems that have not be modelled previously because of the problems of quantitative representation of systems of ‘middle numbers’ (i.e. systems that have neither so many system elements and interactions that statistical mechanics is not useful, but have more elements and interactions than allows simple modelling and analysis) are now being broken down for analysis. The attitude is “we have to start somewhere, so lets start at the bottom with the simplest cases and work our way up”. Such an approach has recently been suggested for the advancement of social science as a whole.

This still means our “virtual Geographies” and “virtual Landscapes” will still be hampered by huge parameter spaces for now. But what about if we try to integrate simple agent-based models of real systems into larger models of systems that we know to be more homogenous (‘predictable’?) in their behaviour. This is the problem I have been wrestling with regarding my landscape model – how do I integrate a model of human decision-making with a model of vegetation dynamics and wildfire. From the brief discussion I’ve presented here (and some other thinking) I think the most appropriate approach is to treat the agent-based decision-making model like the physicists do – examine the system properties that emerge from the individual interactions of agents. In my case, I can run the model for characteristic parameter sets and examine the composition (i.e. “how much?”) and configuration (i.e. “how spatially oriented?”) of the land cover that emerges and use this to constrain the vegetation dynamics model.

So, the summer school was very interesting, I got to meet many people from very different academic backgrounds (physicists, mathematicians, computer scientists, cognitive scientists, psychologists, sociologists, economists…) and discuss how they approach their problems. I think this has given me a broader understanding of the types and uses of models available for studying complex systems. Hopefully I’ll be able to use some of this understanding of different techniques in the future to good effect when studying the interaction between social and environmental systems.

The complex systems approach does offer many possibilities for the investigation of social systems. However, for the study of humans and society this sort of modelling will only go so far. We’ll still need our sociologists, ‘human’ geographers, and the like to study the qualitative aspects of these systems, their components and interactions. After all, real people don’t like being labelled or pigeon-holed.

Millington 2006 Book Chapter

I’ve just received the offprint from the book chapter I wrote with George Perry and Bruce Malamud and have posted it on my website.

MILLINGTON, J.D.A, Perry, G.L.W. and Malamud, B.D. (2006) Models, data and mechanisms: quantifying wildfire regimes In: Cello G. & Malamud B. D. (Eds.) Fractal Analysis for Natural Hazards. Geological Society, London, Special Publications

Abstract
The quantification of wildfire regimes, especially the relationship between the frequency with which events occur and their size, is of particular interest to both ecologists and wildfire managers. Recent studies in cellular automata (CA) and the fractal nature of the frequency–area relationship they produce has led some authors to ask whether the power-law frequency–area statistics seen in the CA might also be present in empirical wildfire data. Here, we outline the history of the debate regarding the statistical wildfire frequency–area models suggested by the CA and their confrontation with empirical data. In particular, the extent to which the utility of these approaches is dependent on being placed in the context of self-organized criticality (SOC) is examined. We also consider some of the other heavy-tailed statistical distributions used to describe these data. Taking a broadly ecological perspective we suggest that this debate needs to take more interest in the mechanisms underlying the observed power-law (or other) statistics. From this perspective, future studies utilizing the techniques associated with CA and statistical physics will be better able to contribute to the understanding of ecological processes and systems.

Bill Cronon: Secular Apocaplyse


I saw this photo a couple of days ago. It’s a comparison of the state of a Chilean glacier in 1928 with 2004. The glacier is retreating by 14 metres per year, attributed by scientists to a warming of the global climate. At that rate of retreat the it could be gone in 25 years. Look at that panorama though – would’t it be great to go and see that before it’s gone? Imagine if you were stood there confronted by this awesome sight, what would you be thinking? Greenpeace have been pretty sneaky though (as they have a right to be). Using those beautiful photos that would stick in my mind; when I arrived at that vista I might just think, “I contributed to this”.

I made a point of going to see Bill Cronon at the Thursday morning plenary “Narrative of climate change” at the RGS conference. He suggested that narratives of climate change have been used as both prediction AND (secular) prophecy. This idea of a secular prophecy comes from recent intonations of Nature as a secular proxy for God. Prophecies are often told as stories of retribution that will be incurred if God’s laws were broken. If Nature is a proxy for God then Climate Change is portrayed as a retribution for humans breaking the laws of Nature.

Cronon suggests that Global Narratives are abstract, virtual, systemic, remote, vast, have a diffuse sense of agency, posses no individual characters (i.e. no heros/villains), and are repetitive (so boring). These characteristics make it difficult to emphasise and justify calls for human action to mitigate against the anthropic influence on the climate. Cronon suggests these types of prophetic narrative are ‘unsustainable’ because they do not offer the possibility of individual or group action to reverse or address global climate problems, and therefore are no use politically or socially.

Coronon went on to discuss the micro-cosms (micro narratives) Elizabeth Kolbert uses in her book “Field Notes from a Catastrophe” to illustrate the impacts of global change in a localised manner. She uses individual stories that are picked because they are not expected, they are non-abstract and the antithesis of the unsustainable global narratives. He concluded that we need narratives that offer hope, and not those tied to social and political models based on anarchic thought that do not address the systemic issues driving the change itself. This is the political challenge he suggests – to create narratives that not only make us think “I contributed to this” when we see evidence of glacier retreat, but that offer us hope of finding ways to reduce our future impact upon the environment.

RGS 2006: Day One

I went to the participatory techniques showcase session on the first day of the RGS annual conference. Nick Lunch (from Insight) made an interesting presentation on Participatory Video – something they call a “community empowerment tool”, but also a way it seems to me of eliciting local knowledge and understandings. I’d suggest that when modelling the interaction between local communities and their environment, this would be a good way to enable the modeller to improve their understanding of the what the problem is and what the key variables and parameters that need to be considered in a model are.

Nick also said that insight have found that one of the techniques best uses was as a catalyst to ‘do things’ and initiate local action within their communities. I can see why this might the case – I’ve found this blog enables me to ‘get things done’ too. It’s given me confidence just to start writing and prompted me to record my thought processes better (both on and off blog) – something I haven’t been strict enough about with myself during the PhD modelling project. This is defintiely a lesson learned from my PhD work and something I want to make sure I do better in the next project I tackle.

It also helps to “crystalise one’s thoughts” as one colleague put it. I often have several ideas swirling around in my head at once, and ususally have a general ‘impression’ of how they relate. But it’s not until I write it down that I really understand – writing something in prose really demands the idea is properly understood. The process of writing clearly aids the process of understanding. And whilst writing in prose helps to shore up these loosely tied ideas, coding demands an even more explicit understanding. This is where I see the worth of the process of generating a simulation model in itself.

Depending upon how much a modeller wishes to publish online, a blog might be an interesting way to demonstrate the modelling process and a way to document and highlight the dead ends that a modelling project often finds itself following. Mike Bithell suggested in a presentation later on the first day at the RGS conference that that the limitations of modelling often cannot be explored without going through the process of producing a model itself. From some of the issues I’ve encountered in my modelling exploits, I can understand what he means.

Energy for Free

Defying the laws of thermodynamics? Skeptical, very skeptical…

Some interesting discussions in their forum about potential consequences however. For example, how long before the oil companies put a stop to this nonsense and buy them out to shut them up? Or do they think this is so lacking in credibility they’ll leave it to self-combust…

tanfastic

I just saw an advert, the strap line of which was “Holiday memories can fade fast, but your tan needn’t”

What? If your memories fade faster than your tan it sounds like you had a pretty boring holiday to me…

Norman Maclean, Young Men and Fire

Book Review

I expect I was wheeling my bike through the tourists, guzzling on my choco-milk after a session in the gym. Those book stalls under Waterloo Bridge on the South Bank get me everytime. This one must have jumped at me, out of the titles and authors streaming by, me the fly.

I’m sure it was because I’d read Norman Maclean before – A River Runs Through It, the story of Montana fly-fishing. But the title also intrigued me; Young Men and Fire. It wasn’t until I’d parted with my £3.50 and began reading just recently that I really discovered how intrigued I would become.

Other reviews will offer you a
better description of the story and more evocative excerpts, but I’ll concentrate less on the story and more on the storytelling. The story is the events of 5th August 1949 when 12 USFS smokejumpers died on a fire in Mann Gulch, Montana, and Maclean’s exploits to understand the tragic events years later. The telling is part story, part history,part science.

“Historical questions the storyteller must face, although in a place of his own choosing, but his most immediate question as he faces new material is always, Will anything strange or wonderful happen here? The rights and wrongs come later and likewise the scientific know how.”

The first third of the book is the story of the tragedy. It’s only later that the detective story begins, where we start “… examining how all the little cockeyed things all fit together to explain one big cockeyed thing”. This is where Maclean begins to suggest that not only did the events of the day happen because ‘everything was just right’ but that the route to discovering what happened also depends on everything being ‘just right’. The process of discovering is often as historically contingent as the history.

Maclean describes his ‘ah-ha!’ moment, his ‘eureka!’, when he thought “that’s funny”. On a boat trying to piece the bits of puzzle together in his mind of how and why those men got caught by the fire, he sees a wave on the water ‘going the wrong way’. Or rather, going in a direction he wasn’t expecting because of the winds that come and go.

Wind is the whip of the fire, spurring it on, pointing the way. The winds on the day of the fire all came together at the right time across the unique topography of the gulch to cause a ‘blowup’, an explosion of fire throwing flames tens of metres high and accelerating fire spread to speeds faster than a man can run. Faster even than a man running for his life.

Everything that had to fit that day did fit that day – but the evidence of those conditions may still be observed in the broader patterns of the landscape that are shaped by the prevailing winds. Processes acting at different rates and extents leave their evidence at different rates and extents. Maclean saw those patterns one day by chance and thought “that’s funny”.

So just as the tragedy was dependent upon “everything fitting together”, so too is the path or route to discovery? The patterns are there but they must coincide with our observation for us to understand? Or is this just the way we tell the story of discovery, linearizing the complex web of our thought processes? How can we know what subconscious links are being made when we think “that’s funny”? Or is the most important skill knowing when “that’s funny” really is funny?

The scientist in the storytelling is Richard C. Rothermel, he of mathematical fire modelling fame. Maclean asks for Rothermel’s help to use his mathematical models to plot the race between the young men and fire on the axes of distance and time. Maclean seems reasonably confident with results of the model – the numbers seem to fit with his qualitative understanding of the events. But he’s not totally convinced by the numbers alone. Just as fire requires the triangle of heat, fuel and oxygen, the events and his understanding of them require story, history and science;

“We are beyond where arithmetic can explain what was happening in the piece of nature that had been the head of Mann Gulch … Near the end of many tragedies it seems right that there should be moments when the story stops and looks back for something it left behind and finds it and finds it because of the things it learned, as it were, by having lived through the story.”

Young Men and Fire is quintessentially ‘Direction not Destination’. The route to discovery is important. The modelling is as important as the model. Hindsight is a wonderful thing because of contingency and history. But hindsight is also painful; it allows us to understand the tragedies that befell the young, who could not see it until it was upon them.

_____________________________________

Maclean, N (1992) Young Men and Fire Chicago: University of Chicago Press. ISBN: 0226500624

Buy at Amazon