Regional partitioning for wildfire regime characterization

Fighting wildfires is a strategic operation. In fire-prone areas of the world, such as California and the Mediterranean Basin, it is important that managers allocate and position fire trucks, water bombers and human fire-fighters in locations that minimize the response time to reach new fires. Not only is this important to reduce risk to human lives and livelihoods, the financial demands of fighting a prolonged campaign against multiple fires demands that resources be used as economically as possible.

Characterizing the wildfire regime of an area (the frequency, timing and magnitude of all fires) can be very useful for this sort of planning. If an area burns more frequently, or with greater intensity, on average, fire-fighting resources might be better placed in or near these areas. The relationship between the frequency of fires and the area they burn is one the characteristics that is particularly interesting from this perspective.

As I’ve written about previously with colleagues, although it is well accepted that the frequency-area distribution of wildfires is ‘heavy-tailed’ (i.e. there are many, many more small fires than large fires), the exact nature of this distribution is still debated. One of the distributions that is frequently used is the power-law distribution. Along with my former advisors Bruce Malamud and George Perry, I examined how this characteristic of the wildfire regime, the power-law frequency-area distribution, varied for different regions across the continental USA (see Malamud et al. 2005). Starting with previously defined ‘ecoregions’ (area with characterized by similar vegetation, climate and topography) we mapped how the frequency-area relationship varied in space, finding a systematic change from east to west across the country.

More recently, Paolo Fiorucci and colleagues (Fiorucci et al. 2008) have taken a slightly different approach. Rather than starting with pre-defined spatial regions and calculating the frequency-area distribution of all the fires in each region, they have devised a method that splits a large area into smaller regions based on the wildfire data for the entire area. Thus, they use the data to define the spatial differentiation of regions with similar wildfire regime characteristics a posteriori rather than imposing the spatial differentiation a priori.

Fiorucci and his colleagues apply their method to a dataset of 6,201 fires (each with an area greater than 0.01 sq km) that burned between 1987 and 2004 in the Liguria region of Italy (5400 sq km). They show that estimates of a measure of the wildfire frequency-area relationship (in this case the power-law distribution) of a given area varies significantly depending on how regions within that area are partitioned spatially. Furthermore, they found differences in spatial patterns of the frequency-area relationship between climatic seasons.

Using both a priori (the approach of Malamud et al. 2005) and a posteriori (the approach of Fiorucci et al. 2008) spatial delineation of wildfire regime areas, whilst simultaneously considering patterns in the processes believed to be driving wildfire regimes (such as climate, vegetation and topography), will lead to better understanding of wildfire regimes. That is, future research in this area will be well advised to look at the problem of wildfire regime characterization from both perspectives – data-driven and process-driven. The approach developed by Fiorucci et al. also provide much promise for a more rigorous, data-driven, approach to make decisions about the allocation and positioning of wildfire fire-fighting resources.

Citation and Abstract
Fiorucci, P., F. Gaetani, and R. Minciardi (2008) Regional partitioning for wildfire regime characterization, Journal of Geophysical Research, 113, F02013
doi:10.1029/2007JF000771

Wildfire regime characterization is an important issue for wildfire managers especially in densely populated areas where fires threaten communities and property. The ability to partition a region by articulating differences in timing, frequency, and intensity of the phenomena among different zones allows wildfire managers to allocate and position resources in order to minimize wildfire risk. Here we investigate “wildfire regimes” in areas where the ecoregions are difficult to identify because of their variability and human impact. Several studies have asserted that wildfire frequency-area relationships follow a power law distribution. However, this power law distribution, or any heavy-tailed distribution, may represent a set of wildfires over a certain region only because of the data aggregation process. We present an aggregation procedure for the selection of homogeneous zones for wildfire characterization and test the procedure using a case study in Liguria on the northwest coast of Italy. The results show that the estimation of the power law parameters provides significantly different results depending on the way the area is partitioned into its various components. These finds also show that it is possible to discriminate between different wildfire regimes characterizing different zones. The proposed procedure has significant implications for the identification of ecoregion variability, putting it in a more mathematical basis.

US-IALE 2009: Coupling Humans and Complex Ecological Landscapes

Coupling Humans and Complex Ecological Landscapes is the theme of the 2009 annual conference of US-IALE (U.S. Regional Association, International Association for Landscape Ecology). The conference will be held in Snowbird, Utah, from April 12-16, 2009. Proposals for symposia and workshops are due September 15, 2008; and abstracts are due November 17, 2008.

Several types of financial support for attending and presenting at the conference are available:

(1) the “Sponsored Student Travel Awards Program” of local sponsors (USGS, Utah State University, and Utah Department of Natural Resources),

(2) US-IALE’s ‘Foreign Scholar Travel Award‘ Program,

(3) the ‘NASA-MSU Professional Enhancement Awards Program‘ (supported by NASA and Michigan State University), and

(4) the ‘CHANS Fellows Program’ of the new International Network of Research on Coupled Human and Natural Systems (CHANS-Net, supported by NSF, see background papers in Science and Ambio).

US-IALE conferences are particularly students-friendly, with two popular programs — Lunch with Mentors and NASA-MSU dinner, and a new program — We’ll “Pick Up The Tab!”.

More information about the conference is available from the web site.

Upper Peninsula Adventures


I’m back from fieldwork in Michigan’s Upper Peninsula. It was quite a short, but eventful, trip to get some forest stand cruises going – lightning, flat tyres, and an incident with some angry bees (we escaped with only a couple of stings). Unfortunately I didn’t have my camera on hand to record any of these (mis)adventures. Now, on with preparing my Systems Modeling and Simulation course for the fall and coding our model to integrate with FVS

Effective Modelling for Sustainable Forest Management

In many forest landscapes a desirable management objective is the sustainability of both economic productivity and healthy wildlife populations. Such dual-objective management requires a good understanding of the interactions between the many components and actors at several scales and across large extents. Computer simulation models have been enthusiastically developed by scientists to improve knowledge about the dynamics of forest growth and disturbance (for example by timber harvest or wildfire).

However, Papaik, Sturtevant and Messier write in their recent guest editorial for Ecology and Society that “models are constrained by persistent boundaries between scientific disciplines, and by the scale-specific processes for which they were created”. Consequently, they suggest that:

“A more integrated and flexible modeling framework is required, one that guides the selection of which processes to model, defines the scales at which they are relevant, and carefully integrates them into a cohesive whole”.


This new framework is illustrated by the papers in the Ecology and Society special feature ‘Crossing Scales and Disciplines to Achieve Forest Sustainability: A Framework for Effective Integrated Modeling’.

The papers in the special feature provide case studies that reflect two interacting themes:

  1. interdisciplinary approaches for sustainable forest landscape management, and
  2. the importance of scaling issues when integrating socioeconomic and ecological processes in the modeling of managed forest ecosystems.

These issues are well related to the project I’m currently working on that is developing an integrated ecological-economic model of a managed forest landscape in Michigan’s Upper Peninsula. One paper that caught my eye was by Sturtevant et al., entitled ‘A Toolkit Modeling Approach for Sustainable Forest Management Planning: Achieving Balance between Science and Local Needs’.

Sturtevant et al. suggest that forest managers are generally faced with a “devil’s choice” between using generic ‘off-the-shelf models’ where information flows primarily from researchers and planners down to local communities versus developing case-specific models designed for a specific purpose or locale and based on information from the local actors. To avoid this choice, which Sturtevant et al. believe will seldom result in a satisfactory management result, they outline their proposal for a hybrid ‘toolkit’ approach. Their alternative approach “builds on existing and readily adaptable modeling ‘tools’ that have been developed and applied to previous research and planning initiatives”.

Their toolkit approach is

  1. collaborative – including stakeholders and decision-makers
  2. a ‘meta-modelling’ approach – the model is derived from other models and tools.

They then illustrate their toolkit approach using a case study from Labrador, Canada, highlighting the stages of establishing the issues, developing a conceptual model, implementing the meta-model, and then refining the model iteratively. They conclude:

“A toolkit approach to SFM [Sustainable Forest Management] analytical support is more about perspectives on information flow than on technical details. Certainly expertise and enabling technology are required to allow a team to apply such a framework. However, the essence of this approach is to seek balance between top-down (off the shelf, science-driven) and bottom-up (case-specific, stakeholder-driven) approaches to SFM decision support. We aim to find a pivot point, with adequate information flow from local experts and stakeholders to scientists, while at the same time avoiding “reinventing the wheel” (e.g. Fig. 1) by making full use of the cumulative experience of scientists and tools they have constructed.”

Although this ‘meta-model’ approach may save time on the technical model building side of things, many resources (time, effort and money) will be required to build and maintain relationships and confidence between scientists, managers and local stakeholders. This approach is really a modelling toolkit for management, with very little emphasis on improving scientific understanding. In this case the modelling is the means to the end of integrative/participatory management of the forest landscape.

The authors continue:

“The mixture of local experts and stakeholders who understand how the tools work, scientists who are willing and able to communicate their science to stakeholders, and integrated analytical tools that can simulate complex spatial and temporal problems will provide powerful and efficient decision support for SFM.”

Unfortunately, unless the scientists in question have the explicit remit to offer their services for management purposes, this sort of modelling approach will not be very appealing to them. In a scientific climate of ‘publish or perish’, management outcomes alone are unlikely to be enough to lure the services of scientists. In some cases I’m sure I will be wrong and scientists will happily oblige. But more generally, unless funding bodies become less concerned with tangible outputs at specific points in time, and academic scientists are judged less strictly by their publishing output, this situation may be difficult to overcome.

This situation is one reason the two sides of the “devils’ choice” are more well developed to the expense of the ‘middle-ground’ toolkit approach. ‘Off-the-shelf’ models, such as LANDIS, are appealing to scientists as they allow the investigation of more abstract and basic science questions than asked by forest managers. The development of ‘customized’ models is appealing to scientists because they allow more detailed investigation of underlying processes and provide a framework for the collection of empirical data collection. No doubt the understanding gained from these approaches will eventually help forest managers – but not in the manner of direct decision-support as the toolkit modelling approach proposes.

As a case in point, the ‘customized’ Managed Forest Landscape Model for Michigan I am working on is raising questions about underlying relationships between deer and forest stand structure. I’m off into the field this week to get data collection started for just that purpose.

JASSS Paper Accepted

This week one of the papers I have been working on as a result of my PhD research has been accepted for publication in the Journal of Artificial Societies and Social Simulation (JASSS). The paper, written with Raúl Romero-Calcerrada, John Wainwright and George Perry, describes the agent-based model of agricultural land-use decision-making we constructed to represent SPA 56 in Madrid, Spain. We then present results from our use of the model to examine the importance of land tenure and land use on future land cover and the potential consequences for wildfire risk. The abstract is below, and I’ll post again here when the paper is published and online.

An Agent-Based Model of Mediterranean Agricultural Land-Use/Cover Change for examining Wildfire Risk

James D. A. Millington, Raúl Romero-Calcerrada, John Wainwright, George L.W. Perry
(Forthcoming) Journal of Artificial Societies and Social Simulation

Abstract
Humans have a long history of activity in Mediterranean Basin landscapes. Spatial heterogeneity in these landscapes hinders our understanding about the impacts of changes in human activity on ecological processes, such as wildfire. Use of spatially-explicit models that simulate processes at fine scales should aid the investigation of spatial patterns at the broader, landscape scale. Here, we present an agent-based model of agricultural land-use decision-making to examine the importance of land tenure and land use on future land cover. The model considers two ‘types’ of land-use decision-making agent with differing perspectives; ‘commercial’ agents that are perfectly economically rational, and ‘traditional’ agents that represent part-time or ‘traditional’ farmers that manage their land because of its cultural, rather than economic, value. The structure of the model is described and results are presented for various scenarios of initial landscape configuration. Land use/cover maps produced by the model are used to examine how wildfire risk changes for each scenario. Results indicate land tenure configuration influences trajectories of land use change. However, simulations for various initial land-use configurations and compositions converge to similar states when land-tenure structure is held constant. For the scenarios considered, mean wildfire risk increases relative to the observed landscape. Increases in wildfire risk are not spatially uniform however, varying according to the composition and configuration of land use types. These unexpected spatial variations in wildfire risk highlight the advantages of using a spatially-explicit ABM/LUCC.

Creating a Genuine Science of Sustainability

Previously, I wrote about Orrin Pilkey and Linda Pilkey-Jarvis’ book, Useless Arithmetic: Why Environmental Scientists Can’t Predict the Future. In a recent issue of the journal Futures, Jerome Ravetz reviews their book alongside David Waltner-Toews’ The Chickens Fight Back: Pandemic Panics and Deadly Diseases That Jump From Animals to Humans. Ravetz himself points out that the subject matter and approaches of the books are rather different, but suggests that “Read together, they provide insights about what needs to be done for the creation of a genuine science of sustainability”.

Ravetz (along with Silvio Funtowicz) has developed the idea of ‘post-normal’ science – a new approach to replace the reductionist, analytic worldview of ‘normal’ science. Post-normal science is a “systemic, synthetic and humanistic” approach, useful in cases where “facts are uncertain, values in dispute, stakes high and decisions urgent”. I used some of these ideas to experiment with some alternative model assessment criteria for the socio-ecological simulation model I developed during my PhD studies. Ravetz’s perspectives toward modelling, and science in general, shone through quite clearly in his review:

“On the philosophical side, the corruption of computer models can be understood as the consequence of a false metaphysics. Following on from the prophetic teachings of Galileo and Descartes, we have been taught to believe that Science is the sole and certain path to truth. And this Science is mathematical, using quantitative data and abstract reasonings. Such a science is not merely necessary for achieving genuine knowledge (an arguable position) but is also sufficient. We are all victims of the fantasy that once we have numerical data and mathematical argument (or computer programs), truth will inevitably follow. The evil consequences of this philosophy are quite familiar in neo-classical economics where partly true banalities about markets are dressed up in the language of the differential calculus to produce justifications for every sort of expropriation of the weak and vulnerable. ‘What you can’t count, doesn’t count’ sums it all up neatly. In the present case, the rule of models extends over nearly all the policy-relevant sciences, including those ostensibly devoted to the protection of the health of people and the environment.

We badly need an effective critical philosophy of mathematical science. … Now science has replaced religion as the foundation of our established order, and in it mathematical science reigns supreme. Systematic philosophical criticism is hard to find. (The late Imre Lakatos did pioneering work in the criticism of the dogmatism of ‘modern’ abstract mathematics but did not focus on the obscurities at the foundations of mathematical thinking.) Up to now, mathematical freethinking is mainly confined to the craftsmen, with their jokes of the ‘Murphy’s Law’ sort, best expressed in the acronym GIGO (Garbage In, Garbage Out). And where criticism is absent, corruption of all sorts, both deliberate and unaware, is bound to follow. Pseudo-mathematical reasonings about the unthinkable helped to bring us to the brink of nuclear annihilation a half-century ago. The GIGO sciences of computer models may well distract us now from a sane approach to coping with the many environmental problems we now face. The Pilkeys have done us a great service in providing cogent examples of the situation, and indicating some practical ways forward.”

Thus, Ravetz finds a little more value in the Useless Arithmetic book than I did. But equally, he highlights that the Pilkeys offer few, rather vague, solutions and instead turns to Waltner-Toews’ book for inspiration for the future:

Pilkey’s analysis of the corruptions of misconceived reductionist science shows us the depth of the problem. Waltner-Toews’ narrative about ourselves in our natural context (not always benign!) indicates the way to a solution.”

Using the outbreak of avian flu as an example of how to tackle complex environmental in the ‘risk society’ in which we now live, Waltner-Toews:

“… makes it very plain that we will never ‘conquer’ disease. Considering just a single sort of disease, the ‘zoonoses’ (deriving from animals), he becomes a raconteur of bio-social-cultural medicine …

What everyone learned, or should have learned, from the avian flu episode is that disease is a very complex entity. Judging from TV adverts for antiseptics, we still believe that the natural state of things is to be germ-free, and all we need to do is to find the germs and kill them. In certain limiting cases, this is a useful approximation to the truth, as in the case of infections of hospitals. But even there complexity intrudes … “

Complexity which demands an alternative perspective that moves beyond the next stage of ‘normal’ science to a post-normal science (to play on Kuhn’s vocabulary of paradigm shifts):

“That old simple ‘kill the germ’ theory may now be derided by medical authorities as something for the uneducated public and their media. But the practice of environmental medicine has not caught up with these new insights.

The complexity of zoonoses reflects the character of our interaction with all those myriads of other species. … the creatures putting us at risk are not always large enough to be fenced off and kept at a safe distance. … We can do all sorts of things to control our interactions with them, but one thing is impossible: to stamp them out, or even to kill the bad ones and keep the good ones.

Waltner-Toews is quite clear about the message, and about the sort of science that will be required, not merely for coexisting with zoonoses but also for sustainable living in general. Playing the philological game, he reminds us that the ancient Indo-European world for earth, dgghem, gave us, along with ‘humus’, all of ‘human’, ‘humane’ and ‘humble’. As he says, community by community, there is a new global vision emerging whose beauty and complexity and mystery we can now explore thanks to all our scientific tools.”

This global vision is a post-normal vision. It applies to far more than just avian flu – from coastal erosion and the disposal of toxic or radioactive waste (as the Pilekys discuss for example) to climate change. This post-normal vision focuses on uncertainty, value loading, and a plurality of legitimate perspectives that demands an “extended peer community” to evaluate the knowledge generated and decisions proposed.

In all fairness, it would not be easy to devise a conventional science-based curriculum in which Waltner-Toews’ insights could be effectively conveyed. For his vision of zoonoses is one of complexity, intimacy and contingency. To grasp it, one needs to have imagination, breadth of vision and humility, not qualities fostered in standard academic training. … “

This post-normal science won’t be easy and won’t be learned or fostered entirely within the esoteric confines of an ivory tower. Science, with its logical rigour, is important. It is still the best game in town. But the knowledge produced by ‘normal’ science is provisional and its march toward truth is seemingly Sisyphean when confronted faced with the immediacy of complex contemporary environmental problems. To contribute to the production a sustainable future, a genuine science of sustainability would do well to adopt a more post-normal stance toward its subject.

Creating our Future

“The future is ours, not to predict, but to create.”

– Al Gore, 16th June 2008

Hear, hear. Spoken in the context of climate change, this might be interpreted as a slight against Global Circulation Models used by scientists. Rather, I think this should be interpreted as an indication that Gore understands that we need to move past discussions about whether we can use such models to ‘prove’ whether climate change is actually happening, and instead act to mitigate against undesired change.

This does not mean computer simulations of earth systems become redundant however – they are still useful tools to improving our knowledge about systems that are so large (spatially) as to prevent empirical experimentation. But we do need to remember that in ‘open’, middle number systems (which the majority of global environmental systems are), proving the ‘truth’ of a model by comparing model results with empirical data is a logical fallacy. In such circumstances, a ‘post-normal’ approach to the use of computer simulation models and, the wider issue of climate change, would be more useful. This view is gaining recognition.

Prometheus has more detailed discussion about prediction, forecasting and decision-making of climate change.

Read the full transcript of Gore’s speech, or watch the section in which he addresses climate change below.

Model Types for Ecological Modelling

Sven Erik Jørgensen introduces a recent issue of Ecological Modelling that presents selected papers from the International Conference on Ecological Modelling in Yamaguchi, Japan (28 August – 1 September 2006). The paper provides an overview of the model types available for ecological modelling, briefly highlighting the shift from a dominance of bio-geo-chemical dynamic models and population dynamics models in the 1970s toward the application of a wider spectrum of models. The emergence of new model types has come as a response to questions such as:

  • How can we describe the spatial distribution which is often crucial to understand ecosystem reactions?
  • How do we model middle number systems?
  • How do we model hetergenous populations and databases (e.g. observations from many different ecosystems)?
  • How do we model ecosystems, when our knowledge is mainly based on a number of rules/properties/propositions?

Jørgensen suggests there are at least 10 types of model currently available for modelling ecological systems (purely mathematical and statistical aside):

  1. (Bio-geo-chemical and bio-energetics), dynamic models
  2. Static models
  3. Population dynamic models
  4. Structurally dynamic models
  5. Fuzzy models
  6. Artificial neural networks
  7. Individual-based models and cellular automata
  8. Spatial models
  9. Ecotoxicological models
  10. Stochastic models
  11. Hybrid models

Of these, my particular interest is in spatial models, individual-based models and cellular automata models (with a passing interest in population models). This is largely because of my background in geography and landscape ecology, but also because of the heterogeneity in patterns, processes and behaviour often exhibited in socio-ecological systems.

Jørgensen offers a short description of each type, before listing their advantages and disadvantages. Here are a couple with my comments in italics:

Individual-Based Models (IBMs)and Cellular Automata (CA)
First, counter to Jørgensen, I would argue that CA models should be placed with the ‘spatial models’ – the ability of CA to represent space for me outweighs their potential to represent (limited) heterogeneity between cells. This aside, their grouping does make sense when we consider that these models can be relatively easily combined to represent individuals’ interactions across space and with a heterogeneous environment (via the CA).

Advantages

  • Are able to account for individuality – agreed, especially for IBMs
  • Are able to account for adaptation within the spectrum of properties – yes
  • Software is available; although the choice is more limited than by bio-geo-chemical dynamic models – but excellent free modelling environments such as NetLogo make this type of modelling widely available
  • Spatial distribution can be covered – yes

Disadvantages

  • If many properties are considered, the models get very complex – and may require the adoption and development of new techniques to present/analyse/interpret output (e.g. POM, narratives)
  • Can be used to cover the individuality of populations; but they cannot cover mass and energy transfer based on the conservation principle – I see no reason why the principle of energy and mass conservation could not be achieved by models of these types
  • Require many data to calibrate and validate the models – yes, this often the case, and in some cases (again) may require new approaches and types of data to calibrate and evaluate models

Spatial Models
Advantages

  • Cover spatial distribution, that is often of importance in ecology – yes, particularly Landscape Ecology, an entire discipline that has arisen since the 1970s and ’80s
  • The results can be presented in many informative ways, for instance GIS – GIS is a means to organise and analyse data as well as present data

Disadvantages

  • Require usually a huge database, giving information about the spatial distribution – this can certainly give rise to the issue of ‘model but no data’ and increases the costs of performing ecological research by adding space to time. We have found that our large (~4,000 sq km) Upper Michigan study area demands high time and resources needed for data collection.
  • Calibration and validation are difficult and time-consuming – maybe more so than non-spatial models, but probably not as much as some individual-based models
  • A very complex model is usually needed to give a proper description of the spatial patterns – not necessarily. A model should be only as complex as the patterns and processes it seeks to examine and the inclusion of space does not imply patterns or processes any more complex than a system with less variables or interactions that is non-spatial.

This isn’t a bad review of the types of ecological modelling being done. However, more incisive and useful insight could have been made with respect to landscape ecology and those models that are now beginning to attempt to account for human activity in ecological systems. [And it definitely could have been better written.] Maybe I’ll stop criticising sometime and write one myself eh?

Aldo Leopold Legacy Center – The ‘Greenest Building in the US’

One of the fieldtrips we took during the US-IALE conference in Madison was to Aldo Leopold’s shack and the Aldo Leopold Legacy Center. Aldo Leopold is considered by many to be the ‘father’ of wildlife management. His significant and lasting mark is his book, A Sand County Almanac. I’ll look at the book in later post, but here I’ll talk briefly about what we saw on our excursion from Madison.

After graduating from the Yale Forest School in 1909, Aldo Leopold spent time working in Arizona and New Mexico before moving to Madison, Wisconsin, in 1924. In 1933 he published the first wildlife management textbook and accepted a new chair in game management at the University of Wisconsin – a first for both the university and the nation.

In 1935, Leopold and his family initiated their own ecological restoration experiment on a washed-out sand farm of 120 acres along the Wisconsin River near Baraboo, Wisconsin. Planting thousands of pine trees, restoring prairies, and documenting the ensuing changes in the flora and fauna informed and inspired Leopold. Many of his writings in the initial parts of A Sand County Almanac – the history of the local region as told through the rings of an oak tree, evening shows of sky dancing woodcock, fishing the Alder Fork, hunting ruffled-grouse in smoky gold tamarack – were penned in ‘the shack’ (above) on his farm which we stopped by at on a wet, grey day after visiting The Aldo Leopold Legacy Center (below).

In sharp contrast to ‘the shack’ the Legacy Center feels solid and dry. But consistent with the Land Ethic message of the writing that was done in the old dilapidated building, the new building ‘sustains the health, wildness, and productivity of the land, locally and globally‘. The Legacy Center has received Platinum Leadership in Energy and Environment Design (LEED) Certification from the U.S. Green Building Council and is currently the ‘greenest building in the U.S.’.

The Legacy Center is an example of how we can use energy more efficiently and construct building with a limited impact on our environment. Through energy efficiency, renewable energy, the Legacy Center is the first carbon neutral building certified by LEED — annual operations account for no net gain in carbon dioxide emissions.

The Legacy Center is also a net zero energy building, using 70 percent less energy than a building built just to code and meeting all of its energy needs on site using tools like a roof-mounted solar array and a ‘thermal flux zone’ to reduce heat flow between interior rooms and the outdoors. Many of the structural columns, beams, and trusses, as well as interior panelling and finish work, are from the pine trees Leopold planted himself on his farm between 1935-1948.

This really is a building that embodies Leopold’s Land Ethic – both conceptually through the principles used when designing the building, and physically by using material from Leopold’s own ecological restoration experiment. The Legacy Center contains the offices of the Leopold Foundation, has a small shop and ‘museum’ about Leopold, and can be hired for meetings. The building itself is what is really the attraction – and hopeully there will be more like this appearing more frequently elsewhere. Unless you’re passing by or really want to make a pilgrimage to gain an insight into the area where Leopold’s vision unfolded, there’s really no need to go out of your way to visit. Take a virtual tour instead to save energy and carbon and make the building even greener.

A Young Scientist’s Guide To Gainful Employment

A recent article that ranked #1 on the Bulletin of the Ecological Society of America‘s Top Ten was led by Anita Morzillo, a former student in Fisheries and Wildlife at MSU. The article, entitled ‘A Young Scientist’s Guide To Gainful Employment: Recent Graduates’ Experiences And Successful Strategies‘ is based on a workshop supported by the NASA-MSU Professional Enhancement Award Program and has some wise words for any junior researcher starting out on their academic career. It’s written with ecologists and biologists in mind but much of the advice is likely to apply to other fields.

The paper is organized into four areas:

  1. Self promotion. What can I do prior to and during the job hunt?
  2. Personal considerations. How will both my professional and personal lives affect which jobs I should apply for?
  3. The application process. What should I expect when applying?
  4. Keeping it all in perspective. What if my application is rejected?

Section 1 considers publications, the importance of experiences beyond research, the ‘elevator speech’, getting your name recognised, and your network or personal connections. Section 2 discussed the necessity (or otherwise) of PhD and post-doctoral experience, issues around the geographic location of jobs, balancing professional and personal life, and issues regarding the careers of ‘significant others’. Section 3 then addresses the job application process from learning about the process before applying to phone and on-site interviews. The final section reflects on extraneous situations such as competing against ‘superstar’ applicants for positions and the need for perseverance in certain circumstances.

The paper concludes:

“Since we all are responsible for taking the initiative to forge our career path, our goal was to share our perspectives on and experiences with several broad themes involved in a job search. Do not hesitate to start thinking about the job hunt early in your career as a graduate student. Each position that you consider will offer unique opportunities to build your resume or curriculum vitae, and will present personal and professional trade-offs. Take time to think about and proactively discuss both professional and personal factors, but also keep in mind that you control only so much of the process. Good luck!”