Latest On The Conservation Gateway

A well-managed and operational Conservation Gateway is in our future! Marketing, Conservation, and Science have partnered on a plan to rebuild the Gateway into the organization’s enterprise content management system (AEM), with a planned launch of a minimal viable product in early FY26. If you’re interested in learning more about the project, reach out to megan.sheehan@tnc.org for more info!

Welcome to Conservation Gateway

The Gateway is for the conservation practitioner, scientist and decision-maker. Here we share the best and most up-to-date information we use to inform our work at The Nature Conservancy.

Conservation’s Smoking Gun: Who Bears the Cost of Making Us ‘Evidence-Based’?

Montambault, Jensen 9/6/2012

Today, it’s hard to think about medicine being practiced on anything but evidence (except on the TV show “House”).

But as late as 40 years ago, medicine still relied as much on tradition and myth as it did on randomized and quasi-randomized controlled trials or massive cohort studies accessible in peer-reviewed publications and technical reports and strained through a methodological sieve. It took a 1972 monograph by the Scottish doctor Archie Cochrane to spur a widespread movement in medicine toward the scientific method. Cochrane — whose name now adorns a huge database of systematic medical reviews, a center in Oxford and an international research NGO — dedicated his life to making medicine evidence-based after seeing countless interventions during the Spanish Civil War and World War II that had no data to back them.

Today, a typical Cochrane Collaboration review might use the outcomes of 18 trials of pre- and neonatal care to recommend a best-practices package of the interventions with evidence showing they reduce infant and maternal mortality… and with detailed, summarized results free and open to the public.

Imagine a menu like this for oyster reef restoration, sustainable fisheries, conservation easements — pick your conservation flavor! Restoration managers in California declared these kinds of published synthetic reviews the single most useful and available tool for choosing an intervention — even better than web databases or calling up a friend/expert (Seavy and Howell 2010).1

Such utility was the dream of Collaboration for Environmental Evidence (CEE) founders when they launched CEE in 2003. It’s hard to find fault with their logic, which linked directly to the widely lauded medical model (Pullin and Knight 2001). But the application of the medical review model to conservation has not quite panned out as hoped. While the Cochrane Collaboration boasts more than 5,000 completed reviews, CEE has only 44 completed reviews, with an additional three by sister organization Conservation Evidence (CE). Admittedly, the Cochrane Collaboration began in 1993, so it had a little head start. However, the low numbers for conservation naturally raise the question: Why isn’t this taking off?

The Lack of An Institutional Base and the ‘Evidence Myth Trap’

First, the political and other institutional structures that provide a solid base for action in the U.K. medical world, Cochrane’s home turf, are absent for global conservation (Segan et al. 2011). Cochrane searches global evidence, but the problems it tries to solve are relatively uniform, unlike conservation’s.

Second, the Cochrane project relies on reliable published medical trials. You can quibble about the definition of “reliable” (and the medical-evidence community more than quibbles; see the recent argument about Cochrane reviews and the evaluation of mass deworming policies in developing countries), but rates of publication of biomedical research are way higher than in conservation-related fields.2 Publication rates for conservation and applied ecology are slower than even sister fields such as taxonomy, behavior and evolution and genetics (Kareiva et al 2002). There is simply less evidence to draw from.

But we are also caught in an evidence myth trap. When I talked to one of the CE founders last December, he told me the best thing TNC could do was convince staff to write up results related to their pre-identified themes. This presents a very different cost/benefit structure, in my mind, than harvesting themes from already published research or organizing scientists around conservation topics of TNC interest. In addition, the published price of CEE reviews is between US$30,000-$300,000, which might help explain the low number of synthetic reviews available in conservation. Despite the enthusiasm of the 50 scientists and managers that attended the meta-analysis session at the last TNC all-science meeting, it seems unlikely that a single program will bear that cost for the good of the order. This challenge isn’t about a culture of valuing evidence; it’s a question of who is championing what evidence and why.

A perhaps more tractable model is that proposed by the Global Environment Facility: special funding windows for “contributing to global conservation [by] …test[ing] and evaluat[ing] the hypotheses embedded in project interventions” (Ferraro 2012). Several conservation programs/organizations could collaborate to test a common intervention in need of evidence — say, upstream conservation to improve downstream ecosystem services.

Conservation may never be as standardized and coordinated as Western medicine, but it may not have to be. More rapid turn-around, open access and conservation-friendly journals accompanied by funding initiatives that promote interagency cooperation and learning might be enough to reasonably intertwine our daily work and the value of science.


By Jensen Montambault, applied conservation scientist, Central Science, The Nature Conservancy

Image credit: winterofdiscontent/Flickr

References:

Cook, C.N., M. Hockings, and R.W. Carter. 2010. Conservation in the dark? The information used to support management decisions. Frontiers in Ecology and the Environment 8:181-186.

Ferraro, PJ. 2012. Experimental Project Designs in the Global Environment Facility: Designing projects to create evidence and catalyze investments to secure global environmental benefits. A STAP Advisory Document. Global Environment Facility, Washington, D.C.

Kareiva, P., M. Marvier, S. West, and J. Hornisher. 2003. Slow-moving journals hinder conservation efforts. Nature 420:15.

Pullin, A.S., and T.M. Knight. 2001. Effectiveness in conservation practice: pointers from medicine and public health. Conservation Biology 15:50-54.

Seavy, N.E., and C.A. Howell. 2010. How can we improve information delivery to support conservation and restoration decisions? Biodiversity and Conservation 19:1261- 1267.

Segan, D.B., M.C. Bottrill, P.W. Baxter, and H.P. Possingham. 2011. Using conservation evidence to guide management. Conservation Biology 25:200-202.

Notes

1Although a different survey in Australia found the use and availability of evidence is highly variable among managers of different types of conservation areas (Cook et al 2010).

2Using data from the National Research Council’s “A Data-based Assessment of Research-Doctorate Programs in the United States” (accessed on 16 August 2012), biomedical fields were Biology/Integrative Biology/Integrated Biomedical Science, Immunology and Infectious Disease, Nursing, Pharmocology, Toxicology, & Environmental Health, and Public Health, and conservation fields were Forestry and Forest Sciences, Agriculture and Resource Economics, Ecology and Evolutionary Biology. Data compared using Welch’s ANOVA for unequal variances; p < 0.001 in JMP 10.0.