John Romano, Global Fellow, International Program, New York City
Despite the return of the chilling polar vortex this week, things have begun to heat up at United Nations Headquarters in New York as a critical UN process on global Sustainable Development Goals continues to unfold.
The UN Open Working Group on Sustainable Development Goals (OWG) – the intergovernmental body tasked with putting forward recommendations for a universal set of global Sustainable Development Goals (SDGs) to replace the expiring Millennium Development Goals (MDGs) in 2015 – has just concluded the first stocktaking phase of its work, with the group convening eight consultative sessions in the past year. Last week, the group’s co-chairs – Ambassador Csaba Kőrösi of Hungary and Ambassador Macharia Kamau of Kenya – released two documents to the group: One stocktaking report that summarizes the various discussions from the OWG to date, and another with so-called “focus areas” tagged by the co-chairs as having particular significance in the discussions from the OWG thus far.
Over the past year, discussions in the Open Working Group have focused largely on specific issues, such as energy, poverty eradication, water, health, oceans, forests and many others. There has been relatively less attention paid to the implementation architecture for these goals.
The “focus areas” document provides a foundation for building consensus on the contents of a report the OWG is tasked with submitting to the UN General Assembly in September. This final report from the OWG will include recommendations for a full suite of goals that can then be worked into the final set of SDGs that will be adopted in 2015. Member States resume their deliberations in the second phase of the OWG next week, with discussions centering around the two stocktaking documents circulated by the co-chairs. While it is clear from these documents that UN has a long way to go to find consensus on a comprehensive yet concise set of SDGs, we were encouraged by language on multi-stakeholder initiatives and partnerships, climate change, and cities.
Highlighting the critical role that a “New Global Partnership” will play in the SDGs, the focus areas document asserts that a “global partnership for development has been emphasized as key to unlocking the full potential of sustainable development initiatives.” Unfortunately, both documents reflect the lack of consensus in offering a clear outline or definition for this concept of a “New Global Partnership.” We support the view of the UN Secretary-General’s High-Level Panel which called for a “New Global Partnership” which harnesses the full potential of partnerships between governments at all levels, businesses, civil society, and a wide range of other stakeholders and moving beyond but complementing the traditional approaches to action, such as Official Development Assistance (ODA) and foreign aid.
What was also encouraging was the emphasis on the need for a system of regular monitoring and reporting for these initiatives and partnerships, which is now insufficient as it stands. Finally, the focus areas document highlights the need for improved coordination around these multi-stakeholder initiatives and partnerships, particularly between governments and the work of the UN.
The focus areas document also affirms that “climate change poses a grave threat to sustainable development and poverty eradication.” While there is little consensus amongst countries on how to integrate the issue of climate change into the SDGs, the focus areas document highlights the interlinkages between climate change and nearly every other issue, only highlighting the criticality of addressing the cross-cutting issue of climate change in this agenda. As the Secretary-General’s High-Level Panel report reminds us, “Above all, there is one trend – climate change – which will determine whether or not we can deliver on our ambitions.”
Reflecting the extensive and stimulating discussions in the OWG on the topic of sustainable cities, the focus areas document underscores the importance of incorporating an urban element into the SDGs, saying that “sustainable cities and settlements, including settlements of indigenous communities, will be central in addressing socio-economic and environmental challenges and in building resilient societies.” We will very likely fall short of achieving our overall aspirations for these SDGs if special attention is not given to cities, where over sixty percent of the world’s population will live by 2030 – and particularly in the developing world, which is expected to represent over 80 percent of the world’s urban population by 2030.
Over the upcoming critical months ahead, NRDC will continue to advocate for a New Global Partnership that includes all stakeholders to drive the transformative change that is so desperately needed. What has been made clear is that these SDGs – and their supporting architecture – must reinvigorate commitments and political will from world leaders to fight climate change and promote sustainable development, and it must mobilize billions of people worldwide to contribute to a sustainable future.
Click here for NRDC’s proposal for a “New Architecture for a New Global Partnership” that will be critical to delivering the transformative actions, accountability and change that the SDGs hope to catalyze for a sustainable future.
Franz Matzner, Associate Director of Government Affairs, Washington, D.C.
A painstaking four-year process to design a non-controversial but meaningful energy bill took another step forward today with the introduction of the latest version of the bipartisan Energy Savings and Industrial Competitiveness Act – commonly known as the Shaheen-Portman bill -- a commonsense approach to cutting energy waste in buildings, industry, and the federal government while saving billions of taxpayer dollars, creating tens of thousands of jobs, and cutting carbon pollution.
The measure has wide support and was developed through extensive review by a broad group of stakeholders seeking to increase investments in energy efficient buildings and technologies.
According to the sponsors, the newest version is estimated to create as many as 190,000 jobs, saving $16 billion annually and avoiding emissions equivalent to taking 22 million cars off the road.
Introduced by Sens. Jeanne Shaheen (D-N.H) and Rob Portman (R-Ohio) the 2014 version has garnered increased support which could signal that Congress will be able to move forward with closing the doors and windows on the wasteful ways we construct, fuel, and manage buildings and homes.
However, for that to happen, Republican members will have to forego the “poison pill” unrelated, controversial amendments that derailed previous versions.
It’s time to take delivery on the bill sponsors’ hard work and check extreme amendments at the door so we can begin implementing important efficiency gains that would benefit all Americans.
Without this measure, the businesses, industries, government and homes in which we live and work will continue to contribute to our national energy and pollution problems rather than help us address and solve them. We owe it to our children to address climate change, and this bill helps pave the way to cutting electricity use and the carbon pollution that is warming our planet and harming our health.
2014 Changes of Note:
The latest version of the Shaheen-Portman bill includes a number of new provisions that support greater investment in energy efficiency, including:
- The SAVE Act, which requires federally backed residential mortgage loans to begin to account for the borrower’s expected utility expenses. Agencies such as Fannie Mae, Freddie Mac, and FHA will make adjustments in the underwriting and appraisal process to recognize houses that are more energy efficient, which leads to lower energy expenses. (In 2012, more than 100,000 new homes sold were high-performance Energy Star houses). It makes sense for the federal mortgage agencies to recognize these important market trends.
- Establishing a voluntary Tenant Star certification and recognition program to promote energy efficiency in leased commercial building space during design and occupancy. Like the Better Buildings Act of 2013 (HR 2126) that recently passed a House committee and results from NRDC’s High Performance Tenant Demonstration Project, it would save money for tenants and landlords when energy use is decreased. America’s commercial buildings are responsible for over $20 billion in annual energy expenses, of which tenant-leased spaces typically represent 50 percent or more of a building’s overall energy use. Tenant Star is widely endorsed by private industry and energy efficiency advocates.
- Requiring the Office of Management and Budget to collaborate with federal agencies to promote energy efficiency in data centers and other information technologies. Federal data centers are responsible for at least 10 percent of all U.S. data center energy use. This costs American taxpayers $600 million a year in electricity bills, and is equivalent to the annual output of three 500 megawatt coal-fired power plants, producing 3 million tons of carbon dioxide annually.
For a more comprehensive analysis of the bill, see NRDC’s Factsheet.
photo credits: solar panels on house courtesy Serge Bertasius photography; Solar panels courtesy of franky242.
Kate Poole, Senior Attorney, San Francisco
Every day, the media is filled with new stories about California’s historic drought – cities facing the imminent threat of no drinking water; ranchers scrambling to feed herds grazing on brown stubble; migrating birds searching in vain for wetlands along the Pacific Flyway; and fish eggs drying out by the thousands on desiccated river banks. There is no question that the drought is imposing hardships on cities, farms and the environment, and that different solutions may be called for to address those hardships. But there is one obvious solution that benefits everyone and that we should all be able to agree on: nobody should be allowed to steal water that is not rightfully theirs in times of drought.
That’s why it is especially perplexing that a group of California legislators is objecting to clamping down on water thieves during the drought in this February 24, 2014 letter sent to Governor Brown. Water theft is a big problem in parts of the State. Recognizing the significant problems caused by illegal water diversions associated just with marijuana cultivation in California, the California Farm Bureau Federation recently asked its national affiliate to tackle the issue. Similarly, this story reports that state Fish and Wildlife wardens discovered multiple illegal water diversions on the Trinity River in just one sweep, featuring hundreds of feet of black PVC snaking through wooden boxes and diesel pumps, and concrete dams blocking tributaries. Here’s a photo of an illegal diversion taken by Department of Fish and Wildlife personnel in the South Fork Eel watershed, on a tributary that has steelhead trout, which need cold, clean flows to survive:
Right now, the State Water Board has limited authority to punish water thieves in times of drought. The Board can only issue fines of a maximum of $500 per day, which is not much of a deterrent when water is scarce and its value sky-high, as in times of drought. As part of the package of urgency drought legislation put forward by Governor Brown and legislative leaders, the Board would have authority to issue higher penalties for violating water rights during a drought, and could issue cease and desist orders to stop people from stealing water that belongs to another.
NRDC strongly supports the urgency drought legislation that state leaders have pulled together, including its modest improvements to State Board enforcement authority in times of drought. We should all be able to support improving our ability to prevent water theft when there is so little water to go around. At the very least, a handful of nay-sayers should not be permitted to scuttle this important package of drought relief measures that provides food and housing assistance to displaced farmworkers, emergency drinking water supplies, and substantial investment in local water self-reliance.
Luis Martinez, Senior Attorney, Energy and Transportation Program, Asheville, North Carolina
This week independent researchers from Wake Forest University confirmed that 35 million gallons of toxic coal ash had spilled into the Dan River from the Duke Energy coal ash dump in Eden, North Carolina. That makes it the third largest spill of coal ash ever in the United States. Across our state there are another 13 of these giant coal ash dumps that have been leaking contamination into our water.
Earlier this year, we all heard about the spill of 10,000 gallons of MCHM (a toxic chemical used in the production of coal) in West Virginia that left over 300,000 people without water. The spill paralyzed whole communities, shutting down businesses and leaving families in a desperate search for sources of potable water. A month after the spill, residents say that the odor from the chemical remains in the water and the truth is we don’t even know how dangerous the chemical is to our health.
There are still no estimates as to how long or how costly the clean-ups will be, but we can be certain they will require many years and many millions of dollars, without including the damage to the communities, their health, their air and their water. These unfortunate events remind us just how important it is to get the rules right BEFORE risky energy development moves into our backyards. Fortunately, there’s one mess we have a chance to get ahead of in North Carolina—fracking.
Just last week, we helped to release a new 30-second video, narrated by James Taylor -- a North Carolina native and NRDC Trustee -- about the rush to allow reckless fracking here. As you read this, North Carolina’s Mining and Energy Commission is pushing to open the state up to fracking with weak regulations that will leave citizens vulnerable to air pollution, water contamination and other dangers. Under these rules, property owners could be forced to tolerate fracking (literally) in their own backyards, and against their wishes. Citizens would be kept in the dark about what chemicals are being injected into their property. And without a plan for disposal, large volumes of waste water generated by fracking operations could end up in dumps next to rivers and streams or get injected underground, alongside our aquifers.
As we’ve seen fracking explode across the country, more and more alarms are going off about the dangers to communities affected by these operations. The natural gas that is stored underground in North Carolina, if there is any, will not be going anywhere – there’s no sense in rushing ahead recklessly. The state should take the time to evaluate the risks, and determine whether and how to protect North Carolinians from falling victim to them.
Allison Clements, Senior Attorney, Project for Sustainable FERC Energy Policy, New York
The Federal Energy Regulatory Commission (FERC) has once again demonstrated its commitment to removing unfair market barriers standing in the way of the grid flexibility necessary to incorporate high levels of renewable energy like wind and solar. It did so last week by issuing an order clarifying aspects of Order 784. Order 784, which FERC originally issued last year, tackled the tricky issue of whether and how to allow for competition in the sale of “ancillary services.” More about these services in a moment, but first some background -
The Changing Energy Markets’ Landscape
The 2013 Renewable Futures Study by the National Renewable Energy Laboratory (NREL) predicts that by 2050, 80 percent of the nation’s energy can come from wind and solar, and other renewable resources. In order to accommodate reliably so much clean energy, which can be variable due to unexpected weather, the study also predicts a need for significant, but feasible, increases in system flexibility. NREL writes that “system flexibility can be increased using a broad portfolio of supply- and demand-side options and will likely require technology advances, new operating procedures, evolved business models, and new market rules.”
The problem with existing energy market rules, in large part, is that they were designed around a generation portfolio consisting almost exclusively of central station, mostly fossil-fueled resources like coal and oil. Over time, the increasing presence of wind and solar power, demand response, distributed (onsite) generation and even energy efficiency have rendered the rules that govern energy markets outdated and, in some cases, discriminatory. The rules do not contemplate the operational characteristics of these emerging clean resources. As a result, these resources cannot compete to provide grid services even though they can often provide them more effectively and/or cheaper than traditional generators.
Under former FERC chair Jon Wellinghoff’s tenure (he left his post last November), FERC issued a series of legacy orders (including Orders 719, 745, and 755) that made real progress in addressing these increasingly unjust rules in wholesale energy, capacity and ancillary service markets. Order 784 and last week’s Order on Clarification represent the most recent of those orders. So, now back to ancillary services.
The Role of Ancillary Services
Ancillary services are grid “products” that transmission owners and customers must supply or procure, in addition to energy, in order to ensure the reliable delivery of energy. These services keep supply and demand balanced in all time frames (moment to moment, hour to hour, and beyond), assure start-up capability after black outs, and ensure reserves in case of emergency. Historically, coal, oil and nuclear generation have depended on large amounts of contingency reserves – the type of ancillary services necessary in case of the unexpected outage of a large power plant. The integration of renewable energy resources depends more on ancillary services that balance supply and demand on the grid when for example, the wind unexpectedly speeds up or slows down (see Brendan Kirby's time graph below). It turns out that energy storage and other demand-side resources can often react faster and switch directions quicker than large oil, coal or even gas generating units, therefore providing more flexibility to the system.
In many regions of the country there are no centralized markets for ancillary services. Transmission owners that are required to supply these services either self-supply or buy them via contract from another party. The nature of the bilateral markets in these regions, together with the nature of FERC’s rules around competition, have made it difficult for suppliers of energy storage and other demand-side resources to break into the market.
Order 784’s Reforms
Order 784 intends to break down these barriers by making it easier for energy storage and other resource owners to obtain the designation of “market based rate authority” necessary to compete for sales. The rule also requires all transmission-owning utilities to consider the “speed and accuracy” of resources providing ancillary services, a change that should mean good news for fast-acting and flexible energy storage. Finally, the rule clarifies some utility accounting procedures, making it easier to deploy and procure energy storage services. Last week’s Order on Clarification elucidated certain technical aspects of the rule’s implementation but did nothing to infringe upon the rule’s intent.
The rule represents a potentially significant contribution to the facilitation of NREL’s 80 percent renewable energy scenario. (See more about how we get there here). Happily, energy storage, demand response, and other demand-side resources not only help to integrate renewables, but can help avoid extending the lives of unnecessary marginal generation or building expensive transmission upgrades. FERC continues to do its part to set the stage for the transition to a clean and reliable transmission grid.
Dylan Gasperik, Program Assistant, Communications, Santa Monica, California
This new project is attractive, inventive, and productive. It throws back to the New Deal idea of putting artists to work in service of the country. See America is a collection of posters celebrating the National Parks. Because America needs its parks, the world needs more art, and artists need to eat.
I interviewed Max Slavkin, one of the founders of The Creative Action Network, the startup design collective behind See America, so that you could hear about it straight from the source. My favorite things he says are italicized so I can comment on them below.
I am so impressed by the great diversity of styles represented by this project. Where do you find all these talented designers?
We’ve been growing our Creative Action Network (CAN)’s community of designers since our first campaign in 2008, Design For Obama. It’s a diverse community, ranging from students looking to get their name out, to retired professionals looking for a way to give back, and everyone in between. There are artists with deep connections to the places they’ve illustrated, and there are some who have never even been. It’s been fun to see that diversity reflected in their work.
I love how democratic it is! Print design-publishing for the social network generation.
I had never actually heard of the original “See America” campaign before this. Where did this idea to re-create it come from?
The idea behind See America is to recreate the New Deal arts projects of the 1930’s. There are so many parallels between what was going on in the country then and now - record unemployment, increasing income inequality, corrupting relationships between congress & corporations, and a young president, who was elected to affect change, and put America back to work. As part of FDR's larger Works Progress Administration effort to hire the unemployed to build roads, schools, post offices, and more, the government also hired artists to make posters to share important messages with the country. The project gave thousands of artists jobs, influenced our national behaviors, and the artwork lives on as iconic pieces of American culture nearly 100 years later.
That parallel has been drawn before, and I know that Obama admires FDR. IT’s amazing to see that this particular stimulus aimed at artists is gaining traction and growing organically.
NPCA has really embraced the project and they are getting a ton of great publicity from it. How did that relationship come together?
We had wanted to do See America for a while, and reached out to a number of organizations about partnership. I actually reached out to NPCA on twitter without any relationship, shared our proposal for the project, and the rest is history! It was a very new sort of undertaking for them, as one of the oldest & largest organizations fighting for our parks, they do a lot of good work, but they don’t traditionally run crowdsourced art projects. Early on they understood the potential to engage a new generation of activists, both in the artists themselves, and in the wide audience that we could reach online with new social technologies. I don’t think we or they expected, however, just how many posters would come in so fast, or how good they’d be.
What an amazing Twitter success story! I use Twitter a lot for scanning news headlines, and I've scored free concert tickets too, but that's a real business relationship, begun in 140 characters. Cool.
What kind of threats are National Parks facing as they near their century anniversary?
Far too many, and of two distinct types. There are ecological threats, like climate change, here in California that includes things like drought and wildfire, and more that require resources and attention to grapple with. Even more sad is the other type, the man-made threats, like government shut downs and congressional budget cuts, that strip away the very resources needed to protect and preserve our parks. Our goal with these posters is to remind people of the beauty of what we’re fighting for, and inspire new generations to get involved in preserving it for the future.
Why should the parks pay for our financial mismanagement? They were here first.
Ken Burns called National Parks “America’s Best Idea”. What’s so great about it?
Haha. I think he had it right. There’s obviously the natural beauty of the parks themselves - I can’t think of anywhere or anything better and more wonderful than those landscapes. But more fundamental is what they represent, the American idea that our best belongs to all of us, and not to the privileged few. The idea that we’re better together is what drove America to designate those lands, and what today, for me, makes the full crowdsourced collection so much more beautiful than any one individual image.
Parks for the people! Not just for this generation but all following. Yes. Let’s expand this idea to marine sanctuaries and protected areas, and hey, why not clean water and air resources, mineral resources… You get it. Natural resources belong to everyone, and they are ours to protect.
What’s next in art and advocacy for the Creative Action Network?
Good question! We’re working now to amplify See America and get these designs in gift shops at the parks on new products like t-shirts and calendars. As for our next campaign, we’re considering a few options, and would love to hear your ideas!
Rock on Max.
Laurie Johnson, Chief Economist, Climate and Clean Air Program, Washington, DC
Today NRDC, Environmental Defense Fund, New York University’s Policy Integrity, and the Union of Concerned Scientists submitted joint comments (click here and here) commending the government for counting climate change benefits in proposed carbon pollution standards. These benefits are measured by the “social cost of carbon” (SCC).
The SCC estimates damages from more extreme weather and other climate impacts, and is the number used to estimate the economic value of protective climate measures the Environmental Protection Agency is currently required under law to implement.
A recent scientific update to the SCC launched a full on assault against it by the fossil fuel lobby and politicians backed by it, because its value increased enough to potentially make a difference (more below). Couching their attack in terms of process issues and imprecision in the estimate, industry interests whose bottom line may affected by carbon standards demanded the government be prohibited from using the SCC to count any benefits from slowing down climate change.
The attack on cost benefit analysis is the latest gambit by climate deniers to prevent progress on climate change. After Congress failed to pass comprehensive climate legislation, the denier crowd proposed numerous bills to stop EPA from regulating carbon pollution. After failing at that, too, they’ve now turned to undermining the agency’s rulemaking process by preventing it from counting benefits from reducing carbon pollution.
This assault on cost benefit analysis goes against Presidential executive orders dating back as far as the Reagan Administration, directing agencies to do their best to quantify as many costs and benefits of protective standards to inform rational and cost-effective regulation. Why would a seemingly routine best-practice exercise for cost-benefit analysis create such backlash?
As I’ve covered in previous blogs, the answer is simple: for the first time ever, forthcoming rules will limit carbon pollution coming from the largest source of these emissions: existing power plants. These plants account for fully 40% of CO2 pollution, representing a very large source of profits for the fossil fuel industry.
And herein lies the rub: it turns out that the updated SCC tips the scale against fossil fuels in favor of cleaner energy, even as it excludes many of the worst impacts of climate change (as documented in our comments and in a forthcoming report from the Cost of Carbon Pollution project). Once you add climate change costs (as partially measured in the SCC) to the cost of electricity generation, cleaner energy is cheaper to society than dirty. If the new rules sufficiently consider climate costs, the logical regulatory outcome of counting the SCC means further growth in the clean energy economy and reduced profits for the fossil fuel industry.
For all the fanfare about the Administration supposedly trying to manipulate the SCC, the truth is that opponents of carbon standards care more about their profits than our children and grandchildren’s futures. Send a note to OMB in support of counting carbon pollution damages in policy analysis.
Peter Lehner, Executive Director, New York City
Last week, at a coffee farm in Costa Rica, I stumbled on hundreds of butterflies, probably some kind of Heliconius species, all fluttering around a particular spot. At first it was hard to tell if they were coming or going. It’s like that with butterflies. But as we stood and watched, they eventually settled on almost everything around. It looked like they had come to spend the night, but I did not stay to find out. It was quite beautiful.
For many butterfly species, finding a good place to stop and rest isn’t easy. Deforestation, drought, and shifts in global temperature are all altering butterfly habitat. Monarch butterflies in particular face a very specific threat from humans: the weed-killer commonly known as RoundUp, or glyphosate. In the past decade, as the use of this potent chemical has skyrocketed, monarch populations have plummeted. This week, NRDC petitioned the EPA to urgently re-examine how and where glyphosate is used, and find ways to help protect monarch butterflies.
(Photo courtesy of Florida State Archives, via Flickr)
Over the past decade, RoundUp has become the most popular weed-killer in the country. Today’s farms use it to grow Monsanto’s genetically modified “RoundUp Ready” corn and soy, engineered to tolerate the herbicide, which the company also manufactures. Highway and utility crews use glyphosate to control plant growth along roadsides and along utility lines. If you use weed-killer to stop grass from sprouting in your driveway, it might contain glyphosate. Anyone can buy it at the store.
Glyphosate isn’t a selective weed-killer—it harms a lot plants. One of the plants it’s wiping out is milkweed, the sole source of food for monarch butterfly larvae. The only plant on which a monarch will lay its eggs.
From 1999 to 2010, roughly the decade after glyphosate use took off, milkweeds declined 60 percent in the Midwest, and monarch populations fell about 80 percent. Last winter, researchers counted an all-time low of 33.5 million monarchs at their Mexican wintering grounds. This is well below their 1997 high of 1 billion, and 10 percent of the running average over the past 15 years.
Monarchs are in trouble. Scientists announced this year that the monarch migration—a near-miraculous event, spanning multiple generations over 3,000 miles in a single season—is in danger of disappearing. And there is broad agreement in the scientific community that glyphosate is a major part of the problem. Monarchs reproduce several times over the course of a migration. Without milkweed to sustain each new generation, the migration will fail.
Many of us are sensing the loss already. At a recent meeting, NRDC trustees from Minnesota, Vermont, New York and Texas told me they had seen almost no monarchs last year. And I know I’ve seen fewer, myself.
My wildlife expert colleagues at NRDC are recommending several steps that the EPA can take to protect monarchs. Limiting or banning the use of glyphosate and other harmful weed-killers along roadsides and utility lines, which tend to stretch along migration pathways and could provide important egg-laying habitat for monarchs, would be a relatively quick and easy first step. Milkweed is a pretty short plant and is unlikely to hinder maintenance work if it’s allowed to flourish.
The EPA could also consider requiring safety zones, free of herbicides, in and around farms, to protect monarch-friendly habitat. The agency should also assess how the use of glyphosate for cosmetic purposes is affecting monarchs.
There are other herbicides which are just as harmful, however, so the EPA needs to ensure that whatever replaces glyphosate isn’t just substituting one harm for another.
I’m not sure what particular plant attracted the butterflies at the coffee farm, but there are a number of sweet-smelling plants around the area. Perhaps there was something they liked. Their visit made me more appreciative of the science that goes into figuring all this out.
This butterfly, and hundreds of others just like it, found some friendly habitat at a coffee farm in Costa Rica.
In the case of monarchs, the science is clear. Monarchs need milkweed, and the widespread use of glyphosate is wiping it out. This knowledge gives the EPA an opportunity to muzzle a direct threat to butterflies. Immediately limiting the use of glyphosate and other herbicides and encouraging a more sustainable approach to farming can help ensure that the monarchs’ astonishing migration will be an event that every generation can witness.
Danielle Droitsch, Director, Canada Project, Washington, D.C.
U.S. Senator Barbara Boxer from California has connected the dots and is pointing to growing evidence that communities living near tar sands mining and drilling operations, pipelines, and refineries are showing serious health risks and problems. An issue brief published by NRDC, Tar Sands Crude Oil: Health Effects of a Dirty and Destructive Fuel, profiles some of the latest evidence including scientific research that tar sands activity is causing increasing levels of air and water pollution that are then linked to health problems including cancer. Tar sands development affects communities across North America and includes a network of mining, drilling, and upgrading operations, pipelines and refineries. This network spans from northern Canada to refineries in California, the Gulf Coast, and the Midwest. The science is mounting but state, provincial, and federal governments have done too little to protect public health. This scientific evidence was not considered by the State Department’s environmental review of the Keystone XL pipeline. This mounting evidence shows there are considerable risks with expanding the tar sands industry.
NRDC's new issue brief reviews the latest scientific literature on this important issue.Air pollution from tar sands operations in Alberta
Studies by the National Academy of Sciences have noted that expanding tar sands activities have increased air pollution near Fort McMurray (the epicenter of tar sands development) and just outside Edmonton, Alberta. The most recent 2014 study looked at polycyclic aromatic hydrocarbons (PAHs) which are chemicals known to damage DNA, are carcinogens, or cause developmental impacts. This study found that environmental impact studies drafted by the tar sands industry have systemically underestimated levels of this pollution. A 2013 study noted elevated level of hazardous air pollutants coming from upgrading facilities north of Edmonton noting elevated rates of leukemia and other cancers in areas surrounding these operations north of Edmonton.Water pollution from tar sands operations
Researchers have confirmed the presence of elevated levels of toxic polycyclic aromatic hydrocarbons which can be traced directly to expansion of tar sands production. Some waters in Alberta exceed Canadian standards for chemicals linked to cancer, genetic damage, birth defects, and organ damage. Scientists have also found that tar sands development is leading to increasing amount of methylmecurcry in Alberta’s waterways including an exponential increase within 30 miles of tar sands upgraders. Methylmercury is a potential neurotoxin causing development and behavioral problems.
Tailings ponds which now cover an area the sized of Washington DC contain multiple toxic chemicals including arsenic, benzene, lead, mercury, naphthenic acid, and ammonia. As much as 2.9 million gallons of toxic tailings leak into the environment every day. A 2014 study showed that extreme concentrations of PAHs present in tailings may be evaporating into the air and then deposited into water. New federal research by Environment Canada released in February 2014 confirms that leaking tailings ponds are leaching into groundwater and then into the Athabasca River.Rising cancer rates in First Nations communities
Scientists have confirmed increased incidences of cancer in Fort Chipewyan, Alberta. There, scientists have noted an increased cancer rate from 1995 to 2009 – 30 percent higher than would be typically expected. Dr. John O’Conner, an Alberta physician, has for years called for further investigation of cancer incidences. To date, there has not been an independent study of these cancers despite repeated called by First Nations. Dr. O’Conner was invited by Senator Boxer to speak in Washington to share his observations.]Tar sands pipeline spills
Large quantities of tar sands were spilled from leaking pipelines into two communities in Marshall, Michigan in 2010 and Mayflower Arkansas in 2013. After the spill in Michigan, 320 people suffered adverse health effects including cardiovascular, dermal, gastrointestinal, neurologic, ocular, renal and respiratory impacts according to the Michigan Department of Public Health. In Arkansas, air monitoring showed significantly increased levels of benzene. Raw tar sands is mixed with diluting agents to move the substance through pipelines. The specific content of diluting agents are unknown as they are proprietary but most formulations include natural gas liquid condensate containing volatile hydrocarbons such as benzene, toluene, ethyl benzene and xylene. So far, the federal government in both Canada and the U.S. has failed to study or adopt regulations to deal with the chemical export of the unique tar sands mixture flowing through pipelines and has not commissioned any studies regarding the long-term human impacts of spills.Tar sands refinery emissions
Chemicals in tar sands may be released as air pollutants during the refining process. Diluted tar sands contain 102 times more copper, 11 times more nickel and 5 time more lead than conventional crude oil. Diluted bitumen from tar sands has notably higher levels of certain sulfur compounds called mercaptans that are highly volatile and linked to central nervous system problems. Diluted bitumen also contains higher levels of naphthenic acids which can significantly increase the corrosive properties of crude oil at high temperatures during the refining process. Low quality crudes like tar sands have been identified as a contributing factor in a major refinery accidents like the one at the Chevron refinery in Richmond, California which sent 15,000 residents to area hospitals and endangered the lives of 19 workers.Petroleum Coke impacts - a byproduct from tar sands refining
The refining of tar sands creates a by-product called petroleum coke which contains relatively high concentrations of metals including mercury, lead, arsenic, chromium, selenium, and nickel which people are exposed to when they breathe dust blown from piles of petroleum coke. This metal-laden dust can contaminate nearby homes and yards where it can accumulate. The dust is composed of particulate matter, which is recognized by the U.S. EPA to contribute to a number of negative health effects. Many of the metals in petroleum coke piles are carcinogens and linked to other health problems.Health Concerns Deserve More Attention
Federal, state, provincial, agencies should evaluate all of the potential impacts of tar sand crude. In Canada, governments should conduct independent investigations into the health impacts on locally affected communities particularly Fort McMurray, Fort Chipewyan, and Edmonton, Alberta. New proposals for tar sands operations and infrastructure including pipelines and refineries must consider human health impacts especially as the tar sands industry seeks to triple production. The Final Environmental Impact Statement for the Keystone XL tar sands pipeline did not adequately consider these issues. Until there is a better understanding of how these projects will cumulatively impact human health, efforts to expand the tar sand industry should stop. This means rejecting the proposed Keystone XL tar sands pipeline.
Go to StopTar.org to ask President Obama to reject the pipeline.
Claire O'Connor, Policy Analyst, Santa Monica
My grandfather, Art, is 82 years old. I’d say he’s been farming for about 80 years, give or take a few months. He’s supposedly “retired” now, but he’s still out in the field every day with my dad during planting and harvest season on our corn and soybean farm in Nebraska.
When you farm for upwards of 80 years, you see a lot. When Art was just a little boy, he watched his family’s crops wither up and die, blowing away into the Dust Bowl. Although the drought during the Great Depression is the most infamous and iconic dry period in American history, it wasn’t the last lean time for Art and the rest my family. The 1950s, 1980s, and most recently, 2012, challenged my family’s dedication to farming, and, let’s be honest, probably their sanity.
The dry Althouse farm during the 2012 drought
Like my family, California farmers are all too familiar with drought. California farmers are on the front lines when the skies seal up, and they are important partners in the path toward drought resiliency.
According to the USDA, over half of the acres in California still use gravity-based systems to irrigate their crops. In other heavily-irrigated, ag-centric states, such as Nebraska and Texas, fewer than 20% of acres use gravity irrigation. In Kansas, just 7% of acres rely on gravity systems. Why is California, the tech capital of the world, lagging so far behind?
Part of the reason is that, although on-farm irrigation technology has significantly advanced in recent years, our delivery systems that bring water to the fields haven’t kept pace. These aging systems can’t deliver the low volume water that farmers with efficient systems need. It’s not only holding farmers back, it’s causing irrigation districts to lose customers as farmers shift to groundwater, which they can pump at the pace and time they need. When NRDC and the Pacific Institute talked to irrigation districts for our report last fall, several districts mentioned the challenge of losing customers because they weren’t able to provide water in a way that worked with modern irrigation technologies.
Modernizing our water delivery systems will also allow farmers to schedule their irrigation for when crops most need water. Right now, 4,700 California farms still receive their water on a rotational basis, not based on when crops actually need water. That’s more than any other state in the nation, and it’s a huge missed opportunity. Irrigation scheduling reduced water by about 30 percent without impacting yield in a University of Nebraska demonstration project, and a Driscoll’s berry supplier also was able to reduce water use by 30 percent right here in California by scheduling irrigation based on soil moisture readings.
By 2025, 100 percent of our state’s delivery systems should be able to work with low flow irrigation systems and irrigation scheduling. That’s an ambitious goal, and the State should work with districts to meet it, offering cost-shares and low-interest financing.
Even farmers who don’t have access to modern delivery systems do have an important tool for drought resilience right beneath their boots—their soil. Soil conservation is about the oldest drought resilience trick in the book. As the USDA’s Natural Resources Conservation Service, points out, healthy soil holds more water, making farms more resilient to dry weather. Each 1% increase in Soil Organic Matter can hold an additional 20,000 gallons of water in the soil profile. You might call it a “dry day fund!”
Soil conservation is a relatively inexpensive, yet high reward, investment that we can make as a State. Immediate drought relief efforts should be directed toward soil conservation—cover cropping, residue management and conservation tillage are examples of soil building practices that the State can encourage with drought relief funds.
Farming ain’t easy. But the farmers we have here in California are tough. Despite the challenges they face, California’s farmers still manage to produce half of the fruits and vegetables we eat in this country, and they’ve doubled the revenue generated per acre-foot of water over the last four decades. Although we don’t know how long this current drought will last, we can be pretty sure it’s not the last dry period our farmers will face. Droughts will never be fun. But by utilizing the tools in the Water Conservation Act, modernizing our water delivery system, and investing in our soils, we can take steps to make the next dry period a little less painful.
Amy Mall, Senior Policy Analyst, Washington, D.C.
The Bureau of Land Management (BLM) is responsible for the enormous task of administering oil and gas leasing beneath 258 million acres of BLM land, 57 million acres of non-Federal ownership, and 385 million acres managed by other Federal agencies. For many years NRDC has been working to improve the oil and gas policies at the BLM.
The Bush administration dramatically increased oil and gas development on public wildlands from 2001-2009, helping the industry expand at the expense of the public and our lands, and trying to shortcut environmental review, environmental protection, and enforcement wherever possible. The Energy Policy Act of 2005, among other things, created seven BLM offices in the West that would expedite oil and gas permitting to help industry. Sadly, experts in other fields, such as biology, were reassigned to speed up oil and gas development at the expense of the BLM's other responsibilities to protect the land and its non-mineral values. There was no balance in the BLM's oversight of oil and gas development. The scales were heavily tipped in industry's favor.
That is why I was very disappointed to learn that there is a move in Congress to reauthorize these pilot programs. The BLM doesn't even have the proper staffing now to fully enforce its rules, yet some want to focus on new permitting that will lead to even more wells.
In Washington, the BLM instituted some leasing reforms in the early years of the Obama administration, but they haven't lived up to the promises that came with them. Since then, the BLM has proposed a very weak fracking rule, and is letting too much leasing go on without legally required environmental review. For these reasons and more, NRDC supports a moratorium on fracking on public lands.
While some of the BLM offices around the country have started to increase the environmental protections required under the law, there are still many BLM offices that appear to be operating almost as if President Bush was still in charge: full speed ahead with destructive oil and gas leasing without doing the required environmental reviews. There doesn't appear to be any unified vision about how the BLM should be carrying out its responsibilities.
This is even more frustrating when one compares it to how the BLM is managing renewable energy development on public lands. The offices tasked with permitting renewable energy projects, known as the Renewable Energy Coordinating Offices (RECOs), are starved for funds and staff. Additionally troubling is that the BLM elected to open RECOs in only four states (California, Nevada, Arizona and Wyoming), leaving Western states with bountiful clean energy resources such as Montana and Oregon chronically understaffed and ill-equipped to permit new wind, geothermal, and solar projects.
The bottom line: there is still no balance in the BLM's oversight of federal oil and gas development.
As I mentioned, the BLM is doing some things right in a few locations, for example:
- In northern New Mexico, the BLM has postponed new oil and gas leasing in an area "crisscrossed with streams that feed into the Chama River," and near the drinking water well for a local community.
- In California, the BLM has commissioned an independent, statewide scientific review of the potential oil and gas drilling impacts on the environment, to be led by the California Council on Science and Technology and to be peer-reviewed.
But the BLM is also doing way too many things wrong around the country:
- In eastern Wyoming, the BLM has ignored citizen complaints about wells being drilled without the proper environmental review for years, even though regulations require a response within ten days, and has allowed more than 700 gas wells to be drilled without responding.
- In states across the country, the BLM has been approving new oil and gas development without conducting any environmental review of the impacts of the fracking techniques being used, including horizontal and directional drilling and fracking. These states include: Wyoming, Colorado, North Dakota, Montana, New Mexico, California, Arkansas, Alabama, Louisiana, Mississippi, and Ohio.
- In Alabama, the BLM proposed to lease over 43,000 acres in National Forests without any analysis of the impacts that industrial oil and gas development could have on the forests, wildlife, or on the communities that depend on them for clean air and drinking water.
And the BLM has not done any analysis of the impacts of oil and gas leasing on the health of residents of nearby communities in the lower 48 (an analysis was done in Alaska), despite requirements under federal law.
As I mentioned above, NRDC supports a fracking moratorium on public lands to protect communities, drinking water sources, and our natural resources from the uncontrolled fracking dangers of polluted air, toxic waste, contaminated water, and more. Instead of moving ahead with dangerous fossil fuels, we should be scaling up clean, renewable energy development by boosting energy efficiency and wind and solar energy.
Before fracking continues to expand on public land, the BLM needs a comprehensive overhaul of its policies that govern federal oil and gas development to ensure a 21st century regulatory program that protects every acre of federal land as required by law in a consistent manner. Even a better BLM fracking rule would be only one component of what's needed to properly protect our public lands and all of their natural resources from the damage and destruction that comes with oil and gas development.
Rob Moore, Senior Policy Analyst, Chicago
The National Flood Insurance Program(NFIP) has long been plagued with a host of problems that have diminished its effectiveness for managing the nation’s current flood risks and the future risks of flooding and sea level rise brought on by climate change.
In 2012, Congress took an important first step towards correcting these problems when it passed the Biggert-Waters Flood Insurance Reform Act (Biggert-Waters). Among the much needed reforms was the phase-out of flood insurance subsidies for several classes of properties that included vacation homes and properties that have repeatedly flooded and then rebuilt at taxpayers’ expense. FEMA was to phase out these subsidies, as well as grandfathered rates for properties that have been remapped into higher risk flood zones but still allowed to retain their old premiums, and begin charging insurance prices based on the actual risk of a property being flooded.
Tomorrow, the House is expected to take up legislation to reinstate both subsidized and grandfathered rates. The House bill goes a little further than a measure passed earlier by the U.S. Senate. The House bill not only reinstates discounted rates for grandfathered properties but it also extends subsidies to properties that have recently changed hands, keeping subsidies in place for future buyers.
Rolling back the earlier Biggert-Waters reforms and reinstating insurance subsidies is bad public policy. It sets back efforts to prepare for the impacts of climate change and is an over-correction for legitimate concerns about the affordability of flood insurance for people of limited means.
For some, federal flood insurance subsidies provide a legitimate and much-needed safety net. These cases should be addressed. But the House and Senate bills go much further, reinstating generous subsidies for properties that are known to be at greater risk of flooding and properties that have recently changed ownership, even if the owner has the ability to pay for insurance.
It’s important to remember that only 21% (1.2 million) of the properties in the NFIP are subsidized. Currently, 79% of policy holders (4.3 million people) covered by the NFIP already pay risk-based prices. The recent actions by Congress benefits the small minority of property owners and makes it easier and cheaper to buy properties in the riskiest, most flood prone areas.
Flood insurance pricing isn’t just an issue of restoring fiscal solvency to the NFIP, this is also about sending the right price signal that there is a cost associated with living in a risky area. Using the example in the drawing below, let’s say a buyer is looking at three properties, one that’s right on the beach, one a little ways back, and one that’s a little further inland. The price of flood insurance tells the buyer that they are going to be living with a higher risk.
Do you think the three properties above are at an equal risk of flooding? Of course not. But the moves by Congress will return us to the days of treating these three properties the same, with the owner of the property at greatest risk getting a taxpayer subsidized incentive to live in a risky, flood prone area.
That is a losing proposition, especially when you consider the increased risk of sea level rise and flooding due to climate change. A study last year commissioned by FEMA found that by the year 2100 coastal communities are expected to see, on average, a 55% increase in high-risk flood areas, mainly along the Eastern seaboard. High-risk flood areas along the nation’s rivers are also projected to increase by 45% by the year 2100, with increases as high as 100% in some riverine areas of the Northwest and along tributaries near the Great Lakes.
The latest actions by Congress represent a major step backwards for efforts to start preparing for the impacts of climate change that are too late to avoid.
Rocky Kistner, Communications Associate, Washington, DC
As the Arctic melts, oceans acidify, storms intensify and crops wither in drought-stricken areas of the globe, it’s easy to feel overwhelmed by the dire consequences of climate change. Faced with the onslaught of polluter cash that line the pockets of politicians and smear the findings of climate scientists, who can we turn to push us in the right direction? How will we move toward a clean energy economy—and away from the dirty mix of fossil fuels driving us off the climate cliff?
The youth of the world may be our salvation, or at least a big part of the solution. That’s what dozens of students, environmental activists and labor leaders discussed at the Youth Climate Caucus recently in Washington, part of the Good Jobs Green Jobs Conference sponsored by the BlueGreen Alliance, NRDC and a host of other labor, business and environmental organizations. As my colleague Rob Friedman explains in his blog, it’s high time that we reach out to young people to play a bigger role in solving our rapidly worsening climate crisis—a future they will inherit with even more at stake.
There is widespread recognition across the progressive movement that we need to make a more concerted effort to get young people to the table on a variety of issues related to climate policy and economic resilience. Despite the fact that our generation is the one for which climate change will have the greatest impact, we are rarely engaged in discussions concerning these very important issues. That is for a variety of reasons, including a perception that we lack the necessary knowledge or experience. It is essential that we build space and create opportunities for young people to gather and learn from each other as we build a more just and sustainable energy sector.
Watch this video about the Youth Climate Caucus meeting in Washington.
Young environmental leaders bring much needed energy and optimism to a deadly serious and complex issue that has political leaders throughout the world tied up in knots. Jobs and environmental advocacy can go together hand in hand, they all say, but young people need more opportunities to learn the skills that can lead to greater jobs in the growing clean energy sector.
“I think we’re really poised to jump on this movement, especially from an economic perspective,” said David Meni, with the Roosevelt Institute Campus Network and a student at George Washington University. “If we’re going to have a green economy, that’s got to be driven by technology, that should really be driven by discoveries that are made here and that should really be made by students who are educated here.”
Everyone who attended the day-long session stressed the importance of making their voices heard. Kwanesha Love, a DC high school student and a member of the Alliance for Climate Education, put it this way:
“I feel as though the youth has a voice, and the youth should be able to use their voice and understand that they can help fix the problem that we’re going through now with climate change and pollution and make the world a better place for our future generations.”
This is the kind of attitude and enthusiasm sorely needed to solve the world’s most vexing environmental problems. The truth is, clean energy jobs are growing in virtually every sector of the economy. Just check out this interactive job-tracking website from Clean Energy Works For US , sponsored by Environmental Entrepreneurs, and see for yourself (click on the map to get up-to-date job info).
Participants at the Youth Climate Caucus Photo: Rocky Kistner/NRDC
These are the kinds of jobs many young people want: rewarding work in a fast-growing industry that’s good for the economy and good for our health and environment. Young folks increasingly say that for the sake of their future, there isn’t any other choice.
Their voices deserve to be heard.
Jennifer Sass, Senior Scientist, Washington, D.C.
Carbon nanotubes are wee little microscopic-sized tubes made of sheets of carbon atoms, rolled up into a tube like a paper towel roll. One roll is a single walled carbon nanotube, and a few rolls will make a multiwalled carbon nanotube (MWCNT).
These tubes resemble asbestos fibers – rigid, long, and once inside the body they are persistent over weeks or months, similar to asbestos.
They also share asbestos’ fire-retardant properties, and asbestos’ ability to make materials stronger and more durable. In fact, carbon nanotubes are touted as being 100-times stronger and 6-times lighter than steel, making them very useful to strengthen and lighten car parts, high-end racing bicycles, military protective wear, and even airplane parts. That's the good news.
But, like asbestos, they can damage lungs, and may lead to asbestos-like diseases and cancer in animal tests. That's bad news!
And, there is more bad news. MWCNTs are being marketed by a company called Nanocyl for use as a flame retardant in textiles such as couch coverings and curtains. Their website says the company will, “Nano-Engineer Your Future”.
EPA reviewed the health and safety hazards associated with using MWCNTs as a fire retardant in textiles, in a hefty 600+ page report issued September 2013. Here are some of the concerns that EPA identified, based on dozens of available laboratory and animal studies.
EPA identified exposure concerns for multiwalled carbon nanotubes in textiles:
- Workplace inhalation risks are greatest concern
- The MWCNTs are in a particulate form when inhaled
- Consumer exposures are likely during use of the treated textiles
- MWCNTs may be released from finished products in particulate form, into dust in the home
- Biopersistent – MWCNTs could remain in lungs for months after inhalation
- Increased concern for workers and children, based on their activity patterns
EPA also identified human health concerns for multiwalled carbon nanotubes:
Over a dozen laboratory and animal studies have shown that carbon nanotubes are likely to pose some very severe, and possibly deadly, health risks to humans. Existing toxicity studies are mainly of dermal and inhalation exposures, with only a few from oral exposure routes. There are no data from humans.
EPA identified the following human health concerns for MWCNTs in textiles:
- Skin and eye irritation
- Respiratory sensitization
- Respiratory/lung inflammation
- Immune function altered
- Genotoxicity studies showed conflicting results
- tracheal installation studies (exposure to the test animal through a tube going down the throat and into the lungs) report that carbon nanotubes behave like asbestos, causing pre-mesothelioma, lung fibrosis, and lung inflammation; carbon nanotubes may be more toxic than asbestos.
From the pan to the fire
Ironically, we may be jumping from the pan into the fire. The carbon nanotubes are intended to replace the highly toxic flame retardant chemicals called polybrominated diphenyl ethers (PBDEs) – and, especially one type, called decaBDE - after several states banned them due to their very high toxicity.
The PBDEs can be found in the blood and breast milk of most Americans, and in wildlife at the North Pole, which are traditional food sources. Women with higher levels of flame retardants in their blood take longer to get pregnant and have smaller babies. Children exposed in the womb have altered brain development resulting in delayed physical development, lower IQs and attention problems. Other studies have linked flame retardants to male infertility, male birth defects, and early puberty in girls. A study in animals has linked flame retardants to autism.
California’s Technical Bulletin 117: The Source of the Problem
In 1978, the California Department of Consumer Affairs created a flammability standard that was both harsh and ineffective. The standard – TB117 - required furniture to be fire resistant to an open flame (such as a cigarette lighter or match) and applied not only to the upholstery covering (for example of a couch cushion), but also separately to the foam interior.
The standard was crafted and fiercely defended by chemical manufacturers, despite overwhelming evidence that the chemicals didn’t save lives, and in fact made fires much more toxic. An excellent in-depth investigative journalism series in the Chicago Tribune called “Playing With Fire” provided overwhelming documentation that the chemical industry was not only manufacturing chemicals, but also phony science to defend its toxic products, mislead consumers, and fool lawmakers.
Fortunately, California recently passed a new furniture flammability standard called TB 117-2013. This new standard is smolder test of the furniture fabric and doesn’t require toxic chemicals, and - unlike the old rule - is effective at improving fire safety.
In December 2009, once the writing was on the wall, the largest U.S producers and importers of decaBDE announced a phase-out of the chemical by 2013. That created a market space for alternative replacement chemicals. Unfortunately, this is where we sometimes end up playing whack-a-mole with substitute chemicals that are as bad or worse as the ones they are replacing. For example:
- Firemaster 550 – A toxic flame retardant that doesn’t break down in the environment (persistent), collects in peoples bodies (bioaccumulative), and was promoted by its manufacturer as a “safe substitute” for certain PBDE flame retardants that were being phased out. Now, Firemaster 550 is being ubiquitously found in house dust and wildlife. Some of the chemical components of Firemaster 550 had been on the EPA Toxic Substances inventory for decades before showing up in the mix of this particular flame retardant. A recent study linked the chemical to obesity in laboratory animals.
- TDCPP or “chlorinated Tris” - was removed from kid’s pajamas in the 1970s due to its high toxicity, but it was not banned for any other use and is now showing up as one of the most common chemicals found in a recent survey of couch foam. From furniture foam it can travel into house dust where people are exposed.
- Emerald 3000 - copolymer of polystyrene and brominated polybutadiene marketed as a flame retardant for furniture foam and textiles. It contains several toxic chemicals, including benzene and 1,3-butadiene which are both known to cause cancer in people. It is also brominated, like the PBDEs it is meant to replace, which will make it last a long time in the environment.
Toxic Free Fire Prevention
The truth is that toxic chemicals – even fire retardant ones – do not significantly slow fires, but they do make the fire much more toxic. House fires have decreased nationwide, probably because of reduced smoking rates, and the introduction of fire-safe cigarettes. Other effective strategies for fires are upgraded wiring in homes, naturally fire-resistant materials in furniture and construction, and the use of functioning smoke alarms. These non-chemical fire prevention strategies make sense and save lives.
An inventory of nanotechnology- based consumer products currently on the market can be found at The Project on Emerging Nanotechnologies website. Nanomaterials are in many consumer products including cosmetics, sports gear, clothing, and children’s products.
Pierre Bull, Policy Analyst, Air & Energy, New York City
Thanks to fateful winter sun on the morning of Feb 2nd, Staten Island Chuck's "six more weeks of winter forecast" is proving to be a painful truth this year. More snow, ice and bone-chilling temperatures in the forecast, again.
Maybe we ought to play it safe next year when Chuck goes outside. He could simply remain behind the shade of a solar panel array. A win-win for all, right? Truth is, he’ll have plenty of solar arrays to choose from with New York City’s first solar farm going up soon near his Staten Island Zoo home at the Island’s Fresh Kills Park -- one of many more solar projects to come if we extend Gov. Cuomo's successful NY-Sun Initiative.
Now is your chance to voice your support for dramatically more solar in New York by sending a letter here (from our friends at Vote Solar) to the New York State Public Service Commission. Let's tell the Commission that we want to extend the NY-Sun Initiative for a full ten years to build New York’s solar industry into a national powerhouse, bringing good jobs, less pollution and lower energy costs with it.
What are you waiting for? Just one week remains (public comment period closes Monday, March 3rd) to tell New York energy policymakers that we want more solar in New York. Sunny thoughts to help you get through the remaining three weeks of winter according to Chuck!
Damon Nagami, Senior Attorney, Santa Monica
This month, California lawmakers at both the state and local levels are moving forward with efforts to put in place a moratorium on hydraulic fracturing, or fracking, and other well stimulation activities. As I said in my blog post last month, an immediate fracking moratorium is needed to give the state time to thoroughly assess the health risks and environmental impacts of fracking and acidizing, and how to protect against them.
At the state level, Senators Holly Mitchell and Mark Leno introduced a bill last week, Senate Bill 1132, that would put a halt to well stimulation activities, including fracking and acidizing, until the state completes a comprehensive study on the threats and impacts from fracking and measures are in place to protect against negative impacts. NRDC strongly supports this bill and in the coming months we will be working to get it passed through the Legislature.
And tomorrow, the Los Angeles City Council’s Planning and Land Use Management (PLUM) Committee will be considering a motion by Council Members Paul Koretz and Mike Bonin that would impose a moratorium on well stimulation activities until the City Council is assured that the environmental impacts are fully mitigated and public water supplies are fully protected. We strongly support this motion as well (see our support letter here), and will be at the PLUM meeting tomorrow to urge the Committee to pass this motion and move it to the full City Council for a vote.
NRDC commends these lawmakers for giving voice to the concerns of a majority of Californians who agree that a fracking moratorium is needed now. We are fully engaged in these efforts and will provide updates as they move forward in the process.
David Doniger, Policy Director, Climate and Clean Air Program, Washington, D.C.
Here are six quick reactions to the oral arguments presented this morning in the Supreme Court’s latest case on climate change and the Clean Air Act, Utility Air Regulatory Group v. EPA. (See my post from last Friday for more background.)
- No going back on Massachusetts v. EPA. Chief Justice John Roberts made that clear early on, saying that even though he was a dissenter in 2007, the Court isn’t going to reconsider Massachusetts, or the follow-on decision in American Electric Power v. Connecticut, which establish EPA’s authority to set Clean Air Act standards for both vehicles and factories that emit carbon pollution that drives dangerous climate change. This case remains focused on the secondary question of the scope of the Act's permitting provisions.
- Climate science not on trial. The Court declined to hear challenges to the science linking carbon pollution and climate change, and none of the Petitioners tried to pick a fight there. Solicitor-General Donald Verrilli emphasized that greenhouse gases pose possibly the gravest threat to health and welfare of all the pollutants EPA addresses. Justice Antonin Scalia asked sarcastically if sea level rise was occurring anywhere but in Massachusetts (a reference to the standing decision in the 2007 case), but no one seriously challenged EPA on scientific issues this time.
- A rough ride for Petitioners. Justice Elena Kagan pressed industry attorney Peter Keisler hard on the inconsistencies in the petitioners’ theories. She asked him to pick one of the four different legal interpretations offered in their briefs. He seemed to respond with a fifth one. Kagan asked why EPA didn’t deserve the widest possible deference under Chevron v. NRDC to resolve conflicting statutory provisions in the setting of greenhouse gases. She asked if EPA’s approach to the permit provisions – covering greenhouse gases but focusing on the largest emitters – wasn’t truest to the statutory language and congressional purposes.
- Tough questions for the Solicitor-General. Solicitor-General Verrilli came in for his share of tough questions, mainly probing whether EPA should have excluded greenhouse gases entirely, despite the statutory phrase “any air pollutant,” rather than modify the statutory threshold numbers of 100 and 250 tons per year. He forcefully responded that EPA faced multiple statutory commands – cover all pollutants, cover all sources above those tonnage levels, and issue permits within a year – and that EPA’s approach best resolves the conflicting statutory commands. Justice Kagan echoed that line of thinking. Justice Stephen Breyer asked if this wasn’t the approach that does the least violence to the statutory terms.
- Getting to five. Kagan’s line of questioning drew support from at least three others: Justices Breyer, Ruth Bader Ginsburg, and Sonia Sotomayor. Justice Anthony Kennedy, as usual, seems to be the key. His most significant question may have been to ask Verrilli for a case that best supports his position. The Solicitor-General responded with Morton v. Ruiz (a case allowing the government leeway to set priorities among claimants, when there are not enough funds to satisfy all of the beneficiaries’ claims). If Kennedy goes with Kagan, it appears EPA's position would command a majority.
- Compromise? Justice Roberts was intrigued with a possible compromise, where carbon pollution is subject to the requirement for “best available control technology” (BACT) for sources that need permits because of their emission of other pollutants, but not when CO2 is the only pollutant emitted in major amounts. That approach would cover power plants and other very large sources that always need PSD permits, but it would leave out a good many others. This compromise is hard to square with the statutory language – it’s one of the four interpretations Justice Kagan skeptically addressed at the outset. So it’s not clear how many votes that approach could command.
So we’ll have to wait and see for the final outcome on the scope of the Clean Air Act’s permitting provisions. But now more than ever, it’s clear that EPA’s authority to set standards for carbon pollution – the basis of President Obama’s Climate Action Plan – is firmly settled on solid ground.
Kaid Benfield, Special Counsel for Urban Solutions, Washington, DC
I and others have been tracking for some time a surging interest in walkable neighborhoods, in both reinvested downtowns and more pedestrian-friendly suburban developments. Just last month I cited University of Utah Professor Arthur C. Nelson for the propositions that, contrary to what occurred in previous generations, half of all new housing demand between now and 2040 will be for attached homes, the other half for small-lot homes. The demand for large-lot suburbia, by contrast, is diminishing.
In other words, there’s a reason why city living is becoming more expensive and suburbs lass so: demand for what cities offer is up, and demand for automobile-dependent suburbs, relatively speaking, is down.
In my new book People Habitat: 25 Ways to Think About Greener, Healthier Cities, I put it this way in a chapter titled “But the Past Is Not the Future”:
“The way households are going to be evolving over the next few decades is toward more singles, empty-nesters, and city-lovers, none of whom particularly want the big yards and long commutes they may have grown up with as kids. A significant market for those things will still exist, but it will be a smaller portion of overall housing demand than it used to be. This new reality means that the communities and businesses that take account of these emerging preferences for smaller homes and lots and more walkable neighborhoods will be the ones that are most successful.”
The voice of a skeptic
Alan Mallach, a serious scholar at the National Housing Institute (NHI) and someone I respect, isn’t buying it, however. Or maybe he is, but only a little bit. Mallach believes that we may be seeing a short-lived phenomenon of latte-sipping Millennials moving downtown, but no one else, and even the Millennials may be unlikely to remain once they start raising kids.
On Rooflines, the NHI’s blog, Mallach cities suburbanist Joel Kotkin with approval and opines:
“As I read much of what is being written about demographic change and urban revival, I see a lot of urbanist wishful thinking, along the same lines as the scenarios some pundits paint of exurban McMansions turning into slums and squatter colonies, as their former residents flee the suburbs for the cities like the residents of Pompeii fleeing the eruption of Vesuvius. Is it possible? Yes, but the evidence is not there.”
Mallach may have a point when he sets up the easy-to-knock straw man of McMansions becoming squatter colonies. I think they will decrease in value, but squatter colonies is going a bit far. Nonetheless, I think he’s wrong to be so sharply dismissive of the evidence that weighs in favor of a lasting rebirth of central cities. (I’ll discuss some of the evidence below.)
He’s also wrong not to acknowledge that, while some residents will indeed prefer suburbs in future decades, that does not mean they will prefer the types of unwalkable, outer suburbs that we built in the last half of the 21st century. I believe that suburbs are here to stay but that increasingly they, too, will take a more walkable, somewhat less auto-dependent form.
The question whether the rebirth of cities will grow to include families is trickier. Here’s what Mallach has to say on the subject:
“The other question is whether millennials will stay in the cities as they move into marriage and child-rearing—as most will—and the appeal of all those bars and restaurants down the block begins to pall. If, as preference surveys show, most will ultimately look for a single-family house in which to raise their children, will they opt for a Philadelphia row house or a St. Louis Victorian, or will they move to the suburbs?
“It will be a while before we have a clear picture, but there is little evidence to point to a long-term millennial commitment to cities as a place to remain, settle down and raise families. Joel Kotkin not unreasonably chastises writers who, with little or no evidence to back them up, confidently assume that they will do so. While the jury is still out, there is no compelling evidence of anything resembling the fundamental shift in values and attitudes on the part of millennials that would lead to most of them behaving that differently from earlier generations, and—to the extent that their means permit—buying suburban houses in which to raise their children, and, as often as not, commuting to work in the city in their Priuses.”
Mallach’s cheap shot at Priuses aside, I have raised the same issue about whether we will make our reborn cities more family-friendly for those with a choice about where to live. I don’t think we have gone far enough yet to do so, but I am hopeful that we will, as those families with a preference for urban living begin to demand it.
More to the point, though, I wonder if Mallach is asking the right questions: I don’t see the fundamental future choice as between city and suburb but between more walkable, diverse and healthy places, on the one hand, and more automobile-dependent, monolithic, and unhealthy ones, on the other. As I also write in People Habitat, whether those places are within or outside city limits is of most relevance to cartographers and candidates for city office; the environment, economy and, increasingly, our social fabric don’t care. What matters in the 21st century is not so much “cities” in the traditional jurisdictional sense, but metropolitan regions and neighborhoods. Both are changing for the better, and in a lasting way, in my humble opinion.
Here’s some evidence:
The rampage of sprawling, outer metropolitan development that ate up farmland while severely disinvesting older communities hit its peak in the mid-1990s, and there is no evidence that it is coming back. My friend Payton Chung, writing in his provocative blog west north, charts the amount of land converted to development over time. His graph, based on the National Resources Inventory, shows a dramatic rise in sprawl from the mid-1980s to the mid-1990s, followed by a marked decrease every five years since:
Note that the dropoff in the amount of newly developed land began a full decade before the recession and, for that matter, even before the “giant housing bubble showered suburbs with seemingly limitless sums of capital,” as Payton puts it. This decrease in outer suburban development isn’t “urbanist wishful thinking”: it is fact. It’s also fact that central cities are growing again, after decades of decline – and, for the first time in a century, growing at a faster rate than their suburbs.
The trend is also consistent with my own observations about average housing prices, which I charted and mapped for the Washington DC area during the recession years 2006-2009, and for the three following years, 2009-2012. The maps show a wide disparity between the changes in many outer location prices, which fell precipitously, and the prices in many inner and transit-served locations, which held steady or increased in value. I’ve seen maps of other regions that, to varying degrees, show similar results: urban housing values are proving more resilient than outer suburban ones.
As for what preference surveys say about the desires of the Millennial generation, those that I have seen support Mallach’s assertion to the extent that some Millennials will seek single-family homes and suburban living. But not to the same degree as preceding generations. According to analysis by industry advisers RCLCO, 31 percent of Millennials prefer a “core city.” What is particularly significant about this finding is that it is twice the portion of the preceding generation when polled at the same age. Perhaps more to the point, two-thirds wish to live in walkable places and town centers, whether in the inner city or in suburbs. A third will pay more for walkability, and half will trade space for it.
Professor Nelson, cited above, believes that, although there will be a continuing demand for large-lot housing, that demand will constitute only 25 percent of the market by 2040. Seventy-five percent will seek either attached or small-lot housing. This makes particular sense when one considers that the number of adults of child-rearing age and the number of households actually living with children will comprise a much smaller portion of the overall market than in previous decades. Nelson estimates that 87 percent of the growth in the housing market through 2040 will comprise households without children.
This means that, even if Mallach is right in his predictive generalization that today’s loft-dwelling Millennials will become tomorrow’s auto-dependent suburbanites – and I don’t believe there is evidence that the generation will make such uniform choices – there will still be plenty of people in the market for urban and walkable suburban homes. And, yes, some of them will be empty-nesting seniors who must or prefer to reduce their driving. While many will remain in their current, automobile-dependent suburban homes until forced out by infirmity – I have sad personal experience with this – I believe that many of those who do move will choose walkable environments as their new communities.
By the way, it isn’t just current and future residents who are going to be choosing cities and retrofitted suburbs over the coming decades. After decades of fleeing downtowns for office parks and exurban campuses, corporations are moving back to the city, too, such as Motorola in Chicago (3,000 jobs coming back to the Loop) and the other large businesses I cited in December. Meanwhile, the major corporations in Dublin, Ohio – the wealthiest suburb of Columbus – have banded together with city leaders in a major, multi-year effort to make their sprawling community’s business district more walkable and hospitable to the bright young talent they need to recruit.
It took a while to convince me of this one, but the data show that we have also hit a peak in vehicle miles traveled (VMT), on both a per capita and absolute basis. I believe the following graph shows the population-adjusted total rate of driving (something akin to VMT per capita but not quite the same; see the technical explanation):
The graph shows that VMT in the US peaked in 2005 and has been dropping ever since. Note again that the beginning of the decline preceded the recession – and the drop has been continuing steadily throughout the recovery, to the point where by late 2012 population-adjusted driving had decreased to 1995 levels.
In the same article in which he discusses the decline in the rate of newly developed land, Payton Chung speculates – and I suspect he is right – that the primary reason for the decline in driving is that, as regions have stopped spatially expanding, driving distances have gotten shorter, on average, as they have added population. Other reasons include mode shifts as greater portions of the population are able to choose walking and public transit for a portion or all of their trips.
Writing last year in The Atlantic Cities, Emily Badger summarized the data:
“The handy thing about ‘peak car’ as a concept is that it can nominally be proven in many ways. You’ve got Peak Driver’s License. Peak Registered Vehicle. Peak Gas Consumption. Peak Miles Traveled. There are peaks per person, per household, per demographic. Then you've got your absolute peaks when you add up all of our vehicles and miles together, as if we were all cruising the highways at the same time.
“Earlier this summer, [University of Michigan researcher Michael] Sivak released data showing that the number of registered light-duty vehicles in America (cars, pickup trucks, SUVs, vans) had peaked per person, per licensed driver and per household in the early to mid 2000s, before the onset of the recession. Because the U.S. population continues to grow, he predicted that the absolute number of vehicles had not yet peaked. But per person and household, we seem willing now to own fewer of the things . . .
“All of the peaks on [Sivak’s] chart occur around 2004, a time that predates both the recession and the housing bust. That means, Sivak suggests, that other factors beyond the temporary state of the economy may be driving these downward trends, from the rise of telecommuting, urbanization and public transit usage to fundamental shifts in the age demographics of drivers.”
Before she stepped down as director of planning for Washington, DC, Harriet Tregoning told me that, in the previous decade, the city had added 15,000 residents with no net increase in driver’s licensing or registered vehicles. I find that astonishing evidence that something real is going on.
Again, this is not urbanist wishful thinking. These are facts.
I suppose Mallach’s answer might be that, once the Millennials start having those kids in the suburbs, we can expect driving to grow again, whether in Priuses or not. And he may well be right to a degree, but I’m betting it won’t be to a degree that takes us back to the per-household levels of 2005.
Consider that, over the last decade, miles driven by Americans aged 16 through 34 dropped 40 percent per capita compared to the same age group in the previous decade. That’s not evidence of a real change? In the same decade, bicycling trips per person in the age group went up 24 percent while walking went up 16 percent. Twenty-six percent of Americans in that age group, a growing number, do not have a driver’s license. (Sorry, but that’s a huge change: the most exciting day of my young life at age 16 was the day I got my driver’s license.)
Again, we don’t need a wholesale shift in behavior to indicate a shift in direction. And I suppose that is my biggest beef with Mallach’s argument: it presents only two options, either that Millennials are choosing cities for the long haul, or they going to revert to 1980s-style suburbia. I suspect the truth is in between, but that a larger portion of Millennials will stay in cities or choose walkable, 21st-century suburban places than did previous generations.
One can even make a case that we have reached a sort of “peak Walmart,” in which the decades-old business model of the giant retailer – paving over forests and farms at the exurban fringe to establish automobile-dependent megastores – is past its prime. The company’s fourth-quarter net income for 2014 fell 21 percent. And, although Walmart isn’t saying it in so many words, the retailer believes its future lies in a different, less sprawling and more urban direction.
Writing in last Friday’s Washington Post, Amrita Jayakumar reports:
“In its fourth-quarter earnings report, Wal-Mart said sales at its U.S. stores fell 0.4 percent and customer traffic decreased 1.7 percent. But the company’s global e-commerce sales grew to more than $10 billion in 2013, an increase of 30 percent from the year before. Sales at its small stores were also up 4 percent in the last year.
“Wal-Mart said it would pour resources into online and mobile shopping options. The retailer also announced that it would open twice as many neighborhood stores throughout the country.” (Emphasis added.)
“The big-box discounter is in need of a bricks-and-mortar makeover, analysts said. To resonate with today's shopper, Wal-Mart needs to move its stores closer to major population centers, shrink the square footage of its superstores and shutter about 100 underperforming U.S. locations, they suggest.”
In other words, Walmart needs to move away from sprawl, because that isn’t where the future market potential lies.
Not that closing and abandoning suburban Walmarts – leaving communities with 20-acre decaying eyesores that are difficult to repurpose – is such a great thing, by the way, even if it does furnish further proof that land use in America is fundamentally changing. And, as for Walmart’s “going urban,” that may not be such a great thing, either, as discount competition drives out established businesses that have made longstanding commitments to inner cities.
A more serious issue
Toward the end of his article, Mallach – whose scholarship has focused heavily on weak-market cities and neighborhoods – raises what I believe is his real concern: not that cities and the geographies of living patterns aren’t fundamentally changing, but that they are; and the changes don’t benefit lower-income populations.
That’s shifting the subject from where he started, but it’s a legitimate issue and a serious one. In some inner cities (in the Rust Belt, for example) the comeback is going to be far slower than in others; in some neighborhoods the influx of newer urbanites could have no positive effect or even negative effects on pre-existing residents. A trend toward walkable suburbs may indeed represent an indication that the Millennial generation has a different set of lifestyle preferences than its predecessors; but it is probably irrelevant to most inner-city, lower-income residents.
I do have a more favorable view than does Mallach of the prospects for older city neighborhoods to reverse years of abandonment and disinvestment to come back. I believe much more than he does that Millennials have a stronger inclination toward urban living than did their predecessors. I think the effects will be lasting.
But I share Mallach’s concern that the current brand of city recovery hasn’t come close to solving the problems that plague poor inner-city residents, including bad schools, chronic unemployment, higher crime rates, and poor health, just to name a few. Smart growth, urbanism, and changing generational values may be real, but they don’t address those social problems. Neither does much else, as far as I can tell: Very little that has been tried over the past several decades has had a pronounced, lasting impact to lift people out of poverty.
That may be tragic, but for me it doesn’t warrant a wholesale dismissal of the many good things going on in cities and metropolitan regions today. It just underscores that we haven’t figured out how to solve some deeply embedded social and economic problems.
I wish I had the answers. I don’t, and as far as I can tell no one else does, either.
- Ten things planners need to know about demographics and the future real estate market (January 30, 2014)
- Are we creating family-friendly cities? If not, shouldn't we be? (January 15, 2014)
- Today's corporations want city connections and amenities, not suburban sprawl (December 13, 2013)
- DC population and housing trends bode well for US central cities (August 12, 2013)
- Central cities now growing faster than suburbs, confirming trends for walkable lifestyles, shorter commutes (June 29, 2012)
- Remaking a suburb for the creative class (October 21, 2010)
- How the evolving housing market will help sustainable communities (April 4, 2012)
- Is 'gentrification' always bad for revitalizing neighborhoods? (October 19, 2011)
Move your cursor over the images for credit information.
Kaid Benfield writes about community, development, and the environment on Switchboard and in other national media. Kaid’s latest book is People Habitat: 25 Ways to Think About Greener, Healthier Cities.
Becky Hayat, Legal Fellow, Santa Monica, California
“Those who don’t know history are doomed to repeat it.” – Edmund Burke
This past Sunday, a large storm pipe under an unlined coal ash impoundment at a retired North Carolina power plant broke, releasing up to 82,000 tons of coal ash, and an estimated 27 million gallons of polluted water, into the neighboring river. Unfortunately, this isn’t the first time coal ash has caused massive destruction to the environment and neighboring communities. The infamous 2008 coal ash spill at the Tennessee Valley Authority’s (TVA) Kingston power plant marked as one of the largest environmental disasters in the U.S., flooding downstream communities and affecting millions of residents. Both the TVA spill and the recent spill in North Carolina are the result of decades of lax oversight by state and federal regulators over the management and disposal of coal ash.
What is coal ash and why do we care? When power plants burn coal, they produce waste, which is often referred to as “coal ash,” and lots of it. In 2012, coal-fired power plants generated approximately 110 million tons of coal ash. Coal ash contains a number of toxic heavy metals, including mercury, lead, and arsenic, known to cause cancer, neurological and organ damage. Much of this toxic byproduct is dumped into unprotected and unlined landfills and ponds, or also known as impoundments. Because these impoundments are unregulated at the federal level and at most state levels, the majority of ash ponds lack adequate safeguards. As a result, communities living near such impoundments are vulnerable to large-scale spills like what just happened in North Carolina, and also to the less obvious but equally dangerous risk of coal ash leaching into drinking water sources, especially if the landfill is unlined.
Given the serious health risks that coal ash poses, it is therefore shocking that there are currently no federal standards on how and where coal ash should be disposed. As of today, coal ash is considered an “exempt waste” under an amendment to the federal Resource Conservation and Recovery Act (RCRA). In 2010, the U.S. Environmental Protection Agency (EPA), for the first time, proposed to regulate coal ash. Under the proposed rule, EPA has the option of regulating coal ash as either a non-hazardous waste or a hazardous waste under RCRA. In the wake of this North Carolina disaster, EPA must act to regulate coal ash as a hazardous waste. Under this option, existing and new power plants would not be allowed to build any new coal ash ponds, and any existing landfills would have to be phased out and converted to specially designed landfills.
Regulating coal ash as a hazardous waste is the only way we can ensure adequate protection to our environment and communities from the dangers of coal combustion waste. Although EPA is now under a court order to finalize its coal ash rule by December 19, 2014, the House of Representatives continues to pass amendments to stop the agency from regulating coal ash and letting the states do what they want - thus, maintaining the inadequate status quo.
While having a strong federal coal ash rule can certainly help prevent future contamination from coal ash ponds across the country, another recent EPA proposal may also help to achieve such goal. On June 7, 2013, EPA published a proposed rule governing water pollution from coal-fired power plants in the U.S. As I blogged about here, the aim of the proposed rule is to reduce toxic metals and other pollutants discharged into surface waters by power plants from seven types of byproducts, including coal ash. Under the most stringent option proposed by the water pollution rule, EPA would establish a “zero discharge” requirement for all discharges of pollutants in coal ash. NRDC, along with Earthjustice, Sierra Club, and Environmental Integrity Project, are pushing EPA to select the most stringent option as the final rule because not only would it yield the greatest reduction in toxic discharges from coal ash, but it would also eliminate the use of ash ponds and impoundments, like those now leaking through a broken pipe at the Duke Energy facility in North Carolina.
The federal government has shirked its responsibility for long enough by continuing to let coal-fired power plants off the hook for the ecological damage they cause. Instead, EPA must meet its commitments and quickly issue regulations that protect our environment and citizens from repeated disasters and pollutions caused by coal ash disposal.
Latin America Green News: a glacial flood in Chile, sustainability in Mexico, coffee and dengue in Costa Rica, COP20 in Peru
Denée Reaves, Program Assistant, International, Washington, D.C.
Latin America Green News is a selection of weekly news highlights about environmental and energy issues in Latin America.
February 2nd-8th, 2014
The southern city of Punta Arenas –gateway to the stunning Torres del Paine national park— joined the towns of Chile Chico and Pucón in banning the commercial use of plastic bags in an effort to reduce this source of aesthetic and environmental pollution. The ordinance states that Chileans use approximately 200 plastic bags per person per year, which in Punta Arenas alone totals 26 million plastic bags annually. Businesses will have one year to phase out their use of plastic bags, and violators will be fined up to $360 per offence. (Santiago Times 2/6/2014)
On February 1, a glacial lake outburst flood (GLOF) on Patagonia’s Baker River left members of the Colonia Sur area completely cut off from their neighbors in the town of Cochrane. The flood happened when glacial melt-water that had been building up in Lake Cachet 2 for days suddenly rushed downstream, submerging the transportation raft that local residents use to connect to the rest of the region. (Cooperativa 2/6/2014)
Costa Rica has fallen from the 5th to 54th spot on Yale University’s Environmental Performance Index. The country’s lower score this year reflects the addition of new categories in the index: energy and climate and waste water treatment. Costa Rica scored particularly low on waste water treatment, earning only 0.9 points out of a possible 100, and placing 125th out of 178 countries for the category. According to the lead author additional challenges include a reduction in forest cover and addressing an increasing trend in carbon intensity. (El Financiero 02/4/2014; CostaRicaLimpia.org 2/6/2014)
Dengue and coffee rust are just two of the pests that are spreading in Costa Rica due to changing climate conditions. Dengue mosquitos were historically found in warmer, humid areas at altitudes below 1,000 meters. In recent years, however, they have also been found in the country’s central region and even in some mountainous areas. In addition to impacts on coffee and health, there also cases of pests affecting banana, palm, orange and chayote. (El Financiero 2/4/2014)
During the C40 Cities Climate Leader meeting in Johannesburg, South Africa, the Head of the Mexico City Government, Miguel Ángel Mancera, detailed all the work his city has done and will do to create “Smart Cities, Livable Cities.” In 2013, Mexico City doubled the existing bikeways by constructing 28.5 more kilometers of cycle infrastructure. Last year was also the second cleanest year in the last few decades for the city, which was recognized by the C40 Air Quality Award. Ángel Mancera also plans to install five new black carbon monitoring stations this year. La Paz in Baja California Sur is another city that is making changes to boost sustainability, including the construction of one of the largest solar plants in Latin America. The park, which includes 130,000 solar panels, will substitute some of the fossil fuels used by the city and eventually will help inject power into the Federal Energy Commission’s electric system. The plant will also help reduce CO2 emissions by 60,000 tons. (Aztecas Noticias 2/6/2014, Suelo Solar 2/4/2014)
In addition to marshes, rivers, coral reefs and oases, Mexico is home to 5% of the world’s mangroves, and the country ranks 4th largest out of 125 countries that have this type of ecosystem. Mexico’s mangroves are home to a wide variety of life and as such the Ministry of Environment and Natural Resources is making it a governmental priority to preserve and protect these ecosystems. (Crónica 2/2/2014)
Gerardo Ceballos, a professor of ecology and conservation of wildlife at National Autonomous University of Mexico, calls for Mexico to have an environmental reform as the country has already spearheaded fiscal, energy and educational ones. The two biggest environmental issues he mentions are deforestation and conservation of wildlife. Mexico has lost 400 thousand hectares per year of forests and jungles, so Ceballos posits a reforestation plan that will prevent further deforestation and produce jobs in rural areas. He also highlights that Mexico is home to a large number of endemic creatures, such as the vaquita, that the country needs to help preserve. (Hoy Tamaulipas 2/4/2014)
A United Nations technical mission visited Peru to meet with Ministry of Environment staff and discus plans for the next round of international climate talks scheduled to be held in Lima this December. According to Peru’s Vice minister of Strategic Development and Natural Resources, the Lima climate summit (COP20) will be key in “transforming the issue of climate change into a factor for a sustainable economy.” He also noted that discussions about the REDD+ program to reduce emissions from forests are expected to conclude at the summit. (Andina 2/3/2014)
During the second summit of the Community of Latin American and Caribbean States (CELAC) held in Cuba, the group’s 33 member nations noted that climate change is one of the gravest problems of our time. The nations re-affirmed the need to work together internationally to meet the challenge and called for a global commitment to reduce greenhouse gas emissions. (Prensa Latina 2/5/2014)
- Bisphenol A (BPA)
- Hexavalent Chromium
- Methylene chloride (dichloromethane)
- Perchloroethylene (Tetrachloroethylene, PERC, PCE)
- Propoxur (Flea and Tick Pesticide)
- Sulfur Dioxide
- TDCP/TCEP (Chlorinated Flame Retardants)
- Tetrachlorvinphos (Flea and Tick Pesticide)
- Trichloroethylene (TCE)
- Triclosan and Triclocarban (Antibacterials)