INFRA US 4

Page 1

COVER USINFRA4 final OUTLINES_july08 01/06/2010 17:02 Page 1

IN FOCUS: THE AGE OF THE

MEGACITY

www.americainfra.com • Q2 2010

An earthquake virtually destroyed San Francisco in 1906. Is modern California prepared for the next Big One?

Information superhighway

Power savior

Stormy weather

Better use of data means a transport revolution, says IBM’s Sam Palmisano

Why Duke Energy sees efficiency as key to America’s energy challenges

Will local opposition blow the Cape Wind project off course?


ITRON AD.indd 2

24/5/10 10:58:50


ITRON AD.indd 3

24/5/10 10:58:52


MASTER METER AD.indd 2

24/5/10 10:59:10


MASTER METER AD.indd 3

24/5/10 10:59:12


AT&T_USINFRA3_FEB10.indd 2

24/5/10 11:16:56


AT&T_USINFRA3_FEB10.indd 3

24/5/10 11:16:58


The New Grid is Smart but is it Secure?

Join the discussion‌ Smartgridsecurity.blogspot.com Ounce labs ad.indd 1

24/5/10 11:01:45


FROM THE EDITOR

7

Who’s the boss? We have the power to control the world we live in, but there are limits.

S

ince our first distant ancestor realized that he could use one rock to reshape another one, mankind’s overriding narrative has been one of gradual domination of its environment. Building better places to live, eating better food and more effectively resolving disputes with meddlesome neighbors, advanced rock technology put our prehistoric forebears firmly on course to the top. The intervening millennia have seen us move beyond stone to ever more advanced and powerful innovations. Over the last couple of centuries in particular, the momentum has been so great that we have succeeded in reshaping our world beyond all recognition. We have rerouted rivers, mastered intercontinental flight and harnessed the power of the atom. Given our achievements it is hardly surprising that a certain sense of invulnerability has begun to permeate, an idea that we can respond to any challenge nature chooses to throw at us. Hubris, meet nemesis. Events of recent months have severely dented our over-inflated sense of control. Few would have predicted that a volcano in faraway Iceland would have the power to ground air traffic across Europe and throw the travel plans of millions into chaos. However the cloud of ash and gas belched out by Eyjafjallajokull’s eruption did precisely that, emptying the continent’s skies and placing tremendous strain on the other forms of transportation infrastructure tasked with taking up the slack. Similarly, the unfolding environmental catastrophe in the Gulf of Mexico is a clear example of the Earth’s ability to put us in our place. At time of writing, all efforts to stop the vast quantities of

“The average time between earthquakes on most parts of the San Andreas Fault is somewhere been 100 and 150 years. The last earthquake on the southern San Andreas Fault was 153 years ago” Dr Lucile Jones, US Geological Service (p32)

“In the future, smarter transportation will even apply advanced modeling to something as previously unpredictable as the flow of volcanic ash across the Atlantic Ocean” IBM CEO Sam Palmisano (p80)

crude oil spewing into the ocean have failed. This disaster of our own making has the potential for far-reaching knock on effects. The slick represents a huge threat to Louisiana’s fragile coastal wetlands, which provide a vital first line of defense against the storms that plague the region. Katrina remains fresh in the mind. Anything that makes Louisiana more vulnerable to nature’s often capricious whims is rightly feared. While we may never be able to protect against every eventuality, we can at least mitigate the worst effects. The San Andreas Fault makes California one of the most geologically lively places in the world. Small earthquakes occur regularly and the threat of a major tremor constantly looms. Nobody is under any illusion that a powerful quake could simply be shrugged off, but bridges and buildings have been engineered to reduce the potential impact, while contingencies are in place to minimize disruption to water and power supplies. The dangers are approached with a healthy dose of realism, which should hopefully lead to more effective solutions. It’s nature’s world and we’re just living on it. The sooner we accept who is really in charge, the better for all of us.

Huw Thomas Editor

www.americainfra.com 7

EDS NOTE.indd 7

2/6/10 09:22:01


CONTENTS 8

32

On deadly ground As California braces itself for the next Big One, Huw Thomas assesses the big infrastructure challenges for a state that is always on the move

On the horizon How Cape Cod could provide the United States first ever offshore wind farm

A penny saved‌

40

Vice President for Energy Efficiency at Duke Energy, Ted Schultz, talks to US Infrastructure about the smart grid’s ability to reduce power consumption

66 The road to a smarter planet IBM CEO Sam Palmisano reveals how solutions are set to get even smarter to meet the transportation needs of the 21st century

80 Contents.indd 8

2/6/10 09:18:53


OPTELECOM AD1.indd 1

24/5/10 11:00:59


CONTENTS 10 48 A bright future US Infrastructure investigates the drivers behind the rapid expansion of the solar industry

102

Jean Lobey

Industry Insight 46 Troy Dalbey, Upsolar 64 Adrian Butcher, Opentext 112 Dan Kroll, Hach

52 Letting the light shine on solar power Kevin Smith outlines the critical challenges currently facing the solar industry

54 Pushing the boundaries of climate change technology Nick Akins on the impact of technological innovation

58 State of the nation Rich Lee, US Head of KPMG’s infrastructure advisory group, looks into the biggest challenges in infrastructure delivery effectiveness

72

Malcolm Unsworth

58

62 Number crunching With Gregory Burkart, Managing Director of Duff & Phelps

78 An intelligent solution Ian Macleod reveals how automated meter reading technologies can make a real difference to our water footprint

90 Mapping the future How the effective use of mapping and surveying tools will help the transition to a more sustainable and secure future

94 Enhancing coastal planning Geospatial technology is critical to better understanding our coastal environments, says Ed Saade

96 A new direction Rick Vincent outlines some of the latest innovations in mapping technology

Roundtables 72 Smart grid 86 Intelligent transport systems

Contents.indd 10

2/6/10 09:28:50


OPENTEXT AD contents.indd 1

24/5/10 11:00:22


CONTENTS 12 100 The big conversation

In the Back

Chris Essid outlines the future of interoperable emergency communications

128

102 The added value of global resources How end-to-end solutions are helping the infrastructure industry, by Jean Lobey

104 Protecting life, property and business The importance of the latest update to a high-level US preparedness standard

106 Fighting ďŹ re with ďŹ re Dennis Smagac outlines the potential of compressed air foam systems in combating multiple-hazard fires

108 High and dry? Why America is feeling the pressure in the water industry

114 Urban legends

126 Infographic: Yucca Mountain 127 Books 128 Photo finish

108

The tale of the 21st century will be defined by the rise of the megacity

120 Big Blue sky thinking

120 Contents.indd Sec1:12

Can greater instrumentation, interconnectedness and intelligence be used to revolutionize infrastructure and make the world a better place?

2/6/10 09:19:04


HATCH AD.indd 1

24/5/10 10:57:49


NG Oil & Gas Summit 2010 3 - 5 November 2010 The Four Seasons Resort & Club, Austin, Texas

Find Out More Contact NG O&G (+1) 212 9208181

The NG O&G Summit is a three-day critical information gathering of the most influential and important CIOs from the food industry. The NG O&G Summit is an opportunity to debate, benchmark and learn from other industry leaders.

A Controlled, Professional and It is a C-level event reserved for 100 Focused Environment participants that includes expert workshops, facilitated roundtables, peer-to-peer A Proven Format networking, and coordinated technology This inspired and professional format meetings. has been used by over 100 executives as a rewarding platform for discussion and learning.

US Infrastructure GDS Publishing, Queen Square House 18-21 QueenSquare, Bristol, BS1 4NH Tel: +44 117 9214000 E-mail: info@gdsinternational.com Legal Information The advertising and articles appearing within this publication reflect the opinions and attitudes of their respective authors and not necessarily those of the publisher or editors. We are not to be held accountable for unsolicited manuscripts, transparencies or photographs. All material within this magazine is ©2010 US Infra.

Chairman/Publisher Spencer Green Director of Projects Adam Burns

“The structure of this event is very useful. The one on one meetings were great. We came here to test some ideas and this format worked extremely well” Bob Welch, President, Chemical, Energy & Natural Resources, CSC

Editorial Director Harlan Davis Worldwide Sales Director Oliver Smart Editor Huw Thomas Managing Editor Ben Thompson Associate Editor Rebecca Goozee Contributors Diana Milne, Julian Rogers, Marie Shields, Nicholas Pryke, Stacey Sheppard, Jodie Humphries, Timon Singh, Ross Densley, Lucy Douglas, Ian Clover Creative Director Andrew Hobson Design Directors Zöe Brazil, Sarah Wilmott Associate Design Directors Tiffany Farrant, Michael Hall, Crystal Mather, Cliff Newman, Catherine Wilson Online Director James West Online Editor Jana Grune Project Director Brooke Thorpe Sales Executives Angela Byrne, Brian MacDonald, Melissa Luongo, Faro Levy-Smith Finance Director Jamie Cantillon Production Director Lauren Heal Production Coordinators Renata Okrajni, Aimee Whitehead Director of Business Development Richard Owen Operations Director Jason Green Operations Manager Ben Kelly

Subscription Enquiries +44 117 9214000, www.americainfra.com General Enquiries info@gdsinternational.com (Please put the magazine name in the subject line) Letters to the Editor letters@gdspublishing.com

www.ngosummit.com CREDITS.indd 14

GDS International GDS Publishing, Queen Square House 18-21 QueenSquare, Bristol, BS1 4NH Tel: +44 117 9214000 E-mail: info@gdsinternational.com

2/6/10 09:21:03


OSI AD.indd 1

24/5/10 11:01:25


16

THE BRIEF

Black days

T

he full cost of the oil spill in the Gulf of Mexico to marine and coastal ecology is only just being realized. And while million dollar fines and compensation claims may dent the bottom line of BP and other companies admitting responsibility for ecological disasters, it still doesn’t seem enough. At least six million gallons of crude have spewed into the Gulf, and the spill has now surpassed the 11 million gallon 1989 Exxon Valdez oil spill off Alaska as the worst in US history.

USINFRA4_UPFRONT.indd 16

BP announced a month after the spill that costs have grown to about $760 million, including containment efforts, drilling a relief well to stop the leak permanently, grants to Gulf states for their response costs and paying damage claims. It has said that it’s too early to calculate other potential costs and liabilities. BP has however admitted ‘full responsibility’ for the spill, which occurred on 21 April, after an underwater explosion on its Deepwater Horizon oil rig ruptured the riser pipe, kick starting a chain of fail-

For more on disaster control turn to page 32

2/6/10 09:47:08


THE BRIEF ures that highlighted just how susceptible oil drilling equipment can be. A blow-out prevention device that is supposed to guard against such accidents was not working and there were no other emergency devices fitted to the rig. The issue for BP and other oil producers is this: Do we have the right equipment to do the job? And if we don’t, do we have good fail-safe equipment? BP neglected to install the $500,000 trigger that may or may not have operated successfully given the dramatic events that unfolded moments before the Deepwater Horizon rig sank. But there may be deeper issues, both figuratively and metaphorically. The fact is oil is harder to obtain now than it was 10 years ago, and as a result the oil industry is undergoing a transition from easy to reach oil wells to deeper, more dangerous targets as the global demand for crude increases, not lessens. In order for BP and other oil companies to quench the world’s thirst for oil they are drilling deeper underwater, and in waters on the continental slope. This technically demanding drilling requires complex equipment and reduces the room for error. And as those watching the spread of crude throughout the Gulf of Mexico can attest, it also makes repair of crippled riser pipes and underwater wells that much more challenging. The catastrophe in the Gulf of Mexico bares stark comparisons with the Lusi mud volcano in Indonesia, which many have suggested was triggered by deep exploratory drilling in an environment with little room for error. In the Lusi case, those drilling failed to seal the well, which eventually led to the blow-out of catastrophic proportions. Like the Gulf of Mexico spill, both deep drilling adventures have caused a catastrophic impact on the environment. With demand increasing, and supply dwindling, simple math dictates that more catastrophic spills and blow-outs are likely on the horizon. More drilling will take place in environmental hot-spots and difficult areas, where accidents like the Gulf of Mexico spill – major, hard to stem leaks – are likely to happen with more frequency. Safety regulations should be evolving to adjust to this new reality, resulting in better legislation for equipment like blow-out valves and shut-off triggers, and the necessary equipment in place to tackle a sprung leak. After all, in the absence of rigorous regulatory scrutiny, oil companies are tempted to take shortcuts that may not have led to disaster in the past, but could be catastrophic where the margins of safety are lower.

USINFRA4_UPFRONT.indd 17

17

News in pictures

Filipino residents clash with members of a demolition crew in Quezon City, Philippines. Police units escorted a demolition crew to dismantle houses and small business establishments to pave the way for government road infrastructure construction.

Spain’s PM Jose Luis Rodriquez Zapatero speaks during the presentation of a government infrastructure spending plan in Madrid.

In cooperation with the 2010 Prix Pictet Commission and UK-based Azafady to show issues around land, forests and poverty in Madagascar, this image represents how the country provides resources for its indigenous people and is promoting sustainable development.

2/6/10 09:47:12


18

NEWS & NUMBERS The ultimate goal

Around the world in 80 days

M

RG estimates unnecessary maintenance spend to be more than $200 billion per year in the United States alone. This loss translates directly into lost operating income and more than two trillion dollars in lost market capitalization. During the lifetime of a physical asset, the data that defines it is touched by, and often reinvented by, many functions including: engineering, construction, operations, maintenance, and sales and marketing. In these transitions, inaccurate and incomplete records are compounded; this prevents companies from optimizing the value created by their physical assets during their lifecycle. MRG has coined the term physical asset intelligence (PAI) to describe the orientation of an organization optimizing value across the asset lifecycle. We define PAI as the holistic strategic approach, methods and tactics that an organization deploys to make informed business decisions regarding its physical assets (capacities, capabilities, availability, utilization, disposal, retirement). PAI encompasses all of the knowledge, technology, communications and conceptual expertise that can be applied to effectively optimize an organization’s returns on investment in property, plant and equipment. Protecting a company’s ‘data commons’ requires data standards that detail the attributes needed to measure and manage all EAM processes. These standards form a data model that must also be adaptable in supporting strategic and operational objectives that are not currently known. Corporate fees levied on functions (e.g. engineering) that do not meet the data standards are an effective and efficient means of ensuring adoption across the lifecycle. Since the standards cut across functions, they are typically best managed by a corporate data governance organization. An organization that observes data quality standards and employs continuous improvement techniques will achieve superior process and business performance. Asset data standards provide a platform for significantly advancing EAM towards the ultimate goal of physical asset intelligence.

USINFRA4_UPFRONT.indd Sec3:18

Our guide to the last quarter’s global events – and their impact on your business. UK slump

Stricter security

Construction activity fell in December for the 22nd month in a row. Although the slowdown in the sector has eased markedly since February 2009, the weakness in December continues a pattern of only limited recovery.

The US has introduced tougher screening measures for passengers arriving by air from 14 nations that the authorities deem to be a security risk. However, the X-ray machine produces ‘naked’ images of passengers revealing any concealed weapons or explosives, which has unsurprisingly drawn criticism.

US Infra impact rating:

US Infra impact rating:

Utilizing wind

Traffic surge

Nine European countries, including the UK, have signed up to develop an integrated offshore grid in the North and Irish Seas in order to utilize wind power and cut down on carbon emissions as well as become more energy secure. Interesting times for the industry.

The International Air Transport Association (IATA) has said that Middle East carriers recorded a 16.5 percent rise in passenger demand in November, outpacing global growth, which stood at 2.1 percent. Will this trend continue? Only time will tell.

US Infra impact rating:

US Infra impact rating:

No vacancies

On track

In December 2009, Australian Transport and Infrastructure Minister Anthony Albanese declared Sydney’s airport at full capacity, while also delaying a decision on the site for a second airport until 2011, pushing construction further into the future. Potential for air traffic tension.

India’s Road Transport and Highways Minister Kamal Nath has announced it has increased its per-day construction of roads to nine kilometres, and the target of developing 20 kilometres daily would be accomplished by April. Opportunities aplenty for construction companies in the country.

US Infra impact rating:

US Infra impact rating:

2/6/10 09:47:14


MRG AD.indd 1

24/5/10 10:59:30


20

TOP 10

The world’s smartest cities

01

This list of the world’s smartest cities has been compiled by Forbes based not only on infrastructure and liveability, but also economic fundamentals. 1. Singapore:

The 21st-century successor to 15thcentury Venice, this once-impoverished island nation now boasts an income level comparable to the wealthiest Western countries, with a per-capita GDP ahead of most of Europe and Latin America. Singapore Airport is Asia’s fifth-largest, and the city’s port ranks as the largest container entrepot in the world. Over 6000 multinational corporations, including 3600 regional headquarters, are located there, and it was recently ranked number one for ease of doing business.

2. Hong Kong: As the center of the world economy continues to shift from West to East, Hong Kong is certainly reaping the benefits. Hong Kong Shanghai Bank’s chief executive recently relocated there from London. Its per-capita GDP is ranked 15th in the world. The Heritage Foundation and The Wall Street Journal have ranked Hong Kong the freest economy in the world. 3. Curitiba, Brazil: This well-run metropolis in southern Brazil is famous for its rapid bus-based transit, used by 70 percent of its residents, and its balanced, diverse economic development strategy. The city’s program of building ‘lighthouses’ – essentially electronic libraries – for poorer residents has become a model for developing cities worldwide. Environmental site Grist recently ranked Curitiba the third greenest city in the world.

02

4. Monterrey, Mexico: Over the past few decades Monterrey has emerged from relative obscurity into a major industrial and engineering center. The city of 3.5 million has 57 industrial parks, specializing in everything from chemicals and cement to telecommunications and industrial machinery. Monterrey and its surrounding state, Nuevo Leon, boast a per-capita GDP roughly twice that of the rest of Mexico. 5. Amsterdam: This longstanding financial and trading capital is home to seven of the world’s top 500 companies, including Philips and ING. Relatively low

USINFRA4_UPFRONT.indd Sec1:20

03

04

2/6/10 09:47:17


TOP 10 05

21

06 corporate taxes and income taxes on foreign workers attract companies and individuals. Amsterdam’s advantages include a well-educated, multilingual population and a lack of political corruption, as well as its location – in the heart of Europe, close to a major international airport and a short train trip to Rotterdam, the continent’s dominant port.

6.Seattle, Washington: Seattle’s location close to the Pacific Ocean has nurtured trade with Asia, and its proximity to Washington state’s vast hydro-power generation station assures access to affordable, stable clean electricity. The area also serves as the conduit for many of the exportable agricultural and industrial products produced both in the Pacific Northwest and in the vast, resource-rich northern Great Plains, closely linked to the region by highways and freight trains. 7. Houston, Texas: Houston’s close tie to the Caribbean, as well as its dominant global energy industry, thriving industrial base, huge Texas Medical Center complex and first-rate airport all work to its long-term advantage. Arguably the big city in the US with the healthiest economy, Houston is also investing in a green future; last year it was the nation’s largest municipal purchaser of wind energy.

8. Charleston, South Carolina: Charleston has

07

08

expanded its port and manufacturing base while preserving its lovely historic core. Once an industrial backwater, Charleston now seems poised to emerge as a major aerospace center, with the location of a new Boeing 787 assembly plant there, which will bring upward of 12,000 well-paying jobs to the region.

9. Huntsville, Alabama: This southern city has long had a ‘smart’ core to its economy, a legacy of its critical role in the NASA ballistic missile program. Today the area’s traditional emphasis on aerospace has been joined by bold moves into such fields as biotechnology. Kiplinger recently ranked the area’s economy number one in the nation. 10. Calgary, Alberta: With the likely rise in commodity prices over the next decade, Canada seems likely to produce several successful cities. Over the past two decades, Calgary’s share of corporate headquarters has doubled to 15 percent, the largest percentage of main offices per capita in Canada. Although the plunge in oil prices hit hard, rising demand for commodities in Asia should help revive the Albertan economy by next year. 09

USINFRA4_UPFRONT.indd Sec1:21

10

Source: www.forbes.com

2/6/10 09:47:21


22

IN FOCUS $452m eco-refit program

D

espite a recent u-turn on offshore drilling President Obama has always presented himself as a greener, more environmentally conscious president and as such, the White House has promised $452 million worth of eco-retrofits for homes in 25 communities across the country. The plan is all part of the new ‘Retrofit Ramp Up’ program that Vice President Joe Biden announced as the first part of his five days of Earth Day tour. As well as creating greater energy efficiency in US homes, it is hoped the project will create more ‘green collar’ jobs, save millions in utility bills and help America cut emissions. A startling fact about carbon emissions is that almost 50 percent of total emissions come from buildings – not just in how they are constructed, but how they are run. By potentially refitting every home in the US (100 million homes) with measures such as insulation and recycling waste water, the country could save $21 billion a year in energy. As such, the government plans to invest $80 billion into clean energy. During his tour, Vice President Biden said of the scheme, “what we’re really talking about here is simple. It’s about making our homes and our office buildings more efficient and more comfortable and more affordable, replacing windows and doors.” In order to encourage people to make their homes more environmentally friendly, by installing solar panels and even roof-bound turbines, the government is also offering homeowners up-front rebates of up to $3000 for their eco-efforts. There are of course drawbacks; there are concerns that the large-scale drive to install green features correctly would see the relatively small number of skilled workers put under pressure, but then this would no doubt see a large number of green jobs created. Also, in order to make the most of the $452 million investment over time, a large effort would be needed to convince entire neighborhoods to retrofit their homes in order to bring down costs. If a contractor was to retrofit one home it would be much more expensive than a plan to retrofit the entire block.

USINFRA4_UPFRONT.indd Sec4:22

The right road

W

hether it’s making roads safer for the public or spurring economic growth with roadway enhancements, transportation agencies face the ultimate responsibility of maintaining our highway infrastructure system. To keep traffic flowing safely along our nation’s highways, transportation agencies require a comprehensive view of roadway network data. Agencies need to track, analyze and review accurate infrastructure data, regardless of its source. A clear picture of your network supports better decisions, saves time and money, and most importantly, promotes safety, all the while protecting roadway infrastructure investments. As geospatial technology plays an increasing role in managing our nation’s highway infrastructure, Intergraph designed its geospatial transportation solutions to address the issues of data integration and many other business problems facing the transportation industry. Our approach of focusing on data integration while delivering modular applications for viewing, managing and reporting roadway infrastructure data provides a cost-effective technology solution for the transportation industry. We deliver powerful and flexible applications that respond to specific business problems, such as multilevel linear referencing for enterprise data analysis, automated routing for oversized and overweight vehicles, and straight line diagrams that enable you to create and view roadway information that meets your needs. Intergraph’s Transportation Solutions enable transportation agencies to more clearly, efficiently, and cost-effectively manage and protect their roadway infrastructure. Working with more than 40 transportation departments across the US, Intergraph solutions help streamline operations, maintain compliance with federal regulations, and improve safety and driving conditions for motorists by quickly addressing potential issues through faster, more informed decision making. Intergraph has many years of experience in the transportation industry. We understand the issues and challenges you face on a daily basis to keep your highways safe and protect your infrastructure. Our comprehensive range of transportation solutions help keep people and assets moving safely and efficiently across the world.

2/6/10 09:47:24


INTERGRAPH AD.indd 1

24/5/10 10:58:29


24

IN MY VIEW

The nuclear option

Earlier this year, plans to store America’s nuclear waste inside one of America’s most challenging engineering feats were scrapped due to lack of federal funding. Charity Azadian argues the case for keeping Yucca Mountain on the table in the controversial debate on nuclear waste.

A

s the familiar saying goes, you can take the Senator (Harry Reid) out of Nevada, but you can’t take the Nevada out of the Senator. It is clear that politics is once again trumping science. By allowing politicians, and not the scientific experts, to detail a path forward for Yucca Mountain, this only stalls the progress toward resolving the nation’s nuclear waste problem. The Nuclear Waste Policy Act of 1982 set 31 January 1998 as the deadline for the federal government to begin disposing of used fuel. More than a decade after the deadline, the government has still not settled on a policy for how to do it. The US Department of Energy (DOE) established a Blue Ribbon Commission to explore alternatives to long-term waste storage. The inability ineptitude to begin proper nuclear waste management should be a reason to remove government

USINFRA4_UPFRONT.indd Sec6:24

responsibilities, not remove Yucca from consideration. The Administration’s Yucca Mountain policy signals once again that the government cannot be a trusted partner. Essentially the Administration’s policies are schizophrenic. On one hand, President Obama has touted that the Federal Government will guarantee loans for nuclear power plants, and then days later states that he is going to abandon plans for a nuclear waste repository in Nevada. Furthermore, in staying the course with Obama-litic politics that sway to the wind, Energy Secretary Chu testified before Congress

2/6/10 09:47:26


IN MY VIEW that he supported funding on Yucca Mountain, but then a few days later sent a letter to the committee that he would like to retract parts of his testimony, and that he was essentially confused. Shocking, appalling, or is it? This type of delusional behavior and lack of integrity in our political leaders is unacceptable. And, by allowing politics to trump science at Yucca Mountain, Chu’s announcement threatens to stall progress toward resolving the nation’s nuclear waste problem. So should we listen to the science, except when it’s inconvenient to well-connected political leaders? If politicians eager to shut down Yucca Mountain are so confident that the science is on their side, why not allow the Nuclear Regulatory Commission to finish its license review? After, all, it’s the NRC’s responsibility to determine the technical feasibility of Yucca – even if it has been studied countless times. The fact of the matter is that nuclear waste is currently being housed at several different national lab sites throughout the country, waiting to be deposited, buried forever, and the clock is ticking. Specifically, the current budget provides no answers as to what the Administration proposes to do with the approximately 57,700 tons of nuclear waste at more than 100 temporary sites around the country, or with the approximately 2000 tons generated each year by nuclear power plants. The Yucca site was designed specifically to handle spent fuel rods from the nation’s 103 nuclear generators. “The concern is not that these facilities are not safe,” says Rick McLeod, Executive for the Savannah River Community

25

Reuse Organization. “The concern is that the facilities have not been evaluated for storing the material permanently.” There are more than 60,000 tons of spent reactor fuel housed in temporary storage at commercial reactors in 31 states. Not to mention there are 7000 tons of high-level waste left over from nuclear weapons production, much of it buried in 177 underground tanks at Hanford. For now, waste from nuclear power generation and national defense programs are housed at 131 sites across the nation. There’s no telling whether that number ever will be reduced under the course President Obama wants to take. At the very least, the DOE shouldn’t spend any money closing down Yucca Mountain while its fate remains undecided. Better yet, the Administration ought to put Yucca Mountain back on the table ahead of any decision – it’s bad policy to unilaterally scratch Yucca Mountain from consideration. It means throwing away nearly $14 billion of the public’s money that already has been spent studying the Nevada site. So, what should the Administration do to constructively handle the Yucca Mountain dilemma? Some possible solutions are the following: 1) Allow the NRC to continue its license review, 2) Transfer the permit to construct Yucca Mountain to a third party, and 3) Use the Blue Ribbon Commission on America’s Nuclear Future to inform a final decision on Yucca Mountain. After all, what’s the current Administration afraid of, a little competition?

“The current budget provides no answers as to what the Administration proposes to do with the approximately 57,700 tons of nuclear waste at more than 100 temporary sites around the country”

Charity Azadian is a professor, diplomat and author who served at the US DHS and DOE in the administrations of President George W. Bush.

Yucca Mountain Located around 100 miles northwest of Las Vegas is Yucca Mountain, the proposed site for the Yucca Mountain nuclear waste repository. Over $10 billion and 22 years have been spent on developing concrete tunnels and chambers designed to keep waste safe for at least a million years. But the long-running waste disposal plan was officially terminated in the DOE’s 2010 budget request – just one year before the site was due to open – which means that the worst waste from the country’s most polluted place now has nowhere to go. Turn to page 126 for more information on the Yucca Mountain project.

USINFRA4_UPFRONT.indd Sec6:25

2/6/10 09:47:27


26

PROJECT FOCUS

Watch this space The last few months have seen the continuation and creation of a slew of infrastructure projects in the US, but not all of them are the major developments the country has hoped for. In fact, one of the biggest current projects is the clean-up of America’s biggest environmental disaster in the last few decades.

One World Trade Centre Almost a decade in the planning, construction is over four years in. Expected to be completed in 2013, One World Trade Centre (formerly known as The Freedom Tower) will be the tallest building in the United States. In March, 1 WTC reached over 200 feet above street level with construction beginning on the first office floor. The month after, plans for the 45-degree octagon were installed to give the building its unique 40 degree angle shape. By the end of the year, the building should reach 55 stories. It is estimated that work is progressing at the rate of one floor per week.

USINFRA4_UPFRONT.indd Sec2:26

The Gulf of Mexico oil spill clean-up Over three weeks of 5000 barrels of crude a day spilling into the Gulf of Mexico has taken its toll, both environmentally and economically. At the time of going to print BP is pumping barrels of mud into the damaged oil well in an effort called ‘Top kill’ to stop the flow of oil. With the disaster, at its peak, being described as an “Exxon Valdez every four days”, pressure is on both BP and Transocean to clean up the spill. The US government has also stated that “it will not rest” until the “communities and natural resources of the Gulf Coast are restored and made whole.” It has been estimated that over 600 species of wildlife are at threat from the spill, with the Gulf coast also being home to 25 percent of America’s wetlands and several wildlife reserves. Over 100 ships have been mobilized to skim oil from the ocean’s surface, but it may not be enough. Initial estimates predict that it could cost an astounding $23 billion to clean up the spill.

2/6/10 09:47:28


PROJECT FOCUS

27

High-speed rail The dream of high-speed rail has seen peaks and troughs over the last few months, but currently its future appears to be in limbo. Despite the success of high-speed rail projects around the world and over $9 billion being invested in to the system in the US, there is concern that due to the extent of the federal deficit, plans may be scrapped or put on hold. In January, high-speed rail supporters won a major victory when it was announced that assorted projects would see $8 billion from the Obama infrastructure stimulus bill, however there is the now growing realization that perhaps the country cannot afford an infrastructure program that could cost between $22 million and $132 million a mile.

Governors Island park redevelopment

Cape Wind - US’s first offshore wind farm The Obama administration has shown its commitment to renewable energy by approving the country’s first offshore wind farm. The Cape Wind farm will be situated five miles of the Massachusetts coast and will be capable of producing enough electricity to power 400,000 houses. The project is expected to cost $800 million and will be built by Cape Wind Associates, a private developer. It should create 1000 construction jobs and produce the same amount of energy as a medium-sized coal-fired power plant. Despite its clear green credentials, the project has faced criticism from many groups. Locals have said the turbines will damage the tourist industry in the area and around Nantucket. However supporters feel in could be a big step in America’s wind energy future, possibly securing similar wind projects proposed for the East Coast and the Great Lakes. Over the past year, US wind generation has increased by 27 percent.

USINFRA4_UPFRONT.indd 27

Once it was merely 172-acres of decaying Coast Guard structures and the oldest US military base in the country, but after being sold for $1 in 2003 to the city and state of New York, Governors Island is to be redesigned into a massive urban park. The design, picked by the city, will see a “hybrid of landscape and architecture based around a sinuous set of new paths, watercourses, restaurants, aquaria and even complimentary wooden bicycles.” It will also feature a Marine Exploration Center (complete with coastal plant greenhouse, marine life tank and vertical reef) at the north end of the island overlooking the Statue of Liberty, while the rest of the island will be a “vertical landscape” of man-made mountains that will incorporate recreational, cultural, and educational functions. Designed by Dutch firm West 8 in partnership with Rogers Marvel Architects, Diller, Scofidio + Renfro, SMWM and Urban Design+, the city has committed $41.5 million to the first phase of the development and construction has been penciled in to begin in 2012. The entire project is expected to cost $220 million.

2/6/10 09:47:32


28

OVERSEAS FOCUS: UNITED ARAB EMIRATES

The rise of the smart home Despite the downturn the UAE is set to become a world leader in smart home technology.

P

icture the scene. You’re halfway to work and realize you’ve left the lights on/the door unlocked/the windows wide open/the air conditioning on fullblast. What are your options? In times past, you’d either have to perform a sharp u-turn in order to remedy the situation or continue with your journey and spend the rest of the day worrying about whether your home would still be there when you got back. But the advent of smart home technology could be about to provide a third, more intelligent option: mobile remote control. Of course, smart home technology is nothing new. People now expect that if they are spending millions of dollars for a place to live, it should come with the latest in gadgets and comforts, ranging from smart systems that allow owners to control the ambient conditions of their home from the touch of a keypad, to the best in home cinema and entertainment systems. But what is revolutionizing the sector is the ability to perform such tasks remotely, via a mobile application or interface. For example, leading multi-room entertainment and control provider Opus Technologies has just supplied the first resident on Dubai’s Palm Island Jumeirah development, Andy Dukes, with a major upgrade to his multi-room entertainment system. As well as incorporating new touchscreens, the latest Opus system benefits from armchair control using an iPhone or iPod touch. “Andy was the first resident to move into Palm Jumeirah Island and opted to have a multi-room audio entertainment system installed throughout his home,” explains Steve Simpson, Opus’ Regional Manager for the GCC region. “It’s fantastic that he’s also become the first to upgrade his property with our state-of-the-art touchscreens in order to enable iPhone or iPod touch control for a wonderful user experience.”

USINFRA4_UPFRONT.indd Sec5:28

Smart appliances While most home automation technology is focused on lighting, security and entertainment, smart appliances may be on their way as well. Ideas include:

Trash cans that monitor what you throw away and generate online orders for replacements

Refrigerators that create dinner recipes based on the ingredients stored inside

Washers and dryers that send text message alerts when their cycle has ended

2/6/10 09:47:37


OVERSEAS FOCUS: UNITED ARAB EMIRATES The potential applications are significant. Opus cabling has been installed into the fabric of every one of developer Nakheel’s 1386 prestigious Signature Villas, Garden Villas and Canal Cove Townhouses on the Palm Island Jumeirah development, and similar solutions are becoming increasingly commonplace in new-build property developments across the Gulf. The cabling or pre-wire of such large-scale developments provides residents with the option of having a multi-room control system installed at any time, without the disruption and costs normally associated with running cables and other infrastructure retrospectively. Typically, the only visible evidence of this system are the flush-mounted in-ceiling speakers, the on-wall touchscreens and stylish remote control. Opus iPhone/iPod touch control offers users full system control from the palm of their hand, in any Opus equipped room, and replicates the intuitive interface of the touchscreen. “I’m completely thrilled with my Opus system, but when I was informed about an upgrade package offering control by touchscreens and my iPod touch, I simply couldn’t resist,” explains Dukes. “The new touchscreens are so intuitive and an absolute pleasure to use – I particularly enjoy using my iPod touch to control the audio in my villa. The ability to select the room I’m in and then effortlessly control the system is a brilliant feature.” Apart from the ease and time-saving benefits this gives the home owner, it also means they can be far more precise about how much lighting or air conditioning is used throughout the day, saving energy as well. Saleem Al Marzouqi, CEO of UT Technology, believes such interfaces will revolutionise the way people in the region think about energy efficiency. “Smart homes are not only about providing technology, they also contribute to conserving the environment and reducing pollution,” he says. “Smart home applications rationalize 30 percent of energy consumption, providing around 50 percent of space needed for technology infrastructure and provides people with more control over their appliances. This helps in rationalizing energy consumption by controlling usage of electricity, water and gas.” Technology companies are betting that consumers and businesses are still willing to pay for smart home technology, even during a global recession. But while the global market for smart home technology is potentially huge, the real opportunities for smarthome technology in the Middle East comes not from the consumer market but from the business to business market. In the region, the biggest potential customers are hotels and utility companies.

USINFRA4_UPFRONT.indd Sec5:29

29

Setting up a smart home The idea of a smart home might make you think of George Jetson and his futuristic abode, or maybe Bill Gates, who spent more than $100 million building his smart home. But while once a draw for the tech-savvy or the wealthy, smart homes and home automation are becoming more common. About $14 billion was spent on home networking in 2005, and analysts predict that figure will climb to more than $85 billion by 2011. Here are some examples of smart home products and their functions.

C CAMERAS will track your home’s exterior, even if it’s pitch-black outside p

AV VIDEO DOOR PHONE provid vides more than a doorbell – you get a picture of who’s at the door

MOTION SENSORS will send an alert when there’s motion around your house, and they can even tell the difference between petss and burglars

our-digit DOOR HANDLES can open with scanned fingerprints or a four-digit code, eliminating the need to fumble for house keys

AUDIO SYSTEMS distribute the music from your stereo to any room with connected speakers

CHANNEL MODULATORS take any video signal – from a security camera to your favorite television station – and make it viewable on every television in the house

REMOTE CONTROLS, keypads and tabletop controllers are the means of activating the smart home applications. Devices also come with built-in web servers that allow you to access information online – enabling smartphone control

2/6/10 09:47:38


30

IN FOCUS

New job?

Direct

Induced

Indirect

The infrastructure industry is undergoing rapid change, including the job market 12.5

25

37.5 Source: Iberdrrula; California High Speed rail;National Utility Contractors; Apollo Alliance and the FWHA

0

HydroEnergy

High Speed Rail Water/ Wastewater

Highways

Green Energy

Power costs Progress has been made in grid reinforcement since 2005 and substantial investment in generation, transmission, and distribution is expected over the next two decades. Demand for electricity has grown by 25 percent since 1990. Public and government opposition and difďŹ culty in the permitting processes are restricting much needed modernization. Projected electric utility investment needs could be as much as $1.5 trillion by 2030.

Green buildings use

85 percent less energy,

60 percent

Estimated five-year funding requirements for energy:

less potable water and send

Total investment needs $75 billion

69 percent

Estimated spending $45.5 billion Projected shortfall $29.5 billion

USINFRA4_UPFRONT.indd Sec5:30

less waste to landfill

Buildings use

33 percent of the world’s energy

2/6/10 09:47:39


IN FOCUS

31

The sidewalk divided

P

osted on infrastructurist.com by Melissa Lafsky, this photo shows the latest sidewalk division at the intersection of Fifth Avenue and 22nd Street in Manhattan. A good joke by an unknown street artist, this does pose an interesting question: would sidewalks better serve large numbers of pedestrians if they were divided for purposes such as sight-seeing versus commuting?

Company Index Q2 2010 Companies in this issue are indexed to the first page of the article in which each is mentioned. 3M 102, 103, OBC American Congress on Surveying and Mapping (ACSM) 90 American Electric Power 54 American Society of Civil Engineers 108 American Water Works Association (AWWA) 108 Arizona Public Service Co 48 AT&T 4 Bureau of Land Management (BLM) 90 Cape Wind 40 Department of Energy (DOE) 61 Department of Homeland Security 100 Duff & Phelps 61 Duke Energy 66 Florida Power & Light Co 48 Fugro EarthData 94, 95 Google 90 Hach Homeland Security Technologies 13, 112, 113

USINFRA4_UPFRONT.indd Sec5:31

IBM Inkmaker Intelegard Intergraph Itron KPMG Los Angeles County Metro Los Angeles Dept. of Water & Power Master Meter Meettheboss.com MRG National Fire Protection Association New Energy Finance Office of Emergency Communications Open Text Optelecom-NKF OSIsoft LLC Ounce Labs Pacific Gas & Electric Co

80, 120 99 106, 107 22, 23 IFC, 70, 72 58 128 48 2, 78 98 18, 19 104 61 100 11, 64, 65 9, 86, 89 15, 72, 75 6, 72, 77 48

Public Service Co of CO-Xcel Energy 48 Public Service Electric & Gas Co 48 Redflex Traffic Systems 86, 87, IBC Sacramento Municipal Utility District 48 Salt River Project 48 San Diego Gas & Electric Co 48 Solar Electric Power Association (SEPA) 48 Solar Reserve 52, 53 Southern California Edison 48 Sulphur Springs Valley Electric 48 The Sanborn Map Company, Inc. 96, 97 United Technologies Corporation 52 Upsolar America 46, 47 US Geological Service 32 Water Environment Federation 108 Water Replenishment District of Southern California 32

2/6/10 09:47:39


EARTHQUAKE LEAD:may10

2/6/10

09:44

Page 32

COVER STORY

On deadly ground As California braces itself for the next Big One, Huw Thomas assesses the infrastructure challenges for a state that is always on the move.

32 www.americainfra.com


EARTHQUAKE LEAD:may10

2/6/10

09:44

Page 33

“San Francisco, at the present time, is like the crater of a volcano, around which are camped tens of thousands of refugees. At the Presidio alone are at least twenty thousand. All the surrounding cities and towns are jammed with the homeless ones, where they are being cared for by the relief committees. The refugees were carried free by the railroads to any point they wished to go, and it is estimated that over one hundred thousand people have left the peninsula on which San Francisco stood”

T

hese were the words of writer Jack London in the wake of the earthquake that struck the Californian city in 1906. A little after five in the morning on 18 April, the San Andreas Fault sprang into angry, life shaking residents from their beds, tearing up roads and rattling buildings to rubble. Though the quake itself only lasted around a minute, its effects would be felt for years to come. The fires that raged through San Francisco’s shattered streets after the ground had become still obliterated much of what the quake did not. A comparable level of destruction would not be wrought on an American city until Hurricane Katrina hit New Orleans nearly 100 years later. In truth, describing the event as the San Francisco Earthquake is misleading, undercutting its true scale. Between San Juan Bautista and Cape Mendocino, 296 miles of the northern San Andreas Fault ruptured, with the vibrations being felt as far away as Oregon and Los Angeles. Though California residents have become used to dealing with their state’s violent geology in the intervening years, there has yet to be a repeat of the 1906 episode. Rather than this being a cause for celebration, the long period since San Andreas’ last major movement simply means the inevitable

www.americainfra.com 33


EARTHQUAKE LEAD:may10

2/6/10

09:45

Page 34

arrival of the next ‘Big One’ is creeping ever closer. The latest scare came on 4 April 2010 when a 7.2 magnitude quake in Baja California, Mexico shook homes in Las Vegas and Los Angeles, reminding even the most sanguine southern Californians of the Sword of Damocles hanging over their heads. Or rather, lying beneath their feet. “The really big earthquake is absolutely inevitable,” says Dr. Lucile Jones, a seismologist with the US Geological Service and Chief Scientist for the Multi Hazards Demonstration Project in Southern California. “The plate tectonics are not going to stop. The San Andreas Fault is going to be producing earthquakes close to magnitude 8, maybe over. The big problem is the difference between geologic and human timescales. We know the earthquake is absolutely inevitable, but there’s a significant possibility it might not show up for 100 years, and that doesn’t feel inevitable to most human beings.” But with every year that passes, the inevitability of another major quake becomes more pressing. “The average time between earthquakes on most parts of the San Andreas Fault is somewhere been 100 and 150 years,” Jones continues. There’s no place that we’ve been able to get recurrence involved that we have greater than 150 years on the San Andreas. The last earthquake on the southern San Andreas Fault was 153 years ago. On some sections of the southern San Andreas Fault it’s been as much as 300 years.”

Thursday 10:00 am, 13 November 2008 (...the quake begins...) The San Andreas Fault suddenly awakens at Bombay Beach, northeast of the Salton Sea, and the rupture shoots northwest along the fault at two miles per second, sending seismic energy waves out in all directions. In an instant, the ground on the two sides of the fault is offset nearly 44 feet. The description above is drawn not from real events but from a document entitled The Shakeout Earthquake Scenario – A Story Southern Californians Are Writing. It was put together to accompany a multi-agency exercise designed to prepare for the possible after-effects of a major quake on the southern section of the San Andreas Fault (see The Great ShakeOut) and paints a troubling picture of the hours, days and minutes following a major tremor. While improving building codes and remedial work on older structures would prevent damage on the scale witnessed in the quake of 1906, the destruction would nonetheless be significant. Emergency services would have difficulty responding and many people would be trapped in collapsed buildings, injured, or both. As with the San Francisco quake,

34 www.americainfra.com

Q&A: Arch enemy A bridge is one of the worst places to be when a quake strikes. Mark DeSio, Deputy Director for External Affairs at Caltrans, explains how seismic techniques are employed to keep transportation flowing when the whole state is moving. What techniques is Caltrans employing to protect transportation infrastructure from the effect of earthquakes? Caltrans utilizes tightly spaced rebar wrapped around the column to increase confinement and make them more ductile. Bridges are designed so that damage occurs in components that can continue to support the bridge after an earthquake. Selected bridges, particularly long span bridges crossing the San Francisco Bay such as the new San Francisco-Oakland Bay Bridge, Benicia-Martinez and Carquinez, have been designed to provide for a higher level of post-earthquake functionality. Bridges are made continuous, or enlarged bearing areas are provided to prevent loss of vertical support In addition, the Caltrans Seismic Retrofit Program has retrofitted older bridges to improve their seismic performance.This is achieved by placing steel casings around columns to increase their ductility while pipe seat extenders or cable restrainers are used to keep bridge joints tied together. Caltrans also utilizes an independent Seismic Advisory Board to advise the Department on seismic policy and technical practices. Following the Loma Prieta Earthquake we established a seismic research program to improve the performance of bridges in a major seismic event.The results of this program are used to update Caltrans seismic design standards. What role does innovative ‘smart’ infrastructure technology have to play in protecting California’s transportation systems and their users? Caltrans has instrumented approximately 70 bridges in locations throughout the state to measure their response in an earthquake in order to improve seismic performance. In collaboration with the US Geological Survey and the California Geological Survey we uses earthquake sensors located around the state to collect earthquake strong motion data to automatically send out highly detailed, email notifications showing the level of ground shaking. New infrastructure items can obviously be built with the latest technologies but what of older existing structures? Can they be effectively retrofitted to bring them up to 21st century standards? Seismic retrofit techniques such as the use of steel casings,pipe seat extenders,cable hinge restrainers,and other measures have been shown to significantly improve the seismic performance older bridges.These techniques have been validated through research performed at many universities. We’ve had some positive real-world validation too as retrofitted bridges in the areas of strongest shaking in the Northridge Earthquake performed well. What emergency response plans does California DOT have in place to cope in the immediate wake of a major quake?


EARTHQUAKE LEAD:may10

2/6/10

09:45

Page 35

When an earthquake hits a portion of the state with a certain magnitude, it triggers the Caltrans Shakecast system. ShakeCast is a web-based application that automatically retrieves measured earthquake shaking data from the US Geological Survey, analyzes it against individual bridge performance characteristics, and generates inspection prioritization emails and other web-based products for emergency responders within minutes of an event. Maintenance field personnel and engineers are deployed for the initial response and conduct preliminary surveys, while bridge engineers assess whether they should deploy, or not, based on the event’s magnitude and assessments conducted in the impacted areas. If necessary, Caltrans will deploy teams of bridge engineers to the affected area. These engineers check bridges in the area for damage that could compromise the structural integrity of the bridge. If significant damage is found, the bridge is temporarily closed while the damaged structure is evaluated. Following the evaluation, a decision is made whether to re-open the bridge. If the bridge is to remain closed, repair or shoring options are then considered. In addition we exercise on a continuous basis, both internally and externally, with the different hazard types that may impact the transportation system. Best management practices are developed based on after action-reports from exercises and actual events, and incorporated into our emergency operations plans. We also coordinate our response efforts with the California Emergency Management Agency, Federal Agencies, such as the Federal Emergency Management Agency, and our local partners. Aging transportation infrastructure is a problem for many states. How does the frequent seismic activity create additional difficulties for California in terms of allocating resources? Caltrans has provided the resources necessary for the Seismic Retrofit Program, which is nearly complete. There is an ongoing need to provide funds for the needs of our existing bridges, including

for their seismic needs. This could include bearings and dampers that wear out, bridges that are modified requiring new seismic retrofit measures or new findings that require bridge seismic needs to be assessed. Associated Press reports suggest that the recent quake on the Mexico-California border caused as much as $100 million in damage.What are the likely impacts in terms of transportation infrastructure of an inevitably far bigger far bigger quake that is widely expected to strike in the not too distant future? As reported by the Seismic Advisory Board in the report Race to Seismic Safety in a major earthquake “standard bridges near the epicenter will be sufficiently damaged as to be out of service for a period of time, and some may require replacement.” Bridges further from the fault will have lesser damage and are expected to remain open or be returned to service soon after the event.


EARTHQUAKE LEAD:may10

2/6/10

09:48

Page 36

SAN ANDREAS: A history of violence San Francisco, 18 April 1906

Loma Prieta, 17 October 1989

Rupture length: 296 miles Magnitude: 7.9 Maximum displacement: 28 feet Depth: 5 miles Fact: With an estimated 3000 deaths and as many as 300,000 left homeless, the worst natural disaster to ever befall California

Rupture length: 25 miles (subterranean) Magnitude 6.9 Maximum displacement: na Depth: 11 miles Fact: The quake caused $18 billion in damage to transportation infrastructure and disrupted the Major League Baseball season for 10 days

Parkfield, 28 September 2004 Rupture length: na Magnitude: 6 Maximum displacement: na Depth 4.9 miles Fact: The town of Parkfield sits directly on top of the fault. It experiences a magnitude 6 quake on average every 22 years

Crescent City

Eureka

Point Delgada

Garberville

Ukiah Point Arena

Fort Tejon, 9 January 1857

Santa Rosa Point Reyes

Rupture length: 225 miles Magnitude: 7.9 Maximum displacement: 30 feet Depth: Less than 6 miles Fact: As powerful as the quake that ravaged San Francisco in 1906, aftershocks were still being felt at Fort Tejon over a year later

San Francisco Daly City San Jose Santa Cruz

Loma Prieta

San Juan Bautista Monterey

Parkfield Fort Tejon Simmler

San Luis Obispo Soda Lake Rd

Frazier Park Palmdale

Santa Barbara

San Bernardino Desert Hot Springs Los Angeles

Palm Springs

San Andreas Fault San Diego

Brawley


EARTHQUAKE LEAD:may10

2/6/10

09:53

Page 37

fire would also be a major issue. Ruptured gas mains, snapped power cables and damaged fuel stations could all burst into flame, a danger compounded by the likely disruption to water supplies. Along with its volatility, the San Andreas Fault is also responsible for other geographic features that make responding to the next Big One a particular challenge. Over millions of years, the violent tectonic action along the coast has pushed up mountains, cutting Southern California off from the rest of mainland USA. This means that major transportation infrastructure has to be routed through only a few corridors, significantly increasing the risk that they will be severed when a quake hits. This creates major knock-on effects even for communities that escape the quake’s immediate attentions. “The ports of Los Angeles and Long Beach we actually don’t think are going to be badly damaged by a San Andreas earthquake because they’re far enough away, but they’re shipping out to the rest of the country.” Says Jones. “They have 45 percent of the freight, the sea traffic that comes into the United States and goes out to the rest of the country on railway lines, the closest of which crosses the San Andreas Fault in the area of this earthquake.” The restricted access routes into Southern California means that is not just transportation that is put at risk. Gas, electricity, water and communication lines also have to squeeze through comparatively narrow entry points, greatly increasing their vulnerability. “There are lifeline corridors that cross those mountains,” Jones continues. “There are only something like five places where everything comes in. Where we have this railway line, we also have two gasoline pipelines that go from the refineries of Los Angeles out to Arizona and Nevada, as well as a natural gas pipeline that brings natural gas into southern California. Those three pipelines cross each other basically at the San Andreas Fault, and if they rupture there and they explode into each other, there’s going to be a huge fire. That’s also then where Interstate 15 and the major railway line are.” In addition, there is a major electrical connection bringing in power from hydroelectric dams on the Colorado River that crosses the fault near Palm Springs. Were that to be severed the effects would be felt further afield than just Southern California. “Once you break that connection and separate the generation from the load, the system will start to shut down to protect itself,” Jones explains. “About thirty seconds into the earthquake

that rupture will break through the line there. In another fifty seconds all of western North America will be dark.” While those on the western side of the fault will likely have power restored the same day, the same cannot be said for the rest of Southern California. The only source of electricity in the region capable of cold starting the grid is the nuclear power plant at San Onofre, which would have to go through 72 hours of safety checks before it could be fired up. Even then, power would be restored only intermittently, with rolling light-ups supplying electricity to different communities for short two hour periods.

Friday 10:00 am, 14 November 2008 (...24 hours after the quake began...) Utility companies are working around the clock to restore services, yet most people in the areas of heaviest shaking lack electricity, natural gas, and water. Utility workers, like transportation crews, medical staff, and emergency responders, push themselves to do their crucial jobs despite concerns about their own families. The issue of water is another major issue in post-quake Southern California. Quite aside from the immediate challenge of combating fires, the long-term implications of a disrupted supply are potentially even more damaging. “What the earthquake could do is rupture pipelines and damage surface reservoirs,” says Ted Johnson, Chief Hydrogeologist at the Water Replenishment District of Southern California. “It could knock out a lot of the surface water infrastructure, and there’s just not a lot of spare parts lying around. If all those pipelines get damaged they just can’t be replaced within a matter of days. It could take six months to manufacture them because they’re not just lying in stock in a warehouse somewhere.” It is here that the Water Replacement District comes in. On a day-to-day basis, the organization’s task is to simply top up water supplies piped into the region from much cheaper sources of groundwater. In normal conditions, the mix is about 60 percent surface water to 40 percent

groundwater. If a major quake were to hit, that figure could shift to 100 percent groundwater for quite some considerable time. “Could be up to three years is the most we’ve seen,” says Johnson. “Six months would be better. The groundwater basin could certainly last for six months, a year definitely, but the computer modeling shows it could handle the demand for up to three years provided there was better water conservation. People are going to really have to cut back on their water supply use.” Ensuring that the groundwater basins remain full enough to meet both daily needs and emergency post-quake requirements is a constant struggle in such a characteristically arid area. “We’re looking for more supplies of water to refill the basins,” Johnson continues. “Using recycled wastewater is one of the main things that we’re doing more and more of these days. The Orange County Water District has a big project that takes treated sewage water and puts it underground. We’re doing a lot of the same thing to reuse our water instead of just letting it go to waste to the ocean, and it’s that water that is refilling our aquifers. It sits underground for a year or more before it’s pumped out again so it’s further purified down there. We’re also trying to capture more storm water.” But even if the reserves are there, getting them to where they are needed will be no easy task in an earthquake-ravaged state. “In many cases the water systems are more than 100 years old,” says Lucile Jones. “The oldest one is built out of cast iron, which is extremely brittle material. Then they moved from that to concrete, another very brittle material.” Though the last 20 or 30 years have seen pipe systems built out of more ductile iron, there remains a huge amount of critical infrastructure that is virtually guaranteed to fail in an earthquake. The Baja California quake in April offers a smaller scale example of what could be expected from a more severe tremor. “The California Water Emergency Network got activated with this Baja earthquake because there was a lot of damage to the water systems in Imperial County in southernmost part of California right at the border,” Jones explains. “The damage is quite extensive to water treatment plants, to the pipe delivery system, to sewage systems. There’s also the problem of cross contamination when you have sewage pipes and water pipes breaking.” Scale the recent quake up to one that ruptures along 200 miles of the San Andreas Fault and your

www.americainfra.com 37


EARTHQUAKE LEAD:may10

2/6/10

09:54

Page 38

problems are much bigger. Water systems would be shattered across a huge area of Southern California and putting them back together would be a massive logistical, construction and manufacturing challenge. In many cases repair would not be cost effective and the only option would be to rebuild from the ground up.

13 May, 2009 (...6 months later...) Water is back in faucets, sinks, and air conditioning units across the region, but it is too late for many businesses, especially smaller businesses that lacked the resources to wait out the bad times. Businesses forced to close have a domino effect, and as the chances diminish for regaining jobs or finding new ones, more and more people are struggling to rebuild their lives. Aside from the immediate human cost a major quake along the San Andreas Fault, it is the long-term economic implications of infrastructure damage that is the most worrying. Prior to 1906, San Francisco was the largest city on the west coast, a massive business hub and the main gateway to the Pacific and Asia. While the city would be rebuilt, the long period of reconstruction saw Los Angeles emerge as the leading Californian city, a position that it holds to this day. Though property damage would likely be less severe today because of improved building techniques and codes, the fact remains that businesses unable to secure regular access to power and water in the weeks and months following a quake would have a hard time staying afloat. People and goods would be unable to move freely around, all of which contributes to cascading economic consequences. The very real fear is that by the time basic services were running normally, it would already be too late for countless organizations, families and individuals. The quake cannot be stopped and Los Angeles isn’t going anywhere, so what can be done? “One of the ways we could make a big difference about it would be to develop an engineering approach to the power and pipe lines that could accommodate thirty feet of motion and still stay up,” says Jones. “When they built the Trans-Alaska Pipeline, they crossed a really big fault called the Denali Fault. They built into it a Teflon slider system with joints that would allow the pipe to stay intact for up to twenty feet of offset. In 2002 we had a bad earthquake with eighteen feet of offset, and the system worked. The pipe did not break.” However, current economic realities make it a cheaper option for the utilities that own the pipes and power lines to repair something broken than to build something that won’t break in the first place. Continual improvement of building codes and the creation of new techniques to make structures more resilient is also essential. “The earthquakes are so infrequent,” says Ted Johnson. “They could be ten years apart. You can instruct and you teach people especially after one has happened, but then you can’t keep it in the forefront every day and people tend to get lazy about their planning, so it is the responsibility of government agencies and teachers to keep the word out.” Efforts like The Great ShakeOut represent perhaps the best approach, bringing together everybody living in the shadow of San Andreas to promote scientific and rational responses to a threat that virtually defies human understanding. “We need to get people to believe it without paralyzing them to the point that they don’t want to think about it,” says Lucile Jones. California was shaking long before we arrived and it will keep shaking long after we’re gone. We just have to find the best possible way to hang on.

38 www.americainfra.com


EARTHQUAKE LEAD:may10

2/6/10

09:54

Page 39

“We need to get people to believe in the danger without paralyzing them to the point that they don’t want to think about it” Dr Lucile Jones, US Geological Service

The Great ShakeOut Now an annual event taking place on the third Thursday in October, The Great California ShakeOut is the largest preparedness drill in America. Involving government agencies, businesses, schools and individual families, it aims to provide the necessary tools to prepare for and react to the inevitable earthquakes that will likely plague the region in perpetuity. The most recent event took place in 2009 and involved more 6.9 million people across the state practicing evacuations and techniques such as ‘drop, cover and hold on’ designed to keep people as safe as possible during a quake. The event also raises awareness of emergency provisions needed for survival in the aftermath and ways in which homes and business can be prepared to minimise damage. The drill is backed up by a huge amount of research looking at the science of earthquakes, their social and economic impacts and what planners can do to best ready themselves. While earthquakes are far too unpredictable to ever truly mitigate against, the Great ShakeOut hopes to show that cooperation and education can at least make the best of a bad situation and prevent a disaster becoming a catastrophe.

www.americainfra.com 39


WIND POWER

ON THE HORIZON With the United States’ first offshore wind farm edging closer to full approval, opposition is becoming fiercer in an attempt to push buttons. US Infrastructure delves into the Cape Wind project to find out more.

40 www.americainfra.com

cape wind.indd 40

2/6/10 09:18:59


F

rom the ancient Chinese utilizing the power of wind for their irrigation needs to Mr. Columbus using it to fuel his exploration sails, wind has played a pivotal role in cementing and progressing the needs of civilization for as long as recorded time cares to remember. Yet despite this fact, the modern day wind turbine only introduced itself to the world in the 1980s, and ongoing research is only now just beginning to hint at how to harness its true potential. Our eyes have witnessed the growth of wind farms across the world, and with it we’ve deepened our exponential understanding of wind power – but in order for this to continue, those dealing within the sector must push the envelope of possibility through experience and innovation. Perhaps the most current embodiment of this sentiment is a recently approved offshore wind farm in the Nantucket Sound off Cape Cod, Massachusetts. Proposed by Cape Wind Associates and developed by Energy Management Inc., the Cape Wind Project aims to become the fi rst offshore wind energy project in US coastal waters – and given its recent approval it looks as though that dream could become a reality in the very near future.

Made to measure

Nantucket before the Cape Wind Project

The footprint of the proposed project would see 130 horizontal-axis wind turbines spread across 24 square miles of shallow water toward the center of the Nantucket Sound, known as the Horseshoe Shoal, and sited between four to 11 miles offshore dependant on the shoreline. The turbines themselves would be aligned in a grid formation of parallel rows, spreading the wind turbines a staggering six football fields apart with rows being separated by an equivalent further three fields. To ensure they remain secure, each turbine would be individually mounted to a single monopole foundation consisting of a hollow steel pipe driven 80 feet into the sandy seabed. With aesthetics designed to minimize their impact on the landscape, the turbines will also be strategically painted to pop up a mere inch and a half above the horizon from the shoreline. Appearance aside, the more important statistics to note are in the context of what the wind farm could potentially deliver in terms of energy. With an expected maximum production of 454 megawatts, Cape Wind projects an expected average production of 170 megawatts – almost 75 percent of the 230 megawatt average electricity demand for Cape Cod and the surrounding islands of Martha’s Vineyard and Nantucket. With the manager of the electric grid, the Independent System Operator of New England, stating that additional electricity generation

www.americainfra.com 41

cape wind.indd 41

2/6/10 09:19:02


it’s all about the wind sources were needed as far back as 2006, it is with little surprise that in May of this year Cape Wind announced a power purchase agreement with National Grid to sell half of the project’s output – roughly 750 million kW-h/year – at a price double that of current retail rates. However, the deal is still subject to approval by the state government, who need to take into consideration exactly how the electricity will be delivered to homes, schools and businesses throughout Cape Cod. Currently, Cape Wind proposes to connect into the electric grid at a substation through underground cables, which will then follow the path of least resistance and be used by electric consumers closest to the source, typically on the Cape and Islands. Tom King, President for National Grid, said: “It is truly fitting that the next milestone in our nation’s clean energy revolution is taking place in the Bay State and New England. We believe this project will provide long-term economic and environmental benefits here, throughout the region and across the nation. We absolutely must develop our homegrown renewable energy resources if we are to meet state and federal renewable goals, secure our energy future and seize the leadership position in the global clean energy economy.” And with 45 percent of the Cape region’s electricity coming from the nearby Canal Power Plant in Sandwich, which burns bunker oil and natural gas, the Cape Wind proposal is distinct in as much as it would directly offset petroleum combustion, unlike the majority of the US where electrical power generation from oil is rare and coal power is the fuel of choice. While it is impossible to know in advance the exact combination of the fuels that Cape Wind will reduce, it is believed that the quantity of wind-generated electricity produced annually will be the rough equivalent of 113 million gallons of oil or 10 billion cubic feet of natural gas.

Provided everything goes to plan – regulations passed, finances seeping through, turbines up and good to go – you only need one thing to kick-start a wind farm. Wind. Luckily for Cape Wind, their proposed site at Horeshoe Shoal is constantly windy, allowing for the turbines to rotate and produce power about 86 percent of the time. For the remaining 14 percent when wind speeds are too low, existing sources of energy will continue to produce the necessary energy. For those wonderfully windy days, the turbines will react dynamically to their surroundings by sensing changes in wind direction and adjusting accordingly. To account for hurricanes and extreme weather, the turbines are rated to survive sustained winds of 150 mph, which is stronger than any hurricane experienced in the 20th century. The nacelle, which sits on top of the tower connecting the blades, is capable of making a full 360 degree rotation. Not bad for a 258 foot wind turbine.

Peter Larkin, Massachusetts State Representative for the Th ird Berkshire District, stated: “Cape Wind is at the forefront in the quest for creating new renewable sources of energy. Massachusetts and the Cape Cod region have the opportunity to take a substantial step forward in limiting the region’s dependence on natural gas and the destructive environmental impacts of fossil fuel fi red facilities. The rest of the country will be looking on as Cape Wind provides a viable, environmentally safe and economically enhancing solution to the problem of the everincreasing needs of the Cape Cod region.”

42 www.americainfra.com

cape wind.indd 42

2/6/10 09:19:03


Despite the revelation that electricity prices from the wind farm will start at double those of standard electricity prices, in the long run it is predicted that these prices will stabilize and potentially even reduce the price of electricity using three strategic positions. Firstly, it will reduce the clearing price for electricity in the New England spot market by reducing operations of the region’s most expensive power plants – lowering electricity prices in New England by a much needed $25 million per year. Secondly, it aims to reduce the implementation costs sts of the Renewable Portfolio Standard to Massachusetts’ electricity consumers by increasing the supply of renewable energy certificates. Finally, Cape Wind will pursue long-term power contracts that will lock-in a fi xed price for electricity for a term of 10 or more years. In turn, this would provide electricity consumers purchasing Cape Wind energy with far greater electric price stability and certainty than is typically available to them.

If there is to be any environmental impact, it would show its head in the construction phase of the project. Although there will undoubtedly be temporary impacts to the seabed during construction, Cape Wind will be using environmentally friendly construction techniques to minimize these impacts. The monopole foundations will be fi lled with displaced sand, negating the need for any excavated sediments to be disposed p of. The electrical cables needed to transmit the electricity generated generate will be buried a minimum of six feet below the seabed seabe with a high-speed water jet that temporarily liquefies the area of the seabed where the cable is laid. Employing this method will minimize disruption d to the seabed and, provided nothing goes awry, return the water to its original opacity within 24 hours. The effect on marine and coastal wildlife would w also be arguably minimal to none. With further the care being taken during the construction phase to ensure ens that there are no marine mammals present in the imm immediate vicinity during pile driving activities, the wildlife ldl f should h ld remain unaffected. Ironically, perhaps the biggest threat facing marine mammals generally is a steep decline in ocean plankton populations as a result of global warming brought on by fossil fuelled greenhouse gas emissions – an area that the wind farm could help reduce in the long-term. Moving from below the surface to above it, Cape Wind has also taken every possible precaution to ensure that its bird population also avoids any impact. Improvements in wind turbine design, including a much slower rate of blade rotation and a lower base instead of lattice towers to avoid birds perching, should help reduce bird mortality whilst also increasing efficiency. Given all of the above – and taking into consideration the fact that Cape Wind would indeed be the fi rst offshore wind farm in the US – it is in every way, shape and form a pioneering project. It is backed up by the Conservation Law Foundation, who stated: “The Cape Wind Energy Project provides a chance to begin this process, providing the region’s fi rst major source of wind energy-based power production and the opportunity to obtain experience that will allow the region to rapidly build the full portfolio of energy facilities that it needs. If built, the Cape Wind Energy Project should be both a rich source of clean energy and a source of essential new information for guiding future projects.” Further encouragement has also been noted from Greenpeace USA, who were quick to assert that: “From local jobs to clean energy, this project is right for America and right for the Cape. In years to come, the people of Massachusetts will be proud of this contribution to the clean energy revolution.”

Cape Wind could reduce about 734,000 tons of carbon dioxide emissions

Opportunity Inherent with new business comes the possibility of new jobs – and in the assembly, staging and ocean construction stage of the project, Cape Wind hopes to create between 600 and 1000 jobs; once it’s fully operational, it will create around 150 permanent jobs with at least 55 based on Cape Cod itself. Backing up this expectation is Ernest Correia, local representative for the International Brotherhood of Electrical Workers, who confirmed: “If this develops, I can promise that workers from the region will construct this project, generating much needed jobs to construction workers from the communities of Cape Cod and south-eastern Massachu-

“Massachusetts and the Cape Cod region have the opportunity to take a substantial step forward in limiting the region’s dependence on natural gas and the destructive environmental impacts of fossil fuel fired facilities.”

setts. The economy of the region will benefit from local workers spending their paychecks in the area. In the long term we will be generating electricity without harmful health issues from the wind farm.” And with human health issues at a complete minimum, so too are environmental health issues. Wind farms by their very nature are clean sources of energy as they don’t produce any air pollutants; conversely, they actually benefit the environment as their operations reduce the amount of fossil fuel power that needs to be generated, in turn reducing the amount of pollution entering the air. In addition, it is predicted that Cape Wind would also reduce about 734,000 tons of carbon dioxide emissions in New England – all tallying up to aid the climate change initiative.

Controversy However, the words ‘if ’ and ‘should’ have been bandied around for some time in relation to Cape Wind as, despite its use as a clean, renewable energy source, it has received strong opposition. Regardless of the fact that all 12 lawsuits fi led against the project have been thrown out by both Federal and State judges, the opposition looks like it will remain committed until the very end. The most recent suit fi led alleged

www.americainfra.com 43

cape wind.indd 43

2/6/10 09:19:03


that Cape Wind’s Final Environmental Impact Report (FEIR) did not comply with the Massachusetts Environmental Policy Act (MEPA). Ultimately, the ruling sided with Cape Wind as it highlighted that its analysis and conclusions in the FEIR certificate were “logical, rational and not arbitrary and capricious.” Even so, heavy criticism from the Alliance to Protect Nantucket Sound continues, putting emphasis on the fact that Nantucket Sound is known worldwide for its wildlife and natural beauty, with a large majority of those supporting the opposition movement coming from local surrounding areas. The Alliance proposes a plethora of reasons to prevent the project going ahead, many directly contradicting Cape Wind’s stated intention to preserve natural environments. Among them, the Alliance outlines the high potential for Cape Wind to fall foul of federal laws including those pertaining to the Marine Mammal Protection Act (MMPA) and the Migratory Bird Treaty Act (MBTA), which both state that if the wind farm was to harm any natural inhabitants to the area, they would be in violation of their respective governing bodies. Numerous state and federal agencies have also allegedly cautioned that the Cape Wind project must not move ahead without proper analysis or regulatory oversight, with the Massachusetts Division of Marine Fisheries anticipating “direct negative impacts to fi sheries resources and habitats”. On top of this is the potential effect the wind farm could have on the region’s fishermen. In a statement on its website, the Alliance to Protect Nantucket Sound said: “If Cape Wind’s 25 square mile grid were constructed, commercial fishermen, who rely on the proposed site for more than half their catch, say they would be restricted in their access to fish fertile waters. The Massachusetts Fishermen’s Partnership, which represents 18 commercial fi shing organizations, says that navigation of

mobile fishing gear between the 130 towers would be hazardous or impossible and, in short, Cape Wind would displace commercial fishing from Nantucket Sound.” While some of the objections the Alliance to Protect Nantucket Sound offer up are perhaps slightly sensational, they do highlight areas of contention such as those affecting the environment and the surrounding economy. If the project does continue through its fi nal stages, it has to have been fully scrutinized and be able to prove itself as fully viable. Another area of disagreement that has been raised by both the Alliance and the Cape Cod Commission is the effect the location of Cape Wind could have on air and sea navigation. According to the Alliance to Protect Nantucket Sound, Cape Wind’s proposed wind turbines would present navigational obstacles for passing ferry and shipping lanes, as well as low flying aircraft . In addition, public access restrictions could prohibit any anchoring or gear fishing in the projected area – again impacting the local fi shing trade as well as potential tourism interests. Given the amount of time spent in the arena of debate between the Alliance and supporters of the Cape Wind project, Clean Power Now, it is perhaps no surprise that it has spilled over into the political realm in recent years, becoming a widely discussed issue in the 2006 election debate for the Governor of Massachusetts. The eventual winner, Democrat Deval Patrick, supported the project; his Republican opponent, former Lieutenant Governor Kerry Healey was in strict opposition. If these factors weren’t enough, the Alliance also uses the higher price of ‘clean’ energy to help champion its cause in its simplest terms, by using Commissioner William Doherty’s statement that: “According to the testimony of the Cape Wind people, [the project] will not lower the electric bills of the Cape Cod consumers.” While it must be admitted that the cost of electricity from Cape Wind would begin at double that of traditional electricity prices, this Dr. Eric Chivian, Director of the Center for Health and the Global doesn’t take into consideration the Environment at Harvard Medical School and winner of the Nobel moral and carbon-friendly aspects of Peace Prize. the project; it also fails to consider the introduction of further wind farms in “As a physician who worked on the Cape for many years in the 1970s – seeing patients years to come alongside the reduction and leading various environmental initiatives – and as the Director of the Center for in market costs and introduction of Health and the Global Environment at Harvard Medical School, I am alarmed at how ‘renewable credits’. little understanding there seems to be about the human risks of global warming; how Ultimately the battle for the Nanthe Cape Wind project has been so irresponsibly misrepresented, and how, as a result, tucket Sound and the future success of people on the Cape and Islands may pass up an opportunity that would help preserve both Cape Wind will depend on the support their environment and their health. I am particularly alarmed at how some politicians, it receives and the viability it can prove. celebrities, business people, and environmentalists who should know better are replacing Its opponents, whilst largely providing their political and parochial interests above those of the common good.” valid arguments, seem to forget that the renewable movement has moved off the horizon and is floating towards our shores out of necessity. Fossil fuels are a fi nite resource. Sooner or later, projects such as Cape Wind will become a necessity if Americans want to keep living in the style to which they have become accustomed.

44 www.americainfra.com

cape wind.indd 44

2/6/10 09:19:03


SAP_AD.indd 1

1/6/10 11:18:50


INDUSTRY INSIGHT

Sunny times ahead

Troy Dalbey explains how the solar power industry is evolving.

I

n order to ensure and provide the best price/quality ratio of the market, Upsolar designs, creates and produces high quality modules through its fi rst-class vertically integrated manufacturing partners. Upsolar controls each stage of the production flow and continues its efforts in research and development to ensure the quality. Upsolar’s performance has been very impressive, with shipment of 95 MW in 2009, sales of close to $172 million, and estimates to reach 250 MW in 2010. The cornerstone of Upsolar’s success has been the rigorous quality-control management protocol we have developed. Combined with our in-house R&D center, industry standard warranties, performance guarantee through innovative insurance coverage and international presence, Upsolar can guarantee its customer-base an extremely reliable PV module, providing protection from power-loss or supply-shortages. Upsolar tests and selects the highest quality components from the top photovoltaic materials suppliers to produce dependable, top quality PV modules. To ensure that Upsolar modules are built to last, technicians at our R&D center are constantly conducting tests to measure and ensure Upsolar modules have the ability to withstand the assault of time and weather for more than 25 years. Upsolar strictly executes its unique online quality control protocol by rigorously checking the conformity of materials. We employ our own quality control team Troy Dalbey of highly qualified technicians that provide a permanent presence at our partners’ facilities. Upsolar online quality control protocol is under permanent audit by third party QC provider, Bureau Veritas. Every shipment of Upsolar product is tracked and traced by an inspection report witnessed by Bureau Veritas. Upsolar also implements an offl ine quality control protocol, conducting tests on components, assembled parts and fi nal products. Th is process of batch-testing randomly and at any change of specifications, components or modification in assembly processes, ensures that Upsolar produces the best quality modules while maintaining continuous production flow. Th is combi-

nation of online QC management protocol, offl ine QC testing and enlisting the services of Bureau Veritas has lead to an efficient module manufacturing process and an extremely reliable product. But Upsolar’s high quality products could not be produced without the implementation of fi rst class equipment at our production partner’s manufacturing facilities, such as Roth & Rau, Applied Materials/ Baccini, Spire, ASYS or NPC. Upsolar also maintains strong relationships with best-of-breed components and materials suppliers such as Tyco, MC, SaintGobain, Isovolta, Bridgestone and Dow Corning. Th is combination of top-tier manufacturing equipment and module components enable Upsolar to produce some of the highest quality, dependable PV modules in the world, preventing Upsolar customers from encountering substantial output power loss during the 25 year life of their modules. Upsolar is working with production platforms that are vertically integrated. Our partners process raw polysilicon, ingots, wafers, cells and assemble the modules Upsolar develops. Th is not only increases the level of quality control, but reduces the number of raw material suppliers, substantially reducing the likelihood of supply shortages. Controlling the production process from the initial processing of raw polysilicon through module production, Upsolar can ensure our modules will be high-quality, extremely dependable, and delivered to our global customer base in a timely manner. Control of the process also enables Upsolar to increase annual production capacity as needed. Upsolar’s global production capacity will reach to 450 MW in the coming years. In order to better serve our global customer base, Upsolar has established headquarters on three continents: in Asia, Europe and America and also maintains local offices in Italy, Greece, Germany and the Czech Republic. To satisfy increasing global demand for Upsolar PV modules, we have also opened logistics centers, and established local customer service and technical teams to provide all the necessary support to customers in their projects across the globe.

“Our partners process raw polysilicon, ingots, wafers, cells and assemble the modules Upsolar develops”

Troy Dalbey, National Sales Manager of Upsolar America, was hired in 2009 to open the company’s North American headquarters and launch the US-based subsidiary, Upsolar America, Inc. He is responsible for the company’s strategic growth in the North American PV market.

46 www.americainfra.com

Upsolar insight.indd 46

2/6/10 09:31:42


UP SOLAR AD.indd 1

24/5/10 11:25:53


RENEWABLE POWER

A bright

future The top 10 solar utilities have expanded integration by 66 percent in the last 12 months. US Infrastructure investigates the drivers behind the rapid expansion of the industry.

T

he push for clean energy is certainly benefiting solar companies as demand increases for renewable sources of power. In fact, shifts in energy demand are understood to be a major driver for the solar market as a whole. And when the Solar Electric Power Association (SEPA), an educational and research non-profit focused on helping utilities integrate solar into their energy portfolios, released its annual Top 10 Utility Solar Rankings in May 2010, it revealed that the

top 10 utilities’ solar megawatts grew a massive 66 percent since 2008, despite electricity demand as whole declining as a result of the economic downturn. Growing from 169 MW in 2008 to 279 MW in 2009, the top utilities represented 80 percent of the total solar energy generated over the year. SEPA’s third Top 10 Utility Solar Rankings report details the results of the annual survey sent to hundreds of utility contacts in the US about their annual and cumulative solar electric installations. The report takes into account both large and small solar projects that are owned by customers, solar companies and utilities themselves that are integrated into the utilities’ grid, which allows them to compare themselves against peer, regional and national benchmarks.

48 www.americainfra.com

SEPA.indd 48

2/6/10 09:25:21


“We’re very excited to see some new names on this year’s list, along with the traditional solar utility leaders,” says Hamm. “We congratulate all our top 10 utilities for their ongoing commitment to solar power – an energy source that’s both clean and increasingly cost effective.” The report also ranked solar watts per customer in order to provide a measure of relative solar activity using the size of a utility’s customer base. Sulphur Springs Valley Electric emerged in fi rst place with 56 watts per customer, despite being unranked in 2008, and was followed by Hawaii Electric Light Company, the City of Santa Clara/Silicon Valley Power and Kauai Island Utility Cooperative.

Solar trends Participation in the survey has increased by a remarkable 170 percent from 2007 to 2009, increasing from 53 utilities to 143, which can be partly attributed to the growing role that utilities are playing in the growing solar market. One of the key conclusions in the report is that utilities’ solar portfolios are on the cusp of significant changes. While solar electricity markets have traditionally been distributed, consumer-focused and solar industry driven, 2009 marked a change in market dynamics. The 2009 rankings were impacted in part by several centralized or aggregated distributed solar projects that were built or began construction, and several utilities that were directly involved in owning new solar projects. Installations on the utility side of the meter increased 267 percent from around 18 MW in 2008 to 65 MW in 2009 and made up 19 percent of the survey’s total, up from nine percent the previous year. With a significant amount of solar power added nationally between 2007 and 2009, not quite doubling from around 750 MW to 1400 MW, it is becoming clear that the top 10 utilities are a focal point for the nation’s solar growth. In the survey, the top 10 utilities represented 80 percent of the annual megawatt total and 89 percent of the cumulative total and in similar fashion, the sum of the top 10 utilities annual solar megawatts grew 65 percent from 169 MW in 2008 to 279 MW in 2009. However, while the top 10 utilities continue to make up the bulk of the national solar megawatts, this is not a static market, and utilities across the country are increasing their solar activity. The top 10 utilities’ share of the overall survey annual megawatts dropped from 88 percent in 2008 to 80 percent in 2009 and the share of

The market size will surpass 1 GW in annual installation in 2010 for the first time

The report attributes the huge growth in the industry to the significant price declines in solar installations – over 40 percent in the last two years – and predicts that this change will continue to drive both customer solar market activity and accelerate internal utility interest for the solar portfolio. “One thing is clear from these results,” says SEPA Executive Director Julia Hamm. “Now is a great time to take another look at solar electric power. If a utility’s pricing perceptions are even 12 months old, they are out of date.” Once again, Pacific Gas & Electric topped the annual solar megawatts ranking with 85 MW installed in 2009, closely followed by Southern California Edison’s 74 MW, which has itself experienced an unprecedented 131-percent growth over the previous year’s total. Electric & Gas, and San Diego Gas & Electric were driven by distributed solar projects, both Southern California Edison and Florida Power & Light can attribute their growth, in part, to the construction of two new centralized photovoltaic (PV) plants, which are the two largest PV plants in the United States.

www.americainfra.com 49

SEPA.indd 49

2/6/10 09:25:24


2009 UTILITY SOLAR RANKINGS

Annual Solar Megawatts (MW-ac)

Annual Solar Watts-per-Customer (Watts-ac)

#1 Pacific Gas & Electric Co. (CA) - 85.2

#1 Sulphur Springs Valley Electric Co-op (AZ) - 56.0

#2 Southern California Edison (CA) - 74.2

#2 Maui Electric Co. (HI) - 33.8

#3 Public Service Electric & Gas Co. (NJ) - 29.6

#3 Hawaii Electric Light Co. (HI) - 31.4

#4 Florida Power & Light Co. (FL) - 29.5

#4 City of Santa Clara/Silicon Valley Power (CA) - 22.3

#5 San Diego Gas & Electric Co. (CA) - 17.6

#5 Kauai Island Utility Co-op (HI) - 18.8

#6 Xcel Energy (CO) - 16.3

#6 Black Hills Energy (CO) - 16.4

#7 Arizona Public Service Co. (AZ) - 9.9

#7 Pacific Gas & Electric Co. (CA) - 16.2

#8 Salt River Project (AZ) - 5.8

#8 Hawaiian Electric Co. (HI) - 15.5

#9 Sacramento Municipal Utility Dist. (CA) - 4.8892

#9 Southern California Edison (CA) - 15.3

#10 Los Angeles Dept. Water & Power (CA) - 4.889

#10 Graham County Electric Co-op (AZ) -14.8 Source: US Department of Energy

the cumulative dropped from 97 percent in 2009 to 89 percent in 2009. In fact, utilities outside the top 10 – all other participating facilities – experienced a growth in solar megawatts of 188 percent. SEPA reports that the increased utility participation in the survey reflects this change. In the 2009 survey the percentage of utilities in the central region equaled the percentage of utility participants from the western region at 41 percent. There is a marked change to 2007 where western utility participation outnumbered central utility participation by roughly nine to one. In 2008, SEPA initiated a direct utility outreach program with three Regional Director positions proactively engaging utilities. While policy and economic changes are the greatest drivers, awareness also drives survey participation. Improved regional survey participation indicates that an increased number of utilities in the central and eastern regions have been integrating solar electric power into their grid and their customers have been interested in distributed generation. The early adoption of solar integration in California and other western states is clearly gaining competition from utilities in other parts of the country.

FAST FACTS US manufacturing plants accounted for just five percent of worldwide solar cell production in 2008 – down from 12 percent in 2003

Most policies that impact the solar industry in the US are created at state level in contrast to the other major solar markets of Japan and Germany

50 www.americainfra.com

SEPA.indd 50

2/6/10 09:25:26


A GRAND PLAN The government’s Solar Energy Technologies Program (SETP or Solar Program) works to develop costcompetitive solar energy systems for America. More than $170 million is spent each year in research and development on two solar electric technologies with the greatest potential to reach cost competitiveness by 2015: photovoltaics (PV) and concentrating solar power (CSP). The greatest R&D challenges are reducing costs, improving system performance, and finding new ways to generate and store energy captured from the sun. The Solar Program also makes sure new technology is accepted in the marketplace. Program staff work to remove many non-technical market barriers, such as updating codes and standards that aren’t applicable to new technologies, improving interconnection agreements among utilities and consumers, and analyzing utility value capacity credits for utilities. Such activities help consumers, businesses, and utilities to make more informed decisions when considering renewable energy, and they also facilitate the purchase of solar energy. The Solar Program conducts its key activities through four subprograms:

Photovoltaics Through its primary research and development efforts, the PV subprogram’s goal is for PV technology to achieve grid parity by 2015. Achieving this goal will lead to rapid and

significant growth of solar electricity in the United States. To reach this goal, the DOE is investing in approaches across the development pipeline – from basic cell technologies to manufacturing scale-up to total system development – that demonstrate progress toward minimizing the effective life-cycle cost of solar energy. In addition, DOE is partnering with national laboratories, start-up companies, universities and integrated industry teams.

Concentrating solar power CSP technologies use mirrors to reflect and concentrate sunlight onto receivers that collect the solar energy and convert it to heat. This thermal energy can then be used to produce electricity via a steam turbine or heat engine driving a generator. The DOE is ramping up its CSP research, development and deployment efforts, leveraging both industry partners and the national laboratories. The DOE’s goals include increasing the use of CSP in the United States, making CSP competitive in the intermediate power market by 2015, and developing advanced technologies that will reduce systems and storage costs, enabling CSP to be competitive in the baseload power market by 2020. The DOE plans to achieve these goals through cost-shared contracts with industry, advanced research at its national laboratories, and collaboration with other government agencies to remove barriers to deploying the technology.

Systems integration Significantly investing in gridintegration technologies, the Solar Program is also coordinating with all stakeholders to accelerate the integration of renewable technologies. As solar technologies provide a larger part of the US electricity supply, it is becoming increasingly important that they be integrated seamlessly into the nation’s electric power grid. This will require new ways of thinking about how the country generates and distributes electricity and new technologies that make it simple, safe, and reliable for solar electricity to feed into the grid. The systems integration subprogram focuses on understanding and breaking down the regulatory, technical, and economic barriers to integrate solar electricity into the electric grid. The subprogram accomplishes this by working with utilities and the solar industry.

Market transformation The Solar Program also promotes the commercialization of solar technologies by addressing nontechnical issues that act as barriers to the adoption of solar energy technologies. The DOE’s market transformation effort identifies and prioritizes significant barriers beyond traditional cost issues and develops specific activities and external partnerships to address those barriers.

www.americainfra.com 51

SEPA.indd 51

2/6/10 09:25:27


SOLAR RESERVE exec_may10 02/06/2010 09:34 Page 52

EXECUTIVE INTERVIEW

Letting the light shine on solar power Kevin Smith outlines the critical challenges currently facing the solar power industry.

Kevin Smith is Chief Executive Officer of SolarReserve. He has held senior executive positions in a number of successful energy development and technology companies including those that develop solar and wind energy projects. In the last 20 years, he has managed the successful development, acquisition, financing and construction of more than 5,000 megawatts of electricity in the US and internationally.

Do you believe that current policies at both state and national level are effective in promoting the growth of solar power generation? What key changes would you like to see? Kevin Smith. While there are both state and federal policies in the US that help promote renewable energy, the major difficulty in the US remains the lack of a long-term federal energy policy relating to renewable energy. Current policies in the US include state renewable portfolio standards (RPS), which currently exist in just 29 states, and the investment tax credit for solar, which expires in 2016. The majority of subsidies for energy actually flow to fossil fuels and fuel burning technologies – with fossil fuel subsidies more than double those spent on renewable energy. These fossil fuel subsidies include those for mining and exploration of coal, natural gas and oil. A long-term energy policy on renewable energy would provide the necessary clarity to utilities, manufacturers and investors that the renewable energy industry is part of the permanent infrastructure plan for the US. This would further stimulate investment and job creation, in addition to helping the US address the critical issues of climate change and energy security. Two major policy options have been in discussion for some time: a nationwide federal renewable energy standard (RES) and a climate change bill. A federal RES would require utilities in all states to purchase a certain percentage of their energy from renewable energy sources, and the climate change bill would mandate a reduction on carbon. While it may be difficult to pass the climate change bill due to its complexities, a federal RES is critical for the US and the renewable energy industry. A common criticism of solar technology has been that it is not efficient enough to provide a viable source of alternative energy. Is this view still valid or are innovative new solutions making a difference? KS. Solar energy has no fuel-burning component, and therefore no costs or pollutants are created, so the primary issue becomes the overall cost of delivered energy as opposed to the efficiency of the technology. It’s critical to evaluate the longterm cost of the electricity and escalation over the 25-30 year life of an energy project, not just the cost in the first year.

52 www.americainfra.com

New solar energy technologies can certainly compete with conventional fuels on a long-term basis when you understand the escalation in costs associated with fossil fuels and the costs for the environmental damage created. SolarReserve’s energy storage technology, developed by United Technologies Corporation, increases efficiency by eliminating the intermittency problems associated with other renewable energy technologies. This storage technology delivers electricity reliably during peak demand periods and allows us to compete with and replace conventional fuel burning technologies. Funding is a major issue for the solar generation business. What are the key challenges involved in obtaining funding and what could be done to make the process more reliable and streamlined? KS. Market-leading technologies, well-structured projects and experienced management teams will continue to attract funding. The short-term stimulus programs and the Department of Energy loan guarantee program have provided significant assistance during the recent economic downturn, and the extension of those programs would be helpful. However, only a long-term energy policy here in the US will create the certainty necessary to the finance community to address the future funding needs of the industry. In the current economic climate, some are wary of devoting funds towards something like solar power. What clear benefits could investing in solar bring to the nation? KS. Unfortunately, short-sighted business and policy decisions have led us into our current energy mess – heavy dependence on fossil fuels and foreign oil, climate change issues and tremendous concerns about energy security. Solar energy, however, will not only stimulate the economy by creating solid jobs in the US, but it will also wean us from fossil fuels, reduce our dependence on foreign oil, and help mitigate climate change. In 2009, China invested almost twice as much money in renewable energy projects as the US ($34.6 billion versus $18.6 billion) and China is on the verge of becoming a world leader in renewable energy technology. The US is capable of taking that world leading position, but in order to do so, critical action is essential now. n


SOLAR AD.indd 1

24/5/10 11:03:27


RENEWABLE ENERGY

Pushing the boundaries of

climate change and

technology

American Electric Power’s Nick Akins tells US Infrastructure how technological innovation is diversifying the company’s fuel mix.

M

uch of American Electric Power’s (AEP) fleet is coal-fi red generation, making climate change the foremost important issue in the company’s agenda right now. As EVP of Generation, Nick Akins spends the majority of his time focused on the development of technology, understanding the changing role of coal and how it is used and perceived. AEP is also focused on developing a balanced portfolio, including renewable sources such as wind power, natural gas and solar, as well as nuclear. The company is also experimenting in other forms, such as sodium sulfide battery storage technologies. “We are focusing on what we need to do in the future to meet customer demand,” says Akins. “As EVP of Generation, I have the fossil and nuclear generation fleets, our capital construction program that includes environmental and new generation, our barge lines and our commercial

operations part of the business, as well as our fuel procurement. As one of the largest utilities in the US, it’s a very significant operation. We purchase from 75 to 80 million tons of coal a year, which is the largest in this country. We follow the legislative activities associated with climate change; we are involved with several activities in that regard.” Akins points out that AEP has taken an active part in developing federal legislation related to climate change, and was involved with the American Clean Energy and Security Act of 2009, also known as the Waxman-Markey Bill. Akins cites AEP’s main motivation for this involvement as the opportunity to match up the technology needed to meet the reduction targets set forth by the American Recovery and Reinvestment Act. Understanding how credits will be allocated is vital to AEP’s future plan, ensuring it has the technology to match the stipulations so its customers don’t have to. “We’re intimately involved with the legislative side because we are doing things that give us the credibility of talking about the technology

54 www.americainfra.com

Atkins.indd 54

2/6/10 09:12:55


Trent Mesa Power Plant

technology. Ultra-supercritical plants operate at very high levels of efficiency, with very high steam temperatures, which supports combustion at higher temperatures. The amount of coal used is significantly less and there are lower emissions as a result. “The efficiency on today’s ultra-supercritical versus supercritical is around three percent,” he says, “which although it may seem small, in the long-term when you talk about year in/year out operations, it’s a pretty significant reduction in the amount of coal that you would have to use. It’s at least 10 percent better than the existing coal facilities that were built in the late 1970s and early 1980s, so it’s a huge improvement. “The ultra-supercritical coal technology is a step up from supercritical. It will run at temperatures exceeding 1100 degrees Fahrenheit, which improves the overall efficiency of the generation itself. It’s driven by metallurgical differences, because the piping

“It is absolutely critical for renewables, particularly wind, to be transported to the load centers”

associated with the boiler has to be able to support those ultrahigh temperatures, and whenever you have the metallurgical aspects support higher steam temperatures that improves the efficiency. The metallurgical aspects enable you to use higher steam temperatures and improve the efficiency,” reiterates Akins.

Climate legislation

advancement to match up with, in a realistic fashion, reduction targets that Congress may come up with. “We have a lot of coal-fi red generation; our customers need those credits that are allocated to us, so we can use those allowances to pay for construction to improve our fleet from an environmental performance standpoint, relative to CO2. We also wanted to see international provisions placed on how the rest of the world will move forward. We still have some work to do on those, but we’ve been very focused on that legislation as it moves through Congress. “We are at the forefront of development of carbon capture and storage technology. In September we installed the fi rst fully integrated capture and storage program at one of our power plants in West Virginia, which will take a small, 20-megawatt electric slipstream, capture the flue gas, convert the CO2 and then store it approximately two miles below the surface of the ground. It is a significant step toward commercialization of capture and storage technologies,” he explains. Th is is by no means AEP’s only power plant pushing the technological boundaries. The John Turk Power Plant is the fi rst ultra-supercritical coal unit in the US – a much more efficient form of combustion coal

He believes that climate legislation will undoubtedly become mandatory, but knowing when it will happen is the question. As early a date as 2010, with it becoming effective in 2015, is possible. Akins points to the company’s integrated gasification combine cycle technology that it proposed in West Virginia and Ohio, and its failure to be approved in Virginia because carbon capture and storage had not yet been proven from a commercial standpoint. “It’s important for us to advance those technologies,” he continues. “From a portfolio management standpoint, we are heavy on coal in a carbon neutral environment; we have to be focused on other base load forms of technology, which includes nuclear. We believe nuclear and coal are the two base load opportunities we have, but you have a lot of load that you serve in an intermediate and peaking type standpoint. It’s become a priority for wind energy to be brought in, because the energy cost is less. “Overall, the cost is more, but when you account for CO2 coal costs will go up as a result and that means it’s important to have wind power. It’s also a priority for us to be able to uprate our nuclear station: we have plans on uprating nuclear by 400 to 500 megawatts and that is a relatively small cost; at least a lesser cost than a new coal-fired station.

www.americainfra.com 55

Atkins.indd 55

2/6/10 09:12:57


“We are also looking at natural gas facilities, and have several mix. He adds that capital projects included transmission projects; enfacilities that are coming online. Here in the US there is a lot of fracturvironmental retrofit projects for scrubbers to remove sulfur dioxide, ing of the new supplies of natural gas and prices are pretty tempered, so nitrogen dioxide and mercury; as well as other projects where rehabilithat is likely to be one of the ways where we manage tation of the system in general have been deferred that transformation. Natural gas, new coal-fi red as a result of the economy. The US has lost nearly 20 generation, nuclear and solar from a rooftop standpercent of its industrial load, and the loss of capital point are a priority for us. We’re also looking at projects as a result was a natural occurrence. sodium sulfide batteries to inject in certain parts of our system.” Down the line Akins’ singling out of wind energy is a pointer Transmission is an essential component to the to the pivotal role it will play in AEP’s fuel mix, as renewables energy mix. “As far as transmission is the company has begun expanding its activities and concerned, it is absolutely critical for renewables, capacities in this area. The company owns some of particularly wind, to be transported to the load its own wind farms, but the majority of its resources centers,” says Akins. “Typically in the US, wind are purchased from the output of other farms. Akins power is generated in areas of the country that are explains that this is done because of the capital situvery sparsely populated from a load perspective, ation being driven by the economy. so transmission is critical in order to move that AEP has trimmed its capital budget back from renewable energy. $3.75 billion to $1.8 billion, a significant drop, and is “We at AEP have the largest transmission now fi nding itself in a position where priorities must system in the US and the highest voltage, 765 KB Nick Akins is Executive Vice President of be made. Although the company is able to order a transmission, and we’ve proposed several transGeneration for American Electric Power. substantial amount of environmental retrofitting, mission projects around the country to move these it knows the areas in which to create economic effirenewable resources to the load centers. It remains ciency. AEP continues to develop the construction of its own wind power to be seen how that’s going to progress, but to further optimize and projects, but its sole focus for wind is purchasing power arrangements. make sure that our entire generation portfolio, including renewables Akins does not shy away from noting the effect of the economic and base load generation, is operating in the most efficient fashion will recession on integrating more renewables into the company’s energy require substantial investment in new transmissions. “There’s a lot of discussion of the effect of including renewables within the transmission system as the actual operations of the power system change as a result. A lot of studies are being done in that regard, and we’re participating in those studies to make sure that when we do add substantial amounts of renewables we can respond from a system stability standpoint to ensure that we can continue to allow those renewables to be injected into the system.” The US federal government’s stimulus funding is attracting almost every utility in the industry, and AEP is no exception. The company is currently evaluating the areas in which it can take advantage of the funding and is working alongside the state authorities – a large portion of the money is handed from federal to state jurisdictions for evaluation. AEP is looking for funding assistance in its carbon capture and JOHN W. TURK, JR. POWER PLANT storage projects: “We’ve asked for the next phase of our carbon capture and storage project to go to 235 megawatts, which is the fi rst commerEstimated completion date is summer 2012 cial scale we’re asking for stimulus funding associated with. We’re also looking at support for our gridSMART technologies for advanced meThe cost of the plant is approximately $1.6 billion tering for other forms of renewable projects as well,” explains Akins. AEP’s efficiency aims are also based on a local level with its smart SWEPCO’s investment will be 73 percent, around $1.2 billion grid initiative ‘gridSMART’. The system analyzes what the customer Plant construction will create over 1000 jobs at the height does on his side of the meter to determine when he uses electricity, of construction installing efficiency at the customer usage through to the generator activity. “Th rough the advancement of our gridSMART technologies, The plant will bring an estimated 110 permanent jobs to the area we’re looking at efficiency gains that could be made all along the path from generation to the customer to make wise decisions from a cusAnnual payroll is projected to be $9 million tomer standpoint.”

56 www.americainfra.com

Atkins.indd 56

2/6/10 09:12:58


Smith Mountain Hydro Project

Technology

AEP’s renewable portfolio Solar AEP’s activities in solar energy have focused primarily on education and outreach. More than 125 schools participate in AEP’s ‘Learning From Light’ and ‘Watts on Schools’ programs. Wind AEP owns 310.5 MW of wind generation capacity in Texas and also has agreements to purchase 742 MW from several wind power facilities in Illinois, Indiana, Oklahoma and Texas. Both of its farms, Trent Mesa Wind Farm and Desert Sky Wind Farm, sell the energy that is produced to wholesale energy supply contracts. In addition to owning and operating its own facilities, AEP also is a major purchaser of wind power from wind projects, such as FPL Energy’s Southwest Mesa and Weatherford Wind Energy Center. Hydroelectric AEP’s 17 hydroelectric facilities in Virginia, West Virginia, Ohio, Indiana and Michigan generate more than 800 MW of electricity. Smith Mountain Hydro Project, on the Roanoke River southeast of Roanoke, Virginia, was the nation’s first major development combining run-of-the-river hydro with pumped storage generation. Biomass Until the company sold the Fiddler’s Ferry and Ferry Bridge power plants in 2004, AEP co-fired biomass in 4000 MW of coal-based power generation in the UK. AEP also has conducted biomass co-firing tests and analyzes at several of its power plants in the US.

In order to incorporate renewables into its generation and successfully transmit that energy across its service area territory, technological innovation is AEP’s key to continuing its success into the future. Akins exemplifies the company’s coal generation – currently it can generate off a single coal unity approximately 2.5 cents per kilowatt hour; and its comparison to wind power, which is approximately 10 cents a kilowatt hour. Solar is around 25 cents a kilowatt-hour; so the advancements of these technologies is critical. “Solar continues to make efficiency gains from a production standpoint, and the same applies to wind power, so we continue to see ad-

“We’ve proposed several transmission projects around the country to move these renewable resources to the load centers” vances there. These technologies will have to continue to improve from a cost standpoint to make sense to our customers,” he explains. “Clearly our main advances in technology are carbon capture and storage, to make sure that that becomes viable, as well as transmission and new technologies associated with generation. We have historically supported FutureGen; we pulled out due to the funding issues associated with it, but we continue to support that technology. We were the fi rst innovating hydrogen technologies, which are a more advanced type of coal-fired generation. We continue to push the envelope on new generation technologies,” concludes Akins.

www.americainfra.com 57

Atkins.indd 57

2/6/10 09:13:00


INFRASTRUCTURE DEVELOPMENT

Rich Lee, US Head of KPMG’s infrastructure advisory group, breaks down some of the key findings from a series of detailed investigations into the biggest challenges in infrastructure delivery effectiveness.

STATE OF THE NATION 58 www.americainfra.com

RichLee.indd 58

2/6/10 09:25:13


K

PMG has carried out three major surveys in the last 18 months. They’re all global surveys, but we had enough US participation so we could compare results in the US against global responses. The fi rst survey was about 18 months ago when we went out to C-level executives around the world during the fi nancial meltdown in the fall of 2008. The message we got back from them was that, unsurprisingly, current levels of infrastructure of that investment were not sufficient. There should be more consideration of P3 public partnerships around the world. We also found that the availability and quality of infrastructure directly impacts business decisions on where companies locate and expand their operations. The importance of infrastructure is only going to increase in the near term, with specific areas of need being roads, power generation, and social infrastructure. The results of that survey were published in early 2009, right around the time when there was a lot of discussion around stimulus programs. In that discussion, infrastructure was given a lot of acknowledgement as a need and as an area where you could spend some money on stimulus and improve infrastructure quality. It coincided well with the broader public dialogue about infrastructure as everyone was becoming more aware of the issue. Our next step was to go out to the market again in the spring of 2009. Th is time we approached the major commercial players in the infrastructure business, asking them many of the same questions as the previous survey, plus a few new ones. Given the public discussion about infrastructure and stimulus, we asked if private providers thought that the stimulus was sufficient? We got a resounding response. Despite the stimulus money, it was widely felt that the needs were still relative to the amount of funds that had been allocated. They also confi rmed that P3s would be an area that should be explored more widely. The thing that came out of the private sector survey was the issue that one of the biggest hurdles to making progress was government effectiveness. Another area they highlighted was a skill shortage both globally

and in the US in terms of people who are qualified to deliver large-scale infrastructure projects. That really spurred us on to say “well, why don’t we complete the loop?” And that’s when we went back to the government public sector officials around the world and got their opinions too.

Surprising results When we looked at the results of these three surveys, it was a little surprising how consistent the results were across the different groups. We thought that there might be some more disparity. Take public private partnerships. The level of acknowledgement of that as something that needs to be explored was pretty consistent across the three groups. We were a little surprised that the public sector was so ready to acknowledge that the politics in the selection and development of projects is a major impediment. We felt that that might be more muted relative to maybe the private sector, which raised it as an issue. And I think that bodes well for us because we’re financial advisors both to government and private sectors internationally. However in the US, at this stage of the market we’re mostly spending a lot of time with public sector clients around the country. And we’re spending a lot of time educating them on what’s being done by some of the earlier movers in the domestic market, but also educating them on what’s been done in some of the mature markets around the world. Whether it’s Canada, the UK, Australia or others. The acknowledgement that there needs to be a better way to evaluate and develop projects is good. It specifically highlighted the need for better business case development on the front end of these projects such as the best way to allocate risk on a project, and what’s the best way to accelerate some of the delivery. It’s really important for many projects, because if we don’t fi nd ways to look about this as a broad base, we’re going to struggle to find the fi nancing to deliver projects that are really needed.

A real stimulus?

An interesting fi nding from our research was that shortage of funding was cited as a big problem even during the period when stimulus money has been Thinking specifically about your organization, do you think the private flowing. In terms of the economic climate sector can help it to deliver infrastructure more effectively? and conditions that we went through, the impacts on government are a lag to the % 100 broader economy. Even despite stimulus spending what we’re fi nding even now is that more governments are still struggling 80 to fi nd ways to close budget gaps and to fund 68% their capital programs. While the stimulus 60 money was accepted and appreciated, the amount relative to the need pales in com40 parison. The amount of stimulus funds set aside for infrastructure in the US was really 26% j just a small percentage of the total. 20 But the needs are as great now as they 9% ever have been. The goal of the stimulus 0 package was twofold. One was to create jobs Yes No Dont Know and employment near-term and the other

www.americainfra.com 59

RichLee.indd 59

2/6/10 09:25:15


was to move some projects along. But a lot of the needs are long-term mega projects that need substantial amounts of money, and they aren’t being covered by this process. There was one key question about which factors would likely produce the greatest improvement in infrastructure development in your jurisdiction and the top thing that came up in the US was the need to depoliticize. From our private sector respondents, 42 percent said that this was an issue. It was really interesting as that survey came out a week before the President announced the money for high speed rail, and down in Tampa.

In his remarks, he even commented on the need to depoliticize the process as further investment was put into this area. He sort of supported the survey that we had coming out at that very same time. That really confi rms that it’s an issue that’s pretty widely accepted as a major one. Ultimately, whether a project is going to be P3 or not, there’s a need for far better transparency around the process, as well as a better evaluation of total cost and total benefits. And I think improving that process of project evaluation, will improve the overall delivery, regardless of the ultimate means and methods by which the projects get completed.

Which of the following are the greatest public sector imedinents to more infrastructure investment in the country where you are based? (select up to three) Other

4%

Politicization of Infrastructure project priorties

17% 18%

Frequent changes in public policy Lack of appropiate public policy

19% 19%

Lack of sense of urgency Corruption or misuse of funds earmarked for infrastructure

27%

Lack of skills/knowledge/training of officials in development and management of infrastructure

27%

Inadequate understanding of the severity of the issue

28%

Poor credit worthiness of public authorities

28%

Frequent changes in legal/regulatory framework

28%

Lack of appropiate legal/regulatory framework

42%

0

20

40

60

80

100

In the country where you are based, which of the following sustainability factors do you believe will provide the greatest competitive advantage in the industry five years from now?

Other

1%

Promoting alternative sources of energy (e.g. solar, wind, hydro)

5%

Smart Infrastructure

8%

Sustainable (green) construction methods

16%

Making existing infrastructure more efficient

18%

Branding infrastructure as a catalyst for behavioural change

19%

Sustainablity factors do not provide a competitive advantage in infrastructure

30%

0

20

40

60

80

% 100

60 www.americainfra.com

RichLee.indd 60

2/6/10 09:25:16


INVESTMENT

NUMBER CRUNCHING Spending in the renewable energy sector is set to climb in 2010 – dramatically reversing last year’s 40 percent decline. Gregory Burkart, Managing Director for Duff & Phelps, explains how all the numbers add up.

P

rior to the credit crunch in mid-2008, most renewable or alternative energy projects relied on federal investment tax credits and production tax credits that offset tax liabilities against investments in wind farms and solar plants. However, with most fi nancial institutions posting steep loses in 2009, tax credit fi nance became obsolete. Fortunately the American Reinvestment and Recovery Act replaced part of the need for tax credits and gave the industry a huge boost last February.

www.americainfra.com 61

Greg Burkhart.indd 61

2/6/10 09:23:22


The act included $6 billion to support loan guarantees for renewable energy and electric transmission technologies, which was expected to guarantee more than $60 billion in loans. The act requires the DOE Loan Guarantee Program to only make loan guarantees to projects that are involved in renewable energy, electric transmission or biofuel technologies that start construction by September 30, 2011. The tax section of the act provided a three-year extension of the production tax credit (PTC) for most renewable energy facilities, while offering expansions on and alternatives for tax credits on renewable energy systems. The act also allowed owners of non-solar renewable energy facilities to make an irrevocable election to earn a 30 percent investment credit rather than the PTC. As the Managing Director of Duff & Phelps, Gregory Burkart specializes in structuring and negotiating government-sponsored economic development incentives packages. Here he offers his advice to renewable energy projects that are currently looking for funding. What’s the climate like at the moment for renewable energy projects that are seeking funding? Gregory Burkart. We’re noticing that there’s been a pick-up in basic industries in the United States, particularly areas like paper and steel manufacturing. Consequently as these companies come back online – they’ve been through extended periods of layoffs, or in some instances even shutdown – a number of communities are trying to incentivize the companies to come back and actually reopen mills or plants within their geographic territories. So we’re certainly seeing an uptick in manufacturing and basic activity. The other side of this is that we’re seeing a lot of activity due to the Recovery Act, with many of its provisions set to expire at the end of this year. We are seeing a lot of companies who are just learning now about these various benefit programs and who want to apply for the benefits before the expiration. We are seeing both of these happening and in respect to basic commodities and manufacturing as companies bring those facilities back online they’re trying to be as efficient as possible and so we see them looking at investments in new energy efficient technologies. And so the stimulus bill has really had a major effect on this? GB. Well, for example we’ve just received a distribution project for foods and for consumer products. They’re tapping into a new market in the Midwest, and what they have decided to do is to put solar on the roof, to use geothermal to cool part of the facility, and then they’re also going to use a small wind project in order to generate

their own electricity. They’re also starting to look at the potential of establishing a manufacturing facility down in the Carolinas. Th is company is looking at tapping into the federal program for direct loans under Section 136, because they make components for an alternative technology vehicle that’s going to be a brand new highly efficient transmission facility. What we do is help companies understand and navigate the benefits of these federal programs. And do renewable projects provide a good return on investment for potential investors? GB. They do. At the federal level we have programs that are a cash grant in lieu of taking an investment for a production tax credit and that would be for facilities like the distribution center that was generating its own electricity. It’s a 30 percent cash grant, and it’s basically 30 percent of the eligible cost you can receive within 60 days of placing the asset in service, which significantly enhances the rate of return. When Congress originally passed this program through they thought it would cost about $350 million, and in the middle of April, the US Treasury had cut checks for about $3.1 billion. So it’s been a very popular program, as you can imagine.

Investment to reach $200 billion by 2010? Bloomberg New Energy Finance Chief Executive Officer Michael Liebreich believes that renewable energy investment may rise as much as 23 percent in 2010 as government stimulus funds in the US and Europe are spent on wind and solar energy. Spending reached $162 billion in 2009 and Liebreich believes this will rise to somewhere between $175 billion and $200 billion this year. New Energy Finance estimates that twothirds of the $184 million that the US, Europe, China and other regions have earmarked for clean energy projects, will be spent through 2011 with the construction of windmills, solar power and biomass plants continuing even after UN negotiators fail to reach a binding treaty to limit carbon dioxide emission from fossil fuel plants. China replaced the US as the biggest investor in renewable energy last year, spending $34.5 billion on low-carbon energy technologies in 2009, compared to the US’s $18.6 billion. The UK was the third-largest renewable investor in 2009, accounting for 10 percent of the G20 total, followed by Spain, Brazil, Germany and Canada.

And have the way the projects are structured and financed seen any major changes or developments in the way these things are happening in the past six to 12 months? GB. What we’re seeing is that the Source: businessweek.com market is starting to respond and develop some new fi nancing tools. For example using specifically the 1603 cash grant program, banks have created what they call bridge loans, they’re specifically lending against the expected cash payment, so it’s a relatively short-term loan during the term of construction. As soon as the asset is placed in service they have the ability to receive a cash grant through the treasury. These cash grants have been coming out pretty quickly, on average within 60 days of receiving the application.

62 www.americainfra.com

Greg Burkhart.indd 62

2/6/10 09:23:23


You mentioned some of the smaller scale projects, but what about some of the larger projects? Is there a different approach for these? GB. Under the 1603 cash grant program wind farms have been a big beneficiary. They haven’t had as many projects; there’s been about 110, maybe 115 projects, but they’ve garnered the lion’s share of funding. At last count these counted for well over $2 billion, and the average project was about $44 million. Compare that to solar, and the number of solar projects is about three times that – about 385 projects – but the projects are smaller in scale. And they seem to be much more rooftop type of installations. And the average grant for solar is $404,000, so you can see it’s a much smaller scale than wind. You mentioned that companies are looking to get the loans before the recovery act funding expires. What do you see as the prospects for post-Recovery Act – what’s going to be happening then? GB. There’s a lot of movement afoot, and right now one of the alternatives that’s being discussed is to turn the 1603 cash grant program into a refundable tax credit program. The industry doesn’t look at that as favorable, the cash grant program. You can see just by the numbers that it’s been a very popular program and the primary reason is because of the delay in receiving the benefit – you have to wait to receive that benefit when you fi le your tax return. Whereas the cash grants are received within 60 days of placing the asset in service, or fi ling your application, whichever date is later. If projects are looking for funding are there any key points they should be thinking about? Is there a checklist that they should be hitting? GB. First and foremost, is to make sure that you fi le your application on time because the funding is tied to some very specific milestones and each program has a different timetable, so pay specific attention to those deadlines. The 1603 cash grant program, for example, has to place the asset in service in 2009 or 2010, and if you start construction in 2009 or 2010, then you have to complete it by the normal credit termination date. So for example, for a wind project it would be 2013. The deadlines are key, with some very similar tight time frames for loan guarantees that are coming up. Secondly, the applications are much more complex than people think and so you really need to ensure that you give yourself enough time to complete the application – you want to make sure that you fi le a good, solid application because you never have a second chance to make a fi rst impression. If you fi le a haphazard application with the Department of Energy (DOE) the fi rst time

around in a rush to get to meet the deadline, that impression is going to stay with the DOE throughout the review cycle. And you’re going to inadvertently create an image of yourself that is not favorable. The third element is that when the applicants are fi ling their applications, they need to pay particular attention to the fi nancial assumptions and also their modeling, because in many instances the DOE will ask you for your fi nancial modeling. They’ll go through it without the benefit of having you in the room so it’s not like your typical application that you’re going to send to your banker – there’s much more due diligence and much more scrutiny, because they stress cut these models. The DOE doesn’t tell you what all the stresses are before they start running your models through, so you really have to pay particular attention to the fi nancial modeling.

“We’re certainly seeing an uptick in manufacturing and basic activity”

www.americainfra.com 63

Greg Burkhart.indd 63

2/6/10 09:37:56


OPEN TEXT_may10 02/06/2010 09:30 Page 64

INDUSTRY INSIGHT

The smart world of utilities Adrian Butcher answers the critical questions behind the smart metering challenge.

I

f there’s one word that’s creating a ‘buzz’ in utilities throughout the developed word, it’s ‘smart’, the next transformational step for two core components of the utilities business model – the grid and metering. Smart metering and the smart grid are both recognized as having profound implications for the industry, in terms of energy efficiency, customer service and cross-border energy management and trading – these topics are widely and imaginatively discussed throughout the industry. What seems less discussed is the topic of what to do with the sheer volume of data generated, how to place it in the hands of all who need it and how to manage the exponential change in the volumes of historic data. At Open Text, we see some critical questions emerging and believe electronic archiving will provide some of the key answers – and crucially, that the history of our experience demonstrates that. So, to those questions. How can we cost-effectively maintain records for the hugely expanded level or scale of meter readings? Where end customers are involved, how and in what media will we choose to present metering data? If on the web, how can we assure performance-controlled access to relevant data in a customer-service, or self-service, environment? We must also maintain such records to support the broader relationship with the customer and make them accessible in the broader customer information context. If we provide such access to customers, how can we assure security of data, and of systems? And if all this means investing in new systems, do we still have the cost burden of legacy systems, for the historic data they hold? Let’s see how an enterprise archive might address such questions. Firstly, scale is one of the ‘specialist subjects’ of the electronic archive, with proven capability to handle truly huge data volumes. It can also do so ‘intelligently’, optimizing the balance between performance of access and storage costs, according to your business policies, but without discrete day-to-day activity by your organization. Secondly, it’s possible to ‘render’ data from pure data sources in customer and employee-friendly formats. Some utilities have already made this capability their strategy for ebilling, with reduced costs and better customer service. For them the presentation of meter reading information may

64 www.americainfra.com

With management experience in a broad range of industries and processes, including many of relevance to utilities, Adrian Butcher leads a function dedicated to helping customers explore, discover and quantify the business value of Open Text solutions in supporting core business processes, reducing cost and supporting corporate compliance obligations.

represent extended leverage of an existing platform, not a completely new environment. Of course, in a customer-service environment, such data needs to be accessed rapidly, yet securely. So thirdly, by placing information in the archive, access permission can be carefully controlled and access is made to the archive, not to the metering systems that feed it – ensuring electronic security for the metering environment. Customers and customer advisors see exactly the same information, so discussions about it are easier. Fourthly, where customer meter readings are concerned, they may need to be made available in the broader customer context of bills, enquiries (through whatever channel), and special package contracts, even complaints. Finally, the archive can be used to hold and make accessible data from legacy systems, delivering cost savings by allowing such systems to be decommissioned. Smart metering represents a bold step into the future. For utilities, which must manage risks as intensively as assets, it may be comforting that utility companies across Europe have been using Open Text archiving for years, for e-billing, SAP data archiving, customer relationship management, legacy systems decommissioning, and more – supporting multi-channel customer relations in particular. For our existing customers, smart metering may present opportunities to further leverage existing data. We believe the smart new world of utilities will come not just to benefit from electronic archiving but to depend upon it, and we look forward to ever more dialogue with likeminded managers in this transforming industry. n


OPENTEXT AD by ed.indd 1

24/5/10 10:59:50


TED SCHULTZ_may10 02/06/2010 09:34 Page 66

ENERGY EFFICIENCY

“The economics aren’t there yet, but I think we’re seeing enough things moving that it’ll take care of itself over time.”


TED SCHULTZ_may10 02/06/2010 09:34 Page 67

A

penny saved…

Vice President for Energy Efficiency at Duke Energy, Ted Schultz, talks to US Infrastructure about the smart grid’s ability to reduce power consumption.

I

t’s not easy being green. Particularly during a global downturn. Throughout the Noughties, the carbon footprint was fast becoming something of a buzz concern for industry professionals and consumers alike until the economic downturn set in, putting the time consuming and expensive issue of energy efficiency on the back burner for most. However, Duke Energy, the major utility provider operating in the Carolinas and Mid-West, has developed an inventive strategy in efficient energy distribution, combined with a strong focus on renewable energy sources, to prove that it is still possible to be green during a recession. Set up in 2008, Duke Energy’s Save a Watt approach to efficiency is an initiative based on the idea that saving energy should be just as important to the company as generating and distributing electricity. “The philosophy behind Save a Watt is actually pretty simple,” explains Ted Schultz, VP for Energy Efficiency at Duke Energy. “We look at energy efficiency and demand response as resources, just as we do a power plant. We are now aggressively going after those resources and being compensated for them.” This scheme is based largely around the wide scale use of the smart grid, the innovative new technology that is revolutionizing energy distribution. As part of the scheme, Duke’s residential and business customers are supplied with a smart meter, a device that monitors not only how much energy is being used, but where and when it is being used, and reports that information directly back the supplier. The company has invested around $1 billion in developing the technology behind the grid in order to improve the efficiency of its power lines and electricity and gas meters, and Schultz is keen to point out the scope for improvement with the smart grid. “There are multiple benefits. Your sampling gets better the more accessible your samples are. So this means really improving the distribution system, which in turn means, for example,

voltage reduction and a dramatic decrease in estimated bills. There are huge operational benefits that come from the smart grid.” The smart grid is by no means a fad in energy distribution. Companies are already reaping the short-term benefits of the grid’s intelligent monitoring system from the accurate information they are receiving about their consumers. Not one to rest on his laurels however, Schultz is now looking to the future for the potential this device has for the industry. “Typically, technologies get over-hyped in the short term and grossly underestimated in the long term. We can’t imagine today what we’re going to be able to do with the smart grid. It holds tremendous promise to really go much deeper with energy efficiency and demand response. There are still some hurdles to overcome because the technology behind the meter isn’t there yet, but we are really working hard on that at the moment. “We also see it as a key enabler of distributed energy resources, such as rooftop solar, and we are working on technology for plug-in vehicles. Plug-in vehicles will certainly require a smart grid, and the benefits of them are huge. Based on a number of assumptions, we estimate that plug-in vehicles can achieve a 40 percent market penetration rate by 2050. It is certainly going to take a long time for this replacement to take place, but using electricity over gasoline in this way will mean around a $20-30 billion net saving for our customers.” While the technology involved in the smart grid is likely to get increasingly advanced, Schultz believes that simplicity is the key to succeeding in the commercial marketplace. He is keen to receive the maximum amount of information from each system about how each individual customer uses their energy, while putting as little responsibility as possible to the customers themselves. Schultz feels that, when it comes to managing their own smart grids, customers don’t want to be involved. He understands that, in order for this initiative to succeed, Duke Energy must develop approaches that require

www.americainfra.com 67


TED SCHULTZ_may10 02/06/2010 09:35 Page 68

minimal involvement from the customer and can easily be tailored to customer’s daily routines. “Ultimately we are going to see some very straightforward involvement technology to help you set and forget. I think devices such as a simple ‘Home or Away’ button on your cell phone, or an iPhone for the home application, will start to emerge on the market.” The customer appears to be a paramount concern for Schultz; he recognizes that innovative and advanced technology is secondary to a positive consumer experience in terms of successfully achieving energy efficiency. “The whole customer experience needed a lot of work,” he admits, and outlines Duke’s plans to use the smart grid to tackle these issues. Something currently in the development stages is a service to improve direct communications between the company and the customer, a device on the smart grid that customers could use to request a call back from Duke’s customer services department.

“For us energy efficiency is a resource. It’s a way for us to invest, just like investing in a power plant and we’re earning a return on that investment” “Think how successful a programable thermostat would have been if there was a button that you could press and be able to speak to someone. You could tell them what you wanted and get your Ted Schultz problem sorted without getting the manual out,” Schultz points out. He explains that system features such as these will benefit the customer directly. “Our real aim was to look at the whole picture and the potential benefits to the customer, and the technology will follow from there.” Of course the greatest potential benefit to the customer related to energy efficiency is the cost saving, however Schultz stresses that though they are selling less of their product, Duke Energy’s profitability is flourishing. “For us energy efficiency is a resource. It’s a way for us to invest, just like investing in a power plant and we’re earning a return on that investment. It’s the cheapest resource we can go after, it adds value directly to the customer because there are savings directly involved with it and it is tremendous for the environment. It’s a win for everybody.” So much so in fact, that Duke Energy has continued to succeed throughout the global recession. Energy efficiency is a key element of the Duke Energy business strategy, so it has remained a priority issue over the last two years where many companies in the industry have let such concerns fall by the wayside. Schultz admits it hasn’t been easy though. “It has been harder to capture the business simply because of the economy. A lot of commercial businesses are in tough shape so they’re not willing to invest.”

68 www.americainfra.com

Despite this, he is still positive about what is to come for the industry. He speaks excitedly about the possibilities held in store for energy efficiency in America as a whole. “The government is certainly helping to raise awareness and get things started in this area, but I think long-term solutions are going to come from private industry.” Confident that Duke Energy will, in the long term, be able to create a sustainable business, Schultz outlines how the smart grid will be a main focus point, and escalating technologies will allow the energy grid to become a more advanced digital grid. “We are beginning to see more companies which we don’t normally see in this space come into play,” he explains, predicting that the solutions to energy efficiency will come from innovative technology companies working hand-in-hand with progressive utility companies such as Duke Energy. Another main area of concern is renewable energy sources, and as he outlined previously, the smart grid is integral to enabling these forms of energy. Duke Energy’s performance in this area of the market is impressive to say the least. It currently generates 735 megawatts (MW) of electricity from its own wind farms in Pennsylvania, Texas and Wyoming and is set to bring a further 251 MW online by December 2010. It is the second largest, investor owned hydro operator in the US, owning and operating some 3200 MW of hydroelectric power in the country, and an additional 3000 MW in South America. Duke is also pioneering various progressive renewable energy programs, including one of the country’s first offshore wind projects off North Carolina’s coast, a venture with AREVA to build biomass fuelled power plants and the use of electricity generated by combustion of methane gas from decaying garbage at landfills in both North and South Carolina. Yet despite this already demonstrably broad use of renewable resources, the company is striving to achieve more, and has outlined aims to meet around 25 percent of its customers’ electricity demands with renewable energy by 2030. Though Duke Energy already operates a number of substantial projects such as wind turbines, Schultz explains how the company is working to combine the advanced technology of the smart grid with renewable power sources in more large-scale implementations, as well as smaller, localized sources such as individual solar panels. “In some states we are required to look at larger scale sources, and there are already solar carve outs.” Currently in the construction stage is a solar farm in Texas, set to generate around 14 MW of power. “We have also done solar rooftop deals where we have leased customers’ rooftops,” Schultz adds. “The solar energy feeds into the grid, not into


TED SCHULTZ_may10 02/06/2010 09:35 Page 69

The magic number Ted Schultz outlines the principles of Duke Energy’s three phase business plan.

T

he initial concern, now that regulatory measures have been established, is to focus on enabling the whole range of products in the Duke Energy portfolio and integrating them across the area the company serves. “We have been rolling out the programs,” he says, “and we have been getting reasonable penetration. But I want to get all our programs out across all the states and really do some specific targeting.” The second phase is working with the customers to make existing infrastructure or appliances more energy efficient. Duke Energy already has in place attractive money saving incentives for its residential customers, encouraging them to use certified heat pumps or air conditioners. On the other side of the market nonresidential customers are able to bring their own solutions to Duke and, in the right instance, will use it as standard within their products. Schultz highlights LED lighting in refrigerators as an example. “When we started we didn’t have it. A customer brought it to us, then a second customer brought it to us. Now we have it as standard.” In addition to this, the company is currently putting a lot of focus on reconditioning whole buildings. According to Schultz, this is a particularly fast moving area of the market and thanks to the support and encouragement of government funding, Duke Energy already

has the facilities to address this issue. “It’s another tough nut to crack,” he adds. “Customers just don’t want to spend their money on it, so we’ve got to address that challenge. But there are some pretty big benefits to improving the shell of a building.” The third phase of Duke Energy’s approach to increasing efficiency is the company’s focus on new technologies, in particular of course, the smart grid. Schultz reiterates the company’s aim to explore and utilize all potential capabilities of the grid and how that can be used to benefit its customers, both residential and commercial. “To me, something as simple as the Home and Away application is killer,” he explains. “It’s so simple. Programable thermostats have proved to be too complex for people to manage them effectively. And even if you did have it set properly, what happens when you go out for, say, your kid’s soccer game for four hours? You could hit a button on your phone or on your security system that instantly puts you in ‘Away’ mode. We estimate that there is about a month’s worth of energy to be saved here that would otherwise just be wasted from nobody being in the house.” Such an application is currently targeted at residential customers, but Schultz outlines plans to use the technology in a similar way to improve efficiency for the company’s commercial

the house or business.” This whole rooftop program is set to generate around eight megawatts of power from solar panels installed atop a network of office buildings, manufacturing plants, schools and warehouses in North Carolina. “We are being very creative in this area,” he explains. “We also have experiments going on with storage, as I think this is a critically important issue. We are doing a lot of research into vehicles and also a lot with wind on the deregulated energy side of the business.” But there is one major issue that surrounds renewable energy resources for a commercial market. As Duke Energy operates in some low-

users. “We’ve really got to get through this agency issue of the building owner versus the tenant. It is one of the biggest issues in energy efficiency,” he says. Again, the solutions to this challenge are still in a planning stage, and rely entirely on development of the smart grid technology to extract extremely detailed data and information about the customer. Characteristically, however, Schultz remains optimistic that Duke Energy is close to reaching the solution. “We think we have some ways to solve this problem with the smart grid and actually create a service for commercial customers based on the pay as you go philosophy. Rather than having tenants commit to performance contracts for 20 years, we can create a service at the value the customer requires.” “So these are the three phases,” he summarizes. “Current programs that we have to get out, the whole building and the new technology. We have a bunch of new ideas. Ultimately, I think you will see a combination of an improved infrastructure shell and innovative new technology for the smart grid.”

cost states, the prices involved in installing distributable energy resources are unlikely to be an attractive prospect to the company’s customers. “If the costs of such energy sources don’t make sense in your high-cost states, then they really don’t make sense in the areas that we cover,” says Schultz. Despite this, however, he doesn’t seem worried about the future, predicting that new technologies will become available to make this type of energy more affordable for his customers. “The economics aren’t there yet,” he says, “but I think we’re seeing enough things moving that it’ll take care of itself over time.” n

www.americainfra.com 69


ITRON AD.indd 2

24/5/10 10:58:50


ITRON AD.indd 3

24/5/10 10:58:52


ROUNDTABLE

GETTING SMARTER

72 www.americainfra.com

Smart grid roundtable.indd 72

2/6/10 09:29:50


With energy efficiency becoming an increasingly important consideration, US Infrastructure asks a panel of experts to discuss the challenges behind a sustainable smart grid.

As wireless communication becomes a standard feature in smart meters, what challenges exist around maintaining security levels? Malcolm Unsworth. Wireless communications for smart metering present new challenges for utilities, especially in managing access to the meters and how the technology can be exploited. In the past, most meters were not ‘connected’ to the utility – there was no twoway communication. Access to these meters had to be accomplished locally, and only the meter being attacked could be compromised, relegating the impact to a single premise or area. Without secure technical controls and design principles in the system architecture, wireless two-way communications mean a local attack at a meter or to the communications infrastructure could be exploited to potentially affect many meters or communication devices from a single point. Because this single point of access can have far greater impact, a higher level of authentication and authorization should be required to locally access the smart meter. Dr. J. Patrick Kennedy. Wireless has significant security issues but the cost is so much less than wired that for most signals we should expect to have to deal with these issues. The potential for benefits is also very high as the cost of new data is reduced. Many of the groups that specialize in wireless communication are working on this issue.

www.americainfra.com 73

Smart grid roundtable.indd 73

2/6/10 09:29:52


Malcolm Unsworth

Dr. J. Patrick Kennedy

Jack Danahy

Jack Danahy. Wireless communications come in many different forms, and security approaches are dependent upon both the technology and implementation model selected. Cellular networks, Bluetooth connections, IP wireless, and even traditional radio networks have all been considered for roles in the smart grid and have differing capabilities, but the main concerns can be captured uniformly. In a smart grid, decisions and charges are driven by the data collected, which necessitates application of all of the traditional security characteristics. In a wireless network, where the receiving system cannot rely on a permanent, hardwired connection to ensure the identity of the sender, strong authentication of the sources must be created and managed. Similarly, because the network is formed through the air, and not over a fi xed wire, confidentiality must be ensured through encryption and decryption of data at the meter and at the aggregation point. Achieving better energy efficiency is a major priority at the moment, both for providers and their customers. What role does the smart grid have to play in this effort? JPK. The smart grid provides a mechanism for users to become more involved addressing their own energy efficiency, conservation and assist in meeting demand response requirements. The smart grid provides means of letting these users know what is required and paying them for a rapid response. These data were the missing element to monetization of better operation and control. Like many control issues, wide scope, high resolution data with reduced latency disseminated to all that need it are important to the continuous improvement process that is used for conservation. For the utilities, metering technology will improve the meter to cash process. It will increase accuracy resulting in less estimated bills and reduce bill complaints with better detection of fraud. Planned for the future are enterprise functions such as better asset management, load profile and forecasting, and demand response including load shaping. This will require a unifying layer for meter data collection unencumbered by proprietary applications at the metering level giving rise to the MDUS versus the older MDMS approach. JD. The smart grid is made smart through the use of analytics and an increasingly rich set of data around behavior on the grid. From a user’s perspective, it has been shown that simply exposing a consumer to their usage pattern creates an average 15 percent saving in energy use. This is largely accomplished by time shifting power consumption and selecting more efficient appliances and practices. The smart grid simplifies and unifies the provision of that type of data. Similarly, increased data and analytics drive

more intelligent and predictive operations of the grid. As a result, outages, downtime and the need to tap additional power sources can be reduced due to the development of a proactive capability in managing the internals of the grid.

THE PANEL A highly-respected executive both domestically and internationally, with broad experience throughout the utility industry, Malcolm Unsworth was named Chief Executive Officer and President of Itron in March 2009. Prior to his post as CEO, he was Itron’s President and Chief Operating Officer. He was elected to its board of directors in December 2008. Dr. J. Patrick Kennedy is CEO and founder of OSIsoft., LLC. Prior to this Kennedy worked as an engineer for Shell Development Company and Taylor Instrument Company. Kennedy attended the University of Kansas where he earned a Bachelor of Science in Chemical Engineering and a PhD in Chemical Engineering. Jack Danahy is a Security Executive at IBM. He holds five patents and has additional patents pending in secure distributed computing, kernel security, software vulnerability analysis, and secure systems management. He currently authors two blogs: Suitable Security at http://suitablesecurity. blogspot.com and Smart Grid Security at http:// smartgridsecurity.blogspot.com/.

MU. Actually, how a utility applies smart metering and smart grid technology can play a central role in meeting environmental mandates. These technologies support a clean energy economy by aiding in carbon reduction – for example, given the number of smart meters Itron can produce annually, with conservative demand response assumptions, over one million tons of carbon emissions could be avoided. And this is before we factor in things like reduced emissions from fewer truck rolls. In fact, the Natural Resources Defense Council has gone as far as saying that smart meters are essential to saving the planet. These technologies also enable a renewable power infrastructure and encourage system efficiency and consumer involvement. Though smart meters are currently focused on electricity supply, are there other applica-

74 www.americainfra.com

Smart grid roundtable.indd 74

2/6/10 09:29:54


OSI AD.indd 1

24/5/10 11:01:25


tions they could be applied to, such as tackling sustainability issues in water and gas utilities? JD. There are existing smart grids already in operation for these other types of infrastructure systems. The electrical grid is the most challenging implementation, given the ‘just in time’ nature of power generation and delivery. A new grid, fully instrumented, must allow for the constant monitoring and management of a variety of components, by technicians fully cognizant of much larger potential impacts of failure. As an example, in traditional IT, many security failures result in denial of access to resources. In a utility, that denial would be catastrophic. As a result, lessons learned in smart grid initiatives in electricity clearly help to inform and advance other infrastructure initiatives. MU. So many of our competitors are focused only on the electric side of the smart grid. Itron is moving beyond just electricity for smart distribution systems. For gas, we’re tailoring our advanced metering solution so that utilities can use the two-way communication of our fi xed network technology to gather gas recorder and corrector serial data, and monitor cathodic protection. Th is is as opposed to reading the instruments manually or relying on expensive cellular technology. Accessing these readings through our gas AMI network will result in reduced operational costs for utilities. And for water, we’re making tremendous headway in making usage information that has typically been static, into dynamic actionable data. As these solutions become increasingly sophisticated, different players will need to work together to create integrated systems. Th is will require a new view of, and a new competence in, collaboration. JPK. Users also want sustainability that requires lower consumption of all resources. Even though water and gas are not consumed as they are produced, as with electrical power, there is a nexus between all of these and they should share the same continuous improvement process. In their presentation on sustainability, Kodak showed that intelligent metering and submetering of power, water and process allowed them to save one billion gallons of water in the last three years as well as saving over $100 million, thus conservation can be a major force. Their industry standard continuous improvement process leveraged this wider scope of data (including the real-time price of power and its history from their provider) for these benefits. It is not possible to say which is the more important type of metering. The emphasis should be on the continuous improvement process, which requires the unification of metering, utility, process data and analytics.

What are the key innovations and improvements that need to be realized before the smart grid can achieve its true potential? How far off do you think these advances are likely to be? MU. Fostering the grid is not only one of our industry’s largest opportunities; it’s also one of the most complex. We need to move quickly and decisively with the best ideas and the right smart technologies. For the smart grid to mature, we need an honest discussion, stripped of the hype. We can’t let a single company’s interests get in the way of realizing the smart grid. We need open standards that allow for seamless interoperability between hardware, soft ware and other systems in the grid. And they all must entice future innovation. Utilities and regulators need to demand standards that allow for this integration and for evolution. Building the smart grid may take a lifetime. But aspects of it are here today and are developing rapidly. To be successful, we need an infrastructure that is designed to evolve. JPK. The smart grid requires extreme scalability for both the event management and total number of real-time data streams. As the cost of metering technology and communication are lowered by utilizing the continued spread of broadband networks and advances in wireless security, the amount of resources conserved will increase. Communication to the end user via portals and smart phones utilizing social networking will allow the consumer to take action from smart devices. The industrial user has greater potential; they have the control technology in place, people to utilize those resources and the economic justification to adjust their process for monetary rewards. Alcoa presented their project to accomplish this recently and reported a four month payout of a $700,000 project by supplying energy back to the grid in peak demand periods while controlling their manufacturing processes. JD. While some may point to technology-based improvements such as utility-scale storage solutions as the keys, the most critical advancement will be in the leadership of the various providers to the smart grid food chain. Recognizing the unique challenges of reliability, responsiveness and security, as being co-equal in the new grid will force a different mindset on the part of utility providers, regulators and product vendors. Technically, there are dozens of challenges, but none of them are notably more difficult than a myriad that have been tackled and tamed during the internet revolution. As with so many new advancements, the greatest challenge will be motivating the players to progress, in the face of risk, beyond the status quo.

76 www.americainfra.com

Smart grid roundtable.indd Sec1:76

2/6/10 09:29:55


OUNCELABS_AD2.indd 1

24/5/10 11:33:01


EXECUTIVE INTERVIEW

An intelligent solution

Ian Macleod reveals how automated meter reading technologies can make a real difference to our water footprint.

sible today and many utilities are making leak alerts available in real-time via SMS text message or email for immediate action.

As the population continues to rise, water becomes an increasingly valuable resource. How can new technologies help utilities and their customers to preserve this resource? Ian Macleod. New technologies help create new understanding regarding how and where we use water. People can’t manage what they don’t measure, nor will consumption behavior change without greater clarity of usage trends. Advanced measurement, collection and management technologies bridge the gap between perception and reality of our water footprint. New Automated Meter Reading (AMR) technologies provide utilities a powerful conservation tool in how they manage water preservation. Rather than simply demanding less water usage, they now can defi nitively show where and how. Data can be made accessible in near real-time via the internet for daily water budgeting or zip code comparison of one’s personal usage vs. their peers. Th is brings out the competitive spirit in everyone to preserve water and cut waste. Leaks are a constant threat to water loss. Leak detection, system balancing, precise zonal measurement (DMA/DMZ) are all pos-

What data collection challenges exist in suburban and rural areas? What solutions exist to overcome these challenges? IM. Every landscape offers a unique data collection challenge. The good news is that technologies now exist for every situation facing utilities. Densely populated areas may be more suitable for a fi xed network system. A mobile drive-by system works in almost every environment and can offer much of the advanced functionality of fi xed network. UAV drone technology outfitted with meter data collection electronics is now available and most suited for wide open sprawling rural areas normally requiring great distances and time to capture perhaps very few reads. The key for true success is a hybrid framework allowing utilities to pick and chose simultaneous technologies for their specific requirements. A concern for some utilities in employing innovative solutions can center around ease of use. How important is it that any new system be user-friendly and simple to deploy? IM. Th is is a major concern and focus for technology developers. Utilities now enjoy unprecedented assured delivery of real consumption. Yet, fighting the water loss battle is akin to military confl ict – data is only as good as it is actionable. Intuitive, well-conceived and highly userfriendly soft ware transforms raw meter data into something actionable. In what sector is water pumped vs. consumed out of balance? What is our overnight non-revenue water?

Where are the major water main leaks occurring? Smart soft ware brings insightful awareness to threats on the utility bottom line, provides a context for improved customer service relations, and helps pinpoint areas for improving operational efficiency. What impact can new technologies such as these have on utilities’ bottom line? Do they provide genuine ROI? IM. ROI impact is very evident, measurable and present across several operation areas within utilities. ROI is achieved most demonstrably through redirecting existing resources. AMR meter reading technology creates operational efficiency. Personnel can now cross train and engage aggressive meter testing and repair programs, customer service related activities and conservation programs without adding people to the payroll. Water treatment costs big bucks. Chemicals and electricity gets expensive even if water is abundant and cheap. The cheapest source of new water is to preserve what you already hold. ROI is quickly achieved when utilities can identify and target leaks and keep water from seeping into the abyss. Synchronized meter readings alert utilities to water loss and imbalance issues, and depending on the AMR platform utilized, narrow the search for the leaks into specific sectors so action can be quickly taken. AMR technology packs power in the conservation punch insofar that people are empowered with actionable information to make real differences, thus leveraging money already spent on public outreach campaigns.

Industry expert Ian Macleod, VP of Marketing for Master Meter, Inc., also serves as an advisor for key industry conferences. Master Meter is a Principal North American water meter manufacturer and leading AMR/AMI technology innovator.

78 www.americainfra.com

Master meter.indd 78

2/6/10 09:24:48


Your World. COVERED From the people you hire to the products you sell, if you’re in business, we’ve got it covered...

Infrastructure Infrastructure provides insight on how developers can achieve critical objectives by integrating leading-edge solutions across their operations – helping them to make informed decisions about technology and operations solutions for all of their areas of responsibility. ALSO AVAILABLE FOR: EU, MENA

n

ditio

EU E

MENA Eddi ititioonn

Find out more: www.americainfra.com

Next Generation Pharmaceutical

Power & Energy

Approximately 50% of new drug development fails in the late stages of phase 3 – while the cost of getting a drug to market continues to rise. NGP is written by pharmaceutical experts from the discovery, technology, business, outsourcing, and manufacturing sectors. It is committed to providing information for every step of the process. Available for: Europe, US

A poll of 4000 utility executives posed the simple question: what keeps you up at night? The answers were costs, new technologies, aging infrastructure, congested transmission and distribution, viable renewables and inadequate generation capacity. Available for: EU

Find out more: www.ngpharma.com

Oil & Gas

Business Management

Collaboration between Government and multinationals to ensure the energy supply is developing on two fronts. O&G is the definitive publication for stakeholders and service companies to read about the regional projects, technologies and strategies affecting their group. Available for: MENA, US, Russia

What business processes work? What are the proven, successful strategies for taking advantage of domestic and international markets? Business Management is about real, daily management challenges. It is a targeted blend of leadership and learning for key decision makers in government and private enterprise. Available for: EU

Find out more: www.ngoilgasmena.com

CataloguePage.indd 79

Find out more: www.nextgenpe.com

Find out more: www.busmanagement.com

2/6/10 09:16:57


SAM PALMISANO_may10 02/06/2010 09:31 Page 80

TRANSPORT FOCUS By implementing intelligent transport solutions, cities around the world are beginning to see improved mobility and a more cost-effective network. IBM CEO Sam Palmisano reveals how solutions are set to get even smarter to meet the transportation needs of the 21st century.

I

t’s fair to say that the first decade of the 21st century has been remarkably eventful. In the last few years, our eyes have been opened to global climate change, the risk of pandemic, and the environmental and geopolitical issues surrounding energy. We entered the new century with a shock to our sense of security on 9/11 and have become aware of the vulnerabilities of global supply chains for food and medicine. And today, of course, we are working our way out of a global financial crisis. It’s interesting how many of those crises are linked to transportation. Indeed, more recently, transportation had its own major crisis – the disruption of global air traffic caused by the volcanic eruption in Iceland. So what links this decade-long series of crises? These are the manifestations and consequences of being globally integrated, reminding us that we are all now economically, technically and socially connected. But, we see now that being connected is not enough. Have you noticed how often in the past decade we have used or read the term ‘systemic breakdown’? While connecting economies, business flows, supply chains – and transportation networks – is important and has yielded tremendous benefits, connectivity by itself does not make for reliable, resilient, wellfunctioning systems. Fortunately, something else is happening at the same time. In a word, our planet is becoming smarter. And this isn’t just a metaphor. Intelligence is being infused into the way the world literally works: the systems and processes that enable services to be delivered; physical goods to be developed, manufactured and sold; everything from people and freight to oil, water and electrons to move and billions of people to work and live. First, our world is becoming instrumented. Today, there are nearly a billion transistors per human, each one costing one ten-millionth of a cent, there are four billion mobile phone subscribers, and 30 billion radio frequency identification (RFID) tags produced globally. Because of their increasing sophistication and low cost, these sensors and devices give us, for the first time ever, real-time instrumentation of a wide range of the natural and manmade systems all over the world. Second, our world is becoming interconnected. Very soon there will be two billion people on the internet. But that’s just the beginning, systems and objects can now ‘speak’ to one another, too – think about the prospect of a trillion connected and instrumented objects: cars, cameras, roadways, pipelines, even livestock and pharmaceuticals. And then think about the amount of information produced by the movement and interaction of all those things, it will be unprecedented.

80 www.americainfra.com


SAM PALMISANO_may10 02/06/2010 09:31 Page 81

THE ROAD TO A SMARTER PLANET


SAM PALMISANO_may10 02/06/2010 09:31 Page 82

“Making transportation smarter is in everyone’s interest. For a whole spate of reasons, the boldest action and the most pragmatic action are now one”


SAM PALMISANO_may10 02/06/2010 09:31 Page 83

Third, all things are becoming intelligent. Thanks to advanced analytics IBM, we know something about systems. Over nearly a century of work with and ever more powerful supercomputers, we can turn mountains of data into businesses and governments, we have designed, built and managed systems insight. And that intelligence can be translated into making our systems, from Social Security, to modern electronic banking, to retail. processes and infrastructures more efficient, more productive and responsive. In doing so, we have learned what is required for a system to be reliable The key to smarter systems lies not in the chip, or the sensor, or the moand resilient. First, there must be clarity on the system’s purpose or goal, a vibile device. It’s not the smart meter, or the smart power line or even the softsion of its end-state. Second, its elements must actually be connected, which ware, per se. It’s the data. From a smart bay in Ireland, to smart power in is another way of saying, interfaces matter. Third, we must be able to know, Malta, to smart telecommunications in India, to smart food tracking in continually and with confidence, the status of the system and its critical comNorway, companies and institutions are applying technology in new ways. All ponents. And finally, the system must be able to adapt as conditions change, around the world, economic stimulus is being injected by governments, much often in real-time. of it aimed at smart grids, healthcare data integration – and crucially, smart Viewed against these four characteristics, every well-functioning system transportation – to improve the systems that make our world work. looks strikingly similar. Now let’s look at American transportation today. Now as we look ahead to the needs and challenges of the 21st century, it There is a broad consensus, forged, in many respects, by the example of the is clear that we must do more. We know what any transportation system is, ITSA and DOT, that American transportation must become traveler-centric, on the most basic level. From ancient times to the present, any such system whether that traveler is a person or a package. has been made up of three elements: vehicles (cars, ships and planes, which The idea is simple: The traveler’s time, safety and experience should be move goods from one place to another); pathways (roads, rail the initial design point. A system’s design point matters and what lines, shipping lanes); and terminals (stations, car parks, airyou optimize it for will determine the value it ultimately delivports, seaports). These are the endpoints where journeys ers. The problem is that the system has to be actually conThe US spends begin and end, where passengers transfer from one mode nected to truly deliver on its goal. In many areas of life, this 2.6% of GDP on of transportation to another, and where goods are kind of connectivity is so basic that we simply take it for infrastructure tracked, organized and assembled. That’s the system. granted. Consider banking: we take it for granted that we We also know that population growth, unprececan transfer funds and make payments among instituBy contrast, dented urbanization and continued globalization are tions. Consider retail: we take it for granted that we can China spends placing great strains on all elements of that system, pushuse the same payment and billing systems, regardless of 9-12% ing them beyond the capacity of their serviceable life. And store, website or industry. All these systems have standards we know that the United States is not investing in its infraand interfaces that permit information to flow. structure at the same levels as other countries. The US spends at A true transportation system would need to connect the vehimost 2.6 percent of GDP on infrastructure, coming 27th among 36 cles, pathways and terminals as well as the government agencies and regulaOrganisation for Economic Cooperation and Development (OECD) nations. tors, the freight and logistics carriers, the vehicle and infrastructure Compare that to China, which invests at a rate of nine percent to 12 percent manufacturers, and the travel-service providers. It would also need to conof its GDP. This underinvestment creates strains on our transportation sysnect the travelers themselves, providing a steady stream of data on their jourtem that put our citizens and businesses at risk of deteriorating safety condineys, condition and location. It would do this across all modes of tions, competitiveness and quality of life, not to mention the waste of precious transportation. Clearly, transportation in America today fails this key test of resources and productivity. a well-functioning system. However, it doesn’t have to be this way. Over the past two centuries, adThird, many of the components and subsystems of transportation are not vances in transportation, from canals, to rail, to automobiles, to aircraft, have instrumented, or are differently instrumented from state to state, so that it is created the modern world, determining which cities would thrive, and usherimpossible to know with confidence what their current status is. This isn’t just ing in a new age for business, for society and for how we lead our lives. Now the a colossal waste of time and money, it also introduces inconsistencies in qualtime has come to return to transportation and what it has given us: new oppority and multiple opportunities for error. Look at the potential impact of an tunity. We must reinvent transportation to meet the needs of the 21st century. emergency like the Iceland volcano. What if that happened here? What if, say, We have the tools and know-how to address the challenges. Intelligent Mt. St. Helens erupted again? Or what if there were another 9/11? What technologies are emerging to enable transportation networks and users to comwould the economic, societal and innovation impact be of this lack of system municate with each other, improving system performance, safety and conveknowingness? nience, making IT just as important to 21st century transportation as airplanes, And when it comes to the fourth characteristic of a well-functioning asphalt and petroleum were in the last century. And under the guidance of orsystem – adaptability – ask yourself: is our transportation system in America ganizations like the ITSA and the US Department of Transportation, under the today, spanning roads, parking, railroads, airports, seaports, bridges, tunleadership of Secretary Ray LaHood, progress is within our grasp. nels and communications, ready for what’s coming? Demand is only going Over the past year and a half, IBM has been working with cities and nato grow, especially as population growth and urbanization continue to extions around the world to improve many kinds of systems and make them pand. The instrumentation of things and cities as well as the empowerment smarter, with particular success in transportation. In doing so, we have of individuals with mobile devices, will continue to increase exponentially learned that our transportation system isn’t, in fact, a system, it’s a collection and we will need far more physical and digital capacity from our transof related industries, operating in close proximity to one another. And, at portation networks.

www.americainfra.com 83


SAM PALMISANO_may10 02/06/2010 09:32 Page 84

Intelligent options

1.

the world from Smarter transportation is helping cities all over capacity to predicting demand and optimizing available er experience and dramatically enhancing the end-to-end travel ing environmental improving operational efficiency while reduc with a nation’s impact. Smarter transportation can also help ation economic recovery. A recent study from the Inform that for every $1.25 Technology & Innovation Foundation found re in the United billion invested in transportation infrastructu States, 35,000 jobs are created and supported.

1.

A smart card system has enabled Singapore Land Transport Authority to develop optimal routes and schedules, reducing congestion, increasing the appeal of public transit, and cutting fare leakage by 80 percent and the cost of fare processing by two percent.

2. 2.

DHL’s RFID-based system monitors the temperature of pharmaceutical shipments at g various points from departure to arrival – helpin new a ating gener its customers keep products fresh and source of revenue growth.

3.

t France’s SNCF manages passenger and freigh It . trams and buses city as well as railways, operates 14,000 trains per day, including the

3.

regional high-speed TGV and segments of the Paris and using system e enanc maint tive predic A s. transit system nts, accide nt preve SNCF g helpin is rs intelligent senso an reduce delays, and cut maintenance costs by t. percen estimated 30

4.

4.

Air Canada developed applications for smart phones that allow travelers to download electronic boarding passes, check in, get flight status and book rental cars. There was a 60 percent increase in mobile checksay ins, and 93 percent of Air Canada passengers The ience. exper travel self-service improves their app also saves 80 percent of the check-in cost.


SAM PALMISANO_may10 02/06/2010 09:32 Page 85

Thanks to an instrumented and interconnected planet, we’re capturing data in unprecedented volumes. In just three years, IP traffic is expected to total more than half a zettabyte. We’re receiving these enormous streams in real-time, and they are coming in multiple forms, from text to rich media, sensors to cell-phone cameras, and we’re capturing it from just about every kind of system or event imaginable: supply chains, rail vibration, weather patterns and billions of individuals using social media. But the most important point about this is not how much data there is. The important point is what it could tell us. To capture that, you need to dive deeper, to move from ‘big data’ to smarter data. That’s why analytics are key – the sophisticated mathematical algorithms that can detect the patterns, spot the correlations and see the context of the data – because a data point by itself is just about useless. Where once we inferred, now we can know and where once we interpolated and extrapolated, now we can determine, that’s the promise of a smarter planet. And it’s coming to life in smarter transportation all over the world. All in all, smarter transportation means advanced traffic management for air, land and sea. It is optimized around the traveler, is connected across all elements of the system, and communicates its status in real-time. It fluidly interacts with the other systems of our planet, from healthcare, to public safety, to commerce and more, because we are increasingly coming to understand that our communities, cities and our entire nation are complex systems that span both nature and human society. In the future, smarter transportation will even apply advanced modeling to something as previously unpredictable as, say, the flow of volcanic ash across the Atlantic Ocean. All of this and more is possible today, or soon will be. The progress of technology is only accelerating. To get there, I believe those of us across the transportation ecosystem must take a leadership role. The initiatives undertaken by Secretary LaHood and his colleagues at DOT are promising, but the truth is that the rest of us do not have to wait for the government – or anyone else – indeed, we must not. Looking forward we need help in four key areas. First, standards. We must establish agreed-upon data standards for transportation. This is long overdue, but I am hopeful that it will soon be accomplished. As we do this however, it is essential that those standards be open, that’s the only way to interconnect processes and data sets across the whole system. Second, smart systems by design. In anything as complex, interdependent and fluid as the transportation ecosystem, the qualities we seek cannot be ‘bolted on’ after the fact, we need to build in the key criteria of interconnectivity, system knowingness, analytics and security from the beginning. Third, moving to a true transportation system will enable, and require, far more collaboration: not just regarding the familiar idea of ‘private sectorpublic sector cooperation’. A diverse, multi-stakeholder world requires all the parties actually working together, shoulder-to-shoulder on a daily basis. We all have particular responsibilities – to customers, to partners, to regulators, to citizens – but in today’s world, fulfilling those responsibilities requires that we also fulfill our responsibilities to the system as a whole. That will be transformative and will also require change. From new models of technology, to the changing form of the corpora-

tion, to the changing role of the individual in modern life, to new expectations for sustainable living, we are entering a very different world. We must come together around clear guidelines on how to operate and manage our organizations and industry, from an ethical and societal point of view. It is exciting to embrace technology to improve speed, safety, efficiency and passenger experience, but the idea of pervasive sensors and cameras sharing data with transportation providers and governments is not going to sit comfortably with everyone. In conclusion, smarter transportation is not some grand, futuristic ideal. For one thing, the examples are real, and more are being deployed right now around the world. For another, smarter transportation is practical because it is non-ideological. While debates will continue to rage on many contentious issues that impact transportation – from energy, to security, to climate change, to the economy – no matter which viewpoints ultimately prevail, the system that results will have to be smarter, more transparent, more efficient, more accessible, more resilient, more innovative. And that’s one final reason for hope: making transportation smarter is in everyone’s interest. For a whole spate of reasons, the boldest action and the most pragmatic action are now one.

President Barack Obama presents the National Medal of Technology and Innovation to Sam Palmisano at the White House in Washington on October 7, 2009. We find ourselves today at a unique moment. The key precondition for real change now exists: people want it and they are hungry for leadership. Such a moment doesn’t come around often, and it will not last forever, so ask yourself this: in hindsight, when the circumstances that cry out for change are gone, when things have returned to ‘normal’, don’t we always wish we had been bolder? More ambitious? Gone faster, gone farther? Did anybody ever wish they had done less? Despite the litany of challenges we face, I am confident that the US will do what leaders do: lead. I’m convinced we can build a smarter and safer transportation system in America and that in doing so, we will achieve both societal progress and economic growth for our cities, states and nation. n This text is based on a speech given at the Intelligent Transportation Society of America, 2010 Annual Meeting & Conference on May 5.

www.americainfra.com 85


ITS RT_may10 02/06/2010 09:29 Page 86

ROUNDTABLE

Driving force Mike Girton and Karen Finley debate the issues around intelligent trafďŹ c systems. The transportation sector is increasingly faced with social changes and greater expectations regarding the protection of the environment and resources. In what ways can intelligent transportation systems (ITS) help the sector to meet these challenges? Karen Finley. Intelligent transportation systems, such as fixed road safety cameras, can aid in 24/7 enforcement of laws with a small environmental footprint. Officers need not expend energy, transportation time and fuel by physically being stationed on the roadside. This real-time data capture and management can prove to be both financially and environmentally responsible and these types of safety systems enhance our growing urban landscape. Mike Girton. One of the major benefits of modern society is being able to go where you want, whenever you want. Mobility is freedom in this day and age and that means that ITS technology needs to make travel on the motorways as smooth and efficient as possible. There are already many different kinds of ITS across the globe from CCTV systems and sensors to variable message signs (VMS). Some nations, such as the Netherlands, have traffic cameras capable of license plate recognition monitoring entire highway infrastructures in order to enforce speed limits. Other countries, like the United States, use cameras and sensors more often to implement everything from speed limits in school zones to fines for drivers who run red lights. While these systems do help to ensure road safety, automatic incident detection (AID) allows for more extensive systems in which a single operator can monitor hundreds of cameras simultaneously and promptly manage traffic flow and incidents, as necessary. By streamlining traffic and road use through the intelligent technologies in ITS, people can get where they are going more quickly and fewer roads need to be built to cope with the rising number of automobiles. In the end, this works to protect the environment and conserve resources. As new opportunities arise due to advances in information and communication technology, how are companies in the ITS sector capitalizing on this? KF. From video vehicle detection to in-ground sensor technologies to intelligent vehicles, information and communi-

86 www.americainfra.com

“Automated safety technology such as speed safety cameras and intersection safety cameras for detecting red light running are rapidly being deployed in about half of the states in the US� Karen Finley cations technologies are rapidly emerging on our roadways. The innovation is spawned by the increasingly economic and rapid transfer of data. As long as data communications technology accelerates, so will the advent of new and intelligent systems that communicate on the road, whether it is vehicleto-vehicle or vehicle-to-infrastructure. MG. AID systems use intelligent algorithms to continually analyze camera images for unusual occurrences and to alert operators within seconds of the type of incident and its location. From a centralized control center, the notified operator can visually verify and assess the situation and react appropriately. This may entail adjusting lane control signs, variable speed signs or dynamic message signs as well as informing


REDFLEX AD.indd 1

24/5/10 11:02:01


ITS RT_may10 02/06/2010 09:29 Page 88

authorities, such as the police, fire department or emergency medical services. In this way, traffic can continue more smoothly and safely, despite obstructions or mishaps. Moreover, similar technology can be used to collect data, such as counting how many trucks pass through a tunnel every day or how many cars use a certain highway each morning. With information like this, it becomes possible to reroute traffic or adjust infrastructures to improve road safety while simultaneously making highways more efficient, thus reducing the need for more roads. Safety and efficiency are key priorities in transportation. How are ITS systems evolving to help deal with these issues? KF. The leading cause of death for Americans younger than 34 is automobile crashes on our roads and highways. Automated safety technology such as speed safety cameras and intersection safety cameras for detecting red light running are rapidly being deployed in about half of the states in the US. These systems don’t replace the judgement of a police officer. Rather they add an extra layer of automation and technological precision on the roadside and leave the ultimate authorization of a violation to an officer, who views evidence back at the police station. This efficiency puts in motion a multiplier effect whereby officers no longer need to use precious time and resources to remain stationary at the roadside, but can be redeployed to attend to other serious crimes in that community.

“By streamlining traffic and road use through the intelligent technologies in ITS, people can get where they are going more quickly and fewer roads need to be built” Mike Girton MG. Companies are working to develop integrated and efficient solutions that combine various aspects of traffic management systems. This offers a more proficient and consistent means of transmitting collected material and initiating responses to incidents. Therefore, Optelecom-NKF, a global supplier of advanced video surveillance solutions, and Traficon, an expert in the development and implementation of AID algorithms, joined forces to create the Siqura TrafficServer. By embedding Traficon’s field-proven incident detection algorithms into a Siqura encoder, the two companies were able to create a cutting-edge IP product that not only increases the quality and effectiveness of the AID system on the whole but also reduces the resources required. By sharing cameras for both incident detection and

88 www.americainfra.com

video monitoring, the Siqura TrafficServer calls for fewer cameras. This decreases power consumption and installation space requirements, which, in turn, ultimately results in less maintenance and lowers the total cost of ownership. Since the Siqura TrafficServer uses a dedicated DSP to implement the incident detection algorithms, all the number crunching is done locally, on the video server itself, and it therefore requires less processing power than traditional, centralized AID solutions. This dedicated DSP also cuts back the amount of data lost in transmission. As a result, the system uses less bandwidth while actually improving video quality. An additional dedicated DSP allows the Siqura TrafficServer to transmit the promising, new H.264 as well as MPEG-2, MPEG-4, and MJPEG, simultaneously. Consequently, the Siqura TrafficServer is able to offer a selection of streaming options for several different purposes. Using MPEG-2 for live viewing, for example, offers a highquality picture with relatively low-bandwidth while H.264 provides an excellent image that doesn’t take up much space, making it perfect for storing video material. Alternatively, MPEG-4 has a lower resolution but is optimal for streaming to web applications, and MJPEG is best used for transmitting image data to remote devices. The power to choose between these different video compression algorithms ensures an easy and effective integration of the Siqura TrafficServer with other, existing applications as well as with future additions. Moreover, the Siqura TrafficServer is making life easier for operators and systems integrators. Each individual stream can be merged with incident detection information, giving operators a complete overview of the traffic data and events, and allowing them to easily access, maintain and enhance traffic conditions. What are the main developments you envisage for the future of ITS technology? KF. Other enforcement technologies on the infrastructure side that are gaining acceptance are automated licence plate recognition and automated enforcement of bus lanes, HOV lanes and double white lane crossing. With distracted driving becoming a causal factor in accidents due in part to cell phones, we believe technology must be used to counter technology. Some main developments we are likely to see are more vehicle-to-infrastructure communications for safety. This is the wireless exchange of data between artificially intelligent vehicles with embedded computing systems and highway infrastructure, intended primarily to avoid motor vehicle crashes through real-time driver warnings. MG. Ultimately, ITS technology will move towards networks that integrate all the latest features available for traffic management, such as automatic license plate recognition and data collection and analysis applications. n

Karen Finley, President and CEO of Redflex Traffic Systems, Inc., started at Redflex in 1998, when Redflex had three US safety programs and 30 employees. She has successfully ushered the business through an enormous expansion, with revenues increasing over 15-fold. Today, Redflex has over 400 employees and 250 safety programs.

Mike Girton is Senior Solutions Engineer for Optelecom-NKF and has 20 years experience in field product support and management. His 13 years of experience with OptelecomNKF includes technical sales support, for the fiber and IP product lines. Girton provides pre and post sales technical support and training for Optelecom-NKF customers, distributors and representatives.


OPTELECOM AD2.indd 1

24/5/10 11:01:10


LAND MANAGEMENT

MAPPING THE FUTURE New technologies and techniques are driving development in the mapping and surveying sector. Curtis Sumner explains how the effective use of tools will help the transition to a more sustainable and secure future. “Surveying and mapping is set to continue playing a large role in creating and developing renewable energy infrastructure, particularly with regard to the positioning of energy sources,” explains Curtis Sumner, Executive Director of the American Congress on Surveying and Mapping (ACSM). The non-profit educational organization was founded in June 1941 to advance the sciences of surveying and mapping in related fields. “Let’s take wind farms, for example. Whether it’s going to be on land or out in the ocean, a position needs to be determined with regard to where it’s going to be – be that on private property, or as a lot of the land in the US, owned by the federal government. Either way, the information about the land is critical to the people who are going to fund the project.” The position of where this activity is going to occur is vitally important from the private or public land ownership perspective. Sumner explains that regardless of who owns the land there is a fairly standard type of survey that is used in commercial land transactions, called the ALTA/ACSM Land Title Survey. The survey has a specific set of criteria under which it is to be conducted that goes along with whatever the jurisdictional laws are in that particular state. The ALTA/ACSM survey is a national standard that in some cases supersedes what the state requires, and in some cases, is not as restrictive as the state requirements. However, it is a uniform set of standards that provides information to a lending institution, whether that site is in

90 www.americainfra.com

Curt Sumner.indd 90

2/6/10 09:54:59


Virginia or Wyoming, for example. “As renewable energy activities become more commonplace, the positioning and the impact on development from title rights to other interests in the land are something that surveyors would typically be involved in identifying – from the geographical position of the property as it relates to other properties around it, but also from whatever those title issues may

integrity of a bridge or a piping facility or whatever. But, if I want to know where that facility is located, relative to everything surrounding it, then I’d still have to get those geographic positions,” says Sumner. “GPS has an important role to play in the position of things as they are relative to whatever is round it. And that’s one of the things that sometimes gets a little bit lost in the whole concept of geographical informa-

“What we’re seeing is a fairly significant incidence of the positioning provided through the GIS systems to the landowners or to whomever that is in conflict with the positioning that was created by the surveyors on the ground” be,” says Sumner who goes on to outline other opportunities. Even in the oil and gas industry, because of the location of the wells and the actual pockets of oil or gas, surveyors need to get involved in the location of the those sites. “There are surveyors in the country who focus primarily on this type of work and some of them are doing quite well these days because that’s what people are looking at, whether it’s existing energy sources or the newly-named renewable energy resources.” Sumner explains that the Bureau of Land Management (BLM) has been involved in looking at some of the locations, particularly those that are potential wind farms, that will fall on public lands. The bureau is being asked to provide survey data that is equal to the ALTA/ACSM survey. “They haven’t previously worried about where this is positioned necessarily in the overall scheme of things, but now, if someone is going to lease a portion of that land, then the position of it within the public lands system is relevant. It may be miles and miles away from any known markers identifying the boundaries and so it can require quite a bit of work and of course global positioning systems (GPS) are very helpful in that regard.” Indeed, GPS has been a major component in the technological advancement of the surveying and mapping industry over the past 10-20 years, which is mainly due to it affecting all the other technology in the sector. “If I’m using a scanning device I’ll be getting tons of points from which I’m going to draw some conclusions whether I’m determining the structural

tion systems (GIS) and geographic positions versus getting relative positions.” Historically in the surveying world, when talking about land what is really important to the landowner is how it fits relative to the property around it. While it may not be important where it lies geographically in space or in the overall world, what is essential is the way it sits with respect to whatever’s beside it. One of the big challenges in this space is creating a data structure that includes the position of property ownership within the overall context of geographic positions and GIS systems. “It all comes down to protecting the in-

tegrity of the individual land ownership from a relative perspective on the ground versus its overall perspective in a broader picture,” says Sumner, who explains that while GPS is a big part of that, scanning is becoming a big tool. “When we went to instrumentation and surveying that required less physical activity and less people, the ability to gather data from whatever method it may be – whether through scanning or robotic instruments where in many cases you have one person on the ground with an instrument, that person’s directing the instrument out there on the front end gathering the data.” All of these technologies have had a major impact on the surveying community, in particular from the perspective of collecting the data that has to be analyzed. However, there are still technological gaps that need to fi lled, mainly around the whole idea of positioning. “If you were going to go down to your local county courthouse, for example, most of which now have tax mapping systems that are in a GIS, they’ve all been converted to a coordinate system that’s tied to something. If you want to get data about your property, you may be given information from that GIS system that gives geographical positioning on where that map says your property corners are. “However, they may not match where they actually are on the ground because of the way the data was imported into the system

www.americainfra.com 91

Curt Sumner.indd 91

2/6/10 14:39:25


to begin with. And so what we’re seeing is a fairly significant incidence of the positioning provided through the GIS systems to the

infrastructure or geodetic platform, and they were all located accurately, the simplicity of pulling those coordinates off and going out

“We’ve got all these wonderful tools that are just so unbelievably useful in so many different contexts landowners or to whomever that is in confl ict with the positioning that was created by the surveyors on the ground. The concern is that somehow we lose sight of the integrity of the documentation on the ground.” Sumner goes on to say that if all the properties in the US were surveyed on the same

and fi nding the right spot would be dramatically increased. Today, local surveys that are completed on people’s properties are not tied to those systems – instead they are relative surveys regarding what is around them. Getting all this information into one central system would involve inputting 300 years

of localized surveying on to the system. Th is is the biggest challenge in the sector, suggests Sumner, because part of what professional surveyors do is work with individuals or property owners of any stripe to help them identify where their property is and how what’s going on at that property relates to them. “We see the evidence of confusion being created where people can just get data out of the local GIS and think they can take their little handheld GPS unit out and identify where their property falls and that’s simply not true.” Will we see a standardized process implemented in the future? Well, Sumner certainly thinks that there is a place in the market for making this happen, although he is keen to highlight that there must be a higher level of understanding of the relevance of it by those on the ground positions as it relates to what is in the computer system. “People, by and large, regardless of their background, don’t really understand the difference of ground position versus computer system. If it’s somewhere on a map and there’s a geographic position then people tend to think ‘there we go, that’s it’. And it’s an educational process really, where we need to have a discussion about the validity of data that’s coming from whatever the source is as it relates to individual land ownership, which is the backbone of what this country’s built on.” Looking forward, Sumner reveals that this is something that he would like the ACSM to be involved in. “We’ve got all these wonderful tools that are just so unbelievably useful in so many different contexts and so when you start talking about the real-world location of things and how it affects people’s property rights and the use of property, sometime that’s seen as almost an impediment to progress, as opposed to an essential part of being able to move ahead.” While Sumner admits that it is not necessary to have this relative positional integrity to use data that has been imported in to GIS systems from different sources, he says that when you actually implement something on the ground on that property, then it becomes extremely important. “When we surveyors get involved and say ‘wait a minute, we need to give consideration to another side of things’, that’s when we are considered an impediment to moving ahead rather than viewed as a pos-

92 www.americainfra.com

Curt Sumner.indd 92

2/6/10 09:55:13


BOUNDARY DISPUTES

GPS can only go so far: • During the severe 2008 drought, the state of Georgia tried to reassert its right to its northern boundary on the 35th parallel, authorized by Congress in 1796 when the state of Tennessee was created. The point on the parallel where Georgia, Tennessee and Alabama meet is on the southern bank of the Tennessee River. It turned out the 19th-century surveyor tasked with finding the western end of Georgia’s northern boundary was off by just over a mile, ending Georgia just south of the river. That left Georgia unable to tap into the river and pump muchneeded water south into Atlanta. • The monument where the four corners of Arizona, Utah, New Mexico and Colorado come together is off by 1807 feet. But the surveying error has become irrelevant, since the states have adopted the accepted location as the only spot in the country where four states meet. • In 1763, the towns of Shelburne and St George were granted charters. However, neighboring communities, including Vermont’s biggest city, Burlington, had already been laid out. So Shelburne and St. George were squeezed in, their maps overlapping. In 2007, a couple built a house on the disputed territory and had to ask for building permits in St. George, even though tax maps showed the land to be in Shelburne. Despite modern mapping techniques, the towns had to use the old maps, look at stone walls and rock piles and survey the area again. Source: Politics News

sible solution long-term,” he adds. Education is obviously key to understanding and Sumner says that when looking at how land is used, it becomes imperative, particularly in regard to the impact on surroundings. “If somebody’s going to build a road across

left once you’re done and how much you’re taking from me. Those have to be determined by identifying whatever the improvement’s going to be, relative to what I own, not just in a broader picture perspective. “For example, if I look at Google Earth,

“We have to be careful not to let our love affair with the technology overcome our common sense” my property,” he suggests, “then I want to know how that impacts me overall, not just in general. If you‘re showing me a map that’s been created from satellite imagery that’s all well and good and it’s very useful, but it’s not going to tell me what the specific impact on my property may be in terms of what may be

I can zoom right in on my house, look at the date and try and work out whose car is sitting on my drive, but that doesn’t really tell me very much in terms of whether or not the sidewalk was actually built on my property or is within the right-of-way that was established for it or any of those types of issues.”

And with the progression of technology clearly having an impact on how the mapping and surveying sector moves forward through the 21st century, better understanding and standardization throughout the industry will also have a part to play. “We have to be careful not to let our love affair with the technology overcome our common sense as it relates to the impact at whatever level,” warns Sumner. “I wouldn’t like to see technology drive our attitudes regarding how we address all the issues up and down the ladder even through technology will of course drive the way we gather data and the way we process it. That said, we need to make sure that while we take advantage of that technology, we don’t lose sight of the common sense side of things in terms of impact.”

www.americainfra.com 93

Curt Sumner.indd 93

2/6/10 09:55:15


EXECUTIVE INTERVIEW

Enhancing coastal planning

Geospatial technology is critical to better understanding our coastal environments for more efficient infrastructure planning, explains Ed Saade.

Recent events in the Gulf of Mexico have drawn coastal management issues into the spotlight. What role can geospatial technology play in managing coastal infrastructure projects? Ed Saade. Human activity and awareness along our coasts clearly continues to accelerate. Coastal and marine waters support millions of jobs in the US while more than 95 percent of US overseas trade by volume transits through US ports. This trend has driven the need for new infrastructure and the need for smart planning. Yet here in the US, as in most other nations, most maps of the coastal zone that are key to planning activities are outdated or inaccurate. Building an accurate and up-to-date base map of the coast allows private and public stakeholders to make informed and effective decisions related to the planning, design, construction and maintenance of infrastructure, from ports and harbors to offshore wind farms. It also provides

critical information for a broad range of other applications from navigation safety to disaster preparedness and response. That’s why we support initiatives such as the Digital Coast. Recent advances in geospatial technology now allow us to develop comprehensive baseline maps over large portions of coastline in a costeffective and timely manner. What changes have driven the way geospatial technology is used today in coastal environments? ES. There have been changes both from a policy and a technology standpoint. On the policy side, governments are under increasing pressure to implement strategies in the interest of long-term sustainability. In the US, for example, there are a growing number of coastal and marine spatial planning (CMSP) initiatives being implemented to study coastal environments and identify areas best suited for various types of human use. CMSP requires accurate, up-to-date geospatial information in order to be effective. On the technology side, new advances in remote sensing and data processing technology have made it possible to develop accurate, seamless geospatial data across the land-sea interface, including topography, natural habitats and geology. In the case Ed Saade of infrastructure projects, this provides valuable data for many planning activities such as modeling, site selection and engineering design, among others. At Fugro, for example, we have pioneered the use of geospatial, geotechnical and met-ocean services to support offshore wind farm siting and development activities. What is so unique about this new coastal mapping approach? ES. Previously, mapping the seabed and mapping the land topography were separate activities handled by separate agencies or firms. Fugro

is one of the only firms in the world to own and operate the technology required to make this type of mapping possible. Our approach uses a unique combination of airborne laser-based mapping, digital imagery and marine and terrestrial surveying technologies. Because data acquisition occurs simultaneously and must coincide with the appropriate weather and water conditions, careful planning and coordination is required. Having all the equipment, resources and technical expertise under one roof greatly facilitates this process. This approach also makes it possible now not just to acquire the data, but also to deliver the maps in a matter of months rather than years. How do you see geospatial technology evolving in the future and how will it benefit infrastructure developers and managers? ES. Geospatial technology has witnessed tremendous growth in the last few years and delivered immense benefits to public and private organizations alike. It will continue to evolve to provide ever more relevant data that will increase efficiencies for a broad range of onshore and offshore infrastructure projects. As an example, Fugro recently developed a new airborne panoramic mapping sensor that delivers both standard and oblique imagery in an easyto-use desktop mapping software. It allows non-specialists to develop accurate three-dimensional information of the topography and man-made structures. We will see more of this trend in the near future, where geospatial data and tools are increasingly put into the hands of end users allowing them more control over how the data is collected and used. Edward J. Saade is President and Managing Director of Fugro EarthData. He has over 30 years of experience in marine- and land-based geospatial and geophysics applications. Saade previously served as President and Managing Director of Fugro Pelagos, and under his leadership propelled them to become a world leader in hydrographic multi-beam and backscatter techniques.

94 www.americainfra.com

Fugro earth data.indd 94

2/6/10 09:22:36


FURGO AD.indd 1

24/5/10 11:38:58


SANBORN_may10 02/06/2010 09:36 Page 96

ASK THE EXPERT

A new direction Rick Vincent outlines some of the latest innovations in mapping technology.

T

he Sanborn Map Company, Inc., an industry leader in aerial mapping and LiDAR collection and processing, continues the tradition of implementing cutting-edge technology with the deployment of the Optech Lynx V200 Mobile Mapping System, a combination LiDAR and high-resolution video system designed to meet the accuracies required for today’s engineering-grade applications and solutions. Leveraging Optech’s 33 years of technology development expertise and the latest in LiDAR innovation, the Lynx Mobile Mapper employs a revolutionary new iFLEX LiDAR sensor head to collect survey-grade LiDAR data at more than 200,000 measurements per second with a 360º field of view while maintaining a Class 1 eye safety rating. Sanborn’s new mobile mapping survey solution is designed for the collection of engineering/survey-grade LiDAR data over large areas that are impractical to collect with static LiDAR sensors but require an accuracy and resolution that exceed airborne technologies. With a system accuracy better than 5 cm and a resolution of up to 1 cm, the Lynx Mobile Mapper offers Sanborn customers unprecedented 3D detail – all from a vehicle moving at speeds up to 60 mph.

96 www.americainfra.com

Sanborn mobile mapping provides remarkable capability for the rapid 3D mapping of highways, infrastructure, and buildings using vehicle-mounted lasers and 5-megapixel digital video cameras. It is a proven solution for collecting engineering/survey-grade LiDAR data over large areas where surveys are impractical with static LiDAR sensors, but require an accuracy and resolution that exceed those of airborne technologies. Traveling at normal road speeds – day or night – the Sanborn’s Lynx Mobile Mapper offers a 360° field of view, with higher precision mapping to very long ranges. Capturing every detail along the highway corridor, including road barriers, cracks in the road surface, ditches and overhead wires, surveyors can create highly accurate 3D computer models for new scheme planning, road maintenance, wide-load route assessment, and asset management applications. Sanborn mobile mapping is an ideal survey solution for LiDAR data collection for rail applications. Installation on a ‘speeder’ has demonstrated the Lynx’s ability to provide unprecedented detail for rail asset management. Traditional survey methods require frequent measurements on the rail base, the top of the rail, and the rail base on the opposing side – a labor-intensive, disruptive and occasionally dangerous process. By contrast, LiDAR and video data acquisition takes much less time and thereby minimizes the disruption to rail traffic. Surveyors are not put in harm’s way, and measurements are more frequent and easily chosen by the operator. Sanborn has extensive experience with the collection and processing of massive amounts of geospatial information. With a fleet of nine aircraft, six digital aerial mapping cameras, three airborne LiDAR systems, 600TB of storage/300 CPUs for processing, and technical staff of more than 150, including engineers, surveyors, photogrammetrists, GIS specialists and certified project managers, Sanborn can effectively manage, control, and produce the required information and value-added products required from our Lynx Mobile Mapping System. The addition of the Lynx Mobile Mapping System allows Sanborn to offer a complete ‘Ground2Sky’ picture of any environment, and is a natural extension to the workflows and processes in place for airborne LiDAR applications. n

With more than 31 years of experience in the mapping industry, Rick Vincent has extensive experience, having managed orthophoto mapping and LiDAR programs both in the United States and internationally. Vincent currently oversees operational management for The Sanborn Map Company’s LiDAR, orthoimagery, and GIS mapping production division.


SANBORN AD.indd 1

24/5/10 11:03:08


MTB_AD (B2B)_MAY 2010:mar10

18/5/10

11:26

Page 1

MeetTheBoss TV is incredible access to the world’s business leaders – so you can learn their winning strategies and attitudes first hand REGISTER NOW

the corporate ladder with exceptional executive learning: anytime, anywhere

Find out more at www.MeetTheBoss.tv Where Future Leaders Learn


INKMAKER AD.indd 1

24/5/10 10:58:04


SECURITY

F

ollowing the attacks on September 11 2001, the National Commission on Terrorist Attacks upon the United States (9-11 Commission) underscored the importance of interoperable communications during response activities. “High-risk urban areas such as New York City and Washington, D.C., should…ensure communications connectivity between and among civilian authorities, local first responders, and the National Guard. Federal funding of such units should be given high priority by Congress.” The 2005 landfall of Hurricane Katrina along the Louisiana coast made clear that interoperable communications were needed well beyond the confi nes of “high-risk urban areas”. Consequently, to address these issues, Congress established the Office of Emergency Communications (OEC) within the US Department of Homeland Security (DHS). Congress created the OEC to be the federal focal point for improving emergency communications operability and interoperability across the nation. OEC’s charge is both vital and complex. Estimates are that over 50,000 emergency response agencies exist across the nation, working in diverse disciplines, geographical locations, and under various jurisdictions.

To complicate matters further, 90 percent of the emergency communications infrastructure is owned at the state and local level with the vast majority of emergency responders being state and local employees. Th is means that improving communications interoperability is impossible without the cooperation and consent of local fi rst responders. As a result, OEC takes a ‘stakeholder-driven’ approach to its mission. Rather than issuing mandates, OEC supports states and localities by providing the tools, guidance, and coordination necessary for them to enhance emergency communications. The cornerstone of OEC’s efforts is the National Emergency Communications Plan (NECP), issued in July 2008. The NECP is the fi rstever national roadmap for improving emergency communications. The NECP provides a comprehensive strategic plan for emergency responders and government officials at all levels, and is designed to guide them in making measurable improvements in emergency communications. Developed with input from more than 150 emergency response experts and government officials, the NECP truly is a national plan. OEC created the NECP using information gleaned from the statewide communication interoperability plans of all 56 states and territories,

The big conversation Chris Essid outlines the future of interoperable emergency communications.

100 www.americainfra.com

Chris Essid.indd 100

2/6/10 09:19:54


national-level disaster aft er-action reports, and feedback from stakeholders from numerous disciplines and jurisdictions. The plan outlines three goals (see key objectives), supported by seven objectives that are further broken down into related initiatives and milestones. Together, the goals and objectives comprehensively address the primary issues affecting emergency communications operability and interoperability. OEC developed the NECP with the awareness that communication gaps can exist for a wide variety of reasons. Sometimes the issue is technology-related – agencies use different radio frequencies, or have incompatible proprietary communication systems and infrastructure. But just as often, the roadblock is a lack of coordination and peer-level working relationships that prevent the development of standard operating procedures and cross-jurisdictional, cross-disciplinary collaboration. Recognizing this, the NECP addresses the full range of emergency communication capabilities needed by emergency responders and maps those to the SAFECOM Interoperability Continuum. NECP milestones range from the development of technology standards, to developing a catalog of federal level technical assistance and establishing best practices for emergency communications coordination with international partners. Over the past year, OEC has worked diligently, along with its partners at all levels of government and the private sector, to complete the NECP milestones in several areas. One key initiative is the establishment of governance structures that provide the necessary leadership, coordination and accountability for emergency communications efforts. OEC accomplished this milestone by sponsoring workshops, working groups, technical assistance, and grant funding. As a result, more states now have

a comprehensive statewide interoperability governing body that brings together officials and emergency responders across all levels of government to collaboratively enhance interoperability throughout a state or territory. OEC has also has supported the establishment of statewide interoperability coordinators as an important element of effective emergency communications governance. Coordinators act as the point person and champion for emergency communications interoperability efforts in their state or territory. To support the coordinators in their work and share best practices, OEC created the Statewide Interoperability Coordinators Council (SICC) as a forum for information exchange and collaboration. To help fund state and local efforts, OEC also provided grants coordination through the Interoperable Emergency Communications Grant Program (IECGP). The IECGP provides funding to states for implementing key activities in the NECP including governance, common operational protocols, standard operating procedures, training, and exercises. Also, OEC helped develop the SAFECOM grant guidance that provides the guidelines necessary to align federal grant programs with NECP initiatives. The NECP also calls for cross-border emergency response collaboration. To accomplish milestones in this area, OEC is working closely with Mexico and Canada to improve cross-border communications and interoperability through a variety of initiatives, including workshops and working groups. For example, in May 2009, OEC partnered with Public Safety Canada to co-host the inaugural US–Canada Cross Border Interoperable Communications Workshop in Niagara Falls, New York. The workshop is part of OEC’s ongoing efforts to assist law enforcement, border protection, customs enforcement, and emergency management agencies along the northern border to more effectively share information and coordinate. OEC also co-chairs the Security Communications Task Group (SCTG), part of the US and Mexico High Level Consultative Commission on Telecommunications (HLCC) that works to improve cross-border communications to combat border violence and improve border security. By supporting state and local efforts and enhancing federal coordination, OEC is working hard to ensure that the NECP is a living document that can periodically be updated and revised. Moreover, in these challenging economic times, it is more important than ever that emergency communications efforts are focused, unified and coordinated. The NECP provides the framework so that, working together with a shared vision, federal, state, local and international partners can move emergency communications to the next level of interoperability.

“In these challenging economic times, it is more important than ever that emergency communications efforts are focused, unified and coordinated”

Key objectives The National Emergency Communications Plan outlines three goals: 1. By 2010, 90 percent of all high-risk urban areas designated within the Urban Areas Security Initiative (UASI) can demonstrate response-level emergency communications within one hour for routine events involving multiple jurisdictions and agencies. 2. By 2011, 75 percent of non-UASI jurisdictions can demonstrate response-level emergency communications within one hour for routine events involving multiple jurisdictions and agencies. 3. By 2013, 75 percent of all jurisdictions can demonstrate response-level emergency communications within three hours of a significant event, as outlined in the department’s national planning scenarios.

Chris Essid was appointed the first Director of the Office of Emergency Communications, within the Department of Homeland Security in December 2007. In his position, Essid guides OEC policies, programs and activities promoting emergency response communications for Federal, State, local, and tribal governments, including the implementation of the National Emergency Communications Plan. Essid previously served as the first Interoperability Coordinator for the Virginia Governor’s Office of Commonwealth Preparedness.

www.americainfra.com 101

Chris Essid.indd 101

2/6/10 09:19:55


3M exec_may10 02/06/2010 09:28 Page 102

EXECUTIVE INTERVIEW

The added value of global resources Jean Lobey reveals how end-to-end solutions are helping the infrastructure industry produce better results in the security sector.

As Executive Vice President of Safety, Security, and Protection Services at 3M, Jean Lobey is responsible for a division that helps keep people safe, the environment secure and assets protected. Lobey’s 3M career spans 34 years and he has a degree in Economy and Marketing from the University Paris XIII.

What challenges do you see on the horizon for those in the infrastructure industry? Jean Lobey. Changing economic conditions will demand that companies offer solutions for a wide range of customer needs and budgets. Additionally, the various hazards brought on by both natural and man-made events are drawing increased attention and concern. Those in the infrastructure world need to help their customers prepare for such situations, not only through product offerings, but also through planning tools and services. Legislative changes are also a continuously evolving factor, and companies must continue to monitor changes and make refinements to products and services, as well as help their customers stay informed. What can infrastructure leaders do to better anticipate future challenges and shape the policies that affect them? JL. We believe it is critical to take an active role in the professional organizations that serve our industries, and to communicate with other organizations that have a stake in

thousands – this is the result of extensive expertise in materials science and a focus on leveraging our core technologies across many different categories. We place an emphasis on information-sharing across divisions, which allows us to adapt our technologies for new and creative applications. For example, 3M became a pioneer in passive fire protection technology after one of its product developers recognized that an internally-developed intumescent material being used in automotive catalytic converters had the potential to save lives by helping to prevent the spread of fire, smoke and toxic gases in construction and marine applications. What does 3M offer to enhance the value of its products? JL. Our subject matter expertise and training services are vital components of our offerings to safety, security and protection services customers, and we work to provide endto-end solutions, rather than just off-the-shelf products. Our training and education programs help keep workers safer, more informed, and better able to use our products

“Legislative changes are also a continuously evolving factor” these businesses. Currently, 3M plays a role in organizations including the American Industrial Hygiene Association (AIHA), American Society of Safety Engineers (ASSE), National Safety Council (NSC), American Road & Transportation Builders Association (ARTBA), International Code Council (ICC), ASTM International, the National Fire Protection Association (NFPA), the American Association of Motor Vehicle Administrators (AAMVA), and more. We monitor the status of the laws and policies that affect us and our customers, and we aim to stay at the forefront of changes so that we can be responsive to our customers’ needs. What is the advantage of working with 3M versus a company with a more exclusive focus? JL. It is true that 3M’s product portfolio numbers in the

102 www.americainfra.com

to their best advantage. One example from the many training sessions we’ve recently conducted is the H1N1 Pandemic Response Webinar held by the occupational health & environmental safety division, which gave customers more information about the nature of the virus and the guidelines issued from governmental agencies regarding the use and care of respiratory products in occupational environments. In consulting services, our divisions are able to help customers identify opportunities for increasing efficiency and better utilizing their existing tools. We are able to provide customers with useful insights into how to more effectively use their resources, as well as how to create new systems that offer a better end-value. n 3M is a trademark of 3M Company.


3M AD.indd 1

24/5/10 10:57:21


SHANNON NFPA_may10 02/06/2010 09:32 Page 104

FIRE PROTECTION

E

arlier this year the National Fire Protection Association (NFPA) released the 2010 edition of NFPA 1600, Disaster/Emergency Preparedness and Business Continuity Programs. NFPA 1600 has received a great deal of attention in the post-9/11 era. First published in 1995, and currently in its fifth edition, NFPA’s preparedness standard is a high-level program-design document that defines the core elements of both emergency management and business continuity: protecting life, property, business operations and the environment. It is intended for use by both public and private sector entities. The document is available from NFPA as well as the American National Standards Institute (ANSI). Because of its national importance, NFPA has, since 2004, been offering the current edition of NFPA 1600 as a free

PDF download from its website (www.nfpa.org). To date over 120,000 PDFs have been downloaded worldwide. The approach taken in NFPA 1600 has been to provide high-level fundamental criteria for the development, implementation, assessment and improvement of emergency management and business continuity programs. It was recognized in the development of the early editions of NFPA 1600 that a comprehensive document addressing the unique hazards, processes and resources of a wide spectrum of industries would be extremely complex and would never be complete. As a result, the NFPA technical committee, composed of over 30 experts responsible for writing the standard, chose to write a more general document that would not be in conflict with industry best practices or regulations and yet was flexible

enough to accommodate different kinds of entities of differing sizes.

Homeland security interest In 2003, NFPA 1600 was presented to professional staff of the US 9/11 Commission, a blue ribbon panel of experts charged with reviewing the events that led up to 9/11 and the broad subject of domestic security. Subsequently, NFPA 1600 was endorsed as the US National Preparedness Standard by the Commission in the 9/11 Commission Report published in the July of 2004. This Report provided the first framework for homeland security in the US The 9/11 Commission’s endorsement of NFPA 1600 was reiterated in US Public Law (PL) 108-458 Intelligence Reform and Terrorism Prevention Act of 2004, signed into law by

Protecting life, property and business The importance of the latest update to a high-level US preparedness standard. By James Shannon


SHANNON NFPA_may10 02/06/2010 09:32 Page 105

President Bush in December 2004. More recently, in the August 2007 PL110-53, Implementing Recommendations of the 9/11 Commission Act of 2007, NFPA 1600 was again named as a standard for private sector preparedness under a new US Department of Homeland Security (DHS) initiative for voluntary certification of private sector preparedness programs named “PS-PREP”. NFPA 1600 remains unique among preparedness standards, as it has a strong life safety component. The 9/11 Commission said this about NFPA 1600: “We believe that compliance with the standard should define the standard of care owed by a company to its employees and the public for legal purposes. Private-sector preparedness is not a luxury; it is a cost of doing business in the post-9/11 world. It is ignored at a tremendous potential cost in lives, money and national security.”

mitigation. Chapter 6 addresses implementation, including resource management, mutual aid/partnership agreements, crisis communications, emergency response, business continuity, recovery and incident management. Chapter 7 addresses testing and exercises. Chapter 8 covers program improvement, including periodic review and corrective action based on changing operations and lessons learned from exercises and actual incidents. The 2010 edition of NFPA 1600 has been endorsed by the Disaster Recovery Institute International (DRII), the Association of Contingency Planners (ACP), and the International Association of Emergency Managers (IAEM). In addition, it has been officially designated as a Qualified Anti-Terrorism Technology (QATT) and certified as an Approved Product for Homeland Security by DHS. It carries the DHS ‘SAFETY Act Certified’ seal.

cause widespread disruption. As a result, there is increasing scrutiny of the resiliency of the supply chain, and larger firms are frequently auditing the preparedness programs of their suppliers. Public sector entities depend on the private sector for products and services. The public sector in turn often provides services such as police, fire protection, emergency medical services and utilities to the private sector. The continuity of the operation of these public entities is important. State and local governments are putting plans in place for support from businesses whose activities are vital to the safety and recovery of their jurisdictions in a disaster. Government entities cannot stockpile enough critical resources for large, locally devastating disasters such as floods, hurricanes or earthquakes. Business relationships need to be established in advance of disasters for critical supplies including

New edition The 2010 edition of NFPA 1600 incorporates lessons learned from Hurricane Katrina as a result of workshops sponsored by DHS and ANSI that captured the experiences of response and recovery organizations and private businesses during and after the devastating Gulf Coast hurricane in 2005. The standard was reorganized and expanded in the areas of program management; risk assessment; and business impact analysis, implementation and recovery. The document is still succinct; the basic requirements are contained within six pages of text. The standard provides the fundamental criteria to develop, implement, assess and maintain an all-hazards disaster/emergency management and business continuity program for prevention, mitigation, preparedness, response, continuity and recovery. ‘All hazards’ includes events caused by nature, humans (both intentional and accidental) and technology. The standard is intended for use by public, private, not-for-profit and nongovernmental organizations (NGOs) on a local, regional, national and global basis. The heart of the 2010 edition is in Chapters 4 through 8. Chapter 4 addresses program management, which has been expanded to emphasize the importance of leadership and commitment. Chapters 5 through 8 cover the program development process and are consistent with the ‘plan, do, check, act’ continuous improvement process. Chapter 5 covers planning, including risk assessment, business impact analysis, prevention and

“Private-sector preparedness is not a luxury; it is a cost of doing business in the post-9/11 world” To assist practitioners, NFPA publishes a textbook entitled Implementing NFPA 1600, which addresses in more detail the preparedness requirements for various types and sizes of entities. NFPA also offers two training programs on NFPA 1600.

food, water, clothing and shelter. Government entities are seeking agreements with retailers and wholesalers that have large quantities of pre-positioned and geographically distributed supplies. The principles in NFPA 1600 apply to evaluating preparedness of these supply chains.

Government and business needs

International use

Business continuity alone is insufficient. Governments and businesses need to manage emergencies to protect their employees and the public while also protecting and preserving their enterprises. Ultimately, disasters are about people – saving lives, preventing injuries and providing the basics of life for communities as they go through recovery. Large companies, states and large cities have implemented emergency management and business continuity/continuity of operations plans. Many of these plans are consistent with NFPA 1600. Addressing preparedness for smaller businesses is a challenge. Resources may be scarcer, and needs may not seem so obvious. A review of supply chains in the aftermath of regional disasters such as Hurricane Katrina in 2005 makes it clear that all levels of the supply chain need to be prepared. The failures of small and medium-sized businesses, which supply larger businesses, can

NFPA 1600 has received wide acceptance over the years in the US, where it is used by industry, commercial property owners, insurers and state and local governments. In October 2008, the Canadian Standards Association (CSA) announced the release of the first edition of CSA Z1600, a Canadian national preparedness standard based on NFPA 1600 and licensed by NFPA for use in Canada. Besides becoming the most widely used preparedness standard in North America, NFPA 1600 has gained significant attention in Latin America and Asia. It is the national preparedness standard in Argentina and is translated and distributed by other standards bodies in Chile, China, Colombia, Ecuador, South Korea, Thailand and Trinidad and Tobago. It is also being used by insurance companies in Europe. n James M. Shannon is President of the National Fire Protection Association.

www.americainfra.com 105


INTELEGARD_may10 02/06/2010 09:28 Page 106

ASK THE EXPERT

Fighting fire with fire Dennis Smagac outlines the potential of compressed air foam systems in fighting multiple hazard fires.

F

irst responders are becoming aware of the need to widen their focus from single to multiple hazard response. A fire department first on scene could discover an intentionally set fire intended to cover up a biological attack. Local citizens may contact the fire department regarding a spill that results in a toxic plume. Traditional mindsets are changing. Fire fighters, police departments and HAZMAT teams are all learning to become more versatile. This shift will require cross-training and more reliance on equipment that is suitable for multiple hazards. There will continue to be a place for ded-

ment met mission requirements. Systems manufactured by Intelagard of Colorado were selected due to their unique design and operational characteristics that make them well suited for demanding environments. These systems have proven to be particularly effective for diverse decontamination missions, HAZMAT remediation operations, vapor suppression environments and critical exposure protection functions in addition to traditional fire fighting applications. CAF systems multiply on-board resources in the form of expanded foam, allowing for maximum effectiveness per size and weight, whether

“Studies show that CAF systems use on average 79 percent less solution to suppress a fire than water alone” icated equipment side by side with multifunctional apparatus. Compressed air foam (CAF) systems can meet this multi-functionality requirement. Pioneered by wildland firefighting departments, compressed air foam is a liquid medium created by injecting air under pressure into a foam solution stream. Studies show that CAF systems use on average 79 percent less solution to suppress a fire than water alone, and on average 64 percent less solution than air aspirated foam to suppress a fire. Studies have also shown that CAF systems are 78 percent faster than water and 66 percent faster than air aspirated foam in achieving fire suppression. CAF technology is poised to become the new tradition of fire fighting. In the emerging multiple hazard environment, CAF systems are showing their versatility and effectiveness as the preferred technology for responders. When US military forces identified a need for equipment, solutions and associated support for rapidly deployable and highly effective chemical, biological, radiological and nuclear (CBRN) decontamination systems, CAF equip-

106 www.americainfra.com

used for fire, decontamination or other emergency response. They also have the ability to apply a wide range of liquid and foaming solutions for a variety of needs. The systems have deployed Class A, AFFF and AR-AFFF foams for fire suppression, and HAZMAT remediation solutions and bio-remediation foams to inert and break down hydrocarbon spills and other hazardous incidents. The same systems have been tested with and deployed EasyDECON DF200, Reformulated Decon Green (RDG), hot and cold water/soapy water and chlorine-based decontamination solutions including HTH and STB. CAF-generated foams, especially when generated through Intelagard systems deployed in decontamination operations, have the unique ability to adhere to vertical and inverted surfaces, allowing the durable foam blanket to maintain required wet contact time between the detected agent and the decontamination formulation. Used extensively by the US military, Intelagard systems are widely chosen for decontamination efficacy testing because of their ease of use and fully replicable foam expansion settings.

The systems have been fully calibrated and rigorously tested to ensure that they will perform to exacting standards in extreme environments. The systems are capable of no-prime remote drafting operations. All Intelagard systems have been through extensive materials compatibility analysis to ensure that they are capable of deploying aggressive decontamination formulations without damage to the equipment. Intelagard CAF technology has been shown to be extremely flexible in adapting to meet specific performance and technological requirements. The unique capabilities of Intelagard technology make it ideal for diverse incidents. The ability to expand payloads and effectively respond to multiple hazardous situations in harsh environments is valuable to any first responder dealing with the demands inherent to these settings. The wide range of system sizes and configurations allow for a comprehensive package of response equipment to fit current and future requirements. While the traditional first responder single focus of fire, HAZMAT, law enforcement and EMS shift to wider multi-hazard response capabilities, expect to see CAF systems play a wider and more crucial role. n

Dennis Smagac is one of Intelagard’s founders. He combined his expertise in compressed air foam technology with his knowledge of decontamination to create a technologically advanced product line of multihazard response equipment for fire suppression as well as decontamination and HAZMAT remediation.


INTELAGARD AD.indd 1

24/5/10 10:58:17


Alan Roberson ed_may10 02/06/2010 09:37 Page 108

WATER FOCUS

High

and

dry? The US uses approximately 150 trillion gallons of water a year and with this number increasing every year, many states are starting to feel the pressure. Alan Roberson, Director of Security and Regulatory Affairs at the American Water Works Association, outlines what needs to be done to improve the industry and ensure water continues to flow across the US.

O

ver the past 125 years the American Water volved in design and security improvement, a set of guideWorks Association (AWWA) has been working lines were formulated and have so far seen moderate success. to improve the quality and supply of water in Roberson puts this down to the way the law is written, suggestNorth America, advancing public health, safety ing that while the law requires an evaluation, there is no reand welfare by providing knowledge and information. One of quirement to then put that in physical security improvements. the items currently on the association’s agenda is the Water “It’s still ongoing, but now the issue is that there is likely to be a Infrastructure Security Enhancement Program. Started soon regulatory program in the near future that will include the curafter the events of 9/11, the aim of the program is to prorently exempt water and wastewater facilities. That will vide some guidance to water and wastewater utilibe more of a program where you do an assessment, ties on a number of items to consider, such as define what you’re going to do to improve it and physical security improvements for treatment then follow it up by actually doing it – and that America uses 408 billion gallons plants and storage tanks. will be a lot more regulatory orientated.” of water a day Alan Roberson, Director of Security and One area where the program has focused a Regulatory Affairs at AWWA, explains that lot of attention is response networks. “This goes bewhen the program began there was a shortage of yond terrorism, which is important, but you really materials in the space, particularly regarding new access need an all-hazards approach that completely incorporates control technologies and CCTV cameras. “The thought was, the issues from natural disasters, because those happen more here’s a way to help understand the different technologies frequently.” What Roberson has found is that with the response and then build upon it to layer the approach depending on network in 47 states, it is possible to react to an issue in the the criticality of the facilities,” Roberson explains. “Our utilfastest and most efficient way possible. If a hurricane hits southities had a regulatory requirement to conduct vulnerability ern Florida, for example, the utilities in northern Florida can assessments back in the early part of the decade and that mobilize, send their people, resources and equipment down identified a lot of the vulnerabilities, but we wanted to go and get the southern part that’s been hit back up and running ahead and address those vulnerabilities.” as quickly as possible. “Longer term there’s probably more of a Through a joint effort between the AWWA, the Water strategy to focus on recovery and resiliency, but at the same Environment Federation and the American Society of Civil time we are doing some obvious stuff on physical security imEngineers, together with input from CH2M, a contractor inprovements,” he adds.

108 www.americainfra.com


Alan Roberson ed_may10 02/06/2010 09:38 Page 109

www.americainfra.com 109


Alan Roberson ed_may10 02/06/2010 09:38 Page 110

Funding crisis?

Rising water prices Over the past five years, municipal water rates have increased by an average of 27 percent in the United States, 45 percent in Australia, 50 percent in South Africa and 58 percent in Canada. In Tunisia, the price of irrigation water increased fourfold over a decade. Yet consumers rarely pay the actual cost of water. In fact, many governments practically (and sometimes literally) give water away. The average American household consumes about 127,400 gallons of water during a year. Homeowners in Washington, DC, pay about $350 for that amount of water. Buying that same amount of water from a vendor in Guatemala City would cost more than $1700. The price people pay for water is largely determined by three factors: the cost of transportation from source to user, total demand, and price subsidies. Treatment to remove contaminants also can add to the cost. A key step in moving toward more rational water management is to place a price on water that reflects its value and scarcity. Although pricing water at a reasonable cost can generate political problems in the short run, it can lead to substantial efficiencies in the longer run and eliminate drains on government budgets. Higher prices will lead households, farmers and industries to use water more efficiently. Just as the oil price shocks of the 1970’s stimulated energy conservation, so too could pricing water to better reflect its real cost stimulate similar conservation efforts. Source: Water prices rising worldwide from Resource Action Program

110 www.americainfra.com

With the Department of Homeland Security (DHS) beginning to understand the importance of response and resiliency against all types of hazard, it becomes even more apparent that preparation is key in responding to various inevitable situations – from ice storms to hurricanes and floods. However, there are still a number of problem areas. Roberson highlights financing as a major challenge in the industry; if you’re struggling to pay for infrastructure improvements then adding a further layer of security is going to push those finances to the limit. “If you let your infrastructure decay too far and you experience major line breaks then it creates an uncertainty about the reliability of your operations. And so that makes it doubly hard to get the money from your customer to do the fix if your service is not reliable to begin with,” says Roberson, who goes on to admit that he wouldn’t define this as a crisis. “It’s more of an issue that we need to address – probably all over the world actually, particularly in older cities where it costs a lot of money to do the repair work because you’re in a congested area.” Roberson goes on to explain that the biggest need is to fix the infrastructure. He suggests that rather than building new treatment plants, investment needs to be optimized in


Alan Roberson ed_may10 02/06/2010 09:38 Page 111

current infrastructure. “For some reason we have yet to atit is odd that there is not a push for this technology in the tract the investors or venture capitalists and although we are water sector. Roberson says that it is simply not as far ahead currently behind the times, it is starting.” as the power industry and despite a slow start he predicts that And with 52,000 water companies across the US, fundit will eventually take off. “We are certainly starting to see ing is not the only challenge in the industry, particularly as smart meters – not simply as an automatic meter reading but the majority of these companies serve less than 1000 people as a two-way communication between the utility and the cuseach. Under-funded and under-manned, these companies tomer, so that the customer can get real-time information on often don’t even have one full-time employee. Because of the their water consumption.” he says. way it is structured, people are keen to maintain their own That said, penetration of smart meters in to the market independence and so to try and consolidate those systems is is still remarkably small and the US has yet to see the same challenging from a political viewpoint. “It is happening, just push as areas in Europe, such as the UK, where there is a very slowly. The number of water systems has gone down huge effort to provide funding for a whole smart grid and to from maybe 54,000 to 52,000 in the last decade, so obviousget the technology in place as soon as possible. And with so ly some states are better at doing it, but it’s still very slow.” few providers willing to offer the service in combination with Smaller providers mean smaller budgets and with that a lack of endorsement from the government or any other ascomes more challenges. “If you have to do more monitorsociations there is currently no seemingly pressing need to ing,” explains Roberson, “that’s one or two percent of your implement this technology. budget right there for a very small system. This is definitely a In order to see any improvement in this situation problem, not just on monitoring, but for the whole infraRoberson highlights two needs in the market, one of which structure side.” is looking at how to collaborate, particularly between the bigHowever, despite the challenges currently facing the secger utilities. “Despite the high number of utilities across the tor, there are a couple of developments that have helped imUS, 400 out of those 55,000 serve more than 100 million and prove the industry. Roberson believes that better that’s about half the population in the US. Well that business processes to manage security has helped a 400 does have some purchasing power and some great deal. “As an example, when handling plant ways to make things work – and we certainly 36 states visitors, you need people to prove who they are need to look at this. We also need to look at are projecting and ensure that a chemical delivery goes to the how we can reach out to the power industry water shortages between now person you expected it to. And just by having and look at the potential of adapting technoland 2013 their supplier fax a name and driver’s license you ogy for our purposes with the minimal amount know who’s going to show up. So there have been of investment – this has the power to propel the many processes and procedures that have improved simindustry forward quite a bit.” ply from low-tech solutions.” Looking forward, Roberson has a couple of items on his And from a hardware perspective Roberson points to agenda. He foresees understanding climate change and the improvements in access control. “We’re seeing more systems implications on both water resources and demand as key in in place – if you had a storage tank that previously didn’t improving the industry. “Lawn watering for example, might have a fence, now it has one, or a critical facility might now change as outdoor irrigation is impacted by climate change, have CCTV or a security guard in place. Access control is but how can you try and predict things that are 20 or 30 years probably the single most used technology at this point.” out when there’s still a lot of uncertainty about how things According to Roberson it really is as simple as putting up a are going to unfold? fence or monitoring an area more carefully: “Obviously you “Another big issue for us is energy management – how have to train people to be able to use CCTV equipment or can we get our water utilities to be the most efficient in the whatever the equipment is, but at the end of the day, every energy they use – it’s typically the biggest non-personnel exlittle helps.” pense of the water utility. If you can optimize that energy use Looking at the importance of technology in the industhen you are not only saving money, but you’re contributing try Roberson highlights the ability to optimize infrastructure to lowering the carbon footprint of the utilities.” investment as key. He believes that tools to assess the condiBy continuing to look at ways to improve the industry tion of a pipe, for example, would be extremely useful. “If a and the current water situation facing the US, Roberson pipe is designed for 50 years, but it’s now 75 years old, do you hopes to persist in the association’s mission to advance pubneed to replace it or is it still good for another 25 years? Once lic health, safety and welfare, and improve the physical inthat’s determined you can look at the most efficient way to frastructure and security needs for water supply, wastewater repair or rehabilitate it.” and storm water. With a tough agenda for the next two years, With the smart grid and smart metering technology it seems the AWWA has its work cut out if it wants to such an important part of the utilities industry across the US, achieve its goals. n

“We need to look at how we can reach out to the power industry and look at the potential of adapting technology for our purposes “

www.americainfra.com 111


INDUSTRY INSIGHT

Full alert Dan Kroll examines vulnerabilities in drinking water supplies, highlighting recent breakthroughs that may prevent or detect a potentially fatal attack.

T

he vulnerability of our distribution systems to disruption and contamination by potential terrorist or malicious acts has been well documented. These potential attack scenarios have the ability to produce casualties on a massive scale. Studies conducted by personnel at Hach Homeland Security Technologies, Colorado State University and the US Army Corps of Engineers among others have shown that attacks on drinking water supplies could be mounted for between $0.05 and $5.00 per death, using rudimentary techniques, and could amass casualties in the thousands over a period of hours. The most likely scenario for such an attack, in which the goal is to inflict mass casualties, is to orchestrate a simple backflow contamination event. A backflow attack occurs when a pump is used to overcome the pressure gradient that is present in the distribution system’s pipes. This is usually around 80 lbs/in and can be easily achieved by using pumps available for rent at most home improvement stores. After the pressure gradient present in the system has

been overcome and a contaminant introduced, siphoning effects act to pull the contaminant into the flowing system. Once the contaminant is present in the pipes, the normal movement of water in the system acts to disseminate the contaminant throughout the network, affecting areas surrounding the introduction point. The introduction point can be anywhere in the system such as a fire hydrant, commercial building or residence. Backflows occur via accident on a regular basis and are of great concern to the water industry. Accidental backflow events have been found to be responsible for many incidents of water borne illness and even death in the United States. According to the USEPA, backflow events caused 57 disease outbreaks and 9734 cases of water borne disease between 1981 and 1998. Intentional dissemination of contaminants through a backflow event is in fact a very critical vulnerability. Studies conducted by the US Air Force and CSU have shown this to be a highly effective means of contaminating a system.

These studies show a few gallons of highly toxic material was enough, if injected at a strategic location via the proper method, to contaminate an entire system supplying a population of 100,000 people in a matter of a few hours. Using computer simulations, when a military nerve agent material was used, over 20 percent of the population received a dose adequate to result in death. When a common chemical was used in place of the warfare agent the result was a casualty rate of over 10 percent. Thousands of deaths could result from this very inexpensive and low-tech mode of attack. It would cause mass casualties, be inexpensive, and actually offer the terrorists a good chance of avoiding apprehension. Unfortunately, because monitoring for contamination in the distribution system typically is limited to infrequent grab samples, the first indications of such an attack are likely to be casualties showing up at local hospitals. These sorts of attacks can occur from any access point to the water system. Wherever water can be drawn out, material can be forced back into the system. Access points near high flow areas and larger pipes would be favored because they would disseminate the material to a wider area more quickly, however, any access point except for those at the very end of long deadhead lines could be used to effectively access the system. It should be obvious from the large number of accidental backflows that occur and the fact that terrorist organizations have shown an interest in attacking water, that the distribution system is a prime candidate for such an attack. Protecting against and/or detecting such an attack is difficult. Recent breakthroughs in the online detection of contaminants have made the deployment of a cost effective early warning system capable of detecting and categorizing such events a reality. Dan Kroll is Chief Scientist at Hach Homeland Security Technologies and Principal Investigator for the Hach Advanced Technology Group. Kroll has developed both advanced and simplified methods for a variety of crucial water quality parameters. He is also author of the book Securing Our Water Supplies; Protecting a Vulnerable Resource.

112 www.americainfra.com

Hach2.indd 112

2/6/10 09:28:25


HATCH AD.indd 1

24/5/10 10:57:49


MEGACITIES

I

f space travel had been possible 100 years ago, those early astronauts would have seen the light from 16 concentrations of a million or more people. Today, the crew of the space shuttle can see 450 such shining cities on the globe – the economic, governmental, cultural and technological power plants of an increasingly urban age. The pace of such development is staggering. At the turn of the last century, only 13 percent of the world’s population lived in cities; two years ago, for the fi rst time ever, more than half of us were urban metropolitans, and by 2050 that number will rise to 70 percent. We are adding the equivalent of seven New Yorks to the planet every year – putting a huge strain on the planet’s resources and infrastructures in the process. And it’s not just the number of cities that is on the rise; their size is increasing, too. Welcome to the age of the megacity. Megacities are defi ned as urban population centers of more than 10 million inhabitants, and they are on the rise: 60 years ago there were only two, New York/Newark and Tokyo, but today there are 22 such megacities – the majority in the developing countries of Asia, Africa, and Latin America – and by 2025 there will most likely be 30 or more. As these megacities evolve, many groan under the weight of a sudden, massive and unprecedented demand for services. The basic necessities of clean water, of sanitation systems to remove megatons of garbage and human waste, of transportation systems to shuttle millions of workers – not to mention the need for electrical networks, healthcare facilities, and policing and security – are creating one of the greatest logistical challenges ever seen in human history. And the challenge is only going to intensify, with experts predicting the expansion and merging of already highly urbanized zones to form a number of ‘megalopolises’ – vast swathes of development such as the one made up of the Greater Boston-New York City-PhiladelphiaBaltimore-Washington areas (the so-called Northeast megalopolis) with an urban population of 55 million. Indeed, the phenomenon of endless urban sprawl could be one of the most significant developments – and problems – in the way people live and economies grow in the next 50 years, according to UN-Habitat, the agency for human settlements, in its bi-annual State of World Cities report.

114 www.americainfra.com

megacities.indd 114

2/6/10 09:23:16


Urban legends Why the tale of the 21st century will be deďŹ ned by the rise of the megacity.

www.americainfra.com 115

megacities.indd 115

2/6/10 09:23:17


TOP 5 MEGACITIES OVER TIME

8.4 3. London

12.3 1. New York-Newark

6.5 4. Paris

19.5 5. Mexico City

3. São Paulo, Brazil

12.3

21.7

MEGA-GROWTH In 1950, there were 83 cities with populations exceeding one million; by 2007, this number had risen to 468. In 1950, New York City was the only urban area with a population of over 10 million. Geographers had identified 25 such areas as of October 2005, as compared with 19 megacities in 2004 and only nine in 1985. The UN forecast tells that in 2015, over 600 million people are expected to have their home in a megacity.

4. São Paulo, Brazil

On the one hand, the development of such mega-regions is generally regarded as positive, asserts the report’s co-author Eduardo Lopez Moreno. “They [mega-regions], rather than countries, are now driving wealth,” he says. “Research shows that the world’s largest 40 mega-regions cover only a tiny fraction of the habitable surface of our planet and are home to fewer than 18 percent of the world’s population, but account for 66 percent of all economic activity and about 85 percent of technological and scientific innovation. The top 25 cities in the world account for more than half of the world’s wealth, and the fi ve largest cities in India and China now account for 50 percent of those countries’ wealth.”

Yet the growth of mega-regions and cities is also leading to unprecedented urban sprawl, new slums, unbalanced development and income inequalities as more and more people move to satellite or dormitory cities. “Cities like Los Angeles grew 45 percent in numbers between 19751990, but tripled their surface area in the same time,” says Moreno, who believes that urban sprawl is the symptom of a divided, dysfunctional city. “It is not only wasteful, it adds to transport costs, increases energy consumption, requires more resources, and causes the loss of prime farmland,” he explains. “The more unequal that cities become, the higher the risk that economic disparities will result in social and political tension. The likelihood of urban unrest in unequal cities is high.”

116 www.americainfra.com

megacities.indd 116

2/6/10 09:23:18


5. Moscow 1. Tokyo

5.4 2. Delhi

12.3

22.2 2. Delhi

5. Dhaka

28.6

20.9

11.3 2. Tokyo

25.8 37.1 3. Mumbai 1. Tokyo

20.0 4. Mumbai

Numbers are in millions

KEY 1950 2010 2025

What is most shocking about the report, however, is that the US emerges as one of the most unequal of all the world’s societies, with cities such as New York, Chicago and Washington showing higher levels of inequality between the haves and have-nots than places like Brazzaville in Congo-Brazzaville, Managua in Nicaragua and Davao City in the Philippines. “The marginalization and segregation of specific groups creates a city within a city,” says Moreno. “The richest one percent of households now earns more than 72 times the average income of the poorest 20 percent of the population. In the ‘other America’, poor black families are clustered in ghettoes lacking access to quality education, secure tenure, lucrative work and political power.”

Infrastructure concerns Infrastructure has a key role to play in reducing these disparities, as a recent Siemens study into the challenges facing megacities as population growth continues to explode shows; 81 percent of stakeholders involved in city management cite the importance of the economy and employment in infrastructure decision-making. In the Siemens study, transportation emerges as the top megacity infrastructure challenge by a large margin – not least because it is seen as the one infrastructure area that stakeholders believe has the biggest impact on city competitiveness. They are also highly aware of its environmental impact (for example, air pollution) and are keen to move to

www.americainfra.com 117

megacities.indd 117

2/6/10 09:23:24


Building from the ground up With cities consuming 75 percent of our natural resources, is a blank-canvas approach to development the key to our urban future? These new city projects are being built from the ground up, and could provide a blueprint for future urban projects.

Masdar City, Abu Dhabi Touted as the world’s first zero-carbon city, Masdar will be car-free, powered by renewable energy with services digitally managed and providing real-time information. With a maximum distance of 200 metres to the nearest transport link and amenities, the compact network of streets will encourage walking and is complemented by a personalized rapid transport system. Shaded walkways and narrow streets will create a pedestrian friendly environment, while surrounding land will contain wind, photovoltaic farms, research fields and plantations, enabling the city to be entirely self-sustaining.

Dongtan, Shanghai Development plans for this ‘city within a city’ – currently being built on an island off the coast of Shanghai – call for it to be modest in size (500,000 residents) and scaled for the people who will live there, rather than for automobiles or architectural monoliths. It is also designed to be completely self-sufficient, providing its own food and energy. Chinese officials hope Dongtan will offer practical lessons about pollution control and sustainability that can be applied to Shanghai proper, as well as to other rapidly growing urban areas.

Songdo, South Korea Songdo is located on the waterfront of the South Korean city of Incheon and will feature numerous eco-credentials – beautiful open space and parks, green roofs, solar passive design, co-generation plants, a waste management system, mass transit and over 120 buildings built to LEED standards. It is expected to cost over $30 billion, house 75,000 residents and handle 300,000 commuters. The first phase of the city has already been completed and Central Park, the 100-acre green space modeled after New York City’s landmark park, is now finished. And in the US…

Treasure Island, San Francisco A masterplan developed for the proposed $1.4-billion island by architectural and engineering services company Skidmore, Owings and Merrill details up to 8000 new homes (30 percent of which would be affordable to those on lower incomes), several solarpowered skyscrapers, an organic farm, three hotels, several shops and restaurants, a wastewater treatment plant, large-scale wind turbines for energy generation and 300 acres of recreational land. The goal is to create a sustainable, compact, mixed-use residential community that is not car-dependent.

118 www.americainfra.com

megacities.indd 118

2/6/10 09:23:26


greener mass transit solutions. It is not surprising therefore to fi nd that transport also emerges as the top priority for investment. Stakeholders acknowledge that the four other infrastructure sectors covered by the study – water, electricity, healthcare, and safety and security – are also in need of investment, but interestingly they are less likely to see a strong link between spending in these areas and improved competitiveness, despite the fact that each has an important impact on the overall attractiveness of the city for investment. Water infrastructure is also being pegged as a major concern for city administrations in the coming years. Megacities around the world must fi nd ways to control runoff while providing clean water for millions of inhabitants. With the World Health Organization suggesting 1.1 billion people – or 18 percent of the world’s population – now lack access to safe drinking water, governments increasingly need the money and knowhow to build massive public works. In São Paulo, Brazil, for instance, planners are struggling to cope with a drainage system that was built when the city was a fraction of its current size. Poor maintenance has left much of it clogged, while forest and parkland have given way to haphazard housing in many areas of the world’s third-largest city. Now there are fewer green areas to soak up incessant rains. Meanwhile, Mexico City is sucking up water from natural aquifers at twice the rate they are being replenished. The result: Mexico City is sinking, in some areas up to 16 inches a year, threatening its entire infrastructure – including the city’s deteriorating drainage system, whose capacity has diminished by 30 percent since 1975 while the area’s population has doubled. In addition, the city, which sits at an altitude of over 7300 feet, must pump water up 3000 feet to reach residents. Last year it had to ration water after one of the worst droughts in six decades. The drainage program includes plans for treatment plants to turn runoff into clean water for use by farmers.

New solutions for old problems The infrastructure and engineering challenges presented by the emergence of these densely populated urban centers are significant, not least because such rapid growth is being played out in the largest and most complicated urban habitats human beings have ever lived in. Managing such complex systems in the future is going to take a much smarter approach than the ones we are currently using. Faced by huge pressures on public services, cities tend to emphasize direct and immediate supply-side solutions. However, this does not always mean adding more capacity: in many cases – particularly in the highly developed megacities of the US – increasing the efficiency of existing infrastructure over building new roads, railways and hospitals can be just as effective. By contrast, although it is mentioned by a minority of the survey respondents, demand management never emerges as a priority. Demand management approaches have been advocated in a variety of areas, but even the specialists in specific infrastructure sectors do not see managing demand as the primary solution to their challenges. Yet with consumption consistently outstripping supply in many cities and infrastructure areas, there is a strong case for the wider adoption of demand management strategies on a global basis. Many believe the answer lies in embedding more (and better) technology into the networks and systems that underpin our cities, and

People magnets Employment and educational opportunities are the main attraction of urban centers. But hopes for a better life are often dashed as overpopulation puts a huge strain on cities’ infrastructures and their ability to provide basic necessities – like clean water and a decent place to live. Consider: • Overall almost 180,000 people move into cities every day • Of the billion people designated very poor, over 750 million live in urban areas • 1 billion people, one-sixth of the world’s population, now live in shanty towns • The number of slum-dwellers is estimated to grow by nearly 500 million by 2020

the effects of such an outlook are already being felt around the globe. Transportation officials in Singapore, Brisbane and Stockholm are using state-of-the-art systems to reduce both congestion and pollution. Public safety administrators in major cities like New York and Chicago are able not only to solve crimes and respond to emergencies, but to help prevent them. A large hospital organization in Paris is implementing an integrated patient-care management solution to facilitate seamless communication across its business applications – enabling them to track every stage of a patient’s stay in the hospital. While smart water management in the Paraguay-Paraná River Basin of Brazil is helping to improve water quality for São Paulo’s 17 million residents. And of course, when urban planners can no longer fi nd the surface space to install vital infrastructure components, they go underground. And while few, if any, cities can rival New York in the density and complexity of its subterranean networks, 21st century cities are looking to take the concept to a new level. For instance, officials in Oslo, Norway, may be the next underground pioneers. In their capital, developers have created a whole sub-urban community. Troubled by the city’s hilly terrain, engineers have built all sorts of structures – such as power plants, an air-traffic-control tower, and a dairy processing operation – under the surface. As a result, some of the world’s most sophisticated air-circulation systems can be found in Oslo, as well as underground lighting that’s tweaked to mimic the movement of the sun throughout the day. Thoreau called the city “millions of people being lonesome together”, but it needn’t be; that is where infrastructure – the underlying network of nodes and interconnections that underpin every urban center – has a vital role to play.

www.americainfra.com 119

megacities.indd 119

2/6/10 09:23:31


SMART TECHNO_may10 02/06/2010 09:33 Page 120

SMART TECHNOLOGY

BIG BLUE SKY THINKING


SMART TECHNO_may10 02/06/2010 09:33 Page 121

Can greater instrumentation, interconnectedness and intelligence be used to revolutionize infrastructure and make the world a better place? IBM certainly thinks so, and is betting billions to prove its point. Ben Thompson reports.

W

eight worldwide labs. He believes that a better understande live in an unpredictable ing of the way systems work can help solve some of the chalworld. You know it when you lenges we face as a global society. “Most of the environments get caught in an unforecast we work and live in are complex systems, a group of operatstorm on a sunny summer’s ing elements that together are too complex for us to visualday; you know it when the ize,” he explains. “Experience helps us to make a guess about stock market tanks and your what might happen in a particular system, but as humans we previously rock-solid investments are reduced to worthless just don’t have the capacity to model all the possible outjunk; you know it when you hit unexpected gridlock on the comes in our minds. So we are looking for approaches that way to that all-important business meeting. What you don’t allow us to come up with some reasonable insight into how always know are the hows, whys and wherefores – the myrithose systems work.” ad combination of variables that fell into place in order for Consider some facts. Congested roadways in the US cost those events to unfold. $78 billion in 4.2 billion lost work hours and 2.9 billion galIt’s like trying to link the butterfly in China to the torlons of wasted gas annually, not to mention havnado in Texas: did the miniscule changes in ating a significant adverse impact on air quality. mospheric conditions caused by the flapping Inefficient supply chains cost $40 billion of the butterfly’s wings set in motion a annually in lost productivity, more than chain of events that ultimately led to the three percent of total sales. Our power formation of a Texan twister 7000 miles grids are hemorrhaging energy, with away? Without the right information, enough electricity lost annually to how do you know whether each of IBM’s annual R&D power India, Germany and Canada for those interactions are connected? What budget an entire year. And our healthcare sysother factors played a part in creating the tem really isn’t a ‘system’ at all; it fails to right conditions for those complex systems link diagnoses, drug delivery, healthcare to develop? And given better intelligence, is it providers, insurers and patients, while costs spiral possible to predict how such permutations might out of control, threatening both individuals and institutions. play out in future? “I think we’ve reached a breaking point in lots of differSaving the planet ent industries,” acknowledges Dean. “Things develop legaIt is precisely these questions that IBM hopes to answer. cies over time, and once we get something working we tend Approximately 30 percent of the company’s $6 billion annuto push that approach to its limits until it reaches a point of al R&D budget goes into long-term research, with the decapacity. It’s time to break down the old established ways of partment churning out more than 4000 patents a year. IBM doing things, in order to move to the next level. We need to consultants then mine that output for so-called ‘repeatable add in new approaches that allow us to do things more effiassets’ – problem-solving technologies that can be applied in ciently, to develop more trust, to get more done. It’s time for a variety of different settings and industries – under the aegis a fundamental reboot of the basic operating systems that of IBM’s Smarter Planet initiative, a program built on the drive our economies, our markets and our societies.” idea that if we can connect the systems that run our world, IBM believes such a reboot is possible by thinking we can create less traffic, healthier food, cleaner water and smarter – and has the customer case studies to prove it. safer cities, amongst other things. Stockholm, for instance, has used a congestion management Dr Mark Dean is Vice President of Global Strategy and system to cut gridlock by 20 percent, reduce emissions by 12 Operations for IBM Research, and responsible for setting the percent and increase public transportation use dramatically. direction of the company’s overall research strategy across Smart food systems are using RFID technology to trace meat

$6 billion

www.americainfra.com 121


SMART TECHNO_may10 02/06/2010 09:33 Page 122

and poultry from the farm through the supply chain to store shelves. Smarter healthcare systems can lower the cost of therapy by as much as 90 percent. And similar intelligent systems are transforming energy grids, supply chains and water management, as well as helping confirm the authenticity of pharmaceuticals and the security of currency exchanges. “We’re using these technologies to help countries, organizations and companies blend old with new, so you don’t have to do a rip-and-replace,” says Dean. “There’s still a lot of capacity out there; it’s just that new approaches need to be brought to bear in order to utilize it. Creating a smarter planet is about the integration, analysis and monitoring of existing infrastructure, along with the addition of new analytic and management capabilities, which has made a lot of this possible. And it’s amazing what we’ve been able to create.”

Powering up

CASE STUDY

Take the energy industry, for example. For decades, power was something the average consumer didn’t think about too much. Electricity was just there (except, of course, when it wasn’t, and then it was all you could think about until it came back on). Today, however, climate change, rising energy prices and technology advances have all helped to reshape the collective mindset of consumers, turning many from passive ratepayers to highly informed, environmentally conscious customers who want a role in how and when they use power – as well as what they pay for it. As Vice President of IBM’s global energy and utilities business, Allan Schurr has witnessed this shifting dynamic firsthand. “All three of those things have fundamentally altered the go-forward for utilities so there’s not really an option of business-as-usual,” he says. “There is a real groundswell of change taking place in the industry. Up until now it has operated more or less in the

same manner for the last 50 years – central station generation pushing electrons through a hierarchal distribution network down to the end consumer, with very little information flow. What we’re now doing is adding substantially more intelligence and data collection systems to the energy picture, so that far more energy awareness and energy efficiency can be effected through the use of new technology and better interaction amongst the parties. It’s really exciting.”

“Creating a smarter planet is about the integration, analysis and monitoring of existing infrastructure, along with the addition of new analytic and management capabilities, which has made a lot of this possible. And it’s amazing what we’ve been able to create” Schurr explains how smart grids use sensors, meters, digital controls and analytic tools to automate, monitor and control the two-way flow of energy across operations – from power plant to plug. A power company can optimize grid performance, prevent outages, restore outages faster and allow consumers to manage energy usage right down to the individual networked appliance. Smart grids can also incorporate new sustainable energies such as wind and solar generation, and interact locally with distributed power sources or plug-in electric vehicles. Intelligent networks are a win-win, says Schurr. “There are opportunities to reduce the amount of investment that utilities are required to make if they have more precise information about loading,” he explains. “There are

The intelligent utility

A simulated thunderstorm shows how CenterPoint's intelligent grid responds instantly to isolate outages. Image: CenterPoint Energy

W

hen it comes to the electricity that powers homes, schools, businesses and hospitals, most people have little more than a fuzzy idea of what’s involved to get it there. This ambiguity disappears when it comes to their expectations, however. They expect the power to be there when they need it, and if it’s not, they want the problem fixed as fast as possible – period. In the greater Houston area, it’s the responsibility of CenterPoint Energy’s electric transmission and distribution business unit to meet this expectation for its two million customers. The utility owns and maintains 3766 miles of transmission lines and 46,376 miles of distribution lines – enough to go around the world twice – and delivers over 76 million megawatt hours of electricity annually.

122 www.americainfra.com

Like others in the electric transmission and distribution industry, the company faced the challenge of how to deliver power more efficiently and reliably in the face of growing consumer expectations, environmental concerns and increasing

costs. Following the Northeast blackout in 2003 and the severe hurricane seasons in 2004 and 2005, the utility was looking for ways to ‘harden’ the grid by making it better able to resist outages and fluctuations in power quality. It was also


SMART TECHNO_may10 02/06/2010 09:33 Page 123

An intelligent future IBM’s Sam Palmisano believes that through pervasive instrumentation and interconnection, almost anything – any person, any object, any process or any service, for any organization, large or small – can become digitally aware, networked and intelligent. “These new capabilities could not be more timely,” he says. “Even in today’s difficult environment, businesses are willing to invest in IT solutions – if they cut costs, drive efficiency and productivity, preserve capital and create competitive advantage. And that’s exactly what smarter solutions do.”

opportunities to increase reliability based on information about where outages are occurring, how extensive they are, and whether power has been restored after an event. And most importantly, there are opportunities for efficiencies on the consumer side. With greater granularity of information, consumers can make better decisions about what they purchase in terms of appliances, how efficiently their equipment is operating and whether or not they can shift their consumption patterns from more expensive peak periods to less expensive off-peak times.” One of IBM’s flagship projects can be found in the Mediterranean. Malta’s electricity and water systems are inexorably intertwined, with the country depending entirely on foreign oil for the production of all of its electricity and more than half of its water supply, which filters through an energy-intensive desalination process. A newly implemented smart grid,

integrating both water and power systems, will be able to identify water leaks and electricity losses in the grid, allowing the utilities to more intelligently plan their investments in the network and reduce inefficiency. Over 250,000 interactive meters will monitor electricity usage in real-time, set variable rates and reward customers who consume less energy and water. “By becoming more efficient with water use, they reduce electricity consumption; and by becoming more efficient with electricity consumption, they reduce the importing of fuel that is used to power their generators,” says Schurr. “It’s a very exciting project that demonstrates how taking a very ambitious agenda and putting it into reality can be done economically.”

Dealing with complexity The project also highlights another key point: that many of the challenges we face today are interrelated, and involve a greater understanding of the interplay between a variety of complex systems in order to be tackled intelligently. The synergy between the water and power sectors is one example, but there are countless others. “This whole Smarter Planet initiative is about trying to understand these different environmental areas as part of a larger, more complex ecosystem, and being able to generate enough information so that we can start to identify patterns and see where the impact points are,” says Sharon Nunes, Vice President of Big Green Innovations, an organization created to identify and launch new businesses for IBM. “Take water, for example. There’s a lot of stress on water systems around the world. With a limited supply, we need to be able to better manage it – and that means better monitoring and measurement, as well as having a holistic picture of what the demand will be and where that demand will come from.”

looking to give electricity consumers the information, make the grid more reliable means to change their consumption and operations more efficient. “We expect patterns based on near real-time usage that the intelligent grid will improve data, transparency and time-of-day pricing, electric power line grid planning, allowing the consumer to be an interactive operations and maintenance, enabling us participant in the electric market. to deliver power more efficiently,” says The goal was to bring many of the Tom Standish, Group President of defining attributes of the information Regulated Operations at CenterPoint superhighway, such as Energy. “We also expect the resiliency and intelligence, technology to contribute to to the electrical grid. fewer and shorter outages.” Drawing upon expertise The solution will address CenterPoint and technology from these issues through the Energy delivers nearly every part of innovative application of 76 million MWH IBM, CenterPoint Energy leading-edge technologies, annually established a roadmap such as broadband over power for building an intelligent line (BPL), and first-of-a-kind utility network that would failure detection capabilities that go provide a more granular, real-time beyond what was previously thought view of conditions on the grid – vastly possible. The fact that BPL, which sends a improving the ability to leverage broadband signal over distribution wires,

leverages the utility’s existing assets is just one benefit; the bigger story is how the company’s future BPL infrastructure will provide a single conduit for a wide range of grid-related activities, with advanced meter services, the use of the meter as a sensor on the grid, and the deployment of home area network monitoring and control prime examples. As a storm-prone city situated on the Gulf Coast – and the home to a large base of energy-hungry businesses – Houston is the ideal testing ground for one of the world’s first true intelligent utility solutions. “While we see this initiative as helping to transform us as a company, many of the results and innovations that come out of it will help to transform the energy transmission and distribution industry as a whole,” says Don Cortez, VP of Operations Technology at CenterPoint Energy.

www.americainfra.com 123


SMART TECHNO_may10 02/06/2010 09:33 Page 124

Clearly, there’s plenty of room for improvement. For instance, municipalities commonly lose as much as 50 percent of their water supply through leaky infrastructure, while global agriculture wastes an estimated 60 percent of the 2500 trillion liters it uses every year. One in five people still lacks access to safe drinking water and the United Nations predicts that nearly half the world’s population will experience critical water shortages by the year 2080. As an example of the inefficiency inherent in the system, Nunes highlights how there are nearly 53,000 water agencies in the United States alone, but very little coordination between them – despite the fact that they are all managing a shared resource. “There is no sharing of data to achieve a holistic view of the entire watershed or water ecosystem,” she says. “Yet if you want to ensure that water is distributed amongst all of the competing demands in an equitable fashion, you really need to understand what the supply is and what the demand is on that entire system.” To address such issues, IBM is working on a number of programs to help make management of the world’s water ecosystem smarter. Together with the Beacon Institute for Rivers and Estuaries and Clarkson University, the company is creating a data platform to support instrumentation of the entire

Ireland team is that we’ve been able to take a lot of what you would think of as disparate information and integrate it to give us a better picture of what is happening in Galway Bay.”

Healthy profits Healthcare is another sector set to benefit. “Our current approach to healthcare is just not sustainable,” says Sean Hogan, IBM’s VP for Healthcare Delivery Systems. “However, the financial crisis has highlighted the burden that healthcare costs are placing on our society, and as such is prompting a very engaging debate about what to do about it. And the conclusion is that if we are going to address the issues of access, cost and healthcare quality, we have to have better information technology to support that.” Rising costs, limited access, high error rates, lack of coverage, poor response to chronic disease and the lengthy development cycle for new medicines – Hogan explains how most of these could be improved if we could link diagnosis to drug discovery to healthcare providers to insurers to employers to patients and communities. Today, these components, processes and participants that comprise the vast healthcare system aren’t connected.

Smarter traffic management The city of the future may have fleets of smaller buses that change routes on the fly to go where they are needed most, while larger buses travel high-demand routes in peak periods. All will be integrated with a system that dynamically tracks and adjusts their movements, to meet changing user demand.

length of the 315-mile Hudson River for a real-time view of a river system that supplies both industry and individuals. In the Netherlands, the company is working with partners to build smarter levees that can monitor changing flood conditions and respond accordingly. And across the world, sensors are revolutionizing agriculture, providing detailed information on air quality, soil moisture content and temperature to calculate optimal irrigation schedules. Elsewhere, smart metering can give individuals and businesses timely insight into their own water use, raising awareness, locating inefficiencies and decreasing demand. IBM and the Dow Chemical Company, through its Dow Water Solutions business, are working together to enable unprecedented visibility into water usage – starting with desalination plants. And IBM itself is saving over $3 million a year at one North American semiconductor plant through a comprehensive water management solution. Nunes’ team is also hoping to move beyond real-time monitoring to prediction, using advanced computing and analytics to support better-informed management decisions. The test-bed is a collaborative research initiative with the Marine Institute in Ireland that aims to turn Galway Bay into a living laboratory – instrumenting the bay to gather data on water temperature, currents, wave strength, salinity and marine life, and applying algorithms that can forecast everything from wave patterns over 24 hours to the right time to harvest mussels. “There’s a lot of data available about water, but there is not a lot of information,” she explains. “What we have found working with the

124 www.americainfra.com

Duplication and handoffs are rampant. Deep wells of lifesaving information are inaccessible. IBM believes that a smarter healthcare system starts with better connections, better data, and faster and more detailed analysis. It means integrating data and centering it on the patient, so each person ‘owns’ his or her information and has access to a networked team of collaborative care. It means moving away from paper records, in order to reduce medical errors and improve efficiencies. And it means applying advanced analytics to vast amounts of data, to improve outcomes. “If you had a map of all industries and plotted the sophistication of the use of information technology within those industries, healthcare would be on the lower end of the spectrum – despite the fact that it is a very technology intensive and information intensive sector,” says Hogan. “But IT can help make the administrative process smarter and more efficient; it can enable health information to be shared between care providers and eliminate redundant procedures; and it can better support the process of care so that physicians have the right information available to support the decisions they need to make.” For instance, Sainte-Justine, a research hospital in Quebec, is automating the gathering, managing and updating of critical research data, which is often spread across different departments. With the help of IBM technology, the center is applying analytics to speed childhood cancer research and improve patient care while drastically lowering the cost of data acquisition and


SMART TECHNO_may10 02/06/2010 09:33 Page 125

And true to the premise of Smarter Planet, healthcare systems like these hold promise beyond their particular communities, patients and diseases. “The smart ideas from one can be replicated across an increasingly efficient, interconnected and intelligent system,” says Hogan. “This should result in lower costs, better-quality care and healthier people and communities. In other words, we’ll have a true healthcare system with the focus where it belongs – on the patient.”

Green means green Of course, despite the undoubted societal benefits the program is likely to stimulate, Smarter Planet is not purely governed by altruism. Nunes estimates that information technology for water management could become a $20 billion market over the next few years, while in the energy sector, recent research by analyst firm The Brattle Group suggests US utilities alone will spend $1.5 trillion upgrading infrastructure by 2030. Make no Number of IBM mistake: the building of smarter systems will be a multitrilpatents filed lion-dollar opportunity for tech companies such as IBM. annually It helps that there is currently federal funding available for many of these initiatives as a result of the recently passed economic stimulus package. The American Recovery and In Galway Bay, Ireland, SmartBuoys such as this Reinvestment Act of 2009 includes $19 billion for healthcare technolocollect data from a variety of different sources gy, $11 billion for the implementation of smart-grid technologies and has which is used to inform a host of industries. $6 billion earmarked for water infrastructure projects. Federal money has also been set aside for expanding broadband and wireless access, as well as Aquaculture countless other infrastructure improvement projects that could help realize Farmers cultivating shellfish need information on salinity, the vision of implementing a more intelligent approach to the way we mantemperature and water quality, especially harmful algae age our socio-economic systems. blooms that may threaten a crop IBM is not alone in pursuing this emerging market. GE has ramped up Commercial fishing its Ecomagination program; Cisco has its recently announced EnergyWise Fishing boats are mostly interested in weather and water project; and similar initiatives from technology big-hitters such as Google, quality data, to better locate catches and ensure safety Hewlett-Packard and Fujitsu, amongst others, have also raised awareness as Harbor Master to the opportunities in this space. Ultimately, this is a good thing: not only are Controlling the industrial shipping ports in the bay such companies spending at a time when the economy desperately needs it, requires real-time information on high tides and flow rates they are also establishing the ground rules for a new way of looking at the Alternative energy world. The vision of a smarter planet is one where technology provides the Energy companies are particularly interested in data on building blocks for change; however, it will be up to people to effect those imthe potential energy in waves provements and implement the vision. Tourism It’s why IBM is spending millions on the marketing side of Smarter The Irish Water Safety Association can close beaches or Planet – in order to generate a greater appreciation of both the challenges we alert lifeguards to dangerous water conditions like jellyfish face as a society, and the solutions that are possible using existing technoloor rip tides gies and approaches. “Why get smarter?” asks CEO Sam Palmisano. “Firstly, Restaurants because we can: the technology is both available and affordable. Secondly, beFishing boats can communicate with local restaurants to cause we must: the shocks we’ve seen to so many systems show that the curinform them of when they will dock and with what catch rent approaches aren’t sustainable. And finally because we want to: IBM is starting a conversation with the world because we think now is the time to make these changes for the better.” enhancing data quality. Another example is Geisinger Health Systems, which The final point is that this is a campaign that has the potential to inspire is integrating clinical, financial, operational, claims, genomic and other inus all to try and make a difference. We might not have a $6 billion research formation into an integrated environment of medical intelligence that helps budget or a big picture view, but we can all recognize inefficiencies in the way doctors deliver more personalized care. This enables them to make smarter society operates on a daily basis. If we can think of ways to become smarter decisions and deliver higher quality care, all because they can easily turn inourselves, we can not only improve our quality of life and make the world a formation into actionable knowledge. better place in which to live; we might also make a fortune in the process. n

4000

Smarter water management

www.americainfra.com 125


126

IN THE BACK. YUCCA MOUNTAIN Yucca Mountain: The Nuclear Waste Repository The Yucca Mountain project, located in Nevada near the nuclear test sites, had been underway since 1987, when Congress had selected the site as America's nuclear waste repository. To ensure the waste was safely stored, over $9 billion was spent on concrete tunnels and chambers designed to keep waste safe for at least a million years.

70,000 metric tons of nuclear waste is currently stored at 121 sites around the US

Yucca Mountain is only

Yucca Mountain neighbors the Nevada test site where there were 904 atomic bomb tests between 1945 and 1992

80 miles

outside of Las Vegas, NV

Mt Yucca Las Vegas As of 2008 $9 billion has been spent on the project

This is the amount that will potentially be stored at Yucca Mountain

Working in the US Capitol Building

15

27% is funded from the tax payer

Comparison of background and detectible risk radiation levels to Yucca limits between 10k and 1 million years (millirems per year)

20

Yucca Mountain Radiation Standard

73% is funded from consumers of nuclear powered electricity

How is it paid for?

There is 2500 metric tons of highly radioactive waste of spent fuel from the production of nuclear weapons in temporary storage

Comparison of radiation exposures to Yucca limits between 0 and 10k years (millirems per year)

The latest Total System Life Cycle Cost is an estimated $90 billion

13,000

Background radiation in Ramsur, Iran Level below which no scientific evidence of cancer risk exists

1,000

Round-trip air travel between Washington DC and Las Vegas

4

Background radiation of South Dakota

963

Eating 1 banana a day

4

Background radiation of Colorado

700

Eating potatoes (US rate of consumption)

4

Yucca Mountain Radiation Standard

350 Demand for nuclear energy also demands disposal capability

809 billion kwh in 2008

20% of the US’s total electricity output

30% of the world’s nuclear output

The US has a 104 nuclear reactors

70%

of Americans now favour nuclear energy

[ Source: ‘Yucca Mountain - The Most Studied Real Estate on the Planet’ a report by the U.S. Senate Committee on Environment and Public Works, World Nuclear Association, Wikipedia ]

BACK SECTION_2.indd Sec2:126

2/6/10 09:14:35


IN THE BACK. BOOK REVIEWS 1

Construction Materials, Methods and Techniques: Building for a Sustainable Future By William P. Spence and Eva Kultermann

2 Building for a Changing Climate: The Challenge for Construction, Planning and Energy By Peter F. Smith

With a practically universal consensus that our climate is changing rapidly, there is extensive debate about what we can do to mitigate the damage being caused. It is becoming increasingly clear that a large part of our resources will have to be directed towards adapting to new climatic conditions. Nowhere is this more evident than in the built environment. In this book, sustainable architecture guru Peter Smith lays out his vision of how things are likely to change, and what those concerned with the planning, design and construction of the places we live and work can and must do to avert the worst impacts. US Infrastructure says: An invaluable mine of information on the global environmental crisis.

Sustainable and Resilient Critical Infrastructure Systems: Simulation, Modeling, and Intelligent Engineering

3

By Kasthurirangan Gopalakrishnan and Srinivas Peeta

As our critical infrastructure becomes increasingly interdependent, the need to ensure that it remains resilient and sustainable, whilst at the same time being adaptive has become a key focus area for the future. This book is looks at recent advances in simulation, modeling, sensing, communications/information and

BACK SECTION_2.indd Sec1:127

Comprehensive coverage of the most up-to-date green methods for residential and commercial building construction, along with the construction materials and properties needed to carry them out. A logical and well-structured format follows the natural sequence of a construction project. US Infrastructure says: A thoroughly rounded, needto-know guide that could prove critical to success in the green building sector.

LEED Materials: A Resource Guide to Green Building By Ari Meisel

It may be good to be green, but it’s still 4 far from easy and an architect’s knowledge of materials can make or break a building’s green rating. Though LEED’s performance-based criteria exclude individual materials and products from earning points toward certification, their specific use can. Apply a material in the wrong situation and you may not get credit for it. Fortunately, with a little insider knowledge, you can also use one material to get credit in two, three, or even more areas. LEED Materials is packed with critical information on nearly 200 materials, products and services. US Infrastructure says:This book fits well alongside other LEED references in any architects’ library.

127

The Green House: New Directions in Sustainable Architecture By Alanna Stang and Christopher Hawthorne

From the arid deserts of Arizona to the icy forests of Finland, the authors of this book have traveled the globe to find all that is new in the design of sustainable homes. Six different climactic zones are presented in The Green House – waterfront, forest and mountain, tropical, desert, suburban and urban; there is also a section on mobile dwellings. Each chapter features a series of homes. Projects are presented with large color images, plans, drawings, and an accompanying text that describes their green features and explains how they work with and in the environment. US Infrastructure says: A beautiful book and one that is an indispensable reference for anyone interested in sustainable design.

intelligent and sustainable technologies that have resulted in the development of sophisticated methodologies and instruments to design, characterize, optimize and evaluate critical infrastructure systems, their resilience, and their condition and the factors that cause their deterioration. US Infrastructure says: Particularly pertinent for those involved with infrastructure planning, design, financing and maintenance.

5

2/6/10 09:14:37


128

IN THE BACK. PHOTO FINISH

LA infrastructure project gets federal funds Los Angeles County Metro workers prepare K-rail barrier boards for installation along the I-405 as part of a $1 billion dollar infrastructure project that will add a 10-mile HOV lane in the Sepulveda Pass. Federal stimulus funds amounting to $189.9 million have been earmarked for the project, which will also improve supporting infrastructure such as ramps, bridges and sound walls on the San Diego Freeway (I-405) and widen lanes from the Santa Monica Freeway (I-10) to the Ventura Freeway (US-101). The project will reduce existing and forecasted trafďŹ c congestion on the I-405 and enhance trafďŹ c operations by adding freeway capacity in an area that experiences heavy congestion. It also aims to improve both existing and future mobility and enhance safety throughout the corridor while decreasing commuter time, reducing air pollution and encouraging more people to rideshare. The I-405 Sepulveda Pass Widening Project features include removing and replacing the Skirball Center, Sunset Blvd. and Mulholland Dr. Bridges; realigning 20 on- and off-ramps; widening 20 existing overpasses and structures; and constructing approximately 18 miles of retaining wall and sound wall.

BACK SECTION_2.indd 128

2/6/10 09:14:37


REDFLEX AD.indd 1

24/5/10 11:02:01


3M AD.indd 1

24/5/10 10:57:21


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.