Submarine Telecoms Forum is published bimonthly by WFN Strategies. The publication may not be reproduced or transmitted in any form, in whole or in part, without the permission of the publishers.
Submarine Telecoms Forum is an independent com mercial publication, serving as a freely accessible forum for professionals in industries connected with submarine optical fibre technologies and techniques.
Liability: while every care is taken in preparation of this publication, the publishers cannot be held responsible for the accuracy of the information herein, or any errors which may occur in advertising or editorial content, or any consequence arising from any errors or omissions.
The publisher cannot be held responsible for any views expressed by contributors, and the editor reserves the right to edit any advertising or editorial material submitted for publication.
Contributions are welcomed. Please forward to the Managing Editor:
On my long runs I have a lot of time for self reflection.
Some thoughts are deeper than others; some are simple fragments albeit mantras used to cloud the action and pain of the moment. As I get older and the general discomfort increases, so, too, do the number of disjointed thoughts. And when you are built like me you have a lot to be uncomfortable about.
So, it was yesterday during a 21 mile butt-kicker that I realized the awe inspiring Exordium that I had written only the day before was actually what I planned to write for November’s issue. The “insights” and appropriateness were in fact not, and had no place in this issue.
Maybe it was my mind’s way of pushing past the dreaded October marathon date. Maybe I am just slowly starting to lose it!
Thankfully, we have evolved to a place where the issues have their own life and can come together in new, interesting ways. And what an issue we have in store. We hope you’ll agree, and have something to ponder on your next long one...
Bharti Airtel Acquires Telecom Seychelles
AAG Cable Hits Snag
Alcatel-Lucent undersea cable ship (photos)
AT&T to Increase Undersea Cable Capacity With Major Upgrade of Asia Pacific Cable Network 2
Bharti and 15 others launch EASSy in Africa
Bharti to expand undersea cable biz
Cyprus-France segment of ALEXANDROS Announced
News Now
DEG finances submarine cable to West Africa
E-Marine Signs Major Maintenance Agreement to Cover East African Submarine Network (EASSy)
EASSy cable live and prices are falling
East-West Cable Advances
eFive Telecoms selects Alcatel-Lucent to build new submarine cable network linking West Africa to South America
Ericsson Reports on Environmentally Friendly Submarine Cables
Etisalat Goes Live on Main One Submarine Cable system
Globacom Gets Operating Licence in Gambia
Global Marine Systems launches "Predator" Inspection Class ROV at HydroVision Internatio
Globe Telecommunications boosts growth of new wave cities in Western Visayas
Google Jump-starts Liberian Online Businesses
Hibernia Atlantic Opens New Office In London U.K.
Huawei bid challenged
Internet to slow down unless more undersea cables are laid
Interoute Assures Data Security For European Enterprises Operating in Sweden
Korean Middle East Engineering wins US$1bn Saudi cable deal
Lime Planning US$600 Million Investment in Caribbean Networks
LS Cable develops umbilical cable for the first time in Korea
Main One ties West Africa closer to Internet
MARISOFT aim for pyramid heights
NTT Com to Enhance Security Services in Northern Europe through Acquisition of Secode AB
Oceanic Time Warner Cable services out statewide in Hawaii
Pacific Crossing Announces Further PC-1 Upgrade to Meet Demand Growth
Pacnet joins Pacific Fibre to build TransPacific Subsea Cable
PC World’s Most Dangerous Jobs in Technology
Pyramid Finds Undersea Cables and WiMax to Propel Africa’s Broadband Growth
Royal Navy Is "Dangerously Weak" Say Experts
SEACOM Repair Update
SMD to supply Hallin Marine with Quantum XPs and new Hanger LARS
SMD to supply Quasar ROVs to Kreuz Subsea, Singapore
Starcomms Signs Up With Main One
WACS to extend its submarine cable system from Portugal to the UK with Alcatel-Lucent's 40G ultra-fast optical technology
WFN Strategies Names Sales Manager
WFN Strategies Ranks No. 1381 on the 2010 Inc. 5000
WFN Strategies Receives 2010 Best of Sterling Award
WIND arrives in Cape Town with first load of recovered submarine cable
the Gulf Oil spill’s effect on the energy Market
Greg Berlocher
On April 20 a massive explosion ripped through TransOcean’s Deepwater Horizon semisubmersible drilling rig, under contract to BP, killing 11 and wounding 7 of the 126 workers onboard. Fires on the rig raged for two days as an armada of workboats showered the inferno with sea water in a valiant but ultimately futile effort to save the vessel. On April 22, the rig sank to the bottom, damaging the subsea infrastructure and allowing the Macondo Well to spew crude into the Gulf of Mexico for 84 days.
Vivid images of crude gushing from a damaged pipeline 5,000 feet beneath the Gulf’s surface were the highlight of evening news stories for months, visually reinforcing the point that each new attempt to cap the well was ineffective. At press time, the well has been capped at the ocean’s floor and a relief well almost completed which will allow engineers to “bottom kill” the Macondo Well, sealing it permanently.
Estimates differ on the amount of crude which spilled into the Gulf, but the environmental
damage isn’t the only concern along the Gulf of Mexico; President Obama’s executive order banning deep water drilling in the Gulf has caused lost jobs and long-term damage to the Energy Industry in the United States.
While demand for subsea fiber in the Energy market clearly isn’t as large as in other markets, it is growing and an important market segment. How will the Gulf oil spill affect the Energy Industry as a whole and how will it affect the subsea fiber market?
Cleaning up the Spill
Oil spills are ugly no matter how they are characterized. Crude oil, which is an amalgam of many different types of hydrocarbons and chemicals, is thick, viscous and sticky. It is unsightly and tends to foul whatever it comes into contact with. Birds and wildlife are often mired in the goo. Louisiana’s coast features thousands of square miles of rich wetlands which serve as a nursery ground for many shell- and finfish. Many estuarine areas have been closed to both commercial and recreational fishing. As a result, the economies of Louisiana and bordering Gulf States have suffered.
Fueled in part by sensational media reports, the public’s perception about the lingering environmental effects is not reality. Significant quantities of oil were skimmed from the surface and collected, or burned, before making it ashore. Many of the lighter compounds in the crude simply evaporated and dissipated into the air. Although the oil spill covered 2,700 square miles at one point, chemical dispersants and bacteria, as well as relentless heat from the summer sun and agitation from waves
dispersed and broke down the oil that wasn’t skimmed.
While it will take a number of years for marshes and coastal areas to fully recover, it will not take half a century as some doomsday environmentalists are prophesying.
This isn’t the first oil spill the Gulf has experienced. A look at history reveals that 56 oil tankers were torpedoed by German U-Boats in the Gulf of Mexico during World War II as they left refineries along the coasts of Texas and Louisiana. Although the tankers were much smaller than today’s giant vessels, the amount of crude that was spilled was massive.
While the environmental impact of the spill can’t and shouldn’t be minimized, cleaning up the emotional damage from the spill will be more difficult than cleaning up the oily residue.
Collateral Damage of Drilling Moratorium
In late May, President Obama issued a six month moratorium on deepwater drilling, thus setting off a legal struggle pitting the White House and Energy Industry against one another. The moratorium was overruled by a federal judge, but the moratorium was modified and reissued shortly thereafter. The effect of the drilling ban is becoming clearer as the weeks go by.
While the moratorium was supposedly about deepwater drilling permits, in fact it has affected all drilling permits, including those in shallow water. In an article published on June 26, CNBC reported that only one drilling permit has been issued since June 18, effectively leading to an “unofficial moratorium” on all offshore drilling.
The production of hydrocarbons is much like an assembly line. Drilling is the discovery process, which allows geophysicists to confirm what they think is under the Earth’s crust. Engineering comes next. Decisions must be made on the development of wells, or fields of wells, and platforms must be designed to support production, as well as pipelines to
transport the oil and natural gas to offshore collection facilities or back to the beach. Fabrication of the offshore platforms comes next with offshore construction being the final step before production begins.
Drilling is the first of a series of steps, and wells can’t be drilled without government permits. The net result of the drilling moratorium is a significant shutdown of the entire infrastructure supporting the Energy Industry. In a recent interview, John Rynd, Hercules Offshore CEO, said that the ban is costing the company $150,000.00 per day in lost revenues.
It is important to note that there are many small companies which provide specialized services
or products to the Energy Industry. Familyowned businesses, many which are several generations deep in experience, are beginning to fold as the large companies they support run out of work. There are no guarantees these companies will return to business after the moratorium is lifted, which could lead to a shortage of key suppliers.
The net result to the US consumer is higher prices at the pump and higher trade deficits due to the additional dependency on foreign oil.
The Bottom Line
Energy-related business is beginning to slow down in the Gulf of Mexico. Vessel utilization and rental rates are both high, primarily because BP has contracted just about anything that will float or fly to assist in the clean-up. This has served as a temporary economic buffer but as the offshore cleanup ends and the focus is entirely on the beach, many of the vessels will return to port and sit idle.
While the Gulf of Mexico represents a significant percentage of US energy production, exploration and production continues around the world, unfettered by the moratorium. Faced with the prospect of stacking their rigs, which rent between $150,000 – 750,000/day, drilling companies have begun moving their rigs. Likely destinations include West Africa, the North Sea, Egypt and Brazil. Towing a rig to a different continent takes several months and the direct costs, as well as the lost opportunity costs, of redeployment range from one to tens of millions of dollars.
of rigs leaving the Gulf of Mexico, stating: “Each deepwater rig employs on average 1,400 direct and indirect jobs. Each rig that moves out of the Gulf and overseas as a result of the moratorium and surrounding regulatory uncertainty, takes those jobs with them. This has already begun to happen. One big driller, Diamond Offshore Drilling, is moving two of its five rigs, the Ocean Endeavor and Ocean Confidence, from the Gulf to other regions. The longer the moratorium and surrounding regulatory uncertainly continue, the more rigs and companies we will see leaving the Gulf of Mexico -- perhaps for good.
The net effect for consumers in the United States is higher prices at the pump, a higher trade deficit due to higher imports of foreign crude, and fewer jobs.
How does this affect the outlook for subsea fiber projects in the Energy Industry? The answer depends on geography. Clearly, offshore operations in the Gulf of Mexico are being impacted now and will be for some period of time. The best candidates for subsea fiber in the Gulf are deepwater wells near the continental shelf. With drilling permits on hold, new exploration activity has come to a halt and so have most subsea fiber projects.
This uncertainty has most companies sitting on their hands, waiting for additional news before making any concrete decisions about the future.
While the Gulf of Mexico is currently mired with problems, the outlook for subsea fiber in other basins has not been affected. Large production platforms are good candidates for subsea fiber systems. Fiber connectivity allows personnel on the beach and on the platform to work collaboratively, thereby reducing head count, improving safety and increasing the speed and quality of production decisions.
Deepwater platforms are often isolated, many miles from a coastline, and as the quest for oil extends to a continent’s shelf, floating platforms are often many miles away from previously developed communication infrastructures. This is true in the Gulf of Mexico and will often be the case in other geographic areas. As deepwater rigs move out of the Gulf of Mexico to other basins, so will the opportunities for subsea fiber.
NOIA’s (National Ocean Industries Association) Dave Welch explained the impact
How long the moratorium will actually last is anyone’s guess. On August 3, Michael Bromwich, Director of Ocean Energy Management, stated that the drilling moratorium may end sooner than November 30, when the moratorium is set to expire. However, the ban could always be extended with the presidential stroke of a pen. Legal wrangling in the courts is unlikely to bring an expeditious end to the struggle.
Greg Berlocher is the President of Transcendent Global Networks LLC and has provided telecommunication products and services to the Energy Industry for over 30 years.
Fiber Optics technology Blows
Away traditional Approach to Reliability in Wind turbine Generators and Wind Farms
Mickaël Marie
Electricity generation by wind turbine generators, or WTGs, is a proven green energy technology in both land and offshore environments. However, wind farms located either onshore or offshore are often in remote and not easily-accessible locations. Additionally, their above-ground height can pose unique maintenance, repair and lightning strike challenges that must be addressed to make wind power renewable energy reliable and economical.
Fiber optics (FO) technology is probably best known for use in high-speed, highbandwidth telecommunication applications. Today, however, fiber optics data and control
links have also replaced copper links in wind turbines and farms making them a critical part of a wind farm operators’ solutions for minimizing costly downtime and service interruption.
Fiber optics technology is the most suitable and, in some cases, the only acceptable technology in high-noise environments for electrical generator/turbine control, power conversion and wind farm wide-area communications. The characteristics and reliability benefits of FO components — receivers, transmitters, transceivers and cable —are useful in wind farms and wind turbines, as well as overall wind farm and wind park operation.
Wind Turbine Environment
Unlike conventional electricity generating facilities that use coal or natural gas as their energy source, wind turbines operate outdoors and in regions with temperature extremes, corrosive spray (e.g., salt), dust, lightning strikes, snow and rain. In offshore wind park installations, weather may prevent maintenance and repair for extended time periods.
Offshore wind turbines are usually larger than onshore installations and also generate more power because wind speeds are generally stronger and steadier, which means the
WTGs must be more powerful. Higher wind speeds, larger mechanical loads, and corrosive elements in sea-based wind parks mean even higher reliability is needed.
Because wind turbines operate in rugged environments, both from a physical and electrical perspective, high levels of noise and EMI (electromagnetic interference) are generated inside the wind turbine nacelle from motors, solenoids, power lines, inverters and generators. Lightning strikes are prevalent in wind farm installations and fiber’s inherent galvanic isolation helps improve system reliability.
The wind turbine itself is a complex assembly of mechanical components, including a tower, gearboxes, brake systems, and blades. It also consists of electronic systems and other components such as motors, frequency inverters, rectifiers, IGBTs, power converters, programmable logic controllers (PLCs), sensors, pitch and yaw systems. And, all of these elements must be interconnected to generate reliable power. Inside the turbine’s nacelle are the power generation electronics, generators, blade pitch control, controllers, motor, Ethernet/PROFINET protocol devices, frequency inverters, circuit breakers, etc.
transmission distances, highnoise immunity and near-perfect EMI immunity.
In comparison, traditional copper data links cannot match the overall capability of a fiber optic-based system when judged on reliability and operation in rugged environments. Likewise, doubledshielded CAT5/CAT6 copper cables make the solution more expensive than using fiber optics solutions.
Figure 2 - Real time, noise and EMI-resistant fiber optic communication technology is used for wind turbine power generation, control and communications.
Figure 1 shows a basic fiber optical link where photons reflected in the core of the fiber cable replace electrons moving in copper cable in the transmission path. By eliminating the conductive copper cable, very high galvanic isolation levels can be achieved.
Fiber data links connect the nacelle’s remote controllers to the turbine’s main controller at the base of the platform and then to the wind park over a redundant fiber data and control communication link (Figure 3).
Fiber optic links are well suited for this short reach environment and are also the best choice for sending data and control signals from an individual wind turbine to the wind farm central monitoring station. They are also the optimal choice within the wind farm because fiber’s high-bandwidth, galvanic isolation,
Figure 1 - Fiber optic data communication link with inherent voltage isolation.
Power-generation electronics, such as the IGBT/IGCT inverter power switches, are controlled over high noise immune, EMIresistant fiber optic control paths (Figure 2).
Figure 3 - Fiber optics are used in the turbine nacelle and wind park for real time control.
A wind farm must rely on constant, reliable data flow for peak performance and safety even in installations covering large areas that are subject to local weather variations. Sensors monitor blade operation and system variables, such as vibration, ice, and other outside environmental factors, which can impact power generation and system safety. The data from system sensors can also be fed into the SCADA (Supervisory Control and Data Acquisition) systems for preventative maintenance action.
Communication links must often run alongside power carrying conduits and fiber cables are immune to crosstalk from power cables. As shown in Figure 3, fiber-based communication links inside the nacelle, between wind turbines and the wind farm control station, all benefit from the use of optical fiber.
Plastic Optical Fiber
Providing isolation and reliable communications make wind farm management and operation safer and more efficient. Many different fiber cable types, such as plastic optical fiber (POF), hard clad silica (HCS ®), Multimode, and Singlemode can be used in WTGs, with plastic being the least expensive. Cost-effective plastic optical fiber and glass optical fiber solutions are used worldwide in existing wind farm installations. Higher reliability and easier maintenance all play a role in the economics of wind power, getting a turbine online quickly and having it running reliably without interruption are critical concerns.
Fiber communication links consist of shortlink plastic optical fiber, e.g., 60 m, in individual turbines or multimode cables coupled to discrete transmitters/receivers or transceivers. In addition to being lightweight, fiber cables are also robust and resistant to harsh environments. Another advantage of POF is its flexibility. For the handling of POF cables, 25 mm is the minimum bend radius (measured to the inside curvature). This is the radius at which the cable can be bent safely without any damage during installation or even shortening the cable’s life. Under minimum tension, the minimum long-term bend radius is 35 mm. These are necessary characteristics for vertical cabling in towers that can be over 200 meters tall and where several hurdles in different parts of the WTG are to be bypassed. Fiber optics termination also offers safe and robust connections either with POF, HCS or MM. It is very important to make sure that no cable will be pulled out of any equipment by mistake when technicians operate any components of the turbine.
For industrial and renewable applications, optical fiber solutions with optimized connections are preferred. In effect, a good retention is required in numerous applications as vibrations may inhibit an optimized coupling of the light to either the source-cable or cable-receiver interfaces. A commonly used solution is a versatile link transmitter (Tx)/receiver (Rx) family. With this solution, a connection is made by inserting the connector into the transmitter and receiver ports. Both the transmitter and receiver have a slot on the top of the housing that enables the use of a latching system (Figure 4). This system allows a retention force of typically 80N, or approximately 10 times more than a connector with no latching system.
Figure 4 – Crimpless connectors HFBR-4531Z (blue) and HFBR-4532Z (black) with latching system.
For industrial applications, with very high operating temperatures, latching connectors are the preferred solution since the connector retention decreases with the temperature. The storage and operating temperature
ranges for the plastic connectors are -40°C to +85°C. Moreover, the retention force remains unchanged even after 2000 insertions of the connector into the transmitter or receiver. The force required to insert the connector varies from 30N to 51N maximum, based on the type of plastic connector used. Destruction occurs typically from 178N.
The connector-cable tensile force is approximately 50N for a solution without a ring and from 22N to 35N using a crimping ring. A crimpless connector is less expensive not only through avoiding the purchase of tools and by reducing the time spent on the termination, but also by reducing the yield loss because of installation errors.
The connectors, like the POF cable, can be simplex or duplex. The duplex connectors are keyed to prevent any bad connection to a system of duplex modules. A duplex configuration is easily achieved by snapping together two non-latching simplex connectors, where the upper part of one of the simplex is inserted into the ferrule of the second. In this way, the POF are definitively connected to the connector and cannot move.
From a practical point of view, the connectors used can be color coded in order to facilitate the task of technicians so they can more easily identify the cable to be connected to a transmitter or a receiver.
Fiber optic components can be selected to match the link length and data rates needed for each specific environment. Low rate, short-span applications can be served by 650 nm LEDbased components where up to 1 MBd data rates, a span out to 45 meters, and cost-effective
plastic fiber cable are suitable. The LED-based components have a reach of 2,700 meters over multimode fiber (MM) and a 20 MBd data rate but can also operate up to 160 MBd over 500 m. Other rates and reaches are served by components such as those shown in Table 1.
Table 1 - Fiber optic component selection for data rate and link distance.
Ethernet Communication and WTGs
Many types of equipment in WTGs use Ethernet communications, including the control system, Ethernet-based circuit breakers and the switches used for the networking of the farm. Each control system of each individual wind turbine will be connected to a Local Area Network (LAN), operating with Ethernet, which is also connected to a remote control center for monitoring purposes. Using fiber optics solutions for communication will help the wind farm operate more reliably, especially in the harsh, offshore farm environment. A digital monitoring interface (DMI) and an industrial 10/100 Ethernet POF and HCS transceiver with DMI (Figure 5) can be used.
The DMI allows a full real-time monitoring of the FO link through a two-wire serial interface. In addition to the monitoring of the LED drive current and photodiode current, the interface also monitors the transmitter supply voltage and temperature.
Figure 5 – Industrial 10/100 Ethernet POF and HCS transceiver with DMI.
Mickaël Marie is the Product Marketing Manager for the Industrial Fiber Optics Product Division, at Avago Technologies, covering EMEA and Asia Pacific. Mickaël’s experience with fiber optics products began in 1999 with Hewlett Packard, where he worked in the technical response center and then as a Field Application Engineer and Business Development Manager. And, now he has been in his current role since November of 2006.
A Life Cycle Assessment Of Fibre Optic submarine Cable systems
Craig Donovan
Submarine cables carry over 97 percent of our transcontinental voice and data traffic. Yet, research reveals that little is known about the potential environmental impacts of a fibre optic submarine cable system from a life cycle perspective. Life Cycle Assessment (LCA) methodology is used to collect specific system data and evaluate the potential environmental impacts of a system by using a holistic approach within a single consistent framework. A cradle-to-grave process is considered, which begins with the extraction of raw materials from the natural environment and ends with the recycling of materials or the return of wastes back to the environment. LCA as a method, however, should not be considered a full environmental assessment as it addresses only those environmental issues specified in the goal and scope. Furthermore, environmental impacts are described as potential impacts as they are not fixed in time and space and are related to
the defined functional unit. Therefore, LCA should be considered an analytical tool to be used in conjunction with other evidence in order to support decision making.
The background research to this paper considered 10 common environmental impact categories. This paper focuses solely on climate change or the Global Warming Potential over a 100-year time horizon (GWP100), which relates to anthropogenic emissions of greenhouse gases (GHGs). The characterization model, developed by the Intergovernmental Panel on Climate Change (IPCC), measures GHG emissions against the reference unit of kilograms of carbon dioxide equivalents (kg CO2e). Emission of methane, for example, is known to cause greater radiative forcing and is assigned a factor of 24:1.
GOAL AND SCOPE DEFINITION
The goal of the study was to undertake an
LCA of a fibre optic submarine cable system in order to assess the potential environmental impact of sending data over the cable network. To evaluate these impacts, the modelled flows within the system must be related to a quantifiable function of the system, described as the functional unit. In this case the functional unit was chosen as Ten thousand gigabit kilometres (10,000Gb.km), which is a scalable unit and can be interpreted as, for example, 1.25Gb of data sent over 8,000km of submarine cable. The technological system boundary is defined as the limits of the land terminal station and includes all significant components within the terminal, the submarine cable and the submarine repeaters. The temporal boundary is based on a commercial service lifetime of 13 years and the average system capacity of 11 systems installed on or after year 2000. The geographical boundary is based on a generic system in a global perspective.
LIFE CYCLE INVENTORY
Detailed data of the flows within, and crossing, the system boundary are collected during the inventory stage. Using a cradleto-grave approach, five life cycle phases were identified. Each phase was further divided into cable (including repeaters) and terminal station sub-models, as shown in Figure 1.
1: Life cycle phases of a submarine cable
Key processes are electricity generation and the production and combustion of marine fuel. Energy sub-models were developed for both. Electricity was modelled using previous LCA studies, with generation averaged for China, Japan, Europe and the US. Production of marine fuel was determined from standard LCA databases using US and EU production figures for Heavy Fuel Oil (HFO). Transportation sub-models included shipping emissions from the combustion of marine fuel during various engine loads, which were determined by previous research in the field and road and air transportation from standard LCA databases. Raw material extraction and manufacturing for the cable was based on the quantities of four principal cable types determined from 4 systems representing 40,000km of cable, as shown in Table 1.
The raw material extraction and manufacturing for a repeater was based on the weight of a beryllium copper housing and assumptions for the internal components. For the terminal, this was based on previous LCA studies of equivalent components. Vessel operations for typical route survey and installation missions (normalized to 1000km) were based on operational reports and are given in Table 2. Fuel consumption was based on vessel specification sheets, with emissions calculated from engine load and fuel type.
being consumed during the use & maintenance phase. Maintenance, normalized to 1000 km of cable, was averaged from the operations of four standby vessels and is given in Table 2. An annual fault factor of 0.37 was calculated, being consistent with other research. A total of 179 ship days per 1000 km of cable was calculated, resulting in the combustion of 1515 tons of fuel. Of this, 54% is consumed during the use & maintenance phase, with 19% consumed during the installation and end-oflife recovery phases. While it is not common to recover decommissioned cable, the end-of-life decommissioning scenario considers that the cable is recovered by cable ship and recycled for the mechanical materials, such as plastic, steel and copper. Recycling of these particular mechanical materials is highly efficient and a “closed-loop” recycling process is modelled, which assumes that 90 percent of the virgin material input is offset by the recycled materials.
1000km
Use and maintenance is divided between electricity used at the terminal and cable ship maintenance. Typical total energy consumption at the terminal, including equipment, climate control and lighting, was calculated from two stations at 191kW. A total of 127 gigawatt hours (GWh) of electricity is used, given the lifetime of 13 years, with 90 percent of this
Generic system length was based on 24 transoceanic cables giving an average of 16,200 km with 8.5 terminal stations. System capacity was based on the average of 11 systems installed from 2000 onward, giving an estimated lit capacity of 400Gbps. Research shows that bandwidth usage is, on average, only 25% of lit capacity. Therefore, assuming bandwidth usage approximates data traffic, a total of 100Gbps actual data traffic is estimated.
RESULTS
The results show that a total of seven grams (7g) of carbon dioxide equivalents (CO2e) are released for every 10,000 Gb.km. This figure can be placed into context by comparing a telepresence meeting utilizing the fibre network, to a face-to-face meeting requiring air
Figure
Table 1: Average ratio of cable types
Table 2: Typical mission length per
travel. Consider a meeting between Stockholm and New York with an approximate distance of 8000 km. A telepresence meeting requiring a bandwidth of 18Mbps would potentially release 355 g CO2e per hour. Expanding this to a 2 day meeting of 16 hours, results in a potential release of 5.7 kg of CO2e. By comparison, this same 2 day meeting in a faceto-face setting would require 16,000 km of air travel per person, resulting in a release of 1920 kg of CO2. This represents a saving of almost 2 ton of CO2 and is equivalent to driving the average passenger car approximately 12,000 km. Table 3 presents the above comparison.
Graph 1: Climate change potential per 10,000 Gb.km
Saving Approximately 2 ton of CO2
* Represents 8,000km of terminal-to-terminal submarine cable data transfer only.
Table 3: Climate change impact comparison
It should be noted that this example is for comparison purposes and considers only the impact of sending data via the submarine cable system and not the telepresence system or the terrestrial network as a whole.
Graph 1 presents a more detailed analysis of the LCA results by life cycle phase and reveals that the use & maintenance phase clearly dominates the climate change impact at 64%.
Emissions of carbon dioxide (CO2) to air are the significant contributor resulting from the generation of the electricity used at the terminal and the combustion of marine fuel consumed during cable maintenance. Emissions of other GHGs are insignificant by comparison. These two energy resources, therefore, appear to be the key processes influencing the environmental performance of a submarine cable system.
The installation and end-of-life phases present an impact of 13 and 16 percent respectively and are influenced by the combustion of marine fuel during cable ship operations.
Further analysis of the use & maintenance phase shows that the emissions of CO2e are equally shared between electricity use at the terminal (47 percent) and cable ship operations during maintenance (53 percent), as shown in Graph 2.
Graph 2: Primary energy consumption vs. climate change potential per 10,000 Gb.km
However, what is more interesting is the magnitude of the impact per unit of primary energy. Graph 2 shows that the combustion of marine fuel has a far greater impact on climate change, per unit of primary energy, than the impact from electricity use, highlighting the disparity in the environmental impact of electricity verses fossil fuel consumption.
DATA QUALITY ANALYSIS
The limitations of the study affect the final result, therefore, as recommended by the ISO 14040 series guidelines, a sensitivity analysis has been undertaken to estimate the effect of data gaps, assumptions and methodological choices. The submarine repeaters and terminal components are two sub-models affected significantly by data gaps and assumptions. However, by changing parameters within these sub-models, the sensitivity analysis shows that they have little effect on the final result. This indicates that the LCA model is relatively unaffected by the greatest uncertainties and is, therefore, robust. Results of the sensitivity analysis based on data gaps and assumptions are shown in Graph 3 (columns 1 to 5).
Graph 3: Data sensitivity analysis
Methodological choices include the use of database models for the production of electricity and heavy fuel oil (HFO) and for the combustion of HFO. The sensitivity analysis shows that methodological choices affect the final result no greater than 20 percent. The choice to use emissions factors for heavy residual oil (RO) fuels was particularly conservative; however, the analysis shows
(Graph 3, column 9) that no great change in the climate change impact is observed between RO and the lighter cleaner marine distillates (MDs). Results of the sensitivity analysis based on methodological choices are shown in Graph 3 (columns 6 to 11).
DISCUSSION
The function of the system is based on usage, or the actual used bandwidth, as opposed to the design capacity, or present technological limitations at the terminal. Research shows that bandwidth usage is approximately 25 percent of current lit capacity. If this gap between usage and lit capacity was reduced, notwithstanding technical and commercial limitations, then a subsequent gain in environmental performance per data unit would be achieved. However, it should be noted that the overall environmental impact over the system lifetime remains unchanged. Similarly, increased total data traffic through a longer service life, reduces the resulting impact per unit of data. From a life cycle perspective, the longer a cable remains in service, the superior the environmental performance. Used capacity and service life therefore have a significant effect on determining the results based on the chosen functional unit.
Overall the use & maintenance phase is the area where the greatest gains could be made, particularly, electricity use at the terminal and the emissions from the cable ships. If the aim is to reduce the overall environmental impact of cable systems further, then cable owners could direct their focus on these two particular areas. Perhaps, electricity produced from renewable resources could be considered, along with other measures to reduce vessel fuel consumption or emissions.
Finally, when interpreting the results, it is also important to remember that an LCA model is a simplification of reality.
CONCLUSIONS
• The use & maintenance phase dominates the potential climate change impact.
• Key processes are electricity use at the terminal station and the combustion of marine fuel during the cable maintenance.
• Maintenance has the greatest impact per unit of primary energy.
• Seven grams CO2e are released per 10,000 Gb.km.
• Increasing the commercial lifetime or data traffic, reduces the environmental impact per unit of data.
Craig Donovan is a researcher at the EMF Safety and Sustainability division of Ericsson Research. He has recently completed a thesis on the environmental impact of submarine fiber-optic communication cables. The study reveals that they leave only a minor carbon footprint in relation to the data capacity of a modern system.
Ultra Low Loss and Large effective Area Fiber for Next Generation submarine Networks
Raj Mishra, sergey ten, & Rita Rukosueva
The transition to higher bit rate (40 Gb/s and 100 Gb/s) transmission systems in submarine networks is imminent. At the 2010 SubOptic conference in Yokohama, Japan, it was clear that the industry anticipates that 40 Gb/s submarine terminal equipment will have several advantages, such as increased capacity, reduced cost, power dissipation and floor space per transmitted bit. Major submarine transmission system vendors presented their views of the requirements for 40 and 100 Gb/s implementation and one theme was consistently present in those presentations: 40 and 100 Gb/s will require higher optical signal to noise ratio (OSNR) compared to existing 10 Gb/s systems. There are several technologies that have been employed to reduce the need for higher OSNR. Forward Error Correction (FEC) is a mature technology that is moving toward its third generation and promises a 2.8 dB coding gain improvement. Coherent detection together with polarization multiplexing is widely seen as a technology of choice for 100 Gb/s systems. It reduces the required OSNR (up to 2.7 dB) and may be employed in high end 40 Gb/s transponders, but its cost effectiveness is uncertain. Finally, distributed Raman amplification that is used in terrestrial and unrepeatered submarine networks to improve OSNR is unlikely to be adopted in the repeatered links due to the electric power requirements.
In addition to hardware improvements, another way to improve OSNR is to use optical fiber with lower attenuation and larger effective area (Aeff). OSNR is proportional to the launched power and inversely proportional to the loss of the optical fiber
between two repeaters. Larger Aeff reduces fiber nonlinearity that enables the launch of higher optical power. Lower fiber attenuation reduces the span loss (product of fiber length between repeaters and fiber attenuation). A simple figure of merit (FOM) can show how the combined improvement of fiber attenuation and Aeff will increase OSNR. According to this FOM and the example cited in this paper, depending on system configuration, it is conceivable to gain up to 3.5 dB in OSNR. This is comparable to the improvements gained by next generation FEC or coherent detection with polarization multiplexing. In practice all three methods (FEC, coherent detection and Ultra Low Loss large Aeff fiber) could be used to achieve 40 Gb/s and 100 Gb/s transmission over transoceanic distances.
Corning’s submarine optical fiber development has been focused on improving fiber attenuation for quite some time. Corning introduced ultra low attenuation Vascade® EX1000 fiber in 2006. This low attenuation fiber, with typical attenuation of less than 0.170 dB/km, Aeff of 76 square micron (mm2) and good Raman efficiency, was designed for unrepeatered submarine systems. Corning Vascade EX1000 is a silica core fiber that does not have germanium doping, known for its higher Rayleigh scattering (unlike in vast majority of conventional telecommunication fibers). Even though the design seems simple, the manufacturing process is more complex compared to conventional fibers and that is why Corning is one of only two fiber suppliers to offer this type of product.
Vascade EX1000 fiber was deployed in 2007 by Faroese Telecom in a 400 km unrepeatered link connecting Faroe Island to Shetland Island in the North Atlantic Ocean. This submarine link supported capacity of 19 channels at 10 Gb/s on a single fiber without the use of remote optically pumped amplifiers. This ultra-long distance became possible due to the use of advanced Raman amplification over ultra low attenuation optical fiber.
Since the introduction of Vascade EX1000 fiber, Corning continued to work on improving fiber attributes that enable higher OSNR in submarine systems. The next logical step was to increase Aeff in fibers with ultra low attenuation. In May 2010, Corning introduced Vascade® EX2000 optical fiber, an ultra low loss and large effective area silica core fiber.
Vascade EX2000 optical fiber has a typical effective area of 112 µm2 and typical attenuation of 0.162 dB/km at 1550 nm. In repeatered systems, these attribute improvements allow extended system reach at 40 and 100 Gb/s. This advanced fiber could be used in the nextgeneration dispersion-managed fiber solution, Vascade® R2000 fiber solution which combines positive (Vascade EX2000 fiber) and negative dispersion fiber (Vascade® S1000 fiber) in a single span. In unrepeatered submarine networks, Vascade EX2000 fiber enables the use of higher launch power which results in longer system reach. The table below summarizes the key attributes of Vascade EX2000 fiber.
Max-Planck Institute, Stuttgart, Germany and began working on submarine and terrestrial transmission fiber systems. He has B.S, M.S, and PhD degree in Physics. Dr. Mishra holds 28 US patents and has co-authored over 15 publications.
Effective Area (Aeff)
Table 1. Summary of key Vascade EX2000 fiber attributes
As the transition to very high data rates in submarine networks continues, submarine system vendors have clearly indicated the need for improved system OSNR. As a result, opportunities to improve OSNR through silica core optical fibers with both ultra low attenuation and large effective area are attractive as they present opportunities for cost-effective, simplified submarine system that can support 40 Gb/s and 100 Gb/s.
Dr. Raj Mishra is a Senior Project Engineer with Corning Optical Fiber in Wilmington, NC. His work has included modeling and simulation, design, development and characterization of optical fibers. Dr. Mishra joined Corning in 1999 after completing post-doctoral research works in Physics at Massachusetts Institute of Technology and
Rita Rukosueva is currently the Submarine Products Manager for Corning Incorporated. She has been with the corporation for over 10 years. Prior to her current position, Rukosueva was the Market Development Engineering Manager where she made numerous contributions in the development of next generation optical fiber products. Rukosueva holds a M. S. degree in Physics from Moscow State University in Russia.
Sergey Ten was born in Rostovon-Don, Russia and graduated with honors from the physics department of Moscow State University. He earned his Ph.D. from the University of Arizona in 1996 and joined Corning in 1997 as a Senior Scientist, concentrating on the physics of light propagation in optical fibers. Sergey worked for Tyco Submarine Systems Ltd. in 2000-2001 and in 2001, he re-joined Corning Incorporated as the manager of the transmission test bed group, investigating high data rate transmission in optical fibers focusing on the nonlinear and linear impairments in all types of optical networks. He has authored 45 journal and conference articles and holds 11 patents in the field of optical communications. Currently, he is a manager of the New Business and Technology Development Group concentrating his efforts on the development of new fibers for telecom and nontelecom applications.
Deepwater Bernstein Research’s View
Neil McMahon, senior analyst with Bernstein Research, presented an overview of where the oil and gas industry is with deepwater, at the May 26 Finding Petroleum forum in London
"Recent exploration success in the deepwater has been dominated by national oil companies (NOCs) and smaller exploration and production companies (E&Ps), not the oil majors (integrated oil companies/IOCs)," said Neil McMahon, senior analyst with Bernstein Research, speaking at the May 26 Finding Petroleum London forum.
"For example, in 2009, there were 42 deepwater exploration discoveries from IOCs, 111 from the NOCs and 211 from the E&Ps," he said.
However, “most IOCS have been talking about returning to exploration recently,” he said. ”The IOCS are picking up the pace of exploration or at least talking about this because they feel they are being left behind.”
“In late 80s and mid 90s - deepwater exploration was pretty much everything IOCs were doing. They took on the risk, they didn't mind doing things seen as quite technical,” he said.
“Today they're balancing the risk of doing a bit in the deepwater and doing lots and lots of easy unconventional stuff where the geological risk is zero. I think they have lost their risk appetite,” he said. “But this could change in a few years as they are pushed into more and more exploration areas. I think we could see
the next renaissance of exploration activity over the next 5 years, where IOCs decide to take on risk again.”
Looking around the world, some of the oil majors are in Ghana, some are in Brazil, and most have gone into Indonesia. There is also new deepwater activity in the Gulf of Mexico and offshore Libya. The country which has seen the most amount of deepwater discoveries over 2006 to 2009 surprisingly, is Australia.
"There is still no clear consensus on what deepwater actually is," he said. Previously, anything over 200m was considered deepwater. But now most people class something over 1000m as deepwater. “Many people would say 1000m is far too shallow.”
Mr McMahon believes that the growth in deepwater exploration is driven more by the availability of opportunities, rather than a high oil price, as technology became available or license blocks were opened.
For example, there were many new discoveries in the period 1996 to 2003, despite a fairly low oil price at the time.
Looking for different geology
Until 2005-2006, people were only looking for a specific number of geology types in the deepwater, mainly submarine fans and clastic reservoirs. But then “there was a big change,” he said.
“Wherever you take a step back - it’s the guys that have looked for something quite different which have met success in the past few years,” he said. “You could argue the future is stratigraphic traps and presalt carbonate reservoirs. The next stage will get a lot more complex but it does open up a lot of new areas.”
History
The first wave of deepwater exploration was in the late 1980s, with Shell discoveries in the deepwater Gulf of Mexico, with submarine fans. This was followed by a second wave in the late 1990s, with discoveries in the Gulf of Mexico, Angola and Nigeria.
There was actually a drop in the rate of new discoveries from 2002 to 2004, as the oil price was sharply climbing. “I think that's because Integrated Oil Companies stopped exploring,” he said.
“Beyond 2002 it’s difficult to name discoveries IOCs have actually made - you can do it pretty much on your hand.”
Then there has been a ‘third wave’ from 2004 to today, which has been driven by Brazil and the smaller E&P companies, he said, exploring in the Gulf of Mexico and new regions of West Africa.
There does also seem to be a trend for discoveries to get smaller and smaller, and deeper and deeper. The flowrates from new discoveries are also decreasing.
“In fields from the late 1990s to early 2000s, the flowrates could be 55,000 bopd at Thunderhorse to 15,000 to 20,000 in places like Angola,” he said.
But for the Gulf of Mexico Lower Tertiary (where many of the newer wells are) “you’re lucky to get 10,000 bopd, or at best 15,000 bopd,” he said.
“I think this is going to be incredibly important going forward.” The new wells outside Brazil tend to be drilled in “very technically challenged geology,” he said.
“In Brazil - the flow rates are well above what we expected 2 years ago and what oil companies had expected from carbonate reservoirs,” he said.
Satisfying Wall Street
Oil industry investors in Wall Street and the City are decreasingly rewarding oil majors for making safe investments, such as in Iraq,
which is dangerous politically but geologically very safe.
“The market is not rewarding them for going for the non risky stuff,” he said. “A lot of investors are sitting there saying, what is your value added for developing this on behalf of someone else - it really isn't much. There are three simple things which investors are looking for now from oil companies," he said.
The first is big growth in production, which is profitable, and which does not include Iraq (which is considered to have low profitability, due to the technical service agreements);
The second is that they “need to be exploring
and have potential for significant added value from exploration acreage for the company,”
And the third is that they need to be leveraged to high oil and gas prices – or in other words, their production costs should be high – so an increase in oil and gas prices should make a big increase in overall profit margin and investor returns.
“Most of the IOCs don't tick any of those 3 boxes,” he said.
West
Africa
In West Africa, the growth in deepwater discoveries has shown “fantastic progression”
until the past 5 years, he said. Lately, “everything that's been discovered in Angola has been much smaller than what was discovered in mid 90s,” he said.
Deepwater Nigeria “has seen pretty much a stalling of any decent exploration activity,” he said. “Until we see a clear hydrocarbon law in Nigeria and people feel a bit more comfortable about the situation - the traditional exploration is pretty much dead.”
"There are two very important wells in Sierre Leone and Liberia, being drilled by Anadarko and Tullow," he said. “These follow from what Anadarko would say is a very important test of a hydrocarbon system in Sierra Leone.”
“In Ghana, Hess will be drilling just south of the Jubilee field in a 100 per cent owned block. We will see if the deeper water offshore Ghana can prove up hydrocarbon plays. So expect a lot more from this area in terms of industry news in next 12-18 months.”
Brazil
Brazil has recently “just taken off,” he said. The industry is looking at a new type of deepwater carbonate reservoirs, where the sediment was originally deposited in shallow water.
“It’s more of an engineering challenge than geological challenge,” he said. “It’s not understanding where the reservoirs are, but understanding where the fractures are and how to drill and complete production wells.”
There haven’t been many new exploration rounds in the Santos and Campos basins
recently, so there has been more appraisal drilling (trying to find out more about known reservoirs) rather than exploration drilling.
During the next 2 years, there will probably be a repricing of reserves in the Santos presalt, now more is known about how productive they are. The Brazilian government has asked independent valuers to work out how much the reserves are worth, and will sell them to Petrobras at that price. Bernstein anticipates that the price will be “at least $7 per barrel,” he said.
This means that companies which already have reserves in the Santos basin, including BG, Galp and Repsol, will be revalued.
“You're going to see a lot of drilling activity in the Campos basin as well,” he said.
French Guiana
“Probably the most interesting new area in South America from a deepwater exploration point of view is probably going to be French Guiana,” he said. “This is an area that hasn't seen much activity at all.”
“There's a Tullow/Shell/Total well going down in the 4th quarter of 2010. It’s targeted historically structural traps - this time it will be probably more stratigraphic traps echoing what’s been going on in West Africa.”
South East Asia
“Indonesia is a place a lot of people have written off because it has come out of OPEC,” he said. “Most people that aren’t involved in the industry think that pretty much Indonesia is completely done because it has been worked on for so long. But there are 2 areas, Pasangkayu and Bone Bay around Sulawesi where exploration has never been done.”
“Marathon has got 2 wells going down here and plans to collect more data in Bone Bay, as well as in another exciting block in West Papua. Hess will be drilling in West Papua too.”
Moving further East, Exxon have had some success in the Philippines, and in the South Chinese sea there is a gas condensate trend. “CNOOC have a major position here along with Husky and Anadarko are involved yet
again, and BG,” he said. “It will be drilled extensively in 2010 and 2011.”
Libya
In Libya, “there have been 2 dry holes drilled by ExxonMobil in the deep water,” he said. “The only real discovery we have information on is Hess' discovery starting to step out into deepwater.”
“However from a company point of view Libya is less relevant than other areas in the world given the fiscal regime that exists,” he said. “We tend to discount it from a share price point movement of view because it can tend to add limited amount to valuation of a company because of the tax.”
Gulf of Mexico
The Gulf of Mexico went through “great wave” in the 1980s. “A lot of exploration kicked off then,” he said. “There were great discoveries in late 90s - Thunderhorse being the best known.”
Then, “it started to drift a bit, he said. “A lot of the Lower Tertiary discoveries were made in last few years then it started to take off again,” he said.
Oil spill
The Macondo oil spill is a “spill of such proportions that it has walked all over the safety record that was in place in the Gulf of Mexico,” he said.
There have been oil spills in the past, but the Macondo oil spill is much bigger. “The total Gulf of Mexico oil spilled in 2005 was surpassed in 3 days by the spill from Macondo.”
"The Gulf of Mexico has a far from perfect safety record. According to data from the US Minerals Management Service (MMS), there have been plenty of fires and explosions offshore since 1996," he said. "We haven't heard about them because in many cases the blow out preventors worked - we didn't have a situation like we have today.”
“[The Macondo disaster] is regarded as a unique event which a lot of attention should be spent on,” he said.
“You could argue that a lot of attention should have been spent on this since 1996 - it’s not as though the industry has been squeaky clean.”
Reduced drilling?
As a result of the Macondo disaster, many people around the world are looking at deepwater drilling much more closely, which will have an impact on the amount of drilling that is allowed.
Already, the consultations which had been planned on oil drilling offshore Virginia for the weeks following the spill have been postponed. “So it looks like that's not going on,” he said. “It looks very unlikely that East Gulf of Mexico lease sale is going ahead.”
There are also areas Shell is trying to drill in the Arctic this year, where there are uncertainties, he said. “We think this whole Arctic area isn't going to see any activity until 2012 and beyond.”
"If all oil and gas deepwater production projects around the world which are planned, but have not yet been started were delayed by a year, there would be a shortage of oil of between half a million and a million barrels of oil per day, versus previous oil supply and demand forecasts," he said. “That gap could be closed by OPEC.”
But “any delays, any regulation, will just drive up the oil price.”
However, “a high oil price could then generate more exploration opportunities going forward,” he said.
Can We Get enough?
exploring the Alternatives of Alternative energy
stephen Jarvis
Modern society as we know it is living on borrowed time, and every day brings us closer to our very own end of the world. Everyone knows the source of the problem: Power. Our world runs on fossil fuels. Oil, coal and natural gas provide the energy needed to keep traffic moving, airplanes flying, electricity in the home and all of the other machines that make this the twentyfirst century. Unfortunately, none of those energy sources are renewable; there is no question that they will eventually run out. Just last year, the world at large used 83.62 million bbl/day of oil according to the CIA's World Fact Book. The US alone used just over 18.5 million of that, thirty percent of which comes from offshore oil.
There may be no way to truly predict how much oil is out there, but like all other fossil fuels, the day will come when the mines will dwindle and the reservoirs will dry up. There are also consequences to consider. While there is plenty of debate over the long-term effects of using fossil fuels, it’s hard to argue that the constant use has increased the amount of CO2 in the atmosphere.
None of this is news. Yet, for all the talk of the need for some alternative, a true replacement for fossil fuels has yet to appear. There are some contenders, among them a few that may help to replace gasoline like hydrogen batteries and biomass products, and some that may in time even replace the need for fossil fuels in general like solar, wind, water and geothermal technologies.
Most oil is processed into gasoline. Since the creation of the internal combustion engine, no other source of fuel has been as important. Recently, the idea of a hydrogen battery has been raised as a possible alternative to gasoline. It is currently being explored for specialized situations such as space and medical applications. Nickel hydrogen batteries have shown real possibilities because of their long life, high
power density and low maintenance. Unfortunately, they cost so much to produce that even when they are made specifically for vehicles, the cost would far exceed the energy that they could provide.
In theory, biomass products are the answer to our gasoline problems. Biomass refers to any renewable energy source created from something biological, namely, ethanol. Using corn crops almost exclusively, ethanol is used to create a mixed fuel that can greatly take the strain off of oil. The commonly used mixture, E85, is 85% ethanol and only 15% gasoline. So where does the problem lie? It’s the same problem as the hydrogen battery: The cost is simply too prohibitive. To produce the quantities needed to really make a dent in gasoline usage, the actual process of turning those crops into ethanol would take more energy than it would save in the long run. Perhaps the use of the modest help biofuels are providing in conjunction with things like hybrid cars will lighten the load on gasoline use. It would be nice to say that everyone should simply buy an electric car, but even this won’t solve the problem.
Most of the energy used to power households, cities, and even electric cars come from fossil fuels, primarily natural gas and coal. The issue of replacing oil as a power source for machinery pales
in comparison to the world’s electric bill: 17.93 trillion kWh as of 2007. Only a small fraction of that came from renewable resources. There is hope that within the next few decades this will change and the majority of our power will come from “green” energies. But who can say if this will truly be the case?
There are a few technologies that may fit the bill for freedom from fossil fuels. Namely, these are solar power (photovoltaics), wind turbines, hydroelectric power sources and geothermal energy. There’s been a lot of hope placed in each of these, but they all have the same problem: Land. At the moment, the energy collection process isn’t efficient enough, and each of these technologies would need to be applied on a huge scale to be useful. Photovoltaic panels and wind turbines would need huge tracts of land to provide enough energy to offset more than a small percent of the world’s energy demand. While hydroelectric complexes are used all over, they can only be used in places like dams where the waters flow has enough strength. Like hydroelectric sources, geothermal is dependent on land. If the energy requirements don’t match the countries supply of geothermal resource, it isn’t a viable option.
How can we reconcile the fact that there simply isn’t enough of the right kind of resources or the room to build these renewable energy farms? One of the most popular suggestions is also possibly the most obvious one; don’t build it on land. In a number of places all over the world, plans have been implemented to create offshore energy farms. So far, wind turbines are the technology of choice for this idea. In late April 2010, Germany completed the first of its offshore wind farms, Alpha Ventus.
Already, Germany has a number of wind turbines dotting its countryside, which provide 6 to7 percent of its power. This venture is a bit grander in scale. The estimated output of this farm is 220 gigawatt hours of electricity, which will power 50,000 homes yearly. And this is only the beginning. Der Spiegel magazine has already reported Germany’s approval for waters that can be used as wind farms, and companies such as General Electric, Siemens, and E.ON have begun taking interest. The hope is for Germany to reduce its CO2 emissions by 30 percent in a decade, while at the same time phasing out its nuclear energy programs.
Why phase out nuclear energy? As of 2007, Germany’s nuclear reactors produced almost 141 thousand gigawatts of the countries energy. It’s a major source of power for the country and has for many years been considered the closest so far to a successful alternative to fossil fuels. However, many organizations, such as the World Wildlife Foundation (WWF), argue against it. In a statement addressing the issue, the WWF says “Replacing fossil fuel fired power stations with nuclear energy simply replaces one fundamental environmental problem with another. It is clear that nuclear power remains particularly dangerous and difficult to control.” Here they refer to a number of accidents such as Chernobyl, in 1986, and Tokaimura, in 1999. And while safety has changed and grown since then, it can’t be denied that options like wind turbines don’t produces dangerous waste products, the disposal of which has always been tricky business.
The wind turbines may be the first of coming trends for offshore building projects. Of the
18.6 million bbl/day of oil used by America, 30 percent of that came from offshore sources, according to the U.S. Energy Information Administration. Of the 3,873 thousand gigawatt hours of electricity that the U.S. used in 2008, 31,917 of it came from oil. If roughly thirty percent of the oil used comes from offshore drilling, then 9,575 gigawatt hours of electricity did as well. This is an impressive amount compared to the 220 gigawatt hours produced by an offshore power farm the size of Alpha Ventus. However, consider this: A conservative estimate for the cost of putting a new platform is place is about $2 billion. The const of Alpha Ventus was roughly $282 million, included estimate cost of upkeep over 25 years. That means that the price of one platform could pay for an offshore wind farm seven times over. If those seven farms were produced, they would cover about 10% of the electricity coming from offshore drilling. In the right amounts, and with the kind of money that is right now put into oil drilling, these wind farms could put a sizable dent in the nation energy requirements.
But, is wind technology too young to replace nuclear reactors or fossil fuels? Only a few months into operation, Alpha Ventus has had significant problems as two of the twelve turbines broke down due to parts failures, with another four needing to be replaced because the same failures were imminent. The French company Areva, which manufactured the parts and is responsible for half the turbines, didn’t foresee the overheating that occurred. In many ways, Alpha Ventus can be considered a test run for long term application; the technology is still changing.
At the Yokohama Renewable Energy International Exhibition 2010, a new form of
wind turbine was introduced: The Wind Lens. The proposed system is potentially three times more efficient than current turbines, and would be quieter, a common complaint about local wind farms. While this isn’t to say that Alpha Ventus isn’t an admirable and impressive creation, the Wind Lens is just this year’s new technology. Germany’s wind farm has only just been finished and the technology has been practically reinvented. Who’s to say was next year will hold?
Still, that is perhaps the wrong question to ask. In the August 05 issue of National Geographic, the director of the National Bioenergy Center, Michael Pacheco, was quoted saying “We’re going to need everything we can get from biomass, everything we can get from solar, everything we can get from wind. And still the question is, can we get enough?” Alpha Ventus will do what it can, and if new and better technology is designed, great. Build that too. It’s a long way to go before renewable technologies can do more than replace a percentage of fossil fuels, and each year it becomes that much more important that they do. So build it all, and maybe together it will make the difference.
Stephen Jarvis is a freelance writer in the Washington D.C. area. He has published articles and done editorial work with several publications including Submarine Telecoms Forum. Also, he has been a speaker for the Popular Culture Association / American Culture Association National Conference.
the Oil and Gas Industry, Radio spectrum, and the FCC
Jack Richards & Greg Kunkle
Energy issues have moved to the forefront of our national policy debate, and sufficient radio spectrum necessary to support wireless telecommunications services in the oil and gas industry is a critical part of the story. Unfortunately, to date, the Federal Communications Commission (“FCC”) has failed to ensure that the industry’s spectrum requirements are satisfied. Concerted efforts by industry and government will be required to make sure that these vital needs are met through a new allocation of spectrum exclusively for use by critical infrastructure industries.
Critical Communications
Sophisticated communications systems are used throughout the oil and gas industry to enhance exploration, production, transportation and distribution. Oil and gas companies install, operate and maintain countless communications facilities, including mobile radios in vehicles, portable handheld units, wireless laptops, point-to-point and point-to-multipoint microwave systems for Supervisory Control and Data Automation (“SCADA”), IP-based broadband networks and fiber links.
For example, data from thousands of oil and natural gas wells can be collected and transmitted wirelessly to a central automation center for monitoring and control of well pressures, temperature, and rates of flow that are essential to the coordinated and safe operation of a production facility. Wireless communications systems can be used to monitor and control thousands of miles of pipeline. Refineries, too, depend on wireless communications systems for voice, data and other critical functions. Leak detection and
other types of alarm systems also depend on adequate wireless communications infrastructure.
In addition, like many other industries, oil and gas companies are seeking to extend as far as possible the reach of communications capabilities in support of workers in the field, ensuring access to communications capabilities on offshore platforms, in landbased exploration and production operations, along pipelines and in remote and rural areas often not served by commercial providers.
To gain efficiencies, modern oil and gas companies across-the-board are in the midst of migrating to IP-enabled technology much in the same way that today’s electric utilities are moving to smart grids. Among other applications, IP-based video in particular also is being used increasingly for site-security and to provide more effective remote diagnostic and troubleshooting capability. IP-based technology represents a major advance in the disaster response capability of oil and natural gas companies and promises to increase efficiency and promote safety and environmental protection throughout the industry.
Tangible Benefits
Communications applications in the oil and gas industry are expanding almost exponentially. Every major private-sector oil and natural gas company already has a “digital oil field” initiative in place to take full advantage of new communications technologies.
Advanced communications systems offer direct and tangible benefits in promoting energy independence and efficiency: tighter SCADA functionality, flexibility to relocate
instantly from one control center to another in the event of disaster situations, enhanced cyber security, more efficient remote troubleshooting, and expanded use of video for security and other purposes. Analogous in many ways to utility smart grid applications, these improvements promise to lower exploration, production and transportation costs, decrease energy consumption, improve safety of industry personnel and the public, and strengthen environmental protections.
New communications technologies will advance two key aspects of President Obama’s energy platform: getting more from our existing oil fields, and promoting responsible domestic production. The digital oil field of the future will address these issues, increasing environmental protection, promoting safety of life and property, improving efficiency, and supporting disaster response efforts. Recent reports estimate that digital oil fields could add 125 billion barrels to global reserves by 2013.
The continued operation of reliable and efficient communications systems by oil and gas companies is essential to protecting lives, health and property in day-to-day operations, as well as during emergencies. These systems are integral to the production, refining and delivery of our nation’s energy resources, as well as our overall economic well being.
Pressing Need for More Spectrum
To operate successfully, these wireless communications systems are dependent on the availability of sufficient radio spectrum. Despite the obvious benefits, however, the oil and gas industry has “hit a wall” in obtaining spectrum necessary to implement its plans.
The FCC regulates and allocates the private use of radio spectrum in the public interest. To obtain a radio spectrum allocation, the oil and gas industry must engage in an administrative debate at the FCC along with many others competing for the same limited commodity, including commercial providers serving mass markets such as cellular, Personal Communications Services and Advanced Wireless licensees. Other, yet unborn consumer-driven technologies likewise will be looking to the FCC in the future for new spectrum allocations to support their own promised service offerings. Due to the finite nature of spectrum, however, not all of these requests can be satisfied.
When allocating spectrum, the FCC historically has favored consumer-friendly commercial providers serving mass markets over oil and gas companies and other industrial users providing critical infrastructure services. In fact, the amount of spectrum available for use by oil and gas companies actually has been decreasing over the years as the Commission has reallocated critical industrial radio spectrum for other “more popular” purposes.
For years, rather than allocating additional spectrum to the oil and gas industry, the FCC routinely took the opposite approach and required private radio users – such as oil and gas companies and electric utilities – to vacate spectrum as a means of accommodating the entry of new commercial providers serving mass markets. For example:
• Private microwave operators were required to vacate the 1850-1990 MHz band to accommodate new Personal Communications Services;
• Private microwave operators were required to vacate portions of the 2 GHz band to accommodate new Advanced Wireless Services;
• Private microwave operators were required to vacate the 12.2-12.7 GHz band to accommodate the introduction of Direct Broadcast Satellite Services; and
• Private mobile radio operators in the 800 MHz band were required to relocate to different frequencies in order to reduce interference to public safety systems caused by nearby commercial radio operators, and licensing at 900 MHz was frozen.
As a result, as the oil and gas industry and electric utilities move towards next generation applications to increase efficiency, effectiveness and safety, the harsh reality is that they lack the necessary spectrum tools to do so. Additional spectrum is urgently needed to advance energy independence and efficiency.
Without sufficient licensed spectrum dedicated for their exclusive use, oil and gas companies often must move to higher spectrum in the unlicensed bands (e.g, from 900 MHz to 2.4 GHz to 5.8 GHz) in a quest to find suitable channels. Under the FCC’s rules, however, these unlicensed frequencies receive few if any regulatory protections. Operation on unlicensed frequencies is “secondary:” the user may not cause interference to others and must accept interference caused by others.
In addition, the unlicensed bands themselves are becoming increasingly congested and are saturated in some areas. To put it mildly,
their ability to accommodate mass consumer applications (think WiFi at Starbucks) while providing reliable point-to-multipoint service to critical infrastructure companies is constrained in many areas.
Eventually, unless the FCC adjusts its spectrum policy, alternative spectrum options will be exhausted for critical infrastructure companies. The energy industry needs exclusive access to new licensed spectrum to support current and anticipated applications. Current estimates call for up to 30 MHz of bandwidth. While this amount of spectrum is available in point-to-point microwave allocations and in unlicensed bands, it does not exist for private licensed mobile or point-tomultipoint solutions – exactly the applications that the oil and gas industry needs most desperately for next generation applications.
Hopeful Signs
Within the last year or so, there have been signs that the FCC might finally be paying attention to the pressing spectrum requirements of the oil and gas industry. For example, at the request of the Telecommunications Subcommittee of the American Petroleum Institute (“API”), the FCC reversed itself and allocated 66.5 MHz of Broadband Radio Service (“BRS”) spectrum in the 2.5 GHz band for use on platforms in the Gulf of Mexico. This is the same band that wireless commercial providers are using to roll out “4G” WiMAX services terrestrially..
The Commission noted that “establishing BRS service areas in the Gulf could provide a means for meeting an important communications need in a critical area, as well as enhance emergency communications in the region.” Further, the Commission stated that “licensing BRS
spectrum in the Gulf will encourage service providers to explore and offer new services in the underserved Gulf region.”
Still pending before the FCC is a request by API for access to 112.5 MHz of additional 2.5 GHz spectrum for use by the oil and gas industry in the Gulf of Mexico. Currently, that spectrum is reserved for use by licensees in the Educational Broadband Service (“EBS”). Under the FCC’s rules, EBS licenses are issued only to educational institutions. Since the FCC has quite properly recognized that “there are no schools or universities in the Gulf of Mexico,” API also has urged the Commission to make this otherwise fallow spectrum available for important offshore uses by the oil and natural gas industry.
Just over two years ago, the FCC opened-up a large block of spectrum in the 3650-3700 MHz band (“3.65 GHz”) throughout the United States. While technical restrictions apply, the 3.65 GHz band is available for fixed and mobile operations on a nationwide, nonexclusive and non-auctioned basis. The 3.65 GHz band represents a unique opportunity for oil and gas companies and other private wireless users to deploy point-to-multipoint broadband services.
The FCC also recently decided to allow low power devices to operate on certain portions of the television broadcast spectrum. As long as no interference is caused to TV broadcasters, the FCC’s new rules allow wireless devices to operate in the TV broadcast spectrum on a secondary basis at locations where channels are not being used for authorized broadcast services (called “TV White Spaces”). This spectrum, too, may be useful to oil and gas companies.
With respect to the broader energy industry, the FCC’s National Broadband Plan recommends that Congress consider amending the Communications Act to enable utilities to use the proposed public safety 700 MHz wireless broadband network. It also urges the FCC to engage in joint efforts with the National Telecommunications and Information Administration (NTIA) to identify new uses for federal spectrum. The Department of Energy, too, has undertaken a proceeding to review spectrum requirements for smart grid applications, and API has urged it to consider the requirements of the oil and gas industry, as well.
There are no reliable promises when it comes to predicting action by the FCC. But with these types of new spectrum opportunities, coupled with an Administration focused on facilitating energy independence and promoting broadband deployment, there seems to be some indication that the Commission at long last is focusing at least some attention on the spectrum requirements of the oil and gas industry and other critical infrastructure companies. Hopefully, in the not-too-distant future, the FCC will recognize the special requirements of these industries and allocate an addition 30 MHz for their exclusive use.
Jack Richards and Gregory E. Kunkle are telecommunications attorneys with the Washington, D.C. law firm of Keller and Heckman LLP.
Mr. Richards and Mr. Kunkle represent petroleum and natural gas companies, electric utilities and others before the FCC and elsewhere on licensing, regulatory and transactional matters. They also serve as counsel to the Telecommunications Subcommittee of the American Petroleum Institute on telecommunications issues at the FCC.
Mr. Richards also serves as General Counsel of the Energy, Telecommunications and Electrical Association (ENTELEC). Prior to joining the firm in 1986, he served for 10 years as an Attorney at the Federal Communications Commission.
This article reflects Mr. Richards’ and Mr. Kunkle’s personal views and not necessarily those of any of their clients.
Racing Bonnie
James Case
July 22, 2010. My downward trek from Northern Virginia to Houston, Texas had been, up until that point, completely uneventful. I watched the scenery pass by monotonously - rolling hills turned into lush forest and eventually flat marshland. Song after song played on my iPod: Weezer.
Nirvana. Jimmy Buffet. With a carload of computer equipment packed tightly behind me, I practically flew southwards; I was making excellent time. It wasn’t until about 14 hours into the journey that a jarring, yet subtly put, news announcement finally caught and held my attention. Tropical Storm Bonnie was quickly moving into the Gulf of Mexico and a state of emergency had been declared in Louisiana by Governor Bobby Jindal. Authorities were contemplating evacuation.
I weighed my options and glanced at the clock. Already getting close to the early morning hours, I could push through, hopefully miss the tropical storm and still make it to Houston in time for an afternoon meeting. On the other hand, it was already 2 am and there was a growing stack of coffee cups
piling up next to me in the passenger seat. I had been planning to stop when I reached the halfway point to Houston and I was approaching that spot. I was exhausted in spite of all the caffeine, and a bed was certainly sounding attractive at this point.
But there really was only one option: I had to make that meeting. I sat up a little straighter, turned the local FM station’s current track of Lady Gaga (similar to my kids screaming in the back seat) up a little louder, the A/C a little higher and continued in my journey towards a new office and, hopefully, a new industry.
This fast but momentous journey halfway across the country was precipitated by our decision to open a new office in Houston. A few months ago, I was living in northern Virginia and serving as Director of Project Management. By day, I worked in a multitude of positions ranging from Project Manager to Cable Engineer, GIS Specialist or Database Administrator, depending on the project. But
a new role was in store for me, one that would have me driving halfway across the country racing a hurricane. Spurred on by new and rapidly moving business opportunities in Angola, our presence in Houston was quickly expanded. Soon enough, we realized that we needed a full-time office. Roughly 70% of WFN Strategies’ business is in the oil and gas field and it was looking more and more like this number would only expand. Ultimately, we wanted to live and work near our customers.
To better understand the decision to open a new office in rough economic times, let’s consider a few facts: almost a quarter of the total US employment in the oil and gas industry is located in Harris County (Houston proper) according to the Bureau of Labor Statistics. Houston has long been, and continues to be, the center of oil and gas innovation, even as its own actual resources decline. Offshoretechnology.com reports that “Large national oil companies have continued
to spend despite the economic downturn. Most NOCS [National Oil Companies] have the necessary financial strength to fund their capital intensive projects even in an adverse financial and economic environment.”
Installing fiber is certainly a capital intensive project. However, as the development of new oil fields becomes increasingly more challenging and difficult, a good communications backbone turns into a critical necessity. While offshore drilling is inherently risky (greatly underscored by the current situation in the Gulf of Mexico) oil companies have little choice but to explore these more demanding areas. Now, with the advent of new, tighter legislation on advanced subsea installations looming on the horizon, optical fiber becomes even more important. Fiber provides the ability for real time control and data acquisition, including sensor readout,
housekeeping, and calibration. This type of remote control capability relocates critical personnel to offices ashore where they are safer and more productive through multi-tasking which ultimately allows the oil companies to maintain a stricter rein on burgeoning drilling costs while decreasing overall risk.
Until recently, the previous communication methods for offshore rigs were primarily limited to microwave and satellite systems. However, satellite systems greatly impede bandwidth capacity and latency is certainly a problem as well; microwave links are limited to approximately 100 km between platforms, and that’s if multiple platforms exist. The new real-time monitoring requires higher bandwidth capacity and reliability than satellite can provide and with optical fiber, distance between platforms is a nonissue. Advanced subsea optical fiber solutions allow for unlimited distances between shore stations, like those used for trans-Atlantic subsea cable systems, and supports scalable bandwidth from 2.5 Gb/s, 10 Gb/s and even 40Gb/s. It truly is the future for oil and gas communications.
July 23, 2010. arrived in Houston at 11 am. Luckily for
me, my hotel was fairly empty and I was able to check in almost immediately. I dropped my bags, splashed some water on my face and decided it was too late in the day to bother sleeping. I headed straight to the new office. During my commute, a series of questions kept running through my mind on repeat: did we make the right choice? Was this financial risk going to pay off? Was I really prepared to start a brand new office in a new city by myself? By the time I reached the office building, I had become a little apprehensive. Steeling myself as I rolled box after box of gear into the elevator, I told myself that no gain ever came without risk.
I entered the fluorescently bright conference room. As I began to greet our new potential clients, I glanced at the window and noted the weather was clear. I gave a sigh of relief - while I had no idea what the future held in store, I knew at least one thing: I had beaten Bonnie to Houston.
James Case serves as Director of Project Management for WFN Strategies. He has managed or supported telecom projects in the Gulf of Mexico, Indonesia, Russia, Caribbean, North Sea, West Africa, Antarctica, Australia and the North Sea.
New Africa Coast to europe submarine Cable Network to enhance Connectivity On the West Coast Of Africa
Leigh Frame
The ACE system will link France to South Africa, spanning 17,000 km, and will add seven new countries to the global communication network.
Answering a primary need: connectivity
Connectivity has the power to help countries, especially those in developing regions such as Africa, to fuel their social and economic development. While a number of submarine cable projects have been launched over the past couple of years, there are still some countries that do not have direct access to international fibre optic links. This is going to change soon.
Until 2009, Africa was the least connected continent but this situation will change dramatically by 2012, particularly for sub-Saharan countries. A massive increase in bandwidth connecting countries on the West African and East African coasts will result from the completion several projects. It is by 2012 that the African continent will be fully connected to the Internet and, as a result, the cost of international connectivity is also expected to decrease very rapidly.
stretching from France to South Africa and connecting countries all along the west coast of Africa. The consortium, initiated and headed by Orange, is composed of 19 parties who will invest a total of around 700 million US dollars.
Additionally, a dedicated initiative from one of the parties, in collaboration with philanthropic organizations, will establish a broadband capacity endowment to provide capacity grants for educational and healthcare development programs that cannot otherwise afford it.
As well as improving the connectivity to the rest of the world, the enhanced connectivity between African countries will help to integrate the continent by facilitating trade, social, and cultural exchange between countries and communities.
Technology Innovation
The ACE consortium has also made a significant technology choice since the system will be 40G equipped from day 1.
Different factors are attracting carriers to 40Gb/s in submarine networks:
$US 700 million investment in African telecommunications infrastructure
The latest enhancement to the connectivity around the African continent is the African Coast to Europe (ACE) submarine network,
Governments on the African continent recognize the importance of expanding access to the Internet, and ICT is increasingly considered as strategic national infrastructure. As access to on-line services develops, a significant increase in the demand for international bandwidth is expected, to improve the reach and stability of services. With an ultimate design capacity of 5.12 Terabit/s, ACE will provide an infrastructure well adapted to present and future needs for capacity and connectivity to support new applications such as e-education, healthcare and commerce.
• Capacity per fibre pair: The doubling of capacity per fibre pair with 40Gb/s channels versus capacity achievable with 10Gb/s channels significantly increases the value of the investment in the submerged plant.
• Cost of capacity: the cost of a channel at 40Gb/s will tend to be lower than the cost of 4 x 10Gb/s channels when the market ramps up.
• Improved efficiency of connection to the terrestrial backbones and data centres: While 40Gb/s submarine networks will continue to support the widely deployed 10Gb/s backbones via 4 x 10Gb/s concentrators, the interconnection to terrestrial networks will increasingly move to 40Gb/s interfaces to capture improvements in cost and operational efficiency.
Partnering with Alcatel-Lucent
The ACE consortium has awarded the design and construction of the ACE network to AlcatelLucent Submarine Networks, relying on fieldproven submerged plant (cable, repeaters and branching units). In addition, Alcatel-Lucent’s 1620 LM submarine line terminal equipment will be deployed in terminal stations. These will support 10G wavelengths as well as 40G wavelengths using next-generation coherent technology to deal with transmission impairments in a cost-effective and automated manner. As part of the solution, Alcatel-Lucent will also supply its 1678 Metro Core Connect nodes in the terminal stations along the route for access to sub-wavelength services as well as for protection and restoration functions. The terminal stations will also be equipped with power feeding units to supply the electrical requirements of the submerged plant, and an end-to-end management system.
Alcatel-Lucent will use its in-house skills and expertise to manage the complexities of the marine operations. The careful planning and implementation of these operations are critical to the success and long-term reliability of a submarine cable network.
Conclusion
The availability of submarine cable networks has a twofold benefit. Not only do they provide an answer to the basic connectivity needs in countries not yet connected to international backbones, they also stimulate the development of backbone and regional infrastructures to provide bandwidth availability in landlocked countries. The complementary nature of submarine cable communications and other network technologies is exemplified by operators leveraging their investments in submarine cable consortia to enhance terrestrial infrastructures, thereby enabling significantly improved domestic services.
Facts & Figures
• Consortium investment: around $US 700 million
• Alcatel-Lucent contract: over $US 500 million
• 19 operators compose the consortium
• 21 landing points
• System span: 17,000 km with built-in 40G capability
• Ultimate design capacity: 5.12 Terabit/ second
• Expected commercial service: in the first half of 2012.
The Consortium
Initiated and headed by France TelecomOrange, the consortium includes nineteen
parties: Baharicom Development Company, Cable Consortium of Liberia, Companhia Santomense de Telecomunicações, Côte d’Ivoire Telecom, Expresso Telecom Group, France Telecom, Gambia Telecommunications Company, International Mauritania Telecom, Office Congolais des Postes et Télécommunications, Orange Cameroun, Orange Guinée, Orange Mali, Orange Niger, PT Comunicaçoes, Republic of Equatorial Guinea, Republic of Gabon, Sierra Leone Cable Company, Sonatel and Sotelgui.
Leigh Frame is the Chief Operating Officer of Alcatel-Lucent's submarine network activity. Leigh has spent 20 years in the submarine industry in various operational, commercial, business development and project management roles. Prior to that, he worked in defense and computers, having graduated in Economics from Leeds University.
Back Reflection: 160th Anniversary
stewart Ash & Kaori shikinaka
Temporary station at Dover. Vessels preparing to set sail.
Courtesy of atlantic-cable.com
As the 28th August was the 160th Anniversary of the birth of our industry, it seemed appropriate, for this issue, to write something about the historic first cable lay. Almost every history of the industry begins with a summary of the Brett Brother’s adventures and a number of myths and legends have, through repartition, grown up and become accepted wisdom. The most apocryphal of which is the story that the cable failed because a French fisherman dragged it up on his anchor and, thinking it to be some new form of seaweed or sunken treasure (depending on the account), he chopped out a section and took it ashore as proof of his find. This may well of happened at some stage but the actual cause of failure was somewhat more prosaic. Therefore, rather than write yet another version of the story, we thought it
might be more informative to reproduce some articles that were published in the British press at the time.
THE TIMES
Saturday 31st August 1850
“On Wednesday evening (28th August), the possibility of communication between France and England by electric telegraph was practically established….. Some three thousand years ago Homer talked of ‘winged’ words; we doubt if even he imagined they would ever cleave their way through space at such a rate as this. The electric telegraph appears to us more like a miracle than any scientific discovery or mechanical achievement of our time. We scarcely taught ourselves to acquiesce in the idea that instantaneous communication
between two points on solid land was a mere matter of course than it was gravely proposed to drop the communicating line and transmit intelligence along the bottom of the ocean. The jest or scheme of yesterday has become the fact of today. Great excitement prevailed throughout Europe when the first balloon carried up an adventurer into the skies. But there was no comparison between such an achievement and the present. Even the most enthusiastic projector must have entertained certain doubts as to the practical value of their aeronautic expedition. In the case of the submarine electric telegraph, the first and obvious effect of this instantaneous communication between two most civilised and powerful nations of the world will be to unite them
so closely in a community of interest as to secure their co-operation in all designs that may promote the advancement of humanity and maintain the peace of the world.”
THE ILLUSTRATED LONDON NEWS
Saturday 7th September 1850
SUBMARINE ELECTRIC TELEGRAPH BETWEEN DOVER AND CALAIS
“In our Journal of last week we described the accomplishment of the first telegraphic despatch, clearly printed in Roman type, from Dover, and received at the temporary station at Cape Grinez, near Calais, on the evening of Wednesday week, at nine o’clock. We now give a very interesting series of Engravings illustrative of this great scientific triumph, the details of which will be found at page 186 of our last Number. Mr John W. Brett was present at the Dover Station, watching the progressive success of the operations until the final signal of its entire completion was made in a clearly-printed message at Cape Grinez.
The only conjectured difficulty on the route was at a point in mid-channel, called the Ridge, between which and another inequality the Varne, both well known and dreaded by navigators, there is a deep submarine valley, surrounded by shifting sands, the one being seventeen miles and the other twelve miles in length. Here ships encounter danger, lose their anchors, and drift; and trolling
nets of fishermen are frequently lost. The submarine telegraph line was, however, successfully submerged. On nearing Cape Grinez the soundings become very rugged and the coast dangerous; but by steady and cautious manipulation , the Goliath delivered her cargo of wire to be safely connected with the end of the tubing which had been laid at Cape Grinez, and run up the cliff to a temporary station at its summit. This was completed the same evening, and every accommodation was afforded by the officials at the lighthouse, in the use of lanterns and lamps, so that at nine o’clock the same evening (the 28th of August) a message was printed, in legible Roman letters, upon a long strip of paper, by Mr. Jacob Brett’s printing telegraph, in the station on the French coast, in the sight of numerous audiences of the French officials and others, amidst tremendous cheers of all present at the success; and three times three resounded on all sides
for the Queen of Great Britain and Louis Napoleon Bonaparte and the French nation. The line is in rapid course of completion by land from Grinez to Calais.
The originating of telegraphic communication between London and Paris, and the European continent, is due to the enterprise of an Anglo-French Company, en commandité, first established in Paris by Mr. Jacob Brett, who obtained a decree from the French Government under Louis Napoleon Bonaprate, which together with authorizations from the various departments of the English Government, confers on Mr. Brett, for a period of ten years, the exclusive rights of telegraphic communication between Dover and Calais. When the gigantic nature of the undertaking is considered, it cannot create much surprise that, at the time Messrs. Brett first proposed by letter to Sir Robert Peel to carry out this and similar
Goliath, accompanied by H.M. packet Widgeon “paying out” the electric wire
Courtesy of atlantic-cable.com
projects, by submarine and subterranean telegraphs, great doubts existed in the minds of most of our leading men of the possibility of its accomplishment. The great problem is, however, now solved.
“THE GOLIATH” STEAMER “PAYING OUT” THE ELECTRIC WIRE
In a letter from Dover, dated Wednesday the 4th inst., it is stated that the wire so successfully submerged last week has been cut asunder among the rocks at Cape Grinez, where the physical configuration
of the French coast has been found unfavourable for it as a place of holdfast or fixture. All communication between coast and coast has consequently been suspended for the present. The precise point where the break took place is 200 yards out at sea, and just where the twenty miles of electric line that had been streamed out from Dover joins to a leaden tube, designed to protect it from the surge beating against the beach, and which serves the purpose of conveying it up the front of the cliff to the telegraph station on the top. This leaden conductor, it would appear, was of
too soft a texture to resist the oscillation of the sea, and thereby became detached from the coil of gutta percha wire that was thought to have been safely encased in it. The occurrence was, of course, quickly detected by the sudden cessation of the series of communications that have been sustained since the sinking of the electric cable between here and Cape Grinez, though at first it was a perplexing point to discover at what precise spot the wire was broken or at fault. This, however, was done by hauling up the line at intervals, which disclosed the gratifying fact that since the first sinking it had remained in situ at the bottom of the sea, in consequence of the leaden weights or clamps that were strung to it at every 16th of a mile. The operation was accomplished by Messrs. Brett, Reid, Wollaston and Edwards, who attended to the management of the telegraph without intermission, and who are now, with their staff, removing the wire to a point near Calais , where from soundings, it has been ascertained that there are no rocks, and where the contour of the coast is favourable. It is thought that for the present leaden tube a tube of iron must be substituted, the present apparatus being considered too fragile to be permanently answerable. The experiment, as far as it has gone, proves the possibility of the gutta percha wire resisting the action of the salt water, of the fact of its being a perfect waterproof insulator, and that the weights on the wire are sufficient to prevent its being drifted away by the currents, and of sinking it in
The cable being taken up the cliffs at Cap Gris Nez
Courtesy of atlantic-cable.com
the sands. It is also intended to make use of the wire for commercial and newspaper purposes until the connexion of it with the telegraphs of the South Eastern and that now completed on the other side from Calais to Paris is effected. Should the one wire answer, it is intended to run 20 or 30 more, so as to have constant reserves in the event of accident in readiness. This huge reticulation of electric line will represent 400 miles of telegraph submerged in the sea; and as each will be a considerable distance apart, a total water width of six or eight miles in extent.
On Thursday afternoon, his Grace the Duke of Wellington visited the Telegraph. The wires are carried in temporarily at the
terminus of the South-Eastern Railway. In the absence of Mr. Reid, the telegraph engineer, who superintended laying the wire across the Channel, his foeman showed the noble Duke the working of the instruments, and explained to him how the wires were insulated, and the plan adopted for laying them across. At the Duke’s request, he also furnished him with a specimen of the insulated wire. His Grace seemed highly pleased, and would, no doubt, appreciate this wonderful agency that could communicate in a second of time between this country and the Continent. ON Tuesday, the Duke again visited the Telegraph, with a party of ladies; but there were no persons in attendance who could explain the operations.”
THE TIMES Monday 14th October 1850
“All operations connected with this cable are now suspended until the spring. The interval will be employed in manufacturing the new wire cables and other apparatus so that the electric line may be completed by May.”
In fact, the new cable was not installed until 19th October 1851, and an expectant public had to wait until 13th November 1851 before the first London to Paris Telegraph went into commercial service. But more on that in next issue!
Offshore Communications World 28-29 September 2010
Kuala Lumpur, Malaysia www.terrapinn.com/2010/ofc/
Submarine Networks World 2010 12-14 October 2010
Singapore www.terrapinn.com/2010/submarine/
Offshore Communications 2010 2-4 November 2010
Houston, Texas www.offshorecomms.com/
Pacific Telecommunications Council 16-19 January 2011 Honolulu, Hawaii www.ptc.org
Letter to a Friend
by Jean Devos
My friend,
Many people from our submarine cable community welcome the emergence of Huawei as a new supplier and see this step as not only logical, but inevitable. They see this as an opportunity for more competition and, consequently, lower prices. Everyone knows that the obstacles to entering this business are high, but no real doubts exist as to Huawei’s ability to emerge successfully. Nothing is impossible for the Chinese!
Many people feel that this development is very dangerous for the Japanese industry, which could likely be the first to be victimized But strangely enough, I am hearing more concerns from the western suppliers than from the Japanese.
Is there something more we must understand here, my friend?
China is now the second largest global economy, coming after the United States and before Japan. This is big news for the Chinese, however China did not publicize it. Nonetheless, it is a real victory of a country that suffered under cruel occupation by the Japanese, who believed firmly in Tokyo’s superiority over all of Asia. China will not soon forget the 12 million dead, 10,0000 taken prisoner, the Nanking and Shanghai massacres, or the policy of general Okamura: ”Kill everyone, burn everything, steal everything.”
The future, however, may not be just a continuation of the past.
From 1949 until 1997, Hong Kong was China’s open window to the rest of the world. Taiwan now plays a similar role as the investment bank and technology research laboratory for the whole of China.
Is it now Japan’s turn? Is Japan China’s new open window? Does China see Japan as a possible extension of its own “garden?”
Let’s not forget that the 130 million Japanese still generate a total national product almost equivalent to the 1.3 billion Chinese. Taiwan and Hong Kong pale in comparison. Japan and China’s economies are closing in upon each other rapidly. Japan is now exporting more goods to China than to the US, and China’s contribution to the Japanese economy continues to grow each year. Chinese investors have recently been very active in Japan. This will likely lead to a political understanding, if not a partnership, in the near future.
“Dobu Dochun” “Same writing, same culture.” This old Sino Japanese saying is now proving itself to be truer than ever. China and Japan’s long-term interests lie more in cooperation than in conflict. It seems to me that they have started on a new and promising path.
How will that translate into our interests?
Who knows?
Issue Themes:
January: Regional Outlook
March: Finance & Legal
May: Subsea Capacity
July: Subsea Technology
September: Offshore Energy
November: Defense & Non-traditional
Cable Systems
Advertising enquiries:
SALES MANAGER
Kristian Nielsen
Tel: +1 (703) 444-0845
Email: knielsen@subtelforum.com
SALES REPRESENTATIVE, EUROPE & AFRICA
Wilhelm Sicking
Tel: 0201-779861
Email: wsicking@subtelforum.com
SALES REPRESENTATIVE, ASIA PACIFIC
Michael Yee
Tel: +65 9616 8080
Email: myee@subtelforum.com
On Issue #52
Hi Kevin
Congratulations on your new baby. I was so pleased to hear that news.
The articles/photos came out excellent. Many thanks for all your support
The submarine cable industry is rarely in the media spotlight, but that wasn’t the case recently on the PBS program History Detectives. An episode that aired on July 5th featured the story of Art Johns, a pilot with a fascinating piece of submarine cable history tucked away in his home.
Back in 1984, the steel freighter Eldia was grounded on Nauset Beach in East Orleans, Massachusetts after an 80-mph storm. As she ran aground, Eldia’s anchor drew across an old submarine telegraph cable, dragging a huge coil of the stuff onto the beach. The wrecked ship caused quite a spectacle at the time. More than 150,000 spectators flocked to see the ship, which rested upright in the sand, over the next month and a half. Local businesses sold post cards, t-shirts and other Eldia-themed merchandise while workers made plans to salvage the ship. Meanwhile, the telegraph cable baked in the summer heat.
Several months later, when Eldia was well on her way to the salvage yard, a young man named Art Johns was visiting Nauset Beach with his parents. His mother had arrived a few days previous and found something unusual on the beach: a rusting
coil of telegraph cable. This was the very cable that Eldia’s anchor drug ashore. When Art and his father arrived on the scene, hacksaw in hand, they recognized at once that this cable was old.
Art’s father, a librarian, had some familiarity with early cables, and the family began to speculate that what they were staring at was a section of one of the first submarine telegraph cables. But which one?
The first successful transatlantic telegraph cable was laid in 1866 by the Great Eastern, but several failed cables were also laid between 1857 and 1865. Each one of these was a piece of history, and the bold leaders responsible for these endeavors (mind boggling at the time) were trying to change the world. A message from New York to London that would have once taken more than a week to deliver before could now be received in a matter of seconds. Welcome to the start of the communications revolution.
Art’s father cut off a 3ft section of the cable and brought it home. Their vacation continued, and before they knew it, 1984 was gone. The years bled into one another, but the cable was always there, a reminder of a long-ago family outing and a rusty piece of history. When his father passed
away the cable came into Art’s possession, and in 2009 he finally decided to put his family’s speculation about the cable to the test. After watching an episode of History Detectives, a popular television program on PBS, Art noticed their request to “let the History Detectives solve your mystery!” After 25 years, it was time.
According to their website, History Detectives “is devoted to exploring the complexities of historical mysteries, searching out the facts, myths and conundrums that connect local folklore, family legends and interesting objects.”
Art sent in an email, and within 20 minutes he heard back from the show’s producer. After hearing the story and probing deeper into the incident with the Eldia, the History Detectives took the case.
Were they able to solve the mystery? I’m not going to spoil the answer here. To find out, you’re going to have to tune in. Click here to view the episode.
What do you think? Click on the Letter To The Editor icon and drop me a line. I’d love to hear from you.