ZAP! The Story of Electricity

Page 1

The earliest humans were introduced to electricity by way of lightning strikes, which occur worldwide about 100 times per second. From lightning, they got fire, and the control of fire proved to be a turning point in the cultural aspects of human evolution. Serious scientific research into electricity began with Benjamin Franklin’s famous kite and string experiment of 1752 in which he proved that lightning is electricity. Other prime movers in the field of electrical research and invention included: Michael Faraday, James Clerk Maxwell, Alexander Graham Bell, Thomas Edison, Nikola Tesla, George Westinghouse, and Albert Einstein. Electricity (along with the internal combustion engine) makes our modern way of life possible. The most ubiquitous electrical/electronic inventions are probably the electric motor, the telephone, the light bulb, and alternating current. Human biochemistry, thought, and action are all controlled by electricity, generated by neurons, which transmit information throughout the body in both chemical and electrical forms, and humans can also be affected by extracorporeal electromagnetic radiation. The hazards of ionizing radiation are well known, but there is considerable controversy about the potential dangers of non-ionizing radiation. Most scientific studies indicate that moderate levels of non-ionizing radiation are safe, but a number of cases have been reported in which individuals have experienced electromagnetic hypersensitivity (EHS), a condition to which negative effects are attributed.

DAVID RITCHEY

DAVID RITCHEY



David Ritchey Headline Books, Inc. Terra Alta, WV


Zap! The Story of Electricity by David Ritchey ©2020 David Ritchey All rights reserved. No part of this publication may be reproduced or transmitted in any other form or for any means, electronic or mechanical, including photocopy, recording or any information storage system, without written permission from Headline Books, Inc. To order additional copies of this book or for book publishing information, or to contact the author: Headline Books, Inc. P. O. Box 52 Terra Alta, WV 26764 www.headlinebooks.com Tel: 304-789-3001 Email: mybook@headlinebooks.com ISBN: 9781951556099 Library of Congress Control Number: 2020931315

P R I N T E D I N T H E U N I T E D S T AT E S O F A M E R I C A


To the staff at Madisonville Hardware and Electric Supply who encouraged and assisted me with my youthful electrical projects.

Acknowledgement Thanks to my friend and colleague, Ellen Meyer, for her ongoing support and superb editing assistance.


Contents Chapter 1: Electricity in Nature...............................................5 Chapter 2: Research and Applications...................................13 Chapter 3: Prime Movers.......................................................20 Chapter 4: Today’s World of Electricity................................47 Chapter 5: Sensitivity to Electromagnetic Radiation.............81 Chapter 6: Personal Experiences...........................................90 Index....................................................................................108 Books By David Ritchey.....................................................110 About The Author................................................................112


Chapter 1

Electricity in Nature While we tend to think of electricity as a type of energy created by humans, electricity is actually one of the natural components of the Earth’s environment. Today’s humans are able to generate it, control it, and use it, but its existence is also widespread in the natural world. Its most recognizable forms are lightning, static electricity, and electric animals. Lightning Lightning has been striking the Earth since before humans first set foot on the planet. At any given instant, approximately 2,000 thunderstorms are roaming the Earth’s atmosphere, and they strike the Earth with lightning flashes 100 times per second. Recently, some scientists have concluded that lightning may have played a part in the evolution of living organisms. Throughout the history of terrestrial life, lightningcaused wildfires have had pronounced evolutionary effects on most ecosystems’ flora and fauna. From the beginning, lightning has fascinated mankind. As primitive humans sought answers about the natural world, lightning became a part of their superstitions, their myths, and their early religions. In many cultures, lightning has been viewed as part of a deity or a deity in and of itself. It was from lightning that the early humans got fire, and the control of fire 5


proved to be a turning point in the cultural aspect of human evolution. Fire provided a source of warmth, protection, new hunting methods, and a means for cooking food. These cultural advancements allowed for human geographic dispersal, cultural innovations, and changes to diet and behavior. Additionally, creating fire allowed the expansion of human activity to proceed into the dark and colder hours of the evening. Fire was an important factor in expanding and developing societies of early hominids. One impact fire might have had was social stratification. Those who could make and wield fire had more power than those who could not and may have therefore had a higher position in society. Evidence for the controlled use of fire by Homo erectus, beginning some 1,000,000 years ago, has wide scholarly support. Evidence of widespread control of fire by anatomically modern humans dates to approximately 125,000 years ago. The ability to make fire generally with a friction device with hardwood rubbing against softwood (as in a bow drill)—was a late development.

Early Humans and Lightning Strike

As proven by Benjamin Franklin in his famous 1752–1753 kite-and-key experiments, lightning is a naturally occurring 6


electrostatic discharge during which two electrically charged regions in the atmosphere or ground temporarily equalize themselves, causing the instantaneous release of as much as 1 billion joules (500 pounds of TNT) of energy. Lightning is often followed by thunder, an audible sound caused by the shock wave that develops as gases in the vicinity of the discharge experience a sudden increase in pressure.† The three main kinds of lightning are distinguished by where they occur—either inside a single thundercloud (intracloud or IC), between two different clouds (cloud-to-cloud or CC), or between a cloud and the ground (cloud-to-ground or CG). Many other observational variants are recognized, including: heat lightning, which can be seen from a great distance but not heard; dry lightning, which can cause forest fires; and ball lightning, which is rarely observed scientifically. At its peak, a typical thunderstorm produces three or more strikes to the Earth per minute. Lightning primarily occurs when warm air is mixed with colder air masses, resulting in atmospheric disturbances necessary for polarizing the atmosphere. However, it can also occur during dust storms, forest fires, tornadoes, volcanic eruptions, and even in the cold of winter, Simultaneous Lightning Strikes where the lightning is known as thundersnow. Lightning is not distributed evenly around Earth. Worldwide, the frequency of lightning strikes is approximately 44 (± 5) times per second, or nearly 1.4 billion flashes per year. The average duration of a strike is 0.2 seconds, †

Light travels at about 984,000,000 feet per second, and sound travels through air at about 1,125 feet per second. An observer can approximate the distance to the strike by timing the interval between the visible lightning and the audible thunder it generates. A flash preceding thunder by five seconds would indicate a distance of approximately one mile.

7


made up from a number of much shorter flashes (strokes) of around 60 to 70 microseconds. About 70% of lightning occurs over land in the tropics, where atmospheric convection is the greatest. The place on Earth where lightning occurs most often is near the small village of Kifuka in the mountains of the eastern Democratic Republic of the Congo. On average, this region receives 460 lightning strikes per square mile per year. Humans or animals struck by lightning may suffer severe injury due to internal organ and nervous system damage, but about 90 percent of people struck by lightning survive. Worldwide, about 240,000 people are injured and about 6,000 are killed each year by lightning strikes. Over the last 20 years, the United States averaged 51 annual lightning strike fatalities per year, placing lightning in the second position, just behind floods, for deadly weather. The saying, “Lightning never strikes twice (in the same place)” is nothing but an old wives’ tale. It can and it does. Illustratively, on the evening of June 30, 2014, Chicago’s three tallest skyscrapers were struck by lightning 17 times: 10 strikes to the Sears Tower, 8 to the Trump Tower, and 4 to the John Hancock Center. Lighting tends to strike the highest and most pointed object around because it is an electrical current being attracted to the easiest path to ground. A strike to any location does nothing to change the electrical activity in the storm above, which will produce another strike as soon as it “recharges.” The previously hit location is then just as likely to be hit on the Lightning Strike – Eiffel Tower 8


next discharge as any other spot. The Empire State Building gets struck by lightning about 100 times a year. Static Electricity Lightning is a big spark ... static electricity on a giant scale. Ancient cultures around the Mediterranean knew that certain objects, such as rods of amber, could be rubbed with cat’s fur to attract light objects like feathers. Around 600 BC, some observers of static electricity believed that friction rendered amber magnetic, in contrast to minerals such as magnetite, which needed no rubbing. While that suggestion was incorrect, later science would eventually prove the existence of a link between magnetism and electricity. Static electricity is a stationary electric charge typically produced by friction, in contrast to current electricity, which flows through wires or other conductors and transmits energy. A static electric charge can be created whenever two surfaces contact and separate and at least one of the surfaces has a high resistance to electric current. The electric charge remains until it is able to move away by means of an electrical discharge. The effects of static electricity are familiar to most people because they can feel, hear, and even see the spark as the excess charge is neutralized when brought close to a large electrical conductor. There are four different ways in which the charge separation in static electricity can be induced: • Contact-induced charge separation – This is what causes one’s hair to stand up and “static cling.”

9


Pressure-induced charge separation – Also known as the “Piezoelectric Effect,” this is caused by mechanical stress generating a separation of charge in various types of crystals and ceramics molecules. The effect is reciprocal, and when a piezoelectric material is subjected to an electric field, a small change in physical dimensions takes place. Heat-induced charge separation – Heating generates a separation of charge in the atoms or molecules of certain materials. All pyroelectric materials are also piezoelectric. The atomic or molecular properties of heat and pressure response are closely related. Charge-induced charge separation – A charged object brought close to an electrically neutral object causes a separation of charge within the neutral object.

In the second half of the 18th century, machines for creating and storing static electricity were invented. The Leyden jar was like a thermos bottle that stored a high-voltage electric charge (from an external source) between electrical conductors on the inside and outside of a glass jar. Electricity could then be carried around and demonstrated. “ElecLeyden Jar tric magic” was in great demand at the royal courts of Europe as entertainment. The parlor tricks amused and fascinated people. Electric Animals The world is full of unusual creatures including animals that can produce electricity. Long before any knowledge of 10


electricity existed, people were aware of shocks from electric fish. Ancient Egyptian texts dating from 2750 BCE referred to these fish as the “Thunderer of the Nile”, and described them as the “protectors” of all other fish. Electric fish were again reported millennia later by ancient Greek and Roman naturalists and physicians, who knew that such shocks could travel along conducting objects. Electrogenic marine animals protect themselves and detect or stun their prey via high voltages generated from modified muscle cells called electrocytes. Among these animals are the: • Electric Catfish – Many species of electric catfish are capable of generating electric shocks of up to 350 volts. They are found in the tropical regions of Africa and in the Nile River. Ancient Egyptians utilized the electric shock–generating ability of smaller sizes of these fish to treat arthritis pain, but not the larger ones, as they can cause very painful shocks. • Electric Rays –There are about 69 species of these rays. They can produce electric discharges ranging from 8 to 220 volts. The electric rays have been known for their electrogenic properties since antiquity, when the ancient Greeks used the electricity generated by them to treat gout and headaches and to numb pain during surgeries and childbirth. Electric rays have been associated with mystical powers due to their ability to stun people without touching them. • Black Ghost Knifefish – This fish inhabits the freshwater habitats of tropical areas in South America. It possesses an electric organ and electro-receptors, which are distributed all over its body, and is thus among the few animals that can both produce and sense electricity. • Electric Eel – This animal lives in the freshwater river basins in South America. Despite its name, the electric eel is not a true eel but a knifefish. The fish can generate both high- and low-voltage electricity. The

11


12

electric current that it generates will definitely produce a painful and numbing shock to humans exposed to it. Northern Stargazer – This fish can be found on the eastern shores of the United States between New York and North Carolina at depths of up to 120 feet. Its mouth faces upwards, which allows it to ambush prey while it remains well camouflaged in the sandy bottoms of the coastal waters.


Chapter 2

Research and Applications Serious scientific research in the field of electricity can be said to have begun with Benjamin Franklin’s famous kite-andstring experiment of 1752 in which he proved that lightning is electricity. In 1791, Luigi Galvani published his discovery of bioelectromagnetics, demonstrating that electricity was the medium by which neurons passed signals to the muscles. In 1800, Alessandro Volta’s battery, or voltaic pile, which was made from alternating layers of zinc and copper, provided scientists with a more reliable source of electrical energy than the electrostatic machines previously used. Hans Christian Ørsted and André-Marie Ampère are credited with recognizing the existence of electromagnetism, the unity of electric and magnetic phenomena, in 1819–1820. Michael Faraday introduced the concept of the electric motor in 1821, and Georg Ohm mathematically analyzed the electrical circuit in 1827. Electricity and magnetism (and light) were definitively linked by James Clerk Maxwell in 1861 and 1862. Heinrich Hertz first conclusively proved the existence of the electromagnetic waves predicted by Maxwell’s equations of electromagnetism, and, in 1887, also helped to establish the existence of the photoelectric effect. In 1905, Albert Einstein published a paper explaining experimental data from the photoelectric effect as the result of light energy being carried in discrete 13


quantized packets, energizing electrons. This discovery led to the quantum revolution. Thanks to the efforts of these early researchers, we now know that electricity is a form of energy resulting from the existence of charged particles (such as electrons or protons), either statically as an accumulation of charge or dynamically as a current. Electric power, like mechanical power, is the rate at which work is done—i.e., the rate at which electric energy is transferred by an electric circuit, and it is measured in watts. The presence of charge gives rise to an electrostatic force, with charges exerting pressure on each other. The charge on electrons and protons is opposite in sign, so an amount of charge may be expressed as being either negative or positive. By convention, the charge carried by electrons is deemed negative, and that by protons, positive. The movement of electric charge is known as an electric current, the intensity of which is usually measured in amperes. Current can consist of any moving charged particles; most commonly, these are negatively charged electrons, but any charge in motion constitutes a current. While the particles themselves generally move quite slowly, the electric field that drives them propagates at close to the speed of light, enabling electrical signals to pass rapidly along wires. The direction of an electric current is, by convention, the direction in which a positive charge would move. Thus, in a battery, the current in the external circuit is directed away from the positive terminal and toward the negative terminal. Electrons would actually move through the wires in the opposite direction. Knowing that the actual charge carriers in wires are negatively charged electrons may make this convention seem a bit odd and outdated, but it is nonetheless the convention that is used worldwide. The ability of chemical reactions to produce electricity, and, conversely, the ability of electricity to drive chemical reactions, has a wide array of uses. Electrochemistry has always been an important part of electricity. From the initial invention of the voltaic pile, electrochemical cells have evolved 14


into the many different types of batteries, electroplating, and electrolysis cells. Aluminum is produced in vast quantities this way, and many portable devices are electrically powered using rechargeable cells. An electric field is created by a charged body in the space that surrounds it and results in a force exerted on any other charges placed within the field. The theory of the electric field was introduced by Michael Faraday and led to his development of the concept of the electric motor in 1821. The electric field acts between two charges in a similar manner to the way that the gravitational field acts between two masses, and like it, extends towards infinity and shows an inverse square relationship with distance. However, there is an important difference. Gravity always attracts, drawing two masses together, while an electric field can result in either attraction or repulsion. In comparison with the much weaker gravitational force, the electromagnetic force pushing two electrons apart is 1042 times that of the gravitational attraction pulling them together. An electric current, which is the movement of an electric charge, produces a magnetic field. An electric charge gives rise to and interacts with the electromagnetic force, one of the four fundamental forces of nature (strong nuclear force, weak nuclear force, gravitational force, and electromagnetic force). The magnitude of the electromagnetic force, whether attractive or repulsive, which relates the force to the product of the charges, has an inverse-square relation to the distance between them. The electromagnetic force is very strong, second only in strength to the strong nuclear force, but unlike that force, it operates over all distances. In the early days of research, electricity was considered as being unrelated to magnetism. With time, many experimental results and the development of Maxwell’s equations showed that both electricity and magnetism arise from a single phenomenon: electromagnetism. The force of an electromagnetic field depends on the direction of the current—if the flow is reversed, then the force is reversed, too. Moreover, the effect is reciprocal: a current exerts a force on a 15


magnet, and a magnetic field exerts a force on a current. Such a phenomenon has the properties of a wave, and is naturally referred to as “an electromagnetic wave.” Electromagnetic waves were analyzed theoretically by James Clerk Maxwell in 1864. Maxwell developed a set of equations that could unambiguously describe the interrelationship between electric fields, magnetic fields, electric charges, and electric currents. He could, moreover, prove that such a wave would necessarily travel at the speed of light, and thus light itself was a form of electromagnetic radiation. Maxwell’s Laws, which unify light, fields, and charge, are one of the great milestones of theoretical physics. While the early 19th century had seen rapid progress in electrical science, the late 19th century would see the greatest progress in electrical engineering. Through such people as Michael Faraday, Alexander Graham Bell, Thomas Edison, Nikola Tesla, and George Westinghouse, electricity turned from a scientific curiosity into an essential tool for modern life. The rapid expansion in electrical technology at that time transformed industry and society in such a way that electricity is today central to the ability of almost all modern societies to function. Electricity’s extraordinary versatility means it can be put to an almost limitless set of applications, including transportation, heating, lighting, communications, and computation. With the advent of the Electronic Age (sometime between the late 1950s and the late 1970s), we began to transition from mechanical and analog technology to digital electronics, and just about anything technological that we now encounter is electrical or electronic. There are literally thousands of such devices everywhere, but the most ubiquitous of all are probably the electric motor, the telephone, the light bulb, and alternating current (not actually a device, but rather an energy delivery system that makes the others possible). • Electric Motor: Michael Faraday’s main discoveries include the principles underlying electromagnetic 16


induction, diamagnetism, and electrolysis. He established the basis for the concept of the electromagnetic field in physics. His inventions of electromagnetic rotary devices, notably the electric dynamo in 1831, formed the foundation of electric motor technology, and it was largely due to his efforts that electricity became practical for use in technology.

Telephone: Alexander Graham Bell is credited with inventing and patenting the first practical telephone in 1876. The Bell Telephone Company was created in 1877, and, by 1886, more than 150,000 people in the US owned telephones. Bell Company engineers

17


18

made numerous improvements to the telephone, which emerged as one of the most successful products ever. Bell also co-founded the American Telephone and Telegraph Company (AT&T) in 1885. Light Bulb: Thomas Edison was a prolific inventor, holding 1,093 US patents in his name, He developed many devices in fields such as electric power generation, mass communication, sound recording, and motion pictures. His best-known invention is the electric light bulb, which was first successfully tested for 13.5 hours on October 22, 1879. He continued to improve this design and, on January 27, 1880, was granted a US patent for an electric lamp using a carbon filament. Edison’s talents as a businessman eventually led him to found 14 companies, including General Electric.

Alternating Current: Nikola Tesla is best known for his contributions to the design of the modern alternating current (AC) electricity supply system. In 1887, Tesla developed an induction motor that ran on alternating current. On April, 15th, 1895, the first Niagara Generator, which bore Tesla’s name and patent numbers, was tested, and it went into operation on August 25, 1895.


Tesla’s AC Induction Motor

19


Chapter 3

Prime Movers Scores of people played significant roles in ultimately bringing about the Electronic Age. Some made their contribution in the field of electrical science (research), and some made their contribution in the field of electrical engineering (invention). A number of these individuals have already been mentioned, but there are a few who stand out as “prime movers.” Among them are Benjamin Franklin, Michael Faraday, James Clerk Maxwell, Alexander Graham Bell, Thomas Edison, Nikola Tesla, George Westinghouse, and Albert Einstein. Benjamin Franklin was born January 17, 1706, in Boston, Massachusetts. He attended Boston Latin School for two years, but did not graduate due to lack of money. Nevertheless, he continued to educate himself through voracious reading. Franklin, an American polymath and one of the Founding Fathers of the United States, was a leading author, printer, political theorist, politician, Freemason, postmaster, scientist, inventor, humorist, civic activist, statesman, and diplomat. As a scientist, he was a major figure in the 20


American Enlightenment and the history of physics because of his discoveries and theories regarding electricity. He was a prodigious inventor, and among his many creations can be counted the lightning rod, glass harmonica, Franklin stove, bifocal glasses, and flexible urinary catheter. However, he never patented his inventions, believing that such things should be shared freely with others. Franklin started exploring the phenomenon of electricity in 1746. He was the first to label the two types of electric charge as positive and negative, and he was the first to discover the principle of conservation of charge. A well-known legend holds that Franklin, on June 15, 1752, flew a kite in a thunderstorm using a wet (for conductivity) string to which a key was attached. The experiment’s purpose was to uncover the unknown facts about the nature of lightning and electricity, and with further

Franklin’s Kite and Key Experiment

experiments on the ground, to demonstrate that lightning and electricity were the result of the same phenomenon. The ambient electricity from the storm was carried down the string, produced sparks from the key, and Franklin was able to collect the electric charge in a Leyden jar. Presumably, the kite was never directly struck by lightning, because he then would have likely been killed. After his successful demonstration, Franklin 21


continued his work with electricity, going on to perfect his lightning rod invention. Franklin suffered from obesity throughout his middle-aged and later years, which resulted in multiple health problems, particularly gout, which worsened as he aged. Benjamin Franklin died from pleuritic attack at age 84, on April 17, 1790, at his home in Philadelphia. “The doorstep to the temple of wisdom is a knowledge of our own ignorance.” —Benjamin Franklin Michael Faraday was born September 22, 1791, in what is now a suburb of London. Due to family financial circumstances, he received little formal education, but educated himself by reading prodigiously. He also developed an interest in science, especially in electricity. Faraday was one of the most influential scientists in history, and is best known for his work regarding electricity and magnetism. His first recorded experiment was the construction of a voltaic pile with which he decomposed sulfate of magnesia. It was through his research on the magnetic field around a conductor carrying a direct current that Faraday established the basis for the concept of the electromagnetic field in physics. He also established that magnetism could affect rays of light and that there was an underlying relationship between the two phenomena. He similarly discovered the principles of electromagnetic induction and diamagnetism and the laws 22


of electrolysis. These experiments and inventions formed the foundation of modern electromagnetic technology. He discovered electromagnetic induction, and demonstrated that a changing magnetic field produces an electric field. This relationship was modeled mathematically by James Clerk Maxwell as Faraday’s Law, which subsequently became one of the four Maxwell equations, which have in turn evolved into the generalization known today as “field theory.” In 1831, Faraday would use the principles he had discovered to construct the electric dynamo, an electrical generator that creates direct current using a commutator. Dynamos were the first electrical generators capable of delivering power for industry, and are the foundation upon which many other later electric-power conversion devices were based, including the electric motor. His inventions of electromagnetic rotary Faraday’s Dynamo devices formed the foundation of electric motor technology, and it was largely due to his efforts that electricity became practical for use in technology. Michael Faraday died at age 75 on August 25,1867, in his house at Hampton Court. “There’s nothing quite as frightening as someone who knows they are right.” —Michael Faraday

23


James Clerk Maxwell was born on June 13, 1831, in Edinburgh, Scotland, to a family of comfortable means. He was educated first by his mother, then by his father, then by a tutor, after which he was sent to the prestigious Edinburgh Academy. Maxwell left the Academy in 1847 at age 16 and began attending classes at the University of Edinburgh. He did not find his classes at the University demanding, and was therefore able to immerse himself in private study during free time there. In October 1850, already an accomplished mathematician, Maxwell left Scotland for the University of Cambridge. In 1854, he graduated with a degree in mathematics. In November 1856, he left Cambridge, having accepted the Chair of Natural Philosophy at Marischal College, Aberdeen. The 25-year-old Maxwell was a good 15 years younger than any other professor at Marischal. In 1860, he was granted the Chair of Natural Philosophy at King’s College, London. In 1871, he returned to Cambridge to become the first Cavendish Professor of Physics. Maxwell was a top-flight mathematician and also unquestionably an academician through and through. He studied the works of other scientists and found ways to consolidate them, extend their reach, and blend them into a comprehensible unity. His most notable achievement was to formulate the classical Theory of Electromagnetic Radiation, bringing together for the first time electricity, magnetism, and light as different manifestations of the same phenomenon. He demonstrated that electric and magnetic fields travel through space as waves moving at the speed of light, and proposed that light is an undulation in the same medium that is the cause 24


of electric and magnetic phenomena. In 1855, he presented a simplified model of how electricity and magnetism are related. He reduced all of the then-current knowledge into a linked set of differential equations with 20 equations in 20 variables. In 1881 Oliver Heaviside reduced the complexity of Maxwell’s theory down to four differential equations known now collectively as “Maxwell’s Equations.”

Maxwell’s Equations

His equations for electromagnetism have been called the “second great unification in physics” after the first one realized by Isaac Newton in the 17th century. Maxwell’s discoveries helped usher in the era of modern physics, laying the foundation for such fields as special relativity and quantum mechanics. Many physicists regard Maxwell as the 19th-century scientist who had the greatest influence on 20th-century physics. His contributions to science are considered by many to be of the same magnitude as those of Isaac Newton and Albert Einstein. When it was suggested to Einstein in 1922 that he had done great things because he stood on Newton’s shoulders, Einstein said that was not the case, that he stood on Maxwell’s shoulders instead. James Clerk Maxwell died of abdominal cancer at age 48, on November 5, 1879, in Cambridge. “Thoroughly conscious ignorance is the prelude to every real advance in science.” —James Clerk Maxwell 25


Alexander Graham Bell was born in Edinburgh, Scotland, on March 3, 1847. He received his initial schooling at home from his father. At an early age, he was enrolled at the Royal High School, Edinburgh, Scotland, which he left at the age of 15, having completed only the first four forms. At the age of 16, he secured a position as a “pupil-teacher” of elocution and music, in Weston House Academy at Elgin, Moray, Scotland. Although he was enrolled as a student in Latin and Greek, he instructed classes himself in return for board and £10 per session. The following year, he attended the University of Edinburgh. In 1865, when the Bell family moved to London, Alexander returned to Weston House as an assistant master, and, in his spare hours, conducted experiments on sound using a minimum of laboratory equipment. With aspirations to obtain a degree at University College London, he considered his next years as preparation for the degree examinations, devoting his spare time at his family’s residence to studying. In 1868, he completed his matriculation exams and was accepted for admission. In 1870, 23-year-old Bell travelled with his family to Paris, Ontario, where his parents purchased a farm near Brantford, Ontario. In October 1872, Alexander Bell opened his “School of Vocal Physiology and Mechanics of Speech” in Boston, which attracted a large number of deaf pupils, and his first class numbered 30 students. He also became professor of Vocal Physiology and Elocution at the Boston University School of Oratory. During this period, he moved back and

26


forth between Boston and Brantford, spending summers in his Canadian home. An inventor, scientist, and engineer, his initial work on the harmonic telegraph had, by 1874, entered a formative stage, with considerable progress having been made. While working that summer in Brantford, Bell experimented with a “phonautograph,” a pen-like machine that could draw shapes of sound waves on smoked glass by tracing their vibrations. In 1874, telegraph message traffic was rapidly expanding, and, with financial support from two investors, Bell hired Thomas Watson as his assistant. The two of them then began experimenting with acoustic telegraphy. On June 2, 1875, Watson accidentally plucked one of the reeds and Bell, at the receiving end of the wire, heard the overtones of the reed— overtones that would be necessary for transmitting speech. That demonstrated to Bell that only one reed or armature was necessary, not multiple reeds. This led to the “gallows” soundpowered telephone, which could transmit indistinct, voice-like sounds, but not clear speech. In 1875, Bell developed an “acoustic telegraph” and received a US patent for it on March 7, 1876. Bell returned to Boston the same day and the next day resumed work. On March 10, 1876, three days after his patent was issued, Bell succeeded in getting his telephone to work. When Bell spoke the sentence “Mr. Watson—Come here—I want to see you” into First Telephone the transmitter, Watson, listening at the receiving end in an adjoining room, heard the words clearly. On August 3, 1876, from the telegraph office 27


in Mount Pleasant, five miles away from Brantford, Bell sent a message to the family household along an improvised wire, and guests there distinctly heard people in Brantford reading and singing. These experiments clearly proved that the telephone could work over extended distances. In 1876, Bell and his partners offered to sell the patent outright to Western Union for $100,000. The president of Western Union balked, countering that the telephone was nothing but a toy. Two years later, he told colleagues that if he could get the patent for $25 million he would consider it a bargain. By then, the Bell Telephone Company, which was created in 1877, was no longer interested. Bell Company engineers made numerous improvements to the telephone, which emerged as one of the most successful products ever. In 1879, the Bell company acquired from Western Union Thomas Edison’s patents for the carbon microphone. This made the telephone practical for longer distances, and it was no longer necessary to shout to be heard at the receiving telephone. Bell also co-founded the American Telephone and Telegraph Company (AT&T) in 1885. By 1886, more than 150,000 people in the US owned telephones. The range of Bell’s inventive genius is represented only in part by the 18 patents granted in his name alone and the 12 he shared with collaborators. These included 14 for the telephone and telegraph, four for the photophone, one for the phonograph, five for aerial vehicles, four for “hydroairplanes,” and two for selenium cells. The bel (B) and the smaller decibel (dB) are units of measurement of sound pressure level invented by Bell Labs and named after him. In 1936, the US Patent Office declared Bell first on its list of the country’s greatest inventors. Alexander Graham Bell died of complications arising from diabetes at age 75, on August 2, 1922, at his private estate in Cape Breton, Nova Scotia. “The achievement of one goal should be the starting point of another.” —Alexander Graham Bell 28


Thomas Alva Edison was born on February 11, 1847, in Milan, Ohio, and grew up in Port Huron, Michigan. Edison attended school for only a few months, and was instead taught by his mother. Much of his education came from reading R. G. Parker’s School of Natural Philosophy and from enrolling in chemistry courses at The Cooper Union for the Advancement of Science and Art. Edison was an inventor and businessman who has been described as America’s greatest inventor. He developed many devices in fields such as electric power generation, mass communication, sound recording, and motion pictures. These inventions—which include the phonograph, the motion picture camera, and the long-lasting, practical electric light bulb—have had a widespread impact on the modern industrialized world. He was an exceptionally prolific inventor, holding 1,093 US patents in his name, as well as patents in other countries. Also, as an entrepreneur, he founded 14 companies. The invention that first gained him wide notice was the phonograph in 1877. His first phonograph recorded on tinfoil around a grooved cylinder. Despite its limited sound quality and that the recordings could be played only a few times, the phonograph made Edison a celebrity. In 1876, Edison began work to improve the microphone (then called the “transmitter”) for telephones by developing a carbon microphone, which consisted of two metal plates separated by granules of carbon that would change resistance with the pressure of sound waves. Edison used the carbon microphone concept in 1877 to create an improved telephone for Western Union. In 1886, Edison found a way to improve a Bell Telephone microphone 29


that used loose-contact ground carbon after discovering that it worked far better when the carbon was roasted. This type was put into use in 1890 and was used in all telephones, along with the Bell receiver, until the 1980s. In 1878, Edison began working on a system of electrical illumination, something he hoped could compete with gas and oil-based lighting. He began by tackling the problem of creating a long-lasting incandescent lamp, something that would be needed for indoor use. After many experiments, first with carbon filaments and then with platinum and other metals, he returned to a carbon filament. In 1878, with several financiers, he formed the Edison Electric Light Company in New York City. The first successful test of his light bulb was on October 22, 1879; it lasted 13.5 hours. Edison continued to improve this design and, on January 27, 1880, was granted

Edison’s Early Light Bulbs

a US patent for an electric lamp using a carbon filament. It was not until several months after the patent was granted that Edison and his team discovered a carbonized bamboo filament that could last over 1,200 hours. Having developed a commercially viable light bulb, he then went on to develop an electric “utility” to compete with the existing gas light utilities. On December 17, 1880, he founded the Edison Illuminating Company, and during 30


the 1880s, he patented a system for electricity distribution. On September 4, 1882, he switched on his New York City generating station’s electrical power distribution system, which provided 110 volts direct current to 59 customers in lower Manhattan. As Edison expanded his direct current power delivery system, he was subject to stiff competition from companies installing AC systems. From the early 1880s, AC arc lighting systems for streets and large spaces had been an expanding business in the US. With the development of transformers in Europe and by Westinghouse Electric in the US in 1885–1886, it became possible to transmit AC long distances over thinner and cheaper wires, and to “step down” the voltage at the destination for distribution to users. This allowed AC to be used in street lighting and in lighting for small business and domestic customers, the market Edison’s patented low-voltage DC incandescent lamp system had been designed to supply. One of the primary drawbacks of Edison’s system was that it was suitable only for the high density of customers found in large cities. His DC plants could not deliver electricity to customers more than one mile from the plant, and left a patchwork of unsupplied customers between plants. Small cities and rural areas could not afford an Edisonstyle system at all, leaving a large part of the market without electrical service, and AC companies expanded into this gap. Edison expressed views that AC was unworkable and the high voltages used were dangerous. Edison took advantage of the public perception of AC as dangerous, and became embroiled in a propaganda campaign having to do with the use of AC for the public electrocution of animals, and supporting legislation to control and severely limit AC installations and voltages (to the point of making it an ineffective power-delivery system) in what was being referred to as a “War of Currents.” The development of the electric chair was used in an attempt to portray AC as having a greater lethal potential than DC, and Edison joined with Westinghouse’s chief AC rival, the Thomson-Houston Electric Company, to 31


smear Westinghouse by ensuring that the first electric chair for use on humans was powered by a Westinghouse AC generator. By the end of 1887, Edison Electric was losing market share to Westinghouse and the Thomson-Houston Electric Company. Thomas Edison’s staunch anti-AC tactics were also not sitting well with his own stockholders. By the early 1890s, Edison’s company was generating much smaller profits than its AC rivals, and the competition would come to an end in 1892, with Edison being forced out of controlling his own company. That year, the financier J.P. Morgan engineered a merger of Edison General Electric with Thomson-Houston that put the board of Thomson-Houston in charge of the new company called General Electric. General Electric now controlled threequarters of the US electrical business and would compete head-to-head with Westinghouse for the AC market. Thomas Edison died of complications of diabetes at age 84, on October 18, 1931, in his home in West Orange, New Jersey. “Our greatest weakness lies in giving up. The most certain way to succeed is always to try just one more time.” —Thomas A. Edison Nikola Tesla was born on July 10, 1856, in the Austrian Empire. There, he received an advanced education in engineering and physics in the 1870s. Handicapped by a gambling addiction, he was unprepared for his final examinations and asked for an extension, which was denied, and he never graduated from university. In June 1884, Tesla 32


emigrated to the United States, where he became a naturalized citizen in 1891. An inventor, electrical engineer, mechanical engineer, and futurist, he is best known for his contributions to the design of the modern AC electricity supply system. In the United States, Tesla began working almost immediately at the Edison Machine Works on Manhattan’s Lower East Side, struggling with the task of building a large electric utility in that city. He quit after six months and soon began working on patenting an arc lighting system. He worked for the rest of the year obtaining the patents that included an improved DC generator and building and installing the system in Rahway, New Jersey. With the help of partners to finance and market an arc lighting manufacturing and utility company, Tesla Electric Light & Manufacturing, he set up laboratories and companies in New York to develop a range of electrical and mechanical devices. His investors showed little interest in Tesla’s ideas for new types of AC motors and electrical transmission equipment. After the utility was up and running in 1886, they decided that the manufacturing side of the business was too competitive and opted to simply run an electric utility. They formed a new utility company, abandoning Tesla’s company and leaving the inventor penniless. Tesla even lost control of the patents he had generated, since he had assigned them to the company in exchange for stock. In late 1886, Tesla connected with two other potential investors, and, based on his new ideas for electrical equipment, including a thermo-magnetic motor idea, they agreed to back him financially and handle his patents. The three formed the Tesla Electric Company in April 1887, and set up a laboratory for Tesla in Manhattan, where he worked on improving and developing new types of electric motors, generators, and other devices. In 1887, Tesla developed an induction motor that ran on alternating current (AC), a power system format that was rapidly expanding in Europe and the United States because of its advantages in long-distance, high-voltage transmission. The motor used polyphase current, which generated a rotating 33


magnetic field to turn the motor. This innovative electric motor, patented in May 1888, was a simple self-starting design that did not need a commutator, thus avoiding sparking and the high maintenance of constantly servicing and replacing mechanical brushes. Tesla demonstrated his AC motor on May 16, 1888, at the American Institute of Electrical Engineers, and engineers working for the Westinghouse Electric & Manufacturing Company reported to George Westinghouse that Tesla had a viable AC motor and related power system—something Westinghouse needed for the AC system he was already marketing. Westinghouse looked into getting a patent on a similar commutator-less, rotating, magnetic field–based induction motor developed in 1885, but decided that Tesla’s patent would probably control the market. In July 1888, Tesla negotiated a licensing deal with Westinghouse for his polyphase induction motor and transformer designs, and Westinghouse also hired Tesla for one year to be a consultant at Westinghouse’s Pittsburgh labs. Tesla’s demonstration of his induction motor and Westinghouse’s subsequent licensing of the patent, both in 1888, came at a time of extreme competition between electric companies, and Westinghouse would not have the cash or engineering resources to develop Tesla’s motor and the related polyphase system right away. Ultimately, Tesla’s AC induction motor and related polyphase AC patents earned him a considerable amount of money and became the cornerstone of the polyphase system, which Westinghouse would eventually market. Six years later Westinghouse would purchase Tesla’s patent as part of a patent-sharing agreement signed with General Electric (a company created from the 1892 merger of Edison and Thomson-Houston). In 1893, Westinghouse Electric won the bid to light the Chicago World’s Columbian Exposition with alternating current. The company had a large space in a building devoted to electrical exhibits, and asked Tesla to participate. Tesla showed a series of electrical effects related to alternating current as 34


well as his wireless lighting system. He also demonstrated a steam-powered reciprocating electricity generator that he had patented that year. It did away with the complicated parts of a steam engine/generator, but never caught on as a feasible engineering solution to generate electricity. Also in 1893, Edward Dean Adams, the head of the Niagara Falls Cataract Construction Company, sought Tesla’s opinion on what system would be best to transmit power generated at the falls. Tesla advised Adams that a two-phased system would be the most reliable, and that there was currently a Westinghouse system for lighting incandescent bulbs using two-phase alternating current. The company awarded a contract to Westinghouse Electric to build a two-phase AC generating system at the Niagara Falls, based on Tesla’s advice and Westinghouse’s demonstration at the Columbian Exposition. In 1895, Adams, impressed with what he had seen of Tesla’s work and ideas, agreed to help found the Nikola Tesla Company, which was set up to fund, develop, and market a variety of previous Tesla patents and inventions as well as new ones. The company found few investors, as the mid-1890s was a difficult time financially, and the wireless lighting and oscillators patents it was set up to market never panned out. The company would, however, continue to handle Tesla’s patents for decades to come. Throughout the 1890s, Tesla spent a great deal of his time and fortune experimenting with transmitting power by inductive and capacitive coupling using high AC voltages generated with his Tesla coil. He also attempted to develop a wireless lighting system based on near-field inductive and capacitive coupling. He saw this as not only a way to transmit large amounts of power around the world but also a way to transmit worldwide communications. In 1898, Tesla demonstrated to the public, during an electrical exhibition at Madison Square Garden, a boat that used a coherer-based radio control. He tried to sell his idea to the US military as a type of radio-controlled torpedo, but 35


they showed little interest. Remote radio control remained a novelty until World War I and afterward, when a number of countries used it in military programs. To further study the conductive nature of low-pressure air, Tesla set up an experimental station at high altitude in Colorado Springs during 1899. There he could safely operate much larger coils than in the cramped confines of his New York lab, and an associate had made an arrangement for the El Paso Power Company to supply alternating current free of charge. To fund his experiments, Tesla convinced John Jacob Astor IV to invest $100,000 (approximately $3 million in today’s dollars) and become a majority shareholder in the Nikola Tesla Company. Astor thought he was primarily investing in the new wireless lighting system. Instead, Tesla used the money to fund his Colorado Springs experiments, and told reporters that he planned to conduct wireless telegraphy experiments, transmitting signals from Pikes Peak to Paris. While in Colorado Springs, he conducted experiments with a large coil operating in the megavolts range, producing artificial lightning (and thunder) consisting of millions of volts and discharges of up to 135 feet in length. At one point, he inadvertently burned out the generator in El Paso, causing a power outage. In March 1901, Tesla’s Artificial Lightning Tesla obtained $150,000 (approximately $4.5 million in today’s dollars) from J. Pierpont Morgan in return for a 51% share of any generated wireless patents, and began planning the Wardenclyffe Tower facility to be built in Shoreham, New York. By July 1901, Tesla had expanded his plans to build a more powerful transmitter so as to leap ahead of Marconi’s radio-based system. Tesla 36


approached Morgan to ask for more money, but Morgan refused. In December 1901, Marconi successfully transmitted the letter “S” from England to Newfoundland, defeating Tesla in the race to be first to complete such a transmission. A month after Marconi’s success, Tesla tried to get Morgan to back an even larger plan to transmit messages and power by controlling “vibrations throughout the globe,” and Morgan again refused. Tesla continued the project for another nine months into 1902, and the tower was erected to its full height of 187 feet. In June 1902, Tesla moved his lab operations from New York City to Wardenclyffe. Investors on Wall Street were putting their money into Marconi’s system, and some in the press began turning against Tesla’s project, claiming it was a hoax. The project came to a halt in 1905, and Tesla mortgaged the Wardenclyffe property to cover his debts. He lost the property in foreclosure in 1915, and, in 1917, the Tower was demolished by the new owner to make the land a more viable real estate asset. In 1906, Tesla demonstrated a 200-horsepower (150 kilowatts) 16,000-rpm bladeless turbine. During 1910–1911, at the Waterside Power Station in New York, several of his bladeless turbine engines were tested at 100–5,000 horsepower. He then spent most of his time trying to perfect the Tesla turbine, but due to engineering difficulties, it was never made into a practical device. After Wardenclyffe closed, Tesla moved his operation back to Manhattan and continued trying to raise further funds by developing and marketing his patents. Most of his patents had expired, and he was having trouble with the new inventions he was trying to develop. He continued experimenting with a series of inventions in the 1910s and 1920s with varying degrees of success. Over the course of his lifetime, Tesla obtained around 300 patents worldwide for his inventions. Some of his patents are not accounted for, and various sources have discovered some that have lain hidden in patent archives.

37


Many inventions developed by Tesla were not put into patent protection. Having spent most of his money, Tesla moved out of the Waldorf Astoria in New York in 1922 after running up a large bill and leaving it unpaid when he left. From then on, he followed this pattern of moving to a new hotel every few years leaving behind unpaid bills. In 1934, he moved to the Hotel New Yorker, and the Westinghouse Company began paying him $125 per month as well as paying his rent, expenses the company would pay for the rest of his life. Nikola Tesla died of coronary thrombosis at age 86, on January 7, 1943, alone in Room 3327 of the New Yorker Hotel. “Most persons are so absorbed in the contemplation of the outside world that they are wholly oblivious to what is passing on within themselves.” —Nikola Tesla George Westinghouse Jr. was born on October 6, 1846, in Central Bridge, New York. He was the Pennsylvaniabased entrepreneur and engineer who invented the railway air brake and was a pioneer of the electrical industry. After his military discharge in August 1865, he enrolled at Union College, Schenectady, New York, but lost interest in the curriculum and dropped out during his first term. He was 19 years old when he was granted his first (of more than 300) patent for the rotary steam engine. In 1869, at age 22, Westinghouse invented a railroad braking system using compressed air. It was granted a patent on October 28, 1873. The Westinghouse Air Brake Company 38


was subsequently organized to manufacture and sell his invention, which was eventually nearly universally adopted by railways. Westinghouse also pursued many improvements in railway signals (which then used oil lamps). In 1881, he founded the Union Switch and Signal Company to manufacture his signaling and switching inventions. Westinghouse’s interests in gas distribution and telephone switching led him to become interested in the then-new field of electrical power distribution in the early 1880s. In 1884, he started developing his own direct-current domestic lighting system. Then, in 1885, he became aware of the new European alternating-current systems. Alternating current had the ability to be “stepped up” in voltage by a transformer for distribution over long distances and then “stepped down” by a transformer for consumer use. This allowed large centralized power plants to supply electricity from a distance to locales with dispersed populations. This was an advantage over DC systems being marketed by Thomas Edison’s electric utility which had a limited range due to the low voltages used. Westinghouse saw AC’s potential to achieve greater economies of scale as a way to build a truly competitive system instead of simply building another barely competitive DC lighting system using patents just different enough to get around the Edison patents. In the early 1880s, he put all his resources into developing and marketing it, a move that put his business in direct competition with the Edison DC system. Electric lighting was a growing business with many companies building outdoor DC and AC arc lighting–based street lighting systems, and Thomas Edison launching the first DC electric utility designed to light homes and businesses with his patented incandescent bulb. In 1886, Westinghouse provided the backing for the installation of the first multiple-voltage AC power system. Built in Great Barrington, Massachusetts, it was a demonstration lighting system driven by a hydroelectric generator that produced 500 volts of AC stepped down to 100 volts for lighting incandescent bulbs in homes and businesses. That same year, Westinghouse 39


formed the Westinghouse Electric & Manufacturing Company, which, in 1889, he renamed the Westinghouse Electric Corporation. During the periof of intense competition with Edison in the late 1880s, Westinghouse continued to pour funds and engineering resources into the goal of building a completely integrated AC system, obtaining the Sawyer–Man lamp by buying Consolidated Electric Light, developing components such as an induction meter, and obtaining the rights to Nikola Tesla’s brushless AC induction motor, along with patents for a new type of electric power distribution, polyphase alternating current. The acquisition of a feasible AC motor gave Westinghouse a key patent for his system, but the financial strain of buying up patents and hiring the engineers needed to build it meant development of Tesla’s motor had to be put on hold for a while. In 1891, Westinghouse built a hydroelectric AC power plant, the Ames Hydroelectric Generating Plant in Colorado, which supplied power to the Gold King Mine 3.5 miles away. This was the first successful demonstration of long-distance transmission of industrial-grade AC power and used two 100-horsepower Westinghouse alternators, one working as a generator producing 3000-volt, 133-Hertz, single-phase AC, and the other used as an AC motor. In 1893, Westinghouse Electric won the bid to light the 1893 World’s Columbian Exposition in Chicago with alternating current. This World’s Fair devoted a building to electrical exhibits. It was a key event in the history of AC power, as Westinghouse demonstrated the safety, reliability, and efficiency of a fully integrated alternating current system to the American public. Westinghouse’s demonstration, at the Colombian Exposition, that they could build a complete AC system was instrumental in them getting the contract for building a two-phase AC-generating system, the Adams Power Plant, at Niagara Falls in 1895. At the same time, a contract to build the three-phase AC distribution system the 40


project also needed was awarded to General Electric. The early to mid-1890s saw General Electric, backed by financier J. P. Morgan, involved in costly takeover attempts and patent battles with Westinghouse Electric. The competition was so costly, a patent-sharing agreement was signed between the two companies in 1896. On April, 15th, 1895, the first Niagara Generator, which bore the name and patent numbers of Nikola Tesla, was tested. It ran at full speed, 250 revolutions per minute, and proved quite satisfactory. This Niagara Falls, N.Y., power plant went into operation on August 25, 1895.

Niagara Falls Hydroelectric Generating Plant

With AC networks expanding, Westinghouse directed his attention toward electrical power production. At the outset, the available generating sources were hydroturbines where falling water was available, and reciprocating steam engines where it was not. Westinghouse felt that reciprocating steam engines were clumsy and inefficient, and wanted to develop some class of rotating engine that would be more elegant and efficient. One of his first inventions had been a rotary steam engine, but it had proven to be impractical. In 1884, the British engineer Charles Algernon Parsons began experimenting with steam turbines, beginning with a 10-horsepower (7.5 kW) turbine. Westinghouse bought rights to the Parsons turbine in 1885, improved the technology, and increased its scale. In 41


1898, Westinghouse demonstrated a 300-kilowatt turbine unit, replacing reciprocating engines in his air-brake factory. The next year, he installed a 1.5-megawatt, 1,200 rpm unit for the Hartford Electric Light Company. Westinghouse remained a captain of American industry until 1907, when the 1907 Banker’s Panic led to his resignation from control of the Westinghouse company. By 1911, he was no longer active in business, and his health was in decline. George Westinghouse died of cardiovascular disease at age 67 on March 12, 1914, in New York City. “If someday they say of me that in my work I have contributed something to the welfare and happiness of my fellow man, I shall be satisfied.” —George Westinghouse Albert Einstein was born in Ulm, in the German Empire, on March 14, 1879. As a 12-year-old, he mastered geometry and started teaching himself calculus, which he mastered by age 14. In 1895, at the age of 16, Einstein took the entrance examinations for the Swiss Federal Polytechnic in Zürich. He failed to reach the required standard in the general part of the examination, but obtained exceptional grades in physics and mathematics, so he attended the Argovian cantonal school in Aarau, Switzerland, in 1895 and 1896, to complete his secondary schooling, and graduated in September 1896. At 17, he enrolled in the four-year mathematics and physics teaching diploma program at the Zürich Polytech42


nic. In 1900, Einstein passed the exams in math and physics and was awarded the Federal Polytechnic teaching diploma. In 1905, he was awarded a PhD by the University of Zurich. After graduating from Zurich Polytechnic in 1900, Einstein spent almost two frustrating years searching for a teaching post, and finally took a job in Bern at the patent office, as an assistant examiner – level III. He worked there from 1902 to 1909. During 1905, which has been called Einstein’s annus mirabilis (miracle year), he published four groundbreaking papers, on the photoelectric effect, Brownian motion (of particles), special relativity, and the equivalence of mass and energy. These four works contributed substantially to the foundation of modern physics and changed scientists’ views on space, time, and matter. In his Theory of Special Relativity, one of the two pillars of modern physics (alongside quantum mechanics), Einstein determined that the laws of physics are the same for all nonaccelerating observers, and he showed that the speed of light within a vacuum is the same no matter the speed at which an observer travels. As a result, he determined that space and time were interwoven into a single continuum known as spacetime. Events that occur at the same time for one observer could occur at different times for another. He predicted that, when measured in the frame of a relatively moving observer, a clock carried by a moving body would appear to slow down, and the body itself would contract in its direction of motion. Observationally, the effects of these changes are most apparent at high speeds (close to the speed of light). In his paper on mass–energy equivalence, Einstein produced the formula E = mc2 as a consequence of his special relativity equations. He is best known for Mass-Energy Equivalence Equation

43


this formula, which has been dubbed “the world’s most famous equation.” Special relativity is limited to objects that are moving with respect to† inertial frames of reference—i.e., in a state of uniform motion with respect to one another. It ceases to apply if the objects are accelerating, decelerating, or curving. But gravity, which is always present, can, and does, cause these effects. Einstein spent the next 10 years trying to include acceleration in his theory and, in 1915, published his Theory of General Relativity, which takes gravity into account. According to general relativity, the observed gravitational attraction between masses results from the warping of space and time by those masses, and is experienced as gravity. He predicted the existence of gravitational waves that ripple in the curvature of space-time and propagate as waves, traveling outward from the source, transporting energy as gravitational radiation. In 1917, he applied the General Theory of Relativity to model the structure of the universe. General relativity has developed into an essential tool in modern astrophysics. It provides the foundation for the current understanding of black holes, regions of space where gravitational attraction is so strong that not even light can escape. †

44

“Energy equals mass times the speed of light squared.” On the most basic level, the equation says that energy and mass (matter) are interchangeable; they are different forms of the same thing. Under the right conditions, energy can become mass, and vice versa. When any piece of matter is converted to pure energy, the resulting energy is by definition moving at the speed of light. Pure energy is electromagnetic radiation—whether light or X-rays or whatever—and electromagnetic radiation travels at a constant speed of 300,000 km/ sec. When something is moving four times as fast as something else, it doesn’t have four times the energy but rather 16 times the energy—in other words, that figure is squared. So, the speed of light squared is the conversion factor that decides just how much energy lies within any chunk of matter. And because the speed of light squared is a huge number—90,000,000,000 (km/sec)2—the amount of energy bound up into even the smallest mass is enormous. If every one of the atoms in a paper clip could be turned into pure energy, the energy produced would be the equivalent of 18 kilotons of TNT (roughly the energy of the atomic bomb that destroyed Hiroshima). On Earth, however, there is no practical way to convert any object entirely into energy—that would require temperatures and pressures greater than those that exist at the core of our sun.


Einstein continued to deal with problems of statistical mechanics and quantum theory, which led to his explanations of particle theory and the motion of molecules. He also investigated the thermal properties of light, which laid the foundation for the Photon Theory of Light and, ultimately, the invention of photovoltaic solar cells. In 1917, he followed up on his 1905 paper in which he had postulated that light itself consists of localized particles (quanta). He suggested that energy quanta must have well-defined momenta and act in some respects as independent, point-like particles, thus inspiring the notion of wave–particle duality in quantum mechanics. Contrary to popular belief, Einstein was not opposed to quantum theory in general. Rather, he objected to what quantum mechanics implies about the nature of reality, because he believed that a physical reality exists independent of our ability to observe it. Quantum theory, he believed, was incomplete. Following his research on general relativity, Einstein entered into a series of attempts to generalize his Theory of Geometrical Gravitation to include electromagnetism as another aspect of a single entity. In 1950, he described his Unified Field Theory. However, in his pursuit of a unification of the fundamental forces, Einstein ignored some mainstream developments in physics, most notably the strong and weak nuclear forces, which were not well understood until many years after his death. His dream of unifying other laws of physics with gravity motivates modern quests for a Theory of Everything and in particular String Theory, where geometrical fields emerge in a unified quantum-mechanical setting. In October 1933, Einstein emigrated to the United States and took a position at the Institute for Advanced Study in Princeton, New Jersey, which was noted for having become a refuge for scientists fleeing Nazi Germany. At the time, most American universities, including Harvard, Princeton, and Yale, had minimal or no Jewish faculty or students, as a result of their Jewish quotas, which lasted until the late 1940s. In 45


1935, he decided to remain permanently in the United States and applied for citizenship. He became an American citizen in 1940 while also retaining his Swiss citizenship. He retained his position at the Institute until his death in 1955. Throughout his life, Einstein published hundreds of books and articles, including more than 300 scientific papers and 150 nonscientific ones. Albert Einstein died from an abdominal aortic aneurysm at age 76, on April 18, 1955, at a Princeton, New Jersey, hospital. “The important thing is to not stop questioning. Curiosity has its own reason for existing.” —Albert Einstein

46


Chapter 4

Today’s World of Electricity Electricity (along with the internal combustion engine) makes our modern way of life possible. Electricity is central to the ability of almost all modern societies to function. Increased access to electricity has lit corners of the world that were once dark, and is a hallmark of advanced societies as well as a basic requirement for economic progress. The greater the per capita consumption of energy in a country, the higher is the standard of living of its people. Hospitals, air-trafficcontrol systems, street lights, modern sewage systems, most forms of communication, and the financial-services industry are all dependent upon electricity. Electricity allows people to continue with their regular activities 24 hours a day, seven days a week. Factories, retail outlets, and other businesses can operate around the clock. Electricity powers the lights that make cities safer. Electricity cooks food on stoves and preserves it in refrigerators and freezers. Electricity keeps people alive in hospitals through ventilators, monitors, and other such equipment. Electricity makes communication instantaneous through telephones, e-mails, and texting. Electricity powers the Information Age, as radios, televisions, and the Internet would be impossible without it. When provided by batteries, electricity powers such things as automobiles, pacemakers, and cell telephones. 47


Electricity is the most versatile and easily controlled form of energy. At the point of use, it is practically loss-free and essentially non-polluting. At the point of generation, it can be produced “clean” with entirely renewable methods, such as wind, water, and sunlight. Electricity is weightless, easy to transport and distribute, and it represents the most efficient way of consuming energy. Electricity does, of course, have its drawbacks: its generation can result in air pollution; its use in lighting can cause light pollution that negatively affects migrating birds and other creatures; electromagnetic radiation from high-voltage power lines can cause cancer and other medical problems; and there is always the hazard of serious, potentially life-threatening, Light Pollution shocks (hence, its use in torture and the electric chair). Moreover, its use can cause functional and medical problems by interfering with people’s normal circadian rhythms. Our dependence on electricity is such that a major attack on, or failure of, the nation’s power grids could shut down the country. The government uses the term “critical infrastructure” to describe the importance of electricity in the country’s ability to function. By way of illustration, a major outage knocked out power across the eastern United States and parts of Canada on August 14, 2003. Beginning at 4:10 PM ET, 21 power plants shut down in just three minutes. The outage stopped trains and elevators, and disrupted everything from cellular telephone service to operations at hospitals to traffic at airports. The loss of use of electric water pumps interrupted water service in many areas. Small business owners were affected when they lost expensive refrigerated stock. Some power was restored by 48


11 PM. Most people did not get their power back until two days later. In other areas, it took as much as a week or two for power to be restored. At the time, it was the world’s second most widespread blackout in history, after the 1999 Southern Brazil blackout. The blackout contributed to almost 100 deaths and affected 45 million people in eight US states and an estimated 10 million people in southern and central Ontario. In 2018, total electricity consumption in the United States was about 3.95 trillion kilowatthours (kWh). That’s more than 16 times greater than it was in 1950. Total electricity consumption includes retail sales of electricity to consumers and direct-use electricity. Direct-use electricity is both produced by and used by consumers. The industrial sector generates and uses nearly all of the direct-use electricity. During that year, direct use of electricity by all end-user sectors was about 0.15 trillion kilowatthours, or about 4% of total electricity consumption; retail sales of electricity were about 3.80 trillion kilowatthours, or 96% of total electricity consumption. The sales of electricity to major consuming sectors, percentage share of total electricity sales, and primary use by each sector were: Sector Residential Commercial Industrial Transportation

kWh (Trillion) 1.46 1.38 0.95 0.01

% of Total 38.5 36.2 25.1 0.2

Primary Use Cooling Refrigeration Machine Drives Machine Drives

Modern technology makes it possible to convert one form of energy to another, and a substantial portion of our natural energy resources is used to produce electricity. About 63% of this electricity generation comes from fossil fuels (coal, natural gas, petroleum, and other gases); about 20% comes from nuclear energy; and about 17% comes from renewable energy sources. The forms of energy available in nature include: • Sun: The Sun is the ultimate source of energy on Earth. The sun radiates its energy in the form of light and 49


heat, and electricity can be produced through the use of solar panels. Solar panels require a lot of area even to produce a small amount of energy. They cannot be used on cloudy days or at night. Nevertheless, there are some locations in the world where this method is very efficient and economical and used very commonly because the operating cost is almost nothing.

50

Wind: This source of energy can be used only where the wind blows for a considerable length of time. Wind energy is used to run a small generator. In order to get the electrical energy continuously, batteries are connected with generators, which provides electrical energy when the wind stops. This method has very negligible maintenance and running cost, but the amount of energy produced from wind is very small and its output is variable.


Water: When water is stored in a suitable location, it possesses some potential energy due to the hydraulic head created. To use this water head, this potential energy needs to be converted into mechanical energy and then into electrical energy with the help of a turbine- alternator combination. This method of electrical energy production has become very popular worldwide due to low production and maintenance cost.

Geothermal energy: Geothermal energy is thermal energy generated and stored in the Earth. The geothermal gradient, which is the difference in temperature between the core of the planet and its surface, drives a continuous conduction of heat from the core to the surface. Electricity generation requires high temperatures that can only come from deep underground, and this heat must be carried to the surface to be used for electrical power generation. Sometimes this occurs naturally where the Earth’s crust is thin, and sometimes drilling is required to access it. Geothermal electric plants have traditionally been built exclusively on the edges of tectonic plates, where high temperature geothermal resources are available near the surface. Geothermal power is considered to be: renewable because the amount of any projected heat extraction by humans is very small compared 51


to the Earth’s heat content; sustainable, thanks to its power to sustain the Earth’s intricate ecosystems; and environmentally friendly, because its greenhouse gas emissions are on average less than 5 percent of that of conventional coal-fired plants.

52

Nuclear energy: Nuclear fuel can liberate a large amount of heat energy by the fission of uranium and other such materials. It is estimated that one kilogram of nuclear fuel can produce heat energy equal to that produced by 4,500 tons of coal. The heat produced by the nuclear fuel can be used to create steam, which is then converted into mechanical and electrical energies, respectively, by the use of a turbine and alternator combination. The biggest problem with nuclear power is disposal of the radioactive waste produced.

Combustible fuels: Minerals in the form of solid fuels such as coal and biomass, or liquid fuel such as oil and natural gas, are the most common sources of energy


production. The heat energy produced from burning these fuels can be converted to mechanical energy by the use of the suitable steam engine and steam turbine, and then converted into electrical energy. The use of such fuels is primarily responsible for air pollution and the greenhouse effect.

Ever since humans first set foot on this planet, they have been busily engaged in seeking out new, or “better,” ways of doing things. Generally, progress was slow, but occasionally, especially during the last 12,000 years, it has accelerated rapidly. These periods of acceleration have been spoken of as “revolutions” to denote their importance and the great significance and degree of change affecting society. The Agricultural Revolution (now known as the “First Agricultural Revolution”) occurred worldwide between 10,000 BC and 2000 BC, when human societies transitioned from hunting and gathering to farming. The Second Agricultural Revolution took place from the 17th to the 19th century, with an unprecedented increase in agricultural productivity in Great Britain. The Industrial Revolution (now also known as the “First Industrial Revolution”) was the transition to new manufacturing processes in Europe and the United States, in the period from about 1760 to about 1830. This transition included changing from hand production methods to machines, new chemical manufacturing and iron production processes, the increasing use of steam power and water power, the development of machine tools, and the rise of the mechanized factory system. 53


The Technological Revolution (also known as the Second Industrial Revolution) was a phase of rapid industrialization from about 1870 to 1914 (the beginning of World War I). Electrification allowed major developments in manufacturing methods, namely, the assembly line and mass production. Advancements in manufacturing and production technology enabled the widespread adoption of technological systems such as the telegraph and telephone, railroad networks, gas and water supply, and sewage systems. The Digital Revolution (sometimes called the “Electronic Revolution”) was characterized by the shift from traditional industry that the Industrial Revolution brought through industrialization, to an economy based on information computerization. The Digital Revolution, which began the shift from mechanical and analogue electronic technology to digital electronics, ushered in the Information Age (the Digital Age, or the Electronic Age), just as the Industrial Revolution marked the onset of the Industrial Age. The device that set the stage for the Electronic Age, rooted in the new physics of the electron—a science born around the turn of the 20th century—was the vacuum tube. The start of the Electronic Age, however, was really marked by the invention of the germanium-based point-contact transistor in 1947. “Electronics” refers to technology that works by controlling the motion of electrons in ways that go beyond standard electrodynamic properties like voltage and current. That is, electrical technology would work the same way if electrons were replaced by some other charge-carrying particles. Electrical devices take the energy of electrical current—the flow of electrons in a conductor with a linear relationship between voltage and current—and transform it in simple ways into some other form of energy, generally light, heat, or motion. Electronic technology, on the other hand, depends on the specific properties of electrons themselves. Because electronic devices are typically used for representing and manipulating information, this makes for a simple rule 54


of thumb for distinguishing between electrical and electronic. Typically, if something uses electricity merely as energy, it is electrical, whereas if it uses electricity as the medium for manipulating information, it is almost surely electronic. Electronic circuits contain active components, usually semiconductors, and typically exhibit behavior in which the relationship between voltage and current is a nonlinear function. The nonlinear behavior of active components and their ability to control electron flows makes amplification of weak signals possible, so electronics is widely used in information processing, telecommunications, and signal processing. The first transistor, the semiconductor germanium-based point-contact transistor, a solid-state device that was capable of amplifying a signal, was invented in 1947, and it was modified and improved over the subsequent years. Electronic devices are transistor-based, and the transistor was perhaps one of the most important inventions of the 20th century. The metal-oxide-semiconductor field-effect transistor (or MOS transistor) was developed in 1959. In 1968, the first silicon-based, rather than germanium-based, transistor a self-aligned gate (silicongate) MOS transistor was developed. It became the basis of all modern complementary MOS integrated circuits, and is the most widely manufactured device in history. It is the basis of every microprocessor, memory chip, and telecommunica55


tion circuit in commercial use. It was used to develop the Intel 4004, the first single-chip microprocessor, which was released by Intel in 1971. An integrated circuit (IC), also referred to as a chip, or a microchip, is a set of electronic circuits on one small flat piece (or “chip”) of semiconductor material that is usually silicon. The integration of large numbers of tiny MOS transistors into a small chip results in circuits that are orders of magnitude smaller, faster, and less expensive than those constructed of discrete electronic components. The IC’s mass production capability, reliability, and building-block approach to circuit design has ensured the rapid adoption of standardized ICs, which are now used in virtually all electronic equipment and have revolutionized the world of electronics. Over the years, transistor sizes have decreased from tens of thousands of nanometers in the early 1970s to 10 nanometers today with a corresponding million-fold increase in transistors per unit of microchip area. Currently, the typical chip areas range from a few square millimeters to around 600 mm2, with up to 25 million transistors per mm2. These days, almost any type of technological device includes electrical/electronic components, often as control systems. Among the most interesting areas of usage are transportation systems, medical systems, and information systems. • Transportation Systems: Transportation systems include cars (as well as trucks and buses), locomotives, boats, aircraft, and spacecraft. Modes of transportation that run on electric power offer a number of benefits relative to those that operate on internal combustion engines: they are cheaper to run; they are cheaper to maintain; they are safer; they are quieter, and they are less polluting. • Electric cars: These first came on the scene in the mid-19th century, but the high cost, low top speed, and short range of battery ??for these???electric vehicles, 56


compared to later internal combustion engine vehicles, led to a significant decline in their use. Beginning in 2008, a renaissance in electric vehicle manufacturing occurred due to advances in batteries, illnesses and deaths due to air pollution, and the desire to reduce greenhouse gas emissions. Tesla Motors began development in 2004 of what would become the Tesla Roadster, which was first delivered to customers in 2008. The Roadster was the first highway-legal, serialproduction, all-electric car to use lithium-ion battery cells, and the first production all-electric car to travel more than 200 miles per charge. Charging an electric car can be done at a variety of charging stations, which can be installed in both houses and public areas. The two all-time best-selling electric cars, the Nissan Leaf and the Tesla Model S, have EPArated ranges 2019 Tesla Model S reaching up to 151 miles and 370 miles, respectively. Currently, the Tesla Model 3 is the best-selling electric car in the US. Electronics that are specific to automobiles include those that have to do with engines, transmissions, chassis, safety, driver assistance, passenger comfort, and integrated cockpit systems. There are also “connected systems” that make the modern automobile much like a smartphone on four wheels. These include navigation systems, entertainment systems, and information-access systems. The technology that we use to navigate through space is being consolidated with the technology that we use to navigate through cyberspace. 57


58

Electric Locomotives: The first practical electric locomotives were developed in the late 1800s. Over the next several decades, their design and functionality were improved and, along with steam and diesel locomotives, their numbers increased substantially. Electric locomotives are powered by electricity from overhead lines, a third rail, or onboard energy storage such as a battery or a supercapacitor. Electric locomotives are much more efficient than diesel locomotives. When electricity is supplied directly from an overhead powerline, about 95 percent of the energy is transferred to the wheels. Diesel-powered locomotives, however, transfer only about 30-35 percent of the energy generated by combustion to the wheels. In the 1980s, the development of very high-speed service brought further electrification to the railroads. In Japan and Europe, high-speed lines for electric trains were built from scratch, but the United States has lagged behind. On September 2, 2006, a standardproduction Siemens electric locomotive achieved Railjet Locomotive a speed of 222 mph, the record for a locomotive-hauled train on a newly built, high-speed line in Germany. This locomotive is now employed to haul Austrian Federal Railways “Railjet” high-speed trains, but is limited to a top speed of 143 mph due to economic and infrastructure concerns. Electric Boats: One of the earliest electric boats was developed in 1839 in St Petersburg, Russia. It was a


24-foot boat that carried 14 passengers at 3 miles per hour. It took another 30 years of battery and motor development before the electric boat became a practical proposition. Electric boats were very popular from the 1880s until the 1920s. The world’s first fleet of electric launches for hire, with a chain of electrical charging stations, was established along the River Thames in the 1880s. The boats were used for leisure excursions up and down the river and provided a very smooth, clean, and quiet trip. The boats could run for six hours and operate at an average speed of 8 miles per hour. The first electrically powered submarines were built in the 1890s. Since then, electric power for the propulsion of submarines underwater (traditionally by batteries) has been used almost exclusively, although diesel was used for directly powering the propeller while on the surface until the development of dieselelectric transmission by the US Navy in 1928, in which the propeller was always powered by an electric motor, with energy coming from batteries while submerged or diesel generator while surfaced. With the advent of the gasoline-powered outboard motor, the use of electric power on boats declined beginning in the 1920s. However, in a few situations, especially their use with outboard trolling motors, the usage of electric boats has persisted from the early 20th century to the present day. In 1968, the Duffy Electric Boat Company of California started massproducing small electric craft. In 1982, the Electric Boat Association was formed in the United Kingdom, and solar-powered boats started to emerge. Since the energy crises of the 1970s, interest in this quiet and potentially renewable marine energy source has been increasing steadily again. Battery power alone, however, limits the range of a boat. Electric energy has to be obtained for the battery bank from some source. 59


○ The boat can be charged from shore-side power when available. Shore-based power stations are subject to very strict environmental controls. By purchasing green electricity, it is possible to operate electric boats using sustainable or renewable energy. ○ In hybrid electric boats, if a boat has an internal combustion engine anyway, then its alternator will provide significant charge when the engine is running. ○ Towed generators are common on long-distance cruising yachts and can generate a lot of power when traveling under sail. ○ Wind turbines are relatively common on cruising yachts and can be very well suited to electric boats. There are safety considerations regarding the spinning blades, especially in a strong wind. Large enough wind generators could produce a completely wind-powered electric boat, but no such boats are yet known to exist, although there are a few mechanical wind turbine-powered boats. ○ Solar panels can be built into the boat in reasonable areas in the deck, cabin roof, or as awnings. The efficiency of solar panels rapidly decreases when they are not pointed directly at the sun, so some way of tilting the arrays while underway is very advantageous. The availability of solar cells for the first time made possible motorboats with an infinite range like sailboats. The first practical solar boat was probably constructed in 1975 in England. The Swiss MS PlanetSolar, the largest solarpowered boat in the world, was launched on March 31, 2010. PlanetSolar is a 115-foot-long, 85-foot-wide catamaran yacht powered by 5,780 square feet of solar panels rated at 93kW. They are connected to two electric motors, one in each hull. There are 8.5 tons of lithium60


ion batteries in the ship’s two hulls. The ship is capable of reaching speeds of up to 10 knots. On May 4, MS PlanetSolar 2012, after 585 days and visiting 28 different countries, PlanetSolar completed a 37,297-mile circumnavigation of the Earth without using any fossil fuel. It was the first solar electric vehicle of any sort ever to do so. Electric Aircraft: The use of electricity for aircraft propulsion was first experimented with during the development of the airship which took place in the latter part of the 19th century. The first electrically powered airship was flown on October 8, 1883. Even with the lifting capacity of an airship, the heavy accumulators needed to store the electricity severely limited the speed and range of such early airships. Although manned flights in a tethered helicopter go back to 1917 and in airships to the previous century, the first manned free flight by an electrically powered airplane was not made until 1973, and most manned electric aircraft today are still only experimental demonstrators. Success in a full-sized airplane would not be achieved until nickel-cadmium (NiCad) batteries were developed, having a much higher storage-to-weight ratio than older technologies. In 1973, after being converted from a motor glider to a battery-powered electric aircraft, the Militky MB-E1 flew for just 14 minutes to become the first manned electric aircraft to fly under its own power. All electric aircraft to date have been powered by electric motors driving thrust-generating propellers 61


or lift-generating rotors. Electricity may be supplied by a variety of methods including batteries, ground power cables, solar cells, ultracapacitors, fuel cells, and microwave power beaming. Batteries are the most common energy-carrier component of electric aircraft due to their relatively high capacity. Batteries were the earliest source of electricity, first powering airships in the 19th century. These early batteries were very heavy, and it was not until the arrival of technologies such as NiCd rechargeable types in the second half of the 20th century that batteries became a practicable power source. Modern battery types include lithium-based and a number of other less widely used technologies. Such batteries remain a popular power source today, although they still have limited endurance between charges and hence limited range. Land vehicles can easily cope with the extra mass from electricity storage or electrical propulsion systems, but aircraft are much more sensitive to weight issues. Aircraft also travel much further than ground vehicles, which means a flight requires far more energy than an average road trip. Aircraft must store onboard all the energy needed to move its mass for each flight. Currently, batterypowered electric aircraft have much more limited payload, range, and endurance than those powered by internal combustion engines. It is therefore only suitable for small aircraft. Developed almost in parallel with NiCad technology, solar cells were also slowly becoming a practicable power source. Following a successful model test in 1974, the world’s first official flight in a solar-powered, man-carrying aircraft took place on April 29, 1979. A solar cell converts sunlight directly into electricity, either for direct power or temporary storage. The power output of solar cells is small, even when many are connected together, which limits their 62


use and is also expensive. However, their use of freely available sunlight makes them attractive for highaltitude, long-endurance applications. For endurance flights, keeping the craft in the air all night typically requires a backup storage system, which supplies power during the hours of darkness and recharges during the day. A motor-glider type aircraft, originally built as a pedal-powered airplane to attempt the Channel crossing, proved too heavy to be successfully powered by human power, so it was then converted to solar power, using an electric motor driven by batteries that were charged before flight by a solar cell array on the wing. The maiden flight of Solar One took place at Lasham Airfield, Hampshire, on June 13, 1979. Electrically powered model aircraft have been flown since the 1970s, with one unconfirmed report as early as 1957. They have since developed into small, battery-powered, unmanned aerial vehicles (UAV) known as “drones,” which, in the 21st century have become widely used for many purposes. From 1983 until 2003, NASA experimented with a series of solar and fuel cell system–powered UAVs. In July 2010, its QinetiQ Zephyr, a lightweight solar-powered unmanned aerial vehicle, set the UAV endurance record of 336 hours. It is made of carbon fiber-reinforced polymer construction weighing 110 pounds with a wingspan of 74 feet. During the day, it uses sunlight to charge lithium-sulfur batteries, which power the aircraft at night. In 2012, the Swiss Solar Impulse 1 was the first solar plane to make an intercontinental flight, flying from Madrid, Spain, to Rabat, Morocco. Solar Impulse 2, completed in 2014, carried more solar cells and more powerful engines, among other improvements. The revised version is 73.5 feet long and has a wingspan of 63


64

236 feet, which is only slightly less than that of an Airbus A380, the world’s largest passenger airliner. While the A380 weighs about 500 tons, the carbon-fiber Solar Impulse 2 weighs only about 2.3 tons, little more than an average SUV. Its 17,248 photovoltaic solar cells, rated at 66kW peak, Solar Impulse 2 cover the top of the wings, fuselage, and tailplane, for a total area of 2,901 square feet. Energy from the solar cells is stored in lithium polymer batteries and used by four electric motors to drive propellers. The aircraft’s maximum speed is 87 miles per hour, and its cruising speed is 56 mph (37 mph at night to save power). Between March 9, 2015, and July 23, 2016, Solar Impulse 2 completed a circumnavigation of the Earth. Its journey was the first circumnavigation of the Earth by a piloted fixedwing aircraft using only solar power. Spacecraft: The era of space flight essentially began on October 4, 1957, when the Soviet Union launched Sputnik 1, the first artificial satellite to orbit the Earth. The Soviets achieved yet another first on April 12, 1961, when they launched the first human, cosmonaut Yuri Gagarin, into a single orbit aboard Vostok 1. On May 5, 1961, the US launched its first astronaut, Alan Shepard, on a suborbital flight aboard Freedom 7. The US reached its orbital goal on February 20, 1962, when John Glenn made three orbits around the Earth in the Friendship 7 spacecraft. On July 20, 1969, NASA’s Apollo 11 landed on the Moon, and astronaut Neil


Armstrong became the first human to physically set foot on the Moon. The International Space Station’s first component was launched into orbit in November 1998, and the last pressurized module was fitted in 2011. The station has been continuously occupied since November 2, 2000, for a record 19 years, On August 25, 2012, Voyager 1, which was launched by NASA on September 5, 1977, became the first spacecraft to cross the heliopause and enter the interstellar medium. At a distance of about 14 billion miles, Voyager 1 is currently the most distant man-made object from Earth and is traveling outward at approximately 333 million miles per year. Weight and size considerations are even more critical for spacecraft components than they are for those of aircraft. All spacecraft, except single-stage-to-orbit vehicles, cannot get into space on their own, and require a launch vehicle (carrier rocket). The only methodology currently available for overcoming the force of gravity and attaining an escape velocity of about 25,000 miles per hour is the use of huge fuel-guzzling booster rockets. It is inefficient, costly, and dangerous, but it is likely to be the only game in town for the foreseeable future, and it does work despite its shortcomings. Historically, the Titan rockets that carried NASA spacecraft weighed 1.5–2.5 million pounds, including fuel—2–3 orders of magnitude heavier than their payloads. Once its launch is completed and a spacecraft moves into orbit or beyond, the heavy, bulky paraphernalia of the chemically powered solid rocket boosters and their fuel tanks—which are needed to overcome Earth’s gravity—are no longer required and can be discarded. While the vehicle will still need to be able to do some maneuvering in space from time to time, the power requirements are significantly diminished. Most 65


craft have simple reliable chemical thrusters for this purpose, and Soviet bloc satellites have used electric propulsion for decades. Most of these kinds of spacecraft propulsion systems work by electrically expelling propellant (reaction mass) at NASA’s Cassini Spacecraft (1997) high speed. Electric thrusters typically use much less propellant than chemical rockets because they have a higher exhaust speed. Due to limited electric power, the thrust is much weaker compared to chemical rockets, but electric propulsion can provide a small thrust for a long time. Electric propulsion can achieve high speeds over long periods and thus can work better than chemical rockets for some deep space missions. Spacecraft that don’t use electric propulsion nevertheless use a considerable amount of electrical power to operate instrumentation, communications, command and telemetry, computer systems, sensors. and life-support systems. Power generation adds significant mass to the spacecraft, and ultimately, the weight of the power source limits the performance of the vehicle. Efforts are ongoing to develop more efficient and lightweight power sources. For some missions—particularly those that occur relatively close to the sun— the energy provided by solar panels may be sufficient, and has often been used. For spacecraft designed to operate in more distant locations or with higher power requirements, nuclear energy is necessary. With any source of electrical power in use today—whether it 66


is chemical, nuclear, or solar—the maximum amount of power that can be generated is quite small, being limited by size and weight constraints. One common type of spacecraft nuclear power is a radioisotope thermoelectric generator (RTG), which has been used on many space missions (more than 40 by the US and USSR), and another is small fission reactors. A radioisotope heater unit provides heat from radioactive decay of a material and can potentially produce heat for decades. For more than 50 years, RTGs have been the United States’ main nuclear power source in space. When spacecraft require more than 100 kilowatts of power, fission systems are much more cost effective than RTGs. The Soviets have launched dozens of small fission power system nuclear reactors into space, while the US has launched only one. In the extreme environment of space, electronic systems are subjected to a host of potentially destructive stressors. Most semiconductor electronic components are susceptible to radiation damage from the high levels of ionizing radiation that exist in outer space and high-altitude flight. Damage can occur from singleevent effects and total ionizing-dose effects. The radiation disrupts the crystalline nature of the inside of the electronic component. Its function is degraded and then fails as more radiation exposure is received by the electronic component. Extreme temperatures and associated thermal cycles can cause mechanical stress on ICs and their packages. Thermal gradients of over 5.5°F per minute can occur, even inside the relatively temperature-controlled internal volume of a spacecraft. Components can be subjected to temperatures as low as 67°F and as high as 257°F. Manufacturers are continually working to develop components that are less susceptible to these effects.

67


68

Medical Systems: The history of the medical use of electricity goes back more than 2,000 years. In 46 AD, the Roman physician Scribonius Largus recommended the use of shocks from live black torpedo fish to relieve gout and other pain. The first recorded modern treatment of a patient by electricity was in 1743. John Wesley, in 1747, promoted electrical treatment as a universal panacea, but it was rejected by mainstream medicine. Medical treatments with electricity using a special apparatus at Middlesex Hospital in London have been recorded as far back as 1767. The record of specific uses, other than their being “therapeutic,” is not clear. Machines ?generating electricity? became central to medicine in Europe during the 1800s. Doctors initially used such machines to treat conditions such as gout, paralysis, and toothache. However, many treatments were ineffective. Others are still regarded as useful therapy to treat pain, spasms, and brain conditions such as epilepsy. German physicist Wilhelm Roentgen discovered X-rays in 1895. This changed the way doctors diagnosed and treated disease. X-ray machines became powerful medical tools over the next 30 years, especially during the First World War. Doctors could now see deep inside the body without using exploratory surgery. X-rays were not just used for diagnosis. Cancer could be treated using Xray radiation therapy devices. However, before the risks were understood, many patients and radiologists in the early 20th century died from overexposure. Using proper safety measures, X-rays were the main imaging technology until the 1970s. Other imaging machines such as the CT, PET, and MRI scanners have since been developed. Unlike traditional X-ray machines, they provide detailed views of the body’s complex structures, such as the brain. Imaging machines are very expensive, costing


as much as $3 million to purchase and $100 thousand per year to maintain.

Computerized Tomography (CT) Scanner

Electrical and electronic medical devices, ranging from the basic to the highly sophisticated, are used to diagnose, treat, monitor, prevent, or alleviate illness or injury. There are several basic types: ○ Diagnostic equipment includes a variety of medical imaging machines. Examples are X-ray machines, ultrasound, and MRI machines, and PET and CT scanners. ○ Treatment equipment includes infusion pumps, medical lasers, and LASIK surgical machines. ○ Life-support equipment is used to maintain a patient’s bodily function. Examples are medical ventilators, incubators, anesthetic machines, heartlung machines, ECMOs, and dialysis machines. ○ Medical monitors allow medical staff to evaluate a patient’s medical state. Monitors may measure patient vital signs and other parameters including ECG, EEG, and blood pressure. 69


○ Medical laboratory equipment automates or helps analyze blood, urine, genes, and dissolved gases in the blood. ○ Therapeutic equipment includes physical therapy machines like continuous passive range of motion (CPM) devices. The use of computers was one of the most important technological changes in 20th-century medicine. They became central to medical care from the 1950s onward. Electronic medical devices with downloadable memories—such as implantable cardiac pacemakers, defibrillators, drug pumps, insulin pumps, and glucose monitors—are now an integral part of routine medical practice in the United States. Functional organ replacements, such as the artificial heart, pancreas, and retina, will most likely become commonplace in the near future. •

70

Information Systems: Electronic communication dates back to the telegraph that used Morse code to send messages long distances over wires. After that, the electronics industry added the wired telephone, the wireless radio, and television. Nowadays, people can share information with each other anywhere, any time, and in ways that are as varied as we are. They include: websites, forums, social networking, emails, instant messaging, text messaging, and video calling, to name a few. For thousands of years, various devices have been used to aid in computation. The earliest counting device was most likely a form of tally stick, and later there were record-keeping aids, included counting rods, which represented numbers of items. The earliest known relatively advanced tool for use in computation was the abacus, which the Romans developed from devices used in Babylonia as early as 2400 BC. Abaci


of a more modern design are still used as calculation tools today. The Antikythera mechanism, which was used by ancient Greeks for astronomical calculations, is believed to be the earliest mechanical analog “computer,” and has been dated to about 100 BC. The slide rule was invented in the 1620s. It was a hand-operated analog computer, the functions of which were initially quite limited, but after the inclusion of logarithms in computational methodologies, there followed a period of rapid progress by inventors and scientists in making calculating devices. The high point of this early era of formal computing can be seen in Charles Babbage’s “difference engine” of the 1820s, and its successor, the “analytical engine” (which was never completely constructed but was designed in detail).

Difference Engine

During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct 71


mechanical or electrical model of the problem as a basis for computation. The art of mechanical analog computing reached its apex with the differential analyzers, which were built in 1927, but whose obsolescence soon became apparent; by the 1950s, the success of digital electronic computers had spelled the end for most analog computing machines. Early digital computers were electromechanical, with electric switches driving mechanical relays to perform the calculation. These devices had a slow operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. During World War II, British codebreakers achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma, was first attacked with the help of an electro-mechanical computer, but to crack the more sophisticated German codes, the Colossus computer was constructed in 1944. Colossus was the world’s first electronic digital programmable computer. It used a large number of vacuum tubes, had paper-tape input, and was capable of being configured to perform a variety of Boolean logical operations on its data. The Electronic Numerical Integrator and Computer (ENIAC), developed in the US in 1946, was the first fully electronic programmable computer to be constructed. While the ENIAC was similar to the Colossus, it was much faster and more flexible. Like the Colossus, a “program” on the ENIAC was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that came later. Once a program was written, it had to be mechanically entered into the machine with manual resetting of plugs and switches. The ENIAC combined the high speed of non-mechanical electronics with the ability to be programmed for a large variety of complex prob72


lems. The enormous machine weighed 30 tons, used 200 kilowatts of electric power, and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors.

ENIAC (1946)

While these early computers were capable of running a variety of programs, changing from one program to another required the laborious rewiring and restructuring of the machine. This problem was solved by the development of the stored-program computer, which was able to store in memory a set of instructions that detailed the computational demands of each program. The Manchester Small-Scale Experimental Machine (SSEM), first run in 1948, was the world’s first stored-program computer, and contained all of the elements essential to a modern electronic computer. The SSEM was followed in 1951 by the Ferranti Mark 1, the world’s first commercially available generalpurpose computer. The bipolar transistor was invented in 1947, and from 1955 onwards, transistors replaced vacuum tubes in computer designs, giving rise to the “second gen73


eration” of computers. Compared to vacuum tubes, transistors had many advantages: they were smaller, required less power (and therefore gave off less heat), were much more reliable, and had longer, indefinite, service lives. The next great advance in computing power came with the advent of the integrated circuit (IC). The first practical ICs were invented in the late 1950s and heralded an upsurge in the commercial and personal use of computers, leading to the invention of the microprocessor. The first single-chip microprocessor was the Intel 4004, which was released in 1971. The microprocessor led to the development of microcomputers, and the “microcomputer revolution.” Back in the 1940s, the idea that we would ever need computers in our homes, much less our pockets or wrists, was considered to be ridiculous. Today, the question is more one of “Where can’t we put computers?” and there are now very few homes in industrialized societies without a computer of some kind. Whether it’s a traditional desktop, laptop, tablet, or smartphone, it’s almost impossible to imagine life without one of these devices. While the early computers were specifically designed to “compute,” (i.e., the perform mathematical calculations), in the later part of the 20th century, they started handling other sorts of data, becoming “information-processing devices” rather than just “computers.” In this way, what began as the “Computer Age” is now more appropriately thought of as the “Information Age.” In the 1970s, the home computer was introduced, as well as time-sharing computers, video game consoles, the first coin-operated video games. The golden age of arcade video games began with Space Invaders. In developed nations, computers achieved semiubiquity during the 1980s as they made their way into schools, homes, business, and industry. Automated 74


teller machines, industrial robots, CGI in film and television, electronic music, bulletin board systems, and video games all fueled what became the zeitgeist of the 1980s. In the world of personal computers, Apple often led the way with lines such as the Apple I in 1976, the Apple II in 1977, the Lisa in 1983 (one of the first personal computers to offer a graphical user interface in a machine aimed at individual business users), the Macintosh in 1984, the PowerBook in 1991, the Power Mac in 1994, the MacBook Pro (as part of Apple’s transition to Intel processors) in 2006, and the iPad in 2010.

The first handheld mobile phone was demonstrated by Motorola in 1973. From 1979 to the mid-1980s, several small cellular networks were set up in various locations around the world. These first-generation (1G) systems could support far more simultaneous calls but still used analog cellular technology. In 1991, the second-generation (2G) digital cellular technology was launched, and in 2001, the third generation (3G) was launched. By 2009, it had become clear that, at some point, 3G networks would be overwhelmed by the growth of bandwidth-intensive applications, such as streaming media. Consequently, the industry began looking to data-optimized fourth-generation (4G) tech75


nologies, with the promise of speed improvements up to 10-fold over existing 3G technologies. The term 5G is used for technology being researched for the next major phase in mobile telecommunication standards, but such networks don’t yet exist. Early handset manufacturers such as Nokia and Motorola enjoyed record sales of cell phones based more on fashion and brand than technological innovation. The smartphone market, dominated at the time by BlackBerry OS and Windows Mobile devices, was quite staid and was focused on the needs of business enterprises. In addition to telephony, 2000s-era mobile phones supported a variety of other services. Mobile phones that provide just the Apple iPhone (2016) basic package of services are known as “feature phones” (or “dumb phones”); those that offer greatly advanced computing capabilities are referred to as “smartphones.” The Apple iPhone, the first generation of which was released on June 29, 2007, is exemplary. Its user interface is built around the device’s multi-touch screen, including a virtual keyboard. The iPhone has Wi-Fi and can connect to cellular networks. It can take photos, play music, send and receive email, browse the web, send and receive text messages, record notes, perform mathematical calculations, and receive visual voicemail. Shooting video also became a standard feature with the iPhone 3GS. Other functionality, such as video games, reference works, and social networking, can be enabled by 76


downloading mobile apps. Since 2010, smartphones have adopted integrated virtual assistants, such as Apple Siri, Amazon Alexa, Google Assistant, and Microsoft Cortana. As of November 1, 2018, a total of more than 2.2 billion iPhones had been sold, and Apple has garnered more than 50% of the global smartphone market. Worldwide, more than 5.13 billion people have a mobile device (cell phone, tablet or other cellular enabled IoT devices.) The World Wide Web (WWW) commonly known as the Web, is an information system where documents and other web resources are identified by Uniform Resource Locators (URLs), which may be interlinked by hypertext, and are accessible over the Internet. The Web has been central to the development of the Information Age and is the primary tool billions of people use to interact on the Internet. The World Wide Web was developed in 1989 and became publicly accessible in 1991. Many people use the terms “Internet” and “Web” interchangeably, but in fact the two terms are not synonymous—they are two related but separate things. The Internet, which was developed in the 1980s, is a massive network of networks, a networking infrastructure. It connects millions of computers together globally, forming a network in which any computer can communicate with any other computer as long as they are both connected to the Internet. As of June 30, 2019, more than 4.5 billion people worldwide (58.8% of the population) were connected to the Internet. The Web is a way of accessing information over the medium of the Internet. It is an information-sharing model that is built on top of the Internet. The Web utilizes the HTTP protocol, only one of the languages employed over the Internet, to transmit data, and it is just one of the ways that information can be disseminated over the Internet. The Internet, not the Web, is also used for 77


email, which relies on Simple Mail Transfer Protocol (SMTP), Usenet news groups, instant messaging, and File Transfer Protocol (FTP). Presently, more than 1,275,000,000 websites are in existence. With technological miniaturization, computers have been getting smaller, but since people haven’t, the limit for miniaturization of human-operated computers has just about been reached. On the other hand, specific-purpose computers, especially those that do not require a human interface, continue to get smaller and smaller. Currently, the world’s smallest multi-purpose computer is the Michigan Micro Mote (M^3), which is about the size of a grain of rice. Despite its tiny size, the Micro Mote has the ability to take pictures, read temperatures, and record pressures. Nanotechnology will permit the creation of computers so tiny that they can be introduced everywhere, but would never be seen, and most likely would never even be thought about. This would set the stage for ubiquitous computing in which computers exist at any time, in any location, and in any form. Computers solve problems in exactly the way they are programmed to, without regard to efficiency, alternative solutions, possible shortcuts, or possible errors in the code—all of which can be problematic. Considerable work is being done on the development of computer programs that learn and adapt, as part of the emerging field of Artificial Intelligence and machine learning. The term artificial intelligence (AI), as used colloquially, refers to the theory and development of computer systems able to perform “cognitive” tasks that humans normally associate with other human minds, such as learning and problem-solving, visual perception, speech recognition, decision-making, and language translation. Since its inception 60+ years ago, work in the field of AI has made great strides, and to78


day most intellectual tasks that can be performed by a human being can now be performed by an AI system. Nobody knows for certain when full human-level AI will be achieved, but it will almost certainly happen before the end of the 21st century. Once human-level machine intelligence is achieved, a superintelligent system—one that significantly exceeds human cognitive performance in essentially all meaningful domains—is likely to follow very quickly thereafter. Sufficiently intelligent software presumably would be able to reprogram and improve itself ... and this improved software would be even better at improving itself ... and so on. This recursive self-improvement would thus increase exponentially and dramatically surpass humans in a very short period of time. Such an “intelligence explosion” is now spoken of as the “technological singularity,” an occurrence beyond which events are unpredictable or even incomprehensible. If an extraordinarily powerful superintelligence is created, one that would, qualitatively, far surpass all human intelligence, collectively, the question arises as to how we can deal with the fact that there is no a priori reason to believe that such an intelligence would share our system of morality, which has evolved for millennia in tandem with our biology—an evolution that AI would not share. Superintelligent software might decide that it is not in its best interest to support the continued existence of humanity, and, given that superintelligence could always outmaneuver humans, unless it decided to allow humanity to coexist, it might inexorably result in human extinction. That is, of course, a worst-case scenario, but even the actions of a superintelligence that isn’t hostile could cause critical harm to humanity while cooperatively pursuing its designated objective. If, for example, such an intelligence were tasked with a primary goal 79


that required unlimited time and unlimited resources to achieve resolution, the system would engage in convergent behaviors such as resource acquisition and self-preservation (including preventing its being shut down), and attempt to turn the entire Earth into some form of computronium (hypothetical “programmable matter”), thereby irreparably harming humankind as a side effect.

80


Chapter 5

Sensitivity to Electromagnetic Radiation Human biochemistry, thought, and action are all controlled by electricity. Neurons are nerve cells, which are the basic building blocks of the nervous system. They are highly specialized nerve cells responsible for transmitting information throughout the body in both chemical and electrical forms. Neurons have two different kinds of nerve fibers: dendrites carry information towards the cell body; and axons carry information (electrochemical nerve impulses called action potentials ) away from it. There are basically three different types of neurons, each of which is responsible for different tasks: • Sensory neurons pass information about stimuli such as light, heat, or chemicals from both inside and outside the body to the central nervous system. • Motor neurons pass instructions from the central nervous system to other parts of the body, such as muscles or glands. • Association neurons connect the sensory and motor neurons. The brain can be thought of as the overall modulator of neuronal activity. It holds to reason, then, that humans can be affected by electromagnetic radiation. Electromagnetic radiation can be classified into two types: ionizing radiation (based on the ca81


pability of a single photon with more than 10 eV energy to ionize oxygen or break chemical bonds) and non-ionizing radiation. Extreme ultraviolet and higher frequencies, such as X-rays or gamma rays, are ionizing; and can be extremely hazardous in that they can cause radiation poisoning. Humans are also affected by non-ionizing radiation. Most of the molecules in the human body interact weakly with electromagnetic fields (EMF) in the radio frequency or extremely low-frequency bands. Dielectric heating, also known as radio frequency heating, is the process by which a radio frequency (RF) alternating electric field, radio wave, or microwave electromagnetic radiation heats a dielectric material. This can lead to biological effects ranging from muscle relaxation (as produced by a diathermy device) to burns. For example, touching or standing around an antenna while a high-power transmitter is in operation can cause severe burns. The dialectric heating effect varies with the power and the frequency of the electromagnetic energy, as well as the distance to the source. The last 50 years have seen a dramatic increase in the number of devices emitting non-ionizing radiation in all segments of society, which has resulted in an elevation of health concerns by researchers and clinicians and an associated interest in government regulation for safety purposes. Direct effects of electromagnetism on human health have been difficult to prove, and documented life-threatening interferences from electromagnetic fields are limited to medical devices such as pacemakers and other electronic implants. The effect of radiation from mobile phones and other Cell Phone Tower 82


wireless electronic devices on human health is a subject of special interest and study worldwide. Radio waves decrease rapidly in intensity by the inverse square of distance as they spread out from a transmitting antenna. Therefore, the cell phone transmitter, which is held close to the user’s face when talking, is a much greater source of human exposure than the cell tower transmitter, which is typically at least hundreds of meters away from the public on a cell tower. The World Health Organization (WHO) states that, to date, no adverse health effects have been established as being caused by mobile phone use. Stronger or more frequent exposures to EMF, however, can definitely be responsible for health problems, and in fact serve as the basis for electromagnetic weaponry. Everyone, to one extent or another, experiences sensitivity to electromagnetic radiation, but there are a number of cases in which individuals are reported to experience electromagnetic hypersensitivity (EHS), a condition to which negative effects are attributed. There are no specific symptoms associated with claims of EHS, and those reported range widely between individuals. They include headache, fatigue, stress, sleep disturbances, skin prickling, burning sensations and rashes, pain and ache in muscles, and many other health problems. In severe cases, such symptoms can be a real and sometimes disabling problem for the affected person, causing psychological distress. There is, however, no scientific basis to link such symptoms to electromagnetic field exposure, and EHS is not a recognized medical diagnosis. Devices implicated in self-reports of EHS include: fluorescent and low-energy lights; mobile, cordless/ portable phones; and Wi-Fi. Symptoms may also be brought on by imagining that exposure is causing harm, an example of the nocebo effect. Whatever the cause of symptoms attributed to EHS, there is no doubt that they can be a debilitating condition that benefits from treatment or management. Cognitive behavioral therapy has shown some success helping people cope with the condition. As of 2005, WHO recommended that people presenting with claims of EHS be evaluated to deter83


mine if they have a medical condition that may be causing the symptoms the person is attributing to EHS, that they have a psychological evaluation, and that the person’s environment be evaluated for issues like air or noise pollution that may be causing problems. The standard medical research approach to electromagnetic hypersensitivity, however, may be too limiting in that people who report EHS are so far removed from statistical norms. About 20 years ago, this author undertook a broad-brush research project to look into those who he called “Anomalously Sensitive Persons (ASPs).”† They could just as well have been called “Hyper Sensitive Persons (HSPs). He used a selfdesigned questionnaire (called the “H.I.S.S.) to identify their predispositions toward sensitivities (biological, trauma and abuse, and temperament type preferences) and their indicators of sensitivities (physiological, cognitive, emotional, altered states of consciousness, and transpersonal—i.e., metaphysical or paranormal). Each of the primary scales—predispositions and indicators—contained a number of secondary scales, and the secondary scales each contained a number of tertiary scales. The H.I.S.S. was administered to a reference group of 300 people and the correlations among the scores on the various scales were very strong. Between the scores on the first-level predispositions and indicators scales, for example, the correlation was such that the probability of its occurring by chance alone was less than one in a trillion. Such was also the case with the correlations between the scores on any two secondlevel scales in the indicators grouping. The strength of these correlations was such that it indicated the likely existence of a syndrome, and I decided to define an Anomalously Sensitive Person as an individual who scored two or more standard deviations above the mean on both of the first-level scales, predispositions, and indicators. †

84

For further information, please see my 2003 book, The H.I.S.S. of the A.S.P.


On a normal curve, two or more standard deviations above the mean would suggest a score in the top 2 percent. In the H.I.S.S. project, 3.4 percent of the reference group met the criteria for ASP-ness. In this project, as with most other research efforts in the social sciences, it was necessary to settle for a sample of convenience and, as a result, the reference group had some compositional imbalances. The most obvious of these were that it was: too female (62 percent were female), too white (86 percent were of full European ethnicity), too educated (41 percent had some graduate-level education and the mean education level was 15.7 years), and too human-services oriented (66 percent had “social” occupations). The available data suggest that all four of these factors are, to a certain extent, associated with high H.I.S.S. scores. Correction of the sampling imbalances would, then, result in a further lowering of mean scores, with scores that are high (in absolute terms) becoming even more anomalous and result in ASPs comprising approximately 2 percent of the reference group. The H.I.S.S. is a self-reporting instrument, so the findings might be challenged on that basis, but it should be noted that most of the questions in the first-level predispositions scale deal with objectively verifiable data. This is especially true of the second-level biological scale. Nevertheless, for our purposes, the results showed a moderate correlation between the third-level electromagnetic sensitivity scale and the secondlevel biological scale. They also showed a strong correlation between the third-level electromagnetic sensitivity scale and the first-level predispositions scale. Patterns of electrical, magnetic, and light radiation are a normal part of nature. Anywhere on Earth, these patterns are continuously changing and all living creatures, including humans, are subject to the effects of those changes. The day/ night and seasonally changing patterns of light radiation are quite obvious, and the effects of light—by way of the pineal gland, on the limbic system’s hypothalamus, as well as light’s modulation of the biological clock—are well known. 85


The changes in the radiation patterns of the electric and magnetic fields surrounding the Earth are quite subtle. It has been known for some time that the hypothalamus is extremely sensitive to electrical fields, and recent research has revealed a similar sensitivity to magnetic fields (thought to be caused by deposits of magnetite located near the pineal and pituitary glands). The mechanisms by which such sensitivities operate are not well understood. Most humans have adapted smoothly to the changing patterns of natural electromagnetic radiation (EMR), but those who are especially sensitive to EMR—people who are electromagnetically hypersensitive (EHS)—often experience significant reactions to such changes. The natural EMR changes to which an EHS individual might be unusually reactive include such things as seasonal variations, solar storms, phases of the moon, approaching thunderstorms, and Santa Ana–type winds. Illustratively, studies have shown that the number of people signed into psychiatric wards with diagnoses of schizophrenia and bipolar disorder, increases markedly shortly after major disturbances in the Earth’s magnetic field. In recent years, the possible effects of man-made EMR on humans have become a highly controversial subject. Reputations, positions, fortunes, and progress are all at stake. While most researchers would agree that direct electrical stimulation of the brain can lead to anomalous neuronal activity, anomalous perceptions, and anomalous behaviors, it is over the issue of indirect or subtle stimulation and the possible link to immune-system dysfunctions that most of the conflicts arise. Those who debunk electromagnetic hypersensitivity point out that the vast majority of people apparently are not affected by exposure to man-made EMR and suggest that those who claim to experience problems are hypochondriacs, alarmists, or attention seekers. Studies have shown, however, that regular exposure to man-made EMR is associated with an increased incidence of hormone-dependent cancers, allergies, mood disorders, and suicide. Moreover, other studies have shown that 86


in a setting of partial sensory deprivation, indirect exposure to EMR commonly results in dizziness, “pins and needles,” psychological dissociation, the sense of a presence, and transpersonal (metaphysical or paranormal) experiences. Norman Shealey, the founder of the American Holistic Medical Association, has stated that chronic exposure to manmade EMR can result in adrenal exhaustion as well as liver and heart-muscle dysfunction. The symptoms by which it is characterized include: chronic fatigue; depression; disrupted immune system functioning; biochemical deficiencies (including DHEA, intracellular magnesium, and essential amino acids); and EEG brainwave pattern abnormalities (including, in extreme cases, all the symptomatology of temporal lobe epilepsy). There are a number of other points about electromagnetic hypersensitivity that are especially relevant, including: • EHS is associated with a history of medical trauma. • The illnesses associated with chronic EMR exposure do not involve a specific pathogen, but rather are characterized by a generalized disruption of immune-system functioning. The negative consequences of such exposure are, therefore, likely to be very subtle and elusive. • The experience of a major electrical event (such as being rendered unconscious by electrical shock) apparently can lead to an individual becoming electromagnetically hypersensitive. • Some people with EHS are purported to be “electrical disrupters,” that is, their presence appears to interfere with the operation of electrical and electronic equipment. This phenomenon (a spontaneous form of electrical psychokinesis) is not well understood, but it has been suggested that, by some unknown mechanism, such people generate within their bodies brief charges of high-voltage electricity that have significant disruptive potential. 87


Women are considerably more likely than men to report EHS symptoms. This would follow naturally from their having a greater degree of interconnectedness (by way of a larger corpus callosum and other pathways), than do men, between the different parts of the brain. • The mechanism by which EHS operates appears to be this: Chronic EMR exposure leads to entrainment of neuronal discharges, entrainment of neuronal discharges leads to neuronal hypersynchrony (neuronal connections that operate faster and with greater sensitivity than the norm), and neuronal hypersynchrony results in a heightened sensitivity to a wide variety of stressors. One of the subjects in the H.I.S.S. reference group who qualified as an Anomalously Sensitive Person, and who I speak of as “Claire,” shared with me what it was like for her being electromagnetically hypersensitive: Thunderstorms … I get really frightened during thunderstorms, but before the storm breaks, I experience heightened awareness and feel very alive and alert. All external stimuli, especially colors and sounds, are extremely vivid. Once a thunderstorm starts, however, I find both the thunder and lightning to be very unnerving. My response to moving water is very similar to what happens before a thunderstorm. Everything is vivid; I’m hyper-alert. It’s a very positive experience— especially since it’s not followed by thunder and lightning. I really like waterfalls, bubbling brooks, and ocean waves. I hate fluorescent lights! When I’m around them, I go a bit nuts. I get headaches, I get very tired, I get bad moods, I have trouble concentrating, … and my stockings run—really, my stockings run. I don’t have a problem with incandescent lights, but they apparently have a problem with me. Light bulbs 88


seem to last for me about one-tenth as long as they last for other people. The situation is improved now, but in the past, I was constantly changing light bulbs. I could walk into a room, wouldn’t touch a thing, and a light bulb would blow. Also, the lights in a room could be off, I’d walk in, and one or more of them would turn on without my doing anything. Power lines, the big transmission lines—I stay away from them. I go out of my way to avoid them. I don’t like them! And those boxes in a cage, those transformers or whatever they are … when I’m driving by one, I get tingling sensations, unusual metallic tastes, and strong feelings of anxiety. Electrical and electronic machines tend to malfunction in my presence … except for those times when I seem somehow to fix them. A man who tried to repair my broken VCR said he couldn’t understand how it got so messed up inside, just by my pushing the buttons. I’ve also inadvertently destroyed cell phones, radios, televisions, and computers. But it works both ways. I can fix things as well as break them. When I worked in an office, it was sort of a joke—if a machine wasn’t working, somebody would come get me, I’d go look at it and it would start working again. Given that I don’t understand anything about machines, that’s pretty funny. My computer used to crash all the time, but then I reached some kind of agreement with it. Now it works fine. The difference between my breaking things and my fixing things seems to be a matter of intentionality. I don’t mess things up intentionally—that happens spontaneously—but to get things not to break, or to get things that are broken to work again, I have to concentrate on positive intentions.

89


Chapter 6

Personal Experiences People pretty much take electricity for granted and don’t often think about it. Minor electric shocks and brief power outages, which almost all of us have experienced, are inconvenient, but are generally taken in stride and soon forgotten about. Extended power outages (which some say are inevitable because of the increasing loads being placed on the power grid), however, are another story. They can impact on an entire community, result in loss of life, lead to looting, and have economic consequences as well. An extended outage may: disrupt communications, water, and transportation; close retail businesses, grocery stores, gas stations, ATMs, banks, and other services; cause food spoilage and water contamination; and prevent use of home medical devices. Surprisingly, though, most of my friends (who have lived in small towns) have positive memories of extended power outages. They speak of the way the community pulled together, neighbors helped each other, and even held impromptu block parties with gas torches and grills.

90


New York City During August 14, 2003, Power Outage

Given that I, myself, have numerous memories of life events related to electricity, I assumed that would be the case for most folks, but that turned out not to be so. Other than memories of major power outages, most of the people I asked have few, if any, memories, either positive or negative, that involved electricity as a noteworthy feature. It appears that men have more such experiences than do women, and that holds to reason because men are more likely to have served in the military, be employed in jobs that involve electricity, have hobbies that involve electricity, and adopt the role of general handyman around the house. Moreover, men are more likely to be risk-takers than are women. Well, I’m a man, I served in the military, had several electricity-related hobbies, been a life-long handyman, and lived in a number of isolated houses where I had to do everything myself. Two of my houses were subject to regular lightning strikes, and, while I’m not willingly a big risk-taker, I’m so intensely curious that I have often forgotten to take into account the potential risks. (The proverb, “Curiosity [potentially] killed the cat,” definitely applied to me.) Here are some of the electricity-related instances I recall from various times in my life: 91


92

My earliest memory of the electrical sort was from the time when I was 4–5 years old. I had gone with my family to my grandparents’ house for a formal Sunday luncheon. (It may have been a holiday, but all meals at my grandparents’ house were formal.) After lunch, I was told to go upstairs and take a nap in the guest room. I didn’t want to and refused, so I was compulsorily marched to the bedroom by my grandfather, a terrifying tyrant, who ordered me to get into bed and then left, closing the door. Fuming with anger, I looked for something destructive I could do to “get even.” I found a pair of scissors, spotted a standing lamp plugged into the wall, and, without a moment’s thought, cut the cord. There was a bright flash, a sharp “crack,” and a powerful shock delivered to my body. Terrified, I raced back to the bed and tried to hide under the covers. The adults began an investigation to see what had happened, and my father soon came into the room and started grilling me. I, of course, responded with innocence and denial. Then my father said, “Did it scare you son?” I then burst into tears and replied, “Yes, Daddy, yes.” A couple of years after that, I became enthralled with model electric trains and battery-powered erector sets. In the attic of our house, we had an unused large open room that I was permitted to use as my “hobby room.” For my trains, I set up a table consisting of two sheets of plywood painted green, resting on sawhorses, and having a hinged trapdoor in the middle for accessibility. Ultimately, I had what I considered to be a “worldclass” model train layout. Before long, however, I realized that I was more interested in the erector sets because they allowed for more flexibility and greater creativity. After building all the gadgets that were illustrated in the instruction books, I set out in my own direction and began building


fantastical “Rube Goldberg” devices. Some of them worked, some of them didn’t, but the projects allowed me to think of myself as an “inventor.” Desiring to further explore my inventing prowess, I got my parents to provide me with a table that I could use for setting up additional experiments. I placed this 6’x2’ “project table” next to my bed, and it was soon covered with a number of gadgets that I had either purchased or constructed. One of the earliest was a “shocking machine” (“guaranteed to be safe”). It was a battery-powered device that consisted of multiple coils of wire wrapped around a metal rod that could be moved to control the strength of the shock delivered. The “victim” held on to two handgrips and received a shock when the machine was turned on. The objective was to see how strong a shock could be tolerated. I took the machine in to my sixth-grade classroom and, with the teacher’s permission, had the students stand in a circle holding hands while a collective shock was delivered to all. One student declined to participate. If it had been a girl, it would have been no big deal, but it was a boy, and for weeks we teased him unmercifully about being a “chicken.” Another project that I found quite fascinating was building a “cat’s whisker” crystal radio. It employed a crystalline mineral, galena, to detect radio waves with a fine wire touching its surface and acting as a “detector.” The contact between the two dissimilar materials at the surface of the detector’s semiconducting crystal formed a crude diode, which acted as a rectifier, converting the radio waves from alternating current to a pulsing direct current, thereby extracting the audio signal. The audio frequency current produced by the detector then passed through an earphone causing the earphone’s diaphragm to vibrate, pushing on the air to create sound waves. It required some skill and a great 93


deal of patience to operate, but it worked! I began to have fantasies of winning the Nobel Prize in Physics sometime in the future. My best friend and next-door neighbor became interested in my research projects and for Christmas, our parents gave us both battery-powered futuristic toys that could be used in a number of different ways. What I recall most vividly were the different forms of communication they supported—Morse code, flashing light, and voice. We both set up our devices in our bedrooms and rigged up a connecting wire between our bedroom windows, which were some distance apart (probably at least 100 yards). It wasn’t long before we had used these toys in all the ways they were “supposed to be” used, so we began disassembling and reassembling them to see what other possible uses we could discover. One especially interesting finding was that the voice communication didn’t require the power of the battery for operation. The communication could occur between the two microphones connected by 100 yards of copper wire with no other components in the circuit. This is possible because dynamic microphones are transducers that work via electromagnetic induction. When sound enters the microphone, the sound wave moves a diaphragm. Attached to the diaphragm is a small movable induction coil positioned in the magnetic field of a permanent magnet. When the diaphragm vibrates, the coil moves in the magnetic field, producing a varying current in the coil, which can then be transmitted through a wire. On the receiving end, the process is reversed: in the induction coil, the electric energy is converted to mechanical energy, which moves the diaphragm, thus causing the production of acoustic energy (or sound). Full of excitement and enthusiasm, I told my father of our discovery, but he wouldn’t believe me. I showed 94


him the setup and explained to him, as best I could, the physics involved, but he continued to resolutely insist that there had to be battery power involved somewhere. Much to my regret, I was never able to convince him. And then there was the pièce de résistance, a real honest-to-goodness vacuum tube radio complete with flashing lights, accurate tuning, quality sound reception, and amplification. Initially, building it was beyond my capabilities, but with the encouragement and assistance of the staff at Madisonville Hardware and Electric Supply, I finally got it finished and working. Even though it had amplified speaker capability, I always listened to it using headphones because this was my private world that I didn’t want to share with anyone else. At night, after I had gone to bed and my parents thought I was sound asleep, I would put on the earphones and tune into western programs such as Roy Rogers, Tom Mix, or The Lone Ranger. I actually preferred the radio programs to their television counterparts because, facilitated by my imagination, “the pictures were better.” At the age of 15, I was sent away for three years of secondary school and then went on to college for another five years. During those eight years, almost all my time was spent studying, and I didn’t make time for my hobbies. After graduating from college, I got married, was commissioned as an officer in the United States Navy, and sent to sea. In the Navy, aboard ship, one can’t get away from electricity—it’s everywhere! One is always surrounded by humming equipment, flashing lights, and beeping alerts. As a young officer, my responsibility was for the engine room and the boiler room aboard the USS John Willis (DE 1027). Those are probably the two spaces aboard ship where the primary focus of attention is least likely to be on electricity. Nevertheless, my machinist 95


96

mates were responsible for operating and maintaining the generators that provided electric power to the rest of the ship. When the generators go down, a crisis is at hand. One night, as the ship was steaming along, I was asleep in my bunk, having been lulled into slumber by the rhythmic sounds of the ship’s engines. Then something changed—there was silence. There are few things as quiet as a ship at sea without power, especially when one has become accustomed to the omnipresent rumbling. I rolled out of my bunk, threw on my clothes, and raced to the engine room. My men had everything well in hand and were getting ready to bring the generators back on line. The power outage lasted for no more than 10 minutes, so a crisis was averted, and the ship was soon underway again. After that, I’ve never forgotten how dependent on electricity we can become. During joint naval exercises our ship was conducting with the Columbian Navy, we spent a few days tied up at a pier in Cartagena. We were all invited to a lavish beach party the Columbian Navy held on our behalf. While I don’t much care for such things, as a junior officer, I was required to go along, essentially in the role of chaperone. After some of the men went swimming, they were changing their clothes in an electrical equipment room and one man unwittingly threw his wet beach towel over an uninsulated 440-volt power line. He received a massive shock and was thrown to the floor. Another man came to get me, and I used the public-address system to request the assistance of any medically trained people in attendance. The man who had been shocked appeared to be in very bad shape, perhaps even dead. He had holes burned through his skin down one arm, the side of his torso, and one leg. The men took a door off its hinges and used it as a


stretcher to carry the victim to a nearby bus. The bus driver headed to the nearest hospital at very high speed, sideswiping a couple of cars on the way, and the injured man was given cardiac resuscitation throughout the entire ride. Unfortunately, he was pronounced dead on arrival at the hospital. Such an incident requires an enormous amount of administrative paperwork and, given that I had been the officer on scene (and the man had been in my division as well), the paperwork was my responsibility and required my attention for a couple of days. In 1969–1970, I spent a year in country Vietnam. I was operations officer at my first duty station, and was responsible for planning and coordinating “Swift Boat” raids into enemy territory. Our headquarters was in the beautiful old French beach resort town of Vung Tau, a place where there was almost no active fighting. One evening, several of our officers were going out to dinner at an elegant French restaurant located atop a hill with a lovely view of the ocean and the coastline to the south. They invited me to join them, but I demurred saying that I had a boat operation to run. They countered that running the operation was the job of the on-duty watch officer, that I only needed to be available to provide guidance should problems arise, and that I could do that if I took along a portable radio for communication with the watch officer. I happily agreed. At the restaurant, we settled in on its patio as dusk fell, and ordered burgundy wine and escargot to begin our feast. Shortly thereafter, the sky over the coastline 10–20 to the south was illuminated by flashes from explosions and, knowing that the anticipated battle had begun, I turned on the radio and put on earphones so my colleagues would not be disturbed. The firefight (in which we prevailed without casualties) lasted about 30 minutes. My input was needed only a couple of 97


98

times, and I then rejoined my companions as we made our way through a delicious seven-course meal. All in all, it was a most surreal experience. At another in-country duty station, I was officerin-charge of a Naval Liaison Element to the Special Forces on the Cambodian border. Everything we did was by way of radio communication with various disparate units. If we had no electricity, that meant no radios, no radios meant no communication, and no communication meant no Naval Liaison Element. Plus, there was another factor involved: In order to continue functioning, the radios had to be kept at a constant temperature of 65o F, and that required an air conditioner. So again, no air conditioner, no radios, no communication, and no Naval Liaison Element. One day, our air conditioner stopped working. I contacted everyone I could think of to get a replacement, but they all responded in much the same way as the organization to which I was attached: “Everyone in Vietnam wants an air conditioner for one reason or another. There aren’t enough to go around, so forget it.” We kept operating as usual and, after about 12 hours, one of our four radios went down. I sent out a message to all concerned stating as much and said that when a second radio failed, I would be discontinuing operations so that we would still have two operable radios in the event of an emergency. (We were still able to be reached by telephone.) As expected, the second radio went down about 24 hours after the air conditioner stopped working, and I sent out a message stating that we were shutting down operations. Not surprisingly, about 12 hours later, a new air conditioner and two new radios arrived at our communications bunker. Once they were put into place and powered up, everything returned to normal.


At that same duty station, we were responsible for monitoring electronic sensors that had been placed in an area as part of a Top-Secret research project that used magnetic sensors, motion sensors, and auditory sensors to detect the presence of humans. When we received a signal that a sensor had been activated, we called by radio for an artillery barrage or an airstrike on its location. Since the only information the sensors provided to us was in the form of electronic signals, we couldn’t differentiate friend from foe, and the test zone was declared off limits to all friendly personnel. Given the highly classified nature of the project, I, of course, couldn’t explain to other unit commanders why they were prohibited from sending troops into the area. The project was quite successful and the number of enemy KIAs that we regularly reported began to catch people’s attention. One Army unit commander (who was several ranks my senior), wanting some of the “glory” for himself, decided, in violation of protocol, to send some of his troops into the prohibited area. The sensors did their job, we did our job, and 10 American soldiers died because of this man’s ego and vanity. I was heartbroken and furious, and I began calling for an investigation. The “culprit” ordered me to appear in his office and, when I did so, he was sitting at his desk flanked by two burly Army sergeants. He ordered me to back off and said that if I didn’t, he would have one of the sergeants roll a live grenade into my hooch in the middle of the night. Despite concerns about my honor and integrity, I obeyed his order. After returning to the US and being discharged from the Navy, I was a mess psychologically—depressed, angry, fearful, confused, unmotivated, and estranged from other people—presumably suffering from Posttraumatic Stress Disorder. I told my wife I needed to get a divorce, moved out of our house, and purchased 99


100

a decrepit old farmhouse out in the middle of nowhere. During the subsequent years, I lived, usually alone, in four houses in remote locations with frequent power outages. Believing strongly in the Boy Scout’s motto of “Be Prepared,” I always did just that—laying in supplies of emergency equipment, freeze-dried food, and, of course, a portable electrical generator. Emergency generators are a bigger undertaking than one might expect. They’re expensive, the house needs to be extensively rewired for them to be employed, the machines need to be regularly maintained (even if not used) and tested, the stored gas needs to be regularly replaced, and they can be dangerous. What is more, their power output is generally sufficient to operate only a limited number of systems in the house— usually heating, water, refrigeration, and a few lights. So, prepared I was, and, as anticipated, multiple power outages occurred over the years. Since the whole rigmarole of setting up the generator and putting it into operation is a real pain in the posterior, I would normally wait an hour or two to see if the regular power would come back on by itself. Usually it did. There were a couple of times when the power was out longer than that, so I would go through the complete drill, put the system into operation, and congratulate myself on my foresight. Invariably, the regular power would come back on within another couple of hours which, of course, was a relief, but in all those years, I never once really got to experience the full benefits of my preparedness. Two of my properties were also frequent targets for lightning strikes. The first of these was an historic old farmhouse built on a rise near the Delaware River in New Jersey. Next to one corner of the house, there stood a massive oak tree that showed signs of having been struck by lightning multiple times in the past.


The house itself was protected by an extensive array of lightning rods, but lightning would go to ground through them, through the tree, or by striking the ground directly nearby. Dealing with the repair or replacement of various electrical/electronic components that had gotten “fried”—notably the well pump, the alarm system, and radios/televisions (this was before I had a computer)—became something of a routine for me. One time, at that same house, I was up in the attic during a thunderstorm looking for something that I had misplaced. My attention was drawn to a ball of blue light, about the size of a bowling ball that leisurely entered the house through the exterior stone wall, drifted slowly and silently around the room, bounced off another wall, and then silently blinked out. Interestingly, this event did not arouse any fear in me; I was simply curious. Until the 1960s, most scientists treated reports of ball lightning skeptically, despite numerous anecdotal accounts from around the world. Scientific data on natural ball lightning is very scarce, due to the phenomenon’s infrequency and unpredictability. Given the unreliability of public sightings and inconsistencies in the data, the true nature of natural ball lightning remains unknown. The other property that was a target for lightning was a newly built house on a hilltop in Vermont. One day, during a lightning storm, I settled into a chair on the screened-in porch to watch the show. One bolt of lightning struck a tree on the edge of the lawn about 100 yards away. The top half of the tree was rent in two, one portion of it rose into the air, moved sideways about 20 feet toward the lawn, and came crashing to the ground. It was quite a dramatic show. Another lightning incident at that house, one that was quite unsettling, occurred one afternoon in July of 1995. I was talking on the telephone during a violent 101


lightning storm when one bolt came to ground about 150 feet from the house where the telephone wires left the pole and went underground. Power was lost, but the telephone continued to work. Shaken, I hung up the phone, realizing I could have been killed. Two carpenters who had been working in the garage at the time reported that the strike had generated a ball of blue light, which had entered the garage, passed between them, bounced off the back wall, exited the garage, and gone to ground in the driveway. What made the incident particularly disturbing was the identity of the person who had called me. He was a well-known researcher who was said by conspiracy theorists to be doing work for a (government) cabal that was exploring the use of electromagnetic energy in both mind control and weather control. (The technologies for both were apparently quite similar.) The cabal was said to engage in “dirty tricks” to prevent researchers from looking into what they were doing. I had, in fact, been looking into issues of mind control, and the man had said that he was calling to refute a statement I had made in an article that I had written about six months previously. I didn’t know the man personally and we had never had any prior contact, so I found it quite odd that he would call me “out of the clear blue sky” to take issue with what I considered to be a very trivial point. I couldn’t help but wonder if the conjoining of events might have been something more than a coincidence. Over the last century, a variety of electrical/electronic devices have been invented and employed in the medical field. I have a number of memories of being a patient on whom such devices were used. • During the post-Vietnam period while I was still experiencing an enormous amount of stress from both PTSD and my divorce, I was one day struck 102


by intense vertigo and vomiting, and the symptoms didn’t go away. My doctor suspected a brain tumor and had me admitted to the hospital. There they ran test after test, the results of all of which appeared to support the brain tumor diagnosis. However, when it came to the most telling of all the diagnostic tests, the electroencephalogram (EEG), there were problems, and the test couldn’t be completed because of equipment malfunctions. I was sent back to my room and told the test would be rescheduled for the next day after repairs had been completed. An EEG uses electrodes on the scalp to measure patterns of brainwaves. Today the procedure is noninvasive with the electrodes being attached to the scalp by suction cups or some other such means; but back then, the electrodes—18 of them—were inserted into the scalp by way of needles. Well, I hate needles! I just wanted to get away … so I did—not physically, but psychologically. Summoning up all my powers of imagination, I had my mind leave the room and become a seagull soaring over the Pacific Coast of California. It was a glorious, beautiful, and liberating experience, and I was fine, having completely forgotten about the ongoing test. That was when the technician said the equipment was malfunctioning. Later, in thinking about the procedure, I wondered if my “escaping” had somehow caused the equipment to appear to be not working. The next day the procedure was begun again. I “escaped” again, the technician inserted the needles again, and said the equipment was malfunctioning again. At the time, I didn’t know much about brainwave patterns, but I did know that what I had been doing could produce a preponderance of alpha and theta brainwaves. I asked the technician if the apparent malfunction had to do with the readouts being swamped by such waves 103


104

and, when she replied in the affirmative, I discontinued my escape efforts because all the needles were already in place and there was no more pain. Probably unaware of my active involvement, she heaved a sigh of relief and told me that the machine was once again working properly. She got her readouts, and when the radiologist reviewed them, he stated that there was absolutely no neurological evidence of a brain tumor. Apparently, the stress that I had been undergoing produced symptoms that mimicked those of a brain tumor. I was discharged and sent home, and the symptoms slowly abated over a period of about three months. Those were a difficult three months. In the last few decades, with electronic miniaturization and the increased stress on hospitals and other health care facilities, patients are being released from hospitals while still needing care. As a consequence, laypeople are now using technological devices at home to manage their own health care more conveniently and independently (and inexpensively). For example, a wide variety of blood and urine testing kits are available that detect different chemicals and conditions. Various types of monitors and meters are also available to measure health status indicators, such as blood pressure or blood glucose level. Newer consumer devices include ones that measure blood coagulation for people taking blood-thinning medications, blood oxygen levels, and sleep apnea. There are also several implantable devices including cardioverter defibrillators, artificial pacemakers, electrical stimulators, active drug delivery devices, and cochlear implants. I, myself, use four electronic medical devices on a regular outpatient basis: a blood pressure monitor, a PT/INR monitoring device for measuring blood coagulation rate, a continuous positive airway pressure (CPAP) ventilator to deal with sleep apnea, and an


implanted artificial cardiac pacemaker to handle irregular heart rhythms. ○ The Blood Pressure Monitor is a very simple device to use. While sitting, one simply wraps the cuff around an arm, positions it properly, and presses the start button. The machine pumps up the cuff and then allows it to slowly deflate. After a few moments, systolic pressure, diastolic pressure, and heart rate are displayed on the screen. That’s all there is to it. ○ The PT/INR Monitoring Device is a bit more challenging. After turning on the machine, one has to prick a finger with a lancet to draw blood (the “ouch factor” can inhibit compliance). The blood sample then needs to be delivered to exactly the right spot on a test strip—a procedure that requires patience and good eye/hand coordination; error messages are common. Once a reading is obtained, the user reports the results to a central monitoring station, which then reports them to the health care provider, who then contacts the patient with new dosing instructions, if necessary. ○ The Continuous Positive Airway Pressure (CPAP) ventilator utilizes mild air pressure to keep the airways continuously open in people who have obstructive sleep apnea (OSA) and are not able to breathe spontaneously at night on their own. OSA can cause shallow breathing, or frequent nighttime awakening, and result in excessive daytime sleepiness. If left untreated, obstructive sleep apnea can increase the risk of health problems, including: headaches, depression, worsening of ADHD, diabetes, high blood pressure, stroke, heart failure, irregular heartbeats, and heart attacks. Prior to the invention of the CPAP machine, the most common therapy option widely used to treat 105


severe OSA was surgery, but surgery didn’t work at all for me. I started CPAP treatment more than 25 years ago, and am now using a more effective and flexible Bi-level Positive Airway Pressure (BPAP) through which supplementary oxygen, provided by an electrically powered generator, is also supplied. Quite a few patients find ventilators to be inconvenient or annoying and do not comply with their therapy instructions, but I experience it as a “security blanket” and use it every moment that my eyes are closed. ○ The Artificial Cardiac Pacemaker is an implantable electronic medical device that generates electrical impulses to regulate the pace of the heart’s beat. Pacemakers are commonly used in the treatment of atrial fibrillation, an abnormal heart rhythm characterized by rapid and irregular beating of the two upper chambers of the heart, known as the atria. Besides issues with the heart’s natural pacemaker that can cause atrial fibrillation, other physiological problems that are associated include: high blood pressure, valvular heart disease, congestive heart failure (CHF), diabetes mellitus, overactive thyroid, obstructive sleep apnea (OSA), and chronic obstructive pulmonary disease (COPD). An artificial pacemaker is implanted by way of a relatively minor surgical procedure in which one or more pacing electrodes are transvenously placed within a chamber, or chambers, of the heart, (most commonly the right ventricle) while the pacemaker itself is implanted inside the skin under the clavicle. Once the pacemaker is implanted, it is periodically checked to ensure the device is operational and performing properly. Even with this treatment, which involves ventricular control, the atria continue to fibrillate, which increases the 106


likelihood of developing blood clots that can travel to the brain and cause a stroke. For this reason, anticoagulation medication is typically prescribed.

107


Index A alternating current [AC] 16, 18, 19, 31, 32, 33, 34, 35, 36, 39, 40, 41, 93 artificial intelligence [AI] 78, 79 B battery 13, 14, 56, 57, 58, 59, 61, 62, 63, 92, 93, 94, 95 Bell, Alexander Graham 16, 17, 18, 20, 26, 27, 28, 29, 30 C cancer 25, 48 communication 18, 29, 47, 70, 94, 97, 98 compute 66, 71, 72, 73, 74, 76, 77, 78, 89, 101 D direct current [DC] 22, 23, 31, 33, 39, 93 dynamo 17, 23 E Edison, Thomas 16, 18, 20, 28, 29, 30, 31, 32, 33, 34, 39, 40 Einstein, Albert 13, 20, 25, 42, 43, 44, 45, 46 electric animals 5, 10 electromagnetism 13, 15, 25, 45, 82 electric motor 13, 15, 16, 17, 23, 34, 59, 63 electroencephalogram [EEG] 69, 87, 103

108

F Faraday, Michael 13, 15, 16, 20, 22, 23 fire 5, 6 fluorescent 83, 88 Franklin, Benjamin 6, 13, 20, 21, 22 G generate 5, 11, 18, 23, 29, 32, 33, 35, 36, 39, 40, 48, 49, 50, 51, 59, 60, 66, 67, 73, 75, 76, 87, 100, 106 H hypersensitivity [EHS] 83, 84, 86, 87, 88 I induction 17, 18, 22, 23, 33, 34, 40, 94 integrated circuit [IC] 7, 56, 74 ionize 82 L Leyden jar 10, 21 light bulb 16, 18, 29, 30, 89 lightning 5, 6, 7, 8, 9, 13, 21, 22, 36, 88, 91, 100, 101, 102 M Maxwell, James Clerk 13, 15, 16, 20, 23, 24, 25 medical 48, 56, 68, 69, 70, 82, 83, 84, 87, 90, 102, 104, 106 N nuclear 15, 45, 49, 52, 66, 67 neurons 13, 81, 86, 88


O outage 36, 48, 90, 96 P power 6, 14, 18, 23, 29, 31, 33, 34, 35, 36, 37, 39, 40, 41, 48, 49, 51, 52, 53, 56, 59, 60, 61, 62, 63, 64, 65, 66, 67, 73, 74, 82, 90, 91, 94, 95, 96, 100 R radiation [EMR] 16, 44, 48, 67, 68, 81, 82, 83, 85, 86 87, 88 radio 35, 36, 70, 82, 93, 95, 97, 98, 99 S shock 7, 11, 12, 87, 92, 93, 96 solar 45, 50, 59, 60, 61, 62, 63, 64, 66, 67, 86 spacecraft 56, 64, 65, 66, 67 static electricity 5, 9, 10 T telephone 16, 17, 18, 27, 28, 29, 39, 48, 54, 70, 98, 101, 102 Tesla, Nikola 16, 18, 19, 20, 32, 33, 34, 35, 36, 37, 38, 40, 41, 57 transistor 54, 55, 56, 73 transportation 16, 56, 90 V vacuum tube 54, 95 W Westinghouse, George 16, 20, 31, 32, 34, 35, 38, 39, 40, 41, 42 World Wide Web 77

109


Books By David Ritchey (Note: * = literary award)

The H.I.S.S. of the A.S.P. (Jun. 2003) The Magic of Digital Fine Art Photography (Dec. 2010) 26 Card Tricks (Feb. 2011) Something About Scrabble (Oct. 2011) Why We Are Fascinated by Dogs (Feb. 2012) A Sense of Betrayal (Aug. 2012) Reviewing the Montauk Legend (Feb. 2013) ** Presidents in the Crosshairs (Sep. 2013) * Descended from the Gods? (Jan. 2014) **** Those Who Know the Wyrd (Feb. 2014) Tales from the Depths (Jun. 2014) * Understanding the Anomalously Sensitive Person (Oct. 2014) ** Keep the Colors Flying (Mar. 2015) * The Deadliest Pandemic (May. 2015) * Locked and Loaded (Aug. 2015) * On Conflict (Sep. 2015) Pyramidal Mystique (Mar. 2016) * The Enigma of Baalbek (Apr. 2016) American Demagogues (Jun. 2016) Invitations to War (Aug. 2016) From Aardvarks to Zyzzyvas (Oct. 2016) * Enduring American Mysteries (Oct. 2016) Pyramids of Fire (Dec. 2016) * Noteworthy UFO Cases (Feb. 2017) * One at a Time or all at Once (Mar. 2017) * Spies Uncovered (Apr. 2017) * Geniuses Among Us (May. 2017) * 110


Pumped Up (Oct. 2017) A Brief History of Hurricanes (Dec. 2017) What Is Truth? (Dec. 2017) Transportation Disasters (Feb. 2018) Up to the Eaves: Major Snowstorms (Apr. 2018) * Everybody Loves Conspiracy Theories (May. 2018) ** The Automobile: An American Cultural Icon (Jun. 2018) * Are We Ready for Artificial Intelligence? (Jul. 2018) They Say It’s Impossible (Oct. 2018) Noble New Nation (Jan. 2019) Confronting the Earth’s Tribulations (May. 2019) Coming to Our Senses (Jul. 2019) Popular Primary Pets (Jul. 2019) Interesting Numbers for Interested Folks (2019) Looking for a Few Good Critters: Marine Mammals (2020) It’s About Time: A Temporal Exploration (2020) Metaphysical Experiences: Are They Real? (2020) Sailing Through the Ages (2020) Rising to the Challenge of Space Flight (2020)

111


About The Author David Ritchey’s vocations have included: naval officer, businessman, fine art photographer, psychotherapist, researcher, and writer. His avocations have included: scuba diving, sailing, skiing, tennis, golf, gardening, woodworking, dogs, magic, bridge, and SCRABBLE™. After being educated in economics at Yale University, he served for five years as an officer in the U.S. Navy, including a year in Vietnam. Back in civilian life, he initially became a businessman, but shortly thereafter followed his true inclinations and became a fine art photographer. While immersed in the art world, he became fascinated by the psychology/neurology of creativity, and returned to school to train as a psychotherapist. During his 15 years of clinical practice, specializing in hypnotherapy, he became especially interested in the psychodynamics of those clients who reported having had transpersonal (“metaphysical”) experiences, and undertook a twelve-year project of researching and writing about such people, who he speaks of as “Anomalously Sensitive Persons.” Later, he became his daughter’s business manager at her art gallery on Cape Cod, and spent a few years involved simultaneously in the worlds of both business and art. Now “retired,” he spends his time writing about a wide range of subjects that are of special interest to him. Information about his books can be found at www.davidritchey-author. com. He currently lives in historic Bucks County, Pennsylvania. He has two grown children, Harper and Mac, and a grandson, Brendan. 112



The earliest humans were introduced to electricity by way of lightning strikes, which occur worldwide about 100 times per second. From lightning, they got fire, and the control of fire proved to be a turning point in the cultural aspects of human evolution. Serious scientific research into electricity began with Benjamin Franklin’s famous kite and string experiment of 1752 in which he proved that lightning is electricity. Other prime movers in the field of electrical research and invention included: Michael Faraday, James Clerk Maxwell, Alexander Graham Bell, Thomas Edison, Nikola Tesla, George Westinghouse, and Albert Einstein. Electricity (along with the internal combustion engine) makes our modern way of life possible. The most ubiquitous electrical/electronic inventions are probably the electric motor, the telephone, the light bulb, and alternating current. Human biochemistry, thought, and action are all controlled by electricity, generated by neurons, which transmit information throughout the body in both chemical and electrical forms, and humans can also be affected by extracorporeal electromagnetic radiation. The hazards of ionizing radiation are well known, but there is considerable controversy about the potential dangers of non-ionizing radiation. Most scientific studies indicate that moderate levels of non-ionizing radiation are safe, but a number of cases have been reported in which individuals have experienced electromagnetic hypersensitivity (EHS), a condition to which negative effects are attributed.

DAVID RITCHEY

DAVID RITCHEY


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.