JOURNYS Issue 2.4

Page 1

Vol 2, No 4 SPECIAL ISSUE

REVIEW OF VISITUSAT WWW.FALCONIUM.ORG

A DECADE IN SCIENCE

+

Y2K 9/11 TOUMAI HGP SATURN THE SCHIAVO CASE PLUTO AN INCONVENIENT TRUTH SYNTHETIC LIFE LUCY

GRAPHIC BY JESSICA ZENG


Falconium Science Journal invites bright and inquisitive high school students to write and submit scientific articles for publication. Articles are accepted on a rolling basis and published quarterly both on web and paper for a widespread peer audience. All submissions are accepted through www.falconium.org. Articles should satisfy one of the following three categories:

ORIGINAL RESEARCH

This is a documentation of an experiment or survey you did yourself. You are encouraged to bring in relevant outside knowledge so long as you clearly state your sources. The article must be between 1000 and 2,500 words, and contain the headers introduction, methods, results, and discussion. Original research articles will be scored on the basis of originality, scientific validity, and the appeal of the research topic to a broad audience. Clarity of writing, conciseness, and accessibility to all readers will also be considered.

REVIEW

A review is a balanced, informative analysis of a current issue or event in science and technology, or in society and politics as it relates to science. A review is based on information from experts and the media, but includes the author’s insights and commentary. The article must have between 750 and 2000 words. Reviews will be scored on the basis of depth of analysis, level of insight, journalistic style, and the appeal of the subject to a broad audience.

OP-ED

An op-ed is a persuasive article or a statement of opinion. All op-ed articles make one or more claims and support them with evidence, arguments, or quotations. Word count: 750-1500. Op-ed articles will be scored based on how well-supported, interesting, and effective the articles are. Please feel free to contact us for any questions or comments. If feel compelled to donate to the Falconium organization, more information can be acquired via email.

WEBSITE: www.falconium.org EMAIL: info@falconium.org MAILING: Torrey Pines High School

SPONSORS:

Falconium Science Journal Attn: Brinn Belyea 3710 Del Mar Heights Road San Diego, CA 92130


letters to the editor

glioblastoma

the lhc: first report

the physics of breakdancing

BY ANGELA ZOU

BY LAUREN SWEET

BY MICHAEL DANG

bulbous bulbs & other mathe matical beauties

the dreaded shots

acetylsalicylic acid (aspirin)

COVER STORY: science of the decade

BY ALBERT CHEN

BY REBECCA KUAN

BY EMILY CAI

4-D microscopy

is a dose of dirt the best medicine for allergies?

a good night’s rest

BY SIDDHARTHO BHATTACHARYA

BY PAUL HO

BY SARAH WATANASKUL

microwaves and cancer BY SARAH HSU AND SARA SHU

neglected tropical diseases

a study of the 2007 wildfires

N|OT T|HERE (part 1)

BY ELORA LOPES & MELODYANNA CHENG

negative effects of seawater airconditioning

BY ALICE FANG

BY ELIZABITH BRAJEVICH

BY ANGELA QIAN

03


Regarding the article “False Alarm? The Truth about Manmade Global Warming” (Spring 2010), clearly there is a lack of understanding (and maybe acceptance) of the science process as well as our current understanding of global climate systems. For instance, one of the most widely accepted theories in climate science is the greenhouse effect (related to the re-radiance of IR as dipole moments shift in greenhouse-gases), yet the review suggests that this is controversial. Further, every major national scientific organization has written position statements regarding human impact on the global climate systems all stating that the evidence is overwhelming. These position statements go further in expressing the urgency for taking actions that will mitigate negative economic and ecological impacts that are already projected to occur because of thermal inertia and in preventing projected negative impacts should we remain complacent about the hacking down of forests and emissions from the use of fossil fuels. As the article correctly indicates, any and all ideas that have scientific validity should be considered and tested. However, in order to gain acceptance as a scientific theory, they must withstand testing and besupported by several lines of evidence. Indeed many of the claims representing healthy skepticism of anthropogenic climate change have either been rejected on the basis of lack of evidential support or logic, or are in the infant stages of development. The danger of the popular media (and this article, apparently un-reviewed by experts) reporting on fringe and junk science is that such reports suggest to the “lay person” that there is controversy within the scientific community ultimately resulting on inaction and relaxed or misinformed environmental policies. The article says it all when it cites the “non-governmental international panel on climate change” as being backed by the oil industry. Whether this statement is valid or not, the NIPCC’s most recent article is published by the Heartland Institute, with a “mission…to discover, develop, and promote free-market solutions to social and economic problems.” For the high school science student/teacher, a general rule in evaluating claims of a scientific nature and their sources is to ask who funded the research. When the underwriter represents a conflict of interest, namely, stands to lose profit, the results should be viewed as extremely suspect. Indeed, as viewed by many, there is no greater enemy of the free-market than environmental policies that regulate industries’ extraction of resources and the disposal of waste. Heartland may be a non-profit organization, but its mission is to support for-profit free-market. No one person more than I hopes that we come across new evidence that says our activities on this planet are having no negative impacts on climate, biodiversity, etc. But to write an article with unfounded claims, with no valid primary research cited, and claims that are extremely erroneous and then publish it as a “scientific review” on a website is a disservice to the future of humanity.

Ruben Zamora Edinburg, TX

NOOR AL-ALUSI, AUTHOR The fact is that there are competing perspectives on manmade global warming that claim to be scientifically based, and most students are already aware of that. The intention of this article was to clarify both sides of the popular debate on global warming, regardless of how evenly favored they may be in the scientific community, to allow students to understand where each side stems from. This article is simply explaining the conflicting conclusions of the IPCC and the NIPCC (and organizations with similar perspectives on global warming) and the studies those conclusions were derived from. The article was not intended to explicitly speak to which side is more scientifically legitimate, but it does discuss possible ulterior motives of the NIPCC relating to politics and economics. Despite the minor role global warming skeptics may play in the scientific community, we felt that this article was worthy of publication because of the weight this discussion had in politics and economics and because it inherently relies on science. What separates this article from “junk science” is that it cites the studies behind each of the claims and qualifies each conclusion by noting possible political interests and by representing the opposite side. The intention of this article was not to encourage the “lay” readers to adopt one perspective or the other, but rather to encourage them to seek more knowledge on the subject. The controversy of this article is supposed to inspire readers to further research the debate, investigate studies, and decide what is right. It may also inspire students to further study the field of environmental science so that they can contribute to the understanding of the current climate situation and put a rest to any skepticism of theories regarding climate change. This article was written with a strong educational purpose in mind and my only regret is not making this purpose clearer in the article itself.

LING JING, EDITOR-IN-CHIEF Falconium publishes op-eds on current controversial issues in the interest of fostering debate and discussion among readers. The opinions expressed in these articles do not represent the views of Falconium Science Journal, its sponsors, or the authors. We apologize for the lack of clarity in specifying this article’s purpose. In the future, we will be sure to provide viewpoints from both sides of a controversy to ensure that we provide a balanced picture of the issue.

04


GLIOBLASTOMA BY ANGELA ZOU

U.S., glioblastoma multiforme is notorious for its ability to stealthily produce the most malignant of all brain tumors known to mankind; diagnoses of this condition are extremely devastating for patients and their families because patients tend to only survive 12 to 14 months on average, even with immediate treatment. Glioblastoma is one of the most aggressive forms of astrocytoma, a cancer that originates from star-shaped cells called astrocytes. Astrocytes are part of the glial tissue, which supports the brain and enables nerve cells to function. Glioblastomas typically contain regions of dead cells, which are surrounded by a pseudopalisading array of anaplastic cells, or cells that have experienced a reversal of cell differentiation. Meanwhile, conditions such as neovascularity (abnormal blood vessel growth), and hemorrhage (blood loss) have a tendency to develop as well. Different tumors will often have varied appearances due to factors such as the amount of hemorrhage and the age of the tumor. Symptoms of glioblastoma vary depending on the location of the tumor and the regions of the brain involved. Common signs include headaches, vision loss, epileptic seizures, and decrease in motor control. Such symptoms, however, will often be manifested only in the advanced stages of glioblastoma, and even then may be attributed to other causes such as stress. Before several medical and scientific breakthroughs, glioblastoma patients were doomed to a lifespan of around two to four months; however, with various modern treatment methods, the average life expectancy of a patient diagnosed with glioblastoma has increased to around one year. Like symptoms, treatments are also based on the characteristics of the tumor, such as its size and rate of growth. In the standard treatment procedure, a patient first undergoes surgery, during which a sample of the tumor is taken for lab analysis. This is then followed by chemotherapy treatment, which is often combined with radiotherapy. Intensity modulated radiation therapy (IMRT) is an example of the latter, in which doses of radiation are administered over time through linear accelerator machines. Generally, however, these methods are ultimately ineffective in treating glioblastoma, for glioblastoma tumors are typically very sophisticated and may consist of several cell types. In many cases they can develop resistance to treatment over time; while one cell type may respond to a specific procedure, other cell types may continue multiplying, allowing the tumor to continue growing. Since the brain is very susceptible to damage, large dosages of radiation and drugs cannot be applied, greatly limiting treatment methods. Scientists have long attempted to identify the precise cause of glioblastoma, but only recently have they done so. In the December 2009 edition of Nature science journal, researchers reported the discovery of two genes, C/EPB and Stat3, which seem

GRAPHIC BY ANGELA ZOU

George Gershwin. Lee Atwater. Ted Kennedy. These are just a few of the more well-known people who have succumbed to a form of brain cancer known as glioblastoma multiforme. Accounting for almost 23% of all brain cancer cases in the

to work together to cause glioblastoma. The Stat3 and C/EBP genes products are each responsible for regulating the growth of cancer stem cells, as well as other processes including metastasis (the spread of cancer to different organs), angiogenesis (the growth of new blood vessels that feed the tumor), and tissue invasion. When these genes are inhibited, they prevent the development of cancer, but when they are activated, they wreak tremendous damage upon the host. In studies of many human tumors throughout the body, Stat3 is one of the activated genes. While either gene constitutively activated individually does little harm, both together allow expression of other cancer-causing genes. Researchers found that about 60 percent of patients diagnosed with glioblastoma displayed the pair of genes and faced especially bad outcomes. In a year-long study at Columbia University’s Medical Center, all of the patients with both the C/EPB and Stat3 genes died. Dr. Antonio Iavorone of Columbia University noted that these findings were “…remarkable given that it’s based on [just] the activity of two genes.’” He later added, “These are…master regulators of the most aggressive phenotype of brain tumor…We have found the real driver making the tumors.” Dr. Iavorone and other researchers hope that by identifying specific triggers of glioblastoma, they can begin exploring and experimenting with more effective treatments, namely gene therapy, in which the two activated genes causing glioblastoma would be replaced with inactivated versions. They have already met some early success. In a side study, scientists turned off the C/EPB and Stat3 genes in glioblastoma cells and then inserted these modified cancer cells into mice; amazingly, glioblastoma cancer did not grow. With such blinding advances in technology and knowledge, as well as hope, glioblastoma multiforme could very well become curable in future years. Alex, Lobera. “Glioblastoma Multiforme.” EMedicine. 26 Aug. 2009. Web. 7 Jan. 2010. <http://emedicine.medscape.com/article/340870-overview>. acedars-sinai.edu/5298.html>. Gardner, Amanda. “Two Genes Work in Tandem to Spur Deadliest Brain Cancer.” 23 Jan. 2009. Web. 6 Jan. 2010. <http://news.yahoo.com/s/hsn/20091223/hl_hsn/twogenesworkintandemtospurdeadliestbraincancer>. “GLIOBLASTOMAS.” IRSA. Web. 6 Jan. 2010. <http://www.irsa.org/glioblastoma. html>. “STAT3 Gene Regulates Cancer Stem Cells In Brain Cancer.” ScienceDaily.

05


THE LHC: FIRST REPORT

BY LAUREN SWEET GRAPHIC BY WENDY ZHANG

After a quarter-century of planning and con- tio of matter and anti-matter as predicted from the theory. If a struction, the CERN (European Organization for Nuclear simple benchmark test like this can help to confirm and supResearch) Large Hadron Collider has finally run its first suc- port the common theories of matter, then one can only wonder cessful test, ahead of schedule. In early December 2009, what more uncharted territory the LHC can map. the team running the tests at the CERN collider announced The success of this experiment has scientists excited the results of the September 23, 2009 experiment. The re- about the possibility of finding the elusive Higgs boson particle, sults, while not groundbreaking, establish the possibility of which has the potential to explain the origins of matter and great discoveries from this new, powerful machine. mass in the universe. While the LHC only fired protons at 450 The Large Hadron Collider (LHC) is a particle ac- GeV, the machine is capable of colliding particles at 14 trillion celerator built on the border between France and Switzer- electron volts (TeV). The previous record was .98 TeV, held by land. The largest in existence, the collision tunnel measures the Tevatron collider. The Standard Model, a theory of subabout 17 miles in circumference. By shooting particles into atomic particle interactions, indicates that the Higgs boson can each other at almost 99.99% the speed of light, physicists be detected between one and three TeV. The LHC’s capabilistudy collision reactions and learn more about the funda- ties are massive, and while this first test is mainly a benchmark, mental subatomic particles. it serves as an inauguration for the LHC. For the first successful run of the LHC, scientists The collider’s ability to achieve such high energies did a proton-proton collision. Firing groups of protons at promises the discovery of many new particles. For now, these 450 billion electron volts (GeV), they crashed the clusters first results represent the first steps to further discovery, and into one another in all four of the LHC’s detectors with a the scientists behind the LHC anticipate eagerly the next run, resulting energy of 900 GeV. ALICE (A Large Ion Collider scheduled for February 2010. As Dr. Jürgen Schukraft from Experiment) published the results and analysis of the test in CERN and an ALICE spokesperson states, “This important Springer’s European Physical Journal C. benchmark test illustrates the excellent functioning and rapid This was the first proton-proton collision yet mea- progress of the LHC accelerator, and of both the hardware sured at 900 GeV. Previous proton-anti-proton collisions and software of the ALICE experiment, in this early start-up have been measured at 900GeV, but not proton-proton phase. LHC and its experiments have finally entered the phase ones. While the accelerator is capable of much higher of physics exploitation.” There is no telling what the next trials speeds, this modest collision was a benchmark for future may reveal. experiments. The data collected by ALICE and the other detectors is consistent with that seen in proton-anti-proton collisions. David Evans, a physicist at University of Birmingham and head of the ALICE project, says the results References show “that we understand our detector, so when we go http://news.nationalgeographic.com/news/2009/12/091204-lhcto higher-energy collisions where we don’t know what the large-hadron-collider-higgs-boson-first-collision.html answers should be, we can better trust our results.” http://www.sciencedaily.com/releases/2009/12/091215112049. Another result of this collision was the confirmation of an htm anti-matter theory. ALICE recorded a total of 284 collisions which resulted in the creation of precisely the correct ra06


BY MICHAEL DANG

PHOTO BY JENNIFER CHENG

>Break dancing is an art form that requires immense skill and strength. A strong core is necessary for executing the intense and seemingly impossible moves. There are many different moves in the break-dancer’s repertoire, but one of the hardest is the windmill. >In the windmill, the breaker rotates from his back onto his front and onto his back again while keeping his legs locked on the V-position and rotating about his center of mass. It is an extremely difficult maneuver that requires strong abdominals, quadriceps and shoulders. These muscles are used to counteract and manipulate the various physical principals involved in the move, including circular motion, momentum, friction and angular motion. >To start, the break dancer must use his hands. By exerting torque he can increase rotational speed about his center of mass. The breaker must achieve a sufficient angular momentum to continue moving, because spinning adds stability, much like spinning bicycle wheels help a person maintain balance when biking. As the move progresses, the friction resulting from contact with the ground slows the dancer. The breaker’s muscles therefore must supply energy to match the frictional force and continue the move. >Inertia partly explains the difficulty of this move. At each moment of the spin, the breaker’s legs want to continue moving along the path of instantaneous velocity and drag the breaker along. However, the breaker must keep the legs in circular motion. The abdominals and quadriceps keep the legs in the V-shape and the shoulders keep the upper body in position. If the breaker does not have strong muscles, the move falls apart, as force needs to be supplied to keep the body rotation in position.

>Angular momentum is also a large portion of this move as it keeps the body constantly spinning. The faster the breaker can push himself, the more angular momentum he will have. The shoulders play a very important role in keeping the angular momentum because the break dancer must roll the shoulders on the ground to minimize the friction and keep the energy. >Many other break-dancing moves, like the flare and the head spin, deal with the same laws of physics. Circular motion, inertia, momentum, friction, and other forces must be contended with in each maneuver. The flare, much like a gymnast’s move on a pommel horse, uses torque to start rotation and keep balance. The head spin requires angular momentum and a strong core to minimize the torque from gravity and to maintain stability. The head spin is very reliant on aerodynamics and air friction, while body position is vital to the speed of the move. >Break dancing is something that takes practice, stamina and strength. The breaker must, in effect, manipulate and use the otherwise debilitating laws of physics to his advantage. The strength and physical prowess necessary to accomplish this feat can only be gained through practice, and no move is possible without good technique. Overall, the key to break dancing lies in high speeds and a refined technique that allows the artist to outsmart and overcome the forces of nature. References “How to do the Windmill.” July, 2006. Youtube. 8 December 2009. http://www.youtube. com/watch?v=aN1FSsOd 44s. Wynick, David. “What is the Weight of a Human Leg?” Eikonworks. 2007. 8 December 2009. http://www.askabiologist.org.uk/punbb/ viewtopic.php?id=1477

07


BULBOUS BULBS

&OTHER MATHMATICAL BEAUTIES

BY

ALBERT CHEN

Mathematics is tied with nature. The golden ratio appears in snail shells and leaves, and the Fibonacci

spiral in pine cones and romanesco broccoli. Fractals appear as blood vessels, fern leaves, and snowflakes

Fractals

Fractals are self-similar shapes on all scales, meaning that they exhibit identical or nigh-identical structures when viewed at different zoom levels. The Sierpinski sieve (also triangle) is an example of a fractal. It is created by beginning with one black equilateral triangle of side length L, then dividing that triangle into four congruent equilateral triangles of side length L/2. The central equilateral triangle is now colored white, and the remaining three black triangles are small scale versions of the initial triangle. Next, the same operation is repeated for the three black triangles, making the central equilateral triangle white, and the process is repeated ad infinitum with smaller and smaller triangles. The Sierpinski sieve illustrates self-similarity in that a black triangle formed at any stage is similar (in the mathematical sense) to the entire triangle. The parts look like the whole. An often cited example in nature of fractals is the coastline of Britain (actually, any country will do), which looks the same depending on the scale of observation and has different length depending on the length of the ruler. Fractals have been found to have connections to other areas of mathematics. For example, the Sierpinski sieve is related to Pascal’s triangle. Because computers are required to draw the more complex fractals such as the Mandelbrot Set, current researchers can quickly advance the field by attempting to produce more complex fractals in higher dimensions that are both beautiful and mathematically intriguing. Due to connections between fractals and many other mathematical principles and patterns, an understanding of them is key to a better understanding of mathematics and nature.

The Mandelbrot Set

The Mandelbrot Set is a fractal of particular note. It has been called “God’s fingerprint,” and was discovered in 1975 by Benoît B. Mandelbrot, who subsequently coined the term “fractal”. The beauty of fractals is that they can be described by simple equations. However, to graph these equations, mathematicians use what

is known as the “complex plane,” a type of coordinate system in which complex equations in the form z = x + iy (where i is the imaginary square root of negative one) can be graphed. In a complex plane, the real numbers constitute the usual x-axis direction(left-to-right) and the imaginary numbers in the usual y-axis direction(down-to-up). This enables mathematicians to plot certain fractal sets on the complex plane, because in the complex plane, multiplication is treated as a rotation while addition is treated as a translation. In this case, the Mandelbrot Set is the set of all complex numbers c for which zn+1 = zn2 + c is bounded, given z0=c. In other words, all points in the complex plane that, with infinite iterations of the equation, do not tend to infinity are members of the set. Plotting these members on the complex plane gives “God’s fingerprint.” But this simple equation leads to intriguing properties, as the Mandelbrot Set is much more complex and aesthetically pleasing than the Sierpinski sieve, given the proper shading. The bean-shaped part of the fractal is bordered by a cardioid, and the large part stuck onto it is bordered by a circle. Calculating the area of the fractal and more complex fractal related properties is also of interest to mathematicians.

The 3-D Mandelbulb?

In 2006, Marco Vernagloine proposed that the Mandelbrot fractal be extended to the third dimension. The point of this research is to make beautiful three-dimensional pictures of the two-dimensional Mandelbrot Set and to find its properties.

08


Recently, Daniel White created the Mandelbulb, the most accurate 3-D representation of the Mandelbrot Set yet. Originally, he attempted to use spherical coordinates (and creative variants), which are an extension of polar coordinates to 3-D consisting of radius, phi, and theta coordinates. The radius connects the origin to the point of interest, phi is the angle measuring from the zaxis to the radius, and theta measures the angle counter clockwise from the x-axis to the projection of the radius onto the x-y plane. White attempted to rotate the Mandelbrot Set around both phi and theta, but the pictures did not show consistent fractal detail. Then, Paul Nylander had a simple idea. A higher power for the z term in White’s formula, z now being a hypercomplex number representing x, y, and z, could result in a complex shape defined at any zoom level. Thus the Mandelbulb was found, a spaceship shape with hidden wonders. Many parts of the Mandelbulb, such as the “spine,” closely resemble parts of the 2-D set. White’s preferred power is eight, but any high power will do just fine. This approach is largely successful in creating a shape with beauty and complexity: many diverse pictures can be formed from just one equation.

Future research PHOTO BY MELODYANNE CHENG

Three-dimensional fractals have already been conceived. Perhaps the most basic example is the Menger Sponge. Think of it as a Sierpinski triangle in three dimensions (to be more accurate, the 3-D analog of the Sierpinski carpet, which uses squares instead of triangles). Start with a cube and mentally chop it into 27 congruent cubes. Remove the one that cannot be seen from any face, and remove the center cube of each face. Repeat the process for each of the small cubes left. Then, repeat infinitely, chopping and removing all the while. This produces a self-similar object, a fractal in threedimensions. It is not known what the 3-D Mandelbrot Set should look like. It is very much an art of using the 2-D fractal to create a detailed 3-D image similar to it but with sufficient variety and distinctness to be worthy of the name “Mandelbrot”. Many methods have been proposed for creating a three dimensional Mandelbrot Set, but until now, none has created the “true” one. The simplest method involves spinning the 2-D fractal about its axis of symmetry to create a simple 3-D shape. However, this method does not add any detail to the Mandelbrot Set but merely spins it. The “raised” method takes the colors on the 2-D fractal and assigns heights based upon the colors. Yet no complexity is added in this case either as it is still the same set when projected onto the 2-D plane. A more complex method projects four dimensional sets produced by quaternions, complex 4-D coordinate systems, into 3-D space, but this approach results in a fluffy mixture which neither looks similar to the Mandelbrot Set nor has any interesting detail.

Though the Mandelbulb seems at first glance to be the “true” 3-D Mandelbrot, it still needs to be refined: some parts look smeared at high zoom levels and low powers for z do not work, suggesting that the true equation has not yet been found. Much research still needs to be done on the properties of the Mandelbulb, including finding its volume and its Hausdorff dimension (a positive real number describing the dimension of a set or space, an integer for simple spaces, for example 2 for a 2-D space, but a non-integer for certain sets like the Sierpinski triangle which has a Hausdorff dimension of ln3 / ln2). It is also unknown if any “true” 3-D Mandelbrot actually exists, but the Mandelbulb is enough to keep enthusiasts occupied for the time being while they continue their quest.

08 References: http://www.newscientist.com/article/dn18171-themandelbulb-first-true-3d-image-of-famous-fractal.html http://mathworld.wolfram.com/Fractal.html http://mathworld.wolfram.com/SierpinskiSieve.html http://www.skytopia.com/project/fractal/mandelbulb. html http://www.math.utah.edu/~pa/math/mandelbrot/mandelbrot.html http://mathworld.wolfram.com/MandelbrotSet.html http://mathworld.wolfram.com/MengerSponge.html http://news.discovery.com/space/introducing-the-mandelbulb.html

09


The shot is an often-dreaded component of a doctor’s appointment. Dread escalates to terror for the 10% of Americans who suffer from trypanophobia, the fear of needles. And yet, needles are necessary for the maintenance of our health. Americans receive many vaccines throughout life: varicella (chicken pox), tetanus, influenza, poliomyelitis (polio), rubella (German measles), pertussis (whooping cough), hepatitis A, hepatitis B., and booster shots (additional doses of vaccines used to “remind” the immune system of the antigens needed to counter a disease) are often administered at annual check-ups to maintain immunity. The needle most commonly used in vaccinations is the hypodermic syringe needle. Developed independently by both Pravaz and Wood in 1853, it uses a skin-piercing needle attached to a syringe. Since then, it has developed into the disposable, plastic model that is most commonly used today. Microneedles and jet injectors have been recently developed as alternatives to the hypodermic syringe needle. Microneedles are small metallic squares that are lined with 400 silicon-based microscopic needles, each the width of a human hair. These small, hollow needles are so thin that application is painless. Microelectronics in the device control the time and dosage of the medicine. Jet injectors use air pressure to make the medication or vaccines thin enough to penetrate the skin and travel as deep as the muscle.

THE

Another use of needles is in IVs. IVs have a central needle surrounded by a catheter, a thin flexible tube. IVs are used to deliver a steady supply of liquids or medicine directly into the veins of the recipient. The catheter is inserted and kept in the vein after disinfection of the skin (which should be done 1-2 minutes before puncture). It is used to provide access to the bloodstream usually up to 3 days, after which it must be transferred to another vein to avoid infection. Usually medications like antibiotics are administered via IVs in severely ill patients who are in a poor general condition or who need drugs that are not available in an oral form. IVs are also essential in emergency medicine (such as after cardiac arrest) to administer life-saving drugs or to provide red blood cells after severe bleeding.

DREADED

SHOTS

Possibly the oldest use of needles is in acupuncture. Acupuncture is a form of traditional Chinese medicine and one of the longest used healing practices in the world, helping restore and maintain health through stimulations of specific points on the body. In many countries, especially China, Japan, and Korea, acupuncture is considered to be an alternative way of healing. Acupuncture is performed by penetrating the skin with thin, solid, metal needles using electronic devices or by the hands of qualified practitioner. Acupuncture revolves around maintaining the balance in the body, represented by the forces yin and yang. According to Chinese philosophy, disease is due to an imbalance blocking the vital energy, qi, which is supposedly remedied by inserting needles into specific points on the body. Acupuncture is still being studied to determine its specific effects on health conditions, such as headaches. Doctors are also trying to determine how the human brain responds to acupuncture. The full effects of acupuncture are unknown, but that does not prevent many from trying this alternative treatment to alleviate their illnesses.

BY REBECCA (BECKY) KUAN

Needles are necessary in medical practice, helping maintain health in both modern and traditional methods. Needles are responsible for administrating vaccines, soothing pain, and saving lives. In our modern day, needles are highly developed and, given appropriate handling like disinfection or removal after the maximum 72 hours, can provide countless benefits. There is no need to fear the needle-it is a part of medical treatment that helps to improve health.

GRAPHIC BY JOYCE HAN

REFERENCES “Acupuncture: An Introduction [NCCAM Health Information].” National Center for Complementary and Alternative Medicine [NCCAM] - nccam.nih.gov Home Page. Web. 09 Jan. 2010. <http://nccam.nih.gov/health/acupuncture/introduction.htm>. apopular medical terms easily defined on MedTerms.” Web. 08 Jan. 2010. <http:// www.medterms.com/script/main/art.asp?articlekey=20418>. “Fear of Needles - Overcoming Fear of Needles.” Phobias - An In-Depth Guide to Managing Phobias. Web. 09 Jan. 2010. <http://phobias.about.com/od/ introductiontophobias/a/trypanophobia.htm>.

“IV Needles.” Brookside Associates Medical Education Division. Web. 09 Jan. 2010. <http://www.brooksidepress.org/Products/OperationalMedicine/DATA/operationalmed/MOLLEBag/IVNeedles.htm>. 10


Acetylsalicylic Acid (aka Aspirin) BY Emily Cai, edited by Michelle Sit

You’re studiously studying for your science test when you suddenly experience a straining headache.The

prostaglandin production. The protective effects of aspirin are due to the acetylation and irreversible inhibition of COX in platelets resulting in decreased synthesis of TXA2, causing reduced platelet aggregation and thrombus formation. Of course, aspirin also can acetylate and irreversibly inhibit COX in vascular endothelial cells resulting in synthesis of PGI2, which would promote platelet aggregation. Endothelial cells can synthesize more COX-2 and overcome the effects of low dose aspirin. Since a platelet can’t synthesize new COX-1 and aspirin irreversibly inactivates the enzyme, low dose aspirin can shut down TXA2 synthesis for the life of the platelets (8-10 days). Of note, low-dose aspirin exhibits apparent selectivity for inhibiting COX in platelets and alters the ratio of PGI2/TXA2 in the vasculature in a beneficial way. It is of interest that aspirin provides these cardiovascular benefits at much lower doses than are necessary for its analgesic, antipyretic, and anti-inflammatory effects (up to 3 gm/day in divided doses). Thus, people with a high risk for heart attacks may rightfully live by the saying, “An aspirin a day keeps the heart attack at bay.” While aspirin is effective in relieving pain and inflammation, continual doses can cause long term side effects such as ulcers and excessive bleeding. Recall that prostaglandin is also responsible for the production of mucus in the stomach. Therefore, taking aspirin results not only in the reduction of thrombus formation, but also a reduction in the amount of protective stomach mucus. The protective mucus consequently thins over time and allows ulcers to form. As with nearly all medicines, some aspects of health must be sacrificed for the well-being of another aspect. In this case, stomach pain and slightly more blood loss from cuts may have to be endured in return for lowered heart attack risk and headache prevention. Despite the side effects that can occur after a prolonged use of aspirin, its ability to provide a few hours of relief from a headache or fever makes aspirin the ideal medication to use. Aspirin has been used to relieve pain for over 100 years since its creation and over 80 million of these tiny pills are taken

throbbing pain doggedly persists for an hour, causing you to feel as if your head will split in two. Knowing that you can’t possibly continue studying like this, you grab a glass of water and down two aspirin, and just like that, the headache is gone. Aspirin belongs to a family of medicine called nonsteroidal anti-inflammatory drugs (NSAID), which includes ibuprofen. How that little aspirin pill can alleviate your headache all starts with what aspirin, scientifically known as acetylsalicylic acid, does to your cells. Aspirin works by inhibiting synthesis of prostaglandin (PG), a hormone found in most cells and responsible for inflammation of injured tissues, platelet clotting, and pain production. PG is synthesized from arachidonic acid by two enzymes, namely cyclooxygenase 1 & 2 (COX-1 and COX-2) and PG hydroperoxidase, which are collectively called as PGH2 synthase. PGH2 is converted to PGI2, PGE2 and thromboxane A2 (TXA2) by the action of the enzymes PGI2 synthase, PGE2 synthase and TXA2 synthase. COX-1 is found in the majority of cells while COX-2 is only produced when there is a threat in the body, though there are trace amounts of COX-2 in the body at all times. The COX-1 isoform is more important in the synthesis of PGs that are involved in physiologic functions such as stimulating platelet aggregation, reducing gastric acid secretion, and increasing gastric mucus secretion, which protects the stomach from the highly corrosive acid produced there. The main prostanoid of COX-1 in platelets is TXA2. In contrast, COX-2 isoform is expressed primarily at sites of inflammation, although it also is the dominant COX in endothelial cells, whose main prostanoid product is PGI2. Both cyclooxygenases are vital to the production of prostaglandin. The aspirin binds non-specifically to both COX-1 and COX-2 enzymes, making the enzyme unable to catalyze the arachidonic acid to produce the prostaglandin. Thus, prostaglandin production is inhibited and the headache, pain, or fever ebbs away, slowly allowing you to continue productively studying for that A on the science test. every day by Americans alone. So the next time Besides reducing pain, low doses of aspirin (~80mg/ you decide to take an aspirin, you day; the typical “baby aspirin” is 81 mg) are also commonly may think twice about the power of used to prevent heart attacks, another result of its inhibition of

this tiny little pill.

REFERENCES Hersh, E. (2000). “Over-the-counter analgesics and antipyretics: A critical assessment”. Clinical Palleros, Daniel R. (2000). Experimental Organic Chemistry. New York: John Wiley & Sons. pp. 494

11


2000:

Y2K

GRAPHIC BY LING JING

The beginning of this decade, century, and millennium, kicked off with an event that was horrifying when it was approaching but laughable in hindsight. Y2K, or “Year 2000,” was the name given to a predicted global software malfunction. Experts predicted that computers would cease to function properly at the turn of the millennium because of programming limitations. At the time, most computers stored only the last two digits of calendar years. For example, 1940 was ‘40’ and 1999 was ’99.’ Programmers were unsure whether computers would be “smart” enough to make the transition from “99” to “00.” The idea that a problem would arise due to this limitation was put forth as early as 1984. By the 1990’s, our society was dependent on these machines, so scientists were extremely worried that a global computer malfunction would cause global chaos. People were told that there would be no water, electricity would shut down, the stock market would crash, and experts even advocated pooping in plastic bags should the toilets fail to flush. Some were certainly skeptical, but such bold predictions supported by so many experts certainly aroused some degree of fear in everyone. There were some technical difficulties, but if you recall the first few 2001: 9/11 seconds of 2000, a wave of relief swept across every living room when the lights did not flicker out as clocks across the world turned from ‘99’ to ’00.’ If you ask your grandparents, they could tell you exactly Considering the capacities of computers today and the fact that Y2K didn’t where they were and exactly what they were doing on December really amount to anything, it’s hard to imagine how we allowed such an idea 7, 1941, the day that the Japanese attacked Pearl Harbor. When to instigate so much fear. our grandchildren ask us, we will all be able to tell them exactly where we were and exactly what we were doing on September 11, 2001. Most of us were in elementary school, but that moment, regardless of how ignorant we may have been, changed history. To Americans, 9/11 is unarguably the most important date of the decade. It forever changed the way that we ride airplanes, view terrorists, and treat our freedom. Yet, though it has been over eight years since the attacks, scientists and engineers are still unsure how exactly the world trade centers collapsed. Several conspiracy theories have been put forth, although they all lack strong support. The official explanation, put forth by civil engineers is that the fuel from the airplanes was hot enough to melt the steel in the building, and thus caused the entire structure to collapse. Some, however, do not agree with this theory because the fuel could not have possibly been hot enough to melt the steel in such a small amount of time. A popular theory is that bombs were planted in the World Trade Center prior to the attacks, and were triggered as the building went down. The problem with this theory is that if it’s true, a huge number of people must have been involved in it, and it’s hard to imagine that the secret could have been kept. GRAPHIC BY TIFFANY SIN

12


By Murong He, Michelle Kao, Ling Jing, and Sumana Mahata 2002:

Toumai

1973 was an important year for archeologists and paleontologists all around the world. Lucy, an Australopithecus that dates back to approximately 4.4 million years ago, was discovered in Ethiopia. In 2002, scientists discovered an even older hominid fossil—Sahelanthropus tchadensis, or more commonly known as “Toumai.” Archeologists dug this 7-million-year-old fossil up the Djurab desert of Chad. Because of the fossil’s age, it is unsure whether Toumai belongs in the hominid tree. The fossil possesses both human and chimpanzee characteristics, and the scientific community is not in agreement whether Toumai has any direct relationship with modern humans and whether it should be classified as a hominid or an ape; its discovery, however, is significant nonetheless because it provides paleontologists and archeologists with a subject necessary for closer examination of human evolution.

“The HGP and its information provided a better understanding of the biology of human beings.” 2003- Human Genome Project (HGP) Completed

GRAPHIC BY SUMANA MAHATA

The idea of mapping out the entire human genome has been entertained since December 1984. However, the magnitude of this gargantuan project, which included determining the sequence of all the bases of DNA in the human genome, on top of identifying and mapping all the genes both physically and functionally, threatened its feasibility. As there are three billion nitrogenous base pairs, and tens of thousands of genes in a chromosome, this project would take an understandably long time—thirteen years. The venture began in 1990, headed by James D. Watson from the U.S National Institutes of Health, only to be completed in April of 2003 after much effort in several different countries’ universities and research centers. Researchers dedicated to the project employed various DNA sequencing techniques to complete this task: the Polymerase Chain Reaction (PCR), Yeast Artificial Chromosomes (YAC), Bacterial Artificial Chromosomes (BAC), Restriction Fragment-Length Polymorphisms (RFLP), and Dideoxyribonucleotide Chain Termination Reaction. The benefits and applications of the knowledge gained from the completion of this project were predictable from the beginning: leaps in medicine due to an increased role of genetics and a better understanding of the biology of human beings. The HGP and its information provided better prevention, diagnosis, and cures of certain ailments or diseases because researchers could determine how certain genes affected these sicknesses. Researchers were also able to investigate how drugs affect certain disease-related genes through the blocking and stimulation of certain genetic pathways; for example, the drug STI-571 was engineered to block the activity of the bcr-abl gene, which produces a protein when a fusion occurs between chromosomes 9 and 22, causing chronic myelogenous leukemia (CML). The HGP revealed that there are around 20,500 human genes: it also determined their locations, structure, and organization. By exactingly cataloguing what information we are composed of, the HGP set down a foundation for research to follow.

13


Saturn has always captured humanity’s curiosity with its beautiful rings and many moons. On October 15, 1997, the Cassini-Huygens probe was launched to sate some curiosities: to determine the structure and behavior of the rings of Saturn, to determine the composition of moon surfaces and the geological history of each moon, to determine the nature and origin of the dark material on the satellite Iapetus’s hemisphere, to measure the structure and behavior of the magnetosphere of Saturn, to study the behavior of Saturn’s atmosphere at cloud level, to study how Titan’s atmosphere changes over time, and to roughly map out Titan. The Cassini-Huygens mission was a joint mission between NASA, ESA and ASI; 16 European countries and the U.S. contributed to the designing, building and flying of the probe, which finally went into orbit around Saturn—the first spacecraft to do so—on July 1, 2004, almost seven years after its initial launch. The mission had already produced various high quality pictures of the Moon, Jupiter, and Phoebe (one of Saturn’s moons) on Cassini’s way to Saturn. On October 10, 2003, the Cassini-Huygens team announced the results of their tests of Einstein’s Theory of General Relativity, which were completed by using radio waves that were transmitted from the Cassini-Huygens space probe. Because the Cassini-Huygens space probe’s measurements were more refined, they were far more accurate than that of earlier tests by the Viking and Voyager space probes. The data at the end of the experiment supported Einstein’s theory. In 2004 the probe discovered three new moons of Saturn, which were named Methone, Pallene, and Polydeuces in 2005. Other highlights were the various pictures of Titan taken on July 2, 2004 and the separation of the Huygens probe from the Cassini orbitor on December 25, 2004 to reach its destination on January 14, 2005. The mission lasted 4 years, but it has been renewed yet again, proving the boundless potential for the exploration of space.

GRAPHIC BY CALIRE CHEN

2004- Cassini-Huygens Mission to Saturn

2005- The Death of Terri Schiavo

“One of the major issues concerning Terri’s death was whether it was ethical to remove her life support.”

On March 31, 2005, a 41-year-old woman died. Even though she was not a celebrity, well-known politician, or anyone remotely famous before hospitalization, she became a figure of national controversy. Terri Schindler Schiavo was hospitalized on February 25, 1990 for respiratory and cardiac arrest: the long period of time without oxygen caused brain damage, leading to her persistent vegetative state (PVS). In June of that year, Terri’s husband, Michael Schiavo, was given guardianship over her and her property. Tensions rose in May of 1998 when Michael Schiavo’s attorney, George Felos, filed a petition to withdraw life support, claiming that Terri Schiavo would not wish to continue living while in PVS. Terri’s parents, on the other hand, believed that Terri was still conscious, and that removing her life support was immoral and wrong. Soon, many people became heatedly involved in the issue; hundreds gathered outside Terri’s hospital and at least 180,000 people signed a petition to the governor of Florida to protest Judge Greer’s October 15, 2003 ruling to remove her food tube. The Florida government even attempted to make “Terri’s Law,” a law that would allow the Governor to reinstate Terri’s nourishment by the permission of an independent guardian, but this law was deemed unconstitutional. The date for life support removal was rescheduled to March 18, 2005, and on March 31, Terri died from severe dehydration. One of the major issues concerning Terri’s death was whether it was ethical to remove her life support. For some, like her husband Michael, allowing her to die was far better than forcing her to live comatose, inhibited by her own body. However, many others, such as Terri’s parents, pro-life supporters, disability righters, and even President Bush and the Pope, compared letting her die to murder, an unnecessary euthanasia that was completely unscrupulous. Terri’s family, in response to her death, created the Terri Schindler Schiavo Foundation to prevent any similar occurrence from happening ever again. The controversy surrounding Terri’s death epitomizes the difficult ethical questions that arise from advancing science and technology.

14


In August 2006, after years of debate, a two-year struggle to develop an official definition of “planet”, and a vote by 424 astronomers, Pluto was demoted to the status of dwarf planet. The controversial move incited both fury and approval from astronomers over the globe, and was a source of sorrow for the countless schoolchildren who had favored Pluto the most out of the solar system planets. Pluto’s status had been under scrutiny since its discovery in 1930, but it was not until Michael Brown of Caltech discovered a new planet, Xena, in 2005, that its planet status became seriously threatened. Like Pluto, Xena had an abundance of ice and rocky terrain, and seemed to have potential to become the 10th planet. Yet its discovery caused astronomers to consider more seriously the true meaning of the term “planet”, especially when the Hubble Space Telescope found that Xena was larger than Pluto the following year. When evaluated with new requirements for planet status, Pluto failed to live up and lost its title of planet. Although Pluto indisputably orbits the sun, it is of questionable size – only twice as large as its moon, Charon. Also, unlike its other outer-space neighbors, Pluto’s orbit is messy with small solar system bodies (bodies in the solar system that are not planets or dwarf planets). Although the new definition was established by majority vote, the results were met with outrage. Out of 10,000 professional astronomers in the world, only 424 were allowed to participate in the vote. Some astronomers also felt that the definition, with the subjective term “round”, is ambiguous, and that it neglected the cultural and historical significance of Pluto. Though some strong opponents to Pluto’s demotion may still seethe with anger for the dwarf planet’s fate, most have accepted the new solar system order. It appears that the eight planets

GRAPHIC BY SUMANA MUHATA

2006 - Pluto’s Demotion

2007 - Al Gore and An Inconvenient Truth Al Gore may have lost the 2000 presidential election, but he went on to win an Oscar for his documentary, An Inconvenient Truth, and with the Intergovernmental Panel on Climate Change (IPCC), received the 2007 Nobel Peace Prize for alerting the world to global warming and its consequences. Key efforts such as the agreements made in the 1992 United Nations Framework Convention on Climate Change and the 1997 Kyoko Protocol occurred before Gore’s film, but it was not until the documentary that the public became aware of the predicament. The impact of his work was felt nearly overnight – suddenly, the media and daily conversations were filled with terms such as “greenhouse gases”, “carbon footprint”, and “climate change”. Talk of polar bear drownings, a surge in category 4 and 5 hurricanes, the melting of the ice caps, and environmentally hazardous fossil fuels became widespread and alarming, and it soon appeared that the entire world was aware of the impending global catastrophe and working hard toward a solution. The scientific causes of global warming also became common knowledge. Gases in the atmosphere such as carbon dioxide, nitrous oxide, methane, and water vapor trap heat and reflect light, resulting in an increase in the Earth’s temperature as solar energy is retained in the atmosphere. The 2000s were recognized as the hottest decade on record. These temperature changes, which were more extreme at the equator and poles, mean rising sea levels as glaciers and polar ice caps melt and more extreme weather, as well as increased prevalence of certain diseases in new areas and population drops in many animal and plant species as they do not adjust to the changes. People tried to lighten “carbon footprints” and help to avert climate catastrophe, while nations imposed restrictions on carbon emissions and switched to alternative energy sources. The battle against global warming is far from over. The Earth continues to warm, and the ice continues to melt. Yet in 2009, eight countries agreed to reduce their carbon emissions by 50% by 2050, while environmentally-friendly products are increasing in popularity and controversies arise over the merits and demerit of some alternative energy sources. This is very much due to Al Gore’s efforts in 2007. For information on how alternative energy is being used to combat climate change, check out Falconium’s Summer 2009 issue.

15


Yeast is most appreciated as a leavener for bread, but scientists at the J. Craig Venter Institute also appreciate it for its ability to assemble large DNA strands. With the help of the versatile yeast Saccharomyces cerevisiae, the Venter Institute was able to create the first synthetic genome, of the Mycoplasma genitalium bacterium, getting closer to creating organisms wholesale. J. Craig Venter, president of the company, remarked that they had completed stage 2 of three of creating synthetic organisms with the assemblage of the 582,970 base-pair genome. With the complete bacterium genome, the next step is to insert it into a cell and force it to function as a natural genome, thereby creating a genuinely synthetic organism. If a successful method of doing so is developed, genomes can be manipulated to create artificial organisms that could be manipulated into doing and making many things. Scientists are also working to develop a genome with the minimum amount of genes needed for survival; such a bare-bones organism could then lead to replacements for fossil fuels. Even more ambitious is the goal to develop a synthetic human genome, which could be used to study cloning. The process of genome synthesis as developed by the Venter Institute scientists began with resequencing of the bacterium genome to ensure they had a correct copy. Then, they chemically assembled 5000-7000 lengths of DNA based on this original genome, marking them as synthetic. Next, these sections were joined together in a five-step assembly process to create longer subassemblies of 24,000 base pairs each, then 72,000 base pairs, then 144,000 base pairs (about 25% of the genome). The scientists then used the process of homologous recombination, a natural method of cell repair, in the yeast to create the final complete genome. The creation of the first synthetic genome in 2008 was a major stepping stone for the Venter Institute’s breakthrough two years later: in May 2010, the scientists succeeded in creating the first synthetic cell. References “9/11: Science and Conspiracy | National Geographic Channel.” National Geographic Channel - Animals, Science, Exploration Television Shows. N.p., n.d. Web. 9 Jan. 2010. <http:// channel.nationalgeographic.com/episode/9-11-science-andconspiracy-4067>. “Ardi’s Feet.” Discovery Channel. Discovery Communications, LLC, 2010. Web. com/videos/ardipithecus-discovering-ardi/. “Bringing Ardi to Life.” Discovery Channel. Discovery Communications, LLC, 2010. Web. 24 Mar. 2010. <http://dsc.discovery.com/videos/ardipithecus-discovering-ardi/>. Britt, Robert R. “Pluto Demoted: No Longer a Planet in Highly Controversial Definition.” Space.com. 24 Aug. 2006. Web. 09 Jan. 2010. <http://www.space.com/scienceastronomy/060824_planet_definition.html>. Candiotti, Susan, Rich Phillips, Bob Franken, and Ninette Sosa. “CNN.com - Terri Schiavo has died Mar 31, 2005.” CNN.com - Breaking News, U.S., World, Weather, Entertainment & Video News. Web. 06 Jan. 2010. <http://www.cnn. com/2005/LAW/03/31/schiavo/index.html>. “Cassini Equinox Mission.” Cassini-Huygens. Web. 06 Jan. 2010. <http://saturn.jpl.nasa.gov/>. Climate Crisis. Web. 09 Jan. 2010. <http://www.climatecrisis. net/>. “CO2 and Heat-Trapping Gases FAQ | Union of Concerned Scientists.” Union of Concerned Scientists. Web. 09 Jan. 2010. <http://www.ucsusa.org/global_warming/science_ and_impacts/science/CO2-and-global warming-faq.html>. Collins, Francis S., Victor A. McKusick, and Karin Jegalian. “Genome.gov | Online Education Kit: Implications of the Genome Project for Medical Science.” Genome.gov | National Human Genome Research Institute. Web. 06 Jan. 2010. <http://www.genome.gov/25019925>. “Genome.gov | What Was the Human Genome Project (HGP)?” Genome.gov | National Human Genome Research Institute. Web. 07 Jan. 2010.

GRAPHIC BY PAUL HO

2008 -Synthetic Life

2009 Ardi: The Newest Link to Our Past For decades, Australopithecus (“Lucy”) was the closest link to the common ancestor of chimpanzees and humans. However, in 2009, paleoanthropologists unearthed an older hominid skeleton that had attributes of both chimps and humans and provided significant clues to solving the mystery of bipedality, or the hominid ability to walk on two legs. Over many years, a team of excavators in Ethiopia found bits and pieces of a hominid, which they recognized as older than Lucy due to its fragility. To determine the skeleton’s age, scientists collected samples of the rock both above and below the layer in which the skeleton were found and melted them with a laser, releasing argon gas from potassium decay. By measuring the amount of argon trapped in various layers of volcanic ash, it was determined that the fossil, Ardipithecus (“Ardi”), was about 4.4 million years old - the oldest hominid skeleton ever discovered. After examining specific bones, scientists determined that Ardi had been able to walk upright. A theory explaining bipedality is that males were considered good mates if they were capable of searching for food and carrying it to the mother and young, which was easiest to do with the hands. Although sacrificing two limbs allowed early hominids to attract mates, bipedality also had unfavorable consequences. Bipedality used energy less efficiently and reduced the ability to run and climb rapidly, making it easier for predators to target Ardi. Also, using two legs instead of four created challenges in recovery and mobility after injury; if a leg was to be injured, only one other leg would have been available for use instead of three. Because bipedality was so beneficial to mating and reproductive success, however, the trait survived. Ardi’s discovery brought us closer to solving the mysteries about our hominid ancestors and how they evolved into humans. In a sense, she has replaced Lucy as the “mother” of the human race.

16


4-D

MICROSCOPY

BY SIDDHARTHO BHATTACHARYA

electron scattering and reduced tendency to damage the sample (in comparison to X-rays). Zewail and his team used a stream of electron packets, with about one electron per pulse every femtosecond fired out of an ultrafast laser. To capture the images, a setup consisting of a laser and an array of sensors was prepared. The images were stored with Digital Micrograph software, which connected the images together to render them into a movie. Zewail and his team discovered the strange and unique ways the groups of atoms moved at this scale. They witnessed layers of carbon atoms locked in a sheet arrangement in graphite, and gold atoms arranged in a crystal formation. The molecules jittered and moved ever so slightly, with a few structures forming and immediately disintegrating. Even more remarkably, when a picosecond (10-12 sec) time frame was used, the team discovered that the graphite produced sound waves as the atoms oscillated. Using this information, they calculated the force holding the sheets together described in a stress-strain property known as Young’s modulus. In a similar experiment, the researchers increased the time scale to a millisecond (10-3 sec) while observing the sheet of graphite under bursts of heat. The excited carbon atoms immediately began to oscillate in a random pattern, but the sheet as a whole vibrated and sent out little ripples, a phenomenon known as “drumming.” The implications of 4D microscopy are tremendous. Not only does 4D microscopy allow scientists to observe the interactions of atoms at a small scale, but it also allows them to observe the particles in nearly real time. With imaging, the atomic motions could allow better understanding of structural, morphological, and nanomechanical phenomena which are caused by these movements. This technology is currently being used in several fields. In biology, researchers are imaging the components of cells and how cells use their machinery in real time. They have already produced images of a stained rat cell and of a protein crystal and cell in vitreous water. Some of these structures exist for tiny fractions of a second and now, thanks to 4D microscopy, their images and their motions can be captured and studied to understand how they work. Indeed, there are limitless possibilities for 4-D microscopy. Researchers are using it in photonics and biophysics and more fields are adopting the technique. In the words of David Tirrell, chair of Caltech’s Division of Chemistry and Chemical Engineering, “The sequences of images produced by this technique are remarkable, they not only provide unprecedented insights into molecular and materials behavior--they do so in an especially satisfying fashion by allowing direct observation of complex structural changes in real space and real time. These experiments will lead us to fundamentally new ways of thinking about molecules and materials.”

GRAPHIC BY CLAIRE CHEN Science began with inspection, observing nature to gain greater understanding of the world. Where the naked eye was not enough, man invented the telescope to look up into the stars, and man invented the microscope to look down at molecules. Yet our quest for knowledge was not sated, and a recent invention brings us one step closer to seeing and understanding the world. A new method of magnifying images with electron microscopes has resulted in a revolutionary way of observing molecules and their interactions. At a very small scale, atoms and molecules exhibit erratic behavior, constantly reacting to form molecules and then immediately disintegrating, all within an infinitesimal instant of time. Dubbed as “4D Microscopy,” the novel technique aims to observe these interactions by controlling individual electrons and their trajectories to form still image sequences of particles at near angstrom (10-10 m) scales. These image series are then played in sequence to reveal the changes and interactions of these particles over time. This technique is part of a field known as femtochemistry, which was spearheaded by Ahmed Zewail, a Caltech professor. Zewail won the Nobel Prize in 1999 for developing the method of using ultra-short laser flashes to capture stills of molecules in motion to observe fundamental chemical reactions of molecules forming and breaking apart at the timescale of a femtosecond (1015 sec), . All imaging techniques are based upon sending and receiving signals. The reception of the signal is what creates the image, be it on a physical medium like camera film or a digital one like a computer. Due to the infinitesimally small amount of time particles exist in a particular configuration, the incredible accuracy of electron microscopy must be combined with the ultrashort time resolutions of optical microscopy. Electron microscopy scatters streams of electrons off an object and then creates an image from them when the sensor receives them, just as a bat echolocates with sound waves. Electrons are used since the wavelength of the particle must be smaller than the distance between atoms, and electrons have significantly smaller wavelengths than photons do because electrons travel at a fraction of the speed of light. However, to include the time aspect, the electrons must be sent in doses that are carefully timed to arrive at the object within femtoseconds of each other. This creates a series of images which depict molecule moveReferences ments within the brief time in which they occur. Svitil, Kathy. “Caltech 4D Microscope Revolutionizes the Way We Look at the Nano On March 20, 2005, Zewail and his colleagues em- World.” California Institute of Technology. 8 Nov. 2008. 8 Jan. 2010. barked on an experiment to demonstrate the capabilities of 4D <http://media.caltech.edu/press_releases/13207> ultrafast electron microscopy (UEM) by imaging single crystals of gold, amorphous carbon, polycrystalline aluminum, and bio- Zewail, Ahmed H., et al. Four-Dimensional Ultrafast Electron Microscopy. National Academy of Sciences of the United States of America. Vol. 102, No. 20. May 17, 2005. logical cells of rat intestines. UEM was immediately shown to be “4-D Microscopy Films Photons.” Photonics Media. 22 Dec. 2009. 8 Jan. 2010. incredibly adept at angstrom level imaging due to its increased <http://www.photonics.com/Content/ReadArticle.aspx?ArticleID=40622> 17


Is a [Dose of Dirt]

The Best Medicine for Allergies?

BY PAUL HO

GRAPHIC BY MICHELLE OBERMAN

Luckily, when people encounter an allergen, they can take various medications that mitigate the effects of these reactions. Mild allergies can by treated with various oral antihistamine drugs. Antihistamines inhibit histamine function by blocking receptors for histamine that are located on nerves, smooth muscle, and mast cells, thereby reducing edema (swelling), itching, redness, and other symptoms associated with allergic reactions. The most commonly used over-the-counter antihistamines include diphehydramine, sold under the brand name Benadryl; cetirizine, found in Zyrtec; and loratadine, which is sold as Claritin. An alternative to oral medication is a cortisone nasal spray or injection, which reduces inflammation and nasal congestion. In more serious cases, epinephrine may be injected into the body, often with an EpiPen. Epinephrine, also known as adrenaline, increases heart rate, dilates air passages, and contracts blood vessels, countering the effects of an allergic reaction. It is incredibly helpful in the rapid treatment of a severe allergy as it completely reverses the effects and restores the body to its natural equilibrium. While drugs can be effective, the best way to treat allergies is to prevent them from developing. Several researchers have hypothesized that the recent increase of allergy related problems is directly related to society’s obsession with cleanliness. They argue that due to overcautious parents and “germ-phobic” individuals, children are not being exposed to enough allergens when their immune systems are maturing, crippling their ability to respond to allergic reactions in the future. A simple analogy is that of a young child learning how to solve puzzles as a four year old. By age eight, if he sees a more complex puzzle, his previous experience with the “junior” puzzle will aid him in his approach of this new puzzle. The human’s immune system functions much like the brain of this child. Supporters of the hypothesis claim that the body must be introduced to allergens at an early age so that it can prepare for future exposures. In the article “New Scientist,” Garry Hamilton discusses the advantages of exposing babies to a certain level of dirt, stating that it rebalances and trains the immune system, reducing the chance that it will overreact to allergens. If individuals are not exposed to dirt at an early age, their encounters with allergens as they grow older are more likely to result in allergic reactions. Evidence from a Harvard study, conducted in 2000, corroborates this claim, stating that children raised on farms, where exposure to dirt and germs is more common, experienced fewer allergies than urban children. Allergies are truly a strange phenomenon of the human body. The idea that one’s body is able to overreact and shut itself down is frightening, but the combination of effective drugs and new notions on how to combat the development of allergies may soon render the fear of allergies insignificant.

The sudden onslaught of runny noses, teary eyes, rashes, or even difficulty breathing are signs of allergies, one of the most extreme manifestations of the human immune system. Despite their intensity, allergic reactions are triggered by generally innocuous substances known as antigens or allergens. When these allergens come into contact with the body, the immune system perceives them as threats and overreacts. Unfortunately, there is no effective way to completely avoid allergies, because nearly anything can be an allergen, including household dust, plants, medications, foods, animal dander, insect venoms, viruses, and bacteria. When exposure to an allergen first takes place, lymphocytes, which are a type of white blood cell in the immune system, create antibodies— proteins that bind to specific antigens, marking them for destruction by white blood cells. If the allergen enters the body again, it will bind to these antibodies, prompting white blood cells in connective tissue, called mast cells, to release hista“Allergic Reaction Causes, Symptoms, Treatment - Allergic Reaction Treatment on mine. In turn, the secretion of histamine causes inflammation EMedicineHealth.” 31 Oct. 2010. Web. 31 Oct. 2010. or contraction of smooth muscle, manifested by the sympwww.emedicinehealth.com/allergic_reaction/page6_em.htm#Allergic Reaction toms of nasal congestion and itching that are commonly asTreatment>. sociated with allergies. One type of extremely dangerous and “Allergies & Dirt.” Information For Therapists. Web. 31 Oct. 2010. potentially deadly allergic reaction is anaphylaxis, which in/www.mytherapypractice.com/bugs_drugs_allergies/AllergiesDirt.htm>. “Asthma & Germs.” Health And Energy Company. Web. 31 Oct. 2010. volves the release of histamines from tissues all over the body, <http://healthandenergy.com/asthma_&_germs.htm>. resulting in asthma-like symptoms and causing the airways to “Histamine.” The Worlds of David Darling. Web. 31 Oct. 2010. contract significantly, sometimes enough to suffocate the al<http://www.daviddarling.info/encyclopedia/H/histamine.html>. lergic person. Consequently, it is important for individuals to be “Why Are Allergies On The Increase.” Sparks-of-light. Web. 31 Oct. 2010. <http:// aware of their allergies to prepare in case of extreme reactions. www.sparks-of-light.org/dirt, allegies and cleaning.htm>. 18


]

PHOTO BY MELODYANNE CHENG

[

A Good Night’s Rest

BY SARAH WATANASKUL

With schedules teeming with AP classes and extracurricular activities—not to mention distractions such as the Internet, television, and video games—, many high school students find it difficult to set aside time for a good night’s rest. Students increasingly remain awake till the wee hours of the morning, sometimes even staying up all night for the sake of completing class assignments. Unfortunately, sacrificing sleep results in serious consequences. Although the amount of sleep needed declines as a person transitions from childhood to adolescence to adulthood, recent studies have shown that adolescents still need an average of 9.2 hours of sleep per night for optimal performance. However, only 15% of high school students sleep 8.5 hours or more, according to a 1998 study by Wolfson and Carskadon. Interestingly, teenagers undergo a “phase shift” during puberty that causes them to sleep later than preadolescent children, despite their need for more sleep. The effects of sleep deprivation vary from person to person: one person may breeze through the day on 5 hours of sleep while another will struggle onward like a zombie. This variation may be partially genetic. Results from differences in length in the PERIOD3 (PER3) gene indicate that people with the long form of the PERIOD3 gene feel the effects of sleep deprivation quickly, while those with the short form possess short-term resilience. The gene may differentially influence the effects of sleep deprivation on function in the early morning. Symptoms of sleep deprivation generally include reduced concentration, inefficiency in learning, poor memory capacity, headaches, dizziness, irritability, growth impairment, depression, high blood pressure, and microsleeps (involuntary sleep lasting for 10-60 seconds). Unlike muscles, which can be regenerated while a person is conscious and resting, neurons can only regenerate in the cerebral cortex and form new synaptic connections during sleep. Thus, reduced sleep can result in malfunctioning neurons, which cause dramatic changes in behavior. Lack of sleep negatively affects language processing. Studies have been conducted in which magnetic resonance imaging scans compared brain functions of sleep-deprived subjects and well-rested subjects based on results from verbal tests. In well-rested subjects, the temporal lobe of the brain (which controls language processing) was very active, whereas in the sleep-deprived subjects, it was inactive. However, the parietal lobe, which controls cognition and sensory perception of the world, was activated in sleep-deprived subjects and allowed them to complete the verbal test relatively well, though their performances were poor in comparison to those of their well-rested counterparts. Sleep deprivation also leads to an increased risk for diseases such as cancer, heart disease, diabetes, and obesity likely due to its profound effect on many vital hormones in the body. One consequence of sleep deprivation is impeded growth because growth hormone is secreted during deep sleep. People spend less time in the stages of deep sleep as they age, explaining why adults gradually stop growing. If adolescents are deprived of sleep, their growth is impaired while their fat-gaining process is sped up, po-

tentially leading to obesity. Sleep deprivation also results in reduced production of the hormone leptin, which notifies the body when it is full. An insufficient amount of it causes a person to desire more food. The amount of food consumed may be sufficient, but without the bloated feeling from leptin, the body will still feel hungry. In fact, a survey of high school students in Ohio revealed that students that slept for less than 5 hours each night were 8 times more likely to be overweight than those that slept for over 8 hours. A second study conducted by Arlet V. Nedeltcheva et al revealed that insufficient sleep also makes it harder to lose weight because it affects a hormone called ghrelin, which can cause increased appetite and fat retention. Finally, sleep deprivation affects melatonin release. The hormone melatonin acts as an anti-cancer agent and can prevent tumor growth. It is released at night in the dark, so exposure to light late at night can decrease melatonin levels and increase a person’s risk of developing cancer. Sleep deprivation also depletes neurotransmitters—chemical messengers in the brain associated with mood regulation, leading to vulnerability to depression and abnormal levels of irritability. Recent research has demonstrated that sleep deprivation has detrimental consequences, but there is also hope for reversing its effects. Robert Havekes and his colleagues have discovered that the enzyme phosphodiesterase 4D, known as PDE4, contributes to the effects of sleep deprivation. The hippocampus, an area of the brain that consolidates new memories, requires a cell-signaller called cAMP, which is required for many biochemical processes throughout the body including the regulation of hormones. In a study of sleep-deprived mice, Havekes and his colleagues found that sleep deprivation resulted in an increased amount of PDE4 activity and a decreased amount of cAMP. By inhibiting PDE4 activity, they were able to counteract the consequences of sleep deprivation. Although future research will undoubtedly generate new methods to counter the results of sleep deprivation, the best remedy for sleep deprivation is still a good night’s sleep. References Carpenter, Siri. “ Sleep deprivation may be undermining teen health.” American Psychological Association. N.p., n.d. Web. 9 Jan. 2010. “Enzyme behind effects of sleep deprivation discovered.” Science Daily: News & Articles in Science, Health, Environment & Technology. N.p., n.d. Web. 9 Jan. 2010. “Gene Predicts How Brain Responds To Fatigue, Human Study Shows.” Science Daily: News & Articles in Science, Health, Environment & Technology. N.p., n.d. Web. 9 Jan. 2010. Pitts, Jonathan. “Lack of Sleep Side Effects.” Health Guidance - Free Health Articles. N.p., n.d. Web. 9 Jan. 2010.

19


MICROWAVES AND CANCER BY SARAH HSU AND SARA SHU

Health advocates have worried for decades that exposure to frequencies emanating from microwave oven and cell phones might be harmful. And the ubiquity of such technology today--especially considering the quantum leap in cell phone usage in recent years--only makes such concerns that much more pressing. Here we discuss about the possible linkage between microwave ovens, mobile phones and cancer current in the light of the available, published literatures. Can Microwaves Cause Cancer? Carcinogenic objects, items known to cause cancer, have always stirred up controversy. Many chemicals and objects are easily identifiable as cancer-causing, and many currently believe that the debated microwave oven should rank among the recognized carcinogenic items. A microwave oven contains a generator called a magnetron, which takes electricity from the power outlet and converts it into waves -- high powered 12-centimeter electromagnetic waves called microwaves, which swing back and forth at a rate of two billion cycles per second. The magnetron directs these waves into the cooking cavity, where the food rotates on a turntable, by a channel called a wave guide. When the microwaves penetrate the food, polar molecules (molecules that have positively and negatively charged regions) within the food such as water begin to move at the same frequency as the microwaves, causing themselves to rotate rapidly and adjust to the constantly oscillating waves. This molecular friction produces heat, which rapidly warms the food. However, this type of heating can also damage the molecules surrounding the rapidly rotating polar molecules, deforming or tearing the molecular structure of the food as well as vitamins and enzymes in food apart. Since microwaves can cause this type of molecular damage in food, there are many who believe it has the potential to cause damage to our bodies as well. There is evidence that because microwave ovens can cause some chemical alterations or the food may develop free radicals--unstable molecules that attract electrons from their surroundings to stabilize themselves. The microwave-cancer correlation hypothesis states that ingested foods containing free radicals may attract electrons from molecules in the body and start chain reactions, creating more free radicals that ultimately lead to alteration of cell DNA, prompting cancer if enough foods containing free radicals are ingested.

However, the Food Standards Agency claims that microwaved food causes minimal damage. In fact, microwaves are not as harmful as other types of radiation. Among two types of radiations, ionizing radiation (gamma rays and x-rays) contains enough energy to strip atoms and molecules from tissue and to alter chemical reactions in the body. Non-ionizing radiation (radio waves, light and microwaves), on the other hand, is safer because it does not have the energy to break molecular bonds, a prerequisite to damaging tissue and causing cancer. Besides being a source of ‘safe’ radiation, microwave ovens also produce negligible amounts of radiation. The FDA regulates microwave ovens: they only leak 5 milliwatts per square centimeter at approximately two inches from the surfaces of microwave ovens in their lifetimes. The FDA states that this restriction ensures that the amount of radiation leaked from a microwave is not enough to harm a person. Thus, the amount of radiation that is speculated to come from a microwave oven has been blown far out of proportion. Additionally, as a person moves away from a microwave oven, the amount of energy decreases by the square of the distance, with the amount of radiation 20 inches from a microwave approximately one-one hundredth that at 2 inches. Unless one chooses to spend time with one’s nose pressed against a microwave’s door glass, the belief that one can get cancer from standing in front of the microwave is indefensible. References “Health Hazards of Cell Phone Use”. Cell Phone Safety. 24 December 2009. <http://www.cellphonesafety.org/health/>. “How Cell-phone Radiation Works”. How Stuff Works. 24 December 2009. <http://electronics.howstuffworks.com/cell-phone-radiation.htm>. Lacasse, Marc. “Should You Be Concerned About Cooking Your Food Using a Microwave Oven?” Healingdaily.com. Healing Daily, 2002. Web. 7 Jan. 2010. <http:// www.healingdaily.com/ovens.htm>. “Radiation, Microwaves, and Cancer.” CancerHelp UK. Cancer Research UK, n.d. Web. 9 Jan. 2010. <http://www.cancerhelp.org.uk/cancer/questions/microwaves-and-cancer>. Wayne, Anthony, and Lawrence Newell. “The Hidden Hazards Of Microwave Cooking .” Mercola.com. Dr. Joseph Mercola, 2010. Web. 7 Jan. 2010. <http://www. mercola.com///.htm>. Added sources:

20


WHAT ABOUT CELL PHONES?

GRAPHICS BY JESSICA ZENG

Cell phones, which also use electromagnetic wave technology, are commonly also mistakenly thought to cause brain tumors or cancers. Officially, the radiation that cell phones emit are non-ionizing and therefore theoretically safe. However, some researchers have become uneasy about the longterm effects of this supposedly harmless radiation. Stories occasionally arise accusing cell phones of causing brain cancer, driving short-term studies on the issue. While a number of studies have found no correlation between cell phones and brain tumors, most of these studies were focused on people who had been using cell phones for three to five years. Longer-term cell phone use may be a different story. A few studies have shown that using a cell phone for an hour each day for ten years can increase the risk of developing a rare tumor on the side of the head, or other illnesses such as Alzheimer’s and Parkinson’s. Researchers believe that younger cell phone users face a higher risk of developing tumors because their nervous systems have not fully developed and their skulls are not as thick as those of adults. However, no study to date has provided conclusive evidence that cell phones can cause these illnesses. The Food and Drug Administration declares that “the available scientific evidence does not demonstrate any adverse health effects associated with the use of mobile phones.” It is true that high levels of radio frequency energy can rapidly heat biological tissue and cause damage, but cell phones operate at levels well below the level at which such heating effects take place, according to the U.S. General Accounting Office. Furthermore, the U.S federal government places limits on the amount of radiation a cell phone is allowed to emit. One worried about the potential hazards of cell phone radiation can use a variety of ways to reduce the risk such as using hands-free devices and keeping cell phone talk to a minimum.

21


Neglected Tropical Diseases

BY ELORA LĂ“PEZ AND MELODYANNE CHENG

An East African boy runs out of the village to greet his father, who is returning home from his weeklong treecutting job. As the father approaches, the boy can sense that something is wrong. His dad is covered in red sores and approaches in a befuddled manner. When the family visits the local Red Cross mobile, they find out that the father has contracted sleeping sickness. The diseases that we often associate with the poverty-stricken, tropical regions of the world are AIDS, malaria, and tuberculosis. These three mainstream diseases have attracted large amounts of attention worldwide because of their widespread consequences and tendency to cause epidemics. What the majority of the developed world has not recognized for decades, however, is that while these three diseases are serious, there are diseases of more or less equal prevalence in the tropics, such as leprosy, sleeping sickness, amoebiasis, leishmaniasis, schistosomiasis, soil-transmitted helminthiasis (STH), onchocerciasis, lymphatic filariasis (LF), cholera, arboviral diseases like dengue hemorrhagic fever, Japanese encephalitis and blinding eye disease like trachoma that often go unnoticed by the media. There is so little funding and awareness for these lesserfocused diseases that they are referred to as the neglected tropical diseases (NTDs). Ascariasis (roundworms), ancylostomiasis (hookworms), and trichuriasis (whipworms) are the most common soil-transmitted helminthiasis (STH), with millions worldwide suffering from infections. These helminths live in the soil, and are transmitted to the human intestinal tract mainly through the fecal – oral route; that is, through the mouth via fecalcontaminated food or dirty hands and a few even through skin, as in case of hookworm. These diseases are prevalent in more densely populated areas of the world, and consequently have an expansive spread. Other factors responsible for the spread of these diseases are climatic conditions that prolong the survival of causative agents and vectors, and social factors such as poverty and ignorance. These worm infections deplete nutrition consumed by their hosts by sharing the food in the intestines, disturbing the absorption of food from the intestine, causing diarrhea, or in the case of hookworms infections, causing blood

to leak from the intestines. As a result, major manifestations of the diseases are malnutrition and gastrointestinal disorders. Children suffering from malnutrition often have stunted growth and reduced ability to concentrate and perform well in school. Malnourished adults also have difficulty thinking quickly or performing heavy manual labor, and so these infections perpetuate the low income earned by the hosts they infect. Another most neglected of NTD is African trypanosomiasis, otherwise known as the sleeping sickness. This disease is spread from host to host by infected tsetse flies in rural environments of Africa and affects an estimated 50,000 to 70,000 Africans each year. In order for an uninfected tsetse fly to become infected by trypanosome, it must first feed upon an infected host, which can be either animal or human. After drinking the infected blood and the trypanosomes within, the tsetse fly becomes a vector of the sleeping sickness. If it goes on to drink from an uninfected host and regurgitates a little of the infected blood in the new host, the new host will be infected as well. The tsetse fly bite is very painful, and within 1 to 3 weeks, a red sore will develop. The incubation period of acute trypanosomiasis ranges from 6 to 28 days, and travelers frequently become ill during their trips or shortly after returning home, while chronic trypanosomiasis may not cause symptoms until months to years following travel to an endemic area. Several weeks to months later, depending on the type of trypanosomiasis, other symptoms of sleeping sickness, such as high fever, skin rash, swelling of the face and hands, severe headaches, extreme fatigue, irritability, aching muscles and joints, itching skin, and swollen lymph nodes, will appear and worsen as time passes. Personality changes, progressive confusion, daytime sleepiness with nighttime insomnia (hence the name) and other neurological problems occur after the trypanosomes invade the central nervous system and pass the blood-brain barrier. If the sleeping sickness is left untreated, the human host will eventually die, months or several years after infection, depending on the type of African trypanosomiasis. More than 1 billion people, representing one sixth of the world population, suffer from at least one NTD. The combined fourteen NTDs are estimated to result in 57 million disability-adjusted life years (DALYs) annually. DALYs are a measurement created by the World Health Organization that estimate the magnitude of the burden a disease afflicts upon a population, calculated by adding how much a disease shortens an affected person’s life expectancy to the number of years a person has lived

22


ART BY TIFFANY SIN

email

with the disease. The only disease that contributes to more DALYs each year than the NTDs is AIDS. However, the future is beginning to appear less bleak for people affected by NTDs. International organizations are beginning to realize the serious problems that these infections create in poor tropical societies, where a huge percentage of the population cannot learn, work, or live fully due to their ailments. The production and distribution of rapid-impact packages, combinations of five different drugs that can be used to treat all of the seven most common NTDs (such as STH worm infections, schistosomiasis, lymphatic filariasis (LF), onchocerciasis, and trachoma) will be of utmost importance in diminishing the prevalence of these diseases. The causative agents of all of the above, excepting trachoma, are the helminthes, and can be prevented with one or two drug treatments annually. Organizations such as Global Network are already making these packages available, claiming that these pills can be manufactured at incredibly inexpensive rates, amounting to fifty cents per person per year. The tools and medicines required to help over a billion people around the world get back to work and school and their lives are already developed and can easily be manufactured. At the same time, we must create an environment that hinders the spread of diseases, and need the initiative to educate people in developed nations on these devastating diseases. These simple and cheap treatment and preventative methods may be the key to a tremendous drop in disease load in the tropical world.

23


Wildfire AIR POLLUTANTS AND ASTHMA HOSPITAL ADMISSIONS: AN OBSERVATIONAL STUDY ON THE 2007 SOUTHERN CALIFORNIA WILDFIRES Abstract The 2007 Southern California Wildfires induced high levels of air pollutants, causing asthmatics to be particularly at risk. To determine the specific effect wildfire air pollutants held on populations at risk for asthma, the concentration of pollutants and the number of asthma hospital admissions in the periods during and surrounding the wildfires were examined. This study has two parts: first, the association between ambient pollutant concentration and the fires was assessed by comparing daily pollutant concentrations during the 2007 fire period with that of corresponding periods in non-fire years. PM2.5 (particulate matter less than 2.5 micrometers) was found to be most correlated with the fire period; other pollutants had no conclusive relationship to the wildfires. In part two of the study, the extent to which asthma hospital admissions were affected by the fires was determined by comparing asthma hospital admission counts during the 2007 fire period with that of corresponding periods in non-fire years and with total (asthma and non-asthma) hospital admissions. Of the five age groups considered, group one (0-1 years old) was omitted due to insufficient data; group two (1-17 years old) experienced increase in admission after the fire; group three (18-34 years old) experienced no increase in admission both during and after the fire; group four (35-64 years old) experienced greatest increase in admission after the fire; and group five (over 65 years) experienced greatest increase in admission during the fire. It was concluded that high levels of wildfire PM2.5 is most detrimental to the elderly. From these results, policy and prevention measures can be made to focus on sensitive age groups at the periods in which they are most vulnerable.

penetrate into the lungs and bloodstream. Numerous studies have shown the effect of particulate matter on cardiorespiratory hospital admissions during non-fire periods, in which PM changes are lower and more common. Most have concluded that PM concentration is correlated with cardiorespiratory admissions. Other air pollutants, such as ozone, sulfur dioxide, and nitrogen dioxide, have shown more varied and less significant associations (Schwartz 1994; Schwartz, 1995; Lipsett, 1996; Sheppard, 1998; Peters, 2001; Dominici, 2006). However, studies on wildfire pollutants, particularly nonparticulate pollutants, are limited. Nonetheless, such studies are necessary because extrapolation of non-fire trends to fire conditions may not be accurate. A 2009 study by Wegesser et al. compared mouse bioassays and found that wildfire PM is more toxic than the particulate matter from ambient air in a non-fire period. Furthermore, the factors affecting hospital admissions in the wildfire period may include conditions such as hospital accessibility, evacuation, and other inconveniences, in addition to pollutant level, which accounts for most of non-fire morbidity. Being able to determine the effect of pollutants on hospital admissions during wildfires would aid prevention and policy implementation. Existing literature in this area has had inconsistent conclusions with scope confined mostly to the effect of particulate matter (PM) on the general population (Emmanuel, 2000; Sastry, 2002; Mott, 2003; Johnston, 2002; Smith, 1996; Cooper, 1994; Jalaludin, 2000; Viswanathan, 2006; Delphino, 2009). Thus, more research is needed to access the effect of both particulate and non-particulate pollutants on the morbidity of different subpopulations. The purpose of this study is to (1) determine the effect of the fires on the concentration of air pollutants (PM2.5, PM10, nitric oxide, nitrogen dioxide, ozone, and sulfur dioxide) and (2) determine the relationship of the periods before, during, and after wildfires to the number of asthma hospital admissions for five different age groups during the same periods. Knowing (1) and (2), an association between pollutants and asthma hospital admissions can be inferred.

Methods In order to determine the effect of pollutants associated with the 2007 Southern California wildfires on morbidity, secondary data of pollutants and asthma hospital admissions was obtained for the seven counties affected by the wildfire, namely San Diego, Los Angeles, Ventura, Orange, Riverside, San Bernardino, and Santa Barbara. Graphical and numerical displays were created for three periods: four weeks before wildfires (09/22/2007-10/19/2007), the period of the wildfires (10/20/2007-11/09/2007), and four weeks after wildfires (11/1012/07/2007). The period before the wildfires is used as a baseline Introduction for comparison. Corresponding data from 2006 and 2008 were Driven by 100 mile per hour Santa Ana winds, wildalso acquired for use as controls. fires ravaged across Southern California in late Fall 2007. The data used was obtained from two sources. Raw 522,514 acres were burned, 3,290 structures destroyed, data on pollutants PM2.5, PM10, nitric oxide, nitrogen dioxide, and a total of 592,500 evacuations ordered (Schwarzenegozone, and sulfur dioxide were acquired upon request from AQS ger, 2008). In many cases, health was also compromised. Data Mart, the EPA’s air quality database. This data was derived Particulate matter (PM), the primary pollutant associfrom data from air quality monitoring stations in affected counties. ated with wildfire smoke, consists of small solid particles or Data on asthma hospital admissions, aggregated by county and liquid droplets suspended in the air. Exposure to particulate age, was requested and acquired from the California Office of matter may cause cardiorespiratory morbidity if the particles 24


Statewide Health Planning and Development (OSHPD). Total (asthma and non-asthma diagnosis) hospital admission counts was also obtained as a control to account for the change in asthma admissions due to overall change in hospital attendance caused by the wildfires. The five age groups provided by the OSHPD were: (1) 0-1, (2) 1-17, (3) 18-34, (4) 35-64, (5) 65 or greater. Results In part 1, PM2.5 showed the highest association with wildfires, increasing from the start of the fire and peaking around the 2nd day of the fire (Figure 1). This is expected because PM2.5 is the primary component of smoke and can obstruct or irritate airways, leading to asthma exacerbations. PM10 was only measured daily in one county (San Bernardino), and thus lack of data makes comparison with the fires inconclusive. There was no notable relationship between the non-particulate pollutants examined and the wildfire period. For example, 2007 ozone progression throughout the wildfire period appears as if ozone concentration is negatively associated with wildfire (Figure 2); however, closer inspection and comparison with 2006 and 2008 data shows that the trend is seasonal: ground-level ozone concentration decreases every Fall to Winter. Table A, with mean PM2.5 concentrations by county, shows this result as well. For each county, PM2.5 concentration is highest during the period of the wildfire in 2007. Part two of this study looked at asthma hospital admissions, which vary by age. Numerical analysis for all counties shows that 13.34% more hospital admissions was seen by Age Category 5 in 2007 than 2006 during the period of the wildfires, after accounting for total hospital admissions and seasonal variation. This means that the elderly may be more affected by PM2.5 and wildfire exposure than younger populations. Increase in admissions in the period after the wildfire was also seen (5.49%), but is less than the increase during the period of the wildfire. This suggests that seniors tend to seek immediate medical care. Age Category 3 (Ages 18-34) seems least affected by wildfire and PM2.5. There are only slight increases in admissions in the two weeks following the wildfire. Table B shows that 2007 asthma hospital admission for age category three only increased 4.35% from 2006; furthermore, comparison with 2008 shows that 2007 hospital admissions for asthma are a 9.31% decrease from admissions in 2008. In numerical analysis of all counties, it is also found that Age Category 3 experiences less hospital admission in the period after the wildfire as well; specifically, 16.36% less when compared to 2006. This may be due to the fact that young adults are less sensitive to wildfire PM2.5. There is also no increase in admission after the wildfire period, which would be expected if morbidity is present but ignored until after fires. Interestingly, Age Category 2 (1-18 years) experienced significant increased admission in the period after the wildfire, specifically 22.83% compared to 2006. Only an increase of 3.29% was experienced during the period of the wildfire. This may have be due to the delayed effect of wildfire smoke or reporting delays. Age category 4 (35-64 years) showed slight increases

25


during and after wildfires in the San Diego graphic display. Table B clarifies this trend: compared to 2006, 2007 showed a .93% increase before the wildfire, 8.27% increase during the wildfire, and 11.20% increase after the wildfire. Middle aged adults are affected by PM2.5 as demonstrated by high increase during and after wildfires. However, unlike the seniors in Age Category 5, more admissions were seen after the wildfires than during the wildfires, suggesting delayed effect or treatment. Discussion To the extent of knowledge of the author, this is the first study to investigate the effect of multiple pollutants on multiple age groups during short-term wildfires. Furthermore, the controls used—specifically, corresponding non-fire time periods in 2006 and 2008—are unique and reaffirm the significance of results. Moreover, all graphical and numerical displays were compared between years, counties, and, in the case of asthma, with total hospital admissions to ensure the significance of results. The resulting conclusions were unexpected. It was originally hypothesized that changes in hospital admission of a particular age group would be traced to a certain pollutant. However, this was not the case since the only pollutant with significant associations to wildfire was PM2.5. No pollutants demonstrated patterns directly relatable to those seen in hospital admission data by age. Rather, it may be rationalized that each age group has a different level of sensitivity to PM2.5 and wildfires; this is plausible since there are biological disparities between age groups (e.g. young children have narrower and more sensitive airways). The scope of this study was severely limited by the amount of available data. As all counts between 1 and 4 are masked, and additional subsetting variables such as demographic factors could not be included. If masking was removed, demographic data such as gender, race, and nationality could be investigated, since asthma affected different demographic groups differently. In future studies, statistical regression could also be used to further reinforce results and correlation between pollutants and asthma hospital admissions. Moreover, from graphical displays, it was noted that an increase in hospital admissions lagged an air pollution event around one to three days. In future studies, a variety of different lags (e.g. two day average, three day average) should be tested to find one that best matches data. In conclusion, policy and prevention at any time should be aimed at the most sensitive populations. During the wildfires, seniors aged 65 and older should be expected and given most care. After, more children (1-17) and middle aged adults (35-64) should be expected in asthma related hospital visits. In addition, more preventive care should be directly towards both children (1-17) and middle aged adults (35-64) during the period of the wildfire so that existing illness does not worsen in the period that treatment is avoided.

26


T h e N e g a t i ve Effects of Seawater Airconditioning

BY ELIZABETH BRAJEVICH

ABSTRACT

Many researchers worldwide are developing and implementing new environmentally friendly technologies. The HSWAC system appears to be the perfect renewable energy solution to Hawaii’s high energy costs and cooling bills, however further examination unveils problems. My goal is to expose that the system may be causing more harm than help. HSWAC works by pumping deep cold water from the ocean off the coast of Hawaii. The deep cold water intake pipe is 1.6 meters in diameter and goes 520 meters deep. The cold water is then used to cool water in a closed circuit air conditioning unit that cools buildings (sea water and closed circuit never mix). The ocean water returns to the ocean 5 degrees Celsius warmer and is returned to a depth of 60 meters (HSWAC) via diffuser heads. The relocation and heat changes in the water create the potential to harm wildlife. The DO levels will lower because the oxygen will become less soluble in water, also, growth in algae species will create a higher demand for oxygen among living things (raise the BOD levels). This combination can be lethal to marine life. The government is allowing use of the system with minor modifications, most likely due to how economically beneficial the system is. HSWAC reduces cooling costs by nearly 90% “Conventional air conditioning systems consume four to twelve times more electricity than equivalent SWAC systems,” (Ho-

PHOTOS BY JENNIFER CHENG

Many projects that claim to be “green” can have harmful effects on the environment. A new technology, the Honolulu Seawater Air Conditioning system (HSWAC) consists of a large uncovered pipeline receiving water from the depth of Hawaii’s oceans at 11 degrees Celsius and using it to cool closed circuit Air Conditioning Systems. The water is returned five degrees warmer, with low dissolved oxygen (DO) levels, and with a macronutrient concentration that is too high to match its release-point surroundings. Inappropriate macronutrient concentration could cause algal blooms which would lower the DO to levels that are fatal to the anchored benthic (seafloor) organisms. The return water is returned at 14 degrees Celsius (Larson, Ingvar) into water that is 18 degrees Celsius. Coral cannot live in temperatures below 17 degrees Celsius therefore 62 square meters of coral will die as a result of the system’s release water. If the inappropriately tethered pipeline were to roll in the case of a storm, a minimum of 2,000 square meters of coral would be crushed, causing a traumatic habitat loss to the fishes. When the pipeline is being placed, not only will it crush corals, it will also cover surrounding corals in silt which could cause devastating damage. The pipeline itself is a huge hazard with the ability to sweep in unsuspecting invertebrates and fish including sharks, manta rays, and the organisms that many species like the endangered monk seal, olive rildy, and loggerhead turtle feed on. Thermal pollution is actually reduced by the system which is 8% more efficient than electricity. Despite the economic and some environmental benefits, a renewable energy source that will “save the environment” means nothing it if it is surrounded by a destroyed one. The pipeline entrance must be covered with a protective screening and return depth must be increased so that water qualities are consistent with surroundings.

INTRODUCTION

27


nolulu Seawater Air Conditioning). While I have great respect for the new technology, the possible negative effect on the benthic environment is frightening. The negative effects on dissolved oxygen levels, oceanic currents, and endangered species remain unknown by officials and that is what I aimed to change in the course of the study. If the HSWAC is used as proposed in Hawaii as an environmentally friendly system of cooling, then it will inadvertently cause severe environmental problems. The use of the system will harm the wildlife due to temperature changes and the gradual warming of the local area could cause algal blooms and a high BOD and a low DO level. METHODS I began by doing thorough background research on all aspects of my project; dissolved oxygen levels, oceanic currents, thermal pollution, and Benthic environments. Next, I downloaded and read the finalized E.I.S. (Environmental Impact Statement) for the project. My next step was to determine the finalized plans for the pipeline construction and I had to find the precise geographical location, actual intake and outflow depths, actual measurements and dimensions of the pipeline, as well as the actual way in which the HSWAC will operate. I also began to contact various outside sources including marine biologists and the engineers of the SWAC for more details pertaining to measurements and safety of the system. Researching local animals, the depths they could dive to, and where they fall on the endangered animal spectrum was a crucial component of my research. Determining the effect of the single system and increased system use on currents was the next step in the study. I continued to determine if there would be an effect on the balance between photosynthesis (primary production) and community respiration as an effect of the system. I checked periodically for modifications to the construction plan involving my areas of focus. Using math to prove my findings was an important aspect of my study. It was important to calculate the effects of rolling pipelines, improper water release temperatures, algae counts, and how all of this related to the overall impacts of the study. I then compiled results and concluded how to modify the system to minimize these negative effects. RESULTS Several of my concerns with the system were proven valid. The water discharged via diffuser heads at 45 meters deep will disrupt the temperature because the water will be released in warm shallow waters and be 5 degrees cooler than surroundings. The water is being released in a biotope of scattered corals. When the water returns from the system, it will be warmer than it was at intake, yet too cold for the environment that it is being released into. The water temperature is too cold for the corals to sur-

vive in where the water is released, and 52 square meters of coral will die, inhibiting them from reef development. Also, coral destruction due to hurricanes would be significantly increased with the risk of the pipeline crushing the corals that surround it. Precautions to keep the pipeline from rolling in the case of a hurricane include gravity anchors holding the pipeline in place there will only be about 850 of these anchors for the pipeline which stretches approximately 8 kilometers. After comparing studies and doing background research, I determined this amount of anchorage is insufficient. In the case of a hurricane, if the pipeline were to roll just four meters, affected area would be 32,000 square meters, destroying 1,600 square meters of coral. Several fish species inhabit the coral and the coral is their only form of protection from predators. The destruction of local coral would greatly decrease the local population of these fishes which can only be found in or near coral structures. Another threat that construction will create for the coral is coverage in silt. When the pipes are laid at night, the silt that rises can cover and damage the coral. Silt curtains would not be efficient because the water is too choppy to use them at the angle required to protect coral. The polyps of coral are most active at night and they come out and stretch open to find food. As the sediment settles it can bury the corals or make it so that they must expand a lethal amount of energy to keep their surfaces clean (ICRF).If a 3m coral is damaged it can take up to 300 years to recover. If the pipeline sends sediment just one meter in each direction from where it is laid, it will bury and coat at least 800 square meters of coral, but these are not the only problems. The released waters macronutrient concentration will be higher than the macronutrient concentration of the water it is released to, potentially causing Algae blooms. Blooms can block light from reaching the organisms that need it, lower the DO levels, and increase BOD. Organisms can only live for so long if DO levels are insufficient. The Algae will use the oxygen that is crucial to the other species living at the release location. There will be a low DO level in the 1,050 square meter release area, harming stationary organisms like coral. The intake end of the pipeline at 520 m deep can be reached by a few different species, posing another hazard. With a diameter of 1.6 meters, the average ten-year- old could stand comfortably in one of the pipes. The pipes are uncovered for economic reasons because if they were to be covered by a protective screen to prevent animals from entering, expensive remotely operated underwater vehicle technologies would be needed to clear any blockage. The only screen to filter out organisms is to protect the machinery, and is located where the pump begins, directly before its entrance to the system. This screen is checked every four to six months and fish and invertebrates are commonly found dead, unable to survive in the conditions. In a similar SWAC cooling system on Hawaii several Manta Rays were found

28


to be made. A renewable energy source that will save the environment means nothing if it is resting atop a destroyed one. The results I determined were consistent with the logical outcomes that could be derived from available resources. The data and research that I compiled supported my hypothesis, proving that the HSWAC can have unforeseen negative effects on its surrounding environment. A project of this nature is susceptible to error because if any of the sources I researched are deemed invalid, my project also looses credibility. Despite this possibility, I am sure that this undertaking this project was one of the best decisions I could have made to help the environment. The results are a clear reflection that even “green” innovations can have harmful effects.

dead by employees. Every three months enough dead sea life to fill a five gallon bucket is emptied from the system (Jensen, Dale). Endangered sea turtles are at risk for being sucked into the pipe. All of these turtles feed on cnidarians, crustaceans, mollusks, fish and algae; they forage in benthic hard and soft bottom habitats. The turtles as well as their prey could enter the pipe both of which could have traumatic environmental outcomes for the local turtles. Also the squid, octopus and fish that endangered sperm whales feed on could be swept into pipes of this nature. While there would be no substantial negative effects with only one pipe system. Significant damage is plausible if use of systems increases. The only safe solution to this problem would be to cover the intake pipe with a soft wired screen. With holes sized so that small organisms and silt would not cause severe blockage. An R.O.V. would have to be used to clear the screen if obstructions handicapped the pipes operating capabilities. Without a covering system the pipe cannot be safe. Thermal pollution will actually be reduced by the project, because electricity production is only approximately 32% efficient. The HSWAC is 40% efficient; therefore it is reducing more thermal production than it is creating. DISCUSSION

PHOTO BY ELIZABETH BRAJEVICH

Despite the fact that thermal pollution is reduced, the HSWAC needs to be redesigned in some aspects to avoid detrimental effects on the unique and valued Hawaiian environment. The results show that improper water return temperature is one of the most significant problems with the system. The company is most likely not willing to return the water to a proper temperature because the capital costs for additional pipe lengths to return the water to its proper temperature would be too great. If this is the case, they should not market themselves as dedicated to preserving the environment when they are knowingly jeopardizing it due to their financial restraints. A puzzling discovery during my study was that limiting the use of air conditioning in Honolulu was never considered as an alternative. Air conditioning is far from necessary to human survival and a true environmentalist would most likely not sacrifice the well-being of the aquatic life to be comfortably cool in their hotel room. A small, yet existent amount of organisms will be affected during construction and operation. This is not a penalty free project by any means. The pipes rolling during a hurricane is a legitimate concern. My research shows that the anchoring system used to secure the HSWAC pipeline failed to work for a pipeline in use for another application during recent hurricane ‘Iwa, a mild category two storm: “The pipeline was moved sideways about 400 ft. and smashed aside into the hard coral (Larson, Ingvar) HSWAC should not be put in place until the financial needs can be met for the necessary safety modifications

References Larson, Ingvar. Hawaii State. Honolulu Seawater Air conditioning Final Environmental Impact Statement. Honolulu: 2009. Print. “Honolulu Seawater Air Conditioning.” HSWAC, LLC. Honolulu Seawater Air Conditioning, LLC, Web. 13 Jan-23 Mar 2010. <http:// honoluluswac.com/index.php>. “IDCRF.” The Indonesian Coral Reef Foundation. Yayassan TERANGI, 16/07/2006. Web. 3 March 2010. <http://www.terangi.or.id/en/ index.php?option=com_cont>.

29


N|OT T|HERE FREEFALL… Etched in the unspoiled blackness was a shape, shuddering in one moment, still in the next. Kasrin’s body drifted in lazy random patterns, but seemed to stay in the same place, with nothing to judge her change of position by. Her head curled in and arms wrapped protectively around her knees. There was no sound, no thought, no feeling—the silence was undisturbed, natural, and completely incognizant of the existence of such a thing as noise. …DREAMING. She dreamed she was falling. Maybe it was part of the human subconscious, wired into the neurons before birth; the dream about falling. She’d been told that somewhere, sometime. But this was her first dream. There were none before, and there would be none after. And she was falling. There was no rush of wind, no sense of pull, no sickening dread as the ground spiraled closer; Kasrin should not have known she was falling at all, but she was. There was no fear, because there was no imminent death. Instead, Kasrin curled in on herself and let her body shake and shudder and twitch and go still, then repeat the cycle all over again. It was silent. She was the only thing that existed. Her body was a cacophony of quiet, nerves mutely screaming at her and a too-fast heartbeat. Only her brain remained calm and austere. She was falling, but that didn’t matter. Nothing mattered. She was the only one in existence…her and the silence that did not allow for sound. Sound? What was sound? —Blink Kasrin opened her eyes and she was no longer in the blackness. Slowly she uncurled herself; nothing hurt, nothing tingled, but her mind was aflame with raw energy. She immediately knew where she was. [Kasrin]. [I’m done.] Kasrin’s head hurt, but she stubbornly switched to real-noise. Her throat seemed too dry for it, but she tried. “I’m done,” she repeated, out loud in real-noise, her voice hoarse and cracking. “How was it?” the monitor asked, in real-noise as well. “You did surprisingly well.” Kasrin wondered if she should take offense at the “surprisingly”, but was too full of the sensation of her mind’s newfound energy to really care. “It was…” Kasrin paused. How could she describe it? It was a state of existence that could never truly be. Now she was back in real-noise, and already her memories of it were fading. It couldn’t have ever happened. What had just happened? “Feelings of forgetfulness and denial are normal,” the monitor said. [Would you like to go back to sim-hear?] “No,” Kasrin said out loud. She tried to remember that sensation of no sound, no noise, of the concepts of those things completely nonexistent. Scraps of feeling from floating in the blackness came back to her. She tried to hold on to those. “No, I’m fine.” The monitor gave her a cursory once-over. “You seem to be fine, Kasrin,” she said. “You’re free to leave.” “Thanks,” Kasrin said absentmindedly, walking over to the exit. “Kasrin,” the monitor said when Kasrin was halfway there. “Yes?” She looked back, a little annoyed at being called again. The monitor smiled. [Congratulations.] [Kasrin.] “Real-noise,” Kasrin reminded Sander. He shrugged lazily. [Thought you might prefer this way.] “I don’t,” Kasrin told him.

THE FIRST OF A THREE PART INSTALLATION OF ANGELA QIAN’S ORIGINAL SCIENCE FICTION STORY.

[Most people do, after being in there.] “Stop that,” Kasrin snapped. “Anyway, the before and after aren’t any different.” Sander yawned, and spoke aloud in real-noise. “Of course it’s different. Now you’re of age.” He smiled at Kasrin, the grin as lazy as his shrug. “Congratulations.” The same word had been used by the monitor. Kasrin didn’t like it for some reason. She kept thinking of the silence that was not silence because there was nothing except silence. Why give something a name if it was so omnipresent? There was no alternative, opposite, or other option. There was no state of things except that unbroken stillness. Kasrin knew that was what she’d been made to believe, for the brief wonderful interlude, and she knew that she had forgotten the real sensation. She wanted to go back. “I know I’m of age,” she said moodily. Was it her desire that made her so moody? “How did you feel?” Sander arched a delicate eyebrow. “Afterwards, I mean.” “I felt like how everyone told me it’d feel.” Kasrin paused, lapsing back into simhear accidentally. [Like my mind was on fire. It was amazing.] [So now you’ve joined the cult.] [But I still don’t—] Kasrin stopped, scowling. “I still don’t get why we do it.” “The vitfit?” “I don’t like that word,” Kasrin said. “Vitfit sounds so…so vulgar, so colloquial, for something like that.” Sander shrugged. “That’s what people these days call it. Vitfit. And you know what it’s for. You’ve been taught it since the day you were born. Before, even.” “I don’t get it,” Kasrin repeated. “I never got it. We’re born. We talk in real-noise and sim-hear. Then we get put in the…vitfit…and afterwards, what?” [Afterwards, you go back to the vitfit whenever you want. It’s like being able to go into bars.] “Stop that sim-hear,” Kasrin snapped. [You’re the only person I know who doesn’t like sim-hear. Why are you so against it? There’s no difference between it and real-noise.] Kasrin shrugged her thin shoulders together, face turning petulant. “I just don’t. I don’t know why. Real-noise is more natural.” “Sim-hear is just as natural as real-noise.” “Just don’t talk to me in sim-hear, okay?” “You do it yourself sometimes.” “Only today, because that vitfit messed me up.” Kasrin put a hand to her head and winced. “Why do people like sim-hear so much more after using the vitfit anyway? Even me. I used to hate sim-hear, and now I use it accidentally all the time.” Sander shrugged again. “Sim-hear is encouraged. The government wants us to use sim-hear, and everyone likes sim-hear better than real-noise.” “Not everyone.” “You’re the exception.” “I just feel like the vitfit is…” “Sacred?” “No.” Kasrin thought, and lapsed back into sim-hear without noticing. [I feel like everyone thinks it’s sacred, and for a while—after I got out—I thought so too. But just now, I thought of it as something horrible. Something terrifying.] [You’re crazy.] Sander shrugged. [And Kasrin?] [What?] [Now you’re the one using sim-hear.] Kasrin scowled.

30


President

Alice Fang

Blog Editor

Editor-in-Chief

Angela Qian Paul Ho

Assistant Editor-in-Chief

Administrative Editor in Chief

Ling Jing

Rebecca Su Angela Zou

Design Editor

Jennifer Cheng Graphic Editor

Michelle Oberman Claire Chen Assistan Graphics Editor

Wendy Zhang Senior Editor

Albert Chen Murong He Praneet Mylavarapu Siddhartho Bhattacharya Chemistry Editor

Michelle Sit Justin Song Sarah Hsu

Biology Editor

Florine Pascal Sarah Watanaskul Becky Kuan Physics Editor

Lauren Sweet Sharad Vikram

Noor Al-Alusi

Leadership Team

Michelle Kao Siddhartho Bhattacharya Melodyanne Cheng Parul Pubbi Praneet Mylavarapu Albert Chen Sharon Peng Sumana Mahata Secretary

Maarya Abbasi Yuri Bae (attendance and staff) Sara Shu (articles) Tavia Sin (graphics) Assistant Webmaster

Tiffany Sin

Staff Authors Adrianna Borys, Albert Chen, Alice Fang, Angela Qian, Angela Zou, Apoorva Mylavarapu, Avinash Chaudhory, Bethel Hagos, Bhavani Bindiganavile, Cassie Sun, Choohyun (Kristine) Paik, Daniel Liu, David Chang, Eden Romm, Elora Lopez, Emma Dyson, Erin Kim, Ethan Song, Eva Lilienfeld, Florine Pascal, Frank Pan, Hana Vogel, Harshita Nadimpalli, Howon Lee, Hyeimin (Lucy) Ahn, Jourdan Johnson, Justin Song, Kiernan Panish, Kira Watkins,

Kyle Jablon, Lauren Sweet, Ling Jing, Lucy Ahn, Maarya Abbasi, Margaret Guo, Maria Ginzburg, Mariam Kimeridze, Marina Youngblood, Mary Ho, Melodyanne Cheng, Michelle Kao, Michelle Oberman, Michelle Sit, Mimi Yao, Mitali Chansarkar, Murong He, Myung-hee(Rachel)Lee, Nandita Nayyar, Nathan Manohar, Noor Al-Alusi, Parul Pubbi, Paul Ho, Peter Khaw, Praneet Mylavarapu, Rebecca Kuan, Rebecca Su, Rekha Narasimhan, Ruochen Huang, Sara Shu, Sarah (Hye-In) Lee, Sarah Bhattacharjee, Sarah Hsu, Sarah Kwan, Sarah Watanaskul, Selena Chen, Serin You, Shannon Lee, Sharad Vikram, Sharon Peng, Siddhartho Bhattacharya, Steven Shao, Sumana Mahata, Summer Bias, Tavia Sin, Tenaya Kothari, Tiffany Sin, Tyler Simowitz, Yuri Bae, Graphic Designers Selena Chen, Sarah Kuan, Mary Ho, Summer Bias, Sara Shu, Hyeimin (Lucy) Ahn, Apoorva Mylavarapu, Sarah Bhattacharjee, Michelle Oberman, Choohyun (Kristine) Paik, Serin You, Cassie Sun, Lucy Ahn, Tavia Sin, Ling Jing, Rama Gosula, Megan Chang, Crystal Li, Amber Seong, Claire Chen, Jennifer Kim, Sikyung Lee, Hanna Lee, Wendy Zhang, Catherine Li, Angela Wu, Sarah Gustafson.

Many thanks to the Scientist Advisory Board, an international team of experts that advises the editors of Falconium and reviews articles before publication. Visit www.falconium.org/advisoryboard for more information on the Board.

31


Vol 2, No 4

Fa Falconium 2010.2011


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.