The Franklin 11, Spring 2024

Page 1

What are the risks of integrating AI into cybersecurity systems?

Introduction

The recent boom in artificial intelligence (AI) has led the majority of fields, from medicine to teaching, to reconsider the way they run their systems. AI can increase efficiency, automation and is revolutionising many fields However, in its current form, it is not by any means a perfect solution. It has a tendency to produce “hallucinations”, causing it to present false information as factual, and it is prone to bias depending on its data models.

Cybersecurity is a field of computing dedicated to protecting systems and networks from attacks, and there is currently a huge undersupply of people in these jobs. The demand is expanding rapidly and is outpacing the supply of qualified professionals In theory, AI should replace many of these jobs (if mainly out of necessity). For example, security operation centres (which are responsible for analysing data for malicious activity) would benefit: this work is done by the least experienced tier-one analysts, and such work is tedious and better suited to AI systems AI is highly attractive to companies for this reason, and it is widely considered to be a valuable acquisition. AI is able to work at greater scales than humans can manage, vastly increasing efficiency

This literature review aims to research some of the advantages and disadvantages of integrating AI into cybersecurity systems, as well as looking into solutions to these issues. It aims to reach a clear conclusion about the potential risks of AI, whether these risks are outweighed by the potential benefits, and examine how it is currently being used in industry.

Different types of models

AI is specifically concerned with constructing machines that simulate human intelligence. However, this is a very broad summary: there are many different subsections of AI Machine learning refers to the ability of a machine to “learn” from experience rather than requiring explicit programming It can come in supervised or unsupervised forms Supervised learning is when the machine is first fed a set of training data: these come with annotations as part of the datasets Unsupervised learning uses annotated data, but the machine must make predictions by itself. This type of learning is often used in looking for patterns, in areas like financial fraud Deep learning is also used for pattern recognition, and is composed of layers of neural networks: these are groups of processing nodes The more layers, the deeper the learningthis allows the model to break down problems further through the various layers.

Deep learning architecture attempts to simulate the processes of the human brain, hence the use of neural networks. Machine learning and deep learning are different, as machine learning involves the parsing of data and making decisions from this annotated data In contrast, deep learning makes decisions based on its neural network (which utilises an unsupervised learning system) AI systems work via a combination of these different processes. The term 'AI' as used in this article will discuss these systems as a whole, assuming that the systems are working through a combination of these processes. However, it is worth noting their differences, since these multiple perspectives are

Notting
THE FRANKLIN The Science Magazine of
Hill & Ealing High School
Spring 2024
1

crucial to solving challenges posed by adversarial attacks (through ‘ensemble learning’).

Range of Applications

As AI is becoming more widely used, its applications are broadening and certain types of models are also becoming more specialised However, the continual advance of technology poses some challenges to industry today Virtually all industries have been impacted so it has been difficult to choose an area to specialise in for this project. After several weeks’ research I chose to specialise in cybersecurity, as the modern world would be very different without it in virtually all sectors.

Cybersecurity is vital to maintaining safety online. According to a statistic from November last year, the global cost of cybercrimes is estimated to increase to $13.82 trillion by 2028. This demonstrates the instrumentality of cybersecurity in the field The recent partnering of the American Productivity and Quality Centre and IBM Institute For Business Value helps demonstrate the importance of AI in the cybersecurity sector Many employees in the cybersecurity sector are already regularly utilising AI for its security capabilities - according to one study, 64% claim to be using it currently, with 29% considering it Other applications in the field include behavioural analysis, user authentication (including password protection), phishing and malware detection/prevention

The potential for more sophisticated levels of criminal activity arises alongside these advancements It is therefore vital that scientists undertake research into data protection so it is able to withstand more advanced cybersecurity breaches

How are companies using AI in cybersecurity at the moment?

Many companies are currently integrating AI for cybersecurity purposes. One of the leading companies creating this technology, IBM, have several case studies showcasing companies which have integrated these systems. Sutherland Global Services, a company dedicated to delivering customer experience (for example, through language translation or improving digital engagement) have recently integrated AI into their

systems Effective cybersecurity is of paramount importance to them, as they handle a lot of sensitive information relating to customers. Before implementing AI the company heavily relied on manual threat detection - a lot of this being done through human intuition. It has now integrated IBM’s Security QRadar SIEM, an AI able to detect threats and respond quickly to ransomware As a result, the company has seen fewer false positives, faster mean detection and response times and greater threat intelligence

A further example, Palo Alto Networks, uses a combination of machine learning and AI technology to create a cloud-based security framework which allows users to access data anywhere Their integration of AI has also enhanced their threat detection, as well as their threat intelligence, with machine learning algorithms analysing web page data serving as an example. The AI is able to take preventive measures against potential threats with greater efficiency through the use of pattern recognition and offers enhanced security as a result.

Both case studies illustrate that the current use of AI in cybersecurity has achieved mostly positive outcomes. It should be noted that Palo Alto Networks feature a disclaimer that they are unable to guarantee 100% security through AI algorithms, which demonstrates that these systems have not been perfected Some potential disadvantages will be explored later in this review

Advantages

AI can be used to spot cyber threats and potentially malicious activity. Traditional software is simply unable to keep pace with the volume of malware created per week, and so AI will be incredibly useful for a number of processes:

● Identification of unknown threats and response times While identifying unknown threats is still challenging to AI, it is more manageable for such systems compared to human workers The adaptable nature of hackers can be overwhelming to a human due to the number operating at any given time - an AI is able to identify these threats much more easily. This drastically decreases the response time to new threats

2

● Spotting bots While some bots are less harmful than others, spotting bots with stolen account credentials and falsified information can identify potentially dangerous threats.

● Better endpoint protection VPNs and other antivirus solutions often work based on signatures; these are effective, but only to an extent AI can establish a baseline of behaviour for the endpoint through repeated training processes rather than these signatures

● Robust Backup and Recovery Thorough backup and recovery of data is essential in cybersecurity, as many breaches often include cloud-based data. In the event of an attack, it must be possible to quickly restore data

Disadvantages

● Creational challenges There is a challenge when it comes to large volumes of high quality data - these are relied upon by AI algorithms to both train and improve their accuracy. However, these can be difficult to obtain: many enterprises struggle to obtain them There are several reasons for this, like data silos (data only taken from one department, incompatible with other data), privacy concerns and regulatory constraints It is also difficult to normalise security logs from various sources spread across various regions

● Adversarial attacks Adversarial attacks manipulate data in order to deceive the AI This works by the adversary crafting malware with similar features to benign files’ feature detectors This misclassification of groups of malicious files can prove a serious threat; causative, where an attacker impacts the model itself in the training phases There is even a possibility that the attacker could steal information from the sensitive learning data the model is trained on and exploit this.

● Generating false positives While not an external issue, hallucinations can be problematic if false positives are occurring too often and diluting the reliability of the

results There is often a resultant insufficient accuracy while keeping false positives at an acceptable level.

● Detection strategies It can be difficult for some models to detect malware that it has never been seen before, especially if the data it has been trained on is completely different to the new malware.

● Model transparency For ethical and security reasons, AI needs some kind of model transparency so that people are able to trust the results it provides This transparency can cause national security problems if not sufficiently tested or corrected.

● Bias There is potential for weakness and bias within a system - this is extremely dependent on the data a model was trained on In a lot of cases, it can be near impossible to determine how a system came to the conclusion it did.

Solutions

While there are many solutions to these problems, I have outlined four of the strongest ones Other solutions include diversity in the input data and randomisation during the training process:

● Reinforcement learning Reinforcement learning (RL) is an agent learning algorithm, and it is able to combat such attacks This type of system must adapt to its environment, and is trained in a variety of ways It uses an agent policy representing agent behaviour, maps pairs of states and action, and is an overall more sophisticated algorithm. There is also a reward function, grading actions taken by the agency and returning a number defining the agent’s process, then grading the process and performance Lastly, the value function looks at the expected state of the environment, a critical part of the RL model.

● Generative Adversarial Networks (GANs) Generators learn to generate plausible attacks, exposing the model to a diverse range of potential threats This helps AI defend itself against strategies like adversarial attacks by making the whole system more robust

3

● Diverse models Known as ensemble methods, many models’ results are able to be combined to improve the reliability of data while also making attacks more difficult (as they would involve the exploitation of multiple different models)

● Updating during deployment If patch models are regularly updated to stay up-to-date with developments in adversarial attacking technologies, they will be significantly more reliable against most kinds of attacks In addition, to further evolve the strategies employed by the AI, the model can continue to be adversarially trained during deployment

How successful has AI integration been in other fields?

Businesses are currently struggling with integration for similar reasons, such as the availability of data For example, Walmart has faced a challenge with the integration of an AI-powered system to predict demand for products, as the accuracy of that system was limited because it was not trained on sufficient data. These issues have recently been overcome. Datasets have been expanded to consider future events in their processing, so the system is able to take these trends into account when predicting demand. This also increases accuracy as the AI is not relying on anomalies as part of its datasets (since they are so vast) Walmart has also been able to integrate its own voice-ordering system through AI Integration of AI has had a huge impact on the volume of customer contact, reducing it by millions of questions. One of the ways Walmart was able to solve these issues was improving natural language understanding capabilities, and the results of this can be seen clearly as the use of these services has become more widespread.

Similar approaches have been taken by other retail companies to improve services Healthcare has also begun integrating AI - much like threat detection in cybersecurity, AI is being used for disease detection An AI program developed by the Houston Methodist Research Institute in Texas is able to translate patient data into diagnostic information through mammograms, with a reported 99% accuracy. In a similar way to AI systems in

cybersecurity, this software analyses millions of records and uses predictive software to determine a result. While this is not perfect, it will significantly reduce the number of scans undertaken, prioritising those who need them IBM has also developed an AI, Watson, which uses similar algorithms to their cybersecurity focused AI but utilises them for diagnosis The advantages and disadvantages of AI in cybersecurity permeate many other fields since the technology is very similar, despite having different purposes It is being integrated successfully into many fields and industries, and this process is ongoing. While far from perfect, AI solutions have been widely implemented and developments in the AI industry as a whole will benefit cybersecurity, among many other fields.

Conclusion

The integration of AI has been hugely successful for many cybersecurity systems in the field, such as Palo Alto Networks and IBM. This has also been beneficial outside of the field, like in retail and healthcare industries However, this is not to say this has been without issues, and the generation of false positives and bias is something that must be taken into consideration when integrating AI Adversarial attacks pose a great threat, and it is worth noting these problems are only created by using AI rather than human intelligence While we do have ways of dealing with adversarial attacks, these solutions can be very expensive and difficult to maintain.

AI should be integrated slowly into our cybersecurity systems as the advantages can drastically decrease response times and reduce the amount of reliance on human intuition. While there are many risks to employing AI in cybersecurity further solutions will be both realised and implemented as time passes. This will largely benefit the quality of cybersecurity offered This research is useful for fields outside of cybersecurity, such as healthcare and retail. These systems use similar algorithms, but have different purposes, so many of the same issues will be present in a wide range of fields By presenting in assembly and publishing some of my findings in the school magazine, I will increase understanding of artificial intelligence among students This will help to give them a more informed perspective of the

4

changes in the systems we use in our everyday lives, as well as raising awareness of some potential issues within the field of artificial intelligence as a whole, not just within cybersecurity

Bibliography

Preliminary research

1 CEPS (2021) Artificial Intelligence and Cybersecurity, accessed 11/08/23

https://www.ceps.eu/artificial-intelligence-and-cy bersecurity/

2. Patrick Grieve (2023) Deep Learning Vs Machine Learning: What’s the Difference?, accessed 25/08/23

https://www.zendesk.co.uk/blog/machine-learnin g-and-deep-learning/

3 Statistia (2023) Estimated cost of cybercrime worldwide 2017-2028, accessed 19/08/23

https://www statista com/forecasts/1280009/costcybercrime-worldwide

AI in use for different companies

1. IBM (2023) QRadar SIEM, accessed 19/02/24

https://www ibm com/products/qradar-siem

2 IBM (2023) Sutherland Global Services, accessed 14/02/24

https://www ibm com/case-studies/sutherland

3 Palo Alto Networks (2023) AI Powered SASE, accessed 15/02/24

https://www paloaltonetworks com/cyberpedia/ai -powered-sase

4. Medium (2023) The 5 Biggest Challenges of Implementing AI in Businesses, accessed 12/02/24

https://medium com/@marketing upnyx/the-5-big gest-challenges-of-implementing-ai-in-busi nesses-93d51b7b9728

AI in use outside cybersecurity

1. Parvez Musani (2023) , Walmart’s AAI Powered Systems

https://tech walmart com/content/walmart-globaltech/en us/news/articles/walmarts-ai-power ed-inventory-system-brightens-the-holidays html

2. Cheryl Ainoa (2023), Three Ways We’re Using Conversational AI At Walmart, accessed 19/02/24

https://tech walmart com/content/walmart-globaltech/en us/news/articles/three-ways-we-areusin g-conversational-ai-at-walmart html

3 IBM (2023), Humana, accessed 19/02/24

https://www.ibm.com/case-studies/humana 15

4 Sarah Griffiths (2016), This AI Software Can Tell If You’re At Risk From Cancer Before Symptoms Appear, accessed 19/02/24

https://wwwwired co uk/article/cancer-risk-ai-ma mmograms

Main Research

1 Nico Klingler (2023) The Ultimate Guide to Understanding and Using AI Models, accessed 03/10/23

https://viso ai/deep-learning/ml-ai-models/

2. Michael Brett, George Duchak, Anup Ghosh, Kristin Sharp, Frank J Cilluffo, Sharon L Cardash (2017) Artificial Intelligence for Cybersecurity: Technological and Ethical Implications, accessed 24/10/23

https://www.jstor.org/stable/resrep21011.6

3 Gaurav Belani (2021) The Use of Artificial Intelligence in Cybersecurity: A Review, accessed 24/10/23

https://www computerorg/publications/tech-new s/trends/the-use-of-artificial-intelligence-in-c ybersecurity

4 Francisco L Loaiza, John D Birdwell, George L Kennedy, Dale Visser (2019) Utility of Artificial Intelligence and Machine Learning in Cybersecurity, accessed 17/10/23

https://www.jstor.org/stable/resrep22692

5 Mark Stamp, Corrado Aaron Visaggio, Francesco Mercaldo, Fabio Di Troia (2022) Artificial Intelligence For Cybersecurity (book)

6 Jai Pradeesh (2023) Artificial Intelligence In Cybersecurity: Unlocking Benefits And Confronting Challenges, accessed 05/11/23

https://www forbes com/sites/forbestechcouncil/ 2023/08/25/artificial-intelligence-in-cybersec urity-unlocking-benefits-and-confronting-challeng es/?sh=7daf0c3964cf

7. IBM (2023) Artificial intelligence (AI) cybersecurity, accessed 05/12/23

https://www ibm com/ai-cybersecurity

5

The Benefits of Boredom

Boredom is an emotion that impacts everyone in different ways and is often viewed as unpleasant, but scientists think it could be much more than that. One of the reasons why many people dislike boredom is the sense of a slower perception of time and lessened productivity, which can make it feel ‘scary’. However, boredom doesn't just lead to negative outcomes – it can increase creativity, curiosity and self-control

Studies have shown that people who experience moderate levels of boredom are more likely to engage in creative thinking and problem-solving. This is because it is an emotion that forces you to remain in the present and engage in introspection In the absence of activity, being bored is an opportunity to reflect, free up your brain and make space for new creative ideas, using your imagination to think in different ways Boredom can also motivate you to try new hobbies and pursue your goals

Another benefit of boredom is heightened self-control, as it enables us to focus and pay attention Nowadays, we use screens a lot and there are many distractions in our minds, so it's unusual to be with yourself and not have anything to do Taking a break and relaxing is extremely helpful to improve your mental health

To conclude, boredom is an emotional state which, like any other, has many beneficial qualities It gives you a chance to check in with yourself and improves mental, physical and emotional health So next time you feel bored, think back to this article and remember to embrace boredom – It could help you find a solution to your problem!

Bibliography

https://www psychologytoday com/gb/blog/science-choic e/202004/5-benefits-boredom

www hope-wellness com/blog/why-being-bored-is-good-f or-your-mental-health#:~:text=Curiosity%20and%20our% 20search%20for,curious%20and%20explore%20new%20 things

6

Will Artificial Intelligence-Powered Robots Ever Be Able To Replace Human Surgeons?

We are currently living in a fast-paced, innovative society where the possibilities of technological advancements are endless Many world-changing new ideas and products have emerged in the previous two decades, such as harnessing the usage of artificial intelligence and virtual reality, as well as contactless payment methods such as Apple Pay. These new technologies all have a purpose - to serve people - and one of the new fields involved with rapidly improving technology is medicine As of now, the average number of doctors per 1,000 people in European nations is 3.7, but the United Kingdom would need 50,000 additional doctors to bring the country up to an equivalent standard The NHS has faced a large drop in staffing and this means that patients have had to spend an average 4 5 hours waiting, slowing down the entire diagnosis process.

It is therefore crucial that the country finds a new method to treat patients more efficiently. One method that has been put to trial, engendering significant controversy, is the usage of artificial intelligence (AI) powered robots to perform medical procedures. AI is a technology that enables computers to learn and analyse human input, and refers to the intelligence of machinery, as opposed to human capability. This discussion will analyse in detail both the positives and negatives of robotic surgeons, and whether this treatment can be both cost and patient-effective.

“Artificial Intelligence is not a substitute for human intelligence; it is a tool to amplify human creativity and ingenuity.” This quotation perfectly encapsulates the possibilities of artificial intelligence in medicine Now more than ever, our NHS requires help with seeing greater volumes of

patients more rapidly A robot has the potential to speed up the procedure, as they are programmed to perform a specific task and have other commands for anything requiring further investigation This means that they are safe to use Many types of surgery currently employ the assisted usage of robots, such as for cholecystectomy (removal of the gallbladder – a very delicate procedure) as well as other excisions and anatomical replacements AI robots also enable doctors and surgeons to access more results and scans more clearly. The da Vinci robot allows the surgeon to see with 3D high-definition vision and 10 times magnification - this means that it can reveal new information to doctors that they could never access before. This also ensures that if there is anything the robot cannot do, the cameras can show the doctors the exact issue This idea of robotic AI-powered surgeons has actually been trialled before by the Smart Tissue Autonomous Robot (STAR) on a pig, and successfully performed laparoscopic surgery in 2016. Many hard-to-perfect surgeries such as prostate cancer tumour removal have been performed by unmanned robots, but the practice is rare due to the cost and specific criterias needed by the patient. With all these positives and possibilities for AI in medicine, the potential outcome could be that all procedures are done solely and tirelessly by robots, consistently performing tasks with 100% precision and satisfaction

On the contrary - robotic surgeons would not be able to replace nor replicate the actions and efforts of a surgeon if they were allowed to perform procedures. Firstly, the cost of AI robots can be up to $1 5 million for extremely efficient robots, such as the da Vinci S robot, which can allow surgeons to perform complex, minimally invasive procedures with extra precision The da Vinci machine is fitted with “arms” for holding a camera and instruments This would not be cost effective for the NHS, and may even necessitate a rise in individual contributions to the NHS through taxation and National Insurance This would have a further negative financial impact on individuals during a time of high inflation, including those who do not currently and may never require this form of

7

surgery Additionally, all machinery has its faults, and certain body positions exposed to the da Vinci surgical device can lead to nerve damage and bruising These malfunctions pose a significant risk of causing additional, avoidable damage to patients My dad, who is a surgeon, asked his colleagues if they believed that robotic surgeons could perform as well (if not better) than humans, and 86% said no It has also been proven that more endorphins are released with the comfort of knowing that a surgery is done by someone who is trustworthy, and this eliminates the risk of stress. Robotic surgeons, who do not yet understand how the human neurological system and emotions work, may cause alarm or concern in prospective patients When one feels scared, your amygdala (a small organ in the middle of your brain) alerts your nervous system and sets your body’s response to fear into motion Stress hormones and cortisol begin to be released into your circulatory system, your muscles begin to tense, and you can begin to shake or tremble This means that a stressed or fearful patient may not feel safe and reassured that a sole robot will be able to perform well, and this could take a toll on their overall mental health Were these robots to inspire Governmental focus and interest through promoting speed and efficiency, the jobs of human surgeons could be put at risk

To conclude, there is the potential for both positive and negative outcomes were we to have AI-powered machines performing surgeries and procedures on patients. There are many factors that can potentially cause avoidable damage However, in light of the UK’s current NHS staffing crisis, these robots could open up a whole new world of medical innovation and discoveries For instance, their new adaptations (like magnification) that could help prevent diseases from spreading (by identifying benign and malignant tumours) However, the more likely outcome is that robots will continue to help surgeons with precision - it is unlikely that solo robots performing major procedures will be legalised due to potential safety concerns and the further testing required However, this topic really emphasises how crucial it is to keep developing technology in order to find new, more efficient methods in medicine that will save lives

The da Vinci robot, powered by AI

Bibliography:

https://www.bbc.co.uk/news/health-36190411

https://www bbc co uk/news/uk-scotland-50745316

8

The Significance of the Discovery of the Higgs Particle, and its Impact upon Life Today.

Named after Peter Higgs in 1964, the Higgs Particle (more formally known as the Higgs boson), was a long-theorised elementary particle considered to be a carrier particle of a field that gives all fundamental particles mass. This was later known to be the Higgs Field Fundamental particles fall into one of two categories: fermions or bosons Leptons and quarks are examples of fermions - these particles are the building blocks for matter. In contrast, bosons form the basis of fields The diagram below illustrates how leptons such as electrons and quarks make up an atom.

Despite this being a prevalent theory within particle physics for decades, the Higgs particle was not observed until 2012 It was at this point that the advanced scientific instruments designed and developed at the European Council For Nuclear Research (CERN), were refined enough to detect the Higgs boson. This provided distinctive proof of its existence, and was an essential discovery that advanced particle physics

The length of this process was a consequence of the challenge posed by attempting to find the Higgs boson The challenge lay in its incredibly short lifespan. Once created by high energy proton collisions in the Large Hadron Collider (LHC), it almost immediately decayed and formed other particles, only leaving traces for physicists to look

for Once they suspected that the boson had been found, scientists had to ensure what they had detected was in fact the Higgs boson. This was done by checking the characteristics of the Higgs boson against the characteristics of the particle found. The characteristics that needed to be identified were whether the particle had a short lifespan, whether it lacked an electric charge and if it had ‘spin’. The term ‘spin’ in this context means the particle had an apparent rotation, similar to that of other fundamental particles The Higgs boson does not have ‘spin’. This was identified due to its observed transformation into 2 photon particles as it decayed - this meant that the Higgs boson would have to have no spin, because it would have a spin value of 0 or 2. A photon has a quantum spin value of 1, and the Higgs boson formed 2 photons However, the chance of the Higgs boson being a spin-2 particle was ruled out due to implausibility, as the signatures of these types of particles were not found In addition to this, scientists also identified that the Higgs boson would have an even parity, meaning that if the Higgs boson was in a mirror it would appear identical This left the particle being identified as the Higgs boson This discovery validated the Higgs field theory and expanded the understanding of fundamental particles

All matter has mass. Even in space where matter would be in a vacuum and have no weight, matter would still have mass This mass comes from the atoms that make up a body, and the mass of these atoms comes from its energy. From Einstein's law E=mc2 (energy = mass x speed of light squared) we can see that mass and energy are equivalent Most of the energy of atoms comes from one of the 4 fundamental forces, the strong force This is a force that holds together all the fundamental particles

9

within the nucleus, therefore keeping the quarks that make up the protons and neutrons bound together. However, the rest of the energy comes from the quarks and electrons that make up the atom, and it is because of the Higgs field, and therefore the Higgs particle, that these have mass. These particles have mass due to their interactions with the Higgs field - the more interactions, the more mass the particle will have. Without the presence of the Higgs field, each particle of the Standard Model of Elementary Particles would not have mass, excluding the Higgs boson. This is why it is special. Like the Higgs boson, these particles all have individual fields However, only the Higgs field has a mass and therefore a vacuum energy, also known as vacuum expectation value (VEV). VEV is the value the field would have in its lowest energy state When other fields interact with the Higgs Field it is also interacting with its VEV which means it gains energy and therefore it has mass The Higgs field can be likened to the current of a stream, and particles to a fish - as the fish moves through the water it interacts with the current and slows down due the the resistance the current poses on the fish; this makes the movement of the fish slower This represents how mass is acquired by particles as they move through the Higgs field The mechanism of particles gaining mass due to their interaction with the Higgs field is called the Brout-Englert-Higgs mechanism, after those who theorised this mechanism

However, some particles, like photons and gluons, have no mass (as shown on the standard Model of Elementary Particles) This is because they do not interact with the Higgs field, and therefore move at the speed of light

There is often a misunderstanding of the significance of the Higgs boson and its connection to the Big Bang It is often misinterpreted that the Higgs particle caused the Big Bang or that there is a religious meaning behind the particle because of the nickname it adopted, the God particle In reality, the nickname was coined as a way to express the particle’s importance to the understanding of the universe While it has no part in causing the Big Bang itself, the Higgs boson does support the Big Bang theory. The Big Bang theory states that the

universe derives from one point with an infinite density and temperature. In the initial moments after the Big Bang, all particles were extremely hot and energetic - these particles did not yet have mass This was because the Higgs field was in a different form where particles could pass through it and not acquire mass, however as the universe expanded, these particles started to cool Consequently, the Higgs Field we understand today was formed, where the Higgs mechanism is in action and therefore giving all particles mass This change in form is due to the shape of the Higgs field (1). The field shows that when the particles are highly energetic they have a mass of zero and a non-zero VEV As the particles cool down, they lose energy and begin to acquire mass as they essentially “fall down” the middle of the field This process is symmetry breaking The particle, at its highest energy form, is in a symmetrical state where it is in the middle of the field, the particle moves through the field, breaks this symmetry and gains mass From this field and graph we can extract an equation from the Higgs field: H=V(h)+Lm (Higgs field = Potential energy of the Higgs field + the mass of the particle) The formation of the Higgs mechanism allows particles to gain mass and therefore form structures such as planets and stars, characterising the universe today

(1)

10

While the Higgs particle was largely significant in the world of physics, its process of discovery had a vast impact in what we now consider everyday life. The Higgs boson was discovered in CERN, an organisation supported by 20 member countries and other observer members. It was at CERN, and with the support and involvement of these countries, that the LHC was developed for the use of particle discoveries and testing theories It was because of this development in technology, advancements in physics, and the consistent demand for sharing found research and scientific information while discovering the Higgs boson, that Tim Bernes-Lee invented the World Wide Web (WWW) in 1989 The WWW has not only revolutionised the scientific world but become a normal part of our everyday lives. CERN is a centre of discoveries and theorisations and it is down to the research and development of the LHC, as well as a result of the process of discovering the Higgs boson and other aspects of CERN, that our communications and knowledge today are so advanced.

In addition to this, the accelerators, detectors and computing programmes associated with the particle

colliders have led to life changing advancements in the medical industry. They have led to the development and invention of Positron Emission Tomography machines, better known as PET scanners, generating 3D images of organs and tissues, and helping doctors in their endeavours to improve people's lives Similarly, the modifications of medical equipment such as accelerators for Hadron Therapy, treating cancer patients, is another impact which CERN and the advancements in technology in physics has had on different aspects of science and society.

Laying the foundation for further theories and findings, the Higgs particle revolutionised our comprehension of particle physics. The discovery of the Higgs boson holds profound significance in physics today, enabling us to truly understand the Higgs mechanism and prove the existence of the Higgs field, which is what we have to thank for the formation of our universe as we know it, explaining the origins of mass in the universe. The discovery of the Higgs boson not only marks a milestone in particle physics but it also opens doors to technological marvels and medical innovations In the years 1964 to 2012, the advancements in technology dedicated to discovering the Higgs particle have allowed physicists and scientists to expand the boundaries of understanding and their knowledge of our scientific world. This has paved the way for future development by the next generation of physicists. The Higgs boson has not only had a direct impact in physics, but its significance stretches into other industries such as medicine The understanding of technology and physics gained from the Higgs particle discovery has provoked inventions of life-changing medical equipment, demonstrating the particles' far reaching significance.

11

Bibliography:

CERN. (n.d.). The basics of a boson. [online] Available at: https://home cern/news/news/physics/basics-boson

Cooke, M (n d ) DOE Explains the Higgs Boson [online] Energy.gov. Available at: https://www energy gov/science/doe-explainsthe-higgs-b oson#:~:text=The%20 Higgs%20 boson%20was%20 proposed.

CERN (2023) The Higgs boson | CERN [online] Home cern Available at: https://home.cern/science/physics/higgs-boson.

CERN. (2023). The Higgs boson: Revealing nature’s secrets [online] Available at: https://home cern/news/series/lhc-physics-ten/higgs-bos on-revealing-natures-secrets#:~:text=At%20the%20time% 20of%20the

Carroll, S M (2015) The Higgs Boson and BeyondTED-Ed (2020). The Higgs Field, explained - Don Lincoln. YouTube Available at: https://wwwyoutube com/watch?v=joTKd5j3mzk

Davis, S (2013) ‘God particle’: Why the Higgs boson matters [online] Cbsnews com Available at: https://www cbsnews com/news/god-particle-why-the-hig gs-boson-matters/.

Lea, R. (2022). 10 Years of the Higgs Boson: how this particle is still unlocking new physics [online] ZME Science Available at: https://www.zmescience.com/science/higgs-boson-10-ye ar/

Ram (2023) The Higgs Boson: Unveiling the Particle that Gives Mass to the Universe. [online] Medium. Available at:

https://medium com/@ramchaganti/the-higgs-boson-unve iling-the-particle-that-gives-mass-to-the-universe-d56f9f4 0107e

CERN (n d ) How does the Higgs boson impact everyday life? [online] Available at:

https://home web cern ch/science/physics/higgs-boson/w hy

A History of Black Holes

Since the beginning of civilization, space has been an area of great curiosity and hence has been extensively researched with progressively advancing equipment and theories One such area of space that has generated significant interest over the years are black holes The theory of black holes relies fundamentally on light and gravitational fields Over time, differing theories of light have been proposed. However, our current theory ties together two of the most popular early theories This theory of the duality of light has allowed us to model theoretical situations which scientists have since researched in order to confirm their existence Their confirmative research has created an incentive to further explore gravitational fields. The current theories of black holes are still ever evolving as new information is discovered

The term black hole is a relatively recent label for a concept that has been around for over two centuries. John Wheeler first popularised the term in

1969, but theories surrounding such an idea have been around since before the early 19th century. At that time, physicists had two conflicting theories of light One of these theories was the Newtonian theory of light, which suggested that light was composed of particles; conversely, Huygen theory proposed that light was a longitudinal wave The

12

conflict between these views led to uncertainty as to how light should react to gravitational fields (if it was affected at all). If light were a wave it would be unclear how gravity would affect its motion However, if light were a particle (photon) it should be affected the same way as any other particle. This means that a photon should theoretically fall back down towards the centre of gravity, the same way a ball would. The original proposal made by scientists as to why we could see no visible effects of gravity on light was that because light travelled infinitely fast, gravity would not be able to affect or impede its motion (whether it was a particle or a wave).

This theory was generally accepted until 1676 when Ole Reomer (a Danish astronomer) proved that light travels at a finite speed Roemer was not, at the time, looking for the speed of light, but instead trying to more accurately map the orbits of Jupiter’s

moon Io He measured the time between the eclipses and found that as Earth moved closer to Jupiter, the time between eclipses decreased, and as the Earth moved away from Jupiter the time between eclipses steadily increased He realised that as the relative position of Earth and Jupiter would not have an effect on the true orbital period of Io, the difference in time must be due to the finite speed of light travelling through space. He calculated the speed of light to be approximately 226,663 km/s, which is 24 4% lower than the value we accept as the speed of light today (299792 km/s).

More famously, Albert Michelson and Edward Morley conducted an experiment in 1887 called the Michelson-Morley experiment They were trying to prove the presence of luminiferous aether by showing that the speed of light varied depending on whether it was travelling perpendicular or parallel to the aether It was presumed that if

travelling parallel to the aether, light would travel slower than it would travelling perpendicular to the aether, as the aether would provide resistance to the motion of light if travelling in a parallel direction In order to try and prove their theory, Michelson and Morley set up an experiment that split a beam of light into two beams travelling at right angles from a point. These beams were then reflected back into the middle and back out many times to increase the path length Theoretically, if aether existed, the distance required to have a visible change due to the aether would be less than the distance covered, and therefore any change based on the direction relative to the aether would be clear The Michelson-Morley experiment failed, and instead disproved their theory , showing that light travels at a constant speed regardless of its direction of travel The discovery of the finite speed of light had huge implications across physics, one of which was that gravity’s effect on light would have to be reviewed.

If, according to Newtonian theory, light was a particle, light travelling at a constant speed must be affected by the gravitational force John Michell

13

wrote a paper in 1783 working on the assumption that a photon, a particle of light, must be affected by gravity. Michell stated that a star with a large enough density would have a gravitational field so strong that light would not be able to escape, since the photons would be pulled back down towards the centre of gravity Michell also suggested that you would not be able to see these entities, as no light would escape, however from Earth we may be able to detect the effects of its gravitational field by observing the behaviour of nearby bodies Although the term was not used at this point, Michell had described the basis of a black hole. During the 19th century, Newtonian theory of light fell out of favour, as Huygen’s wave theory seemed to explain everything at the time. Unfortunately, this led many physicists, including Michell, to drop the idea that an object could have such a gravitational effect that light could not escape, as it was still unclear what effect gravity would have on a wave

Finally, in 1905, quantum mechanics was formed and a theory to explain the duality of light was created by Einstein. This settled the dispute between those who believed light was a particle and those who thought it was a wave. The duality of light means that at any given point, light is observed as a particle; however, when we observe the movement of light it behaves as a wave Schrodinger suggested that, according to quantum mechanics, particles can behave like waves and therefore can be in several states simultaneously The double slit experiment shows us how light would behave as both a particle and as a wave. As the wave enters the slits, two waves come out on the other side - at some point these waves cancel out, meaning that the final pattern you detect on the back surface is a particle pattern, due to the fact that the waves cancel each other in certain places When the peak of a wave aligns with the trough of a

wave of the same amplitude, it will completely cancel out, and you would therefore see no light in those areas. Heisenberg developed the uncertainty principle in 1927, which states that we are unable to tell both the speed and position of a particle such as a photon with perfect accuracy. He states that the more we know about the position of a particle, the less we will know about its speed and vice versa The uncertainty principle has an equation:

ℏ = ��ℎ����������������������������'������������������

The duality of light meant that gravity must have an effect on light as it behaved both like a particle and a wave, however the strength of the gravitational field required would have to be much stronger than any field we had ever encountered With theoretical circumstances confirmed, physicists wanted to explore whether a gravitational field of that strength could exist. In order to understand how such a gravitational field would be formed, physicists turned to the lifecycle of a star, as it is one of the few places that have enough mass to have a chance of forming such a gravitational field

In order to create a gravitational field of sufficient strength, the entity must have a massive density Therefore, it is optimal to explore the collapse of a star. When a star runs out of hydrogen and nuclear fuels, it will no longer have a nuclear force balancing the gravitational forces, and it therefore begins to contract By this logic, a star must then contract to an infinite density, as it would never be able to balance its own gravitational force This seemed impossible to many physicists at the time, until Subrahmanyan Chandrsekhar theorised in 1928 that if a star collapses to a small enough radius, it would eventually be able to support itself through Pauli’s exclusion principle. The exclusion principle relies on the differing velocities of particles and states that no two electrons of the same atom can occupy the same orbital unless the two

∆��Δ��≥ ℏ 2
������������������������������������������,
������������������������������������������,
∆�� =
�� =
14

electrons have opposite spins The spin of an electron can be modelled by:

||��|| = ��(�� + 1)ℎ

||��|| = ��������������������, �� = ��������������������������������������, ℎ = ������������'�� ����������������

As the speed of particles is limited by the finite speed of light, this means that the exclusion principle is also limited Chandrsekhar calculated that any star bigger than 1 5 solar masses would not be able to support itself using the exclusion principle, as the difference in velocity required to balance the gravitational forces would be greater than the speed of light This became Chandrsekhar’s limit. Any star with a mass less than that of Chandrasekhar's limit would not have a great enough density to form a gravitational field strong enough to prevent the escape of light.

If a star had a mass larger than Chandrasekhar's limit, then it would contract, as it would have no state of equilibrium. Eventually, due to the high temperatures and pressure, it would explode to create a supernova The core of a supernova is too small and dense for a white dwarf, so instead it forms a neutron star, where it becomes so compact that protons and electrons become so densely packed that they merge to become neutrons. The exclusion principle between these neutrons provides an outward force that prevents any further collapse. However, as for white dwarfs, neutron stars once again have a limit to the outward force posed by the exclusion principle In 1939, Robert Oppenheimer suggested the limit to form a neutron star was 0.7 solar masses. If the body’s mass exceeds this limit, Oppenheimer suggested that the body would collapse to an infinite density At this point, physicists had circled back to the idea that a body would collapse infinitely, as from this point there would be no outward force to prevent it from collapsing infinitely in on itself. Oppenheimer also stated that there would be no observational consequences (that could be detected at the time) of the infinite collapse This is because a sufficiently massive and concentrated body would be

completely invisible, as the velocity required for matter or signal to escape would be greater than the speed of light. Therefore, nothing would be able to escape the gravitational field of the body

Many scientists went on to extend Oppenheimer's work following the Second World War, and they found that a sufficiently large gravitational field would alter the path of light (when behaving like a wave) As the star contracts, the gravitational field strength increases due to a greater density and therefore the path of the light is altered more. As the light gets bent further and further inwards, it gets more difficult for the light to escape, and it appears dimmer and redder to the observer When the star eventually contracts past the critical radius, the gravitational field becomes so strong that light can no longer escape Everything within that region gets dragged towards a centre of gravity. This region is what we now call a black hole According to Roger Penrose and Stephen Hawking’s research between 1965-1970, due to the general theory of relativity there must be a singularity within the black hole that has an infinite density This singularity is the point to which all mass of the star collapses into, the centre of gravity. At this point all matter is compressed infinitely and conceptions of time and space completely break down

In 1967, Werner Israel expanded on some of the early research done by Hawking and Penrose and found that a non-rotating black hole must be perfectly spherical with the singularity of infinite density at the centre. Israel calculated that the size of the black hole depended solely on its mass The edge of the spherical surface of a black hole is called the absolute event horizon. This is the edge of the area where any point within which a signal can not escape regardless of the direction the signal is emitted. At any point outside of the event horizon, light can escape the gravitational field if

15

directed outwards At any point directly on the event horizon, it is possible for light to hover forever at the same distance from the centre of the black hole, as it doesn’t have enough energy to escape but has enough not to be dragged back down The radius of the event horizon of a non-rotating black hole can be calculated by the equation

2����/��2

�� = ��ℎ�������������������������������������������������������������� �� = ��ℎ����������������������ℎ��

In 1963 Roy Kerr found a set of solutions for the general theory of relativity equations that described black holes that rotate at a constant rate The size and shape of the black hole relies only on the mass and rate of rotation. The faster it rotates, the more it bulges outwards around its centre (“equator”) and the greater the mass, the larger the size of the black hole. His set of solutions also confirmed that a non-rotating black hole would have a perfectly spherical absolute event horizon In 1971, Stephen Hawking derived Hawking's Area Theorem, which states that the area enclosed within the event horizon of a black hole should never shrink His theory was confirmed by observations from physicists at Cornell, MIT, and other institutions in 1976

All of the early theories and research into black holes and their formation remain the basis of extensive research studies and new theories in the present day The fundamental principle of black holes, that a black hole is a star that collapses to an infinite density as it is unable to stabilise itself, is the main idea behind current research, including the research being conducted by the James Webb Space Telescope Along with other space telescopes, JWST has been instrumental in locating over 20 supermassive black holes and allowing astronomers to study them in greater detail.

Bibliography:

as cornell edu/news/hawkings-black-hole-theorem-obser vationally-confirmed#:~:text=A%20central%20law%20for %20black,derived%20the%20theorem%20in%201971

https://sciencedirect com/topics/pharmacology-toxicolog y-and-pharmaceutical-science/pauli-exclusion-principle#: ~:text=Pauli's%20Exclusion%20Principle%20states%20th at www.jstor.org/stable/24927336?searchText=black+holes &searchUri=%2Faction%2FdoBasicSearch%3FQuery%3D black%2Bholes&ab segments=0%2Fbasic search gsv2 %2Fcontrol&refreqid=fastly-default%3Afee35fa7b9d9be6 473b25a2859580c61&seq=1

www.nasa.gov/missions/chandra/nasa-telescopes-discov er-record-breaking-black-hole/#:~:text=By%20combining %20data%20from%20NASA's,years%20after%20the%20 big%20bang

Universe Expansion: Changes in Thoughts Over Time

Dutch Astronomer Willem De Sitter, describing the period of 1915-1930, said that “Never in all the history of science has there been a period when new theories and hypotheses arose, flourished, and were abandoned in so quick succession” This was a period in which theories of static or expanding universe models thrived, bringing about a new age of science and knowledge of the universe. The decades following this may not have been as revolutionary in their discoveries but were more consolidatory in nature, using the advances to understand more about the universe Especially during this time there was much discourse about the Big Bang and the evolution of the universe. Redshift was a significant advancement of this era, and allowed us to provide evidence for the Big Bang.

In the early 1900’s, the general agreement amongst scientists was that the universe was static. Einstein's general theory of relativity was the first endeavour to create a mathematical model of the universe - a theory of time, gravity and space In his laws of general relativity, Einstein explains gravity, predicts that light would bend in a gravitational field, and proposes that space-time is a 4-dimensional

�� = ��������
16

concept which obeys the Einstein equation1 The original equation links spacetime curvature to the stress-energy tensor. The theory proposed that space and time were affected by matter and in fact not fixed and unchanging He applied this theory to all space and time in attempts to model the universe, however struggled with this due to presuming that the universe was static Einstein proposed the static state of the universe; however, in order to prove it, his equations needed to be altered by adding a constant term, to be known as the ‘cosmological constant’. The equation is a modification of his field equation:

where G is the gravitational �� µν 1 2 ����µν = 8π���� µν constant, Tμν is the stress-energy tensor (the density and flux of energy and momentum) and gμν is the metric tensor (defines distance and angles) The modified equation containing the cosmological constant is:

2

1 history aip org/exhibits/cosmology/ideas/expanding htm

2 www scholarpedia org/article/Cosmological constant

3 blogs scientificamerican com/guest-blog/einsteins-greatest-blunder/

to the nature of the equation the cosmological constant is not diluted as the universe expands, and in contrast, the density of matter drops inversely proportional to volume

Development of Einstein's Equations5

�� µ�� = 8π���� µ��

Original Equation

Law of expanding universe = All matter and energy in the universe

�� µ�� + Λ��µ�� = 8π���� µ��

Equation

�� µν 1 2 ����µν + Λ��µν = 8π���� µν where Λ represents the cosmological constant The constant meant that Einstein had a model that followed his theories of a static universe and inertia, the reluctance of a body to move One month after the publication of Einstein’s paper entitled ‘Cosmological Considerations’ (which contained his cosmological constant), De Sitter3 published a paper which countered it De Sitter’s theory contained none of the hypothetical world matter which Einstein assumed essential for any general model (with this assumption being detrimental to his model). He criticised Einstein's distinction of the divide between space and time and his assumption that time has a separate position from space4 In a letter in response to this Einstein stated “To admit such possibilities seems senseless”. Einstein's paper was the last contribution into the field; any following discoveries or theories being from others, with Einstein forming his opinions after acknowledging their results (eventually describing the cosmological constant as his ‘biggest blunder’) Additionally, due 4

arxiv org/pdf/1402 3212 pdf#:~:text=Einstein%20clearly%20laid%20out%20the,Relativity %E2%80%9D%20(Einstein%201917)

Cosmological Constant

Law of expanding universe + cosmological constant = All matter and energy in the universe

Friedman later proposed that non-static solutions to Einstein's equations should be considered in models of the universe. In 1922 he developed his own equation for an expanding universe, which also described the behaviour of matter The Friedman equations are derived from Einstein's equations and are differential. The equation, , ��2 8π��ρ 3 = ����2 ��2 shows how the expansion rate changes over time It represents 6 . ������������������ �������������� = ������������������ H is the Hubble parameter (expansion rate), S is a scale factor, k is the number which represents the overall curvature of the universe and is the ρ average density of matter and energy in the universe This shows the expansion of the universe is not just related to the curvature of space, however also the contents of the universe, given by ρ

Vesto Slipher was the first person to recognise and detect Doppler shifts, now known as redshift, using the Brashear Spectrograph. His detection and observation of spiral nebulae was the first piece of concrete evidence to support the theory of an expanding universe Below shows that the further 6 phys libretexts org/Bookshelves/Astronomy Cosmology/Big Ideas in Cosmo logy (Coble et al )/17%3A Dark Energy and the Fate of the Universe/1703 %3A The Friedmann Equation and the Fate of the Universe

5bigthink com/starts-with-a-bang/einstein-general-theory-relativity-equat ion/

17

away nebulae are, the larger red shifts there were, and the faster the speeds at which they were travelling away.

He took a spectrogram for a nebula which showed that the nebula was advancing to the system at immensely high velocities. Some of these velocities recorded by Slipher were so high that other astronomers struggled to believe it to be real. From these findings he produced a drift theory7 , stating that our galaxy was moving in relation to the nebulae, despite further observations contradicting this.

In 1928 Edward Hubble examined distant nebulae and compared them to closer ones using the Doppler effect. Through this he found a change in frequency of the light between the closer and further away nebulae, and he could find the distances by using the brightness of these stars. Slower frequencies meant longer wavelengths, and they therefore lay further on the red end of the light spectrum.

This displacement of the lines to the red end of the spectrum was to be known as ‘redshift’. Hubble's assistant, Humason, would record velocities of these nebulae and Hubble would record the 7 arxiv org/ftp/arxiv/papers/1108/1108 4864 pdf

distances, and with this they sought out to find a mathematical relationship between the two variables; distance to a star and the velocity at which it is travelling away from the earth8 They calculated the velocity using redshift and the equation . The Hubble constant �� = Δλ λ ≈ Δ�� �� ≈ �� �� was subsequently born from this graph, Hubble’s estimate for the value of this constant being 500 km/s/Mpc The constant is seen in Hubble's Law; , v being the speed of the cosmological ν =�� 0 �� object and d being the measured distance The hubble constant predicts how fast an object such as a star should be moving away from the earth He found that the further away a galaxy is, the faster it moves away from the Earth. From this graph we get Hubble's Law; , v being the speed of the ν =�� 0 ��

cosmological object, d being the measured distance and representing the constant The slope of the �� 0 linear fit we get from Hubble's Law being Hubble's constant itself, representing a constant rate of cosmic expansion9 . We use the Hubble constant to estimate the age of the universe by rearranging the equation, by considering the gradient of the �� = �� ��

Hubble graph - and thus resulting in an �� 0 = �� ��

equation of Knowing that before expansion �� = 1 �� 0

the distance between anything was 0, we can find out t - being Hubble time - using the currently thought value of Hubble’s constant, which is estimated to be 14 billion years

Equation to find the age of the universe10 ���������� = ���������������� �������� �� 0 = �� 0 �� 0

Rearr. �� 0 = �� 0 �� 0

Consider �� =�� 0 �� �� 0 = �� 0 �� 0

8 history aip org/exhibits/cosmology/ideas/expanding htm

9

www pnas org/doi/101073/pnas 1424299112#:~:text=Hubble's%20values%20for%20his %20distances,large%20by%20the%20same%20fac

10 Edexcel A-Level Physics Year 2

18

When using �� 0 = 73.8����/��/������

73 8����/��/������ = 73800��/��/������

1������ = 3.09×1022 ��

�� 0 = 73800

309×1022

�� 0 = 2 39×10 18����/��

Using�� 0 = 1 �� 0

�� 0 = 1

239×10 18

s �� 0 = 4.19×1017

This value gives us an age of the universe of 14 �� 0 ≈ billion years.

Hubble's graph shows the velocity-distance relationship for 46 nebulae The solution considering the nebulae individually can be seen in the solid line and the black dots, and the dotted line and outlined dots shows the solution when the nebulae were grouped11 . Since Hubble's discovery, scientists have found that the constant significantly differed from the calculated 500 km/s/Mpc

Hubble's distances were incorrect by a factor of 7, making all distances too small and the constant too large Today's calculated value of the hubble constant is ~73.8km/s/Mpc. Additionally, when drawing the graph, he took data near the origin only However, he then extrapolated outwards, making the values inaccurate due to little to no data being had on nebulae that were further away, making the regression line incorrect for the larger distances and velocities Despite this, Hubble’s discovery was revolutionary and brought about a new age of knowledge pertaining to the expansion of the universe

A few decades after Hubble's discovery, there were discussions regarding the Big Bang and the early stages of the expanding universe. In 1963, Radio Astronomers Arno Penzias and Robert Wilson12 detected microwaves in their radio telescope, which seemed to come from the sky at the same intensity no matter what and in all directions - in an isotropic manner This was a universal background radiation, something which they came across by complete coincidence. They consulted Robert Dicke13 , along with other scientists, and came to the conclusion that they had found cosmic background radiation This is leftover radiation from the Big Bang - the theory is that when the universe began, it went through inflation, expansion and cooling, and Cosmic Microwave Background is the leftover heat As the universe expands it becomes cooler and less dense, CMB being a result of this and evidence for universe expansion In 1965, Dicke interpreted it as radiation of 3 K, which was groundbreaking - at that time, there was no equipment that detected microwave radiation at a temperature lower than 20K This discovery allowed them to confirm evidence of the Big Bang through the radiation. Scientists today are still debating many aspects of universe expansion, such as the Hubble constant -

11

www pnas org/doi/101073/pnas 1424299112#:~:text=Hubble's%20values%20for%20his %20distances,large%20by%20the%20same%20factor

��
0 = 1 �� 0
news uchicago edu/explainer/hubble-constant-explained
history aip org/exhibits/cosmology/ideas/bigbang htm 19
13
12

while its current predicted value is 73 8 km/s/Mpc, it is still highly discussed today and many different scientists have different opinions on and values of this constant This is due to there being multiple methods to find both Hubble's constant and also the density (also referred to as the ‘clumpiness’) of the matter in the universe Using cosmic voids14 is one of the new methods used to observe universe expansion and the rate at which the universe is expanding - they modelled how universe expansion would impact different sizes of voids, changing the expansion rate to see the variation. Faster expansion resulted in smoother, large voids whereas slower expansion demonstrated more ‘crumpled’ and smaller voids The data presented a Hubble constant similar to the traditional method of using CMB from the Big Bang to measure this, however measurements of ‘clumpiness’ were very different.

There have been many theories and scientific advancements in the calculation of the expansion of the universe. The subject is still being discussed and researched today, and those mentioned above, along with many other scientists, brought about the revolutionary age of knowledge about universe expansion and the beginning of the universe

Despite being incorrect, Einstein's theory of general relativity and cosmological constant sparked new theories, such as De Sitter and Friedman’s, which helped to progress theories and discoveries Vesto Silpher’s discovery of redshift subsequently allowed Hubble to see the correlation between the distance of a star and its velocity This enabled him to come up with the Hubble constant, which, despite being incorrect at the time, set the stage for so much more understanding of the universe and its expansion

Is life possible on other planets?

Is life possible on other planets? Well, yes, and it is very highly likely. In fact it is unlikely that there are no other planets in the universe that sustain lifethe universe is possibly endless If Earth can sustain life, then why should it be the only place that can? There are a few simple rules that a planet must follow to sustain life:

● It must have a large volume of water;

● It must have the correct atmosphere;

● It has to be in the Goldilocks zone, meaning it is the right distance from the star it orbits;

● It needs an energy source, like the sun.

Here are four examples of hypothetical planets that represent what a planet that sustains life might be like:

Aquatus

● 99% water 1% land

● Has an equator

● Two main species, both of which are aquatic:

○ Papilio - fish-like creatures with extremely thin fins to catch prey (as the papilio are predators) and;

○ Parvus - shaped like a butterfly but with stronger waterproof fins for easy gliding

Earth is made up of 70% water, but what if it was 99% or higher? All animals would have to be suited to water, unless they lived in the 1% or lower part of the planet

Orea Montes

● Thousands of mountains taller than Mount Everest;

● Extremely cold with a lowest temperature of -700°C and a highest of -50°C;

● Three main species:

○ Numbovium - cloud-like to blend into their surroundings Boneless, so lightweight;

○ Unguibus - Similar to a bird, with one extremely long, sharp beak, used for spearing prey (Numbovium);

14
quantamagazine
ons-20230725/
www
org/how-nearly-nothing-might-solve-cosmologys-biggest-questi
20

Terrestris

○ Volan - Humanoid creatures with a large wingspan, lots of fatty tissue in order to keep them warm

● Lots of tall trees and forests;

● Most livable climate for humans;

● Two main species:

○ Bacula - Looks like a stick for camouflage, as it is a prey animal No eyesight, so senses through the vibration of the trees;

○ Ursa - A bit like a bear but with the fur pattern of tree bark Long arms to allow it to wrap around trees, shuffle up slowly, and obtain its prey

○ Superstes - This is the main prey of the evil robot. However, like water bears, these creatures can survive in the most difficult climates and are the only remaining animals besides the humanoids;

○ Malum - This is the robot. It has a microchip inside of it which contains every single piece of information that has ever been typed on an electronic device;

○ Inteitus - The humanoids who created climate change on the planet Rabiosa Like humans they are extremely intelligent and we will probably end up like them in a few million years.

Rabiosa

● This planet is most like Earth - humanoids were at the top of the food chain and let climate change get so bad that the few survivors are living in bunkers away from the chaos.

● However, adding to their already significant problems, having tried to import all of the information ever recorded on a computer into one tiny robot (like ChatGPT no1000000), their attempt backfired and now a tiny robot with maximum knowledge is trying to take over the world:

For each of the names of the planets and animals, we used a word that we thought described it best and translated it into Latin Just to make sure you understand; none of these planets or animals exist, we created them for this purpose only.

21

Bumper Edition - Cool Science Facts

Water can exist in three states at the same time.

This is known as the triple boil - or triple pointand it is the temperature and pressure at which materials exist as a gas, a liquid, and a solid all at the same time.

Helium has the ability to work against gravity.

When helium is cooled to near-absolute zero temperatures (-460 degrees Fahrenheit or -273 degrees Celsius) it becomes a superfluid, which means it can flow without friction

Flying squirrels glow pink under UV light

Humans have inherited genes from other species.

Our genome contains up to 145 genes mutated from bacteria, fungi, other single-celled organisms, and viruses.

The oceans produce the majority of the oxygen on Earth.

According to the National Oceanic Service, we can thank plant-based marine organisms for all that fresh air More than half of the world’s oxygen is produced by plankton, seaweed, and other photosynthesizers

Flying squirrels were discovered to glow a pink colour under UV light And flying squirrels aren’t the only ones: scorpions, platypuses, Tasmanian devils and fireflies have all been confirmed to do so as well Scientists aren’t too sure what causes all of these animals to fluoresce, but it is suspected that types of a group of compounds called porphyrins may be in their fur These are a group of organic compounds that glow, but many are as yet unidentified.

22

Editors’ notes

I was honoured and excited to get the opportunity to help edit ‘The Franklin’, and having read past editions, couldn’t wait to delve into some new topics As someone who has been riveted by science from a young age, reading some of the hugely impressive work of fellow NHEHS students was truly inspirational There are so many fascinating topics covered – I would really encourage all students to take some time to read through this edition and maybe next time they will be inspired to write their own article

This edition was a bit easier to put together, now knowing that starting to collect essays earlier was the way to go. We also all collectively discovered our profound hatred for Word as it caused problems when transferring articles from the original document to The Franklin. It was incredibly fun and rewarding to see the 11th edition of this journal come to life before our eyes A huge thank you to those who helped contribute and wrote pieces to include and to the amazing Ms Brown for organising everything

Another term, another issue of The Franklin! We came into this edition with all our editorial knowledge from the previous and a renewed enthusiasm for science However, we also faced some challenges in the highly technical and mathematical nature of some of the Sixth Form articles we received which required use of previously untouched areas of Google Docs like footnotes and the equation boxes. This issue was lots of fun to compile and we hope you enjoy discovering some very interesting aspects of physics, computing and more!

Contents

What are the risks of integrating AI into cybersecurity systems? 1 The Benefits of Boredom 6 Will Artificial Intelligence-Powered Robots Ever Be Able To Replace Human Surgeons? 7 The Significance of the Discovery of the Higgs particle and its Impact in Today's Life. 9 A History of Black Holes 12 Universe Expansion: the Changes in Thoughts Over Time 16 Is life possible on other planets? 20 Editors’ notes 22 23

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.