the Oxford Scientist dividuals are complicated and there needs to be much more utive of CRUK, which called for the reconsideration of the research at lots of levels aimed at understanding how to campaign to ‘prioritise wellbeing over weight’. The letter limit excess weight gain as individuals age and reduce BMI was supported by fellows from both the University of in the overweight”.
Cambridge and King’s College London.
Despite the stark campaign slo‐ gan, the study also failed to prove a positive causal rela‐ tionship
between
With such controversies surrounding collab‐ oration, how can it continue to aid scientific progression in an unbiased manner?
high
Maybe the answer lies in transparency,
BMI and cancer. These
with a focus on making sure a level of
criticisms
were
high‐
lighted in an open letter from
nutritionist
Laura Thomas to
honesty is maintained throughout a project’s development, especially when research may be influen‐ tial in setting public policy.
the chief exec‐ Megan
Boreham
Biomedical
is
a
Sciences
undergraduate at St
Lu c
yK in
g
Hilda’s College.
From Nietzsche to Nissan ations of public attitudes towards ethical
quandaries posed by self-driving cars. In the
“Moral Machine" experiment, researchers cre‐ ated a game based on the famous "trolley prob‐
The ethics of self-driving cars
lem." Participants had to choose the most favourable result
from a series of rather gloomy outcomes in a hypothetical apid technological development has made self- car crash. The announcement of the experiment was a viral driving cars a reality. This advancement raises hit and led to millions of people from over 200 countries
R
questions about how these cars should make ethical contributing. This made the experiment one of the largest decisions in place of human drivers. While technology can ever studies conducted on moral preferences in popula‐
replace, and will undoubtedly supersede humans in actual tions. driving ability, driving a car involves moral decisions. The "Moral Machine" experiment investigated nine These choices would have to be programmed—for instance different criteria including whether a self-driving car whether to collide with another vehicle or swerve towards should prioritise passengers over pedestrians, people over pedestrians in a crash. Such a circumstance would likely be pets, and youth over the elderly. Results were gathered by
exceedingly rare, but this, and similar situations, still need asking users questions such as: should the car continue for‐ to be considered as we move towards a future without a wards and hit a child, or swerve and hit an old lady? Here, human, in both the literal and metaphorical driving seat.
people had to consider both the ages of the potential vic‐ A study conducted by the Massachusetts Institute of tims along with the moral implications of changing the Technology (MIT) was one of the first large-scale investig‐ car’s trajectory. 10