Impact Magazine Autumn 2017

Page 1

D R I V I N G I M P R O V E M E N T W I T H O P E R AT I O N A L R E S E A R C H A N D D E C I S I O N A N A LY T I C S

AUTUMN 2017

PROTECTING BRITAIN’S BIODIVERSITY GIS joins the battle to save endangered species

SUPPORTING DECISIONS AT A LARGE MULTISPECIALTY HOSPITAL

© Comunicaciones ANFP

Analytics provide considerable cost savings and managerial insight

SCHEDULING SOUTH AMERICA’S WORLD CUP QUALIFYING MATCHES

Mathematical Programming produces a fairer fixture list


THE JOURNALS OF THE OR SOCIETY Linking research, practice, opinion and ideas The journals of The OR Society span a variety of subject areas within operational research and the management sciences, providing an essential resource library for OR scholars and professionals. In addition to the Journal of the Operational Research Society (JORS) – the flagship publication of The OR Society and the world’s longest-established OR journal – the portfolio includes leading research on the theory and practice of information systems, practical case studies of OR and managing knowledge, and journals taking a focused look at discrete-event simulation and applying a systems approach to health and healthcare delivery.

OR Society members receive complete online access to these publications. To learn more about member benefits please visit the membership section of The OR Society website: theorsociety.com

palgrave.com/journals A38629_2


E D I TO R I A L The lead article in this issue of Impact, packed with interesting stories of the benefits that analytical work has brought to a wide range of organisations, tells how South American analysts provided a schedule for the region’s World Cup qualifying tournament. The article, written with four rounds of matches still to be played, mentions that the schedule had contributed to a close and exciting tournament. With one game to go, only one place in the finals had been decided, so the excitement continued to the end. Last night, I found myself, surprisingly, closely following the CONMEBOL results. Football fans in the continent will have had very mixed reactions to the results of the final games. In particular, there will have been relief and rejoicing on the eastern side of the Andes in Argentina, but disappointment, if not despair, on the western side, in Chile. Such is the potential effect of analytical work! Away from South America, readers will notice an Italian flavour to this issue, with two interesting, but different, examples of good analytical work. Both are concerned with waste! One describes the use of advanced optimization models for waste flow management, to improve accuracy of the planning process, and to obtain more cost-efficient solutions with significant improvement to profitability. The other is concerned with recycling disused railways, and how to find the best new use for them in discussion with the various stakeholders. Here, profit is not one of the many conflicting objectives. For this latter case an approach known as MCDA was used, an approach explained in an expository article by Larry Phillips. This is a happy juxtaposition. Whilst it is clear what M, C and D stand for – MultiCriteria Decision – for A one author has “Aiding” and the other “Analysis”. It shows that it’s not just criteria for which there is choice! This is the last issue of Impact to be published by Palgrave. Those at Palgrave with whom I have worked have been very supportive from the moment the magazine was conceived. Designing and producing a magazine, rather than a scholarly journal, has been a learning curve for them and, even more so, for me, and I am very grateful for all that they have done.

Graham Rand

The OR Society is the trading name of the Operational Research Society, which is a registered charity and a company limited by guarantee.

Seymour House, 12 Edward Street, Birmingham, B1 2RX, UK Tel: + 44 (0)121 233 9300, Fax: + 44 (0)121 233 0321 Email: email@theorsociety.com Secretary and General Manager: Gavin Blackett President: Ruth Kaufmann FORS, OBE (Independent Consultant) Editor: Graham Rand g.rand@lancaster.ac.uk

Print ISSN: 2058-802X Online ISSN: 2058-8038 Copyright © 2017 Operational Research Society Ltd Published by Palgrave Macmillan Printed by Latimer Trend This issue is now available at: www.issuu.com/orsimpact

OPERATIONAL RESEARCH AND DECISION ANALYTICS

Operational Research (O.R.) is the discipline of applying appropriate analytical methods to help those who run organisations make better decisions. It’s a ‘real world’ discipline with a focus on improving the complex systems and processes that underpin everyone’s daily life - O.R. is an improvement science. For over 70 years, O.R. has focussed on supporting decision making in a wide range of organisations. It is a major contributor to the development of decision analytics, which has come to prominence because of the availability of big data. Work under the O.R. label continues, though some prefer names such as business analysis, decision analysis, analytics or management science. Whatever the name, O.R. analysts seek to work in partnership with managers and decision makers to achieve desirable outcomes that are informed and evidence-based. As the world has become more complex, problems tougher to solve using gut-feel alone, and computers become increasingly powerful, O.R. continues to develop new techniques to guide decision making. The methods used are typically quantitative, tempered with problem structuring methods to resolve problems that have multiple stakeholders and conflicting objectives. Impact aims to encourage further use of O.R. by demonstrating the value of these techniques in every kind of organisation – large and small, private and public, for-profit and not-for-profit. To find out more about how decision analytics could help your organisation make more informed decisions see www.scienceofbetter.co.uk. O.R. is the ‘science of better’.


Beale Lecture 2018: Thursday 22 February 2018 – Entry free

The OR Society’s Beale Medal is awarded each year in memory of the late Martin Beale. It gives formal recognition to a sustained contribution over many years to the theory, practice, or philosophy of O.R. in the UK.

Lecture: “The fitness and survival of the O.R. profession in the age of artificial intelligence” Professor Richard Ormerod (Beale Medal Winner 2016) The focus of Richard’s research has been the practice of O.R., how it can be described, understood and improved. His lecture will explore how artificial intelligence (AI) will affect professional practitioners in general and O.R. in particular. Today we are inundated with books and articles about the coming of the robots and how they will take our jobs. What is likely to happen and how quickly? What are the implications for training, education and research? AI provides the key to unlocking the door; powerful computers the means.

Opening talk: “Heterogeneous Location and Pollution-Routing Problems” Dr Çagri Koç (PhD Winner 2015) This talk presents new classes of heterogeneous vehicle routing problems with or without location and pollution considerations. It describes evolutionary and local search-based metaheuristics capable of solving of such problems with suitable enhancements and provides several important managerial insights.

Thursday 22 February 2018 The Royal Society, 6-9 Carlton House Terrace, London SW1Y 5AG Timings: 14:00 Tea and biscuits 14:30 Lectures starts 16:30 Approximate finish Entry free

Register your place at: www.theorsociety.com/beale Please contact Hilary Wilkes hilary.wilkes@theorsociety.com with any queries.


CO N T E N T S 7

FROM O.R. TO THE WORLD CUP RUSSIA (WITH LOVE) Guillermo Durán, Mario Guajardo and Denis Sauré describe how optimisation techniques helped create a fairer schedule for the South American World Cup qualifying tournament

15

OPTIMISING WASTE FLOWS Tiziano Parriani and Daniele Vigo report on the use of Operational Research to optimize waste logistics in Italy

25

WHAT TO DO WITH DISUSED RAILWAYS? Valentina Ferretti explains how Multi-Criteria Decision Aiding supported negotiations among different stakeholders involved in regenerating a disused railway in Italy

31

MANAGING THE OPERATING ROOM SUITE AT UCLA RONALD REAGAN MEDICAL CENTER Kumar Rajaram and Sandeep Rath inform us of how a sophisticated decision support system has had a major economic and organizational impact

36

DATA SCIENCE AND OPERATIONAL RESEARCH AT DŴR CYMRU WELSH WATER Kevin Parry describes the work of the Data Science Team at Dwr Cymru Welsh Water

© National Trust Images/Joe Cornish

40

PROTECTING BRITAIN’S BIODIVERSITY WITH GEOSPATIAL ANALYSIS Dominic Stubbins reports on how GIS technology is helping the National Trust achieve its goals

43

USING HEALTH SYSTEMS ANALYTICS TO HELP THE NHS Marion Penn, Rudabeh Meskarian and Thomas Monks describe their work to support decisions about how many sexual health clinics should be provided

4

Seen elsewhere Analytics making an impact

13 In praise of sabbaticals

Mike Pidd encourages us to take time out 19 Universities making an impact

Brief reports of two postgraduate student projects 21 How MCDA is moving decision-

making from intuitive and qualitative, to explicit and quantitative Lawrence D. Phillips explains how organisations can improve their decision-making 47 Ctrl - Alt - Facts - Delete

Geoff Royston argues that we need to do what we can to drain the post-truth swamp


SEEN ELSEWHERE

Image courtesy of Graham Rand

WAS ANGELINA JOLIE RIGHT?

Researchers (see Decision Analysis (2017),14,139–169) have determined optimal cancer prevention strategies among BRCA mutation carriers. Angelina Jolie decided to have a preventative double mastectomy because she carried a mutation in the BRCA gene. Two years later she underwent another preventative surgery, the removal of her ovaries and fallopian tubes. Carriers of BRCA gene mutation have an increased risk of breast and ovarian cancer. Though women make these difficult decisions in consultation with their consultants, there is no guideline to advise women at which age each surgery should be performed. The paper provides the first comprehensive model to optimize patient outcomes (quality-adjusted life years, QALY) and survival probability. The authors created a Markov decision process model of the surgery decision by female BRCA mutation carriers which identifies the optimal strategy to determine ages to undertake a bilateral mastectomy (BM), bilateral salpingo-oophorectomy (BSO), or both surgeries. It is optimal for BRCA 1 carriers to undergo BM between ages 30 and 60, and ovary removal after age 40. For BRCA 2 carriers, risks are lower and therefore surgeries are recommended at a later age. THE SCIENCE OF THE TOUR DE FRANCE

This is the title of a book by James Witts (Bloomsbury Sport, 2016) in which the role of predictive analytics and Big Data in the world of pro cycling is revealed. In it you can learn

4

IMPACT © THE OR SOCIETY

how Robby Ketchell, the World Tour’s first data scientist, predicts the future. “Predictive analytics is what it’s about,” Ketchell enthuses. “But it’s not as easy as just producing a calculation. A lot of times we create this model and, in the real world, it doesn’t come anywhere close. So we have to say what we are missing. So you’re looking for more data and refinement of interpretation to perfect the model”. “There are variables galore to analyse and cycling is in a really interesting place right now,” he continues. “There are all these sensors and devices to measure aspects of physiology, movement patterns, biometrics, environment and so on, and doing it in different ways. Key to the future is how you bring that information together. How you aggregate that and bring something meaningful out of it.” Ultimately, Ketchell’s role is to make an unpredictable sport more predictable,

as he follows Team Sky’s ethos of leaving nothing to chance. ‘THE BEAST’ SIMULATES RELIGIOUS BELIEFS

Researchers in religion studies, led by Wesley Wildman, a Boston University School of Theology professor of philosophy, theology, and ethics, are using simulations on a computer dubbed “the Beast” to help answer big questions about religion’s benefits (potentially better mental health) and its evils (violence in the name of God). A simulation which predicts how many people would stay in a religion based on its strictness was validated when compared to realworld data about defection rates from 18 Christian denominations, ranging from those with strict obligations like the Mormons (a fast-growing faith) to the more permissive United Church of Christ, which has shrunk from two million members in the 1950s to less than a million today. Wildman’s research team has used the Beast to analyse people’s reactions to terrifying events such as natural disasters or disease outbreaks. WHAT ANALYTICS GROUPS NEED

According to an article in the Harvard Business Review (see http://bit. ly/2t4Z1Be), there are three main obstacles that are holding back the “benefits” of analytics within organizations: structure, culture and approach to problem solving. Analytics groups can be too independent or too embedded; too data-driven or rely too heavily on gut instinct; or they can produce models that are too


SITING HIV/AIDS DIAGNOSTIC EQUIPMENT

A team of researchers from the Centre for Operational Research, Management Sciences and Information Systems (CORMSIS), University of Southampton, UK, and South Africa have investigated the siting of HIV/AIDS diagnostic equipment in laboratories across South Africa (see International Transactions in Operational Research (2018), 25, 319-336). Classical location analytical techniques were extended to ensure that laboratories are sited as close as possible to major centres of demand from hospitals and clinics. The approach adopted allowed choices between laboratory sites to be made in a transparent manner. Recommendations were also made for locations of pointof-care devices in the remotest places.

Results demonstrated to decision makers showed close comparisons with pilot review projects undertaken in four health districts of South Africa. The research has potential to impact healthcare delivery to HIV sufferers in the poorest rural regions of the country.

route from one town to another, the most efficient way to visit the best pubs in the UK and whether it is true that all living things in the world are six or fewer degrees of separation away from each other. Rhyd showed, pictorially, how the many problems in everyday life can be modelled as networks: from the colouring of maps to the way Facebook makes friend recommendations. Rhyd contributed an article about graph colouring to the Spring 2016 issue of Impact. HELPING THE POLICE WITH THEIR ENQUIRIES

WHEN SPORTS RULES GO AWRY

Changing the rules of any sport, to make it more exciting, can often result in undesirable consequences. Using simulation and other O.R. techniques could help governing bodies foresee these effects before they are implemented according to Graham Kendall of the University of Nottingham Malaysia Campus and Liam Lenten of Australia’s La Trobe University (see European Journal of Operational Research (2017), 257, 377394). MATHEMATICS AT A LITERATURE AND ARTS FESTIVAL

Rhyd Lewis, from the School of Mathematics, Cardiff University, gave a talk in May 2017, on the “Mathematics of Networks and Maps” at the renowned Hay Festival of Literature and the Arts. He explained, inter alia, how satnavs find the quickest

Professor Mark Girolami, Imperial College, is leading a new fiveyear program to test and improve predictive policing and tackle other challenges for future cities. Predictive policing involves using maths and statistics to predict times and places that serious crimes will occur based on historical crime data in a given area, allowing police to efficiently allocate resources. The project called Inference, Computation and Numerics for Insights into Cities (ICONIC) follows on from a successful trial in Los Angeles. It has just been given a £3m grant from the EPSRC. Also involved are the Universities of Oxford, Manchester and Strathclyde, along with the Met, West Midlands, Scotland and New York police forces. The problem is essentially one of identifying where more staff are needed at any given time and, as such, is relevant to many different organisations. MATHS, A MATTER OF LIFE AND DEATH!

In an essay in The Conversation (see http://bit.ly/2uNPdRK), Christian Yates (Senior Lecturer in Mathematical Biology, University of Bath) describes how mathematics is increasingly

IMPACT | AUTUMN 2017

5

Image courtesy of Selective Analytics

complex or too simplistic. The authors think they have the answer. They believe an effective business analytics organization balances functional knowledge, business instinct, and data analysis, with an operating philosophy to add complexity only when the additional insights justify it. This kind of organization includes, they argue, an analytics nerve centre (a small team of independent, highly skilled data scientists), representation at the top (a chief analytics officer who brings the voice of analytics straight to the board), and a champion-challenger approach. In this latter aspect, analysts initially focus on creating a model that solves the problem to a minimum acceptable level as simply as possible. This becomes the “champion” and touches off challenges to add complexity and unseat it. A challenger can replace the champion, but only if all stakeholders agree that its benefits are worth the increased complexity.


MENTAL HEALTHCARE ANALYTICS

At a recent NHS Confederation Conference, Joe Rafferty of Mersey Care NHS Foundation Trust, said, “The digital opportunity is about us grabbing data to really make a case to the government that mental health services absolutely need the investment that we all know they need.” He also said that better use of data could support transformation of service provision. “We see no option but to engage in the digital revolution as a way to fundamentally do things differently. By using a data analytics approach we have already started to pick some really interesting signals out of what otherwise would have been noise.” ANALYSING TERRORISM

Professor Ashraf Labib of the University of Portsmouth is leading a team in collaboration with Polaris, who specialise in operational research, experimental design and software development, to create software that will help security analysts identify the critical information from the huge

6

IMPACT | AUTUMN 2017

volume of data they receive and enable them to better detect potential security threats. He said, “Intelligence analysts need to process large volumes of data quickly, extracting crucial information to detect potential security threats. This could be identifying certain key words used to denote people, places or objects that need to be highlighted to security services for further investigation.” The University of Portsmouth researchers are experts in root-cause decision analysis and operational research techniques, information elicitation and criminal intelligence. The software uses an algorithm to capture the patterns and behaviour of the analyst. This is used to evaluate the consistency of analysts’ judgements at individual and group levels, as well as identifying key factors or biases which influence an analyst’s decision-making. The findings will be used to inform analyst training and as a decision aide within the tool to ensure more robust judgements are made.

to predict how many bags a variety will sell. The best predictor involved quantifying performance of test varieties against check varieties, in a way which reduced the bias from external agronomic and economic factors. The final took place in April 2017 at the annual INFORMS Business Analytics conference in Las Vegas, where their solution was presented to a judging panel consisting of O.R. professionals and business executives from Syngenta. Image courtesy of INFORMS

becoming the language of biology. It has moved a long way from its origins in 1952 when Alan Turing produced a pair of equations which can describe the patterns of spots and stripes seen on big cats. Today, mathematical biology can simulate heart function and how the heart is likely to interact with candidate drugs, for example. Mathematical immunology is used to determine how our immune systems battle with viruses. There are models that help explain how deadly viruses, such as Ebola, spread and how others can be used to inform policy such as overfishing.

AND THE WINNERS ARE…

Three students, Adam Green (University of Cardiff), Anna Scholes (University of Nottingham) and Peter Riley (University of Warwick), who first met on placement as analysts for HMRC, achieved first place in the inaugural INFORMS O.R. and Analytics Student Competition. The problem they analysed was set by Syngenta, one of the world’s foremost seed biotech companies, who were looking for solutions to two important issues. Why were certain strains of soybean varieties underperforming in sales, despite earlier testing that suggested the contrary? How can we use testing data to predict whether a soybean variety will sell more than one million bags in its second year of sales? Based on the data provided by Syngenta, they derived several ways

HOSPITAL DISCHARGE HOLDING AREAS

Researchers in San Francisco have studied the inpatient paediatric unit of the Children’s Hospital of Minnesota (see Service Science (2017), 9, 121–135). Bed and nursing resources are stressed, not only by frequent movement of patients but also by the unit’s patient discharge policy. Their discrete-event simulation model examines how the unit’s efficiency may be improved by a better discharge policy. In particular, the base version of the model is used to investigate the impact of sending various percentages of discharge-ready patients to a discharge holding area where they can safely wait for a few hours until being picked up by their parent or guardian. Doing so frees up inpatient beds, allowing the unit to serve many more patients per year.


© Comunicaciones ANFP

FROM O.R. TO THE WORLD CUP RUSSIA (WITH LOVE) GUILLERMO DURÁN, MARIO GUAJARDO AND DENIS SAURÉ

THE SOUTH AMERICAN QUALIFYING TOURNAMENT for the 2018 FIFA World Cup has been arguably one of the tightest in recent history. In May 2017, with four out of 18 rounds remaining, there were eight teams with plausible possibilities to qualify for the final stage, to be held in Russia in 2018. While this can be mainly attributed to the even strength of the participating teams, Operational Research may have also contributed by means of a much more balanced schedule, in comparison to previous qualification tournaments.

BACKGROUND

The FIFA World Cup is both the most important football competition and the most followed sporting event in the world. Held every 4 years, the World Cup brings together national teams from FIFA’s six continental confederations: AFC (Asia), CAF (Africa), CONCACAF (North

IMPACT © THE OR SOCIETY

7


© Comunicaciones ANFP

America, Central America and the Caribbean), CONMEBOL (South America), OFC (Oceania), and UEFA (Europe). The championship is played in two phases, the qualification and the finals. In the qualification phase, teams in each confederation compete to secure a place in the final phase. Only a limited number of teams advance per confederation. While about 200 national teams participate in the qualification phase, only 32 of them advance to the final phase. In the final phase, the 32 qualified teams fight for the world title in a series of matches taking place over a one-month period at venues usually located within a single host country. This tournament receives more international attention than any other single-sport event, attracting dozens of sponsors, intense world media coverage and hundreds of millions of viewers around the globe. In 2014, some 3.4 million spectators filled the stadiums for the World Cup finals matches in Brazil and the television coverage reached 3.2 billion people around the world. With 10 South American teams vying for four places determined directly and a fifth determined by an intercontinental play-off, CONMEBOL is the confederation with the most places per competing

8

IMPACT | AUTUMN 2017

team. CONMEBOL holds nine titles out of the 20 World Cup tournaments played between 1930 and 2014, with Brazil having won five times and Argentina and Uruguay twice each. In April 2017, four of the South American teams – Brazil, Argentina, Chile, and Colombia – were among the top-5 teams of the worldwide FIFA Ranking.

The FIFA World Cup is both the most important football competition and the most followed sporting event in the world

TRADITIONAL SCHEDULING

The South American qualification is a double round-robin tournament in which each team plays every other team twice over the course of 18 rounds, each of which consists of five matches. The entire tournament thus consists of 90 matches, and lasts about two years. The 18 rounds are scheduled in closely spaced pairs of match dates called double rounds. Matches on the first round of a double round are played over one or two days, and those in the following round are played four or five days later. In contrast, a double round can be separated by one or many

months from the preceding double round. One of the main reasons for the use of double rounds is to allow players in South American national squads, who typically play in toptier European leagues, to play two qualifying matches on a single trip back to South America. This scheduling practice has been promoted by FIFA not only for CONMEBOL but for other confederations as well. Between 1998 and 2014, the tournament was scheduled according to a so-called mirrored scheme, that is, in the first nine rounds each team played every other team once, while in the second nine rounds the matchups were repeated in the same order but with the home-away status reversed. Not only did this mirrored format last for many years, but also the schedule itself, which was identical for all four qualification tournaments leading to the World Cups between 2002 and 2014 (the only variation being that since Brazil was exempted from the process in 2014, the team that in each round would otherwise have played the Brazilians had a bye). During such a period, criticism over this schedule notoriously mounted over the years. While some of that criticism arose especially in countries with the frustration of not having qualified for a final tournament, some features were objectively unbalanced. For example, the so-called breaks (a sequence of two consecutive matches with the same home-away status) were unevenly distributed among the teams: for example, Bolivia had four of its nine away matches scheduled as two double round breaks, while Argentina and Brazil had no breaks at all. This unbalance was particularly controversial: consider that when an away break is scheduled within a double round, it forces a team to play two matches close together


alternative schedule proposals were presented for the last four world cups, none of them received the support of a majority of CONMEBOL countries. Due to the lack of agreement and wide criticism, in January 2015, CONMEBOL decided that the schedule for the 2018 qualifiers would have to be defined by a random draw. A totally random schedule, however, is impractical. Thus, the ten countries were allowed to propose generic schedules that could serve as templates for the draw.

The O.R. approach allowed us to explore a variety of symmetric schemes, some completely new and some previously used in other football competitions, and to generate many candidate schedules that followed such schemes

O.R. SCHEDULING

In collaboration with the Chilean National Professional Football Association (ANFP), we have been applying Operational Research (O.R.) techniques to schedule football leagues

in Chile for the last 12 years. This collaboration has had great impact in the local professional leagues: it has translated into savings in travel costs and increased revenues for teams, it provides much more transparent and sense of fairness to all stakeholders (in comparison to previous traditional methodologies), and has helped improve various aspects of the competition such as, for example, avoiding clashes between fans of classic rivals in the same city, thus contributing to public order and better allocation of police resources. Our continued collaboration was recognized in being chosen as a finalist for the prestigious Franz Edelman award in 2016. In light of the positive experience of scheduling Chilean competitions, the ANFP decided to rely on O.R. techniques (and our group) for constructing a proposal schedule for CONMEBOL’s qualifiers. In our first approach to constructing a proposal, we tried to minimize the main disadvantages of the previous schedule, while keeping its mirrored scheme. However, using an integer programming approach, we quickly discovered that there is no mirrored schedule that can eliminate the number of breaks in double rounds (the least

IMPACT | AUTUMN 2017

9

© Comunicaciones ANFP

without home advantage and possibly involving some long travel sequences. For example, one of Bolivia’s double round breaks had it play away against Argentina in round 3 followed by an away game against Venezuela in round 4. This is the fifth longest of the side’s 72 possible travel sequences (7,340 km, slightly more than the distance from London to Mumbai). Away double-round breaks also imply that teams do not play at home for long periods. This is inconvenient not only for teams, but also for the local media and fans who might not get a chance to see their team for such a long period. For example, in the qualifiers for the 2010 World Cup, Uruguay did not play at home for more than 6 months (from September 2008 to March 2009) due to an away break in a double round. A similar situation affected Chile in the qualifiers for the 2014 World Cup, who did not play at home for almost a year, from November 2011 to September 2012. Even when there are no breaks, the order in which the home-away games are spread within a double round matters for the teams. In general, teams prefer to start a double round with a home game, followed by an away game, rather than the other way around. This is for logistical reasons, as usually the players gather to train in their home country immediately preceding the first game of a double round, while after the second game they head back directly to their respective club teams. In this respect, the traditional schedule had an unbalanced distribution of home-away (H-A) and away-home (A-H) sequences in double rounds. For example, Argentina had nine H-A sequences while Chile and Venezuela had only one and, moreover, Brazil did not have any. Despite these and other shortcomings, and although many


© Comunicaciones ANFP

number that can be obtained while preserving the mirrored structure is 16). This result challenged us to find a solution that departed from the classical mirrored structure and that was acceptable in practice. The mirrored scheme was firmly established in the tradition of CONMEBOL, and we could foresee it was going to be hard to change it. The mirrored scheme is also easy to understand for the fans and the media. Our efforts then focused on trying to generate a schedule that could resemble the symmetry of the mirrored scheme, while improving its defects. The O.R. approach allowed us to explore a variety of symmetric schemes, some completely new and some previously used in other football competitions, and to generate many candidate schedules that followed such schemes. The approach was based on an integer programming formulation that captured the symmetric conditions together with other conditions specific to this tournament by means of constraints, and the desirable minimization of double round breaks as an objective function. The formulation was coded in a mathematical programming language and solved by

10

IMPACT | AUTUMN 2017

a commercial optimization solver. In the overall process, we tested several schemes and mix of conditions, which led us to find many candidate schedules that completely eliminated the breaks within double rounds. We presented these candidate schedules to ANFP’s officials, who then chose one as the official proposal of Chile to CONMEBOL.

Transforming so many years of disagreement among CONMEBOL countries with respect to changing the schedule into a unanimous consensus is a significant achievement, which may probably not have been possible without the use of O.R.

RESULTS

Chile’s schedule proposal was presented at a meeting of the ten countries’ representatives of CONMEBOL, in May 2015. Although other countries also presented proposals, ours was unanimously selected. In July 2015, the

draw of the qualifiers for the next World Cup took place in Saint Petersburg, in an event followed by TV all around the world. The draw of the South American qualifiers was conducted by Ronaldo and Forlán, former winners of the best player award at the FIFA World Cup 1998 and 2010, respectively. They drew balls from different pots. The balls in the pots contained the names of the teams and the numbers they would be allocated in the template schedule that we had generated by O.R.. They might not have understood the technicalities behind the schedule design, but the moment was undoubtedly a milestone in which operational research met football practitioners. The resulting schedule greatly improved the balance of home-away sequences among all teams, which is perceived as scheduling fairness. First, every team plays exactly one home game and one away game in every double round. This means that the new schedule has no breaks within double rounds, while the traditional schedule had 18. Second, all teams have at least four and at most five H-A sequences in double rounds over the course of the tournament. This improves greatly over the imbalance of the traditional schedule, and it is the best achievable since the tournament has nine double rounds. In addition, the schedule resembles the symmetry of the mirrored scheme, by using a so-called French scheme. In this scheme, matches of the second half of the tournament follow the same order of the matches of the first half, with home-away status reversed (exactly as in a mirrored scheme), with the exception of the matches in the first round, which are scheduled (with the home-away status reversed) in the last round of the tournament. Another positive feature of the schedule is that


TABLE 1: FINAL STANDINGS OF THE SOUTH AMERICAN QUALIFIERS TO THE 2018 FIFA WORLD CUP Place

Team

Played

W

D

L

F

A

Points

1

Brazil

18

12

5

1

41

11

41

2

Uruguay

18

9

4

5

32

20

31

3

Argentina

18

7

7

4

19

16

28

4

Colombia

18

7

6

5

21

19

27

5

Peru

18

7

5

6

27

26

26

6

Chile

18

8

2

8

26

27

26

7

Paraguay

18

7

3

8

19

25

24

8

Ecuador

18

6

2

10

26

29

20

9

Bolivia

18

4

2

12

16

38

14

10

Venezuela

18

2

6

10

19

35

12

it prevents any team from playing consecutive matches against Argentina and Brazil, unarguably the two strongest teams of South America. This has been traditionally considered a sport-fairness feature by CONMEBOL (the traditional schedule also had this feature). In the draw, this was handled by using a particular colour of balls for these two teams, so that they would be allocated to predefined placeholders that satisfied this condition in the template schedule. In May 2017, with all teams having played seven home and seven away games, there was a very tough fight for qualification. In particular, the gap between the 4th and 5th teams, Uruguay and Argentina, was only one point. Up to the same round, in the qualifiers to all World Cups between 1998 and 2014, this difference ranged between two and five points. With four games remaining, 12 more points were still available to each team. As a result, even the 8th team in the standings, Peru, had a reasonable chance to qualify. This is in marked contrast to the 2002 and 2014 tournaments, where up to round 14 the eighth team was at 10 or 11 points of

difference with respect to the fourth, and comparable to the standings in the 1998, 2006 and 2010 qualifier tournaments. While the facts above are for the most part caused by situations independent of the schedule, some small parts are arguably derived from having a more balanced O.R. scheduling approach. At any rate, it cannot be disputed that transforming so many years of disagreement among CONMEBOL countries with respect to changing the schedule into a unanimous consensus is a significant achievement, which may probably not have been possible without the use of O.R.. This work, together with numerous other contributions of sports scheduling, show that O.R. has great potential to support decision making in the practice of sports. Some competitions involve a large audience, the South American qualifiers are followed by about 400 million people, thus any improvement can convey a great social component. This also helps to promote O.R. and to introduce it in educational programmes, as we have experienced using sports scheduling topics at both university

and high school levels. Reporting new applications also helps to spread good practices; part of our research team has introduced the use of O.R. to schedule other sports competitions, such as the volleyball and the basketball professional leagues of Argentina. Currently, others related problems that drive our attention are producing a better FIFA ranking, referee assignment, and predicting match results. Guillermo Durán is associate professor at the University of Buenos Aires and adjunct professor at the University of Chile. Mario Guajardo is associate professor in the Department of Business and Management Science at NHH Norwegian School of Economics. Denis Sauré is assistant professor in the Department of Industrial Engineering at the University of Chile. FOR FURTHER READING 1. Alarcón F, G. Durán, M. Guajardo, J. Miranda, H. Muñoz, L. Ramírez, M. Ramírez, D. Sauré, M. Siebert, S. Souyris, A. Weintraub, R. Wolf-Yadlin and G. Zamorano. (2017). Operations Research Transforms Scheduling of Chilean Soccer Leagues and South American World Cup Qualifiers. Interfaces 47:52-69. 2. Goossens D.R., and F.C. Spieksma (2012). Soccer schedules in Europe: an overview. Journal of Scheduling 15:641-651. 3. Durán G, M. Guajardo and D. Sauré (2017). Scheduling the South American Qualifiers to the 2018 FIFA World Cup by Integer Programming. European Journal of Operational Research 262: 1109-1115.

IMPACT | AUTUMN 2017

11


Certified Analytics Professional (CAP®) The OR Society now offers Certified Analytics Professional, an exam-based analytics qualification established by INFORMS in the USA. This is complementary to our own accreditation programme, and does not create an exclusive ‘one-or-the-other’ choice. What is CAP?

CAP is the premier global professional certification for analytics practitioners. Those who meet CAP’s high standards and pass the rigorous exam distinguish themselves and create greater opportunities for career enhancement. Earning the CAP credential requires meeting the eligibility requirements for experience and education, effective mastering of “soft skills”, committing to the CAP Code of Ethics, and passing the CAP exam. For organisations seeking to enhance their ability to transform complex data into valuable insights and actions, CAP provides a trusted means to identify, recruit, and retain the very best analytics talent.

CAPs in the workforce

■ 20% of Fortune 100 companies have CAP on staff including ■ ■

Bank of America, General Motors, Boeing, Chevron, DuPont, IBM, JPMorgan Chase, Chase, Lockheed Martin, UPS, and more. Visit www.certifiedanalytics.org to search the database for CAP professionals. Add CAP Preferred to your job postings to receive résumés from the most eligible analytics professionals.

CAP Benefits

■ Building Capability: CAP-accredited employees and ■

professionals provide the unique capability to leverage the power and promise of analytics. Driving Credibility: Having CAP professionals on your team demonstrates you have the top talent in place and are committed to the highest ethical standards of analytics practice. Creating Opportunity: Encouraging employees and professionals to pursue their CAP certification creates new opportunities for success and provides new avenues for organisational analytics-based growth.

■ Focused on seven domains of analytics process:

■ ■

I Business Problem Framing II Analytics Problem Framing III Data IV Methodology Selection V Model Building VI Deployment VII Lifecycle Management Computer-based exam administered through the CAP test vendor network Managed by the Institute for Operations Research and the Management Sciences (INFORMS)

Are you eligible?

Earning the CAP credential includes meeting eligibility requirements for experience and education. Degree Level

Degree Area

Experience

MSc/MA or Higher

Related Area

3 Years

BSc/BA

Related Area

5 Years

BSc/BA

Non-Related Area

7 Years

Fees and pricing

For OR Society members, fees will be billed in sterling equivalent at the prevailing PayPal exchange rate. Exam Fee: OR Society Member

$495

£380*

Exam Fee: Non-members

$695

£533*

Annual Maintenance Fee Payable beginning 4th year of certification

$100

£77*

Why should I hire CAPs?

Member Re-examination Fee

$300

£230*

■ ■ ■ ■ ■

Nonmember Re-examination Fee

$400

£307*

Processing Fee on Approved Refunds

$100

£77*

Appeals Processing Fee

$150

£115*

Proven pre-qualified analytics talent Improves your organisation’s analytics capability Maintains continuous professional development Provides long-term professional growth Increases your competitive advantage

Programme at a glance

■ Globally recognised credential based on practice of ■ ■

analytics professional Vendor and software neutral Created by teams of subject matter experts from practice, academia and government

*GBP conversion estimated as at 6 September 2017.

For more information about the CAP programme www.theorsociety.com/cap 0121 233 9300 www.certifiedanalytics.org info@certifiedanalytics.org


IN PRAISE OF SABBATICALS Mike Pidd

suspect that a university that does not offer sabbaticals or research and study leave to its staff is, in essence, saying they cannot be trusted to use this time properly. I certainly gained great benefits from working in other universities during those periods. I had time to write without the need to teach and mark (I enjoyed teaching but hated marking). I could mix with new colleagues and discuss ideas with them. I could breathe in a different air and realise that the way we did things back home could often be bettered. MEASURING PERFORMANCE

One of the pleasures of joining Lancaster University many years ago, was that our contracts specified that we had study leave as of right. In effect, this meant that academic staff could take one year off in eight (one after seven) or one term off in eight to focus on whatever took their interest. If they could raise the cash, they could spend time at other universities around the world so as to broaden their experience. This sabbatical system was, of course, a potential opportunity to laze around and do nothing during a sabbatical period. Since the idea of a sabbatical occurs in the second chapter of Genesis, maybe taking a break from routine has some biblical warrant. Though there was obvious scope for abuse, I was unaware of any of my colleagues abusing the system, in fact the opposite. Given the employment contracts, you’d expect one in eight of the staff to be on sabbatical leave all of the time. However, it rarely worked out that way and most staff took less leave then their entitlement, often due to it being inconvenient for their families. There was no extra money to support this sabbatical system, in essence it meant that the department worked 12.5% harder if all staff took this leave rather than none doing so. I was much more willing than some colleagues to take sabbatical leave. Though when I retired I still had some unclaimed leave, but I didn’t take it. When I was chairing the panels that determined the research quality of UK business and management schools in 2008 and 2014, I argued very strongly that the availability of sabbatical leave to all academic staff as of right was an indicator of a school’s determination to support both research and staff development. Why did I think this? I

As non-academics will suspect, universities are very strange places in which staff are paid to spend time doing what they enjoy. Surely it cannot be fair to give even more freedom to them? Many metaphors can be brought to play for university life. One such, though rather chilling, is the notion of a research factory or a graduate factory. That is, universities can be seen as places in which the workers produce research and graduates. I much prefer to think of universities as institutions in which staff teach and do research, and this is not the same thing. We can easily measure research papers written, grants obtained and graduates produced. It is much less straightforward to assess the quality of research activity and of the student experience. But, you will cry, universities have to assess the quality of ‘the student experience’ and universities are subject to research quality assessment every 6 or 7 years. Indeed they are, but … Before I retired I wrote a final book that, unlike the others, has sold extremely badly, despite its obvious and clearly measurable quality. It was called ‘Measuring the performance of public services’, a subject that interested me in the final years before retirement. Some years ago, Alan Coren, father of the now famous Giles and Victoria, researched book titles and concluded that books about cats, golf and World War II sold well. Hence he published a collection of his writings under the title ‘Golfing for cats’, with a picture of a swastika on the cover. I don’t know how well the book sold, however. I didn’t attempt such research, so mine might well have sold better with a snappier title and I invite you to think of something suitable. I am still a promoter of measurement and an unashamed advocate of measuring how well our public services are provided. However, one thing is for sure: measuring these things has to be a multi-dimensional task. Most of us know the maxim ‘If it can be counted it will count’. We are aware that easily measurable aspects will come to dominate the less tangible, but possibly more important, things.

IMPACT © THE OR SOCIETY

13


@Lancaster University

A USEFUL 2X2

What does this have to do with university life and sabbatical leave? Like most public and private institutions of any size, measurement tends to be routine and focuses on the surface, rather than what’s bubbling away underneath and out of sight. In 1989, James Q. Wilson’s interesting book with the very uninteresting title, ‘Bureaucracy’, though it does have the sub-title “What Government Agencies Do And Why They Do It “, appeared on the market. Its title is even less inviting than mine, but delving inside reveals some interesting and useful ideas. One is the type of 2-dimensional typology of which MBA types are said to be fond. Its two dimensions concern the observability of work activity and the observability of the results or outcome of that work. Thus, work activity may or may not be observable as might the results of outcome of that work. This leads to a 2x2 matrix with 4 labels. Organisations with work activity and results that are fairly straightforward to observe, and therefore to measure, he terms ‘Production organisations’. An example might be a contractor that issues driving licences on behalf of a government department. This can operate with standardised processes that can be measured in a relatively objective manner, as can the number of different types of licence produced each period. Thus the internal managers can see how well they are doing and the government department can check whether it is getting value for money. So far, so simple. However, things start to get more complicated when we consider what Wilson terms ‘Craft organisations’ in which the results or outcomes can be observed, but the work activity itself cannot. In some ways, organisations who provide healthcare can be regarded in this way, since

14

IMPACT | AUTUMN 2017

it is (relatively) straightforward to assess the effect of an intervention, but really rather difficult to assess, say the quality of a surgeon’s work as she operates. It’s tricky, also, when we consider ‘Procedural organisations’ in which the results of what people do are nonobservable, at least in the short-term, whereas their activity can be observed. Wilson argues that most bureaucracies are like this. People follow rules and procedures and the degree to which they do so can be measured, however the results of their so doing may well be impossible to assess in anything like current time. Finally, the trickiest of all, in which neither work activity or outcome can be observed, which Wilson terms ‘Coping organisations’. I suspect that universities come closest to this. Short of having observers in each lecture, workshop, seminar or tutorial, the activity cannot be properly measured. Nor can that other activity: research. This typically includes periods where nothing much happens and things may even unravel for a while. Likewise assessing the quality of the results of the teaching, that is, the students, is equally tricky. Coping organisations usually get round this by focusing their assessment on easy to use process measures such as number of hours taught, number of papers written, or number of students graduated. TIME GENTLEMEN, PLEASE

To return to the sheer joy and pleasure of sabbatical time. It supports the essentially unmeasurable aspects of university life for academic staff. It provides time to think, time to make mistakes and a different brick wall against which to beat your head when stuck in a rut despite all efforts to climb out of it. I’m in favour of such time being available to everyone, though they’ll need to find some way to finance it. The vicar of my church, a busy man with a congregation of over 500, has been with us for nearly seven years and we’d like him to stay and not burn out. So he starts a sabbatical next year in which he’ll have chance to recharge his batteries. Some newspapers provide sabbaticals to their journalists; again so they can be refreshed. I suppose that, now I’m retired, I’m on a permanent sabbatical.

Mike Pidd is Professor Emeritus of Management Science at Lancaster University.


TIZIANO PARRIANI AND DANIELE VIGO

SINCE 2013, THE ITALIAN COMPANY HERAMBIENTE S.p.A. has made use of Operational Research to optimize waste allocation and transport over a broad network of treatment and disposal facilities. With 3.3 million citizens served, and more than 4 million tons of waste treated, the company owned by Hera Group is the largest waste operator in Italy and one of the largest in Europe. Herambiente S.p.A. in collaboration with Optit S.r.l., a spinoff company of the University of Bologna, applied O.R. to its core business. This collaboration led to the design, development and application of a Decision Support System (DSS) for waste flow management called OptiWasteFlow. Thanks to OptiWasteFlow, managers can make use of advanced optimization models to improve accuracy of the planning

process, and to obtain more costefficient solutions with remarkable returns in terms of profit.

O.R. AND WASTE MANAGEMENT

The need for O.R. support when taking decisions over waste management systems grew during recent decades. This is true for the European context in particular, where a straightforward source-to-landfill management switched to more complex multiechelon networks in which waste flows generally go through more than one preliminary treatment before reaching differentiated final destinations. The Herambiente network is an example of a large integrated waste management system: all kinds of waste types are routed over a network of about 90 directly controlled facilities and

IMPACT Š THE OR SOCIETY

15

Š Hera Group

OPTIMISING WASTE FLOWS


© Hera Group

several hundreds of facilities owned by third-party companies and located in Centre-North of Italy. Decisions at Herambiente are taken on different levels of detail: from strategic to operational. Each level has also a certain time horizon and time granularity, with lower levels inheriting decisions taken at a higher level and translating such decisions into more operational ones. At the highest level, yearly aggregated waste flows are allocated by Herambiente in a four-year Industrial Plan, updated every year in a rolling fashion. Decisions taken in the Industrial Plan are then refined in a Budget Plan, where the monthly waste flows are considered for the upcoming year. Finally, from the Budget Plan operational decisions are derived, allocating flows on a weekly basis.

optimal or near-optimal solutions for an Industrial Plan can be obtained within an extremely short computing time Before the introduction of OptiWasteFlow, each level of this process was supported only by general purpose IT tools. Especially for the more strategic decision levels, the definition of a good feasible solution was highly labour-intensive, requiring weeks to create the data and elaborate the decisions. Thus, the evaluation of possible what-if scenarios was restricted to an extremely reduced subset of possibilities, and the risk of losing profits was tangible. O.R. opened new possibilities. Once an OptiWasteFlow scenario is configured, optimal or nearoptimal solutions for an Industrial Plan can be obtained within an extremely short computing time, while human resources are dedicated to key activities

16

IMPACT | AUTUMN 2017

A SCREENSHOT FROM OPTIWASTEFLOW SOLUTION NAVIGATION INTERFACE.

such as solution evaluation, whatif analysis and consequent strategic decision-making.

THE CREATION OF OPTIWASTEFLOW

OptiWasteFlow is the outcome of a process lasting many years, which started in early 2000 with preliminary projects between Gruppo Hera and the O.R. group of the University of Bologna aimed at evaluating the potentials of the use of optimization in waste collection and treatment. After several years of testing of the models in various contexts, the good results and the increased complexity of the business favoured the decision of creating, in collaboration with Optit S.r.l., a complete Decision Support System capable to support the entire planning process at Herambiente. The analysis and development started in 2011 and soon the prototypes of the main OptiWasteFlow components were made available to central planners to start a thorough refinement and model revision process leading to the delivery of the OptiWasteFlow system in 2013. Since then, the system has been used regularly by Herambiente planners to develop industrial and budget plans

and to support the monitoring and operational decision making. Given the complexity of the process and of the data required to feed OptiWasteFlow the planning process was initially assisted by Optit’s modellers and analysts, but now the planners are completely autonomous in data preparation, scenario building and solution creation and evaluation. The efficacy of the system is continuously improved by the addition of new features and functionalities to improve the usability and to exploit new savings and efficiency opportunities that the large what-if analysis activities disclose.

THE ARCHITECTURE OF OPTIWASTEFLOW

The mathematical formulation is always kept invisible to the user, who operates on user-friendly graphical interfaces to set-up costs, limitations on the flows at the facilities, as well as all the various parameters necessary to guide the system towards the desired solutions. The user can generate and navigate through several scenarios, each giving a complete and independent representation of an alternative configuration of the system. Scenarios can be optimized on a remote server where solutions are produced in


features can also be integrated in the model, like temporary storages at the nodes, digester facilities, delayed flows on time-spatial graphs, logic constraints on the facility nodes, or economies-ofscale on waste transportation.

an overall increase of 14% in the Industrial Plan foreseen incomes is obtained THE IMPACT ON HERAMBIENTE ECONOMICS

The OptiWasteFlow is nowadays supporting the whole Herambiente decision-making process from the Industrial Plan to the operational level. For what concerns the former, results were the subject of deep qualitative and quantitative analysis performed by the waste operator to assess the validity of the tool. As an initial and fundamental step, the existing treatment network was modelled into OptiWasteFlow and the existing Industrial Plan, formulated with the traditional process, was evaluated with OptiWasteFlow to define an “As-Is” reference solution. Once the accuracy

of the As-Is income evaluation has been verified, an “optimized” solution is obtained by running the model with the same restrictions on flows, costs, and amount of waste managed. The two solutions can then be compared to understand the changes suggested by the model to flow allocations and the consequent effects on the main key performance indicators. The most important result was a considerable increase in the expected operating income. An overall increase of 14% in the Industrial Plan foreseen incomes is obtained, mostly thanks to a more efficient use of facilities resulting in decreased disposal and treatment costs, which constitute the major component of the objective function value. Logistics costs, although included in the model optimization process, are not considered in such revenue evaluations. The reason is that in practice they derive from long-term contracts that are stipulated by the decision-makers with several external carriers. Hence a reduction in the amount of transported flows does not necessarily have an immediate impact on the actual logistics expenditure of a given year since such contracts often prescribe guaranteed

IMPACT | AUTUMN 2017

17

© Hera Group

few minutes. Results are then displayed graphically within the DSS and exported as reports or spreadsheets. From a solution obtained for a certain planning level it is always possible to derive limitations and parameters to a lower level. Therefore, what has been decided for the Industrial Plan can be translated into objectives and constraints for the Budget, and so on. Hidden behind the DSS, sophisticated mixed integer linear problems are solved by state-of-the-art commercial solvers. In the optimization model, a graph represents the physical network. Waste types are aggregated into commodities that travel between nodes representing sources of waste, treatment facilities, and final destinations. Transformation coefficients at the nodes model the variations occurring to the waste when it enters a facility, because what exists from a treatment facility is generally different from what is entering, both in terms of quantity and composition. For each waste type, decision variables represent the amount of waste routed to each possible facility, at a given time. The objective function, to be minimized, considers: treatments and disposal costs and revenues, logistic costs for the transportation of waste, and revenues from selling the sub-products as electric power or recycled materials. Several different types of limitations can be imposed: constraints can set an upper or lower limit to the overall amount of waste entering or leaving a facility at a given time, or can limit the amount of flow of a given subset of waste, in absolute terms or as a percentage of the overall waste entering in that facility. This situation is common for waste-toenergy or composting facilities when the waste operator must create a mix of waste producing the correct heat of combustion or the appropriate “recipe” for the compost creation. Additional


© Hera Group

+14%

-­‐44%

Incomes

Logistic Costs

2014-2017 Industrial Plan: As-Is Solution 2014-2017 Industrial Plan: Optimized Solution

© Hera Group

EXPECTED INCOMES AND LOGISTIC COSTS IN THE AS-IS AND OPTIMIZED SOLUTIONS FOR THE 2014-2017 INDUSTRIAL PLAN.

25%

25% 20%

Flow variations w.r.t. "As-Is" scenario

15%

+13%

+20%

15%

+13%

10%

0% -5% -10%

+6%

+5%

5%

20% +14%

10% +4%

5% +0%

Fertilizer -4%

Recycle

Landfill

Filling Materials

0%

-4%

Optimized Scenario

Income

Landfill-Limited Optimized Scenario

FLOW AND INCOME VARIATIONS IN PRESENCE OF RESTRICTIONS ON LANDFILL UTILIZATION.

minimum flow quantities to the carriers. Nevertheless, is important to observe that the optimization reduced the logistic flows by more than 40%, which undoubtedly means less kilometres travelled by the waste over expensive paths, and with all probability this gives the opportunity to aim to more accurate and convenient terms when contracts are periodically renewed with the carriers. As indicated, an interesting aspect introduced by optimization-based DSS is the possibility of easily exploring alternative scenarios. The comparison between the As-Is and the Optimized solutions offered a typical example of what-if analysis. In the optimized solution, in fact, the amount of flow

18

IMPACT | AUTUMN 2017

disposed at landfills sees an increase of slightly less than 5%. Generally, the disposal of waste in landfills must be discouraged for reasons that can be difficult to translate into operating costs. What happens if we forbid the increase of flows routed to the landfills? A dedicated scenario examining such possibility can be easily derived from the original one. In the landfill-limited Industrial Plan, a new solution is obtained in which the flow routed to landfills has zero-growth with respect to the As-Is. The extra 5% is now almost entirely planned to be used as filling materials. Such a decision, with overall disposal costs increasing by slightly more than 1%, leads to a reduction

of the overall income that is now 6% higher than the As-Is. Not surprisingly, the more valuable wastes were already recycled in the Optimized solution, so no change is observed on flows allocated to those destinations. Thanks to the encouraging early results, OptiWasteFlow is now adopted by Herambiente for supporting the decision making in its core business. The project, started in 2011, is a good example on how O.R. can be integrated into an industrial process. This is an ongoing project in which the DSS and the underlying models are in continual evolution and expansion. Integration with other systems currently in use at Gruppo Hera is under development. Recently Optit and Herambiente are also working together to improve the effectiveness of this methodology for what concerns the lowest, so called Operative, levels. At such finest level of details, short-term legal authorization for the waste transportation are loaded into the system. Weekly target values, deriving from higher levels, are then translated into daily waste flow allocations considering all the restrictions imposed by the authority. Besides from the excellent revenues that were generated OptiWasteFlow also fostered the diffusion of O.R.-supported planning within other branches of the Hera Group such as the waste collection and district heating divisions. Tiziano Parriani is math modelling specialist at Optit. He designs and develops models and methods for combinatorial optimisation problems arising in several application areas, focusing on logistics and energy management. Daniele Vigo is Full Professor of Operational Research in the DEI “Guglielmo Marconi” at the University of Bologna, Italy. He is a founder and CSO of Optit.


U N I V E R S I T I E S M A K I N G A N I M PAC T EACH YEAR, students on MSc programmes in analytical subjects at several UK universities spend their last few months undertaking a project, often for an organisation. These projects can make a significant impact. This issue features reports of projects recently carried out at two of our universities: Strathclyde and Lancaster. If you are interested in availing yourself of such an opportunity, please contact the Operational Research Society at email@theorsociety.com

REDUCING UNCERTAINTY IN EMISSIONS FOR DOOSAN BABCOCK’S CLEAN COMBUSTION TEST FACILITY (Connor Wilson, University of Strathclyde, MSc Business Analysis and Consulting)

the ignition point of the burner, and is correlated to the magnitude of Nitric Oxide emissions. Doosan adjusts air flows and fuel flows in the CCTF to test burners at different BZS points which they then compare to the Nitric Oxide emissions. This allows them to find the ideal operating point for each burner which reduces emissions. Each of the sensors in the CCTF has some measurement uncertainty associated with it, and Connor’s project involved understanding how this uncertainty propagates through to the calculations of BZS and Nitric Oxide emissions. The first project objective was to help Doosan Babcock engineers define uncertainties for the various instruments, and the second objective was to develop a model that could calculate the uncertainty of the BZS and Nitric Oxide parameters based on the uncertainty of the input sensor data. Using Python, a Monte Carlo Simulation involving more than 20000 iterations was developed to randomize previously collected sensor data before passing the data into the calculations for BZS and Nitric Oxide.

The BZS and Nitric Oxide values were collected with each iteration, and the uncertainty was then calculated using population statistics. Once Doosan Babcock was aware of the magnitude of uncertainty in their BZS and Nitric Oxide calculations, they were interested in identifying the sensors which contributed the most uncertainty to the BZS and Nitric Oxide outputs. Final recommendations from the project included possible methods that could be used to reduce the uncertainty of those sensors which were resulting in the greatest uncertainty in BZS and Nitric Oxide calculations. Dr Stuart C Mitchell, Head of T&E Product and Technology Development, says that “Connor’s work supported the combustion team to better understand test facility instrumentation errors and how we determine measurement uncertainty during our test campaigns. Based on Connor’s dissertation Doosan Babcock is currently considering modifications to the existing plant duct work to improve measurement accuracy.”

IMPACT © THE OR SOCIETY

19

© Strathclyde Business School

Doosan Babcock is an engineering firm which provides services in the energy sector, including the upgrade and replacement of burners used in Combustion Power Plants. The burner is a highly engineered component, responsible for combining fuel and oxygen for ignition in a power plant furnace. The furnace heats water into steam that then turns a turbine to generate electricity. Doosan Babcock designs, manufactures, tests, and implements burners which are designed to reduce the emissions of noxious and greenhouse gases. To aide in the burner design and implementation process, Doosan constructed a Clean Combustion Test Facility (CCTF) which operates as a full-scale power plant and allows Doosan engineers to test burners under controlled conditions. The CCTF is fitted with sensors which measure parameters such as air flow, fuel flow, temperatures, and emissions. This real-time data is then used to calculate Burner Zone Stochiometry (BZS) and Nitric Oxide emissions. The BZS is the molar ratio of fuel to oxygen at


© Lancaster University

FORECASTING ELECTRICITY DEMAND AT BUSINESS ENERGY SOLUTIONS (Dmitrii Ishutin, Lancaster University, MSc Management Science and Marketing Analytics)

Business Energy Solutions (BES) is a UK-based utilities supplier located on the Fylde coast in Lancashire. Established by prominent entrepreneur Andy Pilley in 2002, it has grown to a successful enterprise with more than 300 staff. BES supplies electricity and gas to business customers across the UK; Dmitrii’s project focussed on the electricity domain. The UK electricity industry is highly competitive and complex, especially for smaller suppliers, which have to compete against the traditional Big Six - British Gas, E.ON, npower, SSE, Scottish Power, and EDF Energy. In addition, there are numerous regulation and compliance policies that suppliers must sign up to and stick to. To be a successful supplier, BES must: • win customers on a competitive retail market • forecast customers’ consumption accurately • buy electricity on the wholesale market at the best prices • bill customers and respond to their queries effectively. Accurate electricity demand forecasting has always been a critical success factor. Precise knowledge of customers’ demand allows buying

20

IMPACT © THE OR SOCIETY

enough energy in advance and prevents additional expenses incurred in the case of surplus or shortage of energy. In short, accurate demand forecasting saves a lot of money. Dmitrii’s objective as part of his MSc project was to develop a statistical predictive model that would achieve forecast accuracy of at least 90% on a 5-week-ahead forecast horizon. Since electricity demand is half-hourly metered, there are three seasonal cycles, namely daily, weekly and yearly cycles, that had to be analysed and modelled. Dmitrii tested and assessed forecasting methods such as exponential smoothing and autoregressive integrated moving averages (ARIMA), but eventually a regression-based model proved to be the most accurate. This model considered a consecutive timeseries of historic demand, its lagged observations and various dummy variables that modelled daily, weekly and yearly seasonal patterns. Dmitrii also developed a forecasting algorithm that could be used by BES independently and produced reliable and up to 93% accurate forecasts. Upon successful completion of his MSc, Dmitrii was offered a Statistical Analyst position at BES. Since joining BES, Dmitrii has upgraded his regression-based model by introducing weather factors such as temperature and wind speed. Pursuing higher accuracy,

Dmitrii tested two other forecasting approaches with deseasonalised time series. The first approach eliminated the daily period in a way that the original consecutive time series was split into half-hourly time series. Hence, a separate forecasting model was computed for each half-hour, amounting to 48 different models. The second approach eliminated daily and weekly periods in a way that a separate forecasting model was developed for each half-hour for each day type, amounting to 336 different models. In both cases all forecasted half-hours are assembled together and thus the original time series is restored. Regarding practical outcomes, the 48-half-hourly demand forecasting model demonstrated the highest accuracy of 96% and is about to be implemented by the business. David Ballantyne, BES Commercial Director and sponsor of the project, stated that “Business Energy Solutions saw a real practical benefit in Dmitrii’s demand forecasting model as it could help to reduce day-to-day trading costs, which was a primary objective of the whole project. It also provided valuable insight that forecasting disaggregated half-hourly data can be more accurate than forecasting original continuous time-series. We are happy to see Dmitrii among our staff upon his successful completion of this project.”


HOW MCDA IS MOVING DECISION-MAKING FROM INTUITIVE AND QUALITAT IVE, TO EXPLI CIT AND QUANTITATIVE

© Authors own

LAWRENCE D PHILLIPS

DECISION-MAKERS IN ALL ORGANISATIONS attempting to balance benefits against costs and risks can become frustrated when their staff fail to pull together in the same direction. The causes of the disparity are many: benefits are characterised by multiple objectives which often conflict; staff provide advice that only supports their local concerns; expert advice from different sources conflicts; data fail to provide unequivocal guidance; and the underlying issues are too complex. Accountable managers may respond to the impasse by relying

exclusively on intuition and experience in making decisions, a default position that fails tests of quality, transparency, and communicability. To surmount these limitations in balancing benefits, costs and risks, organisations in the private and public sectors are increasingly applying multicriteria decision analysis (MCDA) to aid, but not replace, decision-making. Three aspects of MCDA provide help to decision makers: (1) a process that breaks down the problem into manageable pieces and later reassembles them into a whole that is relevant to the decision, (2) a quantitative model that incorporates the key drivers of a decision, and (3) a facilitated workshop of key players who provide their expert judgements and exercise peer review as the quantitative modelling is developed.

MCDA OF OVER-THECOUNTER ANALGESICS

To explain the benefits of the MCDA approach, I’ve chosen an example that has just been made public: a benefitrisk evaluation of familiar pain-killers. Reckitt Benckiser, which markets Nurofen, asked if I could develop an MCDA model comparing over-

IMPACT © THE OR SOCIETY

21


Pain relief Duration of action

Favourable effects

Speed of onset Skin reactions Adverse Reactions

GI effects Hepatic effects

B-R OTC Analgesics

GI effects 1 Unfavourable effects Serious Adverse Reactions

CV effects Renal effects Anaphylaxis Overdose Toxicity

FIGURE 1: THE EFFECTS TREE OF FAVOURABLE AND UNFAVOURABLE EFFECTS.

the-counter analgesics in a decision conference, which is a workshop of key players whose facilitator helps the group, on-the-spot, to create the model. After establishing that only UK experts in pain control would contribute the content of the model and that Reckitt Benckiser scientists would be present only as observers, I agreed to facilitate the decision conference and a subsequent workshop review by five external experts. All seven outside experts were chosen to represent the diversity of opinion and expertise about analgesics and were recognised authorities in pain relief. The decision-conference group considered six well-known over-thecounter pain-relief drugs: aspirin, diclofenac, ibuprofen acid (ordinary tablet form), ibuprofen S&S (salts & soluble form), paracetamol and naproxen. The experts identified three favourable effects and eight unfavourable effects, as shown in the Effects Tree of Figure 1, and provided operational definitions for all of them. Favourable effects capture the reasons why most people purchase these drugs: they want relief from pain, they want the relief to last for as long as possible, and they

22

IMPACT | AUTUMN 2017

want the drug to start acting quickly. No drugs are completely free of possible side effects, and the unfavourable effects for the analgesics were grouped in three categories: adverse reactions which may be mild or moderate and can be alleviated by discontinuation; serious adverse reactions which can be lifethreatening, requiring hospitalisation or medical treatment; and the potential for a life-threatening over-dose.

measured the performance of the drugs on the favourable effects: the percentage of people reporting a reduction of at least 50% in pain, the time to re-medication for 50% of patients measured in hours, and the time to perceptible pain relief in minutes. The Hiview3 software, initially developed at the London School of Economics & Political Science, and available at www. catalyze.co.uk, that was used for constructing the model, converted each of these metrics to a preference scale, with 100 assigned to the drug that best performed on the given effect, and 0 to the least-wellperforming drug, and all other drugs assigned preference values in linear correspondence to the input metric. For the unfavourable effects, incidence rates and severity formed the basis for scoring. Data were available for skin reactions, but no controlled studies exist for the other seven effects, so the experts relied on available data in making direct 0-100 assessments on the preference scales, after considerable discussion.

WEIGHTING THE CRITERIA

organisations in the private and public sectors are increasingly applying multicriteria decision analysis (MCDA) to aid, but not replace, decision-making

SCORING THE DRUGS

The drugs were scored on each of the 11 criteria by consulting a variety of published research and studies, including cross-sectional surveys, and the UK summary of product characteristics (SPC). Different metrics

With scoring the performance of the drugs on the 11 criteria complete, participants turned to assessing the relative importance of the criteria themselves. This is accomplished with ‘swing-weighting’, in which assessors first consider the swing in performance from the least-to-best positions on each scale, and then decide how clinically important this is. I asked the question, “How big is the difference and how important is it?” repeatedly to provide the weights shown in Figure 2 for the favourable effects. Weights for the unfavourable effect were similarly assessed, and a


Pain relief Ibuprofen S&S

Duration of action Naproxen

IMPLICATIONS

Speed of onset Ibuprofen S&S

Aspirin

Paracetamol

Ibuprofen A

100

40

90

FIGURE 2: SWING WEIGHTS ASSIGNED TO THE THREE FAVOURABLE EFFECT CRITERIA.

final assessment of the 100-weighted favourable effect compared to the 100-weighted unfavourable effect, completed the weighting. These weights equate the units of preference across all 11 criteria, just as it takes nine Fahrenheit units to equate to five units of Celsius of temperature, and the process of obtaining them requires the assessors to think about trade-offs between the scales.

You can also do things with quantitative models that are difficult or impossible to do with words

an overall benefit-risk figure for each drug. Hiview3 displays the result graphically, as in Figure 3. It’s important to recognise that these results are based on judgments of relative preference, so the results are also relative, not absolute benefits and risks. It is the differences in the final results that make sense, but ratios of those numbers don’t, as 30°C is not twice the temperature of 15°C.

Many medical practitioners who have seen the descriptions of all six analgesics tell me they learned something new. For drugs that are so commonly used, this admission suggests that the MCDA process has assembled the data, which lie scattered throughout the vast medical literature along with clinical judgements of the relevance of the data, and delivered the overall result in a comprehensive, easily understood manner. You can also do things with quantitative models that are difficult or impossible to do with words. For drugs, MCDA models created by groups of experts require participants to identify and define the effects criteria, agree and define the options to be evaluated, and select the data that will form the basis for the scoring. For the most part, those three steps are normally carried out implicitly, so the MCDA lends precision and clarity. Scoring requires judgement to decide which data to use; weighting usually leads to trimming the Effects Tree as participants discard

Naproxen Diclofenac Aspirin Ibuprofen S&S Ibuprofen A Paracetamol

RESULTS

The assumption of mutual preference independence between the criteria (input scores on criterion A don’t have to be consulted to obtain scores for criterion B, and vice versa, even though they may be statistically correlated) justifies multiplying scores by criterion weights to obtain

94

83

57

54

39

13

FIGURE 3: FINAL WEIGHTED PREFERENCE VALUES FOR THE SIX DRUGS. MORE GREEN MEANS MORE BENEFIT, AND MORE RED MEANS MORE SAFE (NOT MORE RISK).

IMPACT | AUTUMN 2017

23


effects that have no appreciable influence on the overall benefit-risk balance. Most medical people simply don’t think this way, though they are genuinely intrigued by this new way of structuring a decision, and quickly adapt to the new procedures. Indeed, in one decision conference, during the process of weighting. one slow-responding participant was challenged by another expert to “Stop prevaricating, and give us a number!” After a promise by the facilitator that the weight would be wiggled around to see its effect on the overall result, the reluctant expert gave a firm response.

MCDA works to help decision makers take informed, quality decisions, which can be explained and communicated easily

The MCDA/decision conferencing process is increasingly used by pharmaceutical companies in two ways. The first is to serve as a guide during product development, when data are sparse and ‘best guesses’ are explored in the model to identify the most promising directions. The model is then used to track the changing benefit-risk profile, enabling losers to be identified and stopped before they reach the expensive clinical trials. Models for drugs that survive the early stages continue to be modified and improved as more data become available, with each MCDA model providing an overall view of the benefitrisk profile as the drug develops. The second use of MCDA is to position a drug compared to the competition. One company, for example, found that although their new

24

IMPACT | AUTUMN 2017

drug was second best overall for the main form of the medical condition, it was the best for patients suffering from the severe form of the disease. Paired-comparisons teased out the conditions for which the drug could be most helpful to a sub-group of patients. In this sense, MCDA modelling can help position drugs for individual patient-oriented treatment. Finally, at the end of clinical trials, the model can help in preparing the application to a regulatory authority. In summary, MCDA works to help decision makers take informed, quality decisions, which can be explained and communicated easily. It is the following characteristics of MCDA that make this possible: 1. MCDA expresses all effects, favourable and unfavourable, in comparable units of preferencevalue. 2. It accepts any performance metrics: measurable quantities, scoring systems, relative frequencies, decision outcomes, etc. 3. It distinguishes between data (performance measures) and judgements (relevance of the data to the decision context). 4. It captures trade-offs among the effects. 5. It is based on decision theory, not adhoc ‘weighting-and-scoring’.

transparent, clear decisions. There is no procedure for combining those effects, so it is done intuitively by exercising judgement. However, MCDA provides a way for combining summary statistics on all the effects.

evidence is growing that MCDA can transform the methods and processes for decision making in any type of organisation

One advantage of a decisiontheoretic approach is it could provide the answer to the fundamental question at the heart of all drug decision-making: how probable is it that this new drug is better than a placebo or any other comparator? If patient-level data were to be made available, it would then be possible to carry out probabilistic simulation to provide the probability that the drug’s benefit-risk profile is better than that of any comparator. That, I believe, is where we are heading next, and it will revolutionise drug development, regulatory approval, and the marketing of drugs. More generally, evidence is growing that MCDA can transform the methods and processes for decision making in any type of organisation, providing a way to create a shared understanding of a problematic situation, a sense of common purpose, and commitment to the future direction of the organisation in a changing world.

FUTURE DIRECTIONS

Larry Phillips is an Emeritus Professor of

I hope it is now clear that the current method of making decisions about drugs, which depends heavily on conventional statistical analysis and data summaries for individual effects, is not up to the job of making

Decision Sciences at the London School of Economics. He serves as a process consultant through his company, Facilitations Ltd, helping organisations to deal with complex problems characterised by conflicting objectives, uncertainty, and complexity.


VALENTINA FERRETTI

TRAINS ARE NOWADAYS AN EFFICIENT AND FAST MODE OF TRANSPORT, to the extent that high-speed rail, for medium distances, is competitive to air travel. They also represent environmentally friendly solutions since they cause less pollution compared to other transport modes and they consume fewer environmental resources. However, the current economic crisis has created the necessity of rail services reorganisation in many rural and peripheral areas all over the world, resulting sometimes in their suppression and replacement with bus services.

THE PROBLEM

The proliferation of inactive railways across the world is becoming significant (according to a report in 2014, there

are more than 240,000 km of disused railways in the US, more than 7,000 km in Spain, more than 7,500 km in Italy and more than 2,000 in Belgium, just to mention a few examples). This represents a problem worldwide as abandoned railways usually lead to natural and environmental degradation, structural failings and thus safety problems, and even illegal waste disposal with associated negative aesthetic impacts on the landscape. Consequently, abandoned railway sites and, together with them, station buildings that fall into disuse, are becoming a focus of redevelopment projects in many countries. Creative and successful examples already exist: for instance, in 1983, the United States introduced the Rail Banking solution, a procedure that provides for the maintenance and preservation of the

IMPACT Š THE OR SOCIETY

25

Š Trevor Mogg / Stockimo / Alamy Stock Photo

WHAT TO DO WITH DISUSED RAILWAYS?


© robertharding / Alamy Stock Photo

operational functionality of a railway line, in view of a possible reopening of the track. In periods of inactivity, public or private organisations can be temporarily granted the use of the railway line for recreational purposes. In Belgium, in 1997, the national railway system (SNCB) signed an agreement aimed at granting under concession almost 1,000 km of disused lines for 99 years, resulting in the creation of a network of greenways. Spain followed a similar strategy: 100 greenways were created and old stations were converted to new uses such as bed and breakfasts, bike rentals, railway engineering museums, etc. Therefore, disused railways can also represent an opportunity. As a matter of fact, finding a new use for neglected infrastructures provides a chance for low carbon travel experiences as reconversion policies promote new uses, arrest decay processes and re-establish continuity in the environmental system, using already existing linear infrastructures and thus limiting the consumption of new soil coherently with sustainable development targets. I will show how decision analytics have been used to support railways’ regeneration in Italy, where 50% of the disused rails have been evaluated as suitable to be recovered for touristic and ecological purposes. The focus

26

IMPACT | AUTUMN 2017

is on the 12 passenger railway lines recently abandoned and replaced by bus services in the Piedmont Region (North West of Italy). Among these lines, the Pinerolo - Torre Pellice, which stretches for 16.5 km between the city of Pinerolo and the Pellice Valley, crossing six municipalities, was selected as the most strategic one to be further studied for regeneration purposes. We developed this study for the transportation sector of the Piedmont Region Authority who contacted the Higher Institute on Territorial Systems for Innovation (SiTI, http:// www.siti.polito.it/index.php?l=ENG), where I was working as a post doc fellow in 2015, to support them in the identification of possible reuse strategies for the selected line. The identification and evaluation of feasible alternatives for the regeneration of disused railways is not an easy task. It is indeed an inherent multi-attribute problem, characterized by different dimensions and thus heterogeneous and often conflicting objectives, the presence of multiple stakeholder views that call for a participative decision process able to include different perspectives and facilitate the discussion, long time horizons which add further structural uncertainty to the decision-making process, the irreversible allocation of scarce public resources, and the need for legitimation

and accountability of both results and processes. Based on the above characteristics, we proposed the use of Multicriteria Decision Aiding (MCDA) to support the identification of a comprehensive set of objectives to be achieved and the subsequent design of alternatives to achieve them. MCDA nowadays represents a consolidated approach to decision making in many different contexts, including the analysis of transportation systems. However, such an approach has not been applied yet in the context of abandoned railways’ regeneration. This study has thus an innovative value as it represents the first application of MCDA in the context of railways regeneration.

THE DECISION SUPPORT PROCESS

The decision support process that we developed was designed to achieve the following objectives: (i) provide a transparent and transferable methodological framework able to support collaborative decision-making and planning processes related to the regeneration of disused railways in mixed urban and rural contexts; (ii) offer insights on what needs to be improved on specific alternatives;


(iii) develop a a robust recommendation for the Regional Authority with reference to the best regeneration option for the abandoned railway line under analysis. To achieve these objectives, we integrated descriptive and predictive

decision analytics within a prescriptive framework, as shown in Figure 1. We used MCDA with a facilitated modelling approach throughout the whole decision process, from structuring and defining the nature of the problem situation of interest,

The implementation

Prescriptive analytics

Definition & structuring of the problem

Stakeholders’ analysis

Value focused thinking

Alternatives

Best practices analysis

Descriptive analytics

EXPERTS’ PANEL

Preference elicitation

Visualization tools

Aggregation and real time sensitivity analysis

Predictive analytics

Recommendation for the decision maker FIGURE 1 STEPS OF THE DEVELOPED DECISION SUPPORT PROCESS

The decision model STAKEHOLDERS

OBJECTIVE

ATTRIBUTES

Piedmont Region Creation of new green areas

Turin’s Province Municipalities involved

Environmental factors

Population Commuters Tourists Tourist and commercial associations

Urban Mobility Agency

Italian Railway Network Local transportation companies

Identification of the best scenario for the regeneration of the abandoned railway “Pinerolo – Torre Pellice”

Compatibility with the present land use Duration of the construction works Landscape impacts

Costs New jobs Socio-economic factors

Impacts on the touristic sector Potential users Presence of attractions

FIGURE 2 VALUE TREE FOR THE DECISION-MAKING PROBLEM UNDER ANALYSIS

to supporting the evaluation of priorities and development of plans for subsequent implementation. The combined use of MCDA and facilitated modelling allowed us to take into account behavioural decision science insights when co-constructing the model with the client: e.g. we ensured a balanced and manageable structure of the value tree (Figure 2) to avoid the splitting bias; we started from the identification of values and objectives before defining the alternatives in order to stimulate the identification of better solutions and expand the decision space; we used a graphical elicitation tool for weights’ determination in order to avoid the range insensitivity bias, etc. This prescriptive intervention helped the participants in the process to understand the reasons for and against each alternative project under consideration. The following five alternatives were defined for the project: - greenway, i.e. the conversion of the 16.5 km of abandoned railway into a green corridor linking several municipalities in a mixed rural and urban area; - rail-banking, i.e. ordinary maintenance works on the railway tracks to ensure standards of quality, security and efficiency that are compatible with a possible reopening of the railway tracks in the future; - extension of the urban railway service, i.e. an extension of line 2 of the urban railway service (which has been created with the aim of improving the efficiency of the connections between Turin, the capital of the region, and peripheral cities) to include the municipalities that were served by the railway; - old station recovery, i.e. the recovery of the old station building in the Municipality of Luserna San Giovanni,

IMPACT | AUTUMN 2017

27


@Author Image

FIGURE 3 PARTICIPANTS IN THE FINAL WORKSHOP DISCUSSING REAL-TIME SENSITIVITY ANALYSIS RESULTS

which has been judged as the most strategic one to be recovered for tourist and recreational purposes; - no action (status quo), i.e. not taking any action and leaving the abandoned railway exposed to natural degradation, structural failures and to the risk of being used as illegal landfill. During the decision-making process, we developed and used a visual and interactive preference elicitation protocol by combining tailored visualization tools with well-founded decision modelling techniques. This allowed participants to explore in a less cognitive demanding way the tradeoffs characterizing the decision-making context under analysis. To avoid the black box effect associated with decision modelling and algorithms, we also developed a real-time sensitivity analysis in a focus group setting, by combining visualization analytics with a series of simulations on the weights of the attributes considered in the model. The real-time visualization of the results

28

IMPACT | AUTUMN 2017

by the involved experts supported a thorough discussion during the final focus group (Figure 3). This session was particularly appreciated by the participants in the process as it allowed them to clearly understand which are the attributes that, if weighted more, have the power to change the ranking of alternatives and thus the final recommendation, as well as to share their arguments and knowledge about the likelihood of the different scenarios. The result of the developed decisionmaking process was a recommendation towards the regeneration of the disused railways as a greenway connecting the different municipalities. The results also showed that the rail-banking and the no action options were clearly dominated by all the other alternatives and therefore were not worth further consideration. Surprisingly, the no action option was not the least preferred alternative. This is due to the fact that this option performs very well from the economic point of view because there are no costs associated

with it and cost is always a very important concern in the field of public goods transformations. A specific consideration has to be made with reference to the “greenway” and “old station recovery” alternatives. Indeed, the results of the process highlight that the old station recovery is the third best performing alternative with an overall performance pretty close to that of the first two alternatives. As a consequence, an action plan for the development of both the “greenway” and the “old station recovery” alternatives was envisaged by the actors involved in the process. This demonstrates that considering disused railways and stations as integrated parts of a system can improve the accessibility of environmental, cultural and historical resources’ networks.

STAKEHOLDERS’ FEEDBACK

The methodological framework proposed in this intervention supported the negotiation among different stakeholders towards a solution on how to tackle the functional regeneration of a disused railway in Italy.

real-time sensitivity analysis facilitated the discussion among the different experts participating in the focus group

Dr Andrea Rosa, an independent transport consultant who participated in the decision-making process, provided the following feedback about the MCDA preference elicitation protocol enhanced by the use of graphical visualization tools: “the


better understand the methodological background”. Finally, Elena Comino, Professor in Ecology at the Polytechnic of Torino and part of the committee board for the management of the river basin in the area under analysis, thus developing periodic participative meetings with the local authorities and community to make sure that the collective interests and concerns for the area are taken into appropriate account in the management of the land, also positively commented on the protocol tested in this intervention by saying “I believe that the idea of associating visual pictures to the elicitation of the mid-value splitting points in the value function construction has a lot of potential and may contribute to stimulate the use of collaborative decision support processes to deal with other complex public policy problems”.

Moreover, Elena also stated “… this real-time sensitivity analysis facilitated the discussion among the different experts participating in the focus group and generated more awareness in each of us, thus stimulating an important group learning effect”.

Valentina Ferretti is a Fellow in Decision Science working in the Department of Management of the London School of Economics and Political Science since 2015. Valentina’s research interests focus on exploring behavioural issues in spatial decision making processes and on developing judgement-based analytics for supporting innovative policy design. The impacts of the study presented in this article have been recognized by the INFORMS 2017 Innovative Applications of Analytics Award (https://www.informs.org/ Recognizing-Excellence/Community-Prizes/ Analytics-Society/Innovative-Applications-inAnalytics-Award).

For more detailed information about the work described, please see: Ferretti V. and Degioanni A. (2017). How to support the design and evaluation of redevelopment projects for disused railways? A methodological proposal and key lessons learned. Transportation Research Part D: Transport and Environment 52: 29-48.

IMPACT © THE OR SOCIETY

29

© robertharding / Alamy Stock Photo

traditional Bisection elicitation protocol has indeed a major drawback: it generates questions that are very abstract and thus not easy to answer for participants and real actors that not necessarily have a background in Multiple Criteria Decision Aiding. The improved elicitation protocol proposed by the authors has the crucial advantage of making the procedure more intuitive and the questions easier to answer, thus facilitating the real application of Multi Attribute Value Theory in other Operational Research contexts and studies”. The Director of the SiTI research institute, Professor Giulio Mondini, appreciated the benefits of the combined visual and decision analytics approach, by stating that “the use of an elicitation protocol (for both value functions and weights) enhanced with interactive visualization analytics had the following positive impacts: it brought clarity to underlying complexity, it enabled the insights revealed to be more readily interpretable and actionable, it facilitated intuitive interaction with data visualization and it allowed non-experts in the Decision Analysis field to interact with the data sets and


JOURNAL OF BUSINESS ANALYTICS The mission of the journal is to serve the rapidly growing and emergent community of business analytics both in academics and in industry/ with practitioners. We seek research papers that clearly address a business problem, develop innovative methods/ methodologies and use real-world data to show the how the problem can be solved.

Editors-in-Chief: Dursun Delen, Oklahoma State University, USA dursun.delen@okstate.edu Sudah Ram, Eller College of Management, USA sram@email.arizona.edu

T&F STEM @tandfSTEM

@tandfengineering

Explore more today‌ bit.ly/2g4s9YM


KUMAR RAJARAM AND SANDEEP RATH

Can operational research be used to effectively manage costs at operating room suites of large multi-specialty hospitals? The answer is a resounding yes. A decision support system has been implemented to minimize daily expected resource usage and overtime costs across multiple parallel resources such as anesthesiologists and operating rooms, which are used to conduct a variety of surgical procedures at a large

multi-specialty hospital: the UCLA Ronald Reagan Medical Center (UCLA RRMC), one of the leading hospitals in the U.S. This system successfully incorporates the flexibility in the resources and uncertainty in surgical durations, and explicitly trades off resource usage and overtime costs. It has increased the average daily utilization of the anesthesiologists by 3.5% and of the operating rooms by

IMPACT Š THE OR SOCIETY

31

Š UCLA Health Sciences

M A N AG I N G T H E O P E R AT I N G R O O M S U I T E AT U C L A R O N A L D R E AG A N M E D I C A L C E N T E R


© UCLA Operating Services Department

3.8%, leading to average daily cost savings of around 7%, estimated to be $2.2 million on an annual basis. Furthermore, insights based on this model have significantly influenced decision making in the operating services department (OSD).

PROBLEM DESCRIPTION

Surgical procedures are complex tasks requiring the use of several specialized and expensive resources. The OSD is responsible for managing resources used in surgical procedures. Every day, this department assigns to each surgery an operating room, an anesthesiologist, a nursing team and the requisite surgical materials. The department also determines the sequence in which these surgeries will be performed and the scheduled start times. While performing these actions, the department ensures that the cost of the operating room suite is minimized by reducing resource usage and overtime costs. A significant amount of time is devoted to making these resource management decisions. The complexity of these decisions at any large hospital occurs for four primary reasons. First, operating room resources are expensive, and thus in short supply. Second, surgical procedures are often very specialized. Third, the durations of surgical procedures are very difficult to predict. Finally, the scale of large hospitals, in terms of the number of operating rooms, procedures conducted, the number and types of equipment and anesthesiologists used, makes the simultaneous scheduling of multiple resources a computationally challenging task. “Operating services is one of the largest departments at UCLA RRMC” reports Aman Mahajan, executive director of the OSD, “with around

32

IMPACT | AUTUMN 2017

$120 million in annual revenues representing about 10% of this hospital’s revenues. My department serves around 27,000 patients annually by conducting around 2700 types of elective and emergency surgical procedures across 12 specialties”.

This system has increased the average daily utilization of the anesthesiologists by 3.5% and of the operating rooms by 3.8%, leading to average daily cost savings of around 7%, estimated to be $2.2 million on an annual basis

Emergency surgeries are conducted in three dedicated operating rooms with a separate team of anesthesiologists. Since emergency surgeries are separated from elective surgeries and account for only about 15% of revenues, we focused on elective surgeries, which can only be scheduled to start between 7am and 3pm. To perform these surgeries, OSD uses 23 Operating Rooms (ORs), which are scheduled and staffed simultaneously and shared by the 12 specialties. Some specialties (Plastics,

ENT, Urology, Eyes, General) can use any of the ORs, but for others (Vascular, Neuro, Liver, Thoracic, Cardiac Trauma, Pediatric) three or fewer rooms are available. There are fixed costs for opening an operating room each day: initial cleaning and equipment setup costs along with daily nurse and technician staffing costs, whose assignments do not depend on specialty. In addition, overtime costs are incurred for nurses and technicians if the rooms are required to be open beyond 3pm. The 92 anesthesiologists at UCLA RRMC for the 12 specialties are assigned according to the specialty required for the surgery. There are three shifts of equal duration, with regular working hours of eight hours. They can only be assigned to surgeries that begin during their shift. Overtime costs for anesthesiologists are incurred if surgeries in progress exceed the duration of the shift. A certain number of anesthesiologists who are not scheduled to work on a given day are asked to be on standby, so that they can be called to work if necessary. However, when they are assigned from call, there are significant costs for using such an option, but they do not incur overtime costs. The anesthesiologists on call are informed of their status the previous day and assigned surgeries that day as required.


It is important to note that in the context of large multi-specialty hospitals such as UCLA RRMC, surgeons are not part of OSD. They are usually from the independently administered specialty departments at this hospital and sometimes can be from other hospitals. The surgeons bring their patients and use OSD as a service provider. Thus, OSD does not have the option of assigning surgeons to patients. Thus, each surgery-surgeon combination is already set and is considered together.

SCHEDULING SURGERIES

Typically, a request to schedule a surgery is initiated by the surgeon on behalf of the patient with general admissions at the hospital. This request is assigned a date based on the earliest availability in the block reservations for the particular specialty. Once all the elective surgery requests have been received the day before the surgery, OSD decides which operating room to open, finalizes assignment of these rooms and anesthesiologist to surgeries, determines start times of surgeries and effectively specifies the sequence of all the surgeries. These decisions are made the previous day for all the surgeries that need to be conducted in the following day. Consequently, the planning horizon

is a single day. The current planning process to make these decisions uses an experience-based practitioner’s heuristic consisting of the following steps. Step 1: Assign surgeries to operating rooms in sequential fashion in order of start times requested by the surgeons, by surgery specialty and duration estimates from surgeons, until the last surgery in the room can start before the end of the shift for the operating room. Step 2: Assign one anesthesiologist to each room such that the anesthesiologist can perform most of the surgeries in the room. Step 3: A few anesthesiologists are assigned to surgeries across rooms to ensure all surgeries have been assigned an anesthesiologist by specialty. Step 4: If this plan cannot be implemented by anesthesiologists on regular duty, assign anesthesiologists from call. While this practitioner’s heuristic is easy to understand and implement, it does not consider two important aspects. First, it does not explicitly consider uncertainty in surgical durations. Second, it does not directly consider that most anesthesiologists and operating rooms can perform more than one specialty. Thus, it does not exploit the flexibility in these resources.

Inefficient assignment and scheduling of anesthesiologists and operating rooms to surgeries leads to low utilization (the fraction of the available shift time that is used by a particular resource) and overtime. Average daily utilization across the anesthesiologists is close to 0.75, with around 25% of days having an average daily utilization of less than 0.70. However, despite these lower levels of utilization, the average number of anesthesiologists on call is around 6 per day. Similarly, for operating rooms the average daily utilization is close to 78% but the average overtime per day is around 18 hours. Taken together, average call and overtime costs for anesthesiologists and rooms at this department are about 33% of revenues. A more effective optimization based assignment and scheduling process that considers uncertainty in surgical durations and flexibility in the resources could potentially reduce overtime and on call costs. To achieve these objectives, we developed a decision support system at the UCLA medical center. The core of this system was a large-scale, twostage mixed-integer stochastic dynamic program with recourse. The first stage allocates these resources across multiple surgeries with uncertain durations, and prescribes the sequence of surgeries to these resources. Assuming that each surgery should be scheduled as early as possible, this consequently provides a scheduled start time for surgeries. The second stage determines the actual start times to surgeries based on realized durations of preceding surgeries, and assigns overtime to resources to ensure all surgeries are completed using the allocation and sequence determined in the first stage. The size and complexity of the problem precluded a solution using conventional methods. Therefore, this was solved using a data driven robust optimization approach that solves

IMPACT | AUTUMN 2017

33


SUMMARY OF RESULTS BEFORE AND AFTER IMPLEMENTATION OF DECISION SUPPORT SYSTEM ATTRIBUTES

BEFORE

AFTER

Average number of anesthesiologist on call per day

6.0

5.6

6.7

Average overtime per day for anesthesiologist (hours)

18.2

17.5

3.7

75

77.6

-3.5

Average number of operating rooms used per day

20.4

18.6

8.6

Average overtime per day for operating rooms (hours)

18.5

18

2.7

78

81

-3.8

Average daily operating room costs ($)

57,350

52,417

8.6

Average daily overtime costs ($)

2,2375

2,1754

2.8

Average daily call costs ($)

7,145

6,527

8.5

Average total daily costs ($)

86,870

80,729

7.1

Average daily utilization of anesthesiologist (%)

Average utilization of operating rooms per day (%)

large-scale real-sized versions of this model close to optimality. (For details on the model formulation, properties and solution techniques see Rath, S., K. Rajaram and A. Mahajan (2017). Integrated Anesthesiologist and Room Scheduling for Surgeries: Methodology and Application. Operations Research 65: 1460-1478).

RESULTS

The results before and after the implementation of the model are summarized in the Table, which shows, for instance, that the average number of anesthesiologists on call decreased by 6.7%, and average overtime hours for the anesthesiologists on regular duty reduced by 3.7%. The improvements in these and all the other metrics reduced average daily operating room costs by 8.6%, average daily overtime costs by 2.7%, and average daily call costs by 8.5%. This translates to an overall average daily cost saving of 7%, estimated to be $2.2 million on an annual basis. There are two main reasons for these improvements. First, the model was

34

IMPACT | AUTUMN 2017

% REDUCTION

more effective at utilizing the flexibility in the resources. Most anesthesiologists and operating rooms can perform more than one specialty, typically a primary and a secondary specialty. This system identified these operating room/anesthesiologist combinations and allocated surgeries across these different specialties to them. This led to better usage of resources than the previous approach, in which surgeries from a single specialty were assigned to an operating room and anesthesiologist as much as possible. A surgery of a different specialty was assigned to an operating room only when there were high volume of surgeries in a particular day and this was often done without explicit consideration of the allotted anesthesiologists’ specialty. Thus, this frequently required a separate anesthesiologist to perform these surgeries, who were often assigned from call and this was costlier. Second, this system explicitly considered uncertainty in surgical durations while determining the daily schedule of an operating room. By using an estimation method, it determined which surgeries could

be longer and more uncertain, and which surgeries could be shorter and more certain. It then combined long uncertain surgeries with short certain surgeries to effectively utilize gaps in the schedule in each operating room. This in turn reduced the number of operating rooms each day with the resulting cost reduction being more than any potential increases in overtime costs, thus reducing total costs. In contrast, the previous approach used surgeons’ predictions of surgery durations. To compensate for the errors in these predictions, planners often underutilized operating rooms by leaving sufficient gaps between surgeries. This was done as they did not want to create delays from scheduled start times of succeeding surgeries and incur overtime costs. However, this often led to a larger number of operating rooms being used each day and consequently higher total costs. Finally, the impact of the schedules generated by this system on the surgeons was evaluated. While surgeons were not part of OSD, they are a critical component in the surgical process. First, we found that the average idle time between surgeries was 8 minutes shorter, but the surgeons did not find this reduction significant enough to be disruptive, and in fact some of them preferred this as it made their schedule more efficient. Second, there was only a marginal increase, from 1.54 to 1.57, in the average number of surgeons per OR per day. This suggests that most of the benefits of this system came from making the correct assignment of operating rooms and anesthesiologist to surgeries. Both these aspects were important to verify that the surgeons were not inconvenienced by the model based approach.


MANAGERIAL INSIGHTS

The decision support system generated several managerial insights. First, various scenarios were simulated to consider the impact of reducing variability in surgical durations. The results showed that rather than invest in capital-intensive medical equipment to achieve radical reductions in variability in surgical durations, the major cost benefits can be gained by focusing on incremental reduction in variability. Potentially, this can be achieved by more detailed data collection for improved predictive analytics, and better procedures, such as check lists, improved information technology, following the correct sequence in tasks and standardized operating protocols derived from best practices. Second, this system was used to consider the impact of allowing surgeries to start in operating rooms after 3pm but before the end of the late shift of the anesthesiologists at 7pm. This would require additional fixed technician and nurse staffing costs. Such extensions can be considered if surgical demand on any day is significantly larger than average daily surgical demand. The results suggest that it is beneficial to allow such extensions and the number of operating rooms used depends on the level of demand. This analysis helps management understand how best to react to different levels of daily surgical demand and also estimate the corresponding changes in total costs. Finally, the benefit of increasing cross functionality of the operating rooms was examined. To do so, the system was used to calculate the potential reduction in costs if the number of operating rooms available for each specialty is increased. Such increases can be achieved by investing in special equipment to convert general

surgery operating rooms to have the cross functionality to accommodate a particular specialty. “The results show” says Mahajan “that increasing the number of operating rooms that could be used for a particular specialty can lead to a significant reduction in total resource usage and overtime costs, and these benefits are often more pronounced in certain specialties.” This analysis enables the priority for which specialties additional operating rooms should be made available to be determined. An additional advantage of making operating rooms cross functional is that a higher number of daily surgeries can be more effectively accommodated without conducting new surgeries after 3pm. In particular, it was estimated that this approach led to at least an additional 5% reduction from the lowest costs attainable for all the demand scenarios considered. This provides further justification for management to make more operating rooms cross functional.

The use of the model resulted in considerable cost savings, and has encouraged the management to investigate other problems in this department using a structured and rigorous approach by employing operational research-based methodologies

ORGANIZATIONAL IMPACT

This work has demonstrated the value of a model based approach and analytical methods in dealing with complexity. The organizational impact of this work has been significant. The use of the model

resulted in considerable cost savings, and has encouraged the management to investigate other problems in this department using a structured and rigorous approach by employing operational research-based methodologies. The managerial insights generated from our model have also contributed to the organizational impact. Our analysis of the effect of variance reduction provided management with the further impetus to implement six sigma programs to reduce variability at OSD. It also provides management with clear guidance on when to start new surgeries after the day shift and in how many rooms, allowing mitigation of the impact of varying levels of daily surgical demand on costs. Finally, this work showed the benefits of making some operational rooms cross functional and how to prioritize implementation among the specialties. Furthermore, it was demonstrated that this could potentially be a very effective way to accommodate changes in daily surgical demand. The methodology has had a major economic and organizational impact at UCLA RRMC’s OSD. “The OSD expects to maintain the described gains and to increase them continuously several years into the future”, says Mahajan. “Furthermore, this system is general enough to be adapted to other multi-specialty hospitals and similar benefits can be expected to be accrued”. Kumar Rajaram (kumar.rajaram@ anderson.ucla.edu) is professor and the Ho-Su Wu Chair in management at the UCLA Anderson School of Management, Los Angeles and Sandeep Rath (sandeep@unc.edu) is assistant professor of operations at the KenanFlagler Business School, University of North Carolina at Chapel Hill.

IMPACT | AUTUMN 2017

35


© Dwr Cymru Welsh Water

DATA S C I E N C E A N D O P E R AT I O N A L R E S E A R C H AT DŴ R C Y M R U W E L S H WAT E R KEVIN PARRY

36

IMPACT © THE OR SOCIETY

DŴR CYMRU WELSH WATER (DCWW) is the sixth largest water and sewerage company in England and Wales, providing water and wastewater services to over three million customers. The company is owned and managed by Glas Cymru and, unique in the water and sewerage sector in the UK, operates as a ‘not for profit’ organisation with no shareholders.

The Data Science Team at DCWW has been in place since January 2015. The team supports organisational objectives, which centre largely on ensuring customers receive the best level of customer service at all times. This is achieved by working alongside colleagues across all business areas to understand how solutions to operational and business challenges can be identified


WATER SERVICES

The Water Services business area largely focuses on ‘source to tap’ operational activity. In other words, it covers the maintenance of services from the point of raw water abstraction to potable distribution to our customers in their homes. The business area includes activities relating to water quality sampling, leakage and demand analysis. The asset base managed by the Water Services business area includes 66 impounding reservoirs, such as Llandegfedd, near Pontypool, South Wales, as seen at the head of this article, and around 350 service reservoirs. The organisation is externally regulated to ensure compliance with regulatory standards. To ensure that water provided to customers is wholesome and safe, water samples are

DATA SCIENCE TEAM

taken across the network. Samples are then analysed to ensure they conform to set microbiological parameter limits.

The dashboard tool has been in operation within the organisation for a year and is helping operational teams manage risk by drawing attention to those service reservoirs requiring it the most

One particular example of where the Data Science Team has been able to support the business area, through the use of predictive analytics, is the Service Reservoir Bacteriological Compliance project. Here, suggested factors that may contribute to the likelihood of coliform bacteria developing were explored and statistically tested. Risk factors are identified and flagged up to the business so that actions can

be taken by the operational teams to address issues before they materialise as a regulatory failure. The Data Science Team were able to undertake statistical analysis into the contributory factors and were able to produce a capable predictive model. The team worked closely with the clean operations teams to outline the findings in a clear and understandable way, and collaboratively designed a Tableau dashboard tool that utilises the predictive model outputs. The dashboard tool has been in operation within the organisation for a year and is helping operational teams manage risk by alerting them to those service reservoirs requiring the most attention. This pinpointed focus also means that the operations teams do not need to spend as much time attempting to identify the high risk service reservoirs themselves, and can therefore spend valuable time on site carrying out appropriate remedial

IMPACT | AUTUMN 2017

37

© Dwr Cymru Welsh Water

through the use of operational research, statistics, and computational methods. The Data Science Team currently consists of three Statisticians, a Data Engineer, a Business Analyst, and two Graduate Analysts. Although still small in numbers, the growth of the team over the past few years reflects a growing interest in data science from across the organisation, with business areas recognising how the application of data science could help them achieve their targets and goals. The team are based at the organisation’s Cardiff office, a busy and lively office setting that promotes a collaborative and engaging working atmosphere. The team also engages in collaborative activities with external organisations. The organisation can be categorised into four main business areas – Water Services, Wastewater Services, Retail and Customer Strategy, and Support Services. Each manages and maintains a distinct part of our operating mandate.


© Dwr Cymru Welsh Water

actions that will reduce the risk indicated by the predictive model.

WASTEWATER SERVICES

Our Wastewater Services team look after the sewerage network within the organisation and ensure that water is safely treated after it has been used.

The development of the sludge simulation model will allow the business area to evaluate the effectiveness of implementing certain scenarios, at no cost / impact to the real-world process

A recent operational research project centred on the development of a discrete event simulation model that mirrors the sludge treatment process at our wastewater treatment works. Sludge, a by-product of the water treatment process, has properties that allow it to power certain components of the overall treatment process,

38

IMPACT | AUTUMN 2017

bringing both cost and energy efficiencies. Certain components within the process are dependent on other components, meaning the overall process is fairly complex and involved, and so the organisation has seen the value of creating a discrete event simulation model of this context. The development of the sludge simulation model will allow the business area to evaluate the effectiveness of implementing certain scenarios, at no cost / impact to the real-world process. Involving other operational research techniques, such as optimisation, will allow the business area to understand how the treatment process can be run in such a way that provides maximum benefit and minimal cost.

RETAIL AND CUSTOMER STRATEGY

The Data Science Team also works closely with the Retail and Customer Strategy business areas, which encompass the customer contact centre, billing and customer strategy teams. The team are currently supporting the business area with the development

of two predictive model projects. The first looks to identify the factors that influence a customer’s propensity to complain. The objective is that high risk customers can be proactively contacted first, to remove the need for them to submit a complaint. The second predictive model project centres on propensity to pay. Both projects utilise statistical analysis methods and techniques that include customer segmentation and predictive modelling. Understanding customer sentiment (a measure of how our customers feel about our organisation) is important, particularly in relation to activities we undertake (such as repairs and remedial issues), and the Customer Sentiment Dashboard developed by the team has provided the Customer Strategy team, and the wider business, with visibility of company-wide sentiment. This information provides us with visibility of areas or issues that are impacting our customers, and enables us to respond to these issues proactively and quickly.

The team have been able to replace numerous manual processes with automated solutions that have brought substantial savings in time and effort

SUPPORT SERVICES

The Support Services Business area underpins activities undertaken by the Water and Wastewater business areas. The Smart Hub, for example, is the organisation’s control centre that manages and responds to telemetry alarms across the operating area. The team have worked closely with the Smart Hub on several projects, particularly focusing on initiatives that


CARDIFF UNIVERSITY SCHOOL OF MATHEMATICS

The team have a close working relationship with staff and students of the School of Mathematics at Cardiff University. Several of the team members have been awarded undergraduate and postgraduate qualifications from the University, and the team have participated in several activities as part of the MSc Operational Research and Applied Statistics programme. In 2016, the Data Science Team, in conjunction with colleagues from the Retail business area, facilitated a Case Study event at the University. The Case Study Day, a component of the MSc Operational Research and Applied Statistics course, involved providing a real-life business problem for students to address using statistical and operational research methods. Students were provided with the opportunity to present their analysis and findings to the Data Science Team and Retail colleagues at the end of the day. The day was just as beneficial to the organisation as it was to the School of Mathematics, bringing new insight and a fresh perspective to addressing business problems that the organisation can absorb and consider. Paul Harper, Professor of Operational Research at Cardiff

University: “Our collaboration with the Data Science Team at DCWW is highly beneficial to both organisations. In particular our students gain immensely from the opportunities to interact with DCWW and apply their O.R. and data science skills to real-life problems, and likewise DCWW gain as a local employer of our graduates”.

The Data Science team strive to provide the organisation with a first-class data science and analytical service, principally through the design, development and implementation of valuable and useful solutions

DATA SCIENCE GRADUATE SCHEME

The developing recognition of the benefits that operational research and statistics can bring has led to the organisation extending its existing graduate programme to include Data Science Graduate roles over the past two years. The Data Science Graduates, based within the team for a two year period, will develop their operational research understanding through the delivery of key projects prioritised by the business. They also gain understanding and experience using data science tools, such as R, and how to tackle projects using an effective analytical workflow. They are also provided with opportunities to communicate the

findings of their analysis to relevant stakeholders within the organisation. Particularly pertinent for operational research projects, the translation of often complex findings into clear and understandable language that business area colleagues are able to interpret and use is a key skill, and the Graduates are encouraged to discover and incorporate innovative storytelling methods (such as new data visualisations) for communicating their findings effectively.

FUTURE PROJECTS

A number of exciting new operational research projects lie on the horizon. Optimisation techniques will shortly be applied within the water services context to understand the optimal place to locate water tankers, vehicles that hold and can distribute drinking water to customers should the network supply be interrupted unexpectedly. There will also be an enhanced focus on applying operational research techniques to our wastewater treatment processes, looking at specific components of the process to understand whether they can be operated slightly differently in order to achieve maximum efficiency and value. The Data Science team strive to provide the organisation with a first-class data science and analytical service, principally through the design, development and implementation of valuable and useful solutions. Kevin Parry is Principal Statistician within the Data Science Team at Dŵr Cymru Welsh Water.

IMPACT | AUTUMN 2017

39

© Dwr Cymru Welsh Water

replace extensive manual processes with automation. Using programming, and the development of programmatic processes, the team have been able to replace numerous manual processes with automated solutions that have brought substantial savings in time and effort. This has enabled the Smart Hub staff to spend more time focusing on emerging issues, enabling a quicker response, bringing obvious benefits to the customer.


DOMINIC STUBBINS

40

IMPACT © THE OR SOCIETY

THE BIODIVERSITY OF THE BRITISH ISLES is no longer something that we can take for granted. Recent research published in the State of Nature 2016 report (compiled by a coalition of over 50 leading wildlife charities, including the National Trust: https://ww2.rspb.org.uk/our-work/ stateofnature2016/) revealed that around 56% of species in the UK have decreased in number over the last 50 years and, unless direct action is taken, this figure is likely to increase. Hedgehogs, natterjack toads, hen harriers and the beautiful flowering crested cow-wheat are among over 1,200 species of native animals, birds and plants that are at serious risk of becoming endangered.

The reasons behind the decline in Britain’s biodiversity are well known. Increases in the intensity of agriculture have disrupted the natural propagation of corn marigolds; the draining of bogs has damaged the habitat of the large marsh grasshopper; climate change has constricted the range of the mountain ringlet butterfly; and urban development has put pressure on species such as water voles. What is less clear, however, is what individuals, organisations and public bodies can do to stem this decline and, critically, how. One organisation that has taken on the crusade with conviction is the National Trust. Responsible for an estate of over 250,000 hectares in England, Wales and Northern

©National Trust Images/Trevor Ray Hart

PROTECTING BRITAIN’S BIODIVERSITY WITH GEOSPATIAL ANALYSIS


Ireland, the charity felt that it could play a greater role in protecting and nurturing biodiversity in the UK. It therefore set out to implement new land management approaches to help restore wildlife, as part of a new strategy called ‘Playing Our Part’. However, the National Trust faced a number of familiar operational research challenges that threatened to hamper the achievement of its vision.

A COMPREHENSIVE UNDERSTANDING OF BIODIVERSITY

FIRM EVIDENCE FOR PRIORITISING CONSERVATION SCHEMES

Now a pivotal system at the National Trust, the centralised habitats database gives the organisation a firm evidence base for identifying, planning and prioritising conservation schemes. Employees can analyse the data in the habitats database, using Esri’s ArcGIS Desktop solution, and identify the best opportunities to make habitats bigger, better and more joined up. The trust can identify opportunities for habitat enhancement, working in partnership with other landowners beyond the trust’s boundaries. In short, GIS provides the trust with clear insight that helps it understand where the best opportunities for nature are, so that it can then focus its resources in the right locations and work with the most appropriate partners to have the best possible impact on endangered species. By using ArcGIS to perform detailed habitats analysis, the National Trust is also gaining a far greater understanding of factors like land-use and climate change that may pose a serious, long-term risk for threatened species. It has, for example, modelled the impact of higher sea levels on its 775 miles of coastline using ArcGIS. It is now applying the intelligence it has gained to identify high-nature coastal habitats that may become changed, understand the implications for species and assess opportunities for coastal habitats to move inland.

Rangers working on the land can collect habitat and biodiversity data in the field on smartphones, to monitor the evolving condition of land, water quality and habitats

EFFECTIVE IMPLEMENTATION AND MONITORING OF PROJECTS

GIS also plays a key role during the implementation of habitat restoration programmes and in monitoring the success of the trust’s interventions. For instance, in the large tracts of degraded Blanket Bog and Upland Heathlands in the Peak District, the National Trust is using ArcGIS at the desktop and in the field to locate and block damaging drainage channels and conduct vegetation surveys. This

IMPACT | AUTUMN 2017

41

©National Trust Images/Jonathan Buckley

First and foremost, the National Trust needed to gain a comprehensive understanding of biodiversity on its land. It had a vast amount of habitats data from surveys undertaken at specific properties, and at regionallevel, over a period of more than 30 years - but this data was spread across multiple systems, in multiple formats, making it difficult to gain a single view. Furthermore, the National Trust recognised that there were gaps in its data coverage, so it would need a fast and accurate way to collect and share more habitats data over time. Most importantly, the organisation needed to be able to analyse and interpret its data, to help it make the right decisions and instigate conservation schemes that would have a direct and measurable impact on biodiversity. The solution to all these challenges came in the format of a geographic information system (GIS) from Esri UK, a technology that enables people to visualise, analyse and interpret data in a spatial way to understand patterns and trends. Using Esri’s ArcGIS Server solution, the National Trust was able to consolidate all of its data to create a single, centralised source of habitats information, for the first time in the

organisation’s 120-year history. Better still, the organisation gained a way to visualise, interrogate and query this data to really understand the issues impacting the decline in Britain’s biodiversity.


CLEAR COMMUNICATION OF CONSERVATION GOALS

Already, the National Trust has proven, through its efforts, that organisations can take appropriate, focused and effective steps towards improving Britain’s biodiversity. Yet, for there to be a significant halt

42

IMPACT | AUTUMN 2017

in the decline in so many precious species, it will take the efforts of all organisations, public sector bodies and indeed all individuals to make a real difference. So, that’s why good communication is critical.

ArcGIS is a pivotal tool that is helping the National Trust to identify priority habitats and then intervene appropriately to enrich and extend the biodiversity of the land

In the future, the National Trust plans to make use of Esri’s internetbased GIS platform, ArcGIS Online, to educate the general public about the threats to biodiversity and the vital importance of its conservation activities. Combining captivating images, interactive maps and persuasive text, Story Maps are a clear and powerful way for organisations to convey information about research projects or activities to a wide audience. They will be particularly effective in demonstrating the success of the National Trust’s current

initiatives and promoting the need for habitat conservation right across the country. “ArcGIS is a pivotal tool that is helping the National Trust to identify priority habitats and then intervene appropriately to enrich and extend the biodiversity of the land”, says Huw Davies, Head of Conservation Information, the National Trust. In so many ways, GIS technology is helping the National Trust to achieve its goals in what is an excellent example of a successful operational, researchled project. From the consolidation of resources, the collection of data in the field and the analysis of habitat threats to the implementation of conservation schemes and the communication of results, GIS has proven itself as a key weapon in the battle to save Britain’s most endangered species. Dominic is Chief Architect at Esri UK with extensive experience in delivering geospatial solutions for government, defence, utilities and the commercial sector. With more than 20 years’ experience Dominic is passionate about using geographic analysis and data science to help organisations understand and manage the world in which they operate.

©National Trust Images/Chris Lacey

is a particularly significant project because the organisation is responsible for 28% of England’s entire priority habitats in the uplands, and the bog mosses, that decompose to make the carbon sink that is peat, are beginning to recover as a direct result. Furthermore, the National Trust is now beginning to use Esri’s Collector App for ArcGIS to verify, expand and enhance its centralised habitats database. Rangers working on the land can collect habitat and biodiversity data in the field on smartphones, to monitor the evolving condition of land, water quality and habitats as projects on the ground progress over time. All data collected in the field is synchronised with ArcGIS Server, allowing the National Trust to constantly put better information back into its central systems.


© Meibion / Alamy Stock Photo

U S I N G H E A LT H SYS T E M S A N A LY T I C S TO H E L P T H E NHS MARION PENN, RUDABEH MESKARIAN AND THOMAS MONKS

IN HAMPSHIRE, LOCAL AUTHORITIES AND SOLENT NHS TRUST are making use of advanced Health Systems Analytics to visualise their demand, and support their decisions about how many sexual health clinics should be provided to meet future patient need. Rob Carroll, Public Health Manager at Hampshire County Council said: “The advanced Health System Analytics work undertaken by the NIHR CLAHRC (National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care) Wessex Data

Science Hub was a key element of our transformation project to review and redesign local sexual health services in response to reducing financial resources and increasing demand. The analysis gave us a detailed understanding of how far our residents were travelling to access different services within existing clinics and identified inequities in service access in some parts of the county which we have now been able to address. The analysis also enabled us to forecast future demand for services and to model the impact on footfall and operational capacity of introducing alternative models of service delivery,

IMPACT © THE OR SOCIETY

43


including home-sampling kits for STI testing. As a result of the analysis we have a much better understanding of current and future demand for sexual health services and the optimal location of clinics to provide equitable access. This knowledge has informed our recent procurement requirements and provided us with the information and confidence to reduce the number of clinics we require to meet the evolving needs of our population, providing savings in relation to estates, workforce efficiency and travel costs across the local system.”

The advanced Health System Analytics work was a key element of our transformation project to review and redesign local sexual health services in response to reducing financial resources and increasing demand

THE LOGISTICS AND OPERATIONS OF SEXUAL HEALTH SERVICES

Sexual health care in Hampshire is provided through a network of hub and spoke clinics. The hubs are large clinics that are located within the urban centres. The spokes are smaller clinics located in the more rural areas of Hampshire. Journey times to urban centres vary substantially across Hampshire. This can make life difficult for patients, particularly as young adults are often the ones with sexual health concerns. They may have limited access to transport or may be put off from making a long journey to receive treatment or advice. The spokes aim to increase the equity of access to services for patients who live

44

IMPACT | AUTUMN 2017

FIGURE 1: EXAMPLE GIS VISUALISATION

outside of the major cities. The more spokes that are in operation the more likely it is that a patient lives near a clinic. Of course, there are also facility, workforce and travel costs to consider. The more clinics there are, the greater the costs. Large networks of services are also incredibly difficult to manage. A patient who walks to a clinic in the West of Hampshire should receive the same standards of care and access to treatment as patients in the North East. In early 2016, local authorities and Solent NHS Trust in Hampshire faced the difficult task of reviewing the provision of sexual health care. A balance was needed between equity of access, quality of care and costs. The local authorities and the NHS teamed up with Health System Analytics experts from the University of Southampton to explore how to meet these aims. The team were part of the NIHR’s CLAHRC Wessex. The NIHR’s CLAHRC programme is a five year £10 million research programme in health funded until 2018. The Wessex region has developed an

Analytics team to support the NHS in complex issues regarding operations and logistics that affect patient care. The team has experience in analytics for improving the quality and reducing the cost of patient care.

THE UNTAPPED POTENTIAL OF HEALTH SYSTEMS ANALYTICS

Health systems routinely collect a wealth of data about patient usage of services. Records are often kept on patient demographics, referrals, appointments, diagnoses, prescriptions, procedures performed, time spent in hospitals and follow-up actions. Given financial and time pressures, the NHS rarely exploit such data to their full potential. Analytics offers a suite of easy to use tools and solutions for data ‘wrangling’ and exploring new ways of delivering patient care. Analytics can also be delivered quickly. CLAHRC Wessex’s Health Systems Analytics team delivered results within two months.


of journey times seen. Lastly the dots outside the lines represent the odd few patients who undertook unusually long travel times.

SPEEDY AND GREEDY: THE POWER OF PREDICTIVE ANALYTICS

FIGURE 2: EXAMPLE JOURNEY TIME VISUALISATION.

Potential uses of analytics for health data include building a geospatial understanding of demand for health services; identifying areas with greater care needs; identifying inequities in service provision; increasing the quality of services (such as reduced waiting times or better patient outcomes); forecasting future demand; understanding the causes of poor and good performance; predicting pathway usage and health needs, and identifying safe opportunities for disinvestment.

investigation of the equity of service provision. Example visualisations can be seen in Figures 1-2. Figure 1 illustrates a map of clinic hubs and spokes (red dots); demand (blue dots) and car ownership (shaded green) by postcode sectors e.g. SO16 4. It illustrates that large proportions of the population that live further away from the hubs own at

DESCRIPTIVE ANALYTICS: VISUALISING THE USAGE SERVICES

So how do we identify which clinics could be closed and identify if potential new clinic locations provide a better option for patients? Enter predictive analytics

Each financial year the Hampshire service collected information on over 200,000 appointments from 28 regularly held sexual health clinics. To investigate demand, a Geographic Information System (GIS) was used to visualise both the clinic locations and patient population centres. Population demographic information could also be added to maps to illustrate factors such as deprivation; car ownership; and patient age or gender. A GIS also provides estimates of car and public transport (bus) travel times between locations. This enhances the

least a single car, while the population that live in cities near the hubs can rely on more frequent public transport. Figure 2 illustrates car travel time to clinics using a tool called a box and whisker plot. These are simple graphics to illustrate the spread of data. The middle line of each plot is the median (50% of journey times lie above and below it). The shaded box represents the middle 50% of all journey times that patients undertook. The lines or ‘whiskers’ represent the typical range

So how do we identify which clinics could be closed and identify if potential new clinic locations provide a better option for patients? Enter predictive analytics. These are powerful data centric approaches to ask what if we ran health services differently or what will the population look like in 10 years from now. If we wanted to minimise both the number of clinics needed and maximise the number of patient journey times within 30 minutes there would be 1.9 x 1040 combinations to explore in Hampshire. To put it mildly, this is a trifle tedious to do by hand. Predictive analytics offers the ability to produce good solutions in less than one second using a greedy algorithm. To explain this approach, you can think of the algorithm as an individual who has the ability do arithmetic very quickly and who never regrets any of their decisions. Think of the map in Figure 1 as a completed jigsaw where each jigsaw piece represents a discrete geographic region for example, a postcode sector or a super output area. Each piece of the jigsaw has a number of patients who live there and a known journey time to every other piece of the jigsaw. Our individual quickly completes a few calculations and removes the jigsaw piece that has the most patients living within a 30 minute journey time of it. That area is then selected as a location for a clinic. The pieces that are within a 30 minute journey time are also removed. As our individual has no regrets, once a jigsaw piece has been removed it is never

IMPACT | AUTUMN 2017

45


returned. This process is repeated with the remaining pieces until no further pieces of the jigsaw can be removed. The great advantage of a greedy approach is that it does not rely on complex mathematics. It also works extremely well for problems on a regional scale when compared to more complex approaches. The approach is part of the wider analytics field of optimisation and heuristics. These are powerful approaches used to solve complex combinatorial problems. Figure 3 gives an example of the greedy algorithm in action. It illustrates the trade-off in 15 minute journey times versus the number of clinics that are funded. In this case it is possible to reduce the number of funded clinics from 28 to 14 and still have an equitable journey times for 95% of the population. If 90% were acceptable than only 9 clinics would be needed; although it is advisable to consider the equivalent results that use public transport times.

YOUR ALGORITHM IS CLEVER, BUT I WANT TO DO SOMETHING SLIGHTLY DIFFERENT…

The great thing about the process of Analytics is that it doesn’t end with a dull report stating “here is the answer”. To support the complexity of decisions in the real world, Analytics must provide adaptable solutions. In a sense, you can think of Analytics as a set of tools to facilitate decision making. In this case, local authorities and the NHS are making use of a predictive analytics tool to adapt the ‘solutions’ proposed by the greedy algorithm. Why is this necessary? It is because of the ‘we couldn’t possibly put a facility there!’ phenomenon. There are many reasons why a mathematically equitable solution doesn’t work in the real world ranging from ‘there are no

46

IMPACT | AUTUMN 2017

FIGURE 3: OUTPUTS FROM THE GREEDY ALGORITHM

suitable premises in area x’ to ‘this facility requires expensive renovation’. Predictive analytics algorithms can be written to take account of such factors, but it is often difficult to (quickly) gather the right data to make it work or indeed include everything relevant. Worse still, complex models often make it difficult for decision makers to understand and trust results. The best part is that a predictive analytics tool can be implemented in Microsoft Excel which is on the desktop of every local authority and NHS employee. The tool, in this case, allows the solution suggested by the greedy algorithm to act as a starting point. Clinics can then be ‘turned on’ or ‘turned off’ to see the impact on travel times. This flexibility allows for qualitative information about premises, facility conditions or good old fashioned politics to be quickly considered. The tool also provides guidance on capacity requirements as well, by quantifying likely shifts in demand on each clinic.

ANALYTICS IS MAKING A REAL IMPACT ON THE NHS

The financial pressures on local authorities and the NHS are increasing

by the day. Public scrutiny of decisions affecting the equity and quality of patient care is also at an all-time high. The good news is that Health Systems Analytics is showing signs of making a real tangible impact in the way decisions are made regarding the future provision of healthcare services in the UK. Dr Sarah Williams, Associate Director of Research, Solent NHS Trust said: “This collaborative project between Solent as the Specialist Community NHS Provider, Local Authority commissioners and the Wessex CLAHRC has given the service an effective tool on which to base service delivery decisions; it has been widely used to plan locations of clinics, to minimise travel time and maximise access to patients. Decisions can now confidently made on firm evidence and we hope to extend this project both in terms of complexity (opening hours and staffing models for example) and by implementing similar approaches in other healthcare service areas.” Marion Penn, Rudabeh Meskarian and Thomas Monks are researchers who work for NIHR CLAHRC Wessex, Faculty of Health Sciences, University of Southampton. They can be contacted via thomas.monks@soton.ac.uk


Ctrl - Alt - Facts - Delete Geoff Royston

truth” as “word of the year” and in 2017 it has already featured as the main title of three books: Post Truth - The New War on Truth and How to Fight Back, by Matthew D’Ancona; Post-Truth – Why We Have Reached Peak Bullshit and What We Can Do About It, by Evan Davis; and Post-Truth- How Bullshit Conquered the World, by James Ball. It is no coincidence that the word “bullshit” appears in two of these titles. In his book Evan Davis quotes the American philosopher Harry Frankfurt: “the bullshitter’s eye Is not on the facts at all, as the eyes of the honest man and of the liar are…. He does not care whether the things he says describe reality correctly. He just picks them out, or makes them up, to suit his purpose…. By virtue of this, bullshit is a greater enemy of the truth than lies are.” CAUSES

“The moon is made of green cheese”. “I believe that the moon is made of green cheese”. “We are going to build a beautiful great moon made of green cheese”. One falsehood, one lie, and one piece of bullshit. From President Nixon’s “I am not a crook” and Clinton’s “I did not have sex with that woman”, to the outpourings from the current POTUS, such utterances seem to have grown from a trickle to a torrent. Trumpian samples range from “It (the inauguration crowd) went all the way back to the Washington Monument” to the (Trumped-up) charge that “The concept of global warming was created by and for the Chinese in order to make US manufacturing non-competitive”. Words like spinning, Lincoln and grave come to mind. Although I have chosen examples from the USA, other countries, such as Russia, would have provided plenty, and you will not find it difficult to think of ones much closer to home. This is not just a quantitative increase in egregious untruthfulness; it is also a qualitative shift - the world appears to have moved from one where politicians would fabricate, but worry about being found out, to one where they – and many in their audience - simply don’t much care about fact, relying instead on appeals to emotion. THE POST-TRUTH ERA

This is the world of “alternative facts”, “truthiness”, and “post-truth”. In 2016, Oxford Dictionaries declared “post-

There is nothing novel about some of this. Propaganda has a very long history, arguably as old as civilisation, with several of the “isms” of the 20th century providing some infamous examples, and with its dangers immortalised in fiction by Orwell’s dystopian “1984”. However, the post-truth era does exhibit some new features, and has not arisen now by chance. For example, all the three books cited above note it has been driven by the ability to spread (dis)information rapidly to targeted audiences though the internet and other digital media and the propensity of the media to frame dispute as disruptive entertainment with an illusion of contest between equally valid positions. There are also some deeper currents that have brought post-truth to the surface. D’Ancona notes a number of these. First, with origins that can be traced back to Nietzche and his dictum “there are no facts, only interpretations”; a collapse of trust in traditional sources of authority and information. In such a world what matters is the emotional resonance of a story, not reasoned argument from the evidence. Facts, or falsehoods, are deployed not for themselves but mainly as broad signals of empathy, genuine or contrived, with the concerns of the audience. That provides a link to a second cause of the rise of post-truth, at least in politics – “a resentful demand for change” amongst people who have lost out in the modern world. People who have been, or feel at risk of being, dispossessed - of wealth, of work, of home, of identity may understandably be receptive to a simplistic narrative that promises them a way out and may be disinclined to pay much attention to the correctness of factual details, particularly those provided by experts they no longer trust.

IMPACT © THE OR SOCIETY

47


A BATTLE OF PERCEPTION

The most straightforward approach to countering “alt facts” is to debunk them. This task is typified by the work of fact-checking organisations and websites such as Full Fact or Snopes. This endeavour continues to develop, and there are interesting possibilities being explored in coping with the flood of misinformation by augmenting human checking with automated analysis. Necessary though fact-checking is, it is not sufficient, as is clear when the underlying drivers of post-truth, as summarised earlier, are considered. To paraphrase Evan Davis “If your goal is to stop people believing things that are plainly silly then your best strategy is to ask why they want to believe it and to have something useful to say or do about that underlying concern ---- reducing the social and economic tensions underpinning people’s willingness to believe untrue things would play a major part in that.” Further, the “how” as well as the “what” is said is crucial. D’Ancona notes: “Like an infection resisting antibiotics, a virulent conspiracy theory can fend off even incontestable facts ….. the counter–attack has to

48

IMPACT | AUTUMN 2017

CONCLUSION

In some areas - climate collapse, for example - there is what might be termed a long-stop for post-truth: nature does not care whether what we say is true or not; and will take its course regardless. And, eventually, truth tends to out in any sphere - post-truth politics is likely to come up against reality when promises of change fail to materialise. James Ball’s book quotes Will Moy, the director of Full Fact: “There is only so long you can govern in a post-truth way, even if you might be able to campaign in a post-truth way”. Or, to go back to Abraham Lincoln, “You can fool all the people some of the time, and some of the people all the time, but you cannot fool all the people all the time.” Of course, “eventually” might be too late. So we all have some responsibility – both civic and professional - to help in whatever ways we can to speed the draining of the post-truth swamp. Ctrl-Alt-Facts-Delete.

Dr Geoff Royston is a former president of the O.R. Society and a former chair of the UK Government Operational Research Service. He was head of strategic analysis and operational research in the Department of Health for England, where for almost two decades he was the professional lead for a large group of health analysts.

©post truth

REMEDIES

be emotionally intelligent as well as rigorously rational”. In particular, truth-sayers need to make more use of the post-truthers’ own key weapon - story telling - by coming up with compelling and contagious counter-narratives as vehicles for conveying truth.

©biteback

D’Ancona warns that the post-truth era is a global “crash in the value of truth, comparable to the collapse of a currency…. a battle between two ways of perceiving the world … risking the central value of the Enlightenment, of free societies and democratic discourse…. being trashed by charlatans”. That certainly suggests that this is something about which we should be very concerned as citizens. And I would say that it is an issue also where researchers, analysts, and anybody with a professional interest in decisions being informed by evidence, should be amongst the first to the barricades. While a lack of care about veracity may be particularly prominent in the political arena, it can also be found in areas where scientists and analysts commonly operate; consider for example the refusal for decades of the tobacco companies to accept that smoking is harmful to health, or the persistence of the unfounded belief that MMR vaccination is a cause of the development of autism in children. Post-truthers’ distrust of, or even disdain for, experts, their selective use of evidence on the basis of whether or not it supports their ideological position, their liking for pseudo-science and conspiracy theories; these constitute a kind of awful perversion of the motto of the Royal Society: “Take nobody’s word for it”.


OR ESSENTIALS Series Editor: Simon J E Taylor, Reader in the Department of Computer Science at Brunel University, UK The OR Essentials series presents a unique cross-section of high quality research work fundamental to understanding contemporary issues and research across a range of operational research (OR) topics. It brings together some of the best research papers from the highly respected journals of The OR Society.

ACCESS THESE TITLES AT: palgrave.com/series/14725

A38629


Annual Analytics Summit 2018 Thursday 12 June

As part of London Technology Week, the Annual Analytics Summit delivers a one-day learning and networking event about how big data and analytics are shaping organisational decision-making. Filled with case studies, innovations and strategies on turning data into decisions, the Annual Analytics Summit is the event for practitioners and decisionmakers alike. The summit brings together experts from government, industry and academia, as well as exhibitors from software providers, consultancies and specialist recruitment agencies.

Location IET Savoy Place London WC2R 0BL

www.analytics-events.co.uk #AS18


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.