Issue 13 | channels.theinnovationenterprise.com
ie.
Innovation HOW IS DATA STOPPING EBOLA? We look at how data is helping to predict the spread of Ebola and what can be done to stop it. by gabrielle morse
16
Josie King, President at Innovation Enterprise, talks us through the trends that are going to affect Big Data in 2015 23
Will Quantum Computing Break Bitcoin? With increased speeds the possibility may increase
LETTER FROM THE EDITOR Welcome to this issue of Big Data Innovation, the last issue of 2014
We hope you enjoy this issue of the magazine and wish you a happy holidays.
Managing Editor: George Hill
This year we have seen a shift in the way that Big Data is perceived, both in terms of people’s understanding of it and the media’s perception.
As always, if you are interested in contributing or have any feedback on the magazine, please contact me at ghill@ theiegroup.com
Assistant Editors Simon Barton
The likelihood is that we are going to see a significant rise in the the stature of data in the next 12 months and we will see it on the front pages of newspapers for both good and bad reasons. This issue will discuss some of the potential changes and developments that we are likely to see. Josie King, the President of Innovation Enterprise gives us her thoughts on what the major trends will be in Big Data in 2015, how many will you agree with?
George Hill Managing Editor Are you are looking to put your products in front of key decision makers? For Advertising contact Hannah at hsturgess@theiegroup.com
Art Director: Joe Sanderson Cover Design: Chelsea Carpenter Contributors: Elliot Pannaman Mark Lacey Gabrielle Morse Chris Towers
We also look at the effects that data will have on stopping the spread of Ebola in West Africa after the region has been hit by the deadly virus in 2014.
Heather James
In addition to these we also talk machine learning, quantum computing, data visualisation at the BBC, lifts at the new world trade centre and how data is being used to predict film success.
ghill@theiegroup.com
General Enquiries:
2014
CONTENTS
12 HOW CAN BIG DATA HELP AGAINST EBOLA We look at how data could be the key to predicting and controlling the spread of the Ebola virus.
4 HOW ANALYTICS ARE HELPING TO PREDICT BLOCKBUSTERS As movie studios look at new ways of predicting success, we see how analytics can help.
19 IMPROVING DATA WITH MACHINE LEARNING Simon Barton discusses Machine Learning with Spencer Greenberg from Research Rebellion.
8 DATA VISUALIZATION AT THE BBC We talk to Charlotte Thornton, UX Designer at BBC News, about how the BBC are adapting to new data viz challenges.
23 HOW WILL QUANTUM COMPUTING AFFECT THE INTERNET’S TRUST MECHANISMS? Will quantum computing break Bitcoin and encrypted keys? We investigate the super computer trend to find out..
16 2015 BIG DATA TOP TRENDS Josie King talks us through what she believes will be the top trends in Big Data in 2015.
27 ONE WORLD TRADE CENTRE ANALYSING THE NEW ELEVATOR SYSTEM Big Data is being used to improve the lifts at the new world trade centre, we look at how this works.
HOW ANALYTICS ARE HELPING TO PREDICT BLOCKBUSTERS Elliot Pannaman Director, Predictive Analytics Innovation Summit
HOW ANALYTICS ARE HELPING TO PREDICT BLOCKBUSTERS
Analytics has permeated into nearly every aspect of our lives today. From the way we interact with one another online to the clothes and food that we purchase. It can predict what you are likely to do and how you are likely to do it, but can it predict how you will react to a piece of art? It is an interesting crossover as films, literature and art are relatively subjective, but how can data predict what will be successful and what won’t?
“
I have spoken to many analytics professionals within the film industry and they have all told me that it is impossible to predict the success of a film as a single entity. Look at John Carter and the Lone Ranger, which both failed spectacularly. Both had all the ingredients of a hit, but failed miserably.
Netflix, Hulu and other subscription services can now see what people are watching, at what times and where These films were a kick in the shin for Hollywood, they represent the 4th and 5th most
5
expensive films ever made but both were an abject failure at the box office. However, viewing habits have a new hero, the very thing that many believed would kill the movie industry could save it. Netflix, Hulu and other subscription services can now see what people are watching, at what times and where. It is this information that provides analytics professionals with a goldmine of information to help create a blockbuster. Shows like ‘House of Cards’ and ‘Orange Is The New Black’ have been created with particular demographics in mind. This means that these shows can be made with particular audiences in mind with a higher chance of a successful show. ‘We can look at consumer data and see what the appeal is for the director, for the stars and for similar dramas’. This is what Steve Swasey, Netflix’s VP of Corporate Communications, said on the subject. It is possible to not only predict a blockbuster from the data, but to also create one.
“
So with this kind of data available to them, companies like Netflix have a distinct advantage over the traditional Hollywood powerhouses in predicting how a film or show may fair and can create shows to make the most of these possibilities. Therefore analytics have a huge part to play in the creation of future films and the prediction of how those films will do.
...analytics have a huge part to play in the creation of future films and the prediction in how those films will do BIG DATA INNOVATION
6
HOW ANALYTICS ARE HELPING TO PREDICT BLOCKBUSTERS
to create more original blockbusters than the traditional Hollywood movie studios. With this in mind it is important to remember that simply using Netflix and other streaming services to look at what their audience are watching is never going to be enough to create a blockbuster every time.
At the moment, it is one of the biggest sources of new movies and TV shows, as these companies can afford to invest in a certain level of originality. This is in comparison to the traditional powerhouses, who rather than investing heavily in original films have simply created sequels and prequels to successful films. For instance, so far in 2014, 8 out of the top 10 grossing films are remakes or sequels and this is mirrored by 7 of the top 10 films in 2013. So analytics has not only allowed companies to predict what is going to be successful, but when effectively used,
BIG DATA INNOVATION
Data from previous films such as Silver Lining Playbook and American Hustle, would suggest that audiences are interested in watching films featuring Bradley Cooper and Jennifer Lawrence. However, Serena, their most recent film together, wasn’t even released in cinemas, instead it was a straight to video film. Analytics would have put this film as a success thanks to the casting alone, but the fact is that it was deemed unworthy of a cinema release. But there is a way for movie studios to get around this through data. Using sentiment analysis on social media would allow
analytics executives to see what was being said about a film by those who have seen it. This way they would have the opportunity to adapt future films around what has been said about the wider plot of a film, which scenes people liked or disliked in particular or certain potholes that grated on the audience. By adopting this kind of sentiment analysis even films that make a loss can have a positive benefit to the movie studio. It would ultimately mean that we could avoid Transformers 15 or The Hangover Part 50.
HOW ANALYTICS ARE HELPING TO PREDICT BLOCKBUSTERS
“
7
Using sentiment analysis on social media would allow analytics executives to see what was being said about a film
BIG DATA INNOVATION
DATA VISUALIZATION AT THE BBC Heather James, Director, Data Visualization Summit
9
DATA VISUALIZATION AT THE BBC
The BBC has always been considered as one of the most progressive media companies in the world, and their presentation of data has always been one of their strengths. However, as a publicly funded company, they need to make sure that they are making the most of the data they have without wasting time and money. It falls to a select few who have the expertise and drive to do the job. One of those is Charlotte Thornton, UX Designer at BBC News. We spoke to Charlotte at the recent Data Visualization Summit in London to get her views on
some of the latest trends in data visualization and how these are having an affect on the BBC.
“
‘A good data visualization needs to be simple, it needs to tell a story, it needs to be rewarding and ultimately, the user needs to be engaged’. Charlotte believes that this is a key facet to any data visualization and as the BBC is such a media rich company, it is clear that these are principles that are well adhered to. When looking at their websites and the way that news is presented, the amount of data that needs to be conveyed would often be impossible with words alone.
At the moment we struggle to design across multiple devices and in the future, I think there will be a way to do that easily It is therefore vital for Charlotte and the team that surround her, to adhere to these principles in order to convey data to their audience in the best possible way. She believes that it is these four points, simplicity, story telling, reward and engagement, that ultimately make for the best visualizations. This isn’t to say that this will always be the same though. Despite these ultimately being the key aspects to creating an effective visualization, it will come down to the understanding of the data itself. As our society is becoming more data literate, it means that the visualizations can become more complex whilst still keeping these four main points as the basis of how this information is communicated. Charlotte claims that ’If the users become more literate with the data, the level that you will need to simplify it to will become less’.
BIG DATA INNOVATION
10
“
If the users become more literate with the data, the level that you will need to simplify it to will become less This means that as we move forward and data becomes better understood and prevalent in society, the visualizations that we see on the news, in print or even just for public services, can become more complicated without losing the level of understanding that people will have of them.
BIG DATA INNOVATION
DATA VISUALIZATION AT THE BBC
We know the way that data is portrayed to the audience is important and this is more than just simply making it look good. Charlotte tells us that there needs to be a balance between understanding what the data’s underlying story is and the best way to portray this. In reality this means that visualizations take more than a single person. The chances of finding an individual who can not only understand the data and tell its story, whilst also being able to design compelling and exciting visuals is going to be hard. Charlotte says ‘Ultimately, if you can’t find somebody who can accurately convey the story, there is no point in understanding the data in the first place’.
Aside from effective visuals, due to the position that the BBC holds (it is meant to fairly represent the news without political bias) it is important that visuals do not deliberately or inadvertently have a bias. Charlotte mentions that ‘We always try to make sure that we give a comparison’. This means that they are focussing on both sides of the story throughout rather than focussing on one point. This is imperative with data visualizations for a corporation like the BBC, perhaps even more so than in spoken words or live segments. This is because with those it could simply be a mistake or a slip of the tongue, whilst with visualizations, the amount of planning and work that needs to go in them means that
DATA VISUALIZATION AT THE BBC
extensive research must have gone into each one.
“
We were also keen to hear from Charlotte about what she thought the next few years were going to hold for data visualization. One of the key areas that she points out is ‘At the moment we struggle to design across multiple devices and in the future, I think there will be a way to do that easily’.
With the ever increasing popularity of smartphones in both the developed and developing world, this is certainly going to be the case. Especially for the BBC, who despite being a British based media brand, have a truly international audience. People want to digest their data in whichever way they want rather than being consigned to one
A good data visualization needs to be simple, it needs to tell a story, it needs to be rewarding and ultimately, the user needs to be engaged particular device in order to see a visualization properly. As well as this, Charlotte believes that augmented reality could be the new frontier of data visualization. It would allow for a more immersive
11
experience, but she does admit that ‘it may well be far in the future, but it would be great to have people being able to interact with something in front of them’. Whatever the future holds for data visualization, it is clear that the BBC are going to be one the early adopters of any new techniques and designs. It currently has ’the news at 10’ which is the most watched news programme in Britain with around 4.9 million viewers tuning in every night. In addition to that it is the 7th most popular website in the UK, meaning that the skills of Charlotte and others in her team are not only needed now, but will also be imperative as they move forward.
BIG DATA INNOVATION
HOW CAN BIG DATA HELP AGAINST EBOLA? Gabrielle Morse Director, Data Science Innovation Summit
13
HOW CAN BIG DATA HELP AGAINST EBOLA
Ebola is one of the most talked about and terrifying issues in the world today. Governments do not know how to deal with it and the response from health organizations seems to be only denting the problem. So what technology do we have that could have been used to lessen the impact and help out in the future? BIG DATA Through the use of data, Healthmap (a website showing all contagious diseases mentioned in the news) pinpointed the potential outbreak on March 14th, a full 9 days before the World Health Organization (WHO) formally announced the epidemic. It used social data from blog posts written by healthcare professionals who were treating people with Ebola-like symptoms and people who were sharing these on social media, to identify that there was an Ebola outbreak in Guinea. This use of openly available data to help predict an upcoming epidemic has been held up as a victory for its use over traditional information gathering and news reporting. If it had been taken more seriously when it was flagged in early March, then perhaps we would be in a stronger position now in fighting the spread of the disease. However, this is not the full extent of the story. It was only
“
flagged on March 14 after a press conference from the minister of health from Guinea on March 13, which was conducted in French. The difficulty is both that the isolated communities who are effected don’t regularly use the internet and that monitoring systems are often only monitoring English language posts. As most of the initial posts would have been in French, it would not have been easy to identify with current systems. Also, it can only have a certain effect as it looks at social posts about certain aspects and will often not include some of the primary groups who have been effected. For instance, those who tend to be affected first by diseases are the very old and the very young, the two groups who are least likely to be able to use social media in this context. However, monitoring alone is not the only way that data can be used to help against Ebola. We saw with the way data was used for the Haiti earthquake relief, that it can help to co-ordinate equipment and personnel, putting them in the right place at the right time. This is certainly the case with Ebola. Data from hospitals for the number of people being treated for Ebola, means that vital medical supplies and personnel can be sent to the areas where they can make the most difference.
Data will give us significant insights into how diseases spread and the ways that they can be cured Seeing where the virus is being spread through the way a population is moving is an important element to predicting how to stop it. It is for this reason that a West African mobile carrier has given access to data from it’s many users to allow people’s movements to be seen and thus where future outbreaks may occur. It will be through this kind of open data sharing that the most effective work can be done, essentially the more invasive the data is, the more insight it will give. Although it goes against much of what people want to do with their data, it will allow for doctors to treat in a more effective way and governments to prevent its future spread. Having this data as an open source of medical information is clearly an important starting point to preventing the further spread of the disease, but there are some drawbacks. Gregory Piatetsky has done some fantastic analysis of the data available, managing to identify barriers to the data’s
BIG DATA INNOVATION
14
HOW CAN BIG DATA HELP AGAINST EBOLA
1
effectiveness in West Africa, where the virus has been spreading. The biggest being that data being reported is not necessarily accurate.
“
For instance, the speed of infection in graph 3 appears to have been diminishing since the middle of September, with the line beginning to flatten according to the numbers being reported. This however, does not show us the full picture, as if there was indeed a slowdown in the infection, we would also see a slowdown in the number of deaths. This has not been the case and the actual numbers seem to have continued to increase at the same pace. Therefore, it seems logical to assume that this relative plateau has not been caused
BIG DATA INNOVATION
If it had been taken more seriously when it was flagged in early March, then perhaps we would be in a stronger position now in fighting the spread of the disease. by the numbers of patients not having the disease, but perhaps because the numbers who have it have grown so large that data has not been accurately recorded for each case. Graph 2 represents this, with the numbers of cases per
15
HOW CAN BIG DATA HELP AGAINST EBOLA
day appearing to decrease significantly, but the number of deaths not corresponding. According to Gregory ‘The number of cases/day declines sharply, but the number of deaths does not decline correspondingly. Since cases are more likely to be under reported than deaths, the more likely conclusion is that the number of cases is very much under-reported.’ Data will give us significant insights into how diseases spread and the ways that they can be cured, but in reality, it will only be as effective as the data itself. With weak infrastructures in places such as Sierra Leone and Liberia,
the disease itself can spread but the data that is required to effectively combat it cannot.
“
The challenge that governments now face is not only how to stop the disease in the first place, but how to gather the necessary data on it. If a disease were to spread significantly in one of the more data driven countries, then having access to readily available sets would make this simple, however, by then it may be too late.
It was only flagged on March 14 after a press conference from the minister of health from Guinea on March 13, which was conducted in French.
2
3
BIG DATA INNOVATION
WHAT ARE GOING TO BE THE TOP CHANGES WITHIN BIG DATA IN
2015?
Josie King, President, Innovation Enterprise
seen s a h d ata Big D ap for war it e le ow a hug garding h and 4 re ted in 201 represen es. The een ani has b ross comp rown and ac ave g Data h s used e t a ig tion r tance of B p o has d a n r o o i t p c the im siness fun at are h u as a b sed, but w in e increa oing to se we g 015? 2
17
IN-MEMORY DATABASES As the use of data has increased in the past year, the speed at which results are needed has grown with it. When this hasn’t been the case, people want to be more informed than before or have the ability to make decisions in real time, rather than through the use of reports reporting on historical data. In-Memory databases allow companies the freedom to access, analyze and take actions based on data much quicker than regular databases. This in turn means that either decisions can be made quicker as data can be analyzed faster or more informed as more data can be analyzed in the same amount of time.
NON-DATA SCIENTISTS As we as a society have become more data driven, one of the main aspects that has become clear is that finding the necessary talent has become difficult. This means that companies are often reliant on either too few staff or outsourced consultants. Therefore, 2015 is likely to see more automated platforms that can allow employees who may not have as much skill with data as others, to collect, analyze and make decisions based on this data. This could be anything from simple to use interfaces with more complex backends or simpler tasks that could create business results.
MORE SENSOR DRIVEN DATA The internet of things is evolving and more companies are using it, but it may well hit its tipping point in 2015. This would be sensor-to-sensor data being collected, collated and analyzed through purely sensor based collection. This can be done in multiple ways from the way that objects are interacting with another object, to the settings that people are using on particular devices. Sensor based data is unlikely to out perform transactional data (which currently makes up the majority of data collection) but is still likely to see a marked increase. It is likely that we are going to see more device-to-device data being created and collected in 2015, which could see this number grow beyond transactional data within the next 5 years.
BIG DATA INNOVATION
18
NO OWNERSHIP IN JUST ONE DEPARTMENT
HR ANALYTICS Once thought of as the definition of making your employees ‘just a number’, HR Analytics are being shown to have significant benefits to both the company and its employees. We believe that 2015 is going to see more companies wake up to the positives that an effective HR strategy can bring. DEEPER CUSTOMER INSIGHT Despite the fact that transactional data is still more numerous that sensor data, 2015 may be the year that we see it being truly looked at in multi-dimensional ways to create even deeper customer insight. This could be anything from geographical data to a deeper understanding of purchasing trends according to different, oblique factors. With new technology allowing metrics to be tracked across even more areas and wearables creating even more possible trackable actions, deeper customer understanding is inevitable.
BIG DATA INNOVATION
From optimizing workflow to tracking overall employee happiness, we are likely to see an increased use of HR analytics in 2015 as those who have adopted it within the last couple of years will be gaining ground on their competitors and those who don’t have it will need to catch up.
Data will become a commodity that is not just kept in one department alone and used purely by senior company leaders. 2015 is likely to see a democratization of data throughout the organization, meaning that more departments will become adept at using the insight that it can bring. Rather than working towards a central strategy that is created by senior management, day-to-day activities will be based on data and the insights created from it.
IMPROVING DATA WITH MACHINE LEARNING AN INTERVIEW WITH SPENCER GREENBERG Simon Barton Assistant Editor
20
Public Domain Day represents the moment when previously copyrighted material becomes available for reuse. In most cases, this occurs between 50 and 70 years after the death of an author, with their books normally entering the public domain on New Years Day. Millions of books are released each year, meaning that it’s very difficult for organizations like Project Guttenberg, a producer of free e-books, to determine which ones are worth digitalizing. Through Machine Learning, the scientific discipline of teaching computers to make improved predictions based on a particular data set, an algorithm has been developed that ranks authors. This has allowed organizations like Project Guttenburg to better understand which books they should invest their time and money in. Machine Learning has picked up considerable steam in recent years but remains an incredibly complex field. Despite this however, Netflix and Google are just two of the companies who use Machine Learning as a tool for making sense of the huge amount of data now available. I spoke with Spencer Greenberg, Chairman at Research Rebellion, a quantitative asset management firm that applies Machine Learning technology to global
BIG DATA INNOVATION
IMPROVING DATA WITH MACHINE LEARNING
“
equity investing. Spencer has worked at Research Rebellion for nine years so it was valuable to get his insights on a highly technical subject that can easily be misunderstood.
Spencer explains to us his understanding of the subject by saying, ‘Machine Learning is the study of how to make predictions from large data sets, as a tool, it’s used as a way of making sense of Big Data, helping companies to make accurate forecasts about customer behaviour’ He elaborates on this further by saying, ‘If I have some data and I’d like to make a prediction about something, then an algorithm is going to try and learn how to associate the data to the prediction’. Netflix’s use of Machine Learning to predict which shows their audience are likely to watch is a common example, but the on-demand media giant has used it in far more innovative ways than that. Spencer states, ‘Machine Learning’s been used at Netflix for more innovative applications than just predicting which TV shows and movies their users are likely to watch, they also use it to find out which shows they should make themselves’. The success of House of Cards, Netflix’s first original show, is testament to Spencer’s comment. In a recent New York Times article it was explained
Netflix and Google are just two examples of companies who use Machine Learning as a tool for making sense of the reams of data now available. that a high percentage of Netflix’s 27 million subscribers had watched David Fincher’s, The Social Network, that Kevin Spacey was one of the site’s most popular actors and also that the original version of the House of Cards had done extremely well. Netflix used an algorithm to create a combination between these elements, enabling them to produce a show that its audience couldn’t help but enjoy. Netflix isn’t the only media giant to incorporate Machine Learning into its processes, the world’s most data-rich company, Google also uses it. Spencer expands on this, ‘Google use Machine Learning in their search algorithm as well as their self driving car application as Machine Learning is essential in recognising objects as you drive’. Google and Netflix’s use of Machine Learning
21
IMPROVING DATA WITH MACHINE LEARNING
“
By applying algorithms, transactions are flagged up when something is bought in a state you’ve never visited before or even when something’s purchased that’s outside of your normal pattern. demonstrates how imperative Machine Learning is to two of the world’s most innovative companies and how it’s allowing data to be analyzed to maximise use and minimize time spent analysing. ‘Machine Learning has its roots in Artificial Intelligence and it’s been a slow, gradual progression’, says Spencer. It’s been a process that’s evolved as algorithms have become more efficient and data more abundant, with increasing computer processing speeds allowing Machine Learning to be applied to vast datasets. In Spencer’s words, Machine Learning ‘has crept up on us’ as technology has improved, with companies now entering a
‘Machine Learning boom’ with the realisation that the application of algorithms decreases the amount of data ‘noise’ they are subjected to. In the United Kingdom, card fraud increased by 15% from 2013 to 2014, highlighting the ease at which the public’s data can be attained by fraudsters. This is however an area that Machine Learning can play an important role in. Spencer elaborates on this, ‘Your credit card company has the job of figuring out which purchases made on your card are atypical, the way this works is that they have algorithms that are working to find suspicious behaviours when it comes to purchasing’. By applying algorithms, transactions are flagged up when something is bought in a state you’ve never visited before or even when something’s purchased that’s outside of your normal pattern. This process has become faster and makes customer data a combative tool against fraud.
“
There’s still an awful long way to go with Machine Learning but as Spencer points out, ‘It’s a field which requires a lot of knowledge as it combines three tricky areas, Maths, programming and other specific tricks that you acquire when dealing with large
datasets’. Still, demand outweighs supply for Machine Learning talent, so there’s a real need for experts to emerge. Spencer is clearly excited to be in a field that has so much room left to develop.
By using algorithms to create a combination between these elements, Netflix were able to produce a show that its audience couldn’t help but love. BIG DATA INNOVATION
“content to bridge the gap in your enterprise knowledge�
channels.theinnovationenterprise.com
23
HOW WILL QUANTUM COMPUTING AFFECT THE INTERNET’S TRUST MECHANISMS? Chris Towers Head Of Big Data Channel
BIG DATA INNOVATION
24
There’s been no shortage of innovative ideas across the Internet in recent years. Some of them, including BitCoin and Off-the-Record Messaging, rely on Public-key cryptography to guarantee ultimate secrecy and complete authenticity. Unfortunately for services that use public-key cryptography, Quantum computers, systems that currently perform operations on data around 3600 times faster than regular computers and could see this number rising to billions of times faster, are on the rise and being used to make important breakthroughs. For cryptocurrencies like BitCoin however, Quantum computers are a frightening prospect and potentially detrimental to their longevity.
“
The Guardian recently released an article stating that the progress seen in Quantum computing was beneficial for science, With Canadian company, D-Wave, using a Quantum computer to work out how protein folds. Due to the complexity
Canadian company, D-Wave, used a Quantum computer to work out how protein folds BIG DATA INNOVATION
QUANTUM COMPUTING & INTERNET TRUST
“
of D-Wave’s discovery, there were even claims from scientists that their findings were so advanced that they couldn’t possibly be true, a sentiment that was later found to be false. Public-key cryptography is an expansion of earlier encryption techniques that used a single private key. With a single key shared between the two recipients, the key would probably have to be exchanged in a dark alley or in a deserted car park. This is neither convenient nor feasible for people who are sharing private information in different countries, where it’s impossible to meet in a physical location.
The invention of Public-key cryptography, before the rise of Quantum computing, put an end to this problem. Both the sender and receiver of the messages have their own key, which are both programmed so that anything encrypted with Key A can only be decrypted by Key B. After this has been established, a key pair can be generated, which becomes one public key. This can be shared online and be identified individually. At this point, there is both a public key and a private key. Now, you might be thinking what’s the point in encrypting something if my public key is widely accessible and capable of decrypting my message? If a message is encrypted with a private key, you can guarantee that it’s authentic
At this point, you have both a public key and a private key. Now, you might be thinking what’s the point in encrypting something if my public key is widely accessible and capable of decrypting my message? and sent from the person who is claiming to have sent it. Having this in place is essential for cryptocurrencies like BitCoin who use it to validate purchases and to guard against online forgeries. As the public key is created by the user’s private key, it is possible for a normal computer to uncover it. It has however been predicted that the timeframe needed to find it would be in excess of the total life span of the solar system. It’s safe to say that most are willing to wager that their private keys won’t be stolen if it is going to take an eternity for someone to unlock them. This
25
QUANTUM COMPUTING & INTERNET TRUST
notion is going to be challenged significantly however with the development of quantum computers, where processing speeds are around a billion times faster that what’s capable from today’s current machines. The same Guardian Article referred to at the start of the article, stated that the rise of Quantum computers could threaten to change the way we interact with the Internet by making information, that is purposefully meant to be secret, completely transparent. The fact that Quantum computers can find information a billion times faster than normal machines is very significant as it means that it’s impossible for encryptors to keep up with the pace set by quantum computers, even if
they try to make their codes more complex. This would make Public-key cryptography almost pointless, as it wouldn’t be able to safeguard against the very thing it’s meant to.
“
The advancements made in Quantum computing will not sit well with BitCoin, who as mentioned before, use public-key cryptography to validate purchases and to guard against forgeries. Thankfully for the online payment system, Quantum computing has been a known risk for some time and because of this, a number of ‘hooks’ were added into their encryption code, which allow for a safe transition to another
It’s safe to say that most are willing to wager that their private keys won’t be stolen if it is going to take an eternity for someone to discover them more quantum resistant algorithm. The problem is that the barriers incorporated by BitCoin might not last that long.
BIG DATA INNOVATION
26
The discovery of the Majorana fermions, a sub-atomic particle, could be used to construct an even more efficient quantum computer than we have today. This breakthrough could mean that within ten to twenty years cryptocurrencies like BitCoin could fall foul of quantum computing unless they update their systems to counteract it. If this were to happen, any attempt to keep information encrypted could be in vain. Former National Security Agency director, Brian Snow, says, ‘If such a machine exists and if it is going after people on the net, trying to get to their goodies, you have lost all the trust mechanisms the web has’. If such a machine were to materialize, it would be to the detriment of companies like Bitcoin whose ability to make transactions would be significantly reduced. Like the Guardian article mentions, the pressure on companies would increase, and their ability to keep information private would be tested.
BIG DATA INNOVATION
QUANTUM COMPUTING & INTERNET TRUST
“
At the moment, however, this is a distant possibility. Currently quantum computers cost around $15 million and require a huge operating area in order to function. They are in the ‘Bletchley Park’ stage of development, where they are so far away from being available that they seem almost unfeasible. It has been predicted that the primary use of quantum computers will be through cloud based platforms, meaning that although the use of these computers will undoubtedly spread, there will be more control about the end use for them.
If such a machine exists and if it is going after people on the net, trying to get to their goodies, you have lost all the trust mechanisms the web has
ONE WORLD TRADE CENTRE ANALYSING THE NEW ELEVATOR SYSTEM Simon Barton Assistant Editor
28
Thirteen years after 9/11 and it’s still the single worst terrorist attack ever to hit the United States. Many argue that the reconstruction of the new Four World Trade Center (WTC 4) is more symbolic than anything, but with 60% of it already leased, it shows that businesses are already embracing the complex. At 971 feet, WTC 4 is currently the second tallest building in World Trade Centre complex behind the WTC 1, although the WTC 2 and WTC 3 bulidings are planned to surpass it upon completion. The Port Authority of New York and New Jersey plans to lease a considerable proportion of the property, with retail stores to take up much of the first floor. With a lot of people coming through WTC 4’s doors everyday, it is essential that the building’s elevator system has the capacity to serve the companies that are based within the skyscraper’s four walls. Congestion will be at its most noticeable during the morning rush hour, with thousands of people looking to get to their floor. This will put an enormous
BIG DATA INNOVATION
ANALYSING A NEW ELEVATOR SYSTEM
“
amount of stress on the elevator system, which is why Big Data is being used to give the tenants the easiest journey up to their destination.
A situation can arise where there are 20 people in a lift, all with different destinations. This can almost become an extension of the daily commute with stop after stop to get through before you can finally get to your floor, sit down and start your day’s work. For the people working at the WTC 4 however, they needn’t worry, as they’ll be able to take advantage of an elevator system that can transport them at 20mph,the same vertical speed as an aeroplane during take off. When an individual starts working at the WTC 4 they will be given a unique ID card. This card will know which floor a person works at and will record movements between the levels, identifying patterns in the process. For example, if you typically go from your office’s floor to the gym at 2pm, that will be programmable – if you go from the gym to the cafeteria an hour later, that will also be programmable.
This will put an enormous amount of stress on the elevator system, which is why Big Data is being used to give the tenants the easiest, most stress-free journey up to their specific level. The significance of this is that everybody will be tracked and have their movements analyzed, providing insights to companies about their employees. This is all made possible because the ID card will be able to group people by the floor they are going to as soon as they swipe their card through the main barrier each morning. Instead of participating in a mass brawl towards an
ANALYSING A NEW ELEVATOR SYSTEM
“
“
Instead of participating in a mass brawl towards an elevator, they will be directed to a specific lift, which will be full up with people going to similar floors. elevator, they will be directed to a specific lift, which will be full up with people going to similar floors. It’s not just a matter of convenience either, the use of Big Data will also cut energy consumption in the building by 20% as fewer stops will need to be made.
Security runs to the heart of the WTC 4, which is completely understandable considering the complex’s past. The programmable aspect of the ID card will act as a security mechanism for the skyscraper, meaning that the authorities will be aware of everybody’s whereabouts, all of the time. The hope is that this will negate an individual’s capacity to cause harm to the building, so that the unspeakable can never happen again.
29
act as a deterrent for security threats and keep the wellbeing of the building’s tenants in the best possible place.
The use of data at WTC 4 shows how it can make the lives of ordinary people easier and not just as a tool for companies to target tailor-made products at their consumers. The implementation of Big Data at WTC 4 is much more than just a tool to transport thousands of workers throughout the building’s office floors, it will
The programmable aspect of the ID card will also act as a security mechanism for the skyscraper, meaning that the authorities are aware of everybody’s whereabouts, all of the time
BIG DATA INNOVATION
Whitepapers
Reach a targeted, localized and engaged community of decision makers through our customizable suite of online marketing services.
+1 (415) 692 5498 US +44 (207) 193 0386 UK
ggb@theiegroup.com
@IEGiles
Email dwatts@theiegroup.com for more information