Big data The power behind decisions DECEMBER 2013
The power behind decisions
MARCH 2014
business-technology.co.uk
an independent report from lyonsdown, distributed with the sunday telegraph
Business Technology March 2014
2 | Big data
Opening shots Shane Richmond
T
HE RECENT decision to delay the launch of a new NHS database collecting patient medical records from across England was seen as a victory for patient watchdogs who had complained that the public had not been properly informed of the plans. Critics of the scheme were concerned that individuals might be identifiable despite attempts to anonymise the data, that the database might be sold to private companies without the public being asked or compensated and that it was wrong to assume consent from patients, relying on people to know what was going on and to take action to opt out. But the database, when it finally launches, could provide significant public health benefits, allowing doctors to assess the performance of the NHS, track disease outbreaks and test the performance of new drugs. What might have been welcomed as another step forward in smart government instead serves as a warning about how not to launch a database. Though the term Big Data has become more common recently, the techniques it covers have been around for years. Since the 1990s, Wal-Mart, the US supermarket chain, has collected vast amounts of data on its customers, which it uses to streamline its supply chain. In 2004, for example, Wal-Mart diverted extra supplies of strawberry Pop-Tarts to its Florida stores as Hurricane Frances approached because the data showed increased demand for the toasted snack, alongside obvious emergency items such as torches.
Find us online: business-technology.co.uk
Follow us on Twitter: @biztechreport
If data collection operates without attention to privacy, it will be resisted In his 2008 book, Super Crunchers, Ian Ayres writes that Visa can predict that a person will divorce, months before the event, based on their credit card purchases. Being aware of the risk helps Visa identify customers who might default. Charles Duhigg’s 2012 book, The Power of Habit, tells the story of how Target, the American retailer, became so good at predicting consumer behaviour based on past purchases that it could identify which female customers were pregnant and estimate their due dates, even if the customers did not want Target to know. Given these kind of insights, it’s no wonder that more and more organisations are seeking to leverage the data they collect. Of course, Target offered an unfortunate demonstration of what can go wrong with large data collections in November last year when thieves stole masses of consumer data from its systems, including 40 million credit card numbers and 70 million addresses and telephone numbers. The Target attack was notable for its scale but
Shane Richmond travels the world advising businesses on changing technologies, and was head of technology (editorial) at Telegraph Media Group Twitter: @ shanerichmond
otherwise it was another example of the all too routine theft of customer data from the computer systems of retailers and service providers. Sony, Adobe, LinkedIn, email marketing firm Epsilon and many others have been targeted by hackers in the last few years. But the threat of hacking, though serious, is just one problem for companies that want to store data about employees or customers. As the botched launch of the NHS England database shows, transparency and privacy are also real concerns. Those who are collecting data should be transparent about the information they are collecting and what they use it for. There are good reasons why businesses want to better understand their customers and employees and, likewise, why governments would want to understand their citizens. However, if these efforts appear to be operating in secret – or without adequate attention to protecting privacy – then they will be met with resistance and, ultimately, failure.
an independent report from lyonsdown, distributed with the sunday telegraph
Like us: www.facebook.com/biztechreport
March 2014
Business Technology
Big data | 3
Find us online: business-technology.co.uk
Smarter business models on the increase BIG DATA is helping companies evolve and create new business models to meet challenges in the marketplace and keep ahead of their competitors. Andy Fuller, head of data analytics at Fujitsu UK & Ireland, says: “Where people are starting to focus on is in how they generate business value out of this. What we are seeing is that certain sectors are really starting to embrace this. “One of the cases talked about a lot is around utilities as they are moving towards smart metering, looking to encourage people to be more eco-friendly in the way that they use
energy. Smart meters will generate huge volumes of very simple data. “If you think of someone like cable companies, the Virgin Medias and others in this world, they [know] what channels you are watching, when you are watching, what programmes you are watching and whether you are watching the adverts or not, which gives them the basis to be able to tailor the services they are offering you. “[Elsewhere], financial services, banks and the like have a bit of work to do to rebuild trust with consumers in many ways – one of the things they will end up doing is offering better services
to people. They will be keen not to be seen to be exploiting people’s data.” As companies use big data more and more, the technologies used to analyse data will develop. Wouter de Bie (right), team lead for data infrastructure at Spotify, says: “If you look at big data, it is a very rapidly evolving part of IT. One of the problems is that there are so many options right now to choose from. “Sometimes it is hard to evaluate all of the options
we have to solve particular problems. A large part of the soft ware is still immature compared to soft ware that has been around for a long time. “There are trends toward real-time processing or real-time analytics. It is a smarter way of processing data based on Google’s Dremel [software which analyses information]. That is definitely emerging with technologies like Impala, Apache Drill and Apache Spark – smarter ways of attacking problems.”
Big data generates new analytics jobs By Joanne Frearson
New roles are emerging thanks to big data; right: Gartner’s Frank Buytendij
Publisher Bradley Scheffer.......................info@lyonsdown.co.uk Editor Daniel Evans.....................................dan@lyonsdown.co.uk Production Editor Dan Geary.............d.geary@lyonsdown.co.uk Reporters...................................Dave Baxter and Joanne Frearson Client Manager Alexis Trinh ..................alexis@lyonsdown.co.uk Project Managers...........................Ben Ruffell, Emmanuel Arthur Syndication .....................syndication@theinterviewpeople.com +49 (0) 8161 80 74 977
NEW JOB opportunities are evolving as more companies rely on big data analytics. Firms are creating new roles, such as chief data officer, to help lead fi rms on how to benefit from big data, while universities are offering courses to guide people interested in a career in this field. Frank Buytendij, research VP at Gartner, says: “By the end of next year close to 20 per cent of large organisations will have a chief data officer.” Buytendij explains one of the main issues for companies will be how to get value out of big data, creating the right strategy and finding the right people to harvest it. He says: “For customer experience it is really about understanding what makes a customer unique and enabling them to take the right offers. For instance, if you are a travel agent and you say to a customer there are 25 five-star resorts in Greece. “What big data can do is make you smarter by saying, if I know a couple of things mor e ab out you, instead of
offering you just 25, I can push the ones based on your history or your preferences – then there is a bigger conversion rate. It is more personalised. “On the internal operations, you can better predict when machines need maintenance and therefore be in operation longer and better. The third [benefit] is new business models – how can you really exploit the information as an asset in its own right? The fourth is risk, – how can we detect fraud, how can we qualify risks?” Using big data is not without its risks though. The chief data officer will also be responsible for helping make sure consumer information is not used in a way that could be unsettling to people. “Big data does not come alone,” says Buytendij. ”It requires new technologies, new skills, new strategies, new systems and new deployment methodologies. Are we really in control of what is happening with us, and to us?” Universities are starting to offer training courses to prepare students for this emerging industry. Patrick J Wolfe, professor of statistics at University College London (UCL), says: “As the data grows
you are also getting more noise and more variability, which is another big challenge. The thing that everyone wants to do is turn their data into information. Most of the challenge is sifting through this data. “We have started to think about how we are going to train the next generation of students to take advantage of these opportunities. The questions for us are, what are the new paradigms for statistical thinking, and how do we teach students to operate effectively in this kind of data environment?” In December, UCL set up the Big Data Institute with Elsevier,
a provider of scientific, technical and medical information products and services. “The things we are interested in is mathematics and statistics. We are trying to look across different flavours of big data problems and trying to find some kind of intellectual framework that is common to all of them,” continues Wolfe. “It is a very exciting time. There has been a huge pull created by technology and the fact we can now measure so many things that we couldn’t before. Everybody is rushing to fill the vacuum – it is going to be a very exciting time in the next three to five years.”
Business Technology March 2014
ExpertInsight
ExpertInsight
4 | Big data
an independent report from lyonsdown, distributed with the sunday telegraph
Find us online: business-technology.co.uk
Follow us on Twitter: @biztechreport
Big data will transform insurance Technologies allow companies to access multitude of insights INDUSTRY VIEW
B
ig data is transforming how businesses are managed. Analysts estimate that spending on big data for risk management will grow by 55 per cent from $470million in 2014 to $0.75billion in 2016 as analytics tools mature and firms deploy more enterprise-wide solutions. From an exposure and risk-management perspective, the future promised by big data and cloud computing is right here, right now, and one that is set to transform the insurance industry in particular. An industry at the forefront of the world’s increasing risk profile, understanding exposure and risk is no longer a back-office insurance function – it is a core business process and its output is a key driver in decision-making and resilient portfolio growth. 2011 showed the potential for massive insurance claims, driven by multiple major catastrophes occurring within the space of several months around the globe. Analysing data in real time and leveraging big data analytics as part of a resilient risk management strategy is no longer a nice-to-have for the insurance industry. It is a must-have. It is a competitive differentiator. Previously, insurance
companies made do with multiple analytics applications and products from multiple vendors to mine and analyse exposure and risk data. These vendors each only supported small components of end-to-end business operations – resulting in untimely, isolated insights around exposure, potential losses, and financial performance. This situation made it clear that a common operating platform needed to be designed to facilitate and streamline the risk management needs of an entire industry: A platform where an insurance company’s entire risk profile can be stored and processed by models and applications from multiple vendors all hosted in one place, enabling users to gain key insights to feed their decision making processes. It is by harnessing the value of big data technologies that insurance industries can thoroughly evaluate the financial risk posed by global disasters such as floods, windstorms, hurricanes, earthquakes and terrorist attacks. The solution needs to allow everyone, from catastrophe-modelling specialists to field underwriters, to chief risk officers at companies of any size, to access and consistently communicate insights across a multitude of risk and performance indicators. To address these market needs, RMS has developed RMS(one),™ an industry-centric initiative that provides the underlying technology and architecture allowing insurance companies to model, interpret and efficiently communicate their view of risk. It is a cloud-based platform that facilitates innovation by leveraging the latest technological offerings in the big data space. RMS(one) is the industry’s first open platform,
so customers can access RMS models and applications in parallel with their own or ones provided by other companies. Because it is cloud-based, customers may choose how much computational capacity they need and when they need it, ensuring they always have access to the performance and timeliness they require. This is just the start of the journey for insurance companies as they start to leverage big data technologies to grow their businesses effectively by accessing a multitude of insights all in one place. Mark Heslop (inset far left) is senior manager, business solutions at RMS +44 20 7444 7600 www.rms.com
How data analytics is tackling insurance fraud Firms would benefit by sharing information INDUSTRY VIEW
I
nsurance fraud is a hugely important issue; it undermines the stability of the industry and ultimately limits the ability of the insurance industry to offer consumers fair pricing. The ultimate victim of insurance fraud is the honest policyholder – it is far from a victimless crime. The ABI estimates that the annual cost of undetected claims fraud is in the region of £2.1billion, and that’s leaving out the cost of application fraud entirely. KPMG memorably described fraud as a contagion, spreading through the marketplace like a virus. The good news is that through intelligent use of data analytics, insurers are capable of detecting and stopping more fraud than ever. One of the fastest growing areas has been analytics of public records information. For example, it is now standard for an insurer to instantly check a customer’s public credit history or the electoral roll at point of application. This can prevent a fraudster from stealing an identity, using an address that they don’t live at, or even inventing a completely new identity. What was on the frontier of insurance innovation a few years ago has become the industry norm. Similarly in claims, the sheer amount of technology built into modern cars is having a real
impact on motor insurance fraud. The fact that insurers can now access accident information recorded by the car itself offers many opportunities. Such telematics data analytics can become an incredibly useful tool when investigating a possible “cash for crash”. It also helps honest and safe drivers by enabling an insurer to more accurately match the offered premium against the risk of a driver and reward good driving behaviour. As the industry embraces data analytics we are also seeing a greater willingness among insurers to share data with each other through contributory databases. Often the most valuable data to help identify and prevent fraud is actually held by a different insurer. Contributory databases have an important role to play in this regard. For example, the LexisNexis® No Claims Discount module provides insurers with the most effective method of verifying no-claims history, which, of course, is an incredibly important rating factor, underpinning premium discounts of up to 75 per cent. With this kind of discount potential, it’s no surprise that the no-claims history is a popular target for application fraud. LexisNexis research into the issue indicated that up to 15 per cent of consumers feel that adjusting no-claims history is a perfectly acceptable activity. Of course, in the real world it’s not, and the costs of an insurer offering a premium at a price that doesn’t reflect the risk it has been asked to take on are ultimately borne by honest consumers. With contributory databases
such as these, insurers can address this type of fraud before it affects their honest policyholders. There’s a real need for insurance firms to share more information with each other, with data analytics we can combine and utilise all forms of data for better results. Fraud may be a contagion, but collaborating through data analytics can stop the rot. Bill McCarthy is managing director, LexisNexis Risk Solutions, UK & Ireland +44 (0)1628 528 620 enquiries-info@lexisnexis.co.uk
an independent report from lyonsdown, distributed with the sunday telegraph
Like us: www.facebook.com/biztechreport
March 2014
Big data | 5
Find us online: business-technology.co.uk
Big data: helping insurance firms target serial fraud INSURANCE fraud is a big challenge for the industry. According to the Insurance Fraud Bureau, undetected general insurance claims fraud tota l £2.1billion a yea r, adding an average £50 to the annual costs individual policyholders face, on average, each year. Big data is being used to combat fraudulent claims. Insurance companies can now analyse patterns of behaviour around a claim as well as look at social media accounts to determine whether a claim is real or not. Jane Tweddle, industry principal for banking and
insurance, SAP UKI, says: “Big data is becoming more and more top-of-mind for insurance. Technology is making it more and more possible for insurers to bring a whole load of data together.” It is possible to start to see patterns in fraud through big data, explains Tweddle. “There is a lot more organised fraud these days, where you have a group of people and each one is witnessing each other’s own apparent accident. Through big data you can start to see patterns in fraud. “For example, Mr Smith witnessed Mr Jones’s accident one week, and the next week Mr Jones had an accident and Mr Cliff witnessed it,” says Tweddle. “You can also look at geographical occurrences, where certain types of fraud are happening more regularly. It g ives you a
much better ability to detect fraud. Big data and modern technologies allow you to look at this data more in real time. Insurers might decide that someone taking out a policy has a higher propensity to commit fraud. “It does give insurers a lot more options and opportunity around whether they accept a policy or not, and how they manage claims they can see in real-time, which might be fraudulent. It is obviously a direct cost-saving to them – if they can increase the fraud they can detect it is an even bigger saving.” Big data is being used to analyse social media by insurance companies. Ravi Kumar, senior vice president and head of the Insurance Practice at Infosys, says: “The specific types of data they would look for to create this real-time decisionmaking is linked to trends which negate or substantiate the hypothesis. They would look for activities anomalies of the claimant. “It could mean chats in your Facebook; it could mean a Twitter conversation you are having about
something you are trying to sell off which could be linked to a claim you make. For example, claiming for a fire, but meanwhile you are actually going online and selling white goods. Quite often, people leave a digital footprint thinking no one is observing them. “Fraud is one of the biggest challenges in the insurance world. You need enough data to prove it was a fraud. There are a whole lot of challenges around it, but we are getting closer to improving fraud risk measurement through big data.” The analysing of big data from telematics is lending a helping hand to combat fraud. Tweddle says: “The advent of telematics starts to give a lot of data and not least it will give information in the event of an accident and potential theft of a vehicle. It will give data about location. “It will give data about what was happening with the vehicle speed wise and other information. It can give them a lot of data to help them assess whether a claim is realistic or not.” By insurance companies using big data to tackle fraud it can help reduce that £2.1billion of money wasted through undetected fraudulent claims. Social media, telematics and information about people’s behav iour a na lysed t h rough big data techniques is helping catch more of those fraudsters.
How crunching the numbers can reduce insurance costs – at both ends
Data analytics can better prepare customers for catastrophe
ANALYSING big data from catastrophes can assist not only insurance companies to comprehend how to process claims from potentially hazardous events, but also help people understand what should be done to protect themselves. Claims data is an important part of working out catastrophe risk. Robert Muir-Wood, chief research officer at RMS, says: “Insurers and reinsurers want to mine the data to understand how they can change how they are doing business. “How they can change pricing and where they are doing business, or how they structure the business maybe by offering different financial terms. How they can optimise for them the way in which they can run a profitable business.
After Hurricane Sandy hit New York in 2012, Muir-Wood explains there were huge fine art losses in lower Manhattan. Big data was used to asses this. “There were enormous amounts being learnt about how that business needs to be written going forward, how fine art dealers need to change how they store all their paintings,” he says. The complexity of analysing big data on catastrophes varies from event to event. “If we take an earthquake you pretty much know where the fault
Business Technology
lines are,” explains Paul Miller, head of international catastrophe management at Aon Benfield. “Whereas a flood can be impacted by things such as buildings or a tree falling into a river. On its own the hazard is not enough – you need to combine that with the loss potential.” Insurance firms can use this information to help clients. Miller says: “If the industry can work with policy holders to better inform them, that can be used as a better mitigation rather than trying to solve a problem once it has occurred.”
Why insurers need to realise the benefits of big data INDUSTRY VIEW Big data has the potential to provide insurers with substantial benefits. In fact, in a recent survey 82 per cent of underwriters believed that insurers that do not capture the potential of big data will become uncompetitive. Despite this realisation, significant obstacles are impeding insurers’ exploitation of the full amount of data available to them. Insurance has always been a data-intensive business, but the volume and diversity of data now available can be daunting. A top challenge cited in the survey was finding the correct data for a given analysis. Another challenge identified focused on how to access data underwriters know they need but that is not readily available. To analyse new sources of data, many insurers are diverting resources to the data analytics function, resulting in underwriters and pricing analysts being charged with managing and integrating data. To take full advantage of big data, insurers need to think more innovatively from two perspectives. First, consider solutions from outside the organisation that integrate relevant data into a single analytics platform to support decision-making. This takes the effort of managing and integrating risk data out from under the underwriters, freeing their time to focus on the business of insurance. Second, insurers should step back and think about how data can be used across the entire customer value chain – “inside-out” to better price and assess risk and “outside-in” to provide exceptional customer journeys. It is with an eye to the big picture that big data will bring sustainable competitive advantage. Stephen Williams is lead, business information management innovation at Capgemini Financial Services insurance@capgemini.com www.capgemini.com/ financialservices The Big Data Rush: How Data Analytics Can Yield Underwriting Gold. Survey conducted by Ordnance Survey and Marketforce/ Chartered Insurance Institute/ Chartered Institute of Loss Adjusters, April 2013
an independent report from lyonsdown, distributed with the sunday telegraph
Business Technology March 2014
6 | Big data
Find us online: business-technology.co.uk
Big data analytics can be a huge advantage in figuring out who the next big pop star is going to be… Joanne Frearson reports
VIEW By Keil Hubert 2014 IS supposed to be the year of big data in corporate IT. We all (supposedly) have massive amounts of captured transaction information in the data centre that can be mined somehow to glean some strategic business insight. It’s a great hypothesis, and there’s a lot to be said for making use of information that you already have. After last year’s sensational run of stories about how major retailers will know that a customer is pregnant before she herself knows, there’s been a lot of interest in the business world in trying to fi nd new ways to draw conclusions about customers’ wants and needs. That makes sense. I’m all for pursuing better customer service. However… One word of warning: in your quest to leverage your so-called big data reserves, be wary of the temptation to reward data initiatives until you can conclusively prove that the results of said initiatives actually realise meaningful, independently-provable value for the company. There’s a very good chance that your early forays into big data processing will be rife with mistaken conclusions. Worse, the pursuit of the “big” component of big data can readily lead to inaccurate or misleading data. In 1976, the American social scientist Donald Campbell coined what’s become known as Campbell’s Law when he wrote: “The more any quantitative social indicator (or even some qualitative indicator) is used for social decisionmaking, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended
T
HE NEXT time you watch Lady Gaga on YouTube or tweet about One Direction, it’s likely someone will be using big-data analytics to process this information in an attempt to figure out what could be the next big chart hit. Will Mills, VP music and content at Shazam, says: “There are a lot of data signals out there – from radio airplay data, SoundCloud plays, YouTube plays – and a lot of those are watched
Follow us on Twitter: @biztechreport
very closely and analysed. The data is crossmatched to see if there are any trends. “Looking at the broader picture, it is something the record labels both big and small are taking more seriously.” He explains they are looking at what songs are working and whether they should be moving their marketing budgets around. Shazam has partnerships with Warner Music, and also plans to work more closely with Sony and Universal to share its big-data findings with them.
“Our charts are watched pretty closely by radio programmers and record labels to see what tracks are trending out of the 30 million-plus songs on our database,” he says. “I think there is a lot more intelligence across the music business in terms of how people are working these data points. “There are a lot of cases where record labels will see tracks trending on Shazam and will put three or four tracks out and see which one gets the most resonance with our users, as well as other
Why Campbell’s Law is still key to avoiding unreliable data
to monitor.” When you reward your employees for acting on any given metric or process, the employees gain a powerful incentive to “game” the metric or process in order to maintain or to increase their expected reward, whether or not the metric or process has any actual relevance to the business or to its success. The pursuit of reward is inherently and inevitably corrupting, whether
people realise that they’re being corrupted by it or not. In simple human terms, we’re seeing the results of Campbell’s Law in primary education here in the USA. We’ve grown enraptured in recent years at measuring student, teacher, and school performance in terms of standardised exam results. In theory, comparing two schools’ scores on the same set of exams should tell you which of the two is more effective. Logically, the better-performing school should then receive more resources than the poorer-performing school. That’s proven to be a counterproductive approach. In practice, the obsession with improving one’s data has incentivised teachers and schools to stop teaching anything other than test questions – and has encouraged many to blatantly cheat on the students’ exams in order to boost their schools’ performance data. For the business owner, this means that the pursuit of big data for decision-making can have undesired negative effects for the business. At each step of the analysis process (classifying, storing, managing, analysing, archiving and reporting of data), you run the risk of rewarding counterproductive behaviour at the level where direct employee action is required. For example, if you decide to reward employees that capture certain customer data (like customers’ addresses
or ages), some employees will feel pressure to plug such data into transactions even when it isn’t available – basically, they’ll make something up in order to maximise their chances of being rewarded for having successfully captured the desired information. In order to make big data work for you in a safe and responsible manner, you have to set clear and unambiguous standards for data capture, handling, storage, and disclosure. You need to know (in general terms, at least) what it is that you’re looking for and (more importantly) what you don’t need. Sequester, discard, or otherwise protect all of the content that isn’t germane to your analysis. Test your hypotheses rigorously, and be sceptical about believing any causal relationships within your data until you’ve been able to dispassionately prove that a causal relationship actually exists. More important is to be absolutely clear with everyone involved in the capture of information that bogus or adulterated data will sour the entire analysis, and will not be tolerated. For a big data project to yield practical results, the data has to be reliable. Look for every opportunity in the data capture and handling processes where someone might be inadvertently incentivised to bulk up the records with bogus data, and then put verifiable controls in place to prevent data corruption from happening.
an independent report from lyonsdown, distributed with the sunday telegraph
Like us: www.facebook.com/biztechreport services. Sometimes that helps with which one they go with as the single, sometimes they move the release date forward or back depending on how much that track is resonating. “Emerging acts are using the data in their marketing. For an act with a few thousand fans just finding out who is interested in your music and connect with those using data and work out how you can get bigger than that is very important.” Artists like Lorde, Mills explains, were resonating on Shazam long before streaming services: “Billboard often reference a song as being in the top 10 on Shazam to say ‘hey this song is going off’,” he says. Big data is being used to predict who will be the next big pop star. Social signal analytics at The Next Big Sound can predict a year in advance the likelihood of an artist hitting the Billboard 200 list, and are being used as a measure of whether or not it is worth putting money into a certain artist. Liv Buli, data journalist at The Next Big Sound, says: “Typically, how it has been used by our biggest clients is as a kind of monitoring system – they can look at the impact of the various social promotions they are doing, see how an album is selling or how it is streaming, and what all this activity online around the artist is.” For example, Buli explains, big data can show an artist with an account on Instagram, Facebook and Twitter, that their Facebook following is the largest, their Twitter following is growing the fastest and their Instagram following is more engaged. That can help them where they should focus their efforts – they may choose to launch a campaign on Instagram, as that is where they are going to get the most engagement. She says: “Artists can use the data instead of just shooting in the dark and trying different tactics and not really knowing what the impact is of them. This way they can measure that impact. “We also do a bunch of data science here. We will look at things like late-night television shows here in the US and what the impact is for artists performing on them. Which is better for them to perform on? Which will get a bigger bang for their buck if they are a bigger artist or a smaller artist? Which one would be most impactful?” Artists can look at how they are growing and figure out how they can moving at the rate you want to be. Buli says: “We found that artists whose typical Facebook following would grow from 20,000 to 50,000 page likes would do that in 257 days, and artists that later would reach a million Facebook page likes would do that in 117 days. But artists like Lorde did it in 79 days. “Getting from that small stage where you are an unknown act to really a big mainstream artist where you are making a lot of money off your music, there are plenty of drivers to help you figure out how to get your name out there and how to reach people.” Not all record labels are abandoning traditional methods just yet, though. There should be a
March 2014
Find us online: business-technology.co.uk
Business Technology
Big data | 7
Savvy approach on legal issues essential to optimising value INDUSTRY VIEW
balance between using big data and making gut decisions on bands. Simon Wheeler, head of digital at record label XL/Beggars Banquet, which launched Adele, says: “We are not the heaviest users of data. What we do not want is our staff spending so much time looking at reporting systems and printing out pretty graphs – you do not actually need very complex systems to tell you whether you have success or not. “It is about getting the balance between having enough information to inform us what
Above: Rock band Queens Of The Stone Age are using big data to target specific demographics; below left: Lorde is one artist with an engaged audience on social networks; below: Liv Buli of The Next Big Sound
is happening around our bands. I think where it has helped is the shift from purchasing to consumption and how that works for specific artists. Are there levers we can use to improve the profitability? “One thing we did last year which I think was really interesting was working with Queens Of The Stone Age, who are a pretty well-established band. There current album is the first we have done with them. One of the things they wanted to do was reconnect with a younger audience, being able to use the data through streaming partners such Spotify, plus with our other sales data we were really able to target and work with someone like Spotify to get their music in front of a younger audience and build that awareness up. “It was really important to us and that was a very successful campaign. They had their first number one album in America, it was number two in the UK and top five all around the world. That sense we were able to specify demographics… it was with marketing messages with content, it was with playlists.” Big data is being used to help artists become superstars, whether it is tracking social media campaigns or deciding which song should be released. Although it is unlikely it will be used as the sole mechanism to predict success – gut feeling will always be a factor in the A&R industry.
Use of big data is potentially pivotal to future business success, but no one can afford to overlook the legal issues which abound. Dealing with these upfront is essential. This is as true for those in the retail, financial services, TMT and other sectors as it is for those selling technology and analysis solutions. Who owns and controls the data is fundamental, and yet it remains unclear whether companies will be able to claim proprietary rights in these vast datasets, testing the boundaries of what is protectable under existing copyright, database rights and trade secrets laws. Without that ownership identification and ability to protect and exploit it through clear contractual relationships, the value of the asset being created becomes very challenging to identify or exploit. If the dataset contains or is derived from personal data, the ability to analyse it, let alone exploit it as an asset, can flounder, if the due diligence and compliance steps are not taken. The sources of data, identifying the data controllers, when notices were given or, where required, consents obtained and whose laws apply are a few of the complexities which need addressing, starting with whether it is or was once personal data at all. With an increasing number of global controls on profiling and the sanctions for non-compliance, dealing with the legal issues sooner rather than later is the better maxim here. Paula Barrett, below, is partner and head of Privacy & Information Law Group, Eversheds LLP +44 (0)20 7919 4634 www.eversheds.com
an independent report from lyonsdown, distributed with the sunday telegraph
Business Technology March 2014
ExpertInsight
8 | Big data
Find us online: business-technology.co.uk
Follow us on Twitter: @biztechreport
My data is bigger than your data How to tap into powerful predictive capabilities from public information INDUSTRY VIEW
B
ig data has become a big talking point these days, but there are many different interpretations, contrasting views and conflicting arguments surrounding it. Dan Ariely of Duke University provides what is in my opinion, a great analogy: “Big data is like teenage sex: everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it.” At Boston we have been providing HPC (High Performance Computing) solutions for more than 20 years. HPC has always involved big data for things like large-scale analysis and huge storage requirements. Boston has provided systems to a range of users from various scientific backgrounds, all of whom needed to generate, store and analyse immense volumes of data. One such customer was CERN, which was generating 15 petabytes of data on a daily basis (roughly 250 million MP3 songs). Producing data at this scale can create numerous technological challenges and requires some highly innovative HPC solutions that push the boundaries of science and technology. The SKA Project (Square Kilometer Array) is another venture Boston is currently engaged with. This project aims to analyse approximately one exabyte of data a day (an exabyte is 1,000 petabytes – you can do the math on the number of MP3s this equates to!). These are just two examples of the current scientific challenges with regards to generating and analysing data – but still not necessarily what people mean when they speak about Big Data. Technology is becoming more and more ubiquitous in our society. This has been the main driving factor in the evolution of big data. Our standard means of communication has been digital for some time (email, text messages) – however, thanks to mobile technology and an increasing number of smart devices (TVs, watches, games consoles, cars, some domestic appliances), more of our daily
lives are being carried out online. This sort of information may seem worthless on an individual scale, but if you start to look at it from a holistic point of view or as an aggregated data set, there is huge opportunity for extracting information that can have a global impact. Using Google flu trends as an example, something as simple as an individual searching for a flu remedy online when combined with geolocation and time stamp information regarding similar search terms across multiple states and countries, can suddenly become hugely impactful. This sort of big data analysis could potentially alert us to, and help stop the spread of, more powerful epidemics and pandemics besides the flu. I sat in on a talk recently which was conducted by the University of Warwick, who used big data techniques to identify correlations between internet search terms and fluctuations in financial markets by taking the same approach as Google flu trends, but mapping search sentiment to movements in the stock market. The long and short of their findings is that they were able to outperform the basic “buy and hold” approach (where people buy stocks and hold onto them over a period of time) by almost 20-fold over a number of years with simple strategies such as selling when the search term “debt” appeared as a top search term. I’ve hugely oversimplified their sophisticated algorithm, but the point is they used big data techniques to take everyday data that was freely available and create value from this information. If we look into the future, the Internet of Things is going to have a further impact on big data. Not only will we have mobile phones generating data, but personal sensors, smart watches, cars, homes, fridges – in fact, anything that you have in the house could potentially be a smart device in the near future. One can only begin to imagine a world where your fridge texts you about the low stock of alcohol available having reviewed your calendar and noticed a big sporting event coming up. The sporting event in question could have athletes wearing sensors and GPS tracking devices, which will be automatically collecting measurements and statistics to feed back to coaching and medical teams. The team that wins in the future may be the team with the best analytical capabilities rather
than just the best team on paper. This has already been proven in the baseball arena with Oakland Athletics, who employed statistical methods to put together the optimal team, on about 30 per cent of the budget that the top teams had (check out the book or film Moneyball for all the details). Devices being able to query, analyse and make intelligent decisions will be a big part of the future – these will all be powered by big data solutions. I have a one-year-old daughter – she already has a larger online presence than I do thanks to my wife posting pictures of her either in a new outfit or using her dinner as face paint! She’ll have all this information logged and recorded and available at a moment’s notice. I’ve no idea what I did on this date 10 years ago. In 10 years’ time, however, she will. At this stage, I’m not entirely sure as to whether this eventuality should fill me with intrigue or fear… or both. This lifelong digital presence will become the norm for our children and our children’s children. They’ll have the ability to access all their data, anytime, anywhere – from pictures posted by parents when they were young to videos of their life
achievements captured via Google Glass, and everything in between. One thing is for certain: big data is a truly exciting space to be in and is going to become increasingly applicable and desirable to companies in the future, regardless of their size or budget. Business leaders are rapidly facing up to the implications of this growing phenomenon. If they have not already done so, they are upgrading their data management approaches and processes in order to meet the challenges. They are changing the way they work and they are retasking people within their organisations to take advantage of the insight and business intelligence that can be generated by these critical numbers. If your business is starting to look at big data solutions or is planning to in the future, Boston and Intel have the tools, optimised server solutions, software platforms and the in-house expertise to ensure your big data project is a success. David Power (inset bottom left) is head of HPC at Boston www.boston.co.uk/hpc david.power@boston.co.uk
an independent report from lyonsdown, distributed with the sunday telegraph
Like us: www.facebook.com/biztechreport
March 2014
Business Technology
Big data | 9
Find us online: business-technology.co.uk
Safeguarding your supply chains the big data way By Joanne Frearson
ExpertInsight
BANKS hold more information on us than we might suspect. Debit cards are tell-tale signs of spending habits and behaviours, and banks know everything from where we like to eat to what kinds of entertainment we’re into. They have a plethora of big data information to hand – data which can also be used to help reduce risk to businesses. “I have been told by consulting firms that my data is more interesting than Tesco’s,” says Alan Grogan, chief analytics officer at the Royal Bank of Scotland (RBS), speaking at the Gartner Business Intelligence & Analytics Summit 2014. “I know what all you guys are doing, whether it is through debit cards, credit cards or meetings with my relationship sales team. I know where the economy is going before it almost happens. “If I understand intraday analytics, then I can understand where the UK economy risk is. In the UK, manufacturing is obviously a cornerstone – we are the sixth-biggest manufacturer globally. It is 12.5 per cent of gross domestic product (GDP).” RBS is the largest SME and corporate bank in the UK, and it is building an advanced analytic ecosystem using big data to help customers understand market opportunities and threats as they evolve in the supply chain. Grogan says: “During the meltdown or prior to the meltdown, there was very little customer analytics at
RBS. It was predominantly focused on lending, credit risk and risk-based analytics. “ O b v i o u s l y, w i t h t h e a d v e n t o f the Internet of T hings and big data, everyone has been focused on how to utilise that for customers. We are using data today at RBS substantially. We are trying to use that in many interesting ways – and obviously we want to focus on costs or drive revenues. “If we use analytics to drive happy customers, to drive better value and insight, ultimately you are making a decision customer-centric, looking at things such as customer advocacy, customer touch-point and customer emotion, all [with a view to] trying to understand their needs and wants and making an effort to drive business. “We send and receive over £7billion a day in my systems – we can use that, if we so wish, to understand customer needs and wants. We can use it to drive insight
The 2011 tsunami in Japan had a severe knock-on effect on the global automotive industry
and customer experience and channel risk and anti-money laundering [activities].” At the moment much of the analytics RBS is driving are with governmental agencies – Grogan explains that everything is intertwined and has a dependency on the manufacturing sector. He gives the example of the how the horsemeat scandal last year hit the food manufacturing industry, and how problems in the supply chain had an impact on the major supermarkets. He also describes how, after the Japanese tsunami in March 2011, there were problems in car manufacturing because a lot of the semiconductors needed to produce automobiles could not get to the UK and US, resulting in a global drop in car production. “What RBS is doing is building a supply chain for our customers on all the information we have,” Grogan says. “If you are a customer of ours you will know where the risk is. You will know where the political risk is, you will know where your currency risk is and you will start understanding things. Hopefully we can start moving the risk around.” According to Grogan, without a company understanding what their supply chain risks are, they are unlikely to be prepared in the event of a problem with their supply firm, and may not be able to provide services to their customers and lose business as a result. He says: “We can help companies – we have all the data on billions of pounds going through the economy and are processing this intraday. We know GDP. We know who the suppliers are on shaky ground. We can come in and help you, or say, here are some other suppliers. “Understanding intraday GDP is critical. If we do that the UK would be a much safer place.” Big data is being used to understand risks in the economy, and knowing how we spend money can help avoid potential problems before they happen.
Getting the future right with Datastax and Cassandra If you lose an entire data centre, your business is not affected INDUSTRY VIEW
D
ata is changing how businesses progress and adapt to their customer needs. A technology system called Apache Cassandra™, distributed by enterprise software provider DataStax, is helping companies use their data to give them the edge. Billy Bosworth, CEO of DataStax (inset), says: “What we are doing today is really starting to revolutionise how companies interact with their customers. You are probably a user of our technology right now – you probably use it several times a day and you just do not know it. We are becoming the secret sauce under the covers that is allowing customers to completely and radically change what is possible with their
customers. How we do that is with a database called Cassandra – an opensource technology that helps our customers overcome the obstacles to innovation. They are getting crippled with the old relational databases that are too complex and too expensive to manage the current influx of data, such as Oracle. We do it at a fraction of the cost, and as they revolutionise their businesses they are turning to us.” Bosworth explains that one way Cassandra helps clients is by being always on. “What an always-on database means is we can keep your systems running through incredible outages and disasters,” he explains. “We have this capability called Multi Data Centre – what that means is we are going to spread the distribution of the machines and data so, if you lose a couple of machines or an entire data centre, your business will still operate like nothing happened. “You can distribute and leverage the cloud in conjunction with your physical data centres. Customers have ultimate flexibility on how they want to configure their infrastructure with us, so that you can have an always-on service level agreement no matter what happens.” DataStax only started close to four years ago, and it already has more than
400 customers in 39 countries and is used by 25 per cent of the Fortune 100 companies. “The way they are using us is very mission-critical,” says Bosworth. “It is essential for their customer adoption rates.” Cassandra is used at Netflix to maintain user profile data so that the company knows what entertainment shows they want. Bosworth says: “They released an incredibly popular Netflix original series called House Of Cards. The way studios would have done such things in the past is that they would have gone out to do a couple of focus studies and listened to 20 people and said ‘I think we have a great idea for a movie – let’s pitch it’. “As Netflix is in an always-on
environment, it is able to capture and understand so much about us as users that they do not have to guess what we are going to like. They can start now to make predictive accurate decisions about the kind of programming someone is going to want to watch. When they released House Of Cards, the probability of failure was very low because they were able to know what people liked.” DataStax is the secret sauce for progressive companies these days – enabling them to innovate for their customers in ways they never were able to before. www.datastax.com
Business Technology March 2014
10 | Big data
an independent report from lyonsdown, distributed with the sunday telegraph
Find us online: business-technology.co.uk
Follow us on Twitter: @biztechreport
an independent report from lyonsdown, distributed with the sunday telegraph
Like us: www.facebook.com/biztechreport
March 2014
Find us online: business-technology.co.uk
Business Technology
Big data | 11
Matt Smith reports on how big data analysis has transformed the professional football landscape ROADCASTERS’ cameras aren’t the only lenses focused on players at Premier League football matches. Behind the scenes, imaging systems capture 10 data points for each player every second – 1.2 million per match overall. “From kick-off, every touch of the ball is captured, including the player involved and pitch location,” explains Paul Neilson, head of performance analysis at Prozone Sports, which provides the technology to all 20 Premier League clubs. This data, along with details of 2,000 to 3,000 “events” per match, is used by teams to improve their fitness and performances, scout upcoming opposition, and find new talent. Neilson says that teams are becoming “more scientific” in their recruitment. “In sport, we are looking at an extremely large data set,” he says. “Most of the Premier League teams will now have ten years of Prozone data, and the bigger clubs are doing some very interesting stuff.” He points to Liverpool, Manchester City and Arsenal as teams that have “really invested” in data and have recruited “key people” to analyse it. Another team at the forefront of big data in football is West Ham United. David Woodfine, head of performance analysis, says the east London club goes through data “with a fine-tooth comb”, examining players’ work rates and checking up on key stats like entries into the final third of the pitch, crosses, shots, and shots on target. Statistics are so quickly available they can be shown on a screen in the dressing room at half time. When preparing a game plan for upcoming opponents, he explains, the club will consider scouts’ reports and video from previous games before matching this up with data. “We will look at the statistical reports we can access and see if there are stats that can back up what we are saying from a statistical point of view,” Woodfine says. Data can also be used to narrow the search for potential transfer targets, based on factors such as physical size or pass completion rate. Suggestions are then evaluated by sending scouts to watch players’ performances. In a similar way, data analysis can also be used to highlight top performers in the club’s academy and youth teams. “We are fortunate here in that the board have backed us to run analysis down through the age groups,” Woodfine says, explaining that players of certain ages can be judged against benchmarks based on their predecessors’ performances. “The stats can back up and add accountability to decisions the coaching staff make. It gives them a warm fuzzy feeling if they pick a player and when they go to the analysts the stats match.” All the analysis in the world would have no effect if players didn’t take it on board, but Woodfine says that it’s now been around so long that they are used to data-based feedback. “I think it is becoming a lot easier now with players,” he says. “This is a generation of players coming through youth teams and under-21 development squads that have analysts. They have been receiving data about their performances for years. These guys are used to receiving data, and they understand it now.”
an independent report from lyonsdown, distributed with the sunday telegraph
Business Technology March 2014
ExpertInsight Expert
ExpertInsight
12 | Big data
Find us online: business-technology.co.uk
Follow us on Twitter: @biztechreport
Ensure your software architecture determines technological advantage INDUSTRY VIEW
S
uccessful implementation of big-data analysis depends on your software architecture. Analysis solutions are included in enterprise software created by the market leaders. However, custom-built solutions don’t necessarily have the architecture required for a comprehensive analysis. Analysing the composition of stored data is crucial from both cost and quality perspectives. Firstly, understanding whether the data could be stored in a more appropriate form for optimised analysis is as important as recognising whether junk data is being stored unnecessarily. Effective forms of storage impact on the requirements for processing power, while poor use of storage space is unproductive and costly. Custom-built solutions benefit
from data being stored in SQL relational databases, blobs and table storage. From the technological view, blobs offer the best way to store large amounts of unstructured text or binary data, such as video, audio and images. A NoSQL datastore, such as Azure Table Storage, has capabilities for applications requiring mass storage of unstructured data. Secondly, identifying previously uncaptured data provides quality information to benefit end users. At Enginee.rs, we prefer Windows Azure HDInsight for producing rich user experiences for big data analysis and visualisation. This 100 per cent Apache Hadoop-based cloud service manages structured and unstructured data of any type and
size, maximising the value of big data. A benefit here is the time HDInsight saves. Its agility allows us to deploy and provision a Hadoop cluster in minutes. It also superbly maintains enterpriseclass security, scalability and manageability. And thanks to the Hortonworks Data Platform enabling seamless integration with the Microsoft BI tool ecosystem, we can take advantage of Apache Hadoop projects, including Apache Pig, Hive and Sqoop. When choosing custom-built solutions, ensure your software determines not only cost and quality factors, but technological advantage, by using the vast data analysis features they are built for. www.enginee.rs software@enginee.rs
Engaging the digital generation through digital market analytics INDUSTRY VIEW
I
t’s common knowledge that new communication technologies have changed customer behaviour and expectations. Firms now need to innovate to adapt to such a change, mainly among the digital generation, who have become active creators of a digital culture and the prime targets of digital marketing. Customer review and involvement are crucial for brand value. Marketing is not a oneway street anymore. Digital marketing analytics (DMA) helps companies optimise marketing
spend through real-time attribution, and to draw actionable insights which take into consideration all existing marketing channels and customer behaviour. So what do businesses have to keep in mind to keep up with competitors? The number one rule in marketing is “know your customer”. Only if you know your customer well, can you apply highly personalised and targeted marketing to create a powerful customer experience. Nowadays, everything about the customer is mapped: from their social network preference to their shopping behaviour. As customer behaviour is changing, businesses need to revamp their approach to marketing, and have to find ways to deal with the flood of information that
is at their disposal. There is no one right solution to finding what the customer really wants, but a combination of different approaches needs to be applied. To help companies develop deep insights into their digital touch-points, Evalueserve has partnered with the start-up Kvantum. Kvantum had developed a platform for text analytics, semantic analysis, machine learning and statistical analysis of high-dimensional, large-scale data. Through this platform the marketing and digital analytics team can monitor the ROI of digital channels in real time, identify and track customer behaviour and enable delivery of targeted messages to the right customer. Multiple retail and CPG companies have
thus been able reduce overall marketing spend by up to 20 per cent, which amounts to multi-million dollar savings annually. Since the data volume is high and human language is complex, a purely automated solution might not give conclusive insights, therefore Evalueserve and Kvantum leverage collective intelligence combining machine intelligence/technology with humans who do the intelligent business thinking to provide comprehensive customer profiles. Marc Vollenweider (inset, left) is CEO of Evalueserve +44 (0)7459 230960 www.evalueserve.com
an independent report from lyonsdown, distributed with the sunday telegraph
ExpertInsight
ExpertInsight
Like us: www.facebook.com/biztechreport
March 2014
Find us online: business-technology.co.uk
Business Technology
Big data | 13
Our personal data may save our lives INDUSTRY VIEW
W
e are both cursed and blessed by data. Some fear that unprecedented power to mine our personal data is a threat to our privacy and interests, because it could help corporations and governments know or infer ever more about our lives. But our data can also help us know ourselves in new ways. It can transform how we think about health and participate in our own care. With this will come, of course, new business opportunities. Healthcare is no stranger to big data and analytics. Large organisations, such as the NHS, have had to struggle with storage of and access to patient records for decades. At Cloudera, we help customers like Cerner use open source software based on Apache Hadoop to qualitatively improve doctors’ access to patient information. Our chief scientist, Jeff Hammerbacher, devotes much of his time to a lab at Mount Sinai that is using these technologies to analyse and understand the genomes of cancer patients in order to personalise their therapies. These are impressive applications. However, the most intriguing new uses of data analytics in healthcare may build on data that is not yet meaningfully collected. Consider that doctors today have only a tiny window on patients’ lives, glimpses gathered from infrequent office visits and self-reported statistics. The mundane facts of our days – what we ate, heart rate, how far we walked – would be a trove of actionable information. But it is simply never recorded. As with many potential applications of data analytics, the
problem is not access to software, hardware or techniques, but even having the right data to query. We’ve seen a first wave of consumer healthcare applications built around data from sensors that most of us carry already, in our smartphones. For example, Apple’s forthcoming healthbook app can use an iPhone’s GPS and accelerometer to passively track daily exercise, among other things. A clever app from Azumio uses the smartphone’s camera and flash to measure heart rate from a finger. Wearable technology like Google Wear will undoubtedly add sensors capable of measuring more about the body’s state. Specialist sensor phone accessories to measure, for example, glucose levels are already available, like iBGStar. New hardware represents business opportunity in itself. But these devices also gather the vital data that will help consumers understand their health. Acting on it requires analysis, and that too will demand a new generation of start-ups ready to offer analytics as a service. To what extent should, or will, the current healthcare system play a part in this new ecosystem around personal health data? Will consumers be able to share this data easily with a doctor? Will they want to? Will consumers trust start-ups with personal data? It remains to be seen, but it seems inevitable that personal sensor data will transform the business of healthcare. Far from a threat, our personal data may in fact save our lives. Sean Owen is director, data science at Cloudera 020 3178 4857 www.cloudera.com
Population analytics: A new world of opportunities Data on people movement that delivers accurate results INDUSTRY VIEW
F
ew technologies have the potential to change the way organisations think about their customers and stakeholders as population analytics. Information about how populations move from one point to another is emerging as a powerful tool for strengthening communities, building smarter cities and creating better customer relationships. Detailed knowledge about origins and routes, in particular, can help organisations of many kinds gain new insights into their customers and constituents that are very hard to obtain in other ways. Such knowledge is the foundation of a new innovation called population analytics, which promises to revolutionise everything from city management and planning to retail insight and marketing. INRIX Population Analytics is a
data-driven process for answering questions about whole populations. Its power comes from the ability to understand not just where anonymous groups of people are at a given time, but where they’ve been, how they got there and where they may be going next. Knowing where people start their journeys to a major sporting event can help authorities plan and manage transport arrangements with greater precision. Similarly, a retailer who knows what percentage of customers live in a particular postcode has a clearer view of customer demographics, and can modify strategy accordingly. Population analytics is still in its youth, but is generating excitement as the potential of its applications become clear. The use cases are almost limitless, and organisations in almost every industry sector could benefit from it over time. In the short term, though, it is mobile operators who have a critical role in building its foundations. The benefit of mobile data is that it is readily available from a large and evenly distributed number of sources. This brings several advantages such as scale; the sheer volume of anonymised data means that
sample sizes correlate statistically with real-world population volumes, ensuring that analysis can deliver accurate results. Many organisations can benefit from the ability of population analytics to deliver new insights: • For mobile operators, population analytics represents an opportunity to deliver a range of services to both existing and new customers. By using INRIX’s analytics platform, operators can distil information and insight from their vast collections of data – a potentially significant
new revenue source, and a valuable addition to their services portfolio. • For public sector organisations, origin and destination analysis gives planners a new way to visualise real-world urban problems and create working solutions based on actual data; not just in road and transport planning but across social and medical strata too. • For large businesses, population flow offers a new way to look at old issues; in particular, how to gain far greater understanding of customer behaviour, and use that understanding to maximise revenue and minimise costs. Projects using INRIX Population Analytics are underway. During the London Olympics, INRIX worked with Transport for London (TfL) to analyse real-time population flow across the capital, enabling TfL to make better-informed transport planning decisions. Other examples include a travel time measurement system used by Connect Plus to manage the M25, and providing essential data for the M25 Congestion Relief Scheme Study undertaken by Jacobs UK Ltd on behalf of the Highways Agency. Graham Bradley is business development director, INRIX +44 (0)20 7012 3505 populationanalytics@inrix.com
an independent report from lyonsdown, distributed with the sunday telegraph
Business Technology March 2014
14 | Big data
Find us online: business-technology.co.uk
Follow us on Twitter: @biztechreport
Business W rld
United States
Twitter big data could be used to track HIV incidence and drugrelated behaviours with the aim of detecting and potentially preventing outbreaks, according to a study by California university UCLA. The study, published in peer-reviewed journal Preventive Medicine, suggests it may be possible to predict risky behaviour by monitoring tweets, mapping where they come from and linking them with data on the geographical distribution of HIV cases. Sean Young, assistant professor of family medicine at the David Geffen School of Medicine at UCLA and co-director of the Center for Digital Behavior at UCLA, says: “Ultimately, these methods suggest we can use big data from social media for remote monitoring and surveillance of HIV risk behaviours and potential outbreaks.”
Greece
ExpertInsight
Greek insurer, Ydrogios Insurance is using SAS
Visual Analytics to strengthen its business operations. The product will enable Ydrogios executives to rapidly collect, analyse and present crucial information such as loss rates and frequency, the number of insured risks, and the number and amount of compensation and premiums. The firm will be able to conduct its own big data analysis, correlation and presentation.
Brazil
The Brazilian big data sector is set to become a billion-dollar market by 2018, according to market research by Frost & Sullivan. Market revenues are expected to increase to $965million in 2018 from $243.6million in 2013. The research found the biggest user demand for big data is currently for consulting services on how to use it. Service providers were also working with universities to train more experts while developing user-friendly solutions. Guilherme Campos, information and communication technologies industry analyst at Frost & Sullivan, says: “This will quicken adoption in verticals, such as finance, telecommunications, manufacturing and retail that are already mature enough to implement big data analytics.”
Cisco CTO Ian Foddering tells Joanne Frearson how a childhood obsession with Lego led him to sorting out traffic jams in Nice
M
Y INTEREST in technology is down to Lego and a fascination of how stuff comes together,” Ian Foddering, CTO at Cisco in the UK and Ireland, says. “From there it really just built up.” After completing an engineering degree, Foddering decided to go into technology as it was an area he says was “beginning to take off”. Foddering reckons we are only at the infancy stage of big data. He says: “The
interesting thing is we are in an environment now where more and more things are being connected and more and more data is being produced. There are huge opportunities going forward.” Cisco’s Internet Business Solutions Group (IBSG) predicts some 25 billion devices will be connected by 2015 and 50 billion by 2020. He says: “This all represents more and more data that is going to be produced. “The challenge is then interpreting that data and getting some real insight from it,
Be the best player on the board Using data to put a problem into context and send the most appropriate message INDUSTRY VIEW
S
oftware company Postcode Anywhere uses big data to offer users an unparalleled level of customer service. Data points are collected on the journey clients go through as they move through the website and used to help put customer problems in context, giving them a more personalised service. Jamie Turner, CTO at Postcode Anywhere, says: “We have been capturing all this data and have slowly built up this pretty cool system that lets us figure out what is going on with our customers. “If you look at it from a more support-driven angle, the bad stuff is the really important matter, because when it looks like things are going south, that is when you want to try to intervene and put things back on track. Postcode Anywhere does this by tracking the 12 different stages a customer goes through when they download software on its website. If a person looks like they are struggling in one of the processes they will get a gentle message from the support team to help them. According to Turner the big data is used in a gamification way. He says: “If you think from going through from one step to the
next in the journey, if you imagine those steps as squares on a board, instead of rolling a dice, what drives you from one to the next is an event. It might be that you’re logged onto the website, you have registered, you have copied this code on the page, you have installed it. “You have all of those different moves around the board and it is all numbered according to whether it’s a good or bad thing. Those are what we call our mood numbers. The events are the dice driving the gameplay. It is like snakes and ladders. If it all goes well then you’ve got positive things happening, there will be a rise in mood. “Whereas if it goes badly and you hit a snake, the mood goes down. What you then start to see is those who go through it in a short amount of time steadily improve their mood, while for those who struggle we can look at where they were in the game – what page was that? What piece of code was that?” Postcode Anywhere then uses this data to put problems into context and send customers the most appropriate message for that difficulty. By fixing these problems quickly for customers it makes the process simpler for clients, helps Postcode Anywhere retain customers and frees their resources to do other things. Postcode Anywhere will be speaking about how other companies can use big data on June 18 at Internet World, 11.30am to 12pm in the Big Data meets Big Analytics Theatre at the Excel Centre, Docklands, London. 0800 047 0493 postcodeanywhere.com
an independent report from lyonsdown, distributed with the sunday telegraph
Like us: www.facebook.com/biztechreport
ExpertInsight
to allow for some valuable and informative decisions to be made. I think the big challenge in my mind is around the cultural element of helping people understand what value they can get from the data at the moment. “People need to be clear with what they are sharing, what the benefits are and what people are exchanging for. I think that is a key requirement going forward. “A lot of people perceive the pendulum to be in the favour of the organisation collecting the data. I think over time what will happen is that the pendulum will swing back and people will understand the value that they can get from this data, whether it is financial or otherwise, will be hugely beneficial. “At the moment I think people are a little confused about what is happening to their data. I think the challenge is that most people see big data as their loyalty reward programme, and while they get an extra level of discount from their future shopping experience what they are struggling with is what else they can get from this.” According to Foddering there are many ways in which big data can be used to help people in their lives. “That is the exciting bit, there are so many possibilities,” he says. “It is not just going to be the case of taking data from one source to provide a picture, but when you start to bring in multiple different sources that we would not naturally think about, we will start to get a level of insight that will really surprise us.” Cisco is currently using big data to help reduce traffic congestion in the French city of Nice to make it a smarter city. “Around 30 per cent of the traffic
March 2014
Find us online: business-technology.co.uk
Business Technology
Big data | 15
Foddering is using big data to combat traffic jams in Nice
congestion in Nice is typically made up of people looking for parking,” says Foddering. “Being able to help people identify more quickly where there is parking can be achieved through big data technology. It allows for a better environment for all concerned. “The work we are doing there plays right the way through to the street lighting, which again is a huge area of opportunity. At the moment there are still a lot of old-fashioned lighting systems. A number of organisations have moved towards LED lighting, but at the same time putting capabilities into that lighting which actually makes it intelligent. “Capabilities that recognise the environment around to actually provide additional services from that street lighting, such as Wi-Fi connectivity, which allows you to provide additional services and collect
additional data from that environment. “Big data can be used to look at traffic flow at certain times of a day and the week and flows can be optimised accordingly, but also at the same time being able to make real-time decisions based on congestion or accident hot spots. “One of the challenges of today is being able to react in the most appropriate way. If you can start to automate that in a way that allows for almost a seamless flow of traffic into, out of or around a city, that is what a lot of organisations and governments are looking to do with smart cities.” Foddering has gone from building Lego cities as a child to creating smart cities through the use of big data at Cisco.
For real estate investors, the future has finally arrived Cloud platforms are transforming the industry INDUSTRY VIEW
I
nvestment funds and trusts manage more than £18trillion of real estate assets globally. Though smaller than the market for stocks, bonds and derivatives, the industry plays a significant role in the world economy. Yet, despite the size and economic importance of the real estate investment industry, its investors have lagged behind those in traditional equity and debt in terms of data standardisation, information exchange and technology adoption. Data in other investment sectors has been highly commoditised. Investment professionals can log onto one of numerous services, such as Bloomberg, Reuters, Yahoo Finance, and access consistent and reliable data about stocks, bonds and derivatives. In today’s market, no such platforms exist for investors in real estate. The situation is partially explained by
the fact that, even though firms invest globally, real estate continues to exhibit as a local business. There are significant differences in law and practice between countries and asset classes. Global institutions that invest in real estate need to exchange information with numerous local experts in the regions and sectors where they invest, which is challenging in the absence of global data standards. To date, real estate investors have therefore been forced to adopt a doit-yourself attitude to collecting and standardising the data they need to manage their portfolios from property managers, appraisers, brokers, lawyers, and other disparate sources. Numerous large international investment firms have invested significant sums of money to build proprietary data warehousing solutions that enable this process, which has resulted in a hotchpotch of competing data formats and protocols. However, the emergence of cloud computing is making this challenge easier to overcome. Cloud-based platforms allow information to be easily shared between multiple organisations across the boundaries of corporate networks. Because on-premises software
is not required, the cost and technical complexity of using a cloud-based platform is low, eliminating a significant barrier for smaller investment firms. London-based Voyanta Limited is one of several new software companies taking advantage of this trend. Since launching in January 2013, Voyanta’s cloud-based platform has seen rapid adoption among some of the largest real estate investment managers in Europe and North America. According to Voyanta CEO Raj Singh, “Data exchange has been a problem in real estate for decades, but recent advances in cloud technology, particular in scalability and security, have made it possible for us to build a truly global information exchange platform that can handle the volume and complexity
of the data involved and manage access securely across a wide range of data providers and consumers.” LaSalle Investment Management, manager of $48billion of real estate globally, is one of the early adopters seeking to leverage the cloud to improve their access to data. According to Imran Nasir, a director at LaSalle, “Our mission is to deliver consistent above-target performance across our segregated mandates and funds, which requires the key decision-makers to have relevant critical information at their fingertips. We believe that cloud computing will allow us to more quickly harness the information held by our property managers to allow LaSalle to deliver outperformance for our clients.”
an independent report from lyonsdown, distributed with the sunday telegraph
Business Technology March 2014
ExpertInsight
16 | Big data
Find us online: business-technology.co.uk
Follow us on Twitter: @biztechreport
Engine of the information economy INDUSTRY VIEW
M
any people ask: “What is big data?” Various definitions have been offered and argued over but what is not in dispute is the vast amount of information being created and stored in today’s connected world. Data is pouring in from every conceivable direction: from inbound and outbound customer contact points, from mobile media and the web. It is not just the volume of data – it is the unprecedented velocity and the variety of formats. The majority of the world’s data is unstructured – whether it be text, images or video. Perhaps the simplest way of putting it is the amalgam of all this information is big data. The result is that organisations now have untapped reserves of data which could deliver financial benefits to the emerging information economy. Most of the world’s data has been created in the last few years and by 2020 the amount of information stored worldwide is expected to be 50 times larger than today. This might at first appear to be a problem for businesses wanting to make sense of all this data. The reality is that data is the new oil that will power the information economy. But, just like oil, it needs to be put through the equivalent of a refinery to become valuable. This can
SOURCE: SAS AND E-SKILLS UK
be achieved through big data analytics, which enables organisations to analyse and extract value from all this crude data. The insight derived gives them the power to know more about their business – not only the things they knew they didn’t know, such as valuable customer information, but even the hidden gems that they had no idea even existed. Imagine being able to analyse data to determine the root cause of manufacturing failures or detect fraudulent behaviour before it affects revenue? Advances in computing power mean big data analytics can deliver answers more quickly – what may have taken days or weeks can now be done in minutes or seconds. Cheaper data
storage and cloud-based services also make it accessible to more organisations. Research by SAS and the Centre for Economics and Business Research, an independent consultancy, estimates that big data could deliver additional revenue of £216billion to the UK economy over a five-year period – equivalent to around 20 per cent of UK net debt. Only 14 per cent of large companies have adopted big data analytics, but this will more than double by 2017 according to research by SAS and e-skills UK, the employer body for the digital industries. A major barrier to adoption is a shortage of skills. The same research predicts the number of big data specialists demanded by business will grow by
nearly 250 per cent over the five years to 2017, yet three in five businesses already struggle to hire people with these skills. The government has highlighted the issue in both its Information Economy Strategy paper and Strategy for UK Data Capability and acknowledged the need for business and academia to work together with them to nurture the skills needed. Otherwise the UK risks losing out to other economies around the world. There is no time to waste in extracting value from our new big data assets – the new oil that will drive the information economy. Mark Wilkinson is MD, SAS UK & Ireland 01628 486933 sas.com/uk/big-data
TAILORED INSURANCE FOR FEMALE DRIVERS l REWARDS GOOD DRIVING WITH GENEROUS SAVINGS AND BENEFITS l TELEMATICS OPTION, WITH NO NIGHT TIME CURFEWS OR MILEAGE RESTRICTIONS
ALL POLICIES INCLUDE:
aHANDBAG / CONTENTS COVER, INCLUDING MOBILE PHONE a£100,000 LEGAL COVER aEMERGENCY HELP LINE aMEMBERSHIP TO EXCLUSIVE LOYALTY AND DISCOUNT SCHEME SAVING £1,000+ ON HOLIDAYS, HIGH STREET BRANDS, CINEMA TICKETS
CONTACT US NOW GIRLSDRIVEBETTER.COM 0845 512 0731
Including Big Data Show and Interop London
Define and deliver your digital strategy. 17-19 June ExCeL, London
InternetWorld14Ad_46.5x262mm.indd 1
Register to attend at
internetworld.co.uk/register
@iw_expo #iw2014 10/03/2014 14:41
an independent report from lyonsdown, distributed with the sunday telegraph
Like us: www.facebook.com/biztechreport
Dogberry loves watching the Oscars in black tie at home to capture the spirit of the glamorous event. In true style, the Inspector even gets a pizza delivered just like Academy Awards host Ellen DeGeneres.
Big data is being used to drive Ericsson’s Future TV Anywhere platform which will bring web technologies to the TV operator world. Dogberry will now be able to enjoy his favourite shows such as Mr Selfridge and the Oscars through a seamless viewing experience. By 2020 there will be at least 50 billion connected devices, 15 billion of which will offer video to users. It is Ericsson’s ambition to build an ecosystem where TV will be available on any device anywhere. Per Borgklint, senior vice president and head of business unit support solutions at Ericsson says: “Pay TV at web speed is a critical element in Ericsson’s TV and media strategy, enabling fast innovation and implementation so that our customers can win in this highly competitive marketplace.” Twitter: @dogberryTweets
Ryan McClarren, chief science officer at ICC. “And while social media buzz may be high this week for Leo DiCaprio, sadly, our data shows he is not going home with a statue on Sunday.” The model Farsite uses analyses more than 40 years of film industry and Academy Awardrelated information to forecast probabilities for the winners. This information includes realtime data and an array of variables, including total nominations, other guild nominations and wins, buzz and nominees’ previous winning performances.
The government has unveiled £73million of new funding to help the public and academics unlock the potential of big data. The new investments include the Medical Research Council, Arts and Humanities Research Council, Economic and Social Research Council and the Natural Environment Research Council. David Willetts, universities and science minister, said: “Big data is one of the eight great technologies of the future. It has the potential to transform public and private sector organisations, drive research and development, increase productivity and innovation, and enable market-changing products and services.”
Out for the count Big data and cloud provider firm Bull has launched a game for iPhone and Android to help boost awareness about big data. SUPERCOMPUTOR is designed to hone mental maths capabilities and will be available on App Store and Google Play. There are three modes of play – addition, multiplication and dual mode (addition plus multiplication). SUPERCOMPUTOR offers players the chance to become the “TermiMathor” by doing the most calculations to reach the target figure shown on the screen, as fast as possible. Players can progress through five levels and scores can be compared with friends and other players worldwide. Scores can also be shared on Facebook.
SSP Data Solutions Advances in technology solutions have led to an explosion of unstructured data that can be utilised in previously undiscovered ways. To help insurers make the most of these opportunities, SSP, the international IT company, has created a business unit focused solely on data solutions to support the insurance industry on this journey.
Find out more here... www.ssp-worldwide.com/DataSolutions If you would like to talk to an expert please call 0800 590 705
Business Technology
Big data | 17
Find us online: business-technology.co.uk
Inspector Dogberry But long before Alfonso Cuarón won best director for Gravity, Matthew McConaughey picked up the award for best actor for his role in Dallas Buyers Club and 12 Years A Slave scooped best picture, the results had already been predicted by data scientists at Farsite, the advanced analytics division of ICC. “Most people are surprised to hear that the same sophisticated, predictive modelling we use in industries like retail and healthcare can predict Oscar winners quite accurately,” says
March 2014
By Matt Smith, web editor
u Editor’s pick IBM Big Data Hub www.ibmbigdatahub.com/blogs IBM’s Big Data & Analytics Hub offers insight from a range of industry experts on big data and its applications within business. As well as case studies providing examples of real-world use of analytics, there’s also a section full of resources for developers looking to make the most of big data within their companies.
O’Reilly Data http://strata.oreilly.com
TechRepublic Big Data Analytics www.techrepublic.com/ blogbig-data-analytics
Presented with a clean, easy-todigest design, O’Reilly’s data blog provides regular updates on issues surrounding analysis, from opinion on the latest developments in policy to thoughts on how big data will integrate with the Internet of Things. There’s also a variety of free reports to download and a newsletter.
Trade publication TechRepublic has a blog dedicated to big data and the advantages it offers businesses, covering areas of interest including the potential of open-source software, big data’s role in security analytics, and the benefits of using presales analysis to stay ahead of competitors.
BigData-Startups www.bigdata-startups.com
Hadoop: The Definitive Guide (£3.10 – Android)
Big Data Overload (FREE – Android)
Get into the technical side of big data analysis with this highlyrated guide to open-source data processing framework Hadoop.
Take a break from all that big data analysis by playing a game about it. Can you successfully extract value from the onslaught of data?
BigData-Startups offers a valuable resource to those working with analytics, providing industry information, infographics, videos, and even a course for its members. Alongside all this is an interesting blog that cuts through the buzzwords to examine the fine details of big data analysis.
an independent report from lyonsdown, distributed with the sunday telegraph
Business Technology March 2014
BizTech Zone
18 | Big data – Industry view
Find us online: business-technology.co.uk
Follow us on Twitter: @biztechreport
The future
After the flood: making the deluge work for you Feel like you’re drowning in data? Help is at hand
T
oday’s insurance providers face unprecedented volume, variety and velocity of data. This is both powerful and daunting. Many organisations recognise the power of this data, but are overwhelmed by how to use it to make effective business decisions. The analogy we use is that providers are facing a deluge of data. Like water, the data can pour in and quickly flood the business, leaving it unable to stay dry and execute. On the flip side, like a good rain in a drought-stricken area, this big data deluge can be transformational and necessary to sustain and grow a business. The key is to learn how to make the deluge work for you. Insurance claims management is directly impacted by this big data deluge. One of the most pressing issues is the massive amount of customer claims data accumulating in different systems. The result is data silos and no reliable, single source for customercentric claims data. This makes it extremely difficult to gather leadingedge data and analytics that could help improve claim performance. In this scenario, no one wins. The CSAA Insurance Group faced this very issue. Through the years, it had amassed a significant amount of claims data, but it remained difficult to collect and utilise. The
Video Bonus: Steve O’Connor, CIO of CSAA http://vimeo.com/84232549 Watch Steve O’Connor discuss CSAA’s claims management challenges and how Saama’s innovation helped them harness little data and big data.
information existed in five separate legacy claims applications, and there was no way to get an enterprisewide view. For this reason and more, CSAA decided to migrate all claims data to Guidewire ClaimsCenter. CSAA turned to data and analytics solutions and services company, Saama, for guidance through the claims data conversion project. Saama had experience with conversion to both Guidewire and Exigen ClaimCore. Using their innovative One-Two Punch methodology, Saama enabled two key objectives to happen in close succession:
Conversion of claims data to single, enterprise-wide system via a data warehouse, and set up of claims data warehouse for analytics reporting. As a result of this approach, CSAA was able to reduce the overall implementation cost by 50 per cent and complete the project in record time. Like CSAA, insurance providers need to harness the deluge to ensure a robust claims management experience. Doing so can have numerous positive benefits for their business. +1 408 371 1900 www.saama.com
In focus: How telematics is playing
Tim Berners-Lee It’s difficult to imagine the power that you’re going to have when so many different sorts of data are available
“
a key role in safer driving
T
elematics do not just provide a rich source of big data, but can also play a very real and valuable role in encouraging safer driving. Tough, even draconian restrictions on young drivers are being considered, but it is the view of many in the insurance industry that these could well prove to be counter-productive. Tough rules and threats are not the way forward for young drivers – the key lies in better education and training. A black box in your car should be seen as your
co-driver, not as some sort of technological Big Brother critically looking over your shoulder and watching your every move. Experience and education lie at the heart of safe driving and, as a young driver by definition will have very limited experience, they can be supported through an online education portal. Many inexperienced drivers will be unaware of the mistakes they are making and will also be unaware of how to improve their driving skills generally. At Girls Drive Better we believe that new drivers
need all the help they can get, and the use of telematics can show them the areas in which they could benefit from improvement, as well as highlighting any more serious errors. Telematics has a key
role to play in this supportive function and provides insurers with the valuable opportunity to make a very useful contribution to increasing driver safety and knowledge. And finally, a considerable factor to consider is that drivers may save up to 40 per cent on their premiums by agreeing to have their driving monitored. Simon Jackson (left) is managing director of Girls Drive Better 0845 512 0731 www.girlsdrivebetter.com
Big data analytics can fuel insurance industry The insurance industry already holds a vast amount of data and with evolving technology even more information is being captured from new sources such as GPS-enabled devices, social media postings and CCTV footage. How this information is unlocked and examined by insurers to uncover trends, correlations and other useful information is presenting a premium opportunity to the industry. Premiums can be better correlated to risks and, by aggregating the right data correctly, the industry can build a much more detailed picture of the risk on both an individual and trend-led basis. One of the most interesting developments in this topic is the rise of telematics. Open GI’s involvement in this area has allowed it to see how this new technology is shaping data collection for insurers on motor lines policies. The potential of telematics and the data it tracks offers a greater opportunity for the end consumer to receive a lower premium. For the insurer, this technology helps them to understand the risk exposure by providing details not only of the miles travelled, but also the roads used, journey time and driving behaviours. Big data presents a previously unavailable opportunity for the industry to find new insights, shape business processes and, importantly, offer a better service for the end consumer. The data rush is on and this year will certainly start to shape how the insurance industry evolves in a highly competitive marketplace. David Kelly is Open GI’s distribution director 01905 754455 www.opengi.co.uk
an independent report from lyonsdown, distributed with the sunday telegraph
Like us: www.facebook.com/biztechreport
March 2014
Business Technology
Big data – Industry view | 19
Find us online: business-technology.co.uk
The debate
How can insurers benefit from analytics? Adam Thilthorpe
Adrian Coupland Head of data strategy, SSP
MD, LexisNexis Risk Solutions, UK & Ireland
Global head of impact forecasting, Aon Benfield
Analysing data has the potential, if done correctly, to provide organisations with the opportunity to increase productivity, become more agile and therefore more competitive. Research by McKinsey & Co perhaps sums it up best – big data is “the next frontier for innovation, competition and productivity”. However, it is not without issue and requires a balancing act. Information is a currency and we need to find a balance which provides benefit for both partners in the information transaction. With individuals growing more wary about sharing data, there is a real opportunity for organisations to turn the issue on its head and try to answer the question: “Why should I share?” This way, individuals can understand how their information is going to be used, and decide if and what they want to share in return for an incentive, such as premium service or discounts. Whatever approach is taken, organisations have a responsibility to use data professionally, ethically and legally.
The volume and variety of data available to insurers is growing exponentially, and the associated costs are reducing. However, data is nothing on its own. Combining data with knowledge, insight and domain expertise will differentiate a business solution from a thought or an idea. Through informed thinking and knowledge, insurers will be able to transform their business into one that is customer-centric, from better risk-selection and pricing through to improving their proposition and service into current and new channels. There is a wealth of data available. The key to success is blending that data and looking at its application in innovative ways to get optimum results. Real value is created by using data in new processes, either singularly or through combining a variety of sources of information, including internally held, third-party and non-traditional data, such as social media and behavioural data. The challenge for insurers is how to combine these various data sources, held on a variety of different systems, and how to use the data at the appropriate business point to obtain maximum business benefit from the data.
Analytics bring clarity to information. At its heart, analytics are a powerful form of brand protection – they enable insurers to better assess, predict and manage risk. This insight facilitates profitable decision making and cuts losses from fraud. At the same time, analytics is also a source of brand differentiation – it enables insurers to understand more about what a customer needs. Using analytic insight within its processes, an insurer can provide a higher level of customer experience. In an industry challenged by a relative lack of brand differentiation, declining investment returns and increasingly antiquated processes, analytics is a route to improved profitability. The challenge of analytics is not necessarily obtaining access to the information required: often some of the most valuable information required to tackle fraud is already within an insurer’s systems. Rather, the challenge is having the skills, capability and capacity to turn that information into something that can make an insurer more profitable. That’s where LexisNexis comes in.
Headlines around the globe from floods in the UK to tornados in the US all trigger one key question: how can the insurance industry better manage catastrophic events in the future? This is when analytical experts and tools step into the limelight. Our in-house experts – ranging from seismologists to climatologists – work with global academia to better understand the frequency and magnitude of natural catastrophes. This is the first step in obtaining a stronger grasp on how hazards behave and the extent of the damage they cause. Next, an insurer needs to relate this hazard information to how it would impact the properties in its portfolio. We develop catastrophe models that combine this hazard and portfolio data to estimate the financial impact of a potential event. This analytical process in turn enables insurers to effectively manage their capital, buy the appropriate reinsurance cover and ultimately pay claims. Insurers can transform the knowledge gleaned from analytics to meet customer, shareholder and regulatory requirements, while preparing their businesses for the hazard events of the future.
Director of professionalism, BCS
+44 (0)1793 417755 bcs.org/bigdata
Bill McCarthy
+44 (0)1628 528620 enquiries-info@lexisnexis.co.uk
0800 590 705 www.ssp-worldwide.com
Event manager, Internet World
Product marketing manager, EMEA, Guidewire Software The key question insurers ask themselves should be: is my enterprise data ready to take advantage of big data tools and techniques? One common challenge is that data often lacks integrity or is incomplete in legacy systems. Legacy applications are often organised by line of business or functional area, and have proliferated over time due to historical needs, mergers and acquisitions, resulting in a lack of consistency. In Guidewire’s experience, the result is that stakeholders do not trust in what data tells them. Data integration via a master data hub is becoming more common, and can allow the insurer to share data across legacy and strategic systems, gaining a consolidated view of the enterprise. This data hub can secure data integrity and prepare a robust data infrastructure through methodologies and tools which gather, analyse, test and validate data throughout the organisation. This means that data is continually integrated and fully tested, and data consumers can use it with confidence. +44 (0)20 7042 7779 www.guidewire.com
+44 (0)20 7522 3820 Adam.Podlaha@aonbenfield.com
Nathalie Davies
Mark Mullin
All businesses with any consideration for customer insight will glean useful information from analytics but insurance, as an industry that relies on intelligence at the core of what it does, benefits more than most. Big data analytics will enable the predictive models that power risk analysis to become more sophisticated, allowing insurers to tailor their cover and quotations to evermore individual levels. With access to an increasing amount of information about the level of risk they are exposing themselves to, insurers will be able to adjust their pricing models according to individual circumstances, which will naturally lead to more competition on pricing for the clients, presenting lesser risk. Of course, the opposite is also true, and prices could increase for the riskier segments of the market. Access to real-time information about natural events such as flooding could also enable insurers to plan their response handling in a more efficient and customer-focused manner, for example, boosting call centre numbers following heavy rainfall in certain areas. The benefits of data analytics are seemingly endless but, for me, the possibility for real-time responses to external occurrences is the most exciting one. +44 (0)20 7921 8412 www.internetworld.co.uk
Adam Podlaha
Paul Bermingham
David Turner
Executive director of claims, Xchanging
UK & Ireland country manager, Kapow Software
Insurance is the definitive big data industry, generating 2.4 billion gigabytes of data each day. It is an industry built upon understanding data, to analyse, track and predict future behaviours. However, the first step to using this wealth of data is to identify how to use it, and agreeing a desired outcome or output – what do you want to achieve? Data analytics can make a huge difference to an insurance business’s bottom line, both through its ability to better understand customers’ needs – offering them improved rates, cost savings and service – leading to greater client retention, and through its ability to identify trends and reduce fraudulent claims. By investing in analytics technology, members of the insurance industry have the opportunity to differentiate from their competitors. However, while investing in cutting edge software to manage complex data sets, it is equally important to invest in the people who will be using these tools to ensure they do so effectively.
I think the majority of different industries – not just insurers – can benefit from analytics with the explosion of online data that is being created by both humans and devices. Analytics is the discovery and communication of meaningful patterns in data. Consequently, it will only ever be as good as the data that is being provided for analysis. At Kapow Software, we have unique technology that allows organisations to easily insert and extract data from websites, portals, intranets, social media, forums, blogs and applications providing much more context to the data organisations rely on to make business decisions. We effectively allow you to turn the internet into your own database of information. Not only do we help companies such as Zurich, Audi, Intel, HSBC and DHL with their data requirements, but also many government agencies around the world use our unique technology.
+44 (0)20 3604 3097 paul.bermingham@xchanging.com
+44 (0)7427 564 776 DTurner@kapowsoftware.com