Computer Weekly

Page 1

24-30 May 2011 | computerweekly.com

Tesco checks out retail IT IT director Mike Mcnamara talks to Computer Weekly about technology as a key driver for growth page 4

Money in outsourcing Banks are leading by example in IT outsourcing page 6

Buyer’s guide: Cloud networking The practicalities of optimising an Ethernet network for virtualisation and cloud computing page 10


Highlights from

the week online most popular

1

photo story

whitepapers

> CW+: Computer Weekly Buyer’s Guide to Social Media This six-page guide explains why CIOs and senior IT professionals have an opportunity to rethink core business processes and examine how to use social computing to create advantages for their business.

Sony swamped by password reset requests as PSN restored

computerweekly.com/246655.htm

2

Future of NHS NPfIT in critical condition following NAO report

3

The Guardian restructures IT department

4

Nokia ditches Ovi name in services rebranding exercise

5

George Osborne launches new research centre in London

6

Will Android dominance increase IT security threats?

7

BT recruits apprentices for IT services division

8

What does Microsoft’s Skype buy mean for businesses?

computerweekly.com/246649.htm

computerweekly.com/246650.htm

9

Research in Motion recalls 1,000 faulty Playbook tablets

> CW+: CIO briefing: How compliance in financial services is promoting IT innovation CIOs are using regulation to create IT innovation in the financial services sector, the majority of which dedicate more than a third of their annual IT change budget to regulation-specific IT.

computerweekly.com/246684.htm

computerweekly.com/246468.htm

computerweekly.com/246666.htm

computerweekly.com/246668.htm

computerweekly.com/246681.htm

computerweekly.com/246633.htm

> Upgrade Ethernet to fabric for cloud computing

In a network it is often a good idea to know where server and storage are located. This way, the network path can be optimised to minimise the number of jumps between the various server and storage components. A more direct path between server and storage components in the datacentre leads to better performance. www.computerweekly.com/246686.htm

computerweekly.com/246520.htm

> CW+: Harvey Nash CIO Survey

Photos

The Harvey Nash/PA Consulting CIO survey 2011 shows the recession has fundamentally changed the IT function as CIOs look to a multisourced, flexible future based around mobile devices.

computerweekly.com/246677.htm

computerweekly.com/246656.htm

10

CIO interview: Mike McNamara, Tesco

computerweekly.com/246679.htm Get the latest IT news via RSS feed computerweekly.com/RSSFeeds.htm

opinion

> Personal device momentum will

challenge mobile sourcing strategies Enterprises are embracing mobility at an unprecedented pace, offering employees new ways and more options for how, when and where they do their jobs, writes Brownlee Thomas, principal analyst, Forrester Research. www.computerweekly.com/246705.htm

Case study

> Backing up Avon’s IT infrastructure

> CW+: Fujitsu: Computer Weekly Special Report This nine-page report analyses the challenges facing Fujitsu, its financial performance, the services it offers, its place in the IT market and its future strategy. Packed with graphs and diagrams, the report is essential reading for any organisation working with, or thinking of working with, Fujitsu.

> Choose the right cloud service

> Inventing the future of IT

www.computerweekly.com/246567.htm

www.computerweekly.com/246309.htm

provider to avoid network problems Totally contained cloud services show how full-service functions held in the cloud can facilitate a business’s processes without costly hardware.

The school of electronics and computer science at the University of Southampton is where optical fibre technology was first developed, a field in which it is still a world leader.

computerweekly.com/246630.htm

blogs

> Video review: Palm Pre 2 - too little too late or the start of something?

It’s been a couple of years since the Palm Pre was released. Since then HP has bought Palm, then announced they wouldn’t release any more phones and then released more phones. So the biggest release of the current crop is the Palm Pre 2. Is the familiar boring or an old friend? http://bit.ly/iJaHA7

> Capgemini UK chairman Christine Hodgson on winning Woman of the Year

Christine Hodgson, chairman at Capgemini UK, talks to WITsend about winning the ‘Woman of the Year’ award at the CWT Everywoman in Technology Awards 2011, changing career paths and surviving a business destroyed by fire. http://bit.ly/j0N6cU

> Coalition announces pledge to halve carbon emissions by 2025 – what role will green IT play?

When you’re a major international brand, with a market stretching over continents, you need to know that your company’s IT infrastructure is well supported and backed up. For international beauty company Avon’s Richard Boyles, manager, disaster recovery, ICM Continuity Services gives him that reassurance.

In a jaw-dropping condemnation of the NHS National Programme for IT, the National Audit Office has exposed a white elephant in the final stages of collapse. Messrs Osborne, Cable and Hammond were reportedly against the plans for fear of affecting the economy.

www.computerweekly.com/246413.htm

http://bit.ly/jUT6PE

2 | 24-30 May 2011

Now that the Coalition has announced its pledge to halve carbon emissions by 2025, the next question is how is it going to be achieved. The devil, as they say, is always in the detail. http://bit.ly/iQhrQt

> Mark Ballard: NHS IT system condemned

Daily news for IT professionals at ComputerWeekly.com


the week in IT careers

desktop

The Guardian’s former head of digital Mike Bracken has been appointed the government’s director of digital at the Cabinet Office. He will take up the Whitehall position in July. The post of director of digital was advertised earlier this year with a salary of £141,000 per annum. computerweekly.com/246733.htm

The continued decline of the UK PC market is the worst the industry has experienced for a decade, according to Gartner. The latest figures from the research firm show UK PC shipments reached 2.7 million units in the first quarter of 2011, a decline of 18% compared with the previous year. computerweekly.com/246703.htm

Mobility

enterprise

Orange and Barclaycard have introduced the first near-field communication (NFC) mobile payment system in the UK. The “Quick Tap” payment system can be used via Orange’s NFC-enabled Samsung Tocco Lite handset. This allows customers to pay for items under the value of £15. computerweekly.com/246731.htm

HP’s latest financial results show that corporate users are still buying enterprise IT products and services. HP’s revenue increased 1% for its second fiscal quarter ended 30 April 2011, compared with the same period the year before. HP says its Commercial Client revenue grew 13% and shipments of multi-function printers grew 60% year-on-year. computerweekly.com/246695.htm

Cabinet Office names Mike Bracken as director of digital

UK PC industry decline is ‘worst for decade’

Orange and Barclaycard launch contactless UK mobile payments

security

US security firm uncovers SCADA threats to industrial operations A US security research organisation says it has discovered methods hackers could use to sabotage power plants, oil refineries or manufacturing operations. “This is a global problem. There are no fixes to this right now and [it could] cause real environmental and physical problems and possibly loss of life,” said Rick Moy, chief executive at NSS Labs. computerweekly.com/246727.htm

HP corporate products and services take off

Focus on improving public sector IT Government CIO Joe Harley (pictured) and Efficiency and Reform Group chief operating officer Ian Watmore have outlined a roadmap to improve public sector IT. Government attempts to implement large-scale IT systems are doomed to failure in almost every situation, Watmore told a Public Accounts Committee last week. Designing with the customer in mind is key, as is working on smaller projects, or “chunking” larger ones into smaller components, he said. “Stop talking about ‘IT disasters’ as it makes problem worse, [failures] are nearly always due to business challenges, not IT,” he added. Harley said that keeping senior responsible owners (SROs) at the helm of projects throughout their life was crucial to future success. He also reiterated the need to create further cost savings: “We have to make inroads in the cost of IT in government, including progress on datacentre consolidation.” computerweekly.com/246683.htm

Declining PC shipments

Government outlines plans for identity assurance services The government has outlined plans for identity assurance services to be used across all online public sector services. The intention is to create a market of private sector identity assurance services, allowing the public to choose the provider of their choice to prove their identity when accessing public service, said Cabinet Office minister Francis Maude. computerweekly.com/246723.htm

634 709

HP

public sector

450

Acer

Toshiba Apple

620 204 185 160

computerweekly.com/246703.htm

Dell has boosted profits by 177% since last year after increasing sales to enterprise customers. In its Q1 fiscal 2012 year results, Dell’s profits were up 177% to $945m, compared with profits of $341m in the first quarter of its fiscal year 2011. During the quarter, sales of desktop and laptops rose by 7% and enterprise services sales increased by 5% to $4.4bn. computerweekly.com/246691.htm

CIOs must move beyond keeping the lights on

429

292

Dell posts 177% profit boost after increasing enterprise sales

strategy

523

Dell

financial results

1Q11 shipments 1Q10 shipments Source: Gartner

A global IBM survey of 3,000 CIOs has found most are now confident they have aligned IT with the strategy of their CEO. In the survey, 58% of CIOs said they have integrated technology with the business to help them innovate, while 68% said senior management views technology as key to the success of the business. computerweekly.com/246667.htm

mobility

Nokia’s sales hit 14-year low as worldwide smartphone sales soar

software

Nokia’s worldwide market share of mobile devices dropped to the lowest level since 1997, according to research from Gartner. The telecoms giant sold 107.6 million devices in the first quarter of 2011, a decline of 5.5% from last year. Nokia is expected to aggressively lower average selling prices to maintain shipments of Symbian devices while waiting for its first Windows Phone 7 devices to reach the market, says the analyst. computerweekly.com/246709.htm Daily news for IT professionals at ComputerWeekly.com

“More than 40,000 new companies have decided to run their businesses better with SAP in the past year” SAP co-CEO Jim Hagemann Snabe at Sapphire Now conference computerweekly.com/246682.htm

public sector

Lancashire council signs £400m outsourcing joint venture with BT Lancashire County Council has finally agreed a £400m joint venture deal with BT Global Services, over six months after announcing the firm had been chosen as preferred supplier. The 10-year contract covers IT services for the local authority and its schools, along with IT supporting back-office functions such as HR, payroll and procurement, plus a customer service centre. computerweekly.com/246663.htm 24-30 May 2011 | 3


CIO interview it management

Tesco checks out the latest retail IT IT director Mike McNamara tells Angelica Mari about the retail giant's focus on technology as a key driver for growth

H

aving a boss who “gets” technology is the dream of every chief information officer, especially when the employer is the third largest retailer in the world by revenue. This is good news for Tesco’s new CIO, Mike McNamara, as the company focuses on delivering improved web solutions

and IT supporting new markets and international expansion. McNamara, an IT veteran who helped set up Tesco.com in the late 1990s, was recently elevated to the top post in retail technology after previous incumbent Philip Clarke took over as chief executive in March. According to the CIO, who is also responsible for operations development at the company, being part of an executive board with extensive experience in IT is an advantage. “Philip Clarke is, above all, a fantastic retailer more than an IT guy.

Tesco will extend its customer-facing Wi-Fi across the store network in the next 12 to 24 months 4 | 24-30 May 2011

But I think it is great that we have someone at the top of the organisation who really understands technology and what it can do,” McNamara told Computer Weekly. “He is one of the few FTSE100 top executives who has solid experience in IT and this is clearly a great thing for us, given the huge role that technology plays in our business.” Former CIO Clarke – voted the “most influential person in UK IT” in Computer Weekly’s UKtech50 2010 – has reached the very top of the organisation and inspired many IT leaders worldwide. But that does not mean technology-savvy staff get special treatment when it comes to succession planning across different business areas. “We move senior people about in Tesco quite a lot. Many of my direct reports have worked in other parts of the business, in roles such as store

managers,” said McNamara. “That’s normal to the way we operate and that approach applies for people in all areas: marketing, operations staff and IT.” Increased web focus Earlier this month, Clarke outlined key points of his vision for Tesco. One of the firm’s strategic targets to be “an international retailer” was updated to “be an outstanding international retailer in stores and online”. McNamara says a key objective for the company is to further develop Tesco’s online operations, which have an annual turnover of about £1.5bn in the UK alone, serving 400,000 customers every week. “The change of gear in the strategy that Philip has brought about means far more emphasis on online. We already have a very strong presence on the internet in the UK and have suc-

Daily news for IT professionals at ComputerWeekly.com


CIO interview cessfully exported some of that food internet business to the Republic of Ireland and South Korea. In the next 12 months, that will spread out to other countries in which we operate,” said McNamara. “We have already recognised that online is very important and we have very solid plans to bring that into our international businesses. Likewise, our general merchandising and clothing online is doing well and we are looking at how to bring that overseas too.” Mobile developments According to McNamara, the rise in use of mobile devices – particularly smartphones and tablets – has prompted a pronounced shift in how Tesco sells its products online and in-store, so becoming a multi-channel retailer is another key goal. The company has emphasised mobility in recent years and already offers tools that include an iPhone grocery app for customers, developed by its research and development team. McNamara said there will be further developments in that area in due course. “We see more and more consumers checking prices online, particularly on big ticket items. As people go into our stores, you see them checking reviews, our prices and the prices of our competitors on their mobile phones,” said McNamara. “There is now total price transparency and social reviews are accessible to customers in our stores. We see that happening and we want to help those processes, not hinder them.” To that end, Tesco will extend its customer-facing Wi-Fi across the store network in the next 12 to 24 months, while kiosks will also be introduced to help customers shop instore, online or on the move. McNamara says the main difference between Tesco and its competitors, when it comes to mobile and web customer-facing offerings, is Tesco’s dogged determination to make that area of the business succeed over a long period of time. “We took a very practical, customer-first approach and tended to do things that work for customers, ensuring products online cost the same as they would in-store, getting stock availability right, as well as on-time delivery,” said McNamara. “We didn’t start off with a huge master plan and a massive warehouse doing all sorts of whizzy things. It was a very down-to-earth and incremental approach, and we had customers along the whole journey,” he said. “That has proved to be very successful – we have the biggest internet grocery business in the world and still the only profitable one.”

Major milestones ahead Over the next 12 months, McNamara will be leading a range of IT projects, but three major undertakings are top of his to-do list. One is completing the work of supporting Tesco Bank. Tesco’s total capital spend in the UK for 2010 was £1.7bn, with an additional £200m spent in the banking business, mainly for re-platforming systems. According to the IT chief, his team is halfway through the migration of former partner Royal Bank of Scotland’s systems onto Tesco’s set-up. Tesco will also be launching more websites in the next 12 months as part of a major initiative to improve the way it sells general merchandising and clothing online. Another strand of work is around bringing the Tesco operating model to its international operations. To that end, there will be a huge amount of infrastructure to be rolled out, says McNamara, in relation to items such as network servers and databases. “We also have a lot of software to write and to integrate, as well as a lot of people to train,” he said. About 80-90% of Tesco’s infrastructure is managed in-house. The company has a highly virtualised setup and has spent the past couple of years building what McNamara describes as an internal cloud. “Tesco has a fairly good and substantial foundation in terms of virtual infrastructure, which is very easy to expand,” he said. “I can see in the years ahead – not in the next year or two, though – that some of that infrastructure will be outsourced and we will start using some public cloud services.” But he stresses that the possibility of handing over some of Tesco’s infrastructure to a third-party supplier does not mean any negotiations are currently taking place. Indian IT centre growing Tesco needs all the IT skills it can get to deliver its ambitious project agenda. According to McNamara, one way the company has found to meet its knowledge needs is by drawing on expertise from suppliers. “We work with many software and hardware suppliers on research and development projects and we have reasonably close links with most major companies,” he said. Another key pillar of Tesco’s IT strategy – which also gives the firm access to a vast skills pool – is the Hindustan Service Centre (HSC), a Bangalore-based captive facility that provides IT and business services. “[HSC] is a fantastic capability for

Daily news for IT professionals at ComputerWeekly.com

us. From an IT point of view, it gives us all kinds of skills to build things, which can be used not just in the UK, but in other countries such as South Korea and China,” said McNamara. When Tesco acquired its banking subsidiary from partner Royal Bank of Scotland for £950m in 2008, it was expected the retailer would use HSC as a key resource for the core technology work supporting Tesco Bank. “We built the new bank infrastructure in-house. Physically, that infrastructure is sitting in the UK, but it is managed and operated from the HSC in India, so we delivered on that promise,” said McNamara. The CIO also mentioned HSC is not only involved in IT work, but some back-office tasks, as well as innovation. “The Bangalore site brings a huge amount of innovation to Tesco. They get a very unique view of the world, as they look at our operations globally and not just one country in particular,” he said. Tesco already employs thousands of Indian staff at HSC and expects to

more online CIO interview: Philip Langsdale, BAA computerweekly.com/246531.htm

CIO interview: Trevor Didcock, Easyjet computerweekly.com/246126.htm

Retail IT: A Computer Weekly Buyer’s Guide computerweekly.com/246058.htm

CIO interview: Marcus East, Comic Relief computerweekly.com/245937.htm

CIO interview: Tony McAlister, Betfair computerweekly.com/245816.htm

continue hiring hundreds more staff every year to keep up with demand. McNamara says the retailer will also be looking to hire more IT staff in the UK this year. ■

“Online [business] is very important” Mike McNamara Tesco 24-30 May 2011 | 5


news analysis outsourcing

Banks create IT outsourcing roadmap Banks are setting an example for others to follow as they consolidate suppliers and share strategies, writes Karl Flinders

6 | 24-30 May 2011

Anil M/flickr

S

tability is returning after the recent economic meltdown and banks are leading the way with a new approach to IT outsourcing which could become a roadmap for other sectors to follow. The large banks, which usually play a pioneering role when it comes to buying and using IT services, were blamed for the severity of the economic slump the world is only just emerging from. But the banks could drive the wider economy into a new way of structuring outsourcing relationships that could benefit businesses and their suppliers in the entire outsourcing sector. According to Infosys head of Europe, BG Srinivas, banks are not only leading the way in investing in IT outsourcing but are changing how they work with outsourcers. Infosys boasts some of the world’s biggest banks as customers. These new models for outsourcing relationships, which are seeing supplier portfolios consolidated and strategy sharing between businesses and their IT service providers, are already filtering into other sectors as the banks pioneer them. Srinivas says Infosys is sharing strategic plans with banks and in conjunction with this the banks are reducing the number of partners they work with. The sharing of strategic plans will help the supplier to prepare for the long-term and respond quicker to customer demands while the reduction of suppliers will lower procurement costs and reduce the risks. “Banks are always the pioneers in outsourcing but companies in other sectors are also doing it,” said Srinivas. Jean Louis Bravard, director at sourcing broker Burnt-Oak Partners, was previously head of the global financial services business of EDS. He agrees with Srinivas that banks are revisiting their outsourcing relationships after a turbulent few years. He says as a percentage of revenue banks are the biggest spenders on IT outsourcing and he believes other sectors will follow their example because their requirements are not unique. “Before the economic downturn the banks were just focused on revenue generation. Then after 2008 they were just focused on survival and

IT outsourcing trend: Infosys is sharing strategic plans with banks keeping water off the ship. In the middle of last year they started to try to be efficient,” he said. “A lot of banks are trying to reduce the number of suppliers they work with and build relationships that are not one-sided, where suppliers share the pain and the gain.” Mark Lewis, partner and head of outsourcing at law firm Berwin Leighton Paisner, says banks are currently taking a hard look at their outsourcing strategies with plans for changing them. “Barclays, for example, is open about this and is current-

ly revisiting its longer-term contracts. Part of this includes looking at having longer-term relationships.” But he says this trend is only being seen at more “settled” banks because other banks have other major challenges as the recovery approaches. Lewis says openness is very good for outsourcing relationships but there must be parameters: “What is the sense in not sharing information with suppliers and not being open?” He says regulated information cannot be shared but that there is no reason why information about the business

IT dominates UK outsourcing sector The IT outsourcing industry in the UK is the largest contributor to the UK’s £207bn outsourcing sector with £41.7bn in sales in 2009. The maturing UK IT outsourcing sector looks set to see demand increase. According to research from Oxford Economics, about 20% of total outsourcing sales in the UK, across all industries, were made in IT- and data-related services. The report was commissioned by the Business Services Association. Total outsourcing, including 17 different sectors such as property services, recruitment and administrative support, accounted for 8% of total UK GDP. This puts it on a par with the financial services sector, which accounted for 8.1%. A total of 340,000 people are employed in IT outsourcing jobs in the UK. Peter Brudenall, outsourcing lawyer at Lawrence Graham, says there is room for the IT outsourcing industry to expand. “It will expand as companies realise that outsourcing, despite the negative aspects such as some job losses, allows companies to compete better because they can invest more in their core business. This will create more jobs,” he said. The report revealed a massive £35bn was spent on outsourced IT in the private sector, while the public sector accounted for £6.7bn.

direction should not be shared. John Worthy, technology partner at law firm Field Fisher Waterhouse, says outsourcing supplier consolidation is happening across sectors. “The benefit of that is it means the customer can better manage the relationship and put the right effort into it.” He adds that the suppliers will also be able to see that they have a strong and valuable relationship to commit to. But reducing supplier numbers will reduce choice and options if one partner has problems. Back-up plans are important as the near collapse of Indian supplier Satyam after a massive fraud in 2009 proved. Bravard says banks can overcome this by having a small set of suppliers working in particular technology areas. For example for network and desktop services it might just have a couple each. This means the bank can dip into other resources that it has a relationship with. Srinivas says some banks have different groups for different business units. For example the investment arm of the bank might have three and the retail operation three others. If required, each operation could use the suppliers in another’s portfolio without having to start the relationship from scratch. Outsourcing seems to increase during good and bad times. It has its uses as a cost saver and then as a business changer if required. Banks are big spenders on IT outsourcing and their activities can influence trends. Could the consolidation of supplier portfolios and the development of closer business/supplier relationships be a legacy of the financial services slowdown? ■

more online Analysis: CIOs invest in outsourcing and select multiple suppliers computerweekly.com/246652.htm

News: Outsourcing contracts cost businesses more than expected computerweekly.com/246609.htm

News: Accenture tops global outsourcer league computerweekly.com/246717.htm

Daily news for IT professionals at ComputerWeekly.com


news analysis project management

Can the NHS IT project stagger on? A recent NAO report appeared terminal for the NPfIT, but the Department of Health is still backing the £6.4bn spent

Time for action The NAO outlined concerns about suppliers missing deadlines, including a £500m contract renegotiation with CSC – which has yet to happen – and the costs of procurements in south England after the termination of Fujitsu's contract, which remain uncertain until November 2011. This is all in addition to the £20bn of efficiency savings the NHS has been tasked by the government to make by 2014-15, and a controversial structural shake-up which will see the termination of the 10 strategic health authorities responsible for the programme's implementation locally.

darren greenwood/design pics inc/rex features

A

damning report from the National Audit Office (NAO) has revealed the state of the NHS National Programme for IT (NPfIT), nine years after its launch. The report effectively called for a vote of no-confidence, but could there still be a cure for the NPfIT's long-term ailments? While there have been successes, it is the failures of the NPfIT that have been most heavily documented. Since its inception in 2002, it has left a trail of cancelled contracts, overspends and delays in its wake. Originally intended to create a fully integrated electronic patient records system, the NAO says the programme has fallen far below expectations. Some £6.4bn has been spent on the programme, with a further £5bn earmarked for investment. However, there remains great uncertainty as to whether the remaining roll-out will be delivered in time and on budget. Of the 4,715 NHS organisations in England expecting to receive a system under NPfIT, 3,197 are still outstanding. The contract with key supplier CSC alone requires delivery of 3,023 GP systems and over 160 deliveries of the Lorenzo patient administration software by July 2016. Successful implementation of Lorenzo by this date would require a delivery rate of between two and three NHS trusts a month over the next five and a half years, says the report.

The NHS IT programme is under intense scrutiny Tola Sargeant, research director at analyst TechMarketView, believes the NAO report has come as a loud wakeup call for all involved. "Until earlier this month the Department of Health and CSC appeared surprisingly upbeat, despite delays and lack of progress on Lorenza and delays at the Pennine Trust roll-out. The direction of travel still seemed the same and the expectation that things would muddle through, despite the obvious issues. But now the NAO has come down so strongly and sceptically about the delivery, particularly with CSC, things appear quite different," she said. "I wouldn't be surprised if something more radical is going to happen now, such as a substantial rethinking of the programme or termination of contracts. They could even put the brakes on the project and use funds for local procurement instead." Such a move was voiced by Richard Bacon MP on the Today programme on Radio 4, where he called for the programme to be scrapped entirely: "NHS trusts must be set free to choose the systems that meet the needs of patients and medical professionals. They should have the power to source products locally that suit their needs, subject only to common

“Someone has to talk to GPs. In the past, the approach was too top-down” Daily news for IT professionals at ComputerWeekly.com

standards," he said. Sargeant agrees this approach is feasible: "Many trusts don't have modern patient record systems, which they are crying out for, so I imagine they would at least try to find budgets locally as they will need something at some point. We have already seen a trend for some trusts doing this. But there is still an argument for investment in this area." Hope for NPfIT yet John Enstone, legal partner at Faegre & Benson, has worked with several IT companies previously involved in the programme and believes there is hope for its survival if its contract structures are substantially changed. “This has been one of the most difficult and complex projects in the world, with a history of mismanagement. But I don't see why it can't pull out of the fire – I just haven't seen a solution for doing so yet,” said Enstone. “This would involve some additional spending and would depend on creating precise and deliverable contracts with suppliers and proper commercial-focused management, albeit with patience and more time. The secret is creating contracts that suppliers can deliver on, which may take some back-pedalling and further investment of money,” he said. “Very few systems cannot be fixed and I don't believe there is the stomach for starting again. But I would like to see someone in government take hold of it and say, ‘We are going to take on the headline problems,

stay with it and fix it.’” The other key task is to bring doctors and healthcare professionals on board, says Enstone: “Someone has to talk to GPs. In the past the approach has been too top-down.” Chaand Nagpaul, a GP and member of the British Medical Association's working party on NHS IT, says the government needs to form policy in a way that is realistic for health professionals: “These lessons need to be learnt by every government, not just the previous administration. Any [IT] strategy needs to be realistic, not driven by political ambition or aspiration,” he said. But Nagpaul is adamant IT modernisation must push ahead. “We need to ensure there is no counterproductive response to this report because the NHS needs IT to be cost-effective. IT must be developed in a way that is meaningful and realistic in timescales.” The Department of Health agrees change is needed after the NAO report, but appeared to suggest the department would stand by the project. “We do think the investment made in the NPfIT will potentially deliver value for money now we have a more flexible approach that allows the local NHS to be in charge of its own requirements,” said a spokesman. Given the many problems involved with the NPfIT, few would disagree that it will need a dramatic re-scoping if it is to succeed at this stage. But the question remains as to whether the political will exists to do so. Sargeant believes it may: "I would probably put money on some form of more radical scaling back of the CSC contract. Not necessarily complete termination, but a more radical rethink than I would have thought a few months ago." ■

more online News: Future of NPfIT in critical condition following NAO report

computerweekly.com/246684.htm

News: DWP cancels £300m Fujitsu deal over transition deadline failure computerweekly.com/245957.htm

News: NAO blasts inefficient and unpoliced immigration system computerweekly.com/245905.htm

24-30 May 2011 | 7


community Is it time to switch off the NHS IT project’s machine?

A

t Computer Weekly, we’ve loved the NHS National Programme for IT (NPfIT), it’s been a great source of articles and of ideas for these leader columns in its long existence. Of course, the truth is we wish we hadn’t been given cause to write about it quite so often. For the NPfIT, no news would have been good news, because it has generated more than enough bad news. The National Audit Office (NAO) report last week (see page 7) drove yet another stake into the barely beating heart of this terminal patient. The Department of Health insists the £6.4bn spent has shown value – and perhaps if you were to concentrate on the success stories, such as the PACS digital X-ray and imaging system, that would be true. But when the NAO says it has “no grounds for confidence” about the £4.3bn planned spend remaining on electronic care records, and questions whether the project should be scrapped forthwith, then “value for money” becomes a point of political opinion. We’ve been here so often before. NPfIT is “yet another example of a department fundamentally underestimating the scale and complexity of a major IT-enabled change programme,” according to NAO chief Amyas Morse. And that has been the over-riding challenge that the NHS never got to grips with – the sheer scale of the project proved as over-ambitious as many experts warned. If the NHS is to cut bureaucracy and costs, it has little choice but to streamline and ITenable administration. Can you imagine an NHS in 2020 that doesn’t have electronic patient records as standard? What would that say about our modern health service? NPfIT is over in all but name. Localism is the new buzzword, and NHS IT managers are already being given greater autonomy, regardless of what is said about those huge central contracts. The National Programme may have flatlined, but IT in the NHS lives on and needs better care than this unlamented mega-project. Editor’s blog computerweekly.com/editor 8 | 24-30 May 2011

readers’ letters Eye ubiquitous/rex features

Bryan Glick leader

Use the right tool for the job Victoria Barrett, Nexus GB

The recent high-profile data security problems we have witnessed in the UK have highlighted the issues involved in using high street consumer memory products in engineering applications. When building a house, we don’t use ice as the construction material; we go for bricks and mortar which will stand the test of time. Similarly, when selecting components for a new device we should choose those with a healthy lifespan, which will not “melt away” before the product comes to market. Each time I hear that a CD, USB memory stick or camera card full of customer data goes missing, I think how easy it is to avoid the loss by choosing fit-for-purpose product. Equally, every time I encounter an application in which an industrial device is brought to market containing a consumer memory product, it makes me think of the inevitable redesign when the memory product ceases to be manufactured. Moore’s Law makes it inevitable that this redesign will happen every time. While the use of USB sticks shows an admirably intuitive approach to design, using products out of context will not produce an adequate outcome. The only way to get the right result is to choose a memory product that is specifically produced to meet the needs of the engineering designer.

Make data safer by accessing cloud services via a WAN

Teach the public sector how to get the best of shared services

Neil Thomas, cloud computing and virtualisation product manager, Cable & Wireless Worldwide

Thomas Senger, vice-president, software and solutions, Kofax

A viable option for addressing security concerns is to ensure that the prime access routes to the cloud computing environment are via the wide area network (WAN), as opposed to the internet. This bypasses any sense of anxiety for enterprises and large organisations that would prefer for applications in a cloud environment to not directly face the internet. By placing these services in a secure cloud environment within the WAN, and by using the established methods of data separation between different customers in the cloud computing environment, data becomes intrinsically safer. For businesses today looking for the benefits of cloud migration, WAN connectivity for server administration will allow them to rapidly build and test cloud environments before access is extended to internal or internet users. Ultimately, this approach is the only one that offers such firms flexibility while still adhering to best practice and the business’ own security guidelines.

Regarding your article “Hasty move to shared services risks inflexibility in the public sector” (computerweekly. com/245958.htm) Jos Creese is right to issue caution to councils which might rush decisions and become locked into long-term arrangements. Shared services is a buzzword in the technology sector and one that many service providers love to exploit, but do not properly educate their clients on how the technology can actually deliver the best results. Service providers must work in partnership with the public bodies to ensure they have a better understanding of the efficiency benefits the technology can deliver. By doing so, councils will be able to make more informed decisions about shared services and the specific benefits to their organisation. If councils and technology providers learn to work more closely together and develop effective partnerships, they will have the ability to respond to new legislation and tighter budgets. ■ E-mail your letters and comments to cw-comfeedback@computerweekly.com

Daily news for IT professionals at ComputerWeekly.com


community Andrew buss opinion

Rationalising software licensing in the virtual age

I

t was only a few weeks ago that Freeform Dynamics was discussing software licensing and how complex and difficult it can be to keep on top of the day-to-day practicalities of managing licences and maintaining compliance. This was brought home to me yet again in a conversation I had last week. I was speaking with the IT director of a mid-size company in the insurance sector that has recently undertaken a consolidation effort. As a result the company has reduced tens of sites down to just two and has rationalised the back-end systems infrastructure from a kaleidoscope of architectures down to a simpler mix of x86 servers coupled with a single high-end Unix platform running business-critical applications. The organisation concerned serves thousands of brokers in different industries, and to better enable this all the applications and services are now virtualised so that the infrastructure is flexible enough to cope with unpredictable or “bursty� application usage.

Challenging transformation One of the things that struck a chord in the conversation was that although the transformation was a challenging multi-year effort, it was possible for the IT team to conceptualise and understand the technical aspects of both the hardware and the software. When it came to the commercial side of things, however, it was a different story. While the hardware was purchased and managed by the IT team, it was felt that software licensing was too complex for this to be practical. With the legal as well as commercial implications, it was not feasible nor desirable for this to be a core competence of IT. To solve the problem, a third-party licensing specialist was enlisted to handle the negotiations, management, compliance and optimisation of licensing arrangements. Mindset of software suppliers Beyond the inherent complexity, the mindset of some software suppliers was considered to be an additional challenge of the project, a challenge that several of our surveys have highlighted in recent years. The IT director felt that the technology has advanced rapidly to allow a more consolidated, simplified and

manageable IT infrastructure to be created. Yet many software suppliers have been dragging their feet on the commercial side, preferring to exploit the mismatch between traditional licensing models and modern systems architectures rather than dealing with it properly. There are exceptions, of course, and some suppliers have been quick to evolve their thinking, but many are still not making it easy. Variation between suppliers then adds to the problem, making it very difficult to create standardised virtual systems that can be deployed predictably anywhere in the datacentre. These restrictive terms often force architectural workarounds to be developed. An example given was the need to create a server pool dedicated only to running databases because the licensing meant that if a single virtual machine running a database was deployed on the server, the entire physical server needed to be licensed. This had the net effect of tying supported database workloads to a fixed pool of dedicated servers. This can work well for static consolidation and predictable workloads, but hardly fits the vision of a dynamic private cloud that software suppliers are shouting about. Removing complexity The strategy taken for this project was to remove a lot of software complexity. They used the consolidation exercise to rethink and simplify the software portfolio required to make up the required IT services. In this case they reduced the number of supported database platforms provided as a standard service to the business down to just two major versions which were made available as a pool. This had the effect of encouraging the business application owners to ensure that their applications worked on the supported platforms, as requiring their own was usually five to 10 times more costly. This rationalisation strategy, while simple in concept, proved to be a challenge. It required a determined mindset and extended period to come to fruition. Most companies will not need to go to such an extreme and have a forklift upgrade of their existing systems and applications. Much of the benefit can start to be realised by

Daily news for IT professionals at ComputerWeekly.com

applying the strategy to new deployments and letting it grow naturally, provided there is enough willpower to keep to the strategy over time. Risk of lock-in One of the issues with rationalising licensing is that it places a concentrated part of the IT spend in the hands of fewer suppliers, with the potential risk of higher dependency or even lock-in. Managing this will be a major factor in the long-term value delivered to the business. Before embarking on a rationalisation effort, it would be useful to classify the different software by how critical and important it is to the business, what the unique attributes of the solution are and whether they are really needed, or worth the cost, in practice. Based on the results of this assessment, it might then be useful to spend some time focusing on getting to grips with the software suppliers' sales strategies, how they fit with the needs of your business and how IT is changing to meet them. Aggressive suppliers Some will be quite aggressive in trying to get you to maintain your spend or buy more, with a focus on their business rather than yours. This type of approach often results in inflexible agreements that scale in one way, namely up. Often software that is no longer in use is tied into the overall agreement and it is difficult to remove without an impact on the terms of core offerings. Unless the supplier has some unique technology that business absolutely needs in order to function, it might be best to choose a more flexible alternative. Other suppliers will be more progressive, and will work with you to get maximum use out of the software that you purchase from them. Their business philosophy is more one of long-term mutual value, and this is often reflective of their performance metrics and the way they compensate their staff. This value-oriented approach is much more in tune with the dynamic and flexible nature of virtualised systems and services, allowing the budget to be allocated and re-allocated to where it is needed and best used rather than trying to forecast and predict the way the business will grow years in advance. â–

Andrew Buss is service director of Freeform Dynamics

contacts Computer Weekly/ComputerWeekly.com Marble Arch Tower, 55 Bryanston Street, London W1H 7AA General enquiries 020 7868 4282 Editorial Editor in chief: Bryan Glick 020 7868 4256 bglick@techtarget.com Managing editor (technology): Cliff Saran 020 7868 4283 csaran@techtarget.com Services editor: Karl Flinders 020 7868 4281 kflinders@techtarget.com Head of premium content: Bill Goodwin 020 7868 4279 wgoodwin@techtarget.com Group technical manager: Rebecca Froley 020 7868 4269 rfroley@techtarget.com Content editor: Faisal Alani 020 7868 4257 falani@techtarget.com Chief reporter: Warwick Ashford 020 7868 4287 washford@techtarget.com Correspondent: Kathleen Hall 020 7868 4258 khall@techtarget.com Correspondent: Jenny Williams 020 7868 4288 jwilliams@techtarget.com Production editor: Claire Cormack 020 7868 4264 ccormack@techtarget.com Senior sub-editor: Jason Foster 020 7868 4263 jfoster@techtarget.com DISPLAY ADVERTISING Sales director Brent Boswell 07584 311889 bboswell@techtarget.com

24-30 May 2011 | 9


buyer’s guide

Making Ethernet part of the fabric Cliff Saran looks at the practicalities of optimising an Ethernet network for virtualisation and cloud computing

CW Buyer’s guide cloud networking part 4 of 5

T

he latest thinking on datacentre design recommends that businesses deploy virtualisation, where virtual machines can be started (or spawned) and stopped dynamically. But this can have a huge impact on the network. In a network it is often a good idea to know where server and storage are located. This way, the network path can be optimised to minimise the number of jumps (ie the number of network switches and routers that IP traffic must pass through) between the various server and storage components. A more direct path between server and storage components in the datacentre leads to better performance. But to get the most from server and storage virtualisation, the physical location of a virtual machine should not be tied down, as this would affect the flexibility of the virtual infrastructure. As a result, a traditional Ethernet network cannot easily be optimised for virtualisation and cloud computing.

Fabric for VMs A poll conducted at the Gartner Data Center Conference held in December 2010 found that 83% of respondents were using mobility to reassign new

locations or shift workloads, or were using policy-based software rules for optimisation. Gartner believes an approach called “computing fabric” will be required to support the dynamic allocation of virtual machines (VMs), where the network, server and storage act as a single unit connected using a switch. “Cloud computing and virtualisation make networking difficult. Modern blade platforms, such as HP Virtual Connect and Cisco UCS [unified computing system], are integrated with switches [to simplify networking],” said Andy Butler, a distinguished analyst at Gartner. “Fabric computing relies on the network switch being integrated with the server, the network and the storage.” According to Brocade, Ethernet networks are not designed for cloud computing. While network managers have previously been able to optimise networks by managing performance at the network’s core, Marcus Jewell, regional sales director at Brocade, says virtualisation means network traffic becomes unpredictable. Duncan Hughes, systems engineer at Brocade, said: “For the past 20 years we have been using a three-layer hierarchical layer comprising the access layer, aggregation layer and the core layer. Routing would only be performed at the core layer, so if you needed to communicate with a server on a different part of the network [subnet] you would need to travel the network up through the three layers [three network boxes]

Case study: De Persgroep The Brocade network infrastructure is helping Belgian media company De Persgroep manage its datacentre networking. The company, which has grown rapidly through acquisition, has increased the number of servers in its datacentre from 300 in 2007, and supports around 1,000 virtual servers today. Along with office automation, the main applications are editorial and advertising systems, plus its websites, some of which support two million users at peak times, with bandwidth of 2Gbps. De Persgroep’s previous network was experiencing capacity issues due to the growth in virtual servers, and limitations on the number of physical network ports was affecting network performance. Throughput bottlenecks meant that back-up schedules would fail, and overall switch performance and stability was no longer acceptable. Following a competitive tender, the company selected the Brocade VDX 6720 as the foundation to build an Ethernet fabric for its datacentre. A total of 72 Brocade VDX 6720 switches have been deployed across the fabric-based architecture, creating a single logical chassis with a single distributed control plane across multiple racks of servers. This design has provided compelling reductions in capital and operating costs, while simplifying virtual machine (VM) migration. The deployment is already delivering the desired performance and resilience for De Persgroep. The Ethernet fabric uses dual 24-port network switches fitted to the top of its server racks to enable each rack to operate its own dual redundant network. Servers in the racks are equipped with two network cards each to support the dual redundancy. The racks are connected to aggregator switches which bring the networks from the servers together and connect them to another datacentre, which is located a few hundred metres away. This provides a so-called active-active system, where both datacentres are operational. “On our old system you could lose several seconds if a switch failed,” said Wim Vanhoof, ICT infrastructure manager at De Persgroep. Such a delay would be enough to cause an application to crash. “On the new fabric, which uses a virtual switch configuration, fail-over is instant,” he added.

and back down again.” This is inefficient and does not copy well when used in a virtualised environment. Hughes says that if this approach to networking is used in a virtualised environment, network traffic between virtual machines will bounce up and down the network. Brocade and other network equip-

ment makers are now selling the idea of a network fabric, which overcomes the network problems caused by virtualisation. The storage networking company has developed what it calls a flat-layer, two-network, self-healing fabric, which it says overcomes the limitation of using a three-layer network topology. ■

more online News: Brocade develops ethernet fabric for cloud computing computerweekly.com/246581.htm

Research: Multinationals show growing interest in cloud computing computerweekly.com/246669.htm

Analysis: The fundamentals of preparing for desktop virtualisation computerweekly.com/246173.htm

Ethernet fabric architecture 10 | 24-30 May 2011

Classic hierarchical ethernet architecture Daily news for IT professionals at ComputerWeekly.com


buyer’s guide network infrastructure

Resilience in the datacentre Resilient switching in converged infrastructure must be able to take some beating, as Steve Broadhead demonstrates

CW Buyer’s guide cloud networking part 5 of 5

I

n the world of datacentres and large-scale enterprise networks there has always been some form of perceived trade-off between performance and resilience. Building in resilience is absolutely essential of course, but it has historically affected service and application ability when brought to play. And – in spite of the best planned and designed networks, quality of components and management – problems do arise. Add in the virtual world to the physical one we’ve come to know and trust and the stakes are raised again. The result is that suppliers have been forced to redesign their systems to support the virtual environment, maintaining that level of resilience – or improving it – while also improving round-the-clock access to those services, applications and the data that lies beneath.

Resilient virtual switching One such example is HP’s Converged Infrastructure solution – incorporating servers, storage, networking and management. The datacentre is growing ever more critical to the enterprise, whether physical or virtual, in-house or outsourced. From a supplier’s point of view, this means creating a complete system – a converged infrastructure – based on marrying truly compatible components with the best performance/feature set and with as little compromise as possible. At the heart of HP’s Converged Infrastructure (CI) system is the key to the resilience contained within – what HP calls the Intelligent Resilient Framework – that creates a resilient, fully-redundant virtual switching fabric. Intelligent Resilient Framework (IRF) is designed to combine the benefits of box-type devices (simple, standalone switches, for example) and chassis-based distributed devices, such as a blade switch. The argument is that box-type devices are cost-effective, but can be less reliable and less scalable, and are therefore unsuitable for critical business environments. In contrast, chassis-based devices tick all

Datacentre operators are long used to the trade between resilience and performance these boxes but are more expensive and considered to be more complex to deploy and manage. With IRF, then, HP is looking to merge the benefits of both approaches into one. IRF allows you to build an IRF domain, seen as one, big, logical device. By interconnecting multiple devices through ports (regular or via dedicated stacking) it is possible to manage all the devices in the IRF domain by managing one single IP address (attached to the logical device), which provides the lower cost of a box-type device and the scalability and reliability of a chassis-type distributed device. In a converged infrastructure environment, an IRF-based network extends the control plane across multiple active switches, enabling interconnected switches to be managed as a single common fabric with one IP address. The claim is that it increases network resilience, performance and availability, while simultaneously reducing operational complexity. Another key element of the system is HP’s Virtual Connect Flex-10, comprising two components: 10Gbps Flex-10 server NICs and the HP VC Flex-10 10Gbps Ethernet module. Each Flex-10 10Gb server NIC contains four individual FlexNICs, so a dual-channel module provides eight

Daily news for IT professionals at ComputerWeekly.com

LAN connections with the bandwidth for each FlexNIC is user defined from 100Mbps to 10Gbps in 100Mbps increments. From a practicality perspective, VC Flex-10 reduces cables and simplifies NIC creation, allocation and management. HP’s IRF put to the test A series of tests based around the resilience of HP’s IRF were created, inducing a series of different failures to see how the solution coped with the problems and what this meant in latency/lost packet issues. We also looked at the day-to-day management of the solution, including what happens when planned maintenance is required, in this case carrying out routine firmware upgrades involving switches reboots. Our CI for the test was built around HPs A5820 Ethernet switches supporting IRF, then – at the back-end – a combination of the aforementioned Flex-10 technology and standard HP A6120 blade switches and C3000/ C7000 server enclosures. First IRF test The first test involved seeing what happened when we simulated a failed link between the A5820 switch and a VC Flex-10. In this test both switches are simultaneously active,

thanks to the LACP Bridge Aggregation mechanism – a key benefit of IRF being its ability to maintain an activeactive state. So, in the event of a link failure, the second link of the LACP Bridge Aggregation and the second switch supports the traffic while the broken connection is repaired. Looking at the illustration as a guide, note that we experienced 3ms failover time on this connection. However, between servers 9 and 10 we experienced no dropped packets whatsoever. Between server 11 and ESX4 we communicated with only a 1.3ms failover time. As we brought the link back up we experienced just a minor failover time, again across all server to server links, just 1ms in total. Reverting back to the original situation and testing all connections while the second module was shut down and restarted, we recorded an aggregate failover time across all links of just 1.2ms. Second IRF test: For the second test we checked what happened when we simulated an additional bridge aggregation failure – potentially a traffic killer. Testing with a 64-byte ping while this was happening, we recorded just 4ms failover time and, while in recovery mode, a further 36ms between server 9 and server 10 and 23.6ms between server 11 and server 9 – easily our most significant latencies recorded yet, but still both well below our target level of 50ms. ■ Steve Broadhead is founder and director of Broadband-Testing. This is an edited version of the article – read the full review of the HP Intelligent Resilient Framework online (find the URL in the panel below).

more online Buyer’s Guide: HP’s Intelligent Resilient Framework put to the test computerweekly.com/246720.htm

White paper: Storage decisions to ensure virtualisation success computerweekly.com/246158.htm

In depth: Upgrade Ethernet to fabric for cloud computing computerweekly.com/246686.htm

24-30 May 2011 | 11


cloud computing

Campaigning for human rights in the cloud Cliff Saran interviews Kamesh Patel, the head of IT at Amnesty International, about the charity's move to a private cloud

A

mnesty International has migrated its web hosting to a private cloud-based infrastructure, provided by Claranet, to support the flexibility it requires as it grows its websites and engages in social media activities. A private cloud offers Amnesty International a flexible approach to scaling its website, which enables it to support campaigns run over social media sites. A hosted services approach would not have been flexible enough to support scaling band12 | 24-30 May 2011

width and servers up and down to support peak traffic during a social media campaign. The use of a private cloud, over a public cloud, is also significant, given that Amnesty International campaigns on human rights issues that some global organisations and governments would prefer to cover up. The attacks on Wikileaks earlier this year – and the fact that some of its service providers switched off the site – illustrates one of the inherent problems of public cloud services.

Human rights in social media Kamesh Patel, head of IT at Amnesty International, has worked for the human rights charity for four years, during which time IT at Amnesty has diversified. “Four years is a long time in IT. We now have a lot more activity in the online space, especially with the growth of Facebook, Twitter and MySpace. Unless you are in those areas, you get left behind.” In particular, Patel says Amnesty predominantly uses Facebook and Twitter for discussion on breaches in

human rights activities, to engage with people and give individuals the ability to take some kind of action, such as sign a petition, draft a letter to an MP or sign a card of support. The charity gathers real-time news and information via video footage, blogs and forum updates, posted by individuals around the world. To better engage with its members, Amnesty realised it needed to revamp its digital strategy and the IT behind it. The challenge with social media and digital campaigns is that web traffic

Daily news for IT professionals at ComputerWeekly.com


cloud computing Amnesty International’s cloud l A private cloud gives Amnesty the security of having its own infrastructure. l Claranet provides a single point of contact. l Service level agreements and IT infrastructure are flexible. l Amnesty’s websites can now support peak traffic and bandwidth arising from social media campaigns.

and if so, how – and if they had any other areas of interest,” said Patel. “We knew this information would enable us to support their journey to the Amnesty website, and once there, we could ensure they had access to all the content and resources they needed. This would, in turn, help to encourage and facilitate campaigning activity amongst all our users.”

When the FT pulled Amnesty’s ad, the charity’s social network traffic spiked can peak. Patel says that by moving from hosting to a private cloud, Amnesty is able to control the whole of its infrastructure. “We can define how much we use. You would not get this kind of flexibility in a hosted environment,” he said. Managing risk in the cloud Amnesty previously used several hosting firms for its websites, which was proving problematic. Patel explained: “At the time, a number of third parties looked after our websites; each had their own hosting providers with different levels of service. This led to everything being done in silos and was complicated and demanding to manage.” Furthermore, the situation exposed the organisation to significant instability and risk. Should one of the third parties go under, or should a

dispute arise, it would have been easy for any of the developers or agencies to simply turn off Amnesty’s website. “We couldn’t continue operating with this threat, so we decided we needed to consolidate our providers and gain back control over our hosting platform,” said Patel. As a result of its previous disparate and complex hosting environment, user data wasn’t integrated across the organisation’s various properties, which meant the charity had poor visibility of its users’ profiles and their individual activity online. “Because we lacked visibility of our data, it meant our hands were tied and limited the way we could interact with our users. With enhanced access to this vital information we would be able to see, for example, if an online visitor was an activist, if they were supporting us financially –

“At the last minute the ad was pulled by the FT, infuriating campaigners. Our blogosphere went completely crazy” Daily news for IT professionals at ComputerWeekly.com

Choosing the right provider Patel wanted to simplify the hosting arrangement by having just one organisation take full responsibility. He said: “We undertook a business case exercise, looking at the risks, and how our websites are hosted. The key mitigation point was moving to a key contact with a single account manager from multiple suppliers.” After a competitive pitch, Amnesty chose Claranet to underpin its technology transformation. “Claranet’s SLA covered the whole service and guaranteed application availability. We wanted to dramatically streamline our hosting platform and it was clear that Claranet’s all-encompassing SLA and single point-of-contact would help us to do this.” Claranet will support Amnesty International’s digital strategy, which may require IT resources to be deployed flexibly, Patel said. “We are developing a digital strategy – with a desire to implement in an agile way.” This means Amnesty requires flexibility for its website, its hardware and its service level agreements. Social media can be bandwidthand hardware-intensive. “When we request an agency develops new website functionality, we don’t have to worry about the hardware or bandwidth,” Patel said. When it took over the contract for the Amnesty websites, Claranet worked with the charity’s agencies to ensure ample headroom in server capacity and bandwidth, to support Amnesty’s future plans. This flexibility was recently tested. “We ran a campaign highlighting Shell’s human rights record in Nigeria and, using donations, had bought an ad in the Financial Times. At the last minute, however, the ad was pulled by the FT, infuriating campaigners. As a result, our blogosphere went completely crazy. Our

social networking site, experienced around 400 blog entries that day, compared with between 20 and 40 entries on an average day. Despite the huge spike in demand, our websites didn’t crash.” Claranet’s virtualised platform meant that Amnesty’s server resources could be dynamically allocated to where they were needed. “This ensured our websites could cope with the unexpected demand, and that we were able to provide a smooth online experience for our users. And it continues to do this today,” said Patel. The first phase of the project, completed in 2009, implemented the managed application hosting platform to support the Amnesty website and the organisation’s central registration system. For the second phase of its IT revamp, Amnesty is embracing social media tools and changing the content management system on which its website is built. In the final stage of the project, Amnesty will revamp its website in late 2011. This will mark the end of the three-year project. By using Claranet’s private cloud, Patel says Amnesty International has been able to extend its website. “Before Claranet’s hosting platform was in place, any e-commerce – where our members request information packs, CDs and other materials, or set up direct debits – had to be fulfilled manually, as our website and applications couldn’t support this capability,” he said. “Claranet has helped us to simplify our back-end processes and to automate fulfilment.” ■

more online News analysis: Comic Relief doing something funny in the cloud computerweekly.com/246048.htm

In depth: Donate your IT skills to charities with IT4Communities computerweekly.com/234986.htm

In depth: Choose the right cloud provider to avoid network problems computerweekly.com/246567.htm

24-30 May 2011 | 13


supplier profile

Can Dell grow and conquer?

Photographs courtesy of Dell inc

Can the acquisitive supplier integrate smoothly and position itself for success in a changing IT market? Danny Bradbury reports

I

t was the company that came from nowhere to revolutionise an entire industry. Dell, originally a provider of desktop PCs, was instrumental in changing the way that the IT business worked. On the consumer side, Dell revolutionised product distribution by using e-commerce to compress the supply chain, and it pioneered the commoditisation of enterprise IT hardware. One development enabled Dell to scale up its own sales operation, while the other enabled countless customers to scale out their back-end computing operations. Now, the tables are turning and the industry is shifting underneath Dell. Can it adapt?

A history of direct sales As is the case with so many successful technology companies, Dell’s founder was a university drop-out. Ten years after Bill Gates left Harvard to start Microsoft, and 20 years before 14 | 24-30 May 2011

Mark Zuckerberg left the same place to make 500 million friends with Facebook, Michael Dell left the University of Texas to focus on his baby, PCs Limited, which he started in his dorm room in 1984. Even then, when the closest thing to a web experience was dialing into BBS systems on a 2400-baud modem, Dell was fixated with the idea of selling PCs directly to consumers. He reached them through computer magazines, and assembled PCs for them according to their required specifications. The model remained essentially the same for a quarter of a century, even as the technology to deliver PCs directly changed. Tim Mattox, vice-president of strategy at the firm, sums it up: “Our advantage was through our direct model. If you’re holding six weeks of inventory and the other firm is holding three days, the firm with three days has a significant advantage in operational expenditure.”

Dell’s direct strategy helps to insulate the company against the ups and downs of the business cycle. A company with minimal inventory is better suited to supply product in a volatile market with frequent technology refreshes. In a world where people learned to demand products fitting their exact requirements, ordered as easily as possible and delivered as quickly as possible, Dell thrived. Each time customer expectations grew, the company was able to raise the bar, honing its supply chains, and using web technology to constantly improve the ordering experience. It applied this methodology in other areas, such as televisions and printers. No wonder, then, that it grew into a firm with a $28bn market capitalisation. Consumer to enterprise focus As Dell expanded its operations, it pushed into the enterprise market, first offering PCs to corporate cus-

tomers, and latterly moving into other enterprise equipment, including printers, storage and networking devices. Mattox describes a culture of openness that he says permeated the company from an early point in its history, and allowed it to open up the datacentre market in unexpected ways. “Driving open standards is often antithetical to our competition’s business model. When competition is pushing a proprietary stack and their business prospects rest on that stack, being able to drive the capabilities of an open solution is limited,” he says. Now on its eleventh generation of enterprise projects, Dell spearheaded the use of commodity x86 architectures in the datacentre. Until it began this drive, companies had little choice but to scale up their systems, adding more resources to a single computer to create complex architectures that were expensive and required considerable engineering ex-

Daily news for IT professionals at ComputerWeekly.com


supplier profile pertise to get right. They frequently ran into problems in areas such as the use of memory buses by multiple processors. Architectures such as non-uniform memory access (NUMA) were developed to help alleviate these problems, at a relatively high cost to the customer. At the time, during the mid-1990s, x86 servers were starting to be used for simple tasks such as file and print, but another piece of the market was starting to recognise their importance in higher-performance applications. By stringing together lots of cheap Intel-based computers in sequence, scientists were able to scale out their computing power, creating powerful clusters of processors that could accompany hefty processing tasks by working in unison. “They were using toy servers to do massively powerful things, and as we worked with these organisations, we recognised that this could be applied to commercial workloads,” Mattox recalls. “We came up with the external moniker of ‘scalable enterprise’, and the analysts almost laughed us off the stage. They said dedicated processors were the way to go. But it has been proven that x86 servers have been able to take far more of the workload.” Dell has been busily consolidating its activities in the enterprise market. In particular, it is targeting cloud computing as a potential growth market. The company set up a group four years ago dedicated solely to building solutions for large cloud computing service providers such as Yahoo and Microsoft. It has been refining its PowerEdge server technology to make it easier to deploy, using smaller footprint, more energy-efficient devices that can be packed more readily into the datacentre, in readiness to push its expertise in large-scale cloud deployments further into smaller corporate customers’ datacentres. It is riding the virtualisation wave, in which corporate IT departments

create “private clouds” of consolidated, virtual machines. An acquisitive stance Dell’s acquisitions have also bolstered its position in the enterprise sector, and in some cases are helping it to advance its cloud-based ambitions. Dell’s acquisitions include Boomi, which sells products for integrating cloud-based services and on-premise applications, systems management appliance company Kace Networks, which will focus on the firm’s SME and public sector business, and Ocarina Networks, which provides de-duplication solutions and content-aware compression products for storage systems. Along with Scalent, another 2010 acquisition which provides datacentre infrastructure software, Ocarina Networks will be integrated into Dell’s commercial business. In 2008, it purchased Equalogic, which sells iSCSI storage area network (SAN) storage, for $1.4bn. This helped to propel Dell into the virtualised storage market, mirroring its efforts in x86-based virtualisation. But Dell needs more traction in enterprise storage. iSCSI SANs are suitable for less intensive virtualised storage, but fibre channel-based SANs are designed for the heavier lifting that its larger enterprise clients will be looking for. The company tried to move into higher-end virtualised storage products with its 2010 bid for 3PAR, a company that specialises in utility storage, providing products that start at 2.3Tbyte in capacity and scale up to 600Tbyte. However, HP muscled into the acquisition, putting in its own bid for 3PAR. HP eventually won what became a bitter public bidding war. The fact that Dell went for the 3PAR acquisition highlights a gap in its portfolio. Still, Dell won’t be stopping its acquisitive streak here. With $11.6bn in cash, the company has a lot of headroom to consolidate its position

in the enterprise market, and executives have committed to more acquisitions as it attempts to double the size of its datacentre business to $30bn in revenues. Most of the holes in Dell’s portfolio are datacentre related, according to Jayson Noland, analyst at Robert W Baird. “It needs something to complement Equalogic, and it failed with 3PAR, but I expect it to try again,” he predicts. Baird also suggests that the company needs to consolidate its role in the networking market in the longer term. “It is coming at that market with partnerships, but you need to own, not partner,” he says. Building a services culture Dell’s biggest acquisition to date has been in services. Dell paid $3.9bn for Perot Systems in 2009. Analysts reportedly said that the 68% premium that Dell paid for the services giant was too rich. Nevertheless, it brought some muchneeded traction to Dell’s services revenues, which have increased 50% overall, due largely to its investment in Perot. Service revenues have grown slightly more in its large enterprise business. Perot represented a services company that was focused on IT as its core business, says Ryon Packer, director of portfolio management at what is now called Dell Services. Yet, Perot was not so big that it would negatively affect the Dell culture. “We saw a cultural mesh between the companies,” he says. “We were still focusing on the IT services, both applications and infrastructure, that aligned to where the market views Dell and gives it permission to play.” Even in its services business, though, Dell still needs to plug some gaps. With 23,000 employees, Perot is strong in healthcare (48% of revenue) and government services (27%), which is why Dell’s services revenue soared by 116% in its public sector business during the second

quarter of fiscal 2011. With 88% of its revenue hailing from domestic markets, it is also heavily biased towards US customers. Dell needs to increase its service industry acquisitions to flesh out other areas, especially if it wants to compete with heavy hitters such as IBM and HP. The former purchased PriceWaterhouseCoopers, while the latter purchased EDS, which coincidentally was started by Perot. In a bid to bolster its rapidly growing services operation, Dell is also likely to be looking for potential targets with a strong regional presence outside the US. “We’re looking for areas where there is a strong local presence or very specific intellectual property, or where we can deliver capability that is needed,” says Packer. “It takes quite a while to highlight large teams in regions, and to leap in front and establish ourselves in chosen markets, we would be looking at time-to-market issues. The focus will be on inorganically handling issues where the organic stuff will take too long.” Senior Dell executives have confirmed that Dell’s acquisition path will focus on fleshing out its services portfolio. Future challenges As it continues to purchase companies, Dell faces another challenge, says Ronnie Moas, president and research director at Standpoint Research. “Dell has made more than a dozen acquisitions since 2006, and there is always that integration risk,” he says. “Can it integrate all of the acquisitions it made, and deliver on the bottom line?” Dell has done its best to bolster its expertise in mergers and acquisitions. In summer 2009, it hired the mergers and acquisitions executive David Johnson from IBM to help marshal it through its aggressive acquisition phase, leading to a public spat with Big Blue. ■

more online Whitepaper: Download the full seven-page report on Dell computerweekly.com/245463.htm

News: Dell posts 177% profit boost after increasing enterprise sales computerweekly.com/246691.htm

Dell Precision M4500 mobile workstations Daily news for IT professionals at ComputerWeekly.com

News: Dell to invest $1bn in cloud computing services computerweekly.com/246229.htm

24-30 May 2011 | 15


downtime

downtime Like the name – can’t believe I didn’t think of that Two fame-seeking parents have tried to cash in on their kids early. While the rest of us have to wait until we are old and infirm for our kids to begin to repay us for our sweat and blood, an Israeli couple are seeking fame by giving their kids silly names. Their latest daughter has been named Like, as in the feature on Facebook that is clicked when a user likes something. The parents describe the choice as “modern and innovative”. Just like the way technology companies describe products they cannot sell. Apparently, Like has a sibling called Pie. Poor kids. But I blame the grandparents. They should never have called their son T**t. Somebody was always going to pay.

Teen wishes maths teacher dead on Facebook and gets suspended A 13-year-old New Hampshire student was suspended for five days after she posted a status update on Facebook saying that she wished Osama Bin Laden would kill her maths teacher. Downtime has some bones to pick with both the mother of the student and the maths teacher in question. Her mum said that she “frequently checks her daughter’s Facebook page but didn’t log-on on the day of the incident”. How convenient. She also said that she felt the punishment was harsh as it “denies her daughter an education”. How does keeping the kid at home for a few days deny her an education? It’s just five days – no big deal, especially as the kid threatened her teacher with the most wanted man in the world… who’s dead by the way.

Heard something amusing or exasperating on the industry grapevine? E-mail cw-downtime@computerweekly.com

The mother also said that “they called her daughter to the principal’s office and asked her to show them her Facebook profile, and she complied because she generally does what she’s told”. Yeah, if that’s the mum trying to put across that her kid is a good’un then forget it. Some kids use knives and guns to threaten people, but not this kid, oh no, she used the most deadly weapon in the world… Man. Apparently, the girl’s suspension is over, but she has not returned to class for fear of facing the teacher she made the comment about. Her mum said: “She’s anxious to go back and terrified to go back all at the same time.” Fear? Terrified? What kind of a psychopathic math teacher is this?!?!? What’s going on in these lessons?? Someone needs to find out what this teacher is doing to these kids for them to go to such lengths. I mean, the kid is so terrified that her mum is now asking the school to

remove her from the class and give her a private maths teacher. Let’s just hope the next one isn’t nuts too and that the kid doesn’t have to call upon other dead terrorists/tyrants/evil beings to do her bidding.

Free for meetings: lunchtime, teatime, dinnertime, any time Downtime is glad to be reporting from sunny Marble Arch. The trauma of commuting to darkest Sutton is now behind us. Visitors no longer have to ask, “Sutton? Would you mind meeting at my office instead?” And one of the Downtime crew has decided to sample the finer West End restaurants, courtesy of willing IT companies. He is currently out at lunch, has a dinner appointment this evening, and a breakfast meeting tomorrow, but I think he might have space in his diary for afternoon tea or an aperitif. Since he’s from Newcastle, do invest in a good Geordie dictionary… Why-aye.

Social media can kill: don’t be a planker

Insensitive but oh so funny... We’re not sure we need to add anything to this but that it’s put a smile on our faces, despite the fact that it really is an evil thing to do. Would we have the guts to do it? No. Would we think about doing it? Yes. Would we tell people we had done it and then after they’ve finished laughing say that we didn’t really but how awesome would it be if we actually did do it and how much trouble we’d get into? Hell yeah, all over that one.

16 | 24-30 May 2011

There has been a lot in the news lately about the security risks of social media, but the death of a 20-year-old Australian proves that online games of one-upmanship can be fatal. An internet craze dubbed “planking” involves posting pics online of participants lying down in a variety of weird and wonderful places. The term “planker” is said to have been coined in Australia and refers to participants lying with arms against their bodies to resemble a plank. Pictures of people balancing atop burger joint signs, giant beer cans, street signs and a whole host of famous landmarks can be funny, but the competitiveness among participants is making it an increasingly dangerous pastime.

Following the death of Acton Beale, who fell to his death while posing for a planking pic on a seventh floor balcony, the internet craze has come under fire from police concerned it will get out of hand. Even the Australian prime minister, Julia Gillard, has urged anybody who “likes a bit of fun” to focus first on keeping themselves safe, according to reports. The trick is knowing when a bit of internet fame is just not worth it.

Not high-tech and hip? All you need is Necomimi Downtime is always pleased to see technology being put to good use, such as enabling fashion accessories that express the wearer’s moods. Just in case we have lost the ability to interpret what other people are feeling, Japanese scientists have come up with a high-tech fashion accessory in the form of white ears that move according to the wearer’s thoughts and emotions, according to Reuters. The “Necomimi” – or cat’s ears in English – use two brain-wave sensors to detect, interpret and communicate the emotional state of the wearer through movements. For example, the ears shoot up with the wearer is nervous or attentive, but flatten when the wearer is relaxed. Downtime is pleased to note that Necomimi are merely the frivolous offspin of serious neuro-control science that is being developed in labs around the world to help people with disabilities. High-tech fashion junkies, however, will have to wait a while before Necomimi are available outside Japan, but they should start saving now, because the ears will not be cheap. ■ Read more on the Downtime blog computerweekly.com/downtime

Daily news for IT professionals at ComputerWeekly.com


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.