net.work October 2008

Page 1

october 2008

the way business is moving

Life converged The smartphone investigated

FEATURES://VIRTUALISATION_SILVER BULLET OR LEAD BALLOON?/SECURITY ROI/ MASHUPS TO DISSEMINATE GOVERNMENT INFO/Google Chrome_Game changing?

Inside:

Infrastructure_Communications_Enterprise intelligence_Risk management_Storage_Mobility_Product Update



contents

october 2008 - wireless & mobility issue

10

14

PAGE

PAGE

Feature: Life Converged

PAGE

38

PAGE

22

Regulars 03 Ed’s Column 04 News 50 Product Updates

10

LIFE CONVERGED Converged mobile phone devices offer everything from mobile media to email. Smartphones let you take your work home and your life to the office. The questions this reality raises touch more than just technology. net.work investigates.

14

VIRTUALISATION – SILVER BULLET OR LEAD BALLOON? Companies of all sizes have begun experimenting with virtualisation in the hope of gaining some relief from rising energy costs and the growing complexity of their IT environments. But what does it really have to offer?

18

MASHUPS TO DISSEMINATE GOVERNMENT INFO The British Government recently made an unprecedented move by releasing a number of previously unavailable data sets together with APIs for use in composite applications or mashups. Will the Web 2.0-savvy public show up?.

34

SECURITY ROI Return on investment, or ROI, is a big deal in business. It’s a good idea in theory, but when it comes to security it’s mostly bunk in practice.

KEYNOTES

SECTIONS

40 NAS vs SAN The debate rages on.

22 INFRASTRUCTURE 26 COMMUNICATIONS 30 ENTERPRISE INTELLIGENCE 34 RISK MANAGEMENT 38 STORAGE 42 MOBILITY 46 ENTERPRISE 2.0

46 BUSINESS EVOLUTION New web technologies, open source, employees as consumers. 48 STAYING ON COURSE The guide to identifying, managing and reducing complexity. 50 CHROMING THE WEB Google Chrome: Game changing or folly?

PAGE

54



Managing Editor_Darren Smith darren@technews.co.za Editorial Department Contributing Editor_Andrew Seldon andrew@technews.co.za Sub-Editor_Zamani Mbatha zamani@technews.co.za

ed’s column

portal Editor_Katie Wetselaar katie@technews.co.za Buyers’ Guide_Liz Seed liz@technews.co.za Columnists Brett Haggard, Simon Dingle, Paul Booth Contributors Reshaad Ahmed, Bruce Schneier, Justin Spratt, Gary Lawrence, Graham Duxbury, Paul Morgan, Armand Steffens Sales Department Sales Director_Jane van der Spuy jane@technews.co.za Sales Executives Shirley McGeer shirley@technews.co.za Malckey Tehini malckey@technews.co.za Tracy Karam tracy@technews.co.za Circulation Manager_Carmen Sedlacek carmen@technews.co.za Subscriptions_Justin Grove justin@technews.co.za

Welcome to the October issue of

(net dot work). I trust you enjoyed the first (September) issue. Feedback has been very positive, and serves to re-enforce our decision to do the extreme make-over that we did. Some extracts include: “Firstly I’d like to extend my congratulations to you for an excellent new publication. The first issue of is impressive. I love the new layout. It appears more reader friendly and more attractive to the eye. You should be very proud!” Karen Heydenrych, Predictive Communications “Oh my. I am so impressed – the publication feels nice to the touch, great on the eye and oh so pretty! Well done! The best tech publication I have read through in a long, long time!” Natassia de Villiers, Kilimanjaro Communications “Just seen the first issue of fantastic.” Alastair Otter, Tectonic

. Looks

previously unavailable data sets together with APIs for use in composite applications or mashups. Will the Web 2.0-savvy public show up? • SECURITY ROI – Return on investment, or ROI, is a big deal in business. It’s a good idea in theory, but when it comes to security it’s mostly bunk in practice.

Design & Layout Design_Infiltrate Media Production_Technique Design Repro & Printing_Intrepid Printers Photography_ Whitecliffs Photography Publisher Technews Publishing www.technews.co.za Reg No. 2005/034598/07 1st Floor Stabilitas Chambers 265 Kent Avenue, Randburg, 2194 PO Box 385, Pinegowrie, 2123 Tel: +27 (0)11 886 3640 Fax: +27 (011) 787 8052 URL: www.netdotwork.co.za Contacts Letters to the Ed_netdotwork@technews.co.za Subscriptions_subs@technews.co.za Copyright © 2008 by Technews Publishing. All rights reserved. No part of this publication may be reproduced, adapted, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior written permission of the publisher. Opinions expressed in this publication are not necessarily those of the editors, publisher, or advertisers.

Thank you to all who took the time to send your comments through. It was feedback that was very much appreciated, and those with a good eye for these things may note the subtle tweaks and changes here and there throughout this issue. This month’s content In this issue of , we focus on a number of topics, including mobility and wireless, virtualisation, mash-ups and some good commonsense insight into security ROI. Check out: • LIFE CONVERGED – Converged mobile phone devices offer everything from mobile media to email. Smartphones let you take your work home and your life to the office. The questions this reality raises touch more than just technology. investigates. • VIRTUALISATION – SILVER BULLET OR LEAD BALLOON? – Companies of all sizes have begun experimenting with virtualisation in the hope of gaining some relief from rising energy costs and the growing complexity of their IT environments. But what does it really have to offer? • MASHUPS TO DISSEMINATE GOVERNMENT INFO – The British Government recently made an unprecedented move by releasing a number of

Till next month, Darren Smith Managing Editor PS: All feedback, brickbats and praise, would be welcomed. Contact me directly on darren@technews.co.za.

Next month And next month we’ll report back on VMWorld, and Storage Expo; we’ll take a long hard look at business continuity and disaster recovery; we’ll investigate where virtualisation is taking us, and what it means to our businesses; we’ll be trying to get some insight into the impact virtualisation may have on business continuity and data integrity. And we’ll also focus on how effectiveness and efficiency are driving storage into the clouds. For example, a recent survey of 875 organisations by Storage Expo, has found that the main driver of their current storage policy is storage effectiveness (60%) necessitated by the need for reliability, scalability and access speed. The second most important driver was storage efficiency (33%) resulting from the need to cope with cost vs. capability. The least popular drivers were green criteria (7%). Which begs the question, are organisations prioritising effectiveness over efficiency when it comes to setting policy and making purchasing decisions?

www.netdotwork.co.za

03


picture this News in pictures, brought to you by Technews.co.za

Superimposed Rover on rim of Victoria Crater

Cryptic Studios ventures into new dimensions with IBM

Cryptic Studios, a leading independent developer of massively multiplayer online role playing games, has chosen IBM servers to power its highly anticipated massively multiplayer online (MMO) action game, “Champions Online”. IBM System x offers a responsive, scalable architecture that Cryptic Studios needs to deliver immersive, and action-packed experiences.

NASA’s Mars Rover Opportunity is setting its sights on a crater more than 20 times larger than its home for the past two years. To reach the crater the rover team calls Endeavour, Opportunity needs to drive approximately 12 kilometres to the southeast, matching the total distance it has travelled since landing on Mars in early 2004. The rover climbed out of Victoria Crater earlier this month.This image superimposes an artist’s concept of the Mars Exploration Rover Opportunity on the rim of Victoria Crater. It is done to give a sense of scale. Image credit: NASA/JPL/Cornell

NEWS Atec Atec Systems and Technologies, an IT service and infrastructure provider, has acquired 65% of SA Technologies (Satec), a cabling and networking specialist. Vodacom Vodacom has announced a $700m acquisition of the carrier services and network solutions subsidiaries of Gateway Telecommunications SA. The deal would potentially give Vodacom access to 40 markets including Nigeria. Gateway, which provides interconnection services via satellite and ground networks to telecommunications companies, has offices in 17 countries and recorded sales last year of over $250m. In addition, Vodacom has also acquired 51% of StorTech, a managed enterprise data centre services company providing a range of storage and security services. StorTech was formerly part of MB Technologies, but latterly management-owned. Altron Bytes Document Solutions, part of the Altron Group, has acquired NOR Paper Cape Town and NOR Paper Johannesburg. NOR is a supplier of papers and has a turnover in excess of R250m.

04

October 2008_

The Forgotten Computer IBM this month celebrated the 50th anniversary of the Stretch supercomputer – a machine that was not a commercial success in its era, but helped revolutionise the computer industry by pioneering technologies that power everything from today’s laptops and iPods to the world’s largest supercomputers. The Stretch computer was IBM’s audacious ‘50s-era gamble to create a monster computer, 100 times faster than an IBM supercomputer of the day called the 704. When introduced, it was considered a failure, only 30 to 40 times faster than other systems. Less than 10 were built and the project was shelved. But the story doesn’t end there. Stretch was packed with technology breakthroughs so innovative, they would not die; stuff we take for granted today because it is pervasive throughout the technology landscape. Just a few examples: • Multiprogramming, enables a computer to juggle more than one job at a time. • Pipelining, or lining up instructions in a queue, so that the computer doesn’t have to wait between operations. • Memory protection to prevent unauthorised memory access – absolutely crucial in providing computer security. • Memory interleaving, breaking up memory into chunks for much higher bandwidth. • The eight-bit byte, establishing a standard size for the unit of data representing a single character. These innovations help form the foundation

of modern computing. Following the demise of Stretch, they found a home in IBM’s next big project – the super successful System/360 mainframe – and from there entered the wider world of mainstream computing.

The forgotten computer


CERN’s Large Hadron Collider

The Large Hadron Collider (LHC), a 27 kilometre long particle accelerator straddling the border of Switzerland and France, recently began its first particle beam tests. The European Organization for Nuclear Research (CERN) prepared its first small tests in early August, leading to a full-track test in September – and the first planned particle collisions before the end of the year. Sadly, the Large Hadron Collider near Geneva will be out of action for at least two months, after part of the giant physics experiment was turned off while engineers probed a magnet failure. A CERN spokesman said damage to the £3.6bn ($6.6bn) particle accelerator was worse than anticipated.

Afrigator.com acquired Afrigator.com, a local start-up which has carved out a niche as a hub for user-generated content in the form of blogs, podcasts and videos relevant to the African continent, has been acquired by MIH Print Africa. This comes hot on the heels of investments by 24.com in local social networking company, Blueworld, and minority stake acquisition by cellular giant Vodacom in local social platform Zoopy. The Naspers group’s social media strategy has also really taken off in recent months with local Internet guru Matt Buckland moving to 24.com as GM of online publishing and social media (Laaik.it, play.24.com, Spaces, 24.com blogs etc), Rafiq Phillips moving to MIH Swat and of course the acquisitions of Blueworld and Afrigator.

Cisco Opens Innovation Centre in South Africa Cisco has opened a technology centre in Pretoria, South Africa with an investment of $27m. It said the Cisco Innovation Hub Technology Center, CIHTC, will develop local skills, intellectual property, entrepreneurship, and development capabilities. Steve Midgley, managing director of Cisco South Africa, said, “Cisco is investing in this initiative to ensure that when the broadband revolution really kicks off, South Africa has enough of the right kind of skills, solutions creation

capabilities, and intellectual property to leverage the many benefits broadband will bring to the country and help leapfrog other economies.” The company already has several ICT initiatives in Africa. In 2007, it announced the Broad-Based Black Economic Empowerment initiative and opened its East African headquarters in Nairobi. The Nairobi office also serves as a training and competency centre to help countries in Africa to develop ICT strategies for productivity growth and social inclusion. In June, in collaboration with Absa Bank, the company launched a financial solution for SMBs.

Net 1 UEPS Technologies SA-based but Nasdaq-listed Net 1 UEPS Technologies has acquired 80,1% of Austrian-based BGS Smartcard Systems AG, a provider of smart cardbased payment systems. Net 1 UEPS Technologies has also announced that it intends to apply for an inward listing on the JSE. Altech Altech has won a High Court ruling in favour of a ‘self provisioning’ capability for all network service providers, thus forcing ICASA to extend VANS licences to any operators that wish to build their own networks. DVT AltX-listed DVT has proposed a name change to DTH Dynamic Technology Holdings. Ifca Technologies Kian Keong Yong has been appointed as acting CEO of Ifca Technologies, following the resignation of Craig Christensen. Telkom SA Telkom Media has announced that it had been issued a pay-TV licence by ICASA. A consortium, including a fund connected to the PIC, has emerged as a possible candidate for a stake in Telkom Media, which is 66% owned by Telkom SA.

www.netdotwork.co.za

05


news Brought to you by Technews.co.za

Epson Albert Fayard has been appointed as GM for Epsom Southern Africa, following the move of Hans Dummer to Epson’s London office.

T-Mobile’s G1 Google phone

Gateway Communications Gateway Communications has established a Central African regional office in Cameroon. GijimaAST Robert Gumede has resigned as executive chairman of GijimaAST, although he stays on as non-executive chairman. Microsoft Mteto Nyati has been appointed as Managing Director of Microsoft SA. ACET Processing PIC Solutions, a specialist credit risk management consultancy in the MEA region has sold ACET Processing entity to its management via a MBO transaction, which will enable the unit to expand its service offerings as a provider of solutions in consulting, analytics, software and training. Kelly Group The Kelly Group has acquired Torque IT, an ICT skills development and training company, thus strengthening the former’s presence in this sector. Mustek Mustek has announced that it has formed a Middle East office, based in Dubai that will service the Middle East and North African regions.

Say What#@? WWWOW 2,4 million South African adults accessed the Internet in the past 7 days (Sep 2008), an increase of 30% over a year ago. (AMPS 2007, 2008).

06

October 2008_

A week can be a long time

When it comes to news, a week can be a long time. When it comes to tech news, it can feel like a millennium. The big stories playing out include Google’s launch of the G1 Android based mobile phone, the “Sarah Palin Hacker” hysteria, the shut down of CERN’s £3.6bn ($6.6bn) LHC, the cloud computing strategies of Oracle, IBM, VMWare and others …

NetApp announces new go-tomarket strategy for South Africa Storage and data management specialist NetApp has reworked its business model and go-to-market strategy in South Africa, designed to improve the customer experience and better align with corporate goals and messaging. Now structured under the leadership of Regional Director Martyn Molnar, NetApp’s South African business is moving to a 100% indirect sales model and driving forward its local partner base and activities. “The industries that are strong in South Africa, such as mining, exploration, media and finance, are traditionally ones with large and rapidly growing data volumes. That makes them pre-destined for our high performance, integrated and scalable storage and data management solutions”, said Andreas König, Senior Vice President and General Manager EMEA, NetApp. “By moving to an indirect sales model in South Africa, we are able to meet our service level agreements with our customers and help more end users achieve business benefits.”

NetApp’s Martyn Molnar

Plessey Appoints NEW CEO Plessey has announced the appointment of Zellah Fuphe as Chief Executive Officer of Plessey South Africa. Fuphe joins Plessey from Worldwide Newly appointed CEO of Plessey African Investment South Africa, Holdings where she Zellah Fuphe served as Managing Director for the past four years. Says Fuphe, “I am pleased to join Plessey at this important stage of its growth. The deregulation and liberalisation of the South African telecommunications industry is an exciting time, which offers significant opportunities for the company.”

SAS, Tata Consultancy Services partner SAS and Tata Consultancy Services have announced a global partnership. Together, the companies will deliver to joint customers complete and innovative technology solutions that improve performance, enhance profitability, and deliver insight and intelligence. “There is increasing demand for solutions that can help organisations glean timely and useful intelligence from volumes of existing data, thereby gaining a competitive edge through actions that boost organisational performance,” says Dr. Santosh Mohanty, Global Head, Technology Excellence, at Tata Consulting Services (TCS). “Business analytics – BI, analytics, data integration and performance management – meets that demand. SAS and TCS are a powerful combination for fulfilling the promise of business analytics and helping global companies succeed.”



news Brought to you by Technews.co.za

TeliMatrix TeliMatrix has announced that it is proposing a name change to MiX Telematics, in order to better reflect the company’s new corporate identity and branding.

HP cuts 25 000 jobs in wake of EDS acquisition HP has announced plans to restructure its EDS business group that it acquired for $13.9bn in May. It said this will result in 24 600 job cuts, approximately 7,5% of the combined companies’ workforce. HP said

Naspers MIH Print Africa, a subsidiary of Naspers, has acquired Afrigator Internet (Pty) Ltd, a social media aggregator and blog directory company that was created for African consumers.

the three-year restructuring plan will better align the combined companies’ overall structure and efficiency with annual cost savings of approximately $1.8bn. It expects a $1.7bn restructuring charge in the fourth quarter. It said that most of the cuts will come from within EDS, and nearly half will be jobs in the US. The divisions that are expected to be worst hit include

T-Systems Mardia van der Walt-Korsten, CEO of T-Systems SA, won ‘The Business Woman of the Year’ award in the corporate category. Teraco Data Environments This is a new company created by a number of ex-Storm investors and management to set up two data centres, one in Cape Town and the other in Johannesburg. Seacom Seacom, the venture capital cable company that is laying an Africa East Coast cable, has announced that it is on schedule for its ‘go live’ date of June 2009.

Say What#@? Yebo for more. The number of adults with a prepaid cellphone subscription has quintupled (5x) from 2000 to 2007. In 2000, 3 million adults claimed to have a prepaid cellular phone subscription. (Source: AMPS 2000 and 2007)

08

October 2008_

human resources, legal departments, and finance.

Lumidigm Signs Agreement with Brand New Technologies Lumidigm, a provider of world-class biometric sensors, has signed a reseller agreement with Brand New Technologies, a South African firm that specialises in biometric security solutions. “We are pleased that Brand New is introducing the benefits of Lumidigm’s multispectral technology to their customer base in Africa,” said Seth Miller, a business development specialist at Lumidigm. Under this agreement, Lumidigm and Brand New will jointly develop opportunities for biometric applications in multiple industries including mining, healthcare, finance, and civil identification. Cooperative marketing efforts are already underway. “We became interested in Lumidigm’s multispectral technology because of its real world capabilities,” said Dave Crawshay-Hall, Chief Technology Officer at Brand New. “The Lumidigm sensors close a gap in our product offerings and make Brand New uniquely able to provide complete and robust biometric solutions.” Cisco Acquires Instant-Messaging Start-Up Cisco has agreed to acquire open-source instantmessaging start-up Jabber for an undisclosed sum to enhance its unified communications and collaboration product portfolio. Colorado-based Jabber provides a messaging platform that supports different devices, users, and applications and allows collaboration across Microsoft Office Communications Server, IBM Sametime, AOL AIM, Google, and Yahoo. After the acquisition, Jabber will become a part of the Cisco Collaboration Software Group (CSG). Cisco said the acquisition will allow it to incorporate Jabber’s presence and messaging services in the network and offer aggregation capabilities to users through both on-premise and on-demand applications across multiple platforms including

Cisco WebEx Connect and Unified Communications. Doug Dennerline, senior vice president of Collaboration Software Group at Cisco, said, “Enterprise organisations want an extensible presence and messaging platform that can integrate with business process applications and easily adapt to their changing needs. Our intention is to be the interoperability benchmark in the collaboration space.”

IBM opens research centre IBM has opened a research center to develop collaborative applications and other types of social software. It said the Center for Social Software in Cambridge, Massachusetts, will develop and commercialise best practices in social networking and will create jointly funded research collaborations with the government, academia, industry, and venture capitalists. The centre is part of IBM’s Tomorrow at Work initiative for the research and development of technologies to align with new workplace trends and cultural changes. IBM will identify new business models for its web 2.0 collaboration portfolio, which includes social discovery, social search, and scalable architectures for social software including cloud computing. Microsoft will also open Research Center New England, also in Cambridge, this fall, which will look at the sociological, psychological, and technical aspects of social networks. Irene Greif, IBM fellow and director of the IBM Center for Social Software, said, “The Center for Social Software is a channel for the social computing community and our customers to collaborate on the most innovative social technologies being developed today. We view the centre as a magnet for the top social computing scientists around the world to visit, share work and innovate.”


Say What#@? Cray unveils Windows-based Supercomputer Supercomputer company Cray and Microsoft have collaborated to develop a desk-side, low-end, bladed office supercomputer for customers who require compute-intensive environments. The Cray CX1 supercomputer is pre-installed with Windows HPC Server 2008. It uses standard office power and incorporates up to 8 nodes and 16 Intel Xeon processors, delivering up to 64GB memory per node and up to 4 terabytes of storage. The system can be configured with a mix of compute, storage, and visualisation blades to meet individual requirements, and is interoperable with Linux. Cray said system configuration in the CX1 will be driven by wizards and GUI interfaces for straightforward set-up. The company is also building a web community where users can offer support and advice to each other. The CX1 supercomputer is the first product launched by Cray following its partnership

O3b Plans Satellite Network for Emerging Markets US-based telecoms start-up O3b Networks plans to deploy a satellite network for providing high-speed internet connectivity to emerging markets in Asia, Africa, Latin America, and the Middle East. It plans to provide a high-capacity satellite connection to the internet, which can be used by telephone companies and ISPs to offer mobile and web-access services. It has raised approximately $60m from investors including Google, HSBC Holdings, Allen & Company, and Liberty Global. The company, founded by telecommunications entrepreneur Greg Wyler, expects to launch 16 satellites, which will be operational by the end of 2010. The infrastructure will provide internet backhaul at speeds of 10Gps. The company has approached French aerospace group Thales for the construction of satellites. NEC and Unisys Develop Server Platform NEC and Unisys have completed the development of a common platform for enterprise servers, the result of a 2006 alliance agreement between the two companies. Systems based on the platform, which uses Intel Xeon processors, can scale up to 16 sockets and 96 processor cores supporting up to 1 terabyte. The systems are targeted at high-end enterprise computing tasks including database-intensive activities such as online transaction processing,

Everywhere you go, the better connection. In 2000, 16% of South African adults had a cell phone. By 2007, this proportion had grown to 56%. with Intel as its chip supplier after it ended its collaboration with long-time chip partner AMD. The supercomputer will compete with HP’s “Shorty” and low-end machines from Silicon Graphics. Vince Mendillo, director of HPC at Microsoft, said, “Windows HPC Server 2008 in combination with the Cray CX1 supercomputer will provide outstanding sustained performance on applications. The combined solution will enable companies to unify their Windows desktop and server workflows.”

business intelligence, and ERP. NEC is manufacturing systems based on the platform at its subsidiary NEC Computertechno in Japan. It said both it and Unisys will market the system as its own branded server line. Citrix Outlines Cloud Computing Strategy Citrix Systems has outlined its cloud computing strategy and a introduced a new product family for hosting, managing, and delivering cloud-based computing services. It said the Citrix C3 family integrates “cloud proven” virtualization with networking products and offers providers a set of service delivery infrastructure building blocks for cloudbased services. It includes a reference architecture that combines the capabilities of various Citrix product lines to offer a service-based infrastructure for large-scale, on-demand delivery of both IT infrastructure and application services. The architecture consists of four key components: XenServer Cloud Edition, NetScaler, WANScaler, and Workflow Studio to provide value-added security, high-availability, multi-tenant services. Citrix has also partnered with cloud computing technology and utility computing services firm 3Tera. The new C3 offering will be combined with 3Tera’s AppLogic cloud computing platform to provide enterprise-grade cloud computing applications to customers of all sizes.

(source: AMPS 2007B RA)

NITEL NITEL has announced the appointment of Tom Iseghohi as its Chairman. ProScan Grp The ProScan Group has acquired BCA Bar Code Alliance, a label and print solutions provider of over 20 years standing. The ProScan Group is a specialist in supply chain, automated data collection and mobile computing technologies in southern Africa. Siemens Siemens has re-structured into three sectors focused on industry, energy and healthcare, with Siemens IT solutions and services continuing to operate as a division for all three entities. Siemens recently exited totally from the telecommunications sector. Smile Communications South African-based Smile Communications has been awarded a telecommunications licence in Uganda. Smile is a company led by former MTN director Irene Charnley and owned by a consortium that includes Saudi Arabian investors. TBA (no company name as yet) ‘The big Telcos’ (Broadband Infraco, MTN, Neotel, Telkom and Vodacom) have announced that they intend building a cable down the West Coast of Africa that will supplement the current SAT-3 system. This joint venture is likely to kill off other initiatives that are floundering including Uhurunet.

www.netdotwork.co.za

09


feature

>

By simon dingle

Converged mobile phone devices offer everything from mobile media to email. Smartphones let you take your work home and your life to the office. The questions this reality raises touch more than just technology.

10

October 2008_


“At Gartner Symposium 2008 an emphasis was made that employers should not only allow the likes of social networking interaction at work, but also realise the business benefits of Web 2.0 tools such as Facebook and Twitter.” VGA output,” says Colin Erasmus, Business Group Lead: Windows Client for Microsoft South Africa. “There are also external keyboards available for smartphones.” Other technologies in the pipeline include projectors built into smartphones and keyboards that are projected onto just about any solid surface. Bringing down prices of these new technologies is the only barrier. Imagine putting your phone down on a table, having your monitor projected onto the wall and a full-sized keyboard onto the desk. If the device has sufficient processing power what would you need a laptop for?

Unifying communications Information worker technologies are making their way into the mobile arena. As the world’s leading provider of productivity technologies, Microsoft is gearing everything from Exchange to Windows Live Messenger for the mobile revolution. “Unified communications has tremendous promise in terms of being able to work anywhere,” says Danie Gordon, Unified Communications and Collaboration

S

martphones are the pinnacle of convergence. They connect wirelessly at broadband speeds using the likes of HSPA and have increasingly useful interfaces. They do email, productivity tools, take pictures and video, play media and keep you in touch with your social networks. Mobile computing is shifting off the laptop and onto the mobile phone. And if your business device is always with you and always on, where does work stop and play begin? Of course, some applications still require full-sized keyboards and big screen real-estate. You aren’t about to edit video or type a novel on your mobile phone. But these interface issues are on their way out too. “HTC has a phone due for launch this year with a

Small and cheap Netbooks are filling the gap for affordable, ultraportable computers in the interim and until smartphones take the final leap in terms of interface. The ASUS Eee PC, HP 2133 and Acer Aspire One are examples of tiny, cheap and connected devices that allow you to get the job done on the road. They don’t quite fit into your pocket, however. “Microsoft has made a commitment to supporting netbook and nettop devices as a supporting strategy in terms of the ‘next billion users’ concept that the company is driving along with other vendors,” says Erasmus. “This details getting connected devices into the hands of impoverished people.” With the large penetration of mobile phones in South Africa, for one, Erasmus says that the cellular market provides an entry point into the mass market of the developing world. “Converged devices are fairly expensive, however,” he states. “But we are seeing prices come down and good implementations of cellphones across the board, especially in education.” Eventually Erasmus reckons South Africa will be like Taiwan where more smartphones are sold than laptops. He says that consumer use of smartphone devices in also pushing extended use into the business environment.

Product Solution Manager for Microsoft South Africa. “People are being as productive on the road as they are in the office. But if we look at the mobile phone versus the laptop, we are not able to offer the full range of functionality on the former yet. “Having a full meeting remotely on a smartphone with voice, video, slides and other interaction isn’t possible just yet, given the simple real-estate of the device. But that experience is very well catered for on laptops,” he continues. “Smartphones are already good at bringing your presence, instant messaging and other things to the mobile meeting space.” Microsoft has its own mobile platform in the form of the Windows Mobile operating system, but is using standards to allow other platforms to speak to its technologies, such as the recently launched Exchange connector for Nokia phones.

www.netdotwork.co.za

11


feature

Google on mobile With local offices established in SA, Google has said that mobile will be a priority focus for the search giant in the region. Google continues to expand its mobile offerings worldwide and recently launched AdWords for mobile, allowing advertisers to target users on the mobile web.

16

September 2008_

“One in six Google searches in South Africa happens on a mobile device. That’s the highest ratio in the world,” says Stafford Masie, country manager for Google South Africa. “Mobile phones today have the same computational power as desktop computers did eight years ago,” he adds. “You have to bet on the trend to mobile.” Google is manifesting its commitment to mobile in the development of its Android mobile operating system for smartphones. The first Android phone will be available shortly. “What we’re trying to do with Android is bring iPhone-type functionality to the masses at prices

they can afford,” says Masie. “In the shorter term, and in terms of South Africa, we’ve launched universal search for mobile. SA was the first country in the world to receive this service,” he continues. “Now when you search for something on a mobile device Google no longer just returns webnatural search results, but presents images, videos that can play on a mobile device and geo-based content. We’re also seeing that mobile AdWords enjoy a very high clickthrough rate.” Google’s focus on mobile is indicative of where things are going and other vendors are bound to follow suit, if they haven’t already.

Business becomes life Deon Liebenberg, Regional Director: Sub Sahara Africa for Research In Motion (RIM), the company that develops BlackBerry, agrees that consumerism will lead the smartphone transition. “It’s about empowering the employee to work anywhere,” he says. “But I still speak to CIOs who want to block Facebook and the likes on their user’s devices. And I say, ‘So you expect John to take time sensitive work emails on a Saturday morning, but you won’t allow him to update Facebook on a Friday?’” The move to mobilising the workforce requires a change in attitude from employers and their employees. Instead of seeing work and life as separate and modular, we must begin to see the two as converged. Just like the devices we use. Bosses need to accept that their workforce will use social networks and communicate with friends at work. Employees must then accept working during ‘off’ time. At Gartner Symposium 2008 an emphasis was made that employers should not only allow the likes of social networking interaction at work, but also realise the business benefits of Web 2.0 tools such as Facebook and Twitter.


And by taking work on the road productivity is maximised. “With the new mobile approach we emphasise turning downtime into uptime,” explains Liebenberg. “You can use those five minutes in the car to do something useful.” “Gone are the days where you carry around a GPS device, mp3 player and camera,” he adds. “Now you carry around one, converged device that does all those things.” It comes in to work and goes on holiday with you. The killer app Email is the first thing that most users want to set up on a smartphone device. BlackBerry pretty much invented the portable email market. But now other mobile email clients along with IMAP, mobile browsers and other technologies allow you to get mobile email on any device. So why pay for a BlackBerry subscription? “Because users, and particularly South Africans, are conscious of price,” answers Liebenberg. “You might be able to get your email and other data on any device, but how much is it costing you in terms of mobile bandwidth? With a BlackBerry subscription you get unlimited bandwidth to your

BlackBerry device for around R69 a month.” Liebenberg opines that having unlimited bandwidth to the mobile device is key to fully realising smartphone usage. There is also a shift being made from search to discovery that is being acknowledged by savvy vendors. “BlackBerry believes in having dedicated applications for things like Facebook and Google services. The reason for this is because it is more efficient to be notified of events than have to check on them. A Facebook application will alert you when there is something new happening on the platform, instead of you wasting time in checking the website,” he explains. With increasingly busy lives and the reality that we are all trying to do more with less, mobility is critical and is growing at a rapid pace. “According to Gartner more smartphones will be sold than laptops globally this year,” says Liebenberg. “By 2011 there will be 300 million of them around. People don’t want desktops. Everything is merging into one, converged device. Smartphones are the future.” Your smartphone is probably already capable of meeting most of your portable business needs. The question is, are you ready for it?

[Opinion] Modern mobile smartphone devices are feature-rich and sport processing power comparable to desktop computers of just a few years ago. They play media, have pervasive Internet connectivity, take photos and do almost anything your desktop or laptop can. Smartphones are already able to handle a good deal of your work while on the road and go everywhere you do – but are you ready to converge your work and play?


feature

Virt ual isat ion

Silver bullet or lead balloon? Companies of all sizes have begun experimenting with virtualisation in the hope of gaining some relief from rising energy costs and the growing complexity of their IT environments. But what does it really have to offer? >

By Brett Haggard

V

irtualisation is on the agenda of most businesses today. Unfortunately few of them have thought their virtualisation efforts through properly, evaluated what they’re trying to achieve with their consolidation projects and most importantly made provision for the changes their IT strategy will have to undergo in order to embrace this new way of thinking. It’s becoming clear that virtualisation on its own will not reduce an organisation’s power consumption, or the costs of running its IT environment – in fact, if virtualisation isn’t done correctly it can end up doing the exact opposite. But let’s take a step back. Why exactly is virtualisation such a hot button at present and what can it do for a business? It just makes sense Virtualisation is quite simply, the route to better value from one’s IT investments. “Budgets are shrinking and demands on IT performance are rising,” says Gordon Love, regional manager at Faritec. “The net effect is that IT departments are having to do more with the same or less resource, while all the time keeping the costs of maintaining their infrastructural elements to a bare minimum.

14

October 2008_


“Truly virtualised environments are some way off for the vast majority of organisations. Like any tectonic shift in the market, virtualisation will take time to fully pan out.” “Virtualisation is the most cost effective, risk averse and quickly implementable solution to this overriding problem modern IT departments face,” he adds. “By consolidating multiple workloads onto far less physical hardware, we have seen virtualisation become capable of delivering reductions in both the costs of running a datacentre (impacting real estate, power and cooling requirements) and the costs administering the hardware and software that resides within it,” Love explains. “Furthermore, through the ability to securely backup and migrate entire virtual environments virtualisaiton has helped us to eliminate unplanned downtime and ensure that our clients’ IT environments can recover far more quickly from unplanned outages,” he says. But virtualisation isn’t just making inroads into the discipline of saving money and reducing complexity however. With Stratus Technologies’ announcement last month of a software-based high availability solution for the commodity Intel x86 server realm, it’s become a solution to small and medium sized businesses’ uptime woes. Increasing uptime Stratus’ new offering is designed to afford SMBs and larger organisations with branch infrastructure greater reliability, improved efficiency of their server resources and a reduction in the cost of managing their server infrastructure. And importantly, the solution is designed to fit in with the budgetary constraints these organisations are normally hamstrung by. The solution is called Avance and David Laurello, CEO and Chairman of Stratus who was out in South Africa for the launch of the product last month says it enables SMBs and branch offices to begin immediately benefitting from uptime that would normally be outside of their annual IT budgets. Laurello says that businesses are facing three challenges when it comes to IT infrastructure today. “Firstly,” Laurello says, “as organisations have become more reliant on their IT infrastructure as an enabler for business, so they have realised that they cannot tolerate downtime. “A second priority is cost savings,” he continues, “and one of the most effective ways for a business

to save money in their IT environment is for them to consolidate their applications onto less physical infrastructure. “Lastly,” he says, “SMBs generally don’t have a wealth of IT skill at their disposal and larger corporates generally don’t have IT resources present at each of their branches. “Overall, we’re finding that organisations are opting for strong remote management and problem resolution capabilities across the enterprise to deal with this issue,” he says. Laurello says Avance gives businesses of all sizes the ability to make inroads into each of these areas. “It is a veritable silver bullet when it comes to solving some of the most pressing IT issues being faced by businesses today,” he adds. Good potential for South Africa Stratus believes that Avance has a wealth of potential in the South African market. “Avance is a solution that’s strongly differentiated from competitive offerings by virtue of its simplicity,” says Dick Sharod, country-manager for Stratus in South Africa. “Unlike competitive solutions, it is designed to be installed within 15 minutes and be fully operational in more or less the same time-frame. It is also designed to work in an environment where a highlevel of IT skill is not present,” he says. “So, there’s no complicated scripting and ongoing support required to keep the solution up and running,” he says. Sharod adds that since Avance touches on a far larger segment of the availability market than Stratus’ classic continuous availability solutions, the potential customer base it has at its disposal is also much bigger. “Because the South African market consists of so many small and medium sized organisations, we believe that massive potential exists for our partners to drive the solution locally,” he says. Sharod says his company is also predicting that Avance will give partners an entirely new way of engaging with customers on the service front. “Today,” he says, “the companies handling the support services for enterprise customers’ branch infrastructure have nothing else to compete on but price. “By guaranteeing their customers a higher level of

www.netdotwork.co.za

15


feature Fixing threading with virtualisation One of the biggest problems facing the IT industry today is the disconnect that

uptime than before, and then using their existing infrastructure to deliver better uptime, reductions in management overhead and the ability to consolidate their server resources, service and support organisations will have something new and extremely powerful to differentiate themselves on,” Sharod says. “We are extremely positive about the impact Avance can have in the Sub-Saharan Africa markets,” he concludes.

exists between new hardware and traditional software. To continually up the performance of their processors, vendors like Intel and AMD have implemented multi-core architectures. The problem is, taking advantage of these architectures means multi-threaded application code – something that’s a relatively rare commodity in the market today. Virtualisation’s ability to run multiple distinct virtual machines on these new processors provides a good workaround however. By placing four virtual machines on a single four-core computer and allocating a core to each of these virtual machines, companies can run four

Thinking right While the benefits are clear, the big question remains: Are businesses thinking right about virtualisation in the right way? Andrew Fletcher, enterprise storage and server boss at HP South Africa says his company recently conducted some research into the virtualisation space and found the results quite sobering. “We found that although more than 85 percent of technology decision makers are engaged in virtualisation projects, most of them expect to have only completed the switchover of a quarter of their environment by the end of 2010,” he says. And while many of those surveyed believe they will reach the 75 percent mark, Fletcher says only one-third of them recognise virtualisation as a valuable business tool. The rest – two-thirds of respondents – relegate virtualisation to the role of technology enabler and nothing more, he says. And that’s worrying. Fletcher says virtualisation is a valuable business tool – it should be viewed as a move by the business to reduce costs and gain agility and not by its IT department to make life easier. From HP’s perspective, doing virtualisation right means managing and automating mixed physical and virtual environments, and having access to the tools that can control how applications will behave in this new environment, how operations are managed in this new environment and how the various infrastructure and client architectures are kept in check. An end-to-end portfolio To rectify some of the issues its research revealed, HP recently embarked on a massive virtualisation drive, claiming to have the broadest portfolio of virtualised environments available in the market today. Its approach to virtualisation, Fletcher says, is focused on removing the technology inhibitors that reduce impact on the business. “It highlights how applications and business services can perform well regardless of where and how they are hosted, networked or managed; and dramatically simplifies management across a combined virtual and physical world. It also addresses the issue of pooling infrastructure resources across an organisation,” he says. Fletcher says the aim is to lower operational cost, mitigate the risk of a heterogeneous environment and free up resources so that the business can embark on the rollout of new business services. Breaking new ground HP’s virtualisation announcements comprise a set of enhancements to its HP-UX operating system, a new blade server designed expressly for virtualisation and a new set of scalable NAS File services that ‘virtualise’ the connection between servers and storage to create a consolidated, ‘shared data’ architecture. HP’s claims with these new offerings are bold. With its revamped HP-UX, it reckons that companies can expect to gain in the region of a 50 percent performance improvement for Oracle

16

October 2008_

instances of single threaded code, thus making optimal use of the hardware.

applications, SAP applications and other mission-critical applications of a similar kind. On the blade server front, it says a typical configuration of its new BL495c can support a minimum of 32 virtual machines (VMs) out of the box. Furthermore, it says that filling an HP BladeSystem c7000 enclosure with 16 BL495c server blades allows for up to 512 VMs to be serviced, versus the 256 and 112 VMs offered by its the closest competitors. Lastly, its HP StorageWorks 4400 NAS File services solution’s ‘virtualised’ connectivity results in every server in a network having access to all data storage. “This allows for pools of resources to be applied wherever necessary,” Fletcher says. The theory is that since applications can access data sets from any server, it becomes easier and faster to move them from one server to another to increase productivity. Utility-based pricing arrives Rounding its offerings out, Fletcher says the strong interest the company has noticed in the utility-based pricing model has urged it to offer just these kind of services to its customers. “With this model, customers have instant access to additional capacity right inside the servers they’re currently using – it simply needs to be activated,” he says. Customers can choose to either enable this additional capacity temporarily at times of high-load, or enable it on permanently, as and when their needs grow. If all of the new virtualisation solutions HP has launched over the past month truly do what it claims they do, it could be onto a good thing. But as HP’s research notes, truly virtualised environments are some way off for the vast majority of organisations. Like any tectonic shift in the market, virtualisation will take time to fully pan out. As to whether it will be too late remains to be seen. With cloud computing on the horizon and the two largest players in the IT market at present, Google and Microsoft pushing hard in this direction, virtualisation may be a moot point in a couple of years’ time.

[Opinion]

]

Virtualisation is proving to be more of a beast[than the market ever anticipated. Opinion While it started out as a relatively simple concept, as it’s gained traction it has had to conform to the same rules of governance and business benefit that preceding offerings have had to. That has meant businesses need to take time to understand it, implement tools to manage and tune it, and choose solutions from the myriad available that are capable of integrating with each other. It’s value is unquestionable – whether or not organisations have the appetite for adapting to the changes it will ring in is an entirely different story.



feature

Mashups to disseminate government info

Web services proliferate As one might expect, the software for building consumer-oriented mashups is typically offered by the provider of the web service being consumed as part of the mashup, so developers will work with the documented web service calls from say Amazon Web Services, eBay Developers Program, Windows Live Dev, etc. According to ProgrammableWeb.com, over 50% of mashups consume the Google Maps API, with Flickr (a popular site for photographs) coming in second at 10%, and Amazon third at 8%. Citizen-centric mashups require government data and APIs. The sets released by the UK government are non-personal data that include information from healthcare providers, notices from the London Gazette, the government’s official journal and newspaper of record, a list of schools from the Edubase database, maps from Ordnance Survey, and carbon footprint information from Defra. The move by the government is another step along the way to compliance with the EU Directive on the re-use of public sector information, PSI. The aim of the directive is to encourage the re-use of PSI by removing obstacles that stand in its way. The main themes of the directive are improving transparency, fairness, and consistency. It was introduced two years ago to stimulate the development of innovative new information products and services across Europe, so boosting the information industry.

The penny has dropped

The British Government recently made an unprecedented move by releasing a number of previously unavailable data sets together with APIs for use in composite applications or mashups. The ‘Show us a Better Way’ initiative by the UK’s Cabinet Office is to get input from the Web 2.0Source: ComputerWire savvy public on how government services can be improved using mashups.

T

his welcome but unusual public consultation includes a competition with a prize money of 20 000 pounds ($40 000) for seed funding the best mashup entry. Many view the mashup as the ‘next-big-thing’ for Web 2.0. It is basically a website, or more usually a web-based application, comprised of two or more different resources assembled from different sources on the web, but presented to the user as a single, seamless ‘experience’ or application. Today, most developers experimenting with mashups are using consumer-centric content from the likes of eBay, Amazon, Google, and others, but in the future corporate developers will combine web service elements from a variety of sources, ranging from commercial offerings to bespoke, in-house, line-of-business services enabled by Service Oriented Architecture.

18

October 2008_

The government’s move certainly fits into this strategy. It shows that the penny has dropped with regard to the use of Web 2.0 technologies in delivering public services and there are already several benefits as a direct result of this. First, we have a number of data sets that were not available to the public before. These should be of use to people and businesses and of interest to those who scrutinise government services. Second, we have the first technology-led public consultation of its kind. This is a good move in the age of crowdsourcing which taps into the ‘crowd’ or individuals connected by the Internet to help solve problems, suggest innovations, and provide or improve services. Crowdsourcing is an efficient way to multiply the power of thought that supports a project and is likely to result in more innovative solutions than with a traditional project team that is limited to a handful of people. Third, the move is good for the image of the public sector that is often seen as lagging behind the private sector in adopting new technology. Furthermore, the government should gain good insight into what people want when it comes to public services and deliver more user-friendly on-line services. If the initiative is successful and third parties start to provide access to the data through mashups, we will be able to view the information in different combinations that were not previously possible, e.g. schools data linked to maps or carbon footprint information linked to introduction of legislation, thereby providing researchers with richer information for analysis.

[Opinion] The ‘Show us a Better Way’ initiative should ease the burden on the government that comes with being the sole holder or disseminator of public sector data, releasing some of the demand pressures on the sector for electronic access to information. Now, if only the penny would drop with the South African authorities.




Security ROI, By Bruce Schneier PAGE

22

INFRASTRUCTURE Finding practical and effective solutions to help cut costs, reduce power consumption and deliver world-class data centre facilities.

26

COMMUNICATIONS Learning about best practices for traditional communications services, how to evaluate and select outsourcing alternatives, and leading the transition to VoIP.

34 30

ENTERPRISE INTELLIGENCE The best of BPM, CRM, ERP, e-commerce, business intelligence, project management, application management, and portal software.

34

RISK MANAGEMENT Identifying, mitigating and resolving threats as they become more sophisticated and cause more damage to businesses than ever before.

38

STORAGE Learning how key storage and storage network technologies work together to drive your business. Clarifying how to use storage protocols and technologies.

42

MOBILITY Wireless, driving both technology and strategy, is the future of IT and should be at the strategic heart of your organisation’s IT plans.

section keyline

QA

Q&A

WP

Whitepaper

CS

Case Study

UD

update

OP

Opinion

TL

Thought Leader

RS

Research

www.netdotwork.co.za

21


infrastructure Finding practical and effective solutions to help cut costs, reduce power consumption and deliver world-class data centre facilities.

UD Chips ahoy! As expected, firm details of Intel’s new Nehalem processor and a sneak peak of its upcoming Larrabee processor dominated the three days of announcements from Intel’s Developer Forum (IDF) held in the US recently. went to San Francisco for the event. By Brett Haggard

ever since chipmakers like Intel and AMD began edging up on the physical limits of what clock speeds their processors could run at, the semiconductor space became more interesting. Instead of being judged on the clock speed of their silicon, vendors are getting judged by the number of features they can cram into their new offerings, to what degree architectural changes can eke extra performance out of a chip and lastly, how many cores (distinct processing units) they can cram onto a single die. Although AMD contends its architecture will allow it to take the performance crown back in the coming months, Intel remains the leader for the time being.

22

October 2008_

With the availability of its new processor line, Nehalem, and the upcoming Larrabee platform that it says is set to revolutionise the way we think about conventional computers, AMD might have a tougher time than it anticipates. Just plain faster Nehalem is the code name for the new family of processors Intel showed off at IDF. Officially dubbed Core i7, the processor family is based on the 45-nanometre manufacturing process that its Penryn processors were built on, but features an evolved Core architecture. While clever design tricks have been the way Intel and

AMD have delivered most of their increases in performance, sometimes you can’t argue with the benefits derived from making things, just plain faster. One example of this shines through with the on-die triple-channel DDR-3 memory controller that Intel has built for Nehalem. Coupled with its new ‘QuickPath Interconnect’ – a new highspeed interconnect that resides between the processor, chipset and memory – Nehalem’s memory bandwidth climbs to three times that of previous generation Core microarchitecture solutions. The new micro-architecture also features evolved HyperThreading technology capable

of running up to 8-threads on 4 cores; and a couple of new instructions, causing it to rename its instruction set for Nehalem SSE 4.2. For the more technically minded, these new instructions deliver accelerated string and text processing, accelerated searching and pattern recognition of large datasets and new communications capabilities. The really interesting parts of Nehalem aren’t about performance however – we’ve seen faster chips before. Intel designed Nehalem to be more power efficient and more importantly to be faster.And that’s something that’s remarkable.


“The really interesting parts of Nehalem aren’t about performance ~ its designed to be faster, while being more power efficient. And that’s something that’s remarkable.”

Focusing on power consumption As the energy crunch continues to affect the cost of IT, Green computing is becoming very en-vogue. And fitting in with that trend, Nehalem was designed with a non-negotiable rule in mind – every feature proposed for Nehalem that would increase the processor’s power consumption by 1 percent had to deliver a corresponding 2 percent or greater increase in performance. If the feature couldn’t equal or beat this ratio, it wasn’t added, regardless of how desirable. So Intel’s engineers used 1-million transistors (more than the number of transistors used for the 486 processor) and built

a new on-die Power Control Unit (PCU) for Nehalem. This new PCU allows for individual Nehalem cores to be near-completely shut off when they are in deep sleep states – thereby allowing them to consume a truly negligible amount of power. The other benefit of handling power management on the processor die is that the cores can be woken up extremely quickly. Nehalem thus goes from zero to hero in the performance stakes, very quickly. Turning power saving on its head But saving power isn’t everything. With Nehalem Intel has changed power management on its head and actually allowed headroom in thermal envelope created by deep sleeping cores, to its boost performance. Called Turbo mode, the feature works as follows. When only one or two of the cores in the Nehalem processor are active (and the others are shut-off by the PCU), the chip runs far cooler than what it was intended to. Taking advantage of this, Turbo mode automatically allows the active cores to go up a single clock step of 133MHz. If only a single core is active, the PCU allows for one additional clock step, pushing the clock speed on that one core up by 266MHz. These new features make Nehalem a really impressive addition to the processor lineup in Intel’s stable, and one that will no doubt do wonders in the server market and desktops that run heavily threaded applications. The processor family is destined for the server space first (a six core Xeon is already

available to the market) and will soon be available for highperformance desktops. Intel has said that the longterm view is for this technology to migrate into the mobile space too, where the power savings and performance boosts will definitely come in handy. Looking ahead to Larrabee A subject of much speculation in the market is Intel’s upcoming Larrabee processor – a processor that’s rumoured to have the graphics vendors in the market quite worried. That’s because Larrabee is designed to cater for much of the functionality currently dealt with by graphics cards, right on the processor. It’s a bold set of goals for Intel however, since previous attempts to get into the graphics market haven’t been all that successful. Unfortunately, with the product due to begin shipping sometime next year, the details are still sketchy. So here’s what we know so far: • Larrabee will be the industry’s first many-core x86 Intel architecture. • Initial product implementations of the Larrabee architecture will target discrete graphics applications, support DirectX and OpenGL, and run existing games and programs. Additionally, a broad potential range of highly parallel applications including scientific and engineering software will benefit from the Larrabee native C/C++ programming model. • While Nehalem has been interesting to take a look at, the market is waiting in eager anticipation of Larrabee. If Intel gets Larrabee right, it has the potential to change the way PCs are architected today.

[Opinion] New chips and their accompanying performance boosts are always a good thing. That said, the software market still has a great deal of catching up to do in order to consume all of the performance chip vendors are providing. Right now there simply isn’t enough software available capable of taking true advantage of the multiple cores and multithreadability of modern processors. The Nehalem family of processors does however provide a native speed improvement for applications that haven’t yet been written with multithreading in mind, albeit about 10-15 percent. On the solid-state drive front, even though prices have come down, we’re still alarmed at how much extra SSDs cost by contrast to conventional mechanical drives. On the upside though, prices are coming down and more the SSDs vendors sell, the quicker prices come down. It’s clear that SSDs will become commonplace in every computer over the coming years. But it’s going to take longer than two or three years.

www.netdotwork.co.za

23


infrastructure Finding practical and effective solutions to help cut costs, reduce power consumption and deliver world-class data centre facilities.

Andy Monshaw, General Manager of IBM System Storage worldwide.

The announcement, and rightly so, was deliberately devoid of the underlying and numerous, nearly 50, specific product announcements that supported this initiative and which were the building blocks for the integrated solutions a customer may require. IBM’s Information Infrastructure strategy is based on a program that was started early in 2006, with an investment cost of some $2bn, and which has involved over 2 500 storage technical professionals, engineers and researchers from nine different countries including France, Germany, Israel, Japan, the UK and the USA . These new technologies and services for businesses and government are designed to tackle massive growth and mobility of data, skyrocketing energy costs, security concerns and more demanding users. Acquisitions key Again, and as part of this strategy and a departure from IBM’s historic norm, it was realised that numerous acquisitions would be

UD Information Infrastructure In an event that was very unlike its traditional launches, IBM recently keynoted its Information Infrastructure strategy at three parallel events, the main one being in Montpellier, France, where IBM has one of its major development centres. It was the largest-ever infrastructure launch of new storage hardware, software and services the company has ever undertaken. This sweeping initiative was intended to address the major shift in the flow of the world’s data. By Paul Booth

Global Research Partners

24

October 2008_

involved in the fulfilment of this project. Several of these takeovers were completed earlier this year with one or two in only the last few weeks/months. The eight key buy-outs include Arsenal, Cognos, Diligent, FilesX, NovusCG, Optim, Softek and XIV. Indications at the launch suggested that this new acquisition philosophy would be part of IBM’s on-going strategy and that further announcements could be expected to complement its own internal activities and exploitation of its inherent intellectual property. Explosive data growth IBM believes that the storage requirement for organisations is growing at a rate well in excess of 100% per annum (a much higher figure than analysts are currently suggesting, which is a CAGR of 54%), and that current solutions would be unable to cope with this type of growth, thus necessitating the new approach that has now been announced. For instance, personal information held by organisations is expected to grow 16 times by 2020; 80% of data stored is now unstructured by nature; and that image storage with its high definition requirements pushes up that storage requirement by 1000-fold. In addition, the conventional IT and data storage infrastructure is not designed to efficiently manage the estimated two billion people that will be using the Web by 2011. IBM expects one trillion connected objects, e.g. appliances, cameras, cars, pipelines and roadways, which would constitute the new Internet of ‘things’ by that date. The business issues From a business perspective the issues can be summarised under

three headings: • I have to store it; a compliance issue, • I want to store it; it has business value; and, • I can’t store it; I don’t have the budget. From an IBM viewpoint, its objectives are, when looked at from a user’s perspective, to reduce reputation risks (i.e. information compliance), deliver continuous information access (i.e. information availability), support information retention policies (i.e. information retention), and, enable secure sharing of information (i.e. information security). Some of the highlights of the specific announcements, which are in the areas of compliance, Internet-scale availability; consolidation and retention; and, security; include: • New disk storage systems, • New storage virtualisation software, • New data de-duplication software and hardware, and, • Onsite and remote data protection offerings, . . . to name but a few. In addition to the above, IBM also has incubating in its labs such projects as a longterm archiving solution, solid state technologies, memory breakthroughs and some realtime decision making that will further enhance future storage offerings.

[Opinion] I believe IBM has read the market well and done an excellent job of positioning itself to take advantage of the emerging trends within the storage arena. The timing with the current financial crisis may be a little unfortunate; however, businesses can only defer their decisions for a limited period, otherwise the consequences could be disastrous.



communications Learning about best practices for traditional communications services, how to evaluate and select outsourcing alternatives, and leading the transition to VoIP.

TL open season Altech’s win against ICASA and the DoC has for all intents and purposes opened the South African telecoms landscape somewhat. Will it radically change anything however?? By Brett Haggard The South African telecoms deregulation process finally got into its stride last month, when the Pretoria High Court Judge Norman Davis ruled in favour of Altech in its dispute with Independent Communications Authority of SA (ICASA) and the Department of Communications around its right to self-provision infrastructure as a VANs licensee. In a nutshell, Altech took communications minister, Ivy Matsepe-Casaburri’s 2005 “determination of dates” for the further liberalisation of the sector, as an indication that VANs could begin self-providing their own infrastructure.

26

October 2008_

The minister and ICASA on the other hand argued that this would lead to an “absurd result”. And in many ways it has – it has greatly devalued efforts by players such as Neotel, who spent a great deal on attaining a licence to compete with Telkom. Only three years after Neotel was awarded its licence, VANs licensees are on a similar footing as them. Open up The ruling in Altech’s favour is a landmark in that it allows the hundreds of Internet service providers and other VANs licensees to build their own

networks independently of licensed infrastructure operators such as Telkom, Neotel and the mobile operators. Largely, the ruling is more of a storm in a teacup than anything else however. While industry commentators have speculated that the ruling will result in new players entering the telecoms market with the right to self-provision, the reality is that very few, if any have the appetite or capital to compete with Telkom, Neotel or any of the mobile operators on a national basis. That’s not to say that there will not be competition however.


the best job, competitors are waiting in the wings. “The big boys have to know that they’re not allowed to inflate prices or confuse issues through bundling and the likes. They need to know that there may be new entrants waiting in the wings, capable of coming into the market and forcing prices back down,” he says. More than anything, Hahn says the ruling is a good indicator that the local telecoms regime is maturing. “I do believe however, that it’s time for the regulators to get together and decide on, be it in private or otherwise, what the landscape will look like. Such an exercise with instill a greater sense of confidence in the South African telecoms market,” he says. There have been a number of moves from international players to merge with South African companies and Hahn says, the majority of those moves haven’t worked out. “When one doesn’t know where the regulator is coming from next, it’s an issue,” he says.

It’s just unlikely that more than a couple of players will look at competing with the big dogs nationally. Unique scenario Will Hahn, principal analyst at Gartner says that South Africa’s telecoms landscape is difficult to call and because there’s no blueprint the country can follow. “South Africa can look in vain for a model to follow from another country, but unfortunately the country is in such a unique position that it will have to forge its own way,” he says.

While he agrees that the ruling is a good thing, at the same time he says it’s not likely to create a scenario where dozens of new players will enter the market and begin digging up roads to lay their own infrastructure. Telkom and the mobile providers are well established and Neotel is some way into its infrastructure rollout. Keeping the big boys in check That said however, Hahn adds that it’s important for countries to create viable and low barriers to entry for the telecoms market, so that the larger players have a sense that if they’re not doing

Interesting opportunities What the ruling does allow for, says Hahn is interesting models to begin developing. “Take for example municipalities ability to lay down infrastructure and become telecoms providers in their own right,” he says. “The ruling opens up an opportunity for these municipalities to lay down the infrastructure and either hand it over, or lease it to a licensee with strong appetite and the ability to become a telco. “In fact, I wouldn’t recommend that the municipalities try to run these networks themselves. They would be far better served by a player that has experience in this game,” he says. Certainly Hahn says it’s plausible for VANs to begin

dipping their toes in the water in some places, like for example in metropolitan areas where large potential user bases exist. The same holds true for underserviced areas. He says it’s unlikely for the ruling to result in a proliferation of players competing with the big telcos on a national scale. Back to ICASA ICASA has said it will not appeal the ruling and that it will continue with the licence conversion process. The DoC has however requested leave to appeal the decision, which puts everything on hold until a final ruling has been made. If the appeal is successful it’s back to the drawing board. If on the other hand, the court finds in favour of the original ruling, eyes will turn back to ICASA and the licensing of wireless spectrum, since this would be the most cost-effective way for new players to come into the market. Conditions on that front look pretty onerous at present, so it’s unclear whether even that will provide an avenue for VANs to begin viably providing telecoms services. One thing’s for sure however. The local telecoms space will be interesting for some time to come.

[Opinion] The ruling is a good thing, since it will provide some interesting niche opportunities and go some way towards keeping the larger players in line. It’s unlikely that it’s going to change the telecoms space drastically. Nothing is cast in stone however. We can’t rule an investment from an international telecoms giant out completely however. A more open market does mean increased competition and that’s always good for the consumer.

www.netdotwork.co.za

27


communications Learning about best practices for traditional communications services, how to evaluate and select outsourcing alternatives, and leading the transition to VoIP.

Vodacom’s predominant market has been the South African one, which accounts for some 63% of its total subscriber base of 24,8m (as at the end of its financial year 31st March 2008). Recently, however, Vodacom has embarked upon a strategy that will focus on horizontal growth through Vodacom Business, i.e. IT, data centres and the fixed line space; and on increasing and extending its African presence, which had previously been limited to its GSM activities.

OP Diversifying Vodacom? Vodacom has traditionally been a cellular player within the African continent with a direct presence in the DRC, Lesotho, Mozambique, South Africa and Tanzania. Do recent announcements indicate the start of its diversification? Paul Booth Global Research Partners

28

October 2008_

Two recent acquisitions To this end, Vodacom has already made two initial transactions. The first was the purchase of 51% of Storage Technologies Services (StorTech), a managed enterprise data centre services company that provides a range of storage and security services. This business was previously part of MB Technologies but had more recently been management-owned. The second, and much more significant, is the $700m buy-out of the carrier services and business network solutions units of Gateway Telecommunications SA (GTSA), a black-owned multinational telecommunications service provider. (The broadcasting services of GTSA were not part of the deal.) Gateway is Africa’s largest independent provider of interconnection services through satellite and terrestrial network infrastructure for both local and international telecommunications companies. It has customers in 40 African countries; has offices in several of these including

Angola, Cameroon, Ghana, Nigeria, Mozambique and Tanzania; and had sales in 2007 of $257m. Only last year, Gateway acquired GS Telecom for $37,5m and committed to a 155Mbps circuit from SEACOM, the first fibre optic cable being laid on Africa’s east coast and connecting to Europe. Also involved in the deal is the purchase of 100% of Gateway Telecommunications (Plc), Gateway Communications (Pty) Ltd, Gateway Communications Mozambique LDA, Gateway Communications (Tanzania) Ltd and GS Telecom (Pty) Ltd and their respective subsidiaries. Future ownership question marks Obviously the above is very significant in the development of the new strategy of Vodacom; however it is overshadowed by the question marks over its future ownership. • Vodafone is looking to take an extra stake in Vodacom; • Globacom (Nigeria) is looking at a deal with Telkom SA that includes Vodacom; • Oger Telecom (Cell C owners with Saudi Telecom as its major shareholder) has made approaches to Telkom SA; • Mvelaphanda has also made a bid for Telkom SA that precludes Vodacom.

[Opinion] Will all this uncertainly put the brakes on further Vodacom activities or will it carry on regardless? Certainly the recent moves have created a real competitor in Africa to MTN. There are some interesting times ahead.



enterprise intelligence The best of BPM, CRM, ERP, e-commerce, business intelligence, project management, application management, and portal software.

TL The Connected Republic It sounds like utopia: A society where citizens collaborate freely and overcome the limitations of monolithic bureaucracy and top-down government. This ‘Connected Republic’ view of the world, however, is now close to becoming a reality in many parts of the world thanks to new technologies, according to a white paper published by Cisco’s Internet Business Solutions Group (IBSG), the vendor’s global strategic consulting arm. By Reshaad Ahmed Senior Manager of Cisco’s Internet Business Solutions Group

According to this white paper, the possibilities of the Connected Republic go beyond e-government’s original goal of improved service delivery and could even herald an age of democratic renewal where people, not bureaucracies, call the shots. The Connected Republic 2.0 shows how network technology is already persuading governments and their departments to become more responsive, flexible and accountable. The technology that brought us Wikipedia and Skype can also provide citizen empowerment. Putting citizens in the driving seat There are a number of examples where modern communication technology is

30

October 2008_

enabling those normally outside the official hierarchies to take control of, and contribute to, issues that affect their lives. In the Philippines, for example, the country’s 16 million cellphone users are now able to report smoke-belching public buses and other vehicles via SMS. These citizens are also able to send text messages to seek emergency assistance and report wrongdoing by police officers. In Hawaii, the city of Honolulu is using unified communications solutions to enable citizens to report potholes – 176 000 of which have been repaired since 2005. And in the UK, a Fix My Street service allows residents to report graffiti, littering, defective street lighting and other urban ills to their local council. Residents are furthermore able to track the progress of the problem and the performance of the responsible agency. These are but a few examples that show how mobile telephony, the Internet and social networking sites can have a positive impact on society. In South Africa, we are taking steps


To do so, public sector organisations will need to follow three principles: They should take a platform approach and maximise the number of people and organisations that can collaborate to create public value. They should ‘empower the edge’ so that frontline organisations and workers have the freedom they need to deliver locally appropriate solutions. They should tap into this social revolution by harnessing the power of citizenry – ‘Us’ – to create knowledge, solve problems and deliver better services. The emergence of the shift from hierarchy and centralised control to a selforganising community is imminent and is best summarised by the principle of ‘small pieces, loosely connected.’

in the right direction to enable similar services, which are being driven out of need more than anything else. One such example is the ‘crime line’, which allows citizens to report crime – or intended crime – via SMS and the Internet. A new generation of citizens The Connected Republic is a model that invites citizens and the public sector to change the way people think about technology, society and government. It replaces a rigid, top-down, uni-directional model of communication between the centres of power and the public with a multitude of two-way conversations. In the same way that the highly interactive Web 2.0 model is replacing broadcast media as the paradigm of choice, a new generation of technologically savvy citizens is refusing to be passive, isolated consumers of media. Instead, they are active participants. This ‘Power of Us’ poses a great challenge to the public sector. Put simply, the institutions whose role is to serve society will need to play catch-up if they are to remain relevant.

Overcoming challenges This may seem a tall order for some organisations and apart from the technical issues; there are several potential pitfalls along the path toward the Connected Republic. Ensuring people’s privacy is obviously a central issue. Creating effective identity management systems is another, as is securing the vast amounts of data moving across the network. From a political perspective, protecting the freedom of people to make choices for themselves and their families are all issues that need to be high on the policy agenda. So, too, is the concern about closing gaps in education, resources and skills that, left unattended, will result in disconnected communities and a loss of social cohesion, as the technological ‘haves’ participate in and shape a society – to the exclusion of the digital ‘have nots’.

[Opinion] Despite these obstacles, the Connected Republic seems full of promise. The network will take centre stage to become the platform for productivity, social inclusion and community. Profound transformation and system change must take place. It will take time, careful investment and sustained leadership, but it is essential if institutions are to maximise the public value they deliver to citizens and create the Connected Republic.


enterprise intelligence The best of BPM, CRM, ERP, e-commerce, business intelligence, project management, application management, and portal software.

CS data visibility The deployment of a robust business intelligence solution for the local arm of leading international medical technology company, Smith and Nephew is complete.

The first phase, which has been two years in the making, involved the development and implementation of an advanced querying and reporting solution for Smith and Nephew’s large and disparate salesforce in South Africa. The solution answers to the company’s need for improved access to important sales information for its sales teams and, to

Paul Morgan, Managing Director of ASYST Intelligence

32

October 2008_

facilitate faster, more accurate reporting and, informed decision-making. Meaningful intelligence “We chose Business Objects as a preferred solution because it enables easy access via the web and gives our sales-force the ability to run their own reports and drill down to answer questions about the data themselves,” explains Mark Elliott, System’s Analyst at Smith and Nephew, “Management was also looking for a way of tapping into the vast information they had stored in various systems across its business units and, turn it into meaningful intelligence,” he adds.

“The fact that Smith and Nephew is a global organisation and that it has a vast infrastructure, with different formats and types of information stored in numerous different systems.”


more importantly, it makes it easier for users to answer any business question competently and accurately wherever they may be.”

Smith and Nephew approached ASYST Intelligence towards the end of 2005 to assist in the development and implementation of a BI solution that could address these challenges. ASYST Intelligence, which is a South Africabased Business Objects Platinum Partner, used Business Objects XI Release 2 and WebIntelligence to migrate the necessary reports and created a customised webbased business intelligence portal. The powerful BI solution allows the Smith and Nephew sales-force to access information via a secure web-based portal from any Internet-connected PC, draw relevant reports, prepare specific reports and analyse data in a no-fuss, user-friendly manner no matter where in the country they are. The solution also allows them to provide management with the required scheduled reporting and delivers a dashboard view of activities. “The solution offers advanced functionality to help reduce report proliferation and maintenance. But,

Connecting the disconnected “This is particularly useful for sales personnel who are usually on the road most of the time and are largely disconnected from the rest of the business. It is up to the sales personnel to answer client queries, most times on the spot and, provide feedback to the business on what’s happening in the market. This solution ensures that they have the right tools and information at their fingertips, empowering them to make informed decisions and be more proactive. “From a management perspective, the solution provides a single view of what’s happening in the field, facilitating more informed business decisions,” explains Elliott. Commenting on the solution, Paul Morgan, Managing Director of ASYST Intelligence, says: “We spent a considerable amount of time with Smith and Nephew to gauge a proper understanding of the business, the challenges it was facing in terms of information management and what their expectations of BI were. “The fact that Smith and Nephew is a global organisation and that it has a vast infrastructure, with different formats and types of information stored in numerous different systems, meant that we needed a solution that is easy to deploy and integrate with the different systems already in use across the business.”

[Opinion] Morgan: “The solution meets these requirements and at the same time retains the granularity and flexibility needed in complex global deployments. Training end-users was a key aspect in the successful deployment of the project. Our team spent time up-skilling Smith and Nephew’s sales personnel and management team to ensure that they understood how to use and manipulate the solution for maximum benefit.”


risk management Identifying, mitigating and resolving threats as they become more sophisticated and cause more damage to businesses than ever before.

TL Security ROI Return on investment, or ROI, is a big deal in business. Any business venture needs to demonstrate a positive return on investment, and a good one at that, in order to be viable. By Bruce Schneier // Chief Security Technology Officer BT

It’s become a big deal in IT security, too. Many corporate customers are demanding ROI models to demonstrate that a particular security investment pays off. And in response, vendors are providing ROI models that demonstrate how their particular security solution provides the best return on investment. It’s a good idea in theory, but it’s mostly bunk in practice. Before I get into the details, there’s one point I have to make. “ROI” as used in a security context is inaccurate. Security is not an investment that provides a return, like a new factory or a financial instrument. It’s an expense that, hopefully, pays for itself in cost savings. Security is about loss prevention, not about earnings. The term just doesn’t make sense in this context. The bottom line But as anyone who has lived through a company’s vicious end-of-year budgetslashing exercise knows, when you’re trying to make your numbers, cutting costs is the same as increasing revenues. So while security can’t produce ROI, loss prevention most certainly affects a company’s bottom line. And a company should implement only security countermeasures that affect its bottom line positively. It shouldn’t spend more on a security problem than the problem is worth. Conversely, it shouldn’t ignore problems that are costing it money when there are cheaper mitigation alternatives. A smart company needs to approach security as it would any other business decision: costs versus benefits. Annualised loss expectancy The classic methodology is called annualised loss expectancy (ALE), and it’s straightforward.

34

October 2008_


Calculate the cost of a security incident in both tangibles like time and money, and intangibles like reputation and competitive advantage. Multiply that by the chance the incident will occur in a year. That tells you how much you should spend to mitigate the risk. So, for example, if your store has a 10 percent chance of getting robbed and the cost of being robbed is $10 000, then you should spend $1 000 a year on security. Spend more than that, and you’re wasting money. Spend less than that, and you’re also wasting money. Of course, that $1 000 has to reduce the chance of being robbed to zero in order to be cost-effective. If a security measure cuts the chance of robbery by 40 percent – to 6 percent a year – then you should spend no more than $400 on it. If another security measure reduces it by 80 percent, it’s worth $800. And if two security measures both reduce the chance of being robbed by 50 percent and one costs $300 and the other $700, the first one is worth it and the second isn’t. The actuarial tail The key to making this work is good data; the term of art is “actuarial tail.” If you’re doing an ALE analysis of a security camera at a convenience store, you need to know the crime rate in the store’s neighbourhood and maybe have some idea of how much cameras improve the odds of convincing criminals to rob another store instead. You need to know how much a robbery costs: in merchandise, in time and annoyance, in lost sales due to spooked patrons, in employee morale. You need to know how much not having the cameras costs in terms of employee morale; maybe you’re having trouble hiring salespeople to work the night shift. With all that data, you can figure out if the cost of the camera is cheaper than the loss of revenue if you close the store at night – assuming that the closed store won’t get robbed as well. And then you can decide whether to install one.

Cybersecurity is considerably harder, because there just isn’t enough good data. There aren’t good crime rates for cyberspace, and we have a lot less data about how individual security countermeasures – or specific configurations of countermeasures – mitigate those risks. We don’t even have data on incident costs. One problem is that the threat moves too quickly. The characteristics of the things we’re trying to prevent change so quickly that we can’t accumulate data fast enough. By the time we get some data, there’s a new threat model for which we don’t have enough data. So we can’t create ALE models. The million dollar question? But there’s another problem, and it’s that the math quickly falls apart when it comes to rare and expensive events. Imagine you calculate the cost – reputational costs, loss of customers, etc. – of having your company’s name in the newspaper after an embarrassing cybersecurity event to be $20 million. Also assume that the odds are 1 in 10 000 of that happening in any one year. ALE says you should spend no more than $2 000 mitigating that risk. So far, so good. But maybe your CFO thinks an incident would cost only $10 million. You can’t argue, since we’re just estimating. But he just cut your security budget in half. A vendor trying to sell you a product finds a Web analysis claiming that the odds of this happening are actually 1 in 1 000. Accept this new number, and suddenly a product costing 10 times as much is still a good investment. It gets worse when you deal with even more rare and expensive events. Imagine you’re in charge of terrorism mitigation at a chlorine plant. What’s the cost to your company, in money and reputation, of a large and very deadly explosion? $100 million? $1 billion? $10 billion? And the odds: 1 in a hundred thousand, 1 in a million, 1 in 10 million? Depending on how you answer those two questions – and any answer is really just a guess – you can justify spending

“Security is not an investment that provides a return, like a new factory or a financial instrument. It’s an expense that, hopefully, pays for itself in cost savings.”

anywhere from $10 to $100 000 annually to mitigate that risk. Jigging the numbers Or take another example: airport security. Assume that all the new airport security measures increase the waiting time at airports by - and I’m making this up – 30 minutes per passenger. There were 760 million passenger boardings in the United States in 2007. This means that the extra waiting time at airports has cost us a collective 43 000 years of extra waiting time. Assume a 70-year life expectancy, and the increased waiting time has “killed” 620 people per year – 930 if you calculate the numbers based on 16 hours of awake time per day. So the question is: If we did away with increased airport security, would the result be more people dead from terrorism or fewer? This kind of thing is why most ROI models you get from security vendors are nonsense. Of course their model demonstrates that their product or service makes financial sense: They’ve jiggered the numbers so that they do.

[Opinion] This doesn’t mean that ALE is useless, but it does mean you should 1) mistrust any analyses that come from people with an agenda and 2) use any results as a general guideline only. So when you get an ROI model from your vendor, take its framework and plug in your own numbers. Don’t even show the vendor your improvements; it won’t consider any changes that make its product or service less cost-effective to be an “improvement.” And use those results as a general guide, along with risk management and compliance analyses, when you’re deciding what security products and services to buy. Resources Are Security ROI Figures Meaningless? http://tinyurl.com/4k8aqt The Problem of Measuring Information Security http://tinyurl.com/47e8yv Calculating Security Return on Investment http://tinyurl.com/4gyo4g Bejtlich and Business: Will It Blend? http://tinyurl.com/3hol5r How to Calculate Return On Investment (ROI) for Web Security http://tinyurl.com/3elh37

www.netdotwork.co.za

35


risk management Identifying, mitigating and resolving threats as they become more sophisticated and cause more damage to businesses than ever before.

UD Full Disclosure In eerily similar cases in the Netherlands and the United States, courts have recently grappled with the computer-security norm of “full disclosure,” asking whether researchers should be permitted to disclose details of a fare-card vulnerability that allows people to ride the subway for free. By Bruce Schneier, Chief Security Technology Officer of BT.

THE ‘Oyster card’ used on the London Tube was at issue in the Dutch case, and a similar fare card used on the Boston ‘T’ was the centre of the US case. The Dutch court got it right, and the American court, in Boston, got it wrong from the start – despite facing an open-and-shut case of First Amendment prior restraint. The US court has since seen the error of its ways – but the damage is done. The MIT security researchers who were prepared to discuss their Boston findings at the DefCon security conference were prevented from giving their talk.

The ethics of full disclosure The ethics of full disclosure are intimately familiar to those of us in the computer-security field. Before full disclosure became the norm, researchers would quietly disclose vulnerabilities to the vendors – who would routinely ignore them. Sometimes vendors would even threaten researchers with legal action if they disclosed the vulnerabilities. Later on, researchers started disclosing the existence of a vulnerability but not the details. Vendors responded by denying the security holes’ existence, or calling them just

36

October 2008_

theoretical. It wasn’t until full disclosure became the norm that vendors began consistently fixing vulnerabilities quickly. Now that vendors routinely patch vulnerabilities, researchers generally give them advance notice to allow them to patch their systems before the vulnerability is published. But even with this “responsible disclosure” protocol, it’s the threat of disclosure that motivates them to patch their systems. Full disclosure is the mechanism by which computer security improves. Medieval guilds Outside of computer security, secrecy is much more the norm. Some security communities, like locksmiths, behave much like medieval guilds, divulging the secrets of their profession only to those within it. These communities hate open research, and have responded with surprising vitriol to researchers who have found serious vulnerabilities in bicycle locks, combination safes, masterkey systems, and many other security devices. Researchers have received a similar reaction from other communities more used

to secrecy than openness. Researchers – sometimes young students – who discovered and published flaws in copyrightprotection schemes, votingmachine security and now wireless access cards have all suffered recriminations and sometimes lawsuits for not keeping the vulnerabilities secret. When Christopher Soghoian created a website allowing people to print fake airline boarding passes, he got several unpleasant visits from the FBI. Fundamentally fragile This preference for secrecy comes from confusing a vulnerability with information *about* that vulnerability. Using secrecy as a security measure is fundamentally fragile. It assumes that the bad guys don’t do their own security research. It assumes that no one else will find the same vulnerability. It assumes that information won’t leak out even if the research results are suppressed. These assumptions are all incorrect. The problem isn’t the researchers; it’s the products themselves. Companies will only design security as good

as what their customers know to ask for. Full disclosure helps customers evaluate the security of the products they buy, and educates them in how to ask for better security. The Dutch court got it exactly right when it wrote: “Damage to NXP is not the result of the publication of the article but of the production and sale of a chip that appears to have shortcomings.” In a world of forced secrecy, vendors make inflated claims about their products, vulnerabilities don’t get fixed, and customers are no wiser. Security research is stifled, and security technology doesn’t improve. The only beneficiaries are the bad guys. If you’ll forgive the analogy, the ethics of full disclosure parallel the ethics of not paying kidnapping ransoms. We all know why we don’t pay kidnappers: It encourages more kidnappings. Yet in every kidnapping case, there’s someone – a spouse, a parent, an employer – with a good reason why, in this one case, we should make an exception. The reason we want researchers to publish vulnerabilities is because that’s how security improves. But in every case there’s someone – the Massachusetts Bay Transit Authority, the locksmiths, an election machine manufacturer – who argues that, in this one case, we should make an exception.

[Opinion] We shouldn’t. The benefits of responsibly publishing attacks greatly outweigh the potential harm. Disclosure encourages companies to build security properly rather than relying on shoddy design and secrecy, and discourages them from promising security based on their ability to threaten researchers. It’s how we learn about security, and how we improve future security.



storage Learning how key storage and storage network technologies work together to drive your business. Clarifying how to use storage protocols and technologies.

TL Data confusion Analysts and industry experts agree that the world is in for an information explosion. Is your infrastructure ready to take the load? By Simon Dingle

38

October 2008_

Storage is about more than just having the physical capacity for your data. Stored data is useless if it can not be used to extract useful information. It’s not surprising that we all generate more data today than ever before, and that our needs for storage are growing exponentially – but predictions for the future are startling. The world is gearing up for an information explosion. “The dynamics are changing,” says William Raftery, Senior Vice President of EMC Global Product Sales who was in SA recently. “Thousands of bytes of data are generated every second and the rate increases exponentially from one month to the next. Are organisations ready for

the information explosion?” Confusion is rife when it comes to data. Who is responsible for what and where does the buck stop in terms of compliance? “The role of the CFO is different today from what it was years ago,” explains Raftery. “The CFO used to report finances. Now they are strategic partners to the CEO, driving compliance and security issues. Then the CIO is the person carrying the burden of the organisation’s information load, but doesn’t have the relevant portfolio.” The first element of the confusion is in determining information roles in terms who is responsible for what.


“The first step in organising information and gearing it for value, is in turning around and looking at the business itself; assessing business units, processes and looking at the process of information lifecycle management.”

The new information economy We are entering the next phase in terms of data usage and storage. In this new paradigm information is the be all and end all of business. Compliance drives coherent storage and cataloguing of data. This new information economy is driven by accountability. It will also hail the end of disorganised data. “85 percent of the world’s information is unstructured,” says Raftery. “If your information isn’t structured how do you know what is critical and what is junk – and how do you extract value?” “This is part of the reason a new information systems strategy is being built around the new information economy. It’s

about who’s accountable,” affirms Raftery. “But it’s also about doing more with less. It’s still about TCO and ROI. But added to this is risk avoidance. And, of course, how to extract value from information.” In terms of South Africa, Raftery says that local challenges closely reflect those of international companies. “I speak to South African businesses and I hear the same questions I do in the rest of the world,” he states. And if the questions are the same, then so are the answers. Make that info matter According to Raftery, the first step in organising information and gearing it for

value is in turning around and looking at the business itself; assessing business units, processes and looking at the process of information lifecycle management. “Next phase involves securing information,” he says. “It’s no longer about peripheral security. Technologies need to be built in at the storage device level to look after the data it houses.” “With security in place, we can now look at building policy and classification. Who gets to do what internally? And how can we eliminate security breaches?” Raftery uses the example of a Boston company that lost the details of 40 million American credit cards due to a physical security breach. But don’t think that smaller and medium sized businesses can afford to let down their guard either. Data loss prevention can be implemented at a technology level, but workflow must be addressed too so that encryption and other methodologies can be implemented along the workflow path. “Look at the top 10 priorities of the modern CIO and you’ll find storage is number one, followed by virtualisation and then security,” says Raftery. And when those three come together we get a glimpse of the shape that storage solutions are starting to take.

[Opinion] A majority of the world’s data is unstructured and disorganised. And we’re generating exponentially more of the stuff everyday. Virtualisation and security solutions are offering new frameworks for storing information, but the real question is not where you store data, but how you store it to ensure that it can be used to provide valuable information. A new information economy is emerging, and in this paradigm your business must be physically geared for value, with the help of technology to unlock that value.

www.netdotwork.co.za

39


storage Learning how key storage and storage network technologies work together to drive your business. Clarifying how to use storage protocols and technologies.

UD The NAS vs SAN debate rages on The low costs associated with diskbased data storage in the past motivated companies to simply add more disk space when capacity problems arose. Now, in the light of rapidly rising costs linked to this resource, Armand Steffens, a Netgear business development manager at Duxbury Networking, revives the long running debate as to which storage technology is best – network-attached storage (NAS) or the storage area network (SAN) – from a cost-efficiency point of view. By Armand Steffens Netgear business development manager at Duxbury Networking.

Vendors have been warning corporate users for some time now to revise their data storage strategies in the light of imminent price rises on disk based storage resources. Today, a more scientific approach is needed to the management of short- mediumand long-term data storage. In essence, companies need to actively manage data from the moment it is created until it’s no longer needed, rather than passively warehousing it forever at ever-increasing cost. They will also have to come to terms with the two technologies that are vying for pride of place in the corporate storage landscape: network-attached storage (NAS) and the storage area network (SAN). At first glance NAS and SAN technologies may seem similar. A SAN is a dedicated network that is interconnected through a Fibre Channel protocol using Fibre Channel switches and Fibre Channel host bus adaptors. Devices such as file servers connect directly to the SAN through the Fibre Channel protocol.

40

October 2008_

NAS, on the other hand, connects directly to the network using TCP/IP over Ethernet CAT 5 or equivalent cabling. In most cases, no changes to the existing network infrastructure need to be made in order to install a NAS solution. The NAS device is attached to the corporate network and assigned an IP address just like any other network device. Best option SAN protagonists have always underlined the highly redundant features of SANs through the implementation of multipathing and the ability to create fully redundant fibre meshes, obviating a single point of failure. However, the initial investment in a SAN is expensive due to the high cost of Fibre Channel switches and host bus adaptors and the need for additional software – because SAN file sharing is operating-systemdependent and the facility does not exist in many operating systems. When cost is a factor, NAS supporters point to the fact that NAS products are able to run

over an existing TCP/IP network and, as such, are cheaper to implement than SANs. What’s more, NAS challenges the traditional file server approach by creating systems designed specifically for data storage. Instead of starting with a general-purpose computer and configuring or removing features from that base, NAS designs begin with the barebones components necessary to support file transfers and then add features. Ease-of-use is an obvious feature of NAS. NAS products simply plug into the IP network

and appear as ‘just another device’ using the file system and operating system used by client PCs. These include Windows, Unix, Linux, NetWare and Macintosh clients. In addition, users can access files held in personal and shared directories on the NAS server and interrogate it in the same way they do on a normal Windows or Unix Server – with similar security and permissions. And unlike general purpose file servers, client licence fees are not required, reducing cost of ownership as the network grows.


“NAS challenges the traditional file server approach by creating systems designed specifically for data storage. Instead of starting with a general-purpose computer and configuring or removing features from that base, NAS designs begin with the bare-bones components necessary to support file transfers and then add features.” Cost effective NAS servers are also the most cost effective way of adding storage capacity to the network, scaling from entry level 1U rack mount units, to fully redundant NAS Storage Servers incorporating RAID (Redundant Array of Independent Disks) technology. With constant interest rate hikes, return on investment (ROI) is a big consideration today. NAS systems attempt to reduce the cost associated with traditional file servers. Rather than use generalpurpose computer hardware and a full-featured network operating system – such as NetWare – NAS devices generally run an embedded operating system on simplified hardware. Because of this feature, NAS storage servers install without the need to change the standard network configuration. And as they are designed specifically for network storage, a NAS tends to be easier to manage than a file server and can be up and running very quickly.

Reliability, ease of administration NAS systems strive for reliable operation and easy administration. They often include built-in features such as disk space quotas, secure authentication, or the automatic sending of email alerts should an error be detected. Many NAS systems also support the Hypertext Transfer Protocol (HTTP) allowing users to download files in their Web browser from a NAS that supports HTTP. NAS systems also commonly employ HTTP as an access protocol for Web-based administrative user interfaces.

[Opinion] Besides providing a reasonable alternative to traditional file servers in client/server networks, the new breed of NAS networking products has succeeded in cutting costs for most users. For example, entry-level NAS products containing 250 1000 gigabytes of storage can be purchased for less than R5000. Besides cost, a NAS promises reliable operation and easy management. Expect NAS technology to keep evolving as the field matures even further over the next decade.

www.netdotwork.co.za

3


mobility Wireless, driving both technology and strategy, is the future of IT and should be at the strategic heart of your organisation’s IT plans.

Justin Spratt, Manager Mobile Voice Solutions at Internet Solutions.

QA Mobility & wireless The mobile landscape has changed dramatically in recent times. Devices we once considered mobile are now considered baggage as people opt for real mobility that meets their needs instead of a PC in a smaller box. That said, the latest netbooks seem to be bridging the gap between what we considered mobility just a short six months ago, and the traditional laptop.

posed a few questions to Justin Spratt, Manager Mobile Voice Solutions at Internet Solutions, to get a take on what he had to say on the subject. [net.work] Do mobile workers really need a desktop replacement? What work is not done on the Internet or in your inbox? A small, simple device will do most of the time. [Spratt] There are and will continue to be (at least the next five years) two types of mobility platforms that the ‘Digital First World’ (clearly underdeveloped nations are completely different and focus solely on mobile telephony) will demand: The handheld The handheld must be the form factor of a phone (which is the primary requirement) that delivers the email

(secondary) and finally camera and music (I would say these are joint third – the split being even in the market). Web browsing is a tiny fraction of usage, but there is a massive pent up demand, it just needs a standard interface, something WAP couldn’t achieve (Why? There are a plethora of mobile phones, each with its own framework – but that’s a whole story on its own). The prime example of the best mobile device is the iPhone, and this is definitely the window to see what things will look like in the future. Look at the haptics (touch screen – you will see massive improvements here); Location Based Services (LBS – uses GPS coordinates); music; seamless integration to “the hub” (laptop – see below). It has some problems in that it is trying to be all things to all people (read: feature bloat) and with its 3G chipset, but the market is adopting in spite of these due to the ‘Mac cult/Jobs’ love affair. Even Jobs doesn’t try to sell the iPhone as the only device. For Jobs, the Mac (read: laptop) is the hub and everything else are nodes that connect in to that hub to get their information. Interestingly, Ray Ozzie, Microsoft’s new(ish) Chief Architect ascribes to this theory too. Ozzie calls this ‘Services + Plus’ – the plus meaning the desktop is the hub. The laptop The laptop is replacing the desktop, that is for sure, but the market is also demanding that this be ultra mobile too – hence the push for greater battery life (Dell has just issued a laptop that claims 19 hours battery); Wi-Fi chips and extension boards for 3G cards and dongles. The user interface on the phones is too small to do everything and the experience too taxing, so people need the laptop as the interface to do Continued on page 44

42

October 2008_



mobility Continued from page 42

Wireless, driving both technology and strategy, is the future of IT and should be at the strategic heart of your organisation’s IT plans.

most of the ‘work’. This device must then also be interoperable with all other nodes – TV; music system(s); digital photo frames; DSL routers and so on. So it is the laptop that is still the epicentre of the Digital First World by a long way. The fact that PDA adoption rates were so small for almost a decade is a testament to this. Only once they had the phone functionality was there that watershed moment of ubiquity. Even now, I would argue that although most of our time is spent on them, they are still not good enough to do the work / fun stuff. In summary, the laptop is replacing the desktop, and thus, we still need a desktop replacement. I don’t see this requirement going in the foreseeable future. The reason is because the usability (Virgin Mobile UK just spent R15m on usability just for their website in the UK, so it counts!) of phones is still low enough that interactions with the device average no more than a few minutes, while a large chunk of work and fun require longer than this. Enter: laptop. As much as there will be convergence (I am a big believer), people will never use only one device for all the interactions. Ever. Looking further into the future, I believe that eventually, the laptop will become just a node too, and every personal device will interact with ‘The Cloud’ (aka Internet – web browsing; email and all our files). [net.work] Laptops, PDAs, smartphones and gadgets of all types will soon converge into a small form factor ultra-mobile device. What will it look like? Do the netbooks of today do the job? [Spratt] One word: iPhone. It is far from perfect (aforementioned feature bloat is but one example), but we

44

October 2008_

“I am also not a believer in one size fits all, so you will see different variants of the iPhone, much like you have with the iPod (Not everyone liked driving Ford’s black Model T!)” have definitely moved to a new S curve that shows a window to what future devices will look like. The main barrier now to further convergence is battery life, but the composite of phone; email; camera; web browsing; music; LBS (location based services and searches), and so on, is the what the device will have. I am also not a believer in one size fits all, so you will see different variants of the iPhone, much like you have with the iPod (Not everyone liked driving Ford’s black Model T!) As for netbooks, they will not become prevalent until ‘The Cloud’ becomes robust enough, and for me, that is still a few years away (see the recent outages with GMail; Google docs; Amazon’s EC3, and so on – as much as I love SaaS* and The Cloud, it still has some maturing to do). Why? Because I still need my serious processing and storage to lie somewhere, and with a netbook, that can only be ‘The Cloud’ or i need to buy ANOTHER computer, which won’t happen. ‘The Cloud’ is the solution, and that needs maturity. Until then, the form factor will be the mobile phone, and more so, the iPhone. (SaaS = software as a service – delivered over the Internet versus onsite deployments)

[net.work] And on the topic of convergence, the user drive behind convergence will be to task-specific devices instead of the everythingin-an-ugly-box trend we suffer from today. Is this a realistic assessment? [Spratt] Yes. There will be

a drive to get a ‘Swiss-armyknife’ solution, but it will never completely work. Task specific devices will still remain high. One example: I don’t really want to take my iPhone into the gym – it looks silly; too big; and too expensive to sweat on it – for gym, I want a tiny little Nano iPod. [net.work] Given the prices of these devices they will remain tools for specific users and uses and will not become a mainstream option for most users. [Spratt] I don’t think price will be the major issue – economies of scale can be derived very early on, on much lower volumes than used to be the case. But I do think that they might stay niche products based on usage.

[Opinion] In Professor John Ratcliffe’s opinion (Joint author of the report, with colleague Ruth Saurin, and Chairman of The Futures Academy), “In the uncertain world of today and tomorrow, one major risk to business is being caught out by inevitable surprises. To avoid this, a new mindset reinforced by fresh ways of thinking about the future is needed by all those involved in constructing, occupying and managing the workplace. This report will enable the industry to face the challenges and grasp the opportunities that lie ahead over the next few decades. Businesses that can embrace these foreseeable changes will have a competitive advantage.”



By Simon Dingle

Business Evolution TL

New web technologies, open source and a number of other factors are changing the way we work. Key trends need to be accommodated in the modern workplace, while protecting business interests. Several key trends are bearing down on the modern business and changing the way the organisation interacts and transacts. Business is evolving in terms of its technology and the way that consumerism is driving new tools and solutions for business. Employees have sophisticated technology stacks at home and aren’t afraid of trying out new software and services. They expect the same at work. Gartner recognises ‘Five Major Mutually Reinforcing Discontinuities’ that are changing the way we work. They are software as a service, open source, Web 2.0, consumerisation and ‘global class applications and systems technologies’. These trends feed each other and offer both opportunities and threats for business.

Ploughing ahead Largely, one of the dynamics emerging in the modern workplace is employees taking things into their own hands in terms of technology. “Business units are becoming more independent,” says Gartner analyst Jeffrey Mann. “Workers are able to use free or cheap consumer services to get things done. They don’t have to wait for the IT department.” This is driving fundamental changes into business models and budgets. “The concern this raises is in terms of where data is held,” explains Mann. “Many of the services being used are in other countries. Do you really want your data leaving the organisation, or your country?” Cloud computing is increasing in use, like it or not, and offers high value to businesses looking to escape the bonds of owning

46

October 2008_

their own infrastructure and having to keep it up to date. Mann therefore suggests that attitudes surrounding location need to change. “Do you trust your suppliers?” he asks. “And if so then that trust needs to go further than it does right now in terms of these new applications and services.” That’s not to say that shifting your approach will be easy. “Most IT professionals want the world to proceed in an orderly, incremental fashion, with no massive overnight changes and plenty of time to adapt to external change,” says Mann. “Significant discontinuities can be a nightmare. For example, when assumptions about the useful life of an asset shift early in a project, plummeting from years to months. Investors can be ruined, and people can lose their jobs. Any of the five major intersecting discontinuities on the horizon can upset the balance of power between users and their IT organisations. And they amplify each other.”

Time to get over Web 2.0 As one of these discontinuities, Web 2.0 is much talked about in terms of business. The new web is not complicated, but is made so by the legion of consultants and strategists that make their living from the market’s confusion surrounding new technologies. This is the same as what happened with service orientated architecture (SOA). “Although the Web 2.0 name is popular and represents the Web of today, the world seems hungry for 3.0, whatever that is,” says Mann. “Although Web 2.0 suffered from being overly broad, the special interests driving the hype surrounding 3.0 have the opposite problem in that they are often too focused.


Regardless of what the next big buzzword is, the Web will remain one of the major catalysts in technology and one of the major sources of innovation. “Looking too closely at the details of Web 2.0 by examining technology, communities and business models misses the way phenomena emerge from the primordial ‘Web soup’,” he continues. “In a small-scale economy with, for example, five people in a remote farming community with no contact with other humans, there might be little need for a market economy where prices rose and fell as a function of supply and demand. The village could agree to let one person specialise in egg production and another in meat production and they could work out a meat-egg exchange policy. “Similarly, they could figure out who would grow the wheat and provide other commodities. And that would be that. The Web has hundreds of millions of actors. Pre-ordaining form, structure, processes and so forth for all ‘social interaction sites’ would have been catastrophic for the founders of the Web. Billions of social interactions a day shape the social context of the Web. Patterns and social orders emerge, guided by the invisible hand of the social marketplace and will continue to do so as long as new sites are allowed to appear and older sites are allowed to morph with the times,” states Mann.

The emergence imperative Given the importance placed on frameworks, emergence is therefore the most important phenomenon in this discussion. According to Mann, this refers to the free market economy setting prices, and the social market determining shape and structure.

The Web impacts on interaction, is fed by consumerisation, bolsters the demand for popular applications and services and is underpinned by the changes being brought to the entire stack by the rise of open source. Software as a Service (SaaS) is touched by all of these factors and is a culmination of where enterprise technology is going. “The report card on SaaS is mixed,” says Mann. “In some cases, there are clear advantages such as lowered cost, improved vendor accountability, lowered barriers for entry and exit, and greater flexibility. But those advantages are situational. They will vary depending on the needs of various elements within the business. And there are risks in terms of the business goals of the provider and the consumer never being identical, the costs are not guaranteed and can escalate, and it will likely be harder to integrate SaaS-based solutions with in-house developed capabilities.” Allowing employees, as consumers, to take the lead in terms of identifying new usable services and applications for business must seemingly be balanced with oversight from management in terms of compliance, security and future-proofing.

[Opinion] Business is being changed by several factors, not least of which is the people within the organisation, as consumers, taking things into their own hands when it comes to plugging holes with new technologies. Most of these will be easily accessible online applications and services offered for free, or cheaply. It is possible for business to benefit from this dynamic, but of course key elements such as security must be maintained.

www.netdotwork.co.za

47


By Gary Lawrence - CA Africa

Staying on course: TL

The guide to identifying, managing and reducing complexity In this month’s column, Gary Lawrence, country manager of CA, touches on the bottom line – how to do more with less and deliver greater value for your organisation. Doom and gloom certainly seems to be at the top of the global news agenda in 2008. Pick up any newspaper or magazine, turn on the television or tune into your favourite radio station and you will no doubt be confronted with news about record oil prices, tumbling property prices and a general squeeze on the economy. IT has not escaped the downturn. In fact, IT organisations are under even more pressure to deliver tangible results faster with the same, or in many instances, reduced budgets. Whilst IT budgets may be finite, the demands on IT often seem to be infinite. In today’s highly challenging environment, simply working faster is not enough. IT managers need to utilise resources better, revise management processes for greater efficiency and focus even more on their people.

Specifically, IT managers should consider: • Managing what the business already has: This involves consolidating infrastructure, knowing what resources exist and how they are being used, reducing redundancy and increasing utilisation. • Managing change: Consider a change management process that can support the rapidly changing needs of the business whilst at the same time minimising the cost and disruption of change. • Measuring results: Measure and understand the impact that IT has on key business processes such as order throughput. • Practicing governance: IT managers need to ensure that their people, processes and technology all support business initiatives and meet legal and regulatory mandates. Governance should be seen as a next logical step, given the central role that technology plays in meeting legislative and

48

October 2008_

regulatory demands. It should not be seen as a hindrance, but rather be embraced to earn that seat at the boardroom table.

Project and Portfolio Management Project and Portfolio Management (PPM) is also gaining increasing popularity as a way to plan, manage, measure and prioritise IT projects across the enterprise. Once the combination of PPM and governance is in place, it is a small step to apply change management and financial management principles. By doing so, you can ensure that IT investments deliver as intended and can be managed effectively. This will also allow for sufficient flexibility to accommodate change while providing an acceptable level of return to the business. PPM and IT asset management provide some of the best capabilities for managing risk and costs. These two disciplines provide the IT governance oversight needed to ensure proper resource usage, management of programme and programme cross-dependencies and control of contractual obligations and financial investments. Last, but not least, and something that is often overlooked is people. If you are to derive the greatest possible return on your IT investments, you need great IT staff. By nature of its description the IT manager’s role is technical, but people remain their most valuable asset.

[Opinion] It is all about striking a balance between finding and retaining the best people and deploying the right technology that will deliver the most benefit to the business in a cost effective manner.


www.netdotwork.co.za 3


By Simon Dingle

When Google launched in the late ’90s many found it confusing that it chose to introduce a new search engine into the market. We already had Yahoo!, Microsoft search, Lycos and others. The market was tied up. Who in their right mind would choose to launch a new search engine then? Now Google has launched Chrome – a new web browser in a market dominated by Internet Explorer and Firefox. Only this time, no one is laughing. But many still wonder why. To answer that question Google has prepared an elaborate online comic, illustrated and adapted by famous cartoonist Scott McCloud, that can be viewed at http://www.google.com/ googlebooks/chrome/ The gist of it is that other browsers just weren’t cutting the cheese in terms of new web applications and services.

Chroming the Web Google has launched its own browser in the form of Chrome. Some are calling it ‘game changing’ while others are querying why Google would chuck yet another browser into the market.

50

October 2008_

For the greater good Anders Thorhauge Sandholm was the project manager on Chrome and says that a change was needed in the way browsers address content. “It’s not as if everyone was banging their heads against the wall because the Internet wasn’t working properly,” he says. “On the other hand, things have changed online that affect the way we use the web.”


Conflict of interests Chrome is extensible via plugins in http://www.google.com/googlebooks/chrome/

much the same way that Firefox is.

“The first websites were static and textheavy. They had content on them and links to other pages, and that was about it,” continues Anders. “Since then there has been a lot of development that nobody could have foreseen and now websites are interactive and have multimedia content that is pulled from all over. The web is a different place now and people do a lot of things in their browser now that they did elsewhere before, like email, setting up meetings, working on presentations and banking. If the browser crashes or freezes it can be very frustrating.” Chrome therefore takes a new approach to JavaScript and memory management, allowing each tab within the browser to run as its own process. If one tab crashed it won’t bring down the whole browser – just that tab. Chrome has also been optimised for online applications. And it’s no surprise that Google Apps and Gmail work better on Chrome than any other platform. Chrome is also open source, allowing other browser developers and the community at large to build on it. What Google is doing through this is improving the way everyone experiences the web – not just Chrome users. And this, of course, makes a whole lot of sense for a company that’s business is focused on the Internet. Google also has the Chrome Bot that crawls the ‘net and tests a website for use with Chrome, boosting the browser’s compatibility. Presumably, the benefits of this will pour through into other browsers making use of the open source bits of Chrome.

The meta operating system If you’re doing your work, banking, email and just about everything else using web applications the question is where this leaves operating systems and conventional applications. It seems all you really need these days is a browser. Google won’t be lured into commenting about the competitive landscape in this regard. It also won’t make any promises in terms of a mobile version of the browser. “Google Chrome is in beta and is being designed primarily for the desktop,” says Anders. “Right now the goal is to make it as good as possible for that platform and to get it as stable as possible before launch.”

It seems obvious that one of the first extensions we’ll see will be an ad-blocker. Already, one can block ads in Chrome using proxy services. This interferes with Google’s primary business of selling advertising. Other vendors would’ve curbed such functionality but Google insists that Chrome remains a truly open source offering and, as such, completely open to extension. Perhaps Google really isn’t evil?

[Opinion] Google Chrome changes the way browsers address memory management and is optimised for web applications. To call it game changing is a tad premature. What is changing the game is the web applications themselves and people’s propensity to work within the browser. Google Gears allows this to happen in offline environments too, [Opinion and is built into Chrome. As an open source application Chrome is helping the web get better. But lets not get ahead of ourselves in terms of thinking that operating systems as we know them are on their way out. Yet.

]

www.netdotwork.co.za

51


product update Product updates on desktops, laptops, accessories, gadgets, software and more. A byte of fresh technology.

New devices from Motorola Enterprise Mobility Motorola‘s Enterprise Mobility business has launched two new devices designed to enhance efficiency and convenience for mobile workers in specific fields to the sub-Saharan Africa market. The products represent the convergence of technologies and devices to deliver a complete package purpose-designed for demanding mobility applications. The new CA50 The new CA50 is intended for ‘indoor’ use, while the rugged MC75 is ideal for field work, incorporating a range of features including GPS. The CA50 is a VoIPenabled wireless scanner designed to enhance customer service, help

HP Mini-Note demand soars Mini notebooks are an emerging category of mobile computing devices, with worldwide shipments predicted to reach 5,2 million units this year, and 8 million units in 2009, according to Gartner. It is further predicted that there could be as many as 50 million mini notebooks shipped in 2012. Leading IT infrastructure distributor Axiz has just released the HP 2133 Mini-Note, which weighs only 1,27 kg and is a mere 33 mm thin. Adele Oosthuizen, Axiz Category Lead: HP PSG, says the demand for these mini notebooks is being driven by several factors including their small form factor, small screen, light weight, price, ease of use and their basic, but adequate, PC functionality. “The mini-note market is the fastest growing segment of the mobile PC industry. HP’s offerings were designed with a strong emphasis on the education market as the devices offer schools affordable computing for every student, and are indeed flexible enough for students to use from the classroom to the family room. At just slightly over a kilogram, the HP Mini-Note PC is smaller and lighter than many school text books. It includes a suite of wireless, multimedia and security capabilities to allow students to learn everywhere they go.” Of course business and mobile professionals value the same mobility, usability and cost concerns as the education market, she adds, and the HP 2133 has

52

October 2008_

reduce operational costs

also proven extremely sought-after in this market sector. The device offers full wireless technologies such as integrated Wi-Fi Certified WLAN 1 and optional Bluetooth, all of which make researching and communicating more convenient through access to the Internet, e-mail, instant messaging and VoIP. Experience video, still-image capture, web conferencing, or video-enhanced instant messaging with no additional hardware to buy or carry. Integrated VGA webcam enables both video and still-image capture so you can add photos and video clips to presentations, documents and e-mail. Optional security solutions are available to help defend your data and your notebook. “The HP 2133 comes equipped with all the productivity tools you need to conduct business efficiently on the go, including 8.9-inch diagonal, scratch-resistant, WXGA display which is 92 percent full-size, a user-friendly QWERTY keyboard, and an integrated Secure Digital Reader touchpad with scroll zone,” Oosthuizen says. The simple, refined design and allaluminum case make it sleek and sturdy yet super lightweight. Reliable features such as HP DuraKeys, magnesium alloy support structure, and HP 3D DriveGuard help give you a reliable and durable mini-note that can really go the distance.

and increase worker productivity. The CA50 provides retail, healthcare and hospitality service workers with an integrated voice and data device that fills a large communications gap that is found on store floors, across hospitals and restaurants. “This device illustrates Motorola’s efforts to fuse legacy scanning technologies with new voice and wireless capabilities so that all employees are connected with the latest technology at their fingertips to better serve their customers and patients,” says Mark Kelly, country manager Motorola South Africa.

The MC75 EDA The Motorola MC75 Enterprise Digital Assistants (EDA) is a rugged, mobile computer which delivers simultaneous voice, data, GPS navigation, and camera-based document capture, says Kelly. Based on Intel’s XScale processor and Microsoft’s Windows Mobile 6.0 operating system, the device is interoperable with existing enterprise infrastructure and provides a flexible development platform. Features include support for 3G/HSDPA, 802.11a/b/g and a 2 megapixel camera capable of capturing 1D and 2D bar codes.

Contact: Axiz, Tel: +27 (0)11 237 7196, www.axiz.com


T-Mobile G1 launches T-Mobile has unveiled the world’s first mobile phone powered by Android, an open-source mobile platform led by Google. T-Mobile G1 is a touch-screen mobile phone that includes a QWERTY keyboard and features applications like Google Maps with StreetView, Gmail and YouTube. The phone will be available in the U.S. beginning Oct. 22 and in the United Kingdom in early November with other international launch dates scheduled throughout 2009.

Android Market evolves Talking to developers, a common topic is the challenge of getting applications into the hands of users. Apple has done this well with its iTunes services, and Google is set to do the same, having released early details of its Android Market—an open content distribution system that will help end users find, purchase, download and install various types of content on their Android-powered devices. The concept is simple: leverage Google’s expertise in infrastructure, search and relevance to connect users with content created by

developers. Developers will be able to make their content available on an open service hosted by Google that features a feedback and rating system similar to YouTube. Google chose the term “market” rather than “store” because its feel that developers should have an open and unobstructed environment to make their content available. Above are some screenshots that illustrate some of the security features and workflow. With the addition of a marketplace, the Android ecosystem is becoming even more robust.

Protecting smartphones Kaspersky Lab has released Kaspersky Mobile Security 7.0, which offers powerful protection for data stored on smartphones in the event that the device is lost, as well as protecting against network attacks, malware and SMS spam. Key new functions allow users to completely block a lost device or to remotely delete the data on it. Moreover, if a smartphone is stolen, the SIM-Watch function prevents the thief from accessing data on the phone without the original SIM card; as soon as the original SIM card is replaced, a message notifies its owner of the new telephone number. Kaspersky Mobile Security 7.0 scans incoming files and network connections in real-time, performing antivirus scans of the

whole device on demand or on schedule. The product also blocks messages from unwanted or incorrect numbers, as well as SMS messages containing words or phrases entered by the user into a blacklist. Contact: AfricaSD, Tel: +27 (0)12 665-2513 asd@africasd.com, www.africasd.com


product update Product updates on desktops, laptops, accessories, gadgets, software and more. A byte of fresh technology.

Style and substance It’s not new-new. But it is worth highlighting again if only because it is so sexy. The top of the line BlackBerry smartphone features

HP’s new TouchSmart boasts natural-touch interface The new HP Touchsmart PC and MTV Artist Edition and videos can be captured and posted to social Notebooks have arrived, apparently exclusive to networking sites in a flash, without using a keyboard. HP Experience Sandton City only. Featuring a sleek, The touch-sensitive, high-definition, 22-inch diagonal new design, the second-generation HP TouchSmart widescreen PC is Energy Star qualified too, so you get PC provides a friendly, fun approach to personal to be green. As your colleagues might well be when computing. Entertainment is this all-in-one PC’s they see one on your desk. forte and focus. A new, natural interface is activated by the tap or stroke of a finger. Photos and music Contact: HP Experience, Sandton City, Tel: +27 (0)11 783 2053 collections can be displayed in tiles or like a fan for info@hpexperience.co.za, www.hpexperience.co.za fast access. Photos can be enhanced and uploaded

premium design and unprecedented performance; perfect for business professionals and power users. Crafted from premium materials, inside and out, that radiate elegance with a dramatic presence, the BlackBerry Bold is designed to give business professionals and power users unprecedented functionality and performance. It is the first BlackBerry smartphone to support tri-band HSDPA high-speed networks around the world and comes with integrated GPS and Wi-Fi, as well as a rich set of multimedia capabilities.

diesel generators Meissner, suppliers of UPSs in South Africa for over 40 years, has launched a range of diesel generators. The Himoinsa range of generators imported from Spain, incorporate IVECO, Perkins, Scania and Volvo engines with Stamford alternators and as such are aimed at the more discerning customer for professional applications. Contact: Meissner Tel: +27(0)11 824 0202, www.meissner.co.za

The next evolution of PC-speaker acoustics Logitech has introduced its Logitech Z-5 omnidirectional stereo speakers – for PC and Mac computers – delivering pure digital audio that’s easily moved with your laptop, the Z-5 omnidirectional speakers can be quickly connected to any PC or Mac via USB. There’s no need for an external power adaptor or batteries. And to let you wirelessly navigate and enjoy all your entertainment options, Logitech’s newest speaker system comes with a sleek remote control. Launch your favourite entertainment application, adjust the volume and change your selection from across the room. Contact: Logitech South Africa Tel.: +27 (0)11 656 3375, www.logitech.com

54

October 2008_

Centralised control Falcon Electronics has announced the KM0832 KVM switch, which gives IT administrators in large corporations advanced control of multiple servers. Operators working at up to eight keyboard, mouse, and monitor consoles can simultaneously and independently take direct control of up to 32 computers. With a combination of daisy chaining and cascading, up to 8 consoles can access and control more than 8 000 computers. This unit is completely multi-platform and supports PC, Sun and Mac. The unit also offers great hardware flexibility by being able to offer PS/2 and USB compatibility on both the server and the console.The high density port configuration allows datacentre admins to leverage its space saving design as well as use their current infrastructure cabling solutions. Contact: Falcon Electronics Tel: +27 (0)11 630 1024 , www.fe.co.za



advertorial Graham Duxbury is the CEO of Duxbury Networking, Formula 1 commentator, South African champion and Daytona Speedway USA Hall of Fame inductee.

Formula 1 gets it wrong - again Lewis Hamilton stood on the top step of the podium at Spa-Francorchamps having won an exciting, spectacular, tension-filled Belgian Grand Prix in his McLaren. He accepted the trophy, listened to the British national anthem and enthusiastically sprayed the champagne. Many millions witnessed his triumph on TV. By Graham Duxbury, Formula One in Focus Then, more than two hours after the race, when the spectators had left the track and viewers had switched off their TV sets, came the announcement: A 25-second penalty was imposed, demoting him to third place. Why? At a post-race stewards’ enquiry, he was judged to have gained an unfair advantage after cutting the final corner - the infamous bus-stop chicane - two laps before the end of the race and passing Kimi Raikkonen as a consequence. The victory, therefore, went to Felipe Massa, who had crossed the finish line in second place (after Raikkonen crashed out on the last lap), and the second position went to Nick Heidfeld, whose BMW was initially classified third. Hamilton claimed he had done nothing wrong and his view was echoed by millions of TV viewers who remain outraged at the injustice of it all. Of course these TV viewers, myself included, aren’t privy to all the data input - particularly the cars’ telemetry traces - that could have helped swing the decision in Ferrari’s favour. A sense of disappointment The problem, in my opinion, lies not with the correctness - or otherwise - of the stewards’ decision, but with the sense of disappointment that all Formula One fans feel at a time like this. They sat anxiously on the edge of their chairs, glued to their TV sets and cheered their favourites on those incredibly exciting last laps when, on a wet, slick track they fought each other with every ounce of skill they possessed. The fans were there, riding shotgun, sitting with Kimi or Lewis and fighting the battle with them. Then they basked in the warm afterglow of having seen true mastery in action. A classic GP at a classic circuit. A few hours later the FIA burst their bubble. In many ways, Spa 2008 reminded me of the 2003 Brazilian GP. It was also an exciting, incident-packed and controversial race. It was unusual in that the final podium ceremony featured only two drivers instead of three. Like Spa this year, it was also a race in which rain played a major role. Even champion Michael Schumacher spun and crashed out.

A radical strategy Several teams opted for a radical pit strategy, filling their cars with fuel early in the race, hoping that they would save enough under the expected safety car periods to avoid having to stop again. This strategy paid off for Giancarlo Fisichella, whose usually uncompetitive Jordan rose from last in the field, after an early pit visit, to first place when other cars stopped. This was shortly before Mark Weber crashed, was hit by Fernando Alonso and the resultant wreckage blocked the track. The race was stopped prematurely and the stewards awarded victory to Raikkonen. Fisichella was given second place and Alonso, although in an ambulance, was awarded third. Fisichella had to wait several days before the officials reviewed the lap scoring data and their decision. He was belatedly named the race winner at an FIA meeting in Paris, half a world away. Of course, delayed decisions are ‘bad news’ for F1. Commenting on Hamilton’s demotion, former world champion Niki Lauda says it was the ‘worst decision in F1’s history’. This is a sentiment supported by many in the pit lane who believe Hamilton did everything right after cutting the chicane as he lifted off the power, allowing Raikkonen to retake the lead on the straight as the rules require. Race fans have swamped F1 internet sites and blogs saying that F1 needs more overtaking and more excitement. “But all the decisions that are taken seem to be against this,” says one. “It will have a negative impact on F1 racing as drivers will, in future, opt for caution instead of overtaking.” Many comments have also called into question the FIA’s impartiality, claiming the rule-makers are on Ferrari’s side. Renault’s director of engineering, Pat Symonds, says something needs to be done in order to ensure decisions are made promptly. “I think motor racing should be like football, not like cricket. Let’s have action, let’s know what is going on in real time, not wait for two days to find out the result.” Duxbury Networking tel: + 27 (0)11 351 9821, fax: + 27 (0)11 646 3079, www.duxbury.co.za



Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.