Intermedia Vol 39 No 1 March 2011

Page 1

The world’s most influential telecom and media policy, regulatory affairs, and compliance journal March 2011 Volume 39 Issue 1

Andrew Odlyzko on net neutrality The hidden secrets of search-land Social networking: should the workplace protect itself? Media access in developing countries Determining the reasonably efficient operator


[This page intentionally left blank]


www.iicom.org March 2011 Volume 39 Issue 1

Debate key policy issues The International Institute of Communications Community

Contents Viewpoint, news and analysis Neutrality and non-neutrality: From the past to the future page 4

The INTERNATIONAL INSTITUTE of COMMUNICATIONS is an independent, non-profit, dynamically-engaged forum for industry, government and academia to analyse and debate trends in communications and their impact on society. Mission: to provide a global framework for dialogue and to promote access to communications for all peoples of the world. Intermedia editorial enquiries Please contact Stephen McClelland, Editor in Chief - s.mcclelland@iicom.org Subscription and membership enquiries Please contact Joanne Grimshaw, Project Executive - j.grimshaw@iicom.org

News brief pages 7, 16-17, 34-37 Regulators: don't mislead on broadband PTC '11: Reading the Net future IIC TMF Washington 2010: Prospects for cloud computing IIC TMF Washington 2010: Meeting spectrum demand News analysis pages 8-15 Searching for the missing link: what next for search engines and is there a problem with the way they do things? The web's USD 100 billion surplus: how the economics of the web may be changing everything Towards the Connected Consumer: mobile makes a big difference, if it works

Media access Citizen Access to Information: Emerging trends from the Developing World page 18 Are developing economies left behind in the information revolution - or empowered?

Telecom competition Margin squeeze: defining a reasonably efficient operator? A keystone of telecom competition policy analyzed page 24

Email Search Social networks Instant messaging

Social networking policy When are companies liable? page 29 Employers are increasingly exposed to social networking snafus by their staff

Internet phone Web mapping Comparison shopping

Is there a web surplus and how much is it? Page 14

Martin Cave  page 38 Regulatory economics expert is happy with regulation but warns against too much government interference

Music

Contents Wikis © 2011 Authors and IIC Compilation of contents © 2011 IIC Yellow pages All rights reserved

The IIC website has many features and webinars linked to the above. See http://www.iicom.org for details

Advanced uploading Podcasts, contnt Blogs Games, gambling

3

IPTV Analyisis Issue Newscontents Analysis

Videos

First person


www.iicom.org March 2011 Volume 39 Issue 1

by Andrew Odlyzko

Neutrality and non-neutrality From the past to the future, there is very little new under the sun

Most current discussions about telecom policy appear to lack proper perspective, from both economic and historical points of view. There are extensive discussions of different approaches to net neutrality, with many Europeans observers dismissing it as just an American issue. On the other hand, some Americans wonder why Europeans have to make a big issue out of international mobile roaming fees. Simultaneously, in seemingly totally different areas, such as pharmaceuticals, there are hot controversies about wildly varying prices for the same drugs from the same manufacturers in different countries. What is generally missing is a perspective that puts all these developments into a single coherent view. Yet such a perspective is not hard to present. History, going back centuries, is full of similar pricing controversies, in many cases amazingly similar to current ones. This long history, fortified by economic explanations for the existence of such controversies, guarantees that they will persist into the indefinite future, as no definitive solution exists. They will therefore provide generous meal tickets for generations of lobbyists, lawyers, and economists.

Discrimination is fundamental The fundamental issue is the degree to which service or goods providers are allowed to discriminate among their customers. Basic economic theory tells us, and empirical evidence reaffirms, that more control allows providers to increase revenues and profits. Both theory and evidence also show that under some conditions, this is to the benefit of customers. The real question is about the appropriate balance, and this question has been debated, explicitly or implicitly, over centuries. The basic arguments have not varied much, the providers always pleading high costs or need to expand, customers resisting on the grounds of ruinous charges or unfairness.

4

Probably the only novel element in recent times has been the argument, based on property rights, that providers should have complete freedom in pricing. This does seem to be new. Traditionally, law has imposed special nondiscrimination and duty-to-serve obligations on those employed in certain industries. The understanding was that untrammeled control for providers in such industries, in particular transportation, inn-keeping, and communication, was similar to a lack not just of property rights but of law itself. It would expose society to arbitrary actions that repressed economic activity, either by making enterprises unprofitable, or introducing prohibitive degrees of uncertainty. Aside from basic non-discrimination principles of law, there were often statutory requirements that extended these principles and made them explicit. Yet many of these requirements compromised with the economic need to allow some forms of discrimination. Thus, for example, the 1833 charter of the London and Birmingham Railway in England envisaged a degree of structural separation, with the company providing only the rails, and other carriers being able to bring their own wagons and locomotives onto those rails to provide actual carriage service for customers. This charter (3 William IV, c.36) provided that the railway "shall not partially raise or lower the Rates or Tolls payable under this Act, but all such Rates and Tolls shall be so fixed as that the same shall be taken from all Persons alike, under the same or similar Circumstances." Yet the maximal tolls set by this Act (the railway could charge less, but not more) varied by a factor of three between goods such as cotton clothes and compost. Also, the extent to which providers could discriminate, or could engage in vertical integration, varied historically, depending on economic circumstances. Thus, for example, English canals were in most cases barred from being carriers. This prohibition was lifted in the


www.iicom.org March 2011 Volume 39 Issue 1

1840s, when canals were impacted negatively by railway competition. Moreover, even before this change, the tolls the canals collected depended on the cargo (as with railways such as the London and Birmingham one), and so were totally uncorrelated to costs, and thus required (as in the case of railways) the equivalent of the modern "deep packet inspection" to determine what was being carried.

Balance the incentives These examples illustrate the general historical trend for society to balance the competing incentives, depending on the circumstances of each case. In general, the greater the costs, the more power to discriminate is given to the providers, although that is often controlled by fairness rules. (For example, private colleges in the United States practice an extremely discriminatory policy, with affluent parents paying the full tuition, and those earning less benefiting from discounts, called scholarships. However, this policy is carried out through policies that those colleges assure the public are based on rules and "fair.") Hence, precedent and economic logic suggest that neutrality rules should be based on detailed evidence of costs, something that is seldom seen today.

"It is worthy of remark that in order to check the use of this line for conveying coals for shipping, and to confine it to inland traffic, parties inter-

It was supposed by all parties that it would be impossible to carry coals at such a low rate without loss; but this rate not only turned out profitable, but formed ultimately the vital element in the success of the railway." This quote illustrates not only the political interference in policy making that was (and surely will continue to be) ubiquitous, but also the difficulty in regulating rapidly developing technologies. Tolls that were "supposed by all parties" to be ruinous turned out to be profitable, as incremental changes in the newly developed steam railway lowered costs. Hence, there is no reasonable hope of coming up with a fixed set of rules that will work in general. This means that regulations will have to be flexible. Yet they should not be arbitrary or too heavily affected by politics, in order not to inhibit economic activity, another difficult compromise.

Future regulation That regulation will be required by society there seems little doubt. Claims that there haven't been any serious problems concerning net neutrality in telecom and so there is no need to regulate are easy to dismiss. Not only have there been problems recently, but history shows that providers, if allowed, will discriminate, just as the economic incentives suggest. Very often they will do it even when there is competition, much less a monopoly. (There are many examples of competition leading to increased discrimination.) They will even do it when there is regulation, by looking for ways around the rules. Thus, for example, English canals that were barred from 5

IPTV Analyisis News Analysis Viewpoint

The process of finding a compromise between the incentives to discriminate and to have equal treatment was seldom free of politics, and the outcomes most likely were seldom exactly optimal. This is nicely illustrated by the following citation from a report of a UK 1867 Royal Commission, writing about the Stockton and Darlington Railway, which opened in 1825, and was a key step in the development of this industry:

ested in rival ports contrived to insert a clause limiting the charge for the haulage of all coal to Stockton for shipping, to one halfpenny per ton per mile, whereas the rate of fourpence per ton per mile was allowed for all coals transported for land sale.


www.iicom.org March 2011 Volume 39 Issue 1

the business of serving as carriers, sometimes managed to exert extra control by gaining control of warehouses. In some cases regulation simply could not deal with the complexity of the industry at the desired level. The early English railways were obliged by their statutes to allow carriers to use the rails for movement of their own wagons. However, in practice those railways frustrated such requirements by not providing water for the carriers' locomotives or sidings for loading and unloading cargo. The result was that Parliament conceded to railways monopoly carriage rights.

Events Diary

Regulation in the future is likely to be even more difficult. While the world is not moving at the proverbial "Internet time," many changes are taking place faster than before. In addition, some of the most interesting changes that could lead to a reshaping of our economy are taking place in software, in the architecture of the information systems, such as cloud computing and social networks. In the traditional physical world, of physical infrastructures, regulators had some tools at their disposal that are less applicable now. Service providers, even when they clothed themselves in the mantle of private property rights, frequently had to rely on the coercive power of the state to acquire land or rights of way, and the state could condition the grant of such rights on proper behavior. In the online world, that is not available.

»» IPTV World Forum 2011 22-24 MARCH London, UK www.iptv-forum.com

And it is very tricky, for example, to decide whether a change in a web search algorithm is driven by the need to control activities that pollute search results, or by the desire to enhance the prominence of a service controlled by that search engine. Thus we are bound to be involved in very messy proceedings, and the only safe prediction is that lobbyists, lawyers, and economists will do well. Andrew Odlyzko is a professor at the University of Minnesota. Best known for his studies on Internet economics at Bell Labs and AT&T Labs, he is now exploring analogies between the ongoing transformation of society and earlier technology revolutions, with papers available at http://www.dtc.umn. edu/~odlyzko. The subject of this note is dealt with in more detail in two papers by the author that appeared in Review of Network Economics: 1. Network neutrality, search neutrality, and the never-ending conflict between efficiency and fairness in markets, http://www.bepress.com/rne/vol8/iss1/4/ 2. The evolution of price discrimination in transportation and its implications for the Internet, http://www. bepress.com/rne/vol3/iss3/4/

6

»March » 2011 »» Mobile Financial Services 15-16 MARCH London, UK www.mobile-financialservices.com »» Telecoms Regulation Forum 21-23 MARCH London, UK www.iir-telecoms.com/event/regulation

»April » 2011 »» IMS World 2011 12-14 APRIL Barcelona, Spain worldforum.imsvision.com »» IIC Telecommunications and Media Forum 13-14 APRIL Brussels, Belgium www.iicom.org

»May » 2011 »» LTE World Summit 17-18 MAY Amsterdam, The Netherlands ws.lteconference.com »» 5th Annual Roaming World Congress 23-26 MAY Madrid, Spain www.iir-telecoms.com/event/roaming

»June » 2011 »» 6th Annual European Spectrum Management Conference 14-15 JUNE Brussels, Belgium http://www.eu-ems.com »» 3rd Annual Internet of Things Conference Europe 28-29 JUNE Brussels, Belgium http://www.eu-ems.com


www.iicom.org March 2011 Volume 39 Issue 1

News brief Regulators: Don't mislead on broadband London - The magic words of "up to" can mean so much, at least in the world of service providers making claims in advertising for broadband speeds. In the UK, for example, a study released by Ofcom in March indicated that "the average broadband speed increased from 5.2Mbps (May 2010) to 6.2Mbps (November/ December 2010) but was less than half (45%) of the average advertised broadband speed of 13.8Mbps."1 Ofcom studied 11 packages supplied in the UK from the UK's seven largest ISPs.

Crackdown? Regulators around the world are clearly increasingly irritated by the practices that confuse consumers and frequently get the industry the same sort of reputation as used car selling. In fact, a significant part of the confusion is related to the deployed technology. In the Ofcom figures, most of the discrepancy emerges from the performance of copperbased DSL networks compared with newer fibre-based systems although all technologies see some differences. ADSL often sees problems because of varying quality and line length of the copper distribution network. The performance differences were most marked in the high speed ADSL marketplace.

1 http://stakeholders.ofcom.org.uk/marketdata-research/telecoms-research/broadbandspeeds/speeds-nov-dec-2010/

Recommendations The apparent deception of consumers received much media attention. Ofcom's recommendations going forward are unsurprising rulings that advertising should be more truthful but do not formally outlaw the "up to" tag. In conjunction with the Advertising Standards Authority, a new code of practice is being developed. In particular, they focus on the representation of a Typical Speeds Range (TSR) representing the range of speeds actually achieved by at least half of customers (around the median) which should be used when using speeds in broadband advertising; Ofcom also says that if a maximum "up to" speed is used in an advert, then the TSR must have at least equal prominence. The theoretical maximum "up to" speed stated must also be a speed actually achievable by what Ofcom calls "a material number

of customers". Ofcom also wants advertisers to include a qualification alerting consumers that they can confirm the likely speed that they will receive when buying their service.

But do they realize? Other regulators have also mulled the issue, which again highlights a broadband under-performance compared with the all-fibre networks being built in east Asia. In the US - where there are similar considerations to UK - there are concerns that confusion exists. However, a 2010 survey2 on behalf of the FCC revealed that four out of five Americans do not know what their broadband download speeds actually are. An FCC position paper3 has already considered the issue, reporting: "Consumers need a better, publicly agreed-upon measure of broadband performance that reflects the network operation and end-user experience. The [National Broadband Plan]'s Recommendation 4.3, calling on the FCC to work [with various groups]...to develop broadband measurement standards is designed to address this issue." Fixed broadband speed analysis at least is probably more reproducible than the other elephant in room: claims for mobile broadband as 4G approaches - but regulators no doubt think that is for another day. - Stephen McClelland 2 http://fjallfoss.fcc.gov/edocs_public/attachmatch/DOC-298516A1.pdf 3 Broadband Performance OBI Technical Paper No 4. FCChttp://fjallfoss.fcc.gov/edocs_ public/attachmatch/DOC-300902A1.pdf

7

News brief

"Very few ADSL broadband customers achieved average actual download speeds close to advertised ‘up to’ speeds. Just three per cent of customers on ‘up to’ 20 or 24Mbps DSL services received average down-

load speeds of over 16Mbps, while 69% received average download speeds of 8Mbps or less," says the regulator, but acknowledges at lower speeds the gap was narrower. The best performer was cable operator Virgin Media using DOCSIS technology and a fibre-based network to street cabinet level with onward distribution via coaxial cable which nearly achieved advertised speeds. BT Infinity, a fibre-to-the-cable network available to 15% of the UK population also achieved performance nearer to advertised speeds and showed effective capabilty on upload speeds as well.


www.iicom.org March 2011 Volume 39 Issue 1

by Stephen McClelland

News analysis Searching for the missing link London - When BeautifulPeople.com, a dating website, went global at the end of 2009, it needed to stand out from the crowd. After all, online dating is hardly a business lacking in well-established competition and the business needed to do more than appear as another dating site. In fact, the site did have a wellhoned USP: it said it did not want "ugly" people to sign up. Nor was it quiet about the policy. In fact, it launched what was to become a major viral campaign after the holiday season with a clear message: it was prepared to expel people who had gotten too fat based on peer reviews of other members. If the announcement was arguably controversial, it was certainly high impact. And perhaps this was by design. Whilst the original announcement of the policy had been made and distributed through news agencies such as PR Newswire on 24th October 2009, ("BeautifulPeople.com goes global. Ugly people banned from the world's most soughtafter beautiful network as it launches worldwide") viral behaviour over successive weeks ensured it was picked up by major news outlets including BBC, CNN, Fox and CTV who further empowered the developing story. By early January 2010, the BBC website carried the story complete with link to the BeautifulPeople website itself. Again the key details were included including that controversial USP: "As a business, we mourn the loss of any member, but the fact remains that our members demand the high standard of beauty 8

be upheld," said site founder Robert Hintze. "Letting fatties roam the site is a direct threat to our business model and the very concept for which BeautifulPeople.com was founded." According to marketing experts such as Dixon Jones, CEO of Majestic SEO, a supplier of search engine optimization software, speaking at Europe's major conference in the area1, traffic to the BeautifulPeople. com site skyrocketed principally because of news reports and discussion - and above all, links back to BeautifulPeople.com. Within days hundreds more domains linked to the site and generated a prodigious spike increase in traffic - and interest.

Attention-seeking behaviour The BeautifulPeople.com story is by no means unusual but it exemplifies what is happening in the world of digital marketing. It is not one discipline but a set of sophisticated techniques and practices that range from subtle influence of opinion formers online to sophisticated email broadcasting, and now, to emerging social networks and how they can be made to work. At the heart of most marketing are search engine optimization techniques that get attention from the major global search engines such as Google, the global behemoth processing in excess of one billion search requests per day. 1 Search Engine Strategies, London 21st 25th February 2011. http://www.searchenginestrategies.com/london/

Getting attention on the web has never been more important. The B2B and most particularly the B2C environment selling fast-moving consumer goods needs the visibility that search engines give when consumers are typing in enquiries. First page - and preferably the first position ranking - on Google is desired. For most marketeers, being elsewhere than the first page is to be nowhere at all (see Figure 1). Dixon Jones cautions that ranking is not the same as traffic. Nevertheless, an entire commercial network has developed carefully scrutinizing search engines' strategy. Google keeps most of its practices and, in particular, its major search algorithms a trade secret but drops hints and broad approaches that the rest of the industry follows with the zeal of Cold War Kremlin-watching. For this community, the official update and commentary blogs emanating from Google Principal Engineer Matt Cutts have far more significance than what Google's CEO says in public about the entire company. The algorithm itself is complicated, and many contain as many as 200 separate factors which determine how well Google will rank a given page. Alongside relevance, a significant part however is the apparent readiness with which other sites - especially those deemed high quality - will link in. These so-called backlinks cannot be directly controlled by the site or its manager and so must be sought after, requested, or even begged for. Controversially, they can even be bought. Clearly, though, if the


www.iicom.org March 2011 Volume 39 Issue 1

It is big business. Specialist agencies spend most of their time encouraging the creation of such authoritative backlinks. Some it seems overstep the ethical line - and in doing so call attention to the wider picture. Latest in the frame is giant US retailer JCPenney. As one of the largest retailers in the US with a formidable online presence it found itself last month in the middle of a New York Times investigation on its use of spamming links.2 The New York Times investigator called the operation "the most ambitious attempt to game Google’s search results that he [had] ever seen." The issue centred around the use of a huge number of backlinks to link to the JCPenney site. As the New York Times' article puts it: "Some of the 2,015 pages are on sites related, at least nominally, to clothing. But most are not. The phrase “black dresses” and a Penney link were tacked to the bottom of a site called nuclear.engineeringaddict. com. “Evening dresses” appeared on a site called casino-focus.com. “Cocktail dresses” showed up on bulgariapropertyportal.com. ”Casual dresses” was on a site called elistofbanks.com. “Semi-formal dresses” was pasted, rather incongruously, on usclettermen.org.

Percentage of Google traffic

Figure 1 Percentage of Google traffic by result

site is important enough anyway, other people will want to link to it: this is the point of the search engine philosophy after all.

Source: Chikita.com

35

0

1

was an attempt to spam Google into thinking that JCPenney should rank more highly. During the investigation, the New York Times contacted Matt Cutts' team at Google and the result was more or less instantaneous: a deranking of JCPenney for the many terms in question. JCPenney spokespeople said the link building exercise was done without its knowledge. Parting with its digital marketing agency of 7 years' tenure, the company has lost no time in weeding out the offending links (and, it seems, many quite legitimate ones) and begun the vigorous re-engagement of a major brand that may have suffered reputational damage online.

2 The Dirty Little Secrets of Search http://www. nytimes.com 12th February 2011

3 http://www.blogstorm.co.uk

it will be for people to resist trying it out." Other experts say the strategy was "stupid" insofar that otherwise sophisticated and experienced people in the industry used a embarrassingly transparent ruse: employing a link broker to get the links. Ironically, the offending strategy seems only to have been put in place relatively recently to push an already strong brand still further: given the JCPenney profile, it would have been the natural link target of many legitimate pages and ranked appropriately for many categories.

Getting to No 1 No one denies that Black Hat practices remain a problem in the industry. Occasionally, too, getting links from high authority sites suggests some link builders will even offer low-paid webmasters money to make the link. Is it worth it? It might be, according to the available data. One analysis was recently made by Chitika who attempted to answer the question specifically: "'How much is the top spot on Google actually worth?' According to data from the Chitika network, it’s worth a ton – double

9

IPTV News analysis NewsAnalyisis Analysis

"There are links to JCPenney.com’s dresses page on sites about diseases, cameras, cars, dogs, aluminum sheets, travel, snoring, diamond drills, bathroom tiles, hotel furniture, online games, commodities, fishing, Adobe Flash, glass shower doors, jokes and dentists — and the list goes on." In short, there was little or no relevance - and equally important, the backlinked pages themselves had little or no authority. It

The worldwide SEO community has been variously shocked and baffled at what has become known as the "Penney problem". Most in the community are reputable and want to distance themselves from such "Black Hat" practices. But it is difficult, as Patrick Altoft, Search Director at digital agency Branded3 points out in his widely read blog3, "Nobody wants to use spam links but if everybody else is doing it and not getting caught then people start to think it’s OK. Google has been telling retailers not to use link spam for years but the longer people can see blatant spam working the harder

20


www.iicom.org March 2011 Volume 39 Issue 1

the traffic of the #2 spot, to be precise.4 (see Figure 1) The Chitika blog continues "In order to find out the value of SEO, we looked at a sample of traffic coming into our advertising network from Google and broke it down by Google results placement. The top spot drove 34.35% of all traffic in the sample, almost as much as the numbers 2 through 5 slots combined, and more than the numbers 5 through 20 (the end of page 2) put together. “Obviously, everyone knows that the No 1 spot on Google is where you want to be,” says Chitika research director Daniel Ruby. “It’s just kind of shocking to look at the numbers and see just how important it is, and how much of a jump there is from 2 to 1.” The biggest jump, percentage-wise, is from the top of page 2 to the bottom of page 1. Going from the 11th spot to 10th sees a 143% jump in traffic. However, the base number is very low – that 143% jump is from 1.11% of all Google traffic to 2.71%. As you go up the top page, the raw jumps get bigger and bigger, culminating in that desired top position." In the light of the Penney problem, getting there may be a link too far. The New York Times itself suggests that the way the web and Internet structures are seen to work has taken a reputational hit, commenting, "When you read the enormous list of sites with Penney links, the landscape of the Internet acquires a whole new topography. It starts to seem like a city with a few familiar, well-kept buildings, surrounded by millions of hovels kept upright for no purpose other than the ads that are painted on their walls." 4 http://insights.chitika.com/2010/the-valueof-google-result-positioning/

10

Google itself has probably suffered an image problem although they acted rapidly when notified. "The fact that Google didn’t catch this sort of spam is a big PR nightmare for them," says Branded3's Patrick Altoft. The search engine giant is considered to be omnisicient in its monitoring of potential abuse, with algorithms designed to weed out such behaviour. Indeed, part of Google's rise to near domination of the market was the concept that linking from highly regarded pages was a strategy more in tune with how humans would naturally rank information.

We want high quality Policymakers might be relieved that for the moment they don't have to get involved in such disputes but jurisdictions such as the UK have recently extended self regulatory oversight to web sites and even the use of user generated content. Google meanwhile has continued to revise its search algorithm and in late February announced a significant policy change named the Panda update aimed at giving higher quality results by essentially derating lower-quality sites, particularly those which rely on copying and syndicating content from others essentially to embrace as many keywords in their text as possible and so get ranked on search engines. Ezine-type sites pose a particular problem in this respect. This algorithmic change was a timely move in view of the JCPenney problem but was probably unrelated to it. Google's own blog authored by Matt Cutts pointed out "This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content

and information such as research, in-depth reports, thoughtful analysis and so on." The change is estimated to affect around 12% of all search queries.

Should you retarget? Few doubt however that there will always be someone who will exploit a loophole. Equally, emerging techniques such as behavioural monitoring may raise unforeseen issues anyway. Take for example the retargeting of banner ads related to a particular product to those enquiries who have just searched and looked at the details for that product. Generally, the retargeting takes place on the same content network but users may get the impression the provider is following them around the web. Is that an ethical marketing strategy? Retargeting usually mediated by dropping and reading cookies on user browsers - has its proponents and opponents and even in Europe receives a significantly different policy reception from country to country. Cookies may be used in various ways. One search marketing expert enthusiastically points to the benefits: people who have checked things out make better prospects to sell to. But consider the following: you've applied to make a test drive via the web for a new 4x4 under pressure from your spouse, but spent most of the session browsing another page that concerns a sleek sports car. The chances are the system will know about your browsing behaviour, and it is conceivable that when you arrive at the dealer for the test drive, the sports car will be waiting alongside the 4x4 even though you didn't request it. Scenarios like this - when not causing marital disharmony - test the boundaries of what is permissible. Some advise to cap the behaviour and not to be seen as "stalking" the customer.


www.iicom.org March 2011 Volume 39 Issue 1

Search gets social In the world of search however, nothing stands still for very long. The present challenge is to effectively bringthe newest of the new media waves - social networking - into the orbit of search engines, although this is not a trivial task and marketeers are still trying to understand the value of social media. "Search is crucial to social (networking) and social is now very crucial to search," says Katy Howell, CEO of PR agency Immediate Future in London, emphatically. Google launched Google Social in 2009 and in February announced some major improvements to the service. Marketeers are already seeing significant potential in including social media in search engines. Search engine results pages display social media searches but will be able to link to social networks to get an across-the-board view of what your contacts are commenting

on. Marketeers already know the power of this referral system (see Figure 2) she asserts. For one thing, Ms Howell says, "checking keywords in line with social insights will ensure your brand shares the consumer's language, leading to more targeted search queries. And content that is heavily shared or liked across social platforms will improve social advocacy and influence rankings." She adds, "This will lead to higher visibility in social search. Marketeers will also be able to leverage the power of peer influence by making reviews and recommendations shareable and, where possible, containing them within owned media. Clearly, too, ongoing monitoring will identify online search risks (for example, negative tweets, posts or conversation) that can be managed before they create a negative search legacy."

In arguing for traffic, Majestic SEO's Dixon Jones sees another trend emerging as social network platforms such as Facebook and Twitter become dominant in their own right and points to intriguing developments in the ecosystem such as the paper.li phenomenon. Swiss-based Paper.li says it organizes links shared on Twitter and Facebook into an easy to read newspaper-style format ready to be further shared. The result is a DailyMe style portrayal and capable of being searched but regenerating enormous amounts of traffic through the social networks beyond the traditional idea of purely search engine based rankings. The mixing and remixing potential may go a long way. Meanwhile, linking it seems is still everywhere, and still finding new ways of being used. Search, its successes and its controversies seem not to be going away anytime soon.

Figure 2 Recommendations from people known Consumer opinions posted online

70%

Brand websites

70%

Editorial content (eg newspaper article)

(*E.g. 90% of respondents trusted "completely" or "somewhat" recommendations from people they know)

69%

Brand sponsorships

64%

TV

62%

Newspaper

61%

Magazines

59%

Billboards/outdoor advertising

55%

Radio

55%

Emails signed up for

54%

Ads before movies

Source: Nielsen Global Online Consumer Survey April 2009

41%

Online video ads

37%

IPTV News analysis NewsAnalyisis Analysis

52%

Search engine result ads

33%

Online banner ads Text ads on mobile phones

90%*

The degree of trust in various forms of advertising April 2009

24%

11


www.iicom.org March 2011 Volume 39 Issue 1

by Stephen McClelland

News analysis Towards the "connected consumer"? London - It's hard to know where to begin with the mobile story these days - a story that shows no sign of slowing down its global expansion. But in this expansion, some areas are more expansive than others. Big picture trends are hotly debated: one consultancy has reported on the results of a major survey* of 6000 customers, and it emerges with some surprising perspectives. The inevitable conclusion remains a picture of near-seismic activity in a sector that will continue for some time yet.

Tectonic shifts of mobile There are several tectonic shifts evident in the industry, say AnalysysMason, but some of these appear to be related to (in a larger or smaller context) operator strategy and network operation, particularly in the mobile broadband space. In fact, mobile broadband per se in developed countries may already be seeing some aspects of early maturity and a slowing market, at least compared with the recent high growth rates. However, operators would be unwise to try to enhance positioning of mobile broadband as some sort of substitute for fixed line broadband. Consumers, in short, seem to want the mobility of mobile broadband rather than anything else. As Martin Scott of AnalysysMason points out: "Attempts to sell mobile broadband as a substitute for fixed are likely to fail as there is a strong perception among consumers that mobile broadband is not as fast, more unreliable and more pricey than fixed broadband.

12

Ironically, many people might not even be using mobile broadband in a mobile sense. According to the AnalysysMason survey, some 72% of mobile broadband subscribers surveyed in Europe said they used the service mostly, or solely, at home or work, and, for US subscribers, this figure increased to more than 79%. AnalysysMason said it expected this figure to increase, with customers downloading 90% of mobile data indoors by 2015. So much for the mobile part: AnalysysMason predicts this will in turn mean significant opportunities for fixed operators to move into the connected space of users because when indoors, mobile data consumers customarily access data through WiFi/DSL networks and fixed operators should be able to analyze customer demand across mobile devices.

Getting to M2M Operators have been improving their game - customer churn is down, for example, as the industry sees more stability in pricing and longer term contracts. However, the Apps world may be giving operators a major headache. AnalysysMason reports for example, one major European operator is only getting a 65% ‘first time the user tried’ completion rate on Apps with consequent increased churn and more customer service needed. The problem is particularly severe with smartphones and inexperienced users, but may signal the industry needs to work harder and in a more user-friendly way.

Meanwhile, the industry is searching for new business models - and ones that may not necessarily involve people. More recently, the focus has been on predicting the takeup of machine-to-machine (M2M) applications. M2M applications are devices embedded seamlessly in services but in this area at least, no figure for M2M deployment in the 2015 to 2025 timeframe seems too high. AnalysysMason themselves forecast over 2 billion devices in 2020 but that is much lower than several other estimates. It would still represent a compound annual growth rate of 36%. In fact, as Steve Hilton of AnalysysMason says, "M2M solutions are nothing new – the technology sector has dabbled for at least ten years. However, whereas most deployments have focused on the commercial segment – particularly in the automotive/ transport sector, where the first solutions appeared – we expect this balance to shift markedly from commercial to consumer applications." He continues: "While the commercial segment accounts for well over half of today’s M2M deployments, consumer connections will grow to outpace commercial ones during the next decade. Consumer demand for real-time access to information will foster new services, including home security, health management, vehicle monitoring and connected engine diagnostics. "The Average Revenue per User (ARPU) from M2M services is substantially lower than that from mobile voice or traditional mobile


www.iicom.org March 2011 Volume 39 Issue 1

MARKET POSITION OF DIFFERENT MOBILE BROADBAND CONNECTION TYPES (2011- 2014) Type of connection

Market position in 2011

Mass-market users of mobile broadband

USB modem

Inbuilt connection

Heavy users. Typically these would be users of tablet PCs or laptops. This is a growing niche of users.

MiFi device

Users with multiple device connectivity. With growing awareness users will buy MiFi devices, especially on contract renewal

Handset with MiFi capabilities

Early adopters with adhoc requirements to connect multiple devices. Penetration will remain low until battery issues and tariffing problems are resolved.

Estimated share of connections (2011)

Market position in 2014

Estimated share of connections (2014)

>75%

Relatively small portion of the market, as other connections forms take precedence. Low income

~40% and declining

<15%

The standard connection form for consumers making heavy usage of single devices, including laptops and tablets

~ 20%

<5%

Mass market connectivity solution though starting to lose share to handset-based connectivity

~15%

<5% (tethering)

Individuals with multiple devices to connect. With battery and tariffing issues resolved, this is a key way for users to connect multiple devices.

~25% and growing

Source: AnalysysMason

data services: about EUR3.65–5.15 per SIM per month." The ARPU per kilobyte of data can be very high, however, he says and operators should make money by treating M2M as incremental traffic on a built broadband network.

But perhaps the most intriguing of all the mobile trends relates to a sense of convergence directly in the consumer space but largely mediated by the increasing number of devices that single users have that all need connectivity, probably on an ad hoc basis. Consumers clearly

AnalysysMason's Tom Rebbeck suggests: "To capitalise on this increased interest in MiFi/personal hotspots, operators should try to position themselves as the central point of management for multiple devices that need Internet access. In the longer term, this may mean

providing multi-SIM contracts (that is, adding multiple devices with inbuilt connections to a single contract). But until billing systems can manage this feat, MiFi/personal hotspots will be the way for operators to become the key player for users with multiple devices." AnalysysMason conclude that many options are likely to be available in the next three years with no single one dominating (see Table above). - Stephen McClelland *Mobile Insight and the Connected Consumer (AnalysysMason 2011)

13

IPTV News analysis NewsAnalyisis Analysis

And now - MiFi

don't want multiple connection options and separate subscriptions. One approach has been the development of the so-called personal hotspot or MiFi, a generic term for compact wireless routers. Verizon, for example, has already announced that its iPhone will feature a personal hotspot.


www.iicom.org March 2011 Volume 39 Issue 1

Jacques Bughin

News analysis The web's USD 100 billion surplus Consumers derive significant value from all they do on the Web, and since advertising pays for much of this, it involves no immediate out-ofpocket cost. We all experience these benefits each time we log onto a social network or watch a free Web video. But how much is all of this Web use worth? About EUR150 billion a year, according to new McKinsey research involving a survey of 4,500 Web users across Europe and the United States, as well as conjoint analysis of their willingness to pay for various online activities. Consumers do pay for some of this: EUR30 billion for services such as music subscriptions and gaming Web sites. In a sense, they also pay for the “pollution” of their Internet experience by intrusive pop-up advertising and perceived data privacy risks, an amount we estimate to be €20 billion after asking consumers what they would pay to avoid further clutter and privacy concerns. That leaves a substantial consumer surplus of EUR100 billion a year, a total that we project will grow to EUR190 billion by 2015 as broadband becomes ubiquitous around the world and as new services and wireless devices come to the fore. For Web service providers, this is a large parcel of value to leave on the table. In fact, it amounts to more than three times the EUR30 billion companies pay providers to advertise on their Web sites and is almost

14

as much as the EUR120 billion consumers pay for wired and wireless broadband access. One reason for this seeming largesse may be that once a Web service is created, the cost of distributing it is very low, and most Web companies are satisfied covering their basic costs with advertising. In the off-line world, things are different, of course: the surplus is more evenly divided between consumers and suppliers, since in many markets— books, movies, or cable TV, for example—consumers pay for content.

Three ways Web economics could shift Web players may try to recapture some of this large, growing source of value. One not-too-distant example of such a move is how broadcasters gradually shifted service from free programming to pay-TV to capture a bigger slice of value. While it’s not clear how things will play out on the Web, at least three scenarios seem worth contemplating.

Service costs rise One obvious possibility is that Web players will charge more for services, they already do for certain premium offerings, such as multiplayer video game sites or subscription-based access to unlimited music libraries. So far there’s been strong resistance to this approach from consumers: only about 20% of online users pay for some services, and our research

shows that expanding the scope of fees to an amount equalling the value of the surplus would reduce usage by as much as 50 percent, torpedoing the economics of Web services.

Advertising grows Another strategy would be to ramp up Web advertising, and here, the “pollution factor” may be the key. At present, Web companies are reaping more in advertising revenues than consumers are willing to pay to avoid them (EUR30 billion versus EUR20 billion). This imbalance suggests that today’s levels of advertising are sustainable and that there could be room for more ads or other monetization plays, such as asking consumers to provide more personal data to access services. It’s hard to say how much more, though, because there’s no data on how consumers would respond if Web pollution grew a great deal. Is there a tipping point where their willingness to pay to eliminate pollution would increase so dramatically that business models would shift in response? For example, if ad revenue grew to EUR40 billion or EUR50 billion, it is not clear whether consumers’ willingness to pay to avoid the new ads would grow so much that Web service providers would be better able to extract more surplus by charging users more, as opposed to selling still more ads.


www.iicom.org March 2011 Volume 39 Issue 1

Monetization Web players operate in multisided markets that allow them to collect revenues from both their advertisers and their users. They may be betting that by creating a large consumer surplus today with free services and big audiences, they will bolster their online brands, leading to higher profits or market value down the road. The rationale for this approach is pretty compelling, though a for-pay walled garden would work only for premium brands and services. Even for those, reach will be limited—as will companies’ ability to use their Web platforms to launch other businesses.

Preparing for change Of course, we’re still in the early days of the Web economy, and only recently has the consumer surplus swelled with the rise of blockbusters such as Facebook and alwayson connectivity. Clearly, this is a market that’s far from equilibrium, so players should be planning for major change and preparing their strategies accordingly.

Email

Search

Search

3.2 3.1

Social networks Social networks

2.2

Instant messaging Instant messaging

2.1

Internet phone

1.4

Internet phone

Web mapping Web mapping Videos Videos

1.1 0.9

Comparison shopping shopping Comparison

0.8

Music

Music

0.8

Wikis

Wikis

0.8

Yellow pages Yellow pages Advanced uploading Advanced uploading

0.8 0.7

Podcasts, content reading Podcasts, contnt

0.7

Blogs

0.6

Games, gambling Games, gambling

0.6

Blogs

Directories Directories

52%

of total surplus

Communications Web services Entertainment

44% 38% 18%

0.5

devices; Twitter and Facebook are prime examples of such multi-use platforms. As more business and individual activities move online, early movers should be well positioned to capture higher advertising revenues and perhaps, over time, higher service fees. In turn, advertisers may have better revenue options because of Web innovations. Some are already moving beyond distracting display ads; they’re designing branded content promotions to attract the attention of users and shaping marketing campaigns around messages that travel virally among socially networked “friends,” thus

making these campaigns more acceptable to the consumer. For consumers, the benefits of Web surpluses will continue. Engagement with consumers is the key to value creation in multisided markets, so they should expect continuing service innovations and tolerable advertising levels that keep the prices for Web use and access low. Jacques Bughin is a director in McKinsey’s Brussels office. © McKinsey Quarterly 2011

15

IPTV Analyisis News brief News Analysis

Service players trying to stay ahead of market shifts must be attuned to rapid market consolidation: the top 100 providers accounted for 45% of Web traffic in 2010, up from only 20% in 2007. To stay ahead, leading players are already broadening their base of services on robust proprietary platforms, particularly services that can be offered at low cost via the cloud and mobile

Email


www.iicom.org March 2011 Volume 39 Issue 1

PTC '11 Reading the Net future Honolulu - It doesn't take very much observation to see - as PTC '11 demonstrates - that in spite of the global economic pressures, telecom deployment in Asia-Pacific remains in an upswing. The region is still in a broadband deployment phase both nationally and internationally and can boast widespread and advanced mobile services.

One trillion is a round number Connectivity for everyone is key but behind - or rather above - the network deployment, future trends need revision. But the relationship between the two is complicated and ideal for PTC (as a service provider conference) to examine. For Cisco's Robert Pepper, there are three big trends: "The first trend will be total ubiquity [of the network]. We have 2 billion on Internet and only 1 billion on broadband...so the question is how far we can replicate the mobile miracle, and have 5 billion on broadband. More importantly, by 2015, we are looking to 50 billion devices on the Internet going to one trillion by 2015." He continues: "Second is un-tethered [communications]...I am not using the word "mobility" because the difference between the two is velocity. The third is the demand for "ultra-broadband". It is not a megabit connection...we are talking about Gigabit Ethernet more or less everywhere at some point. We are seeing huge demand drivers for consumers, being driven [especially in the Asia Pacific] by data especially video services." 16

Collaborative commons?

Goodbye Gutenberg

Blair Levin, now of the Aspen Institute and late of the US Broadband Plan, takes a more societal viewpoint: "Focussing on societal implication, when we were at the FCC we had a lot of talk about convergence of voice, video and data. That has happened, but it is not the most important thing. Far more important is the commons of collaboration which has huge implications for the economy and society."

"As for the future of it, it was really noticeable during the [US] Broadband Plan whereas most of the talk was about the deployment of networks, the real upside for America (and I would argue for most countries) is not in the deployment of networks but in the use of applications for the public good.

He says he can see hugely positive and also negative impacts as a result of this shift in society, and continues: "I am not sure [broadband networks] will be ubiquitous but they will [need to] become ubiquitous because if you want a productive society you have to have everyone on, and everyone skilled in how to use it. It is also true that when you have a civic society it becomes the commons for that so if you want to know what going on in terms of news, you are no longer listening to the radio or watching broadcast television, you are relying on this medium." "That is really being driven by three revolutions: data, computing, and communications. In the data revolution billions of connected devices create data - finding a data point would in the past have been like finding a needle in a whole galaxy of haystacks. The computer revolution makes it easy to analyze those data points. That revolution along with the communications revolution that leads to the idea of the commons platform.

"It is noticeable that the sectors of the economy that are the last ones to adopt broadband are the ones that are dominated by government either as the sole buyer, or major supplier: education, healthcare, and public safety are examples. All these could be improved by broadband, and yet we are way way behind. The fundamental way we distribute educational content to children is through a platform that was developed 500 years ago by Gutenberg. A digital platform is so much better for education and, by the way, a lot cheaper and yet we are very slow to adopt it. And that is true across the board with the public sector. I think there is a big upside [because of broadband deployment], but whether government can overcome the forces of inertia and do that remains to be seen."

Intelligent consumers Smart devices may imply a power shift between suppliers and buyers. In retailing, this may be marked, as consumers have more information power at their fingertips than ever before. Craig Walker, Entrepreneur in Residence at Google Ventures, defines the breakout point may well be the mass market appearance of


www.iicom.org March 2011 Volume 39 Issue 1

the smartphone. That can fundamentally change many things such as customer relationships: "Having those types of connected, interactive [capabilities] means everything can be a much more interactive experience, particularly for the consumer, who now has unprecedented information at his or her fingertips. That is what really excites me: everyone walking around with smart, connected location-ware devices. So what services and products are going to emerge to marry the elements and make the whole experience effective?" Blair Levin likewise acknowledges the impact of the information revolution on the traditional knowledge industries. There is no more "sacred" knowledge in this power shift, he suggests - even for professional classes. "Lawyers, for example, have to increase their value proposition because everyone - including the client - has access to the [same] data and can see what is going on. This access becomes routine."

An interactive interaction?

Robert Pepper agrees. "A teacher from 1911 could walk into a classroom today and feel very comfortable because nothing has changed." Dr Pepper suggests that emerging economies - particularly around the Pacific Rim - are embracing advanced technologies in the public sector space "faster than North America and Europe where there are huge [public sector] constituencies threatened by them."

Not the most important problem? Blair Levin contends it is a policy question: "A central point is policy in the United States. We are going to spend billions and billions of dollars to solve an important problem - but not the most important problem - that is how to get rural America hooked up. We are paying subscale inefficient companies to offer services. What we need to do is to drive connectivity and leadership in the next couple of decades is make sure from a strategic basis we are leaders in the connectivity." He continues: "Google is doing this with the Google Fibre initiative. This is perhaps the most stunning thing of 2010. I am schizophrenic about this. Eleven hundred communities have signed up. But these communi-

ties are saying they want a Gigabit capability, roughly equivalent to a small community saying they want an eight lane highway in 1911. Part of me thinks: what are they doing? On the other hand, I think there was a really profound understanding that this is the future. You really have to be visionary in how you use this stuff. From a US perspective, we need a critical mass of communities so we can lead in next generation applications." "I think it is about networks fit for purpose," suggests Robert Pepper. "We actually need networks that are adapted and dynamic and that can be, in some ways, aware of the applications. The applications are going to drive the demand on the networks, but the networks have to be capable of supporting those applications...so the underlying infrastructure is absolutely crucial... it is necessary but not sufficient for where we want to go which is the demand driving use of applications which actually will be anywhereanytime." He continues: "One of the magical things about mobiles is that I can go anywhere in the world and switch the phone on, and the network can instantly detect that I am on the network, my device is authenticated and I pay my bill...and that all happens within seconds. That is pretty amazing and that is what is required for all applications." - Stephen McClelland http://www.ptc.org

17

IPTV Analyisis PTC '11 News Analysis

He continues: "In education, a child reading a conventional textbook today perhaps doesn't understand something - so what is their ability at that moment in time? Contrast that with the same material as an e-book. After every paragraph the child can click a link or can see a one-minute video of the best teacher in the world explaining the concept. Or he or she can click a link for a Skype connection to a tutor who may be a college student offering tuition on the subject for a demand-led fee of USD4 per hour. The child is essen-

tially empowered at all times to find resources to deal with that particular problem. Meanwhile, the textbook company has become an educational content company [to enable this]. This sort of collaboration is happening in many places, but it is not happening in education, and it is not happening in healthcare."


www.iicom.org March 2011 Volume 39 Issue 1

by Samia Khatun, Klara Debeljak and Dr. Gerry Power

Citizen Access to Information Emerging trends from the developing world

The Global Monitoring Report on Education for All 2010 and Reaching the Marginalised, both published by UNESCO, report shocking figures about the state of education in the developing world. Less than 55% of school-age children in developing countries attend secondary school, seventy-two million children are still out of school and if current trends continue, there will still be 50 million out of school by 2015. On another front, across Africa, Asia, the Middle East and Latin America, there is a profound transformation in the way people are gaining access to information. Rapid media liberalization and the increasing availability – and sometimes ubiquity – of new technologies (particularly mobile telephony) are combining with broader political and social changes to bring about this transformation (West, 2008). Global mobile phone subscriptions are expected to reach 7.9% during the period 2007-2012, boosting the number of global mobile phone subscribers to 4.5 billion in 2012, with a penetration rate hitting 64.7 %, up from 46.8% in 2007. Most of that increase is happening in developing countries in which it is having a transformative effect. Since 2000, mobile ownership has grown by 70 % every year across the 50 poorest countries of the world. The International Telecommunication Union (ITU) (2009) reported that mobile phone subscriptions have increased dramatically in Africa from around 4% to 28% of the total population in the period from 2002 to 2007.1 According to ITU’s 2010 report, the global mobile cellular subscription already reached an estimated 4.6 billion by the end of 2009, corresponding to a penetration of 67 per 100 inhabitants globally. We know that investment in education produces significant returns in poverty reduction, economic growth, child survival and democracy. We also 1 ITU, Measuring the Information Society, ICT Development Index, 2009, p.4.

18

know that there is huge potential for mobile platforms to meet the information needs of people living in poverty. However, the extent to which the benefits of the increased access to information and communication technologies will be shared equitably among citizens of developing countries has been challenged (Etzo & Collender, 2010). In September 2010, a conference was held at the UN headquarters in New York to discuss the Millennium Development Goals (MDGs) and to review progress to date, with five years left to the 2015 deadline. The conference finished with the adoption of a global action plan to achieve the eight anti-poverty goals by their 2015 target date. Progress on the MDGs is reliant upon populations in developing countries understanding and acting upon MDG-relevant information. The World Bank (2003) believes that “When the crucial information and communication needs of the poor go unmet, quality of life may significantly degrade, resulting in social exclusion, marginalization, isolation, alienation and humiliation” (pg 36). Similarly Wilson, Warnock and Schoemaker (2004) argue that “Reaching the MDGs in 2015 will require a belated recognition that communication is a prerequisite and central to all aspects of sustainable development” (pg 4). However, despite a growing consensus that information provision is important and that, for example, ICTs can improve the delivery of services and facilitate management and transfer of knowledge, the role of information has largely been left off the MDG agenda. We believe that information is not only imperative, but also that “Citizen Access to Information” is a catalyst to achieving the MDGs across a broad range of development indicators and that tracking citizen access to information over time would aid in resource allocation efforts to support all of the MDG objectives. To obtain sufficient understanding of citizen’s access needs in relation to development outcomes


www.iicom.org March 2011 Volume 39 Issue 1

it is crucial to understand that the availability of information and access to different technology platforms in countries in development varies significantly across different population subgroups; men and women, among educated and uneducated and between those living in rural and urban areas. For example, while 82% of all adults living in urban areas in Ghana own a mobile phone, the share of mobile owners in rural areas is much lower, namely 65%. In Zambia the gap is even wider: 81% of adult residents of urban areas own a mobile phone in comparison with only 53% in rural areas. To address these issues, InterMedia has developed a citizen-centred research framework Citizen Access to Information MDG Tracker™, that identifies which population sub-groups lack what information and provides direction on where the gaps need to be filled.

Information access & the MDGs The MDGs are eight international development goals that all 192 United Nations member states and at least 23 international organizations have agreed to achieve by the year 2015. They include targets on poverty, education, gender equality, child health, maternal health, HIV/AIDS and other diseases, environment and global partnerships for development. However, reference to information provision on any of these topics are largely missing in the MDGs. Information and communication are in fact mentioned only twice, once as one of the targets within Goal 8 and once as one of the indicators for measuring progress within Goal 6:

Likewise, the public debate around information provision and its relevance to the achievement of MDGs (as well as development issues in general) has so far been limited. When this debate has taken place, it has been largely focussed on

While access to different media and technology platforms or sources is certainly important, it alone does not guarantee access to information, nor does it warrant access to quality information that is relevant to people’s day-to-day lives. In addition to measuring citizen’s access to various platforms it is thus imperative to understand who is using which platforms, what is the purpose of use, how often do they use them and what content is being consumed. To capture this more holistic definition of access to information we propose to expand the definition of access to information to include five dimensions and related sub-dimensions. We employ the term ICM - Information, Communication and Media resources, which incorporates both traditional mass media (radio, television and print) and newer platforms (internet and mobile) as well as informal resources such as “word of mouth”.

Our approach: 5 dimensions of access to information Our approach is based on an understanding of access to information as a composite measure of access to source/s or platform, exposure, evaluation, content and self-reported differences in citizens’ reporting of the impact of their use of ICM resources (Power, Khatun and Debeljak 2011). The data that were used to test this approach were collected within InterMedia’s AudienceScapes research initiative, which was co-funded by the Bill and Melinda Gates Foundation, and aimed to improve development outcomes through knowledge sharing and dissemination of research and analysis in a user-friendly format (see www.audiencescapes.org). Within the initiative, InterMedia conducted nationally representative surveys in Kenya, Ghana, Zambia and Tanzania, which focused on how the general population in all four countries obtains, shares and uses information on development related issues. The analysis provided insights into understanding citizen’s access to information in a number of key development areas, such as education, health, governance, agriculture and personal finance, and enabled the InterMedia team to examine our approach to ICM resources and the five dimensions of “Citizen Access to Information”, discussed here. 19

IPTV News Analysis Access to Analyisis information

»»Goal 8: Develop a global partnership for development. Target 8.F: In cooperation with the private sector, make available the benefits of new technologies, especially information and communications (UN Official List of MDG Indicators 2008). »»Goal 6: Combat HIV/AIDS, malaria and other diseases. Indicator for monitoring progress 6.3: Proportion of population aged 15-24 years with comprehensive correct knowledge of HIV/AIDS (UN Official List of MDG Indicators 2008).

connectivity issues and the digital divide, rather than on access to the relevant content and its quality.


www.iicom.org March 2011 Volume 39 Issue 1

Own a phone

Did not use a phone in the past year 6%

60%

30%

3%

24%

65%

10%

24%

21%

11% Purchase SIM cards to use in others' phones

Phone users who do not own a phone Did not purchase SIM cards Figure 1 Kenya: Types of Mobile Phone Access AudienceScapes National Survey of Kenya, July 2009 (N=2000 adults 15+)

In this article we use examples from these four studies to demonstrate the variation in citizen access to information in developing countries, and illustrate the proposed dimensions and sub-dimensions of our approach.

Dimension 1: Access to source The first dimension, access to a source, is a precondition to access to information and also a basic criterion for the other four dimensions proposed in our model. These sources include traditional media, i.e. television, radio and print media, newer platforms (mobile phones and internet), as well as non-media sources, such as “word of mouth”. When assessing citizen’s access to individual sources, particularly to traditional media and newer platforms, it is however important not only to measure their overall access to these sources and platforms, but also to distinguish between those who have access and those who own the technology. The AudienceScapes research, for example, illustrates that while the overall access to mobile phones in Kenya is almost universal (90% of all adults say they have used a mobile phone in the past year), a wide gap exists between those who personally own a mobile phone (60%) and those who borrow mobile phones from others (30%). A similar pattern also emerged in Ghana (Figure 1 and Figure 2). Citizen’s consumption of information is also influenced by the place where they access various sources, and it is thus crucial to understand whether this access occurs in a public or a private space. For example, content on HIV/AIDS and reproductive health may be 20

Figure 2 Ghana: Types of Mobile Phone Access AudienceScapes National Survey of Ghana, July 2009 (N=2051 adults 15+)

more comfortably consumed by young people in the privacy of their home than in the internet cafe. Further, in a world of growing convergence between platforms it is necessary to accurately identify the medium of access. This can be particularly challenging in the case of mobile phones, which are now used not only for making calls and sending short messages, but also to listen to the radio, watch television or access the internet. Furthermore, mobile phones have in recent years become an increasingly important platform for sharing development focused content and even play a growing role in the delivery of various development services. For instance, in 2007 Safaricom launched a mobile money service M-Pesa, which over the past few years enabled millions of previously unbanked people to have access to financial services without needing to visit a bank branch (CGAP 2009). In November 2010, the Bill and Melinda Gates Foundation announced several grants to scientists who are pioneering the use of mobile phones to improve health care in poorer communities. Among others, grants were given to scientists who aim to develop a disposable malaria biosensor based on a SIM card platform, which will make diagnostic testing more widely available in remote areas. A grant was also given to a project aiming to develop a mobile-phone based tool which will quickly identify women at risk during labour and delivery and assist with emergency transfer to a hospital, thereby reducing maternal and infant mortality rates (Gates 2010). Finally, we need to take into account possible elements which may restrict access to sources and platforms at certain times, such as electricity cuts


www.iicom.org March 2011 Volume 39 Issue 1

100%

or weak signals, which may result in the content being incomprehensible. For example, 24% of all Zambian adults who said they do not listen to the radio quoted the lack of radio signals as the main reason, while 27% cited problems with power.

55.5 50%

3.2

1.5

0% Once a week

Every day

1.1

Dimension 2: Exposure to Content

15.5

9.5

0.3

Once a month

As mentioned, access to a medium or technology alone does not imply access to information. In the case of Zambia, about three quarters (73%) of all adults confirmed that they have a working radio in their household, yet considerably less, 56% reported listening to the radio every day (see Figure 3). Four percent of those who do have a working radio in their household on the other hand said they never listen to it. Establishing the frequency of exposure (i.e. how often a person is consuming content), the amount of time exposed, the recency of exposure and the criteria for confirming accurate recall of the content and specific format is therefore critical in understanding how exposure to content relates to development outcomes.

Never

A few A few times a Less often times a DK month week Figure 3 Zambia: Frequency of radio use AudienceScapes National Survey of Zambia, 2010 (N=2000 adults 15+) 100%

50% 24

33

24

25

20

12

7

Dimension 3: Content

0% Malaria

Tuberculosis Diarrhoea Polio Maternal HIV/AIDS Family and infant planning health

Figure 4 % who never received information on health topics or received it more than 12 months ago in Ghana AudienceScapes National Survey of Ghana, 2009 (N=2051 adults 15+) 100%

96 94

Nevertheless, exposure to content does not guarantee that the content provided will be of specific value or interest to the citizen. To make sure that citizens’ needs are met, a high level of specificity is required and it is imperative to identify the precise attributes of the content consumed by the citizens, namely a specific source, date, genre and 100%

86 82 75 72 62

58

50% 35 29

26

0%

Radio

TV

Friends and family

Colleagues at work

SMS texting

Internet

Posters/brochures Traditional healers Newspapers Community elders

Figure 5 % who find the source at least somewhat trustworthy for information on health issues in Kenya AudienceScapes National Survey of Kenya, 2009 (N=2000 adults 15+)

75%

50%

25%

Urban ed. women

Rural ed. women Rural ed. men

Urban ed. men

Rural uned. Urban Urban uned. women uned. men women Rural uned. men

0% 25%

50%

75%

Used formal financial services in the last year (%)

Figure 6 Ghana: Access to information services and access to formal financial services AudienceScapes National Survey in Ghana, 2009 (N=2051 adults 15+)

21

Access to information News Analysis

Medical doctors

Received information on formal financial services in the last year (%)

13.5


www.iicom.org March 2011 Volume 39 Issue 1

the sub-topic. For example, instead of asking the respondents about “health content” they might have heard on the radio, it is critical to, ask about information on acute respiratory infection (ARI) which was provided during the programme for young mothers on Channel 4 on Friday at 6pm.

their knowledge levels, attitudes or self-reported decision-making or behaviour can be attributed to information they received from a particular source? Are those, who are well informed about a specific topic, also more likely to act on this information?

The importance of specificity is illustrated by an example from our AudienceScapes research in Ghana, which explored citizen’s access to information on various health sub-topics. As Figure 4 shows, more than a third of all adults in Ghana never received any information on diarrhoea, or received it more than a year ago, a shocking result for a disease that still kills 1.5 million children under five every year and accounts for 16% of child deaths worldwide (UNICEF/WHO, 2009). On the other hand, Ghanaian citizens appear much better informed particularly about malaria, with only 7% claiming they never received any information on malaria, or received it more than a year ago. If respondents had been asked about the information they received about health topics in general, the results would in no doubt be different, and the level of specificity, which enables an actionable response and filling the gaps where needed, would be significantly reduced. `

The AudienceScapes data on citizen’s information and the use of formal financial services in Ghana for example show a positive correlation between the amount of information on formal financial services received by the citizens and their actual use of these services. In addition, the data also reveal large variations between different population sub-groups, in terms of the amount of information they received, as well as their self-reported behaviour. Gender, location (urban/rural) and

Dimension 4: Evaluation of Content To understand the potential of received information to effect individual and social change, it is necessary to establish measures of quality of the content according to the citizen. How appealing does the citizen consider the content? How interesting, trustworthy, objective, diverse and relevant to them? AudienceScapes research in all four countries revealed that different types of sources enjoy different levels of trustworthiness for different topics. For example, in Kenya radio is perceived as the most trustworthy source of information on health issues, followed by medical doctors; health information obtained from these sources is trusted by more than nine in ten Kenyan adults. At the other end of the spectrum are traditional leaders and the internet, which are seen as trustworthy sources of health information by less than a third of all adult Kenyans (Figure 5).

Table 1 (Source: Power, Khatun and Debeljak, 2011)

FIVE DIMENSIONS OF CITIZEN ACCESS TO INFORMATION Dimension 1: Source or platform

Access v Ownership Public v Private Restricted Access: Time, Electricity, Signal Medium

Dimension 2: Exposure to content

Frequency Amount of Time Recency Recall-format specific Indexing-Dose Effect

Dimension 3: Content

Named Source Genre Topic v Subtopic Date

Dimension 4: Evaluation of content

Appeal Interest Trust Diversity Relevance Objectivity

Dimension 5: Self-reported response

Self-report: Political participation, Health, Education, Gender, Livelihoods

Dimension 5: Self-Reported Response The final dimension of our model builds on the previous four dimensions and explores the relationship between reception of information on individual topics, and self-reported behaviour. For example, to what extent do citizens identify that

22


www.iicom.org March 2011 Volume 39 Issue 1

level of education all play an important role in defining the amount of information different citizen groups have on formal financial services, and how likely they are to use them. Knowing these variations is crucial as it provides a more nuanced understanding of where the information gaps are, and where information resources need to be strengthened.

Access to information The goal of the research program proposed here is to provide a framework that employs citizens’ access to information as the lens to understand the relationship between ICM resources and the development outcomes, particularly in achieving the MDGs. The five-dimensional index described above (Table 1) is designed to capture the complexity of citizen access to information in developing countries and provide policy makers and other development actors with a detailed understanding of the information gaps and needs among different populations sub-groups. It facilitates input of the research findings into information dissemination campaigns, establishes clear benchmarks and tracks progress over time. It also enables a tailoring of communication activities to the information needs of specific population sub-groups, and by increasing efficiency of these policy interventions, positively contributes to development outcomes, and the progress on the MDGs. Samia Khatum is a Research Assistant at InterMedia in London supporting a range of assignments across Africa and Asia. She holds a BSc honours degree in Politics and Economics from Brunel University and an MSc in Political Economy of Development from the School of Oriental and African Studies (SOAS), University of London.

Dr Gerry Power is Managing Director of InterMedia in London. Prior to joining the InterMedia team, Gerry served as Director of Research and Learning at the BBC

References BBC World Service Trust (2006) African Media Development Initiative. London: BBC World Service Trust. Besley, T. and Burgess, R. (2003) Halving Global Poverty. London: LSE, accessed 05/01/2011, http://econ.lse.ac.uk/staff/rburgess/ wp/jep11.pdf CGAP (2009) M-Pesa: Connecting Urban and Rural Communities. CGAP, accessed 29/12/2010, http://www.cgap.org/p/site/c/ template.rc/1.26.11223/. Etzo, S., and Collender, G. (2010) “The Mobile Phone ‘Revolution’ in Africa: Rhetoric or Reality?,” African Affairs, 109(437): 659-668. Gates, B. (2010) Cell Phone Science. Bill & Melinda Gates Foundation, accessed 04/01/2011, http://www.gatesfoundation.org/ foundationnotes/pages/bill-gates-cell-phone-science-101109.aspx ITU (2009) Measuring the Information Society, ICT Development Index. Geneva: ITU. ITU (2010) Measuring the Information Society, ICT Development Index. Geneva: ITU. Pisani, E. (2009) The Wisdom of Whores: Bureaucrats, Brothels and the Business of AIDS. London: W.W. Norton and Company. Power, G., Khatun, S., and Debeljak, K. (2011) Citizen Access to Information: Capturing the Evidence across Zambia, Ghana and Kenya manuscript submitted to Volkmer, I. (ed) Handbook of Global Media Research. London: Wiley Blackwell UNDP (2006) Communication for Empowerment: Developing Media Strategies in Support of Vulnerable Groups. New York: UNDP Publisher, accessed 20/10/2010, http://www.undp.org/ oslocentre/docs06/Communicationforempowermentfinal.pdf. UNICEF/WHO (2009) Diarrhoea: Why children are still dying and what can be done, New York & Geneva: UNICEF and WHO Waage J., Banerji, R., Campbell, O. et al. (2010) The Millennium Development Goals: A Cross-Sectoral Analysis and Principles for Goal Setting after 2015. Lancet, September 13th. West, J. (2008) The Promise of Ubiquity. Mobile as Media Platform in the Global South. Paris: Internews Europe, accessed 23/11/2010, http://www.internews.eu/publications/promiseubiquity. Wilson, M., Warnock, K., and Schoemaker, E. (2007) At the Heart of Change: The Role of Communication in Sustainable Development. Panos: London. World Bank (2003) ICT and the MDGs A World Bank Perspective. Washington, DC: World Bank, accessed 06/10/2010, http://www-wds.worldbank.org/external/default/WDSContentServer/WDSP/IB/2004/09/15/000090341_20040915091312 /Rendered/PDF/278770ICT010mdgs0Complete.pdf.

23

IPTV Access to Analyisis information News Analysis

Klara Debeljak is a Project Manager at InterMedia in London. She works across multiple teams and has completed a range of qualitative and quantitative research projects in Africa, South Eastern Europe, Central Asia, and South-East Asia. She managed Bill & Melinda Gates Foundation-funded AudienceScapes research in Kenya, Zambia and Tanzania and authored the AudienceScapes’ report on Communicating with Policymakers about Development in Zambia.

World Service Trust, working across radio, television, internet and mobile platforms. InterMedia (www.intermedia.org) is a research-based consultancy providing strategic guidance and insight into the behaviours and views of people globally, with particular expertise among hard-to-reach populations.


www.iicom.org March 2011 Volume 39 Issue 1

by Richard Cadman

Margin squeeze: defining a reasonably efficient operator* Is this the biggest challenge in telecom margin squeeze?

What standard should a regulator or competition authority apply when determining if a dominant firm has been margin squeezing its competitors: the Equally Efficient Operator (EEO) or Reasonably Efficient Operator (REO)? Traditionally competition authorities, at least, have applied the EEO on the basis that economic efficiency is only served if competitors are at least as efficient as incumbent firms. However, the European Commission (EC) has on several occasions introduced the concept of a REO, most recently in its Recommendation of regulation of Next Generation Access1 where the EC writes: “In the specific context of ex ante price controls aiming to maintain effective competition between operators not benefiting from the same economies of scale and scope and having different unit network costs, a “reasonably efficient operator test” will normally be more appropriate.” (paragraph 26) The problem is that a REO has not been properly defined leaving it difficult for regulators to apply the test. In this article we seek to define a REO to meet two objectives. First, the REO standard should promote efficient entry in markets where there is a dominant firm. Secondly, the REO standard should be sufficiently transparent that it can be applied by the dominant firm when setting its own prices.

Policy and legal Background The EEO standard has emerged in the context of competition policy with its emphasis on protecting consumers from the abuse of a dominant position that one or more firms might enjoy in a relevant market. Competition policy and law do not seek to introduce competition into previously monopolistic markets, but to protect consumers should one firm 1 European Commission (2010) ‘Commission Recommendation of 20th September 2010 on regulated access to Next Generation Access Networks’ SEC (2010) 1037

24

in a relevant market become dominant. Competition policy is concerned primarily with economic efficiency and so it follows that any competitor to a dominant firm should be at least as efficient. To use competition law to protect a less efficient competitor would be harmful to economic efficiency and consumer welfare. The various cases of alleged margin squeeze that have been examined by the courts and the European Commission under competition law have, therefore, expected the competitor to be equally efficient. Regulatory policy, by contrast, has the objective of promoting sustainable competition, typically in previously monopolistic markets. In the European Union, the Common Regulatory Framework for the electronic communications market explicitly sets the objective of national regulatory authorities as promoting competition in Article 8.2 of the Framework Directive. Regulatory policy is applied in very different economic and market conditions to competition policy. Table 1 contrasts the economic circumstances where competition and regulatory policy apply. The idea of a REO, which we also refer to as an explicitly ex ante margin squeeze test, has emerged because in a regulated market where competition is being introduced it would be difficult, if not impossible, for an entrant at the time of entry to be as efficient as the incumbent. As the European Commission suggests in the paragraph quoted above, an entrant will not benefit from the same economies of scale and scope as an incumbent. How then should a margin squeeze test be adapted to meet the REO standard, whilst not promoting inefficient entry and being sufficiently transparent * This article is based on a consultancy project conducted by the author and Richard Carter for three Dutch electronic communications firms: BBNed, Online and Tele2. All views expressed are those of the authors. I thank Richard Carter for his comments on this article.


www.iicom.org March 2011 Volume 39 Issue 1

Competition policy

Regulatory policy

Normal market. Independent competitors. Market failure may occur if one firm becomes dominant.

Incumbent former monopoly, usually still dominant in the upstream essential input which it provides to itself and downstream rivals.

Policy objective

Protect competition and consumers from the abuse of a dominant position.

Promote sustainable competition where it has not historically existed.

Timing

Applied ex post usually following a complaint of anti-competitive behaviour. Only applied ex ante in case of a merger which may create a dominant firm.

Applied ex ante. Regulated firms may have property rights affected to ensure access by competitors to essential inputs on fair and reasonable terms.

Market economics

Table 1

for the incumbent to know what costs it should price against?

is dominant should of course be passed on to the entrant through regulated prices.

What is an REO?

Equivalence of Input

A margin squeeze occurs when the monopoly or dominant provider of an essential input, which is also active in the retail market, sets its price such that an efficient competitor cannot make a reasonable profit. The monopolist can either set the wholesale price too high, compared with its own retail price, or sets its retail price too low. Formally, a margin squeeze is said to occur when the following condition is met:

The starting point of any margin squeeze test must be the cost of the relevant input or inputs (‘C’ in the condition above). It is essential that the same input is used to calculate the margin in the test as that which is actually used by the entrant. Where a country has followed the UK model of Equivalence of Input (EOI) in which the incumbent is required to provide the same product internally and externally, this ought not to be a problem. However, where incumbents are free to use a different product themselves than that which is used externally then the REO standard should be based on the input actually used by the entrant.

R - (C + M) ≤ 0

Where R = revenue, C = costs of inputs, and M = retail margin. This formal definition is normally interpreted to mean that the competitor is at least as efficient as the incumbent, as both the costs and the margin are typically calculated on the basis of the incumbent’s own costs. However, as the EC implies, competitors’ lower economies of scale and scope may mean that the entrant cannot meet such a standard. In our proposed definition of a REO we consider four costs which an entrant bears but an incumbent does not, at least to the same extent, such that, even if the entrant is as efficient in all other aspects, it cannot be as efficient as the incumbent.

A margin squeeze test based on the input costs of the incumbent would therefore require the entrant to be more efficient than the incumbent to overcome the disadvantages of not having as extensive a network as the incumbent. As the cost of building a local access network is one the most significant economic barriers to entry, regulators would be failing in their duty of promoting competition if the margin squeeze test is based on the incumbent’s input costs rather than the entrants.

25

IPTV NewsAnalyisis Analysis Margin squueze

The costs we are considering here are only the costs of the retail operation of the incumbent and entrant. When we consider issues such as economies of scale, we are not concerned with the economies of scale of the upstream business where the incumbent has SMP, but only with those in the downstream retail business. Any benefits of economies of scale in the product where the incumbent

To give an example: suppose a REO standard is being used to determine if there is a margin squeeze associated with voice calls. The incumbent operator is almost certain to have most “interconnection” at the local exchange level, which is the lowest cost point of interconnection. An entrant, however, may find it efficient to build out only to a proportion of local exchanges and so be more reliant on single and double tandem interconnection and will therefore have a higher cost of interconnect than the incumbent, even if all other aspects of its operation are as efficient.


www.iicom.org March 2011 Volume 39 Issue 1

Economies of scale

Economies of scope

By definition a market entrant will produce at a lower scale than an incumbent. Thus, even if the entrant’s cost curve is as efficient as the incumbent, its lower volume means that its unit costs will be higher. Even if the entrant were to have a more efficient cost curve it would still have higher unit costs until the volume it produces was sufficiently high to challenge the incumbent.

Just as the incumbent will benefit from economies of scale, so too will it benefit from economies of scope. It is almost certain that the incumbent will offer a wider range of products than the entrant. At the very least the incumbent will have both a wholesale and retail product, whereas the entrant will only offer a retail product. Economies of scope allow a firm to spread overhead costs over a wider range of products when those costs grow non-linearly with each additional product.

However, as the incumbent will not know its rival’s cost curve, we propose that a REO should be based on the cost curve of the incumbent, but adjusted for the entrant’s lower production volume. This of course raises the question of what is the appropriate volume for a REO based margin squeeze test. Our proposal is that unit costs should be based on a volume equivalent to a market share of 20% - 25%. Thus if the market has total sales of 1,000 units, the unit costs should be based on the incumbent’s cost curve assuming a production volume of 200 – 250 units. We propose this level of volume because academic research has shown that most of the benefits of competition accrue when there are four firms in the market and there is a diminishing return as more firms enter. In their seminal article, Bresnahan and Reiss2 explore entry and competition in concentrated markets, specifically various retail service markets in discrete geographic markets in the USA. Although their research does not relate directly to electronic communications markets, their findings are nevertheless interesting. Using an econometric model, they seek to measure how the level of profit changes with the entry of the nth firm in a market. Their analysis confirms their hypothesis that post-entry competition increases at a rate that decreases with the number of entrants and that most of the increase in competition comes with the entry of the second and third firms. “Our empirical results suggest that competitive conduct changes quickly as market size and the number of incumbents increase. […] Surprisingly, once a market has between three and five firms, the next entrant has little effect on competitive conduct.” 2 Bresnahan, T. F. and Reiss, P.C. (1991) ‘Entry and competition in concentrated markets’ The Journal of Political Economy Vol. 99 No. 5 pp977 - 1009

26

To provide a simple example, suppose the incumbent sells two products (wholesale broadband access and retail broadband access) and the entrant sells only retail broadband access and buys in wholesale access from the incumbent. Without economies of scope the incumbent has twice the level of overheads as the entrant and it divides them equally between the two products. If the incumbent’s total overheads are €2 it would assign €1 to each product. The entrant has the same level of overhead per product and so has overhead of €1. Each firm’s retail price is then equally affected by the level of overheads. The entrant has its own overheads of €1 plus €1 of overheads included in the wholesale price. Now suppose that the incumbent enjoys some economies of scope such that its total overheads are less than twice the level of the entrant. For example, say that its total overheads are now €1.8. Again it divides its overheads equally between the two products: €0.9 to each of the upstream and downstream product. Now the incumbent’s retail price includes its total overheads (€1.8) and the entrant’s retail price includes half the incumbent’s overheads included in the wholesale price (€0.9) plus all of its own, higher, overheads (€1) in its retail margin. Once again even if the entrant is equally efficient on all other aspects, the simple fact that is has fewer products to allocate overheads to means that it is at a competitive disadvantage: the entrant has to cover total overheads of €1.9 whereas the incumbent has to cover total overheads of €1.8. One way to account for economies of scope in the REO standard would be spread the incumbents overheads across its competitive products only, rather than its competitive and SMP products, thus placing on the same footing as an efficient entrant.


www.iicom.org March 2011 Volume 39 Issue 1

Search and switch costs Last, but by no means least, we turn to the question of compensating the consumer for any search and switch costs he or she incurs when changing supplier. It is a reasonable assumption that there is greater consumer awareness of an incumbent than there is of an entrant, even when consumers are large businesses themselves. After all, by definition the incumbent is already in the market, whilst an entrant may be a recent newcomer. Consumers and businesses considering changing suppliers will incur costs of finding out about potential alternatives and then if they change supplier may incur directs costs of actually changing suppliers: so called search and switch costs. New entrants need to compensate consumers for such costs either through some form of discount or through extensive advertising to lower search costs in the first place. Entrants may also have to absorb some of the switch costs themselves by providing migration services free of charge. Over and above the direct search and switch costs are what might be considered psychological switch costs, or the “nobody ever got fired for buying the incumbent” principle. Risk averse customers may prefer the incumbent simply because it is the incumbent and so would require even greater compensation from an entrant to buy from the entrant. Search and switch costs are likely to be lower for consumers and businesses buying from the incumbent firm and so it may incur much lower costs to compensate consumers. Thus, even if an entrant was as efficient as the incumbent in all other aspects of its business it would still incur higher retail costs than the incumbent. Unless these costs are allowed for in a margin squeeze test, the entrant will be disadvantaged and competition will not be promoted.

The Dutch electronic communications market regulator, OPTA, has implicitly recognised that the additional marketing costs incurred by entrants are in the region of 5-10%. In its decision on fixed telephony markets of December 2008, OPTA stated:

In a different market, the UK energy regulator, Ofgem, established that regional incumbents can maintain a six to ten percent average price differential over competitive suppliers. Ofgem found no cost basis for this premium4. The actual level of discount used to compensate for search and switch costs may be considered on a market-by-market basis. Where the vertically integrated firm is much larger than its competitors, the discount may have to be higher with the discount falling, and potentially being set at 0%, as the market shares of firms converge.

Timing Adjustments to a margin squeeze test to reflect the costs of a reasonably efficient entrant should only persist for the time within which the entrant may be expected to achieve sufficient scale and consumer acceptance such that it can become equally efficient. At this point, the advantages of incumbency may be considered to have been removed and the market to have become normally competitive. If the adjustments continue beyond this time, inefficient entry may be encouraged and consumer welfare may be harmed as regulatory protection of entrants may keep downstream prices above the competitive level. How long should such a period be? We have considered two options. Under the first option an ex ante margin squeeze test would be based on a REO for a fixed period, say three or five years. However, we have rejected this option as it requires

3 Reference in Dutch Para 413. Informal translation. 4 Ofgem (2008) Energy Supply Probe: Summary of initial findings and remedies

27

Margin squeeze News Analysis

The obvious next question is what level of search and switch costs should be accounted for in a REO standard? This level could be calculated empirically through observing prices in the market and through research surveys. However, there are some useful indicators already available.

OPTA acknowledges that having the disposition of a last mile network is not sufficient for successful market entry. A solid customer case, reputation and name are indispensable. For historic reasons KPN has an extensive customer base, a well known brand and a reliable reputation. [OPTA note: In the context of WLR OPTA acknowledges that alternative providers can only set a retail price 5 to 10% less than the KPN retail price.] A significant part of the KPN customer base is therefore very loyal and not sensible for incentives to switch to another provider (end user inertia). Therefore KPN is able to ask a ‘price premium’, i.e. the customer is prepared to pay a higher price to KPN for the same service than to other providers.3


www.iicom.org March 2011 Volume 39 Issue 1

the NRA to judge in advance how quickly a market will become effectively competitive. Our preferred, option is to hold a periodic review of the need for continuing with an ex ante margin squeeze test in the light of competitive developments in the market and the metrics to be used. Such a review could be tied in with the market review cycle, in much the same way are price controls are. This option allows the NRA to make a pragmatic decision about whether competitors have benefited from a period of promotion of competition and whether their market position is a result of their own actions or whether market failures remain that can be corrected through the presence of an ex ante margin squeeze. The key criterion for whether the REO standard can be replaced with the EEO standard is whether effective competition in the downstream market is sustainable in the absence of ex ante regulation in the immediate upstream market. For example, suppose that the retail broadband access market was being examined. If the Wholesale Broadband Access (WBA) market is effectively competitive and not subject to ex ante regulation, then a margin squeeze test between these two markets could revert to the ex post EEO standard. However, if WBA is only competitive because the incumbent is required to provide unbundled access in the Wholesale Network Infrastructure Access (WNIA) market, then the REO standard would need to be maintained in any margin squeeze test between WNIA and WBA products, unless the dominant upstream firm was not active downstream. Significant Market Power (SMP) in the upstream market would otherwise allow the SMP operator to leverage that dominance into the downstream market.

Conclusion: A formal definition Earlier we formally described a margin squeeze test as:

R - (C + M) ≤ 0

Where R is the downstream price, C is the cost of the upstream input and M is the margin, equivalent to the costs of an equally efficient downstream operator. We have argued in this article that a margin squeeze test using the REO standard should be based on the input used by the entrant and

28

adjusted for consumers’ search and switch costs, for which an entrant needs to compensate, and for the incumbent’s economies of scale and scope. On this basis, the ex ante margin squeeze test can be presented formally as:

(R x (1 - D))-(CE + M) ≤ 0 Volume ≈ X% market share Scope ≈ Y Products

Where the additional variable D is the discount percent entrants need to offer consumers to compensate for search and switch costs and to overcome the risk premium faced by the entrant. The subscript E (CE) refers to the input cost to the entrant. The scale and scope of production is also explicitly stated as volume equivalent to a market share of X%, and scope to be average overheads for Y products. We have proposed various values to be applied to the economies of scale and the discount for search and switch costs. Bringing those values in to the condition set out above, we propose that a REO standard should use the following inequality as the margin squeeze test:

(R x (1 - 0.05:0.1))-(CE + M) ≤ 0 Volume ≈ 20% - 25% market share Scope ≈ Competitive products

This condition, which applies to the immediately downstream market, should remain in place so long as a firm is dominant in the upstream market and operates in the downstream market and therefore has the ability to leverage its dominance. Our REO standard is designed to promote competition, in line with regulators’ objectives under Article 8.2 of the Framework Directive, but also to be transparent such that incumbents can set a price using their own cost curve in the retail market. This test therefore provides advantages to all players in the market: regulators, entrants, incumbents and, most importantly, consumers who will benefit from increased competition over time. Further development may be needed to refine the parameters of the test so that ex ante margin squeeze cases based on the REO standard do not get mired in legalistic debates about, for example, definitions of market share. However, we hope that our proposed REO standard can be used as a basis for further discussion and development. Richard Cadman is Director, SPC Network Ltd. and is completing a PhD at the ESRC Centre for Competition Policy, University of East Anglia, UK


www.iicom.org March 2011 Volume 39 Issue 1

by Pamela McDonald and Iain Connor

When are companies liable? Social media may be an ongoing risk in the workplace

For businesses, it is hard to ignore the statistics. Online promotion of a brand through social media is one of the most effective marketing platforms that exists today. Many businesses use social media websites and blogs as a means of communicating with their customers and raising brand awareness. It is easy to see the attraction. It humanises brands, enables companies to gain rapid feedback on products and ideas in order to develop them and so enables a sense of collaboration between the company and the consumer. And all of this for the relatively small price of a website and a website designer. But the ability for Joe Bloggs to post comments on website bulletin boards, in chat rooms and on blogs owned by businesses opens the way for those businesses to be liable if they are not alive to the potential pitfalls which can arise when users post comments on their sites which fall within the realms of defamation, and the business has been monitoring, modifying or editing the content.

When chatter turns libellous In the case of Smith v ADVFN plc & Others the court held that the weight to be given to comments made on bulletin boards are “like contributions to a casual conversation” and “are much more akin to slanders” than libel. While courts may appear to give less weight to chatter and gossip on blogs than defamatory statements published in books or articles, the truth is that no matter what the forum, if users publish disparaging, untrue allegations which damage or undermine a person's or business’s reputation then a claim for defamation could be made. There is no requirement for the claimant to prove that he has suffered actual damage or, in certain circumstances, that the defamatory words are false. The claim can be made against both the author of the defamatory statement and, if the statement is made on a website, the company which owns the website as “publishers”. When dealing with bulletin boards, the actual blogger is likely to have limited resources to pay damages and given the inherent cost of identifying the blogger it means that a quicker and more effective remedy is likely to lie against the

29

Social networking liability

Although the law provides some protection for website hosts who have not actively published site content, the courts are closely considering the criteria to be met for this protection to apply. To be afforded a defence against defamation and other claims, it seems the best advice to website hosts is to do nothing at all to moderate, modify or edit user-generated content (“UGC”), despite how tempting it may be to tweak posts.

The choice for a business is to either moderate diligently to keep all potentially libellous posts off the site, or not to touch the site at all.


www.iicom.org March 2011 Volume 39 Issue 1

business. Further, the business which owns the website usually has far deeper pockets which increase the claimant’s chance of recovering damages. Any litigation is concerning for business, but defamation on the internet is particularly high risk. This is because the law is still evolving in this area which means the court can make assumptions when assessing how widely the offending material was read (when assessing damages) and to the extent defences such as “hosting” are available, apply them strictly.

The Free Speech debate It is difficult to strike the balance between freedom of expression and the right for individuals and businesses not to have their reputations unfairly harmed by defamatory statements. At the moment, both sides of the free speech debate claim that current defamation laws are unfair. Big businesses and rich individuals are accused of stifling freedom of expression with the prospect of costly litigation whereas businesses say individuals hold all the cards because of the way litigation funding works. For example the wide availability of ‘no win no fee’ funding arrangements makes it easier for claimants to bring claims, given that if they lose, no payment is due. However, the extent to which these funding arrangements will be available in the future is uncertain given the European court’s recent ruling that they impinge on the right to freedom of expression (provided by article 10 of the European Convention of Human Rights). Generally though, Britain is seen as one of the most claimant-friendly defamation jurisdictions in the world. Despite views to the contrary, it is not hard to find examples of ‘libel tourism’ where people from foreign jurisdictions appear to take advantage of Britain’s libel laws by 30

issuing proceedings before the English courts. For example, in 1996 Boris Berezovsky, a Russian oligarch, issued proceedings against Forbes Magazine for publishing an article entitled “Godfather of the Kremlin”. Berezovsky won the case even though Forbes is based in New York and sold less than 2,000 (of the nearly 800,000 offending magazines) in the UK. The growing concern for open-ended liability for defamation claims was recently given prominence by British Deputy Prime Minister Nick Clegg who promised to publish a draft Defamation Bill in Spring 2011. Clegg’s speech made clear that freedom of speech was to be bolstered in order to support investigative journalism and scientific debate. Clegg promised that the new Bill would “provide a new statutory defence for those speaking out in the public interest, whether they be big broadcasters or the humble blogger”. The Bill will also clarify the current defences of fair comment and justification giving defendants to defamation claims more protection. It remains to be seen whether this new wave of protection for bloggers will impact on businesses that use social media. One thing that is clear is that the protection afforded to businesses by the hosting defence given under the E-Commerce Regulations has, in recent case law, been carefully defined meaning companies should pay close attention to the extent to which they moderate their sites and the policies and procedures they have in place to deal with defamatory posts.

The Hosting Defence The starting point of a defamation claim is that the website owner is a “publisher” and therefore could be liable. However, regulation 19 of the E-Commerce Regulations 2002 provides a


www.iicom.org March 2011 Volume 39 Issue 1

defence for businesses who provide ‘information society services’ (i.e. websites) which consist of the storage of information. Businesses which own their own websites and provide a platform for users to publish (“hosts”) are distinguished from ISP’s which merely act as conduits to transmit information (which are protected under regulation 17). Hosts are immune from liability for defamation claims provided they have no knowledge of the unlawful activity or, upon receiving knowledge, act expeditiously to remove the offensive post. This is mirrored by the innocent dissemination defence in Section 1 of the Defamation Act 1996, which protects secondary publishers of information. Section 1 protects hosts so long as they have not published the defamatory statement themselves and have not been made aware that their website is facilitating the publication of a defamatory statement. The growth in popularity of social media has meant that any business that enables blogging, bulletin boards, chat rooms and the like, is reliant on the hosting defence to protect them from claims of defamation resulting from comments that are posted on their site by their users. However, the core principles of defamation law and its defences have not changed. The law applies in the same way to printed material as it does to online media, despite the growth of technology and social media. Whilst the hosting defence appears to provide adequate protection to businesses that offer such sites, its application by the court shows that website hosts must exercise caution in the extent to which they intervene and edit the content posted on their site.

Case law

This case was followed by an application for summary judgment in Karim v Newsquest Media

However, in addition, as Newsquest (the website host) removed the offensive posts as soon as it was alerted to them, Mr Justice Eady held that was sufficient to afford them the protection of the hosting defence. The ruling suggests that publishers of UGC (which included posts in chat rooms) will not be held responsible for potentially libellous material posted on their company website so long as the post is removed as soon as possible. In this case, that was done on the same day that Newsquest received the complaint. By way of contrast, website owners should consider Mr Justice Stadlen’s judgment in Kaschke v Hilton and Gray which related to an allegedly defamatory blog by John Gray (the first defendant) on the website LabourHome.org which was operated by Alex Hilton, the second sefendant. The blog suggested that the claimant was previously linked to a terrorist group which was responsible for deaths in Germany in the 1970s. Mr Justice Stadlen held that the hosting defence would fail if Mr Hilton had interfered with the blogs in any way. The High Court assessed how far exemptions for service providers should go under the E-Commerce Directive and provides helpful guidance on the level of intervention the court will allow before it denies website owners the defence. Mr Hilton had initiated a system under which entries on the website could be, and were described as ‘recommended’. The court considered that because he could himself adjust the score required to place a post “in a more prominent position than they would otherwise be on the website” he was taken to have accepted editorial control over the content, or responsibility for it. More concerning still, was the court’s finding that amending spelling and grammar, and, “on a few occasions remov[ing] blog posts

31

Social networking liability

The infamous Godfrey v Demon Internet case was the first case in the UK to hold an ISP liable for the content of the site it hosted. The court made clear that where such posts are published, website hosts will find themselves in “insuperable difficulty” if the posts are not removed “expeditiously” upon notification that the posts break the law. The defendant in this action was not afforded protection because they took time to investigate the complaint and so were deemed to be aware that they were facilitating the publication of a defamatory statement.

Group which concerned comments posted on the bulletin board of the defendant’s website relating to a newspaper article entitled “Crooked solicitors spend client money on a Rolex, loose women and drink”. The article qualified for absolute privilege in law (a complete defence which attaches to words spoken in legal proceedings and reports arising from them) and the case was thrown out after the court found that the story was published contemporaneously and was a substantially fair and accurate account of tribunal proceedings in which Mr Karim was struck off the Law Society roll after being found guilty of dishonesty.


www.iicom.org March 2011 Volume 39 Issue 1

on grounds of bad language, political provocation or offensiveness falling short of defamation” went beyond the mere storage of information anticipated by the hosting defence and as such the defence was unavailable. This was despite legal statements on the website warning bloggers that they are responsible for any libellous statements made and Mr Hilton’s diligence in removing posts upon it being first complained of.

Some good news? Whilst Kaschke restricts the extent to which companies can take control over their websites and still rely on the hosting defence, the judgment did go some way to assist businesses who would like to take control over part of their website and still be protected by the defence. The court restricted the target of the defence “by reference not to the website as a whole or the homepage or even the general storage of blog posts on web pages made available on the website” but to the specific blog complained of. It follows that if businesses do ever find themselves on the receiving end of a claim for defamation because of UGC, the defence might, depending on the level of intervention for the blog complained of, be available even if the business has been operating the homepage, website as a whole and all other blogs on it, in a manner which goes beyond mere storage. The downside however, is that when considering the blog complained of, the court is entitled to draw inferences from evidence submitted relating to the host’s treatment of the website as a whole. This has not yet been tested in the court, so it remains to be seen how much weight the court will place on evidence of active historical intervention which goes beyond mere storage as grounds for denying businesses the defence.

32

So, should businesses moderate their sites or not? Kaschke provided a warning for businesses making use of social media that the hosting defence may not be as wide a protective shield as it appears. When a site is moderated, either before content appears or shortly thereafter, the operator of the site assumes responsibility for the material that appears. The more businesses moderate, intervene and take editorial control over users’ comments posted on their sites, the more likely they will be held liable if defamatory comments are made, as they will be seen to have enabled, facilitated and contributed to the defamation. The E-Commerce Directive explicitly prohibits member states from imposing an obligation on website hosts to monitor the information which they transmit or store. So there is no general requirement to monitor websites and less risk of being denied a defence in law if hosts do not do so. This might be seen as an incentive for businesses not to monitor their sites, despite the risk of offensive or defamatory comments being posted. Further, the E-Commerce Directive only blocks the defence where the host has (or should have) “knowledge of unlawful activity”. So knowledge of an offensive (rather than defamatory) post does not prevent the defence being used. On a practical level, full-scale monitoring and moderation of websites either pre or post publication is expensive and labour-intensive. However, the inherent nature of certain businesses means that the risk of creating a site which facilitates UGC without moderation is simply too high. Brands who market to children, for


www.iicom.org March 2011 Volume 39 Issue 1

example, must consider the safety of its users and not provide a platform for bullying or enable children to post information which would identify them personally. Even businesses which do not operate in sensitive markets may wish to moderate their site in order to maintain their brand’s reputation and ensure the site remains useful for its intended purpose. After all, a site which is flooded with irrelevant posts or spam will deter users from coming back. Despite the harm it may cause businesses not to moderate its websites, the message from the court is that intervention comes with a risk of losing the protection of the hosting defence. Many businesses simply choose to accept this risk because the alternative is too damaging to its brand. So was Starbucks, for example, really faced with this dilemma last year when several racist comments were posted on its Facebook page? It acted fast to remove the offensive posts, which was the right thing to do as everyone would agree regardless of the potential for later arguments about whether or not it was moderating the site. In addition, actively moderating and having a clear notice and takedown policy is likely to serve to mitigate a claim for damages even if it does not provide a complete defence. One method that businesses are increasingly using to protect themselves from the damage that can be done by defamatory comments posted on their websites is to employ the services of companies such as Disqus, which monitors comments posted on their client’s website. It is a helpful management tool for businesses not able (or willing) to deal with the voluminous and often complex posts by its users. Employing a third party to manage UGC can shift the liability on to the monitor, who should indemnify the business for failure to remove content expeditiously.

Protective measures

»»Have a good notification system in place (such as a ‘Report Abuse’ button) and set a response time for removing posts that have been subject to complaint.

Any litigation is concerning for business, but defamation on the internet is particularly high risk. This is because, as stated above, the law is still evolving in this area. It remains to be seen how the law will develop in light of the Nick Clegg reforms and the implications this will have on businesses. Until further clarity is provided, businesses which choose to utilise social media should be alert to the potential issues that may arise and have a clear policy in place. Pamela McDonald and Iain Connor work in the intellectual property team at international law firm Pinsent Masons, providing advice on the development of IP strategy and all aspects of its implementation. Iain is a Partner and specialises in contentious intellectual property matters advising on all aspects of High Court litigation and dispute resolution. He has a broad range of intellectual property experience having worked on matters involving the infringement of copyright, database rights, design rights, moral rights and trade marks and passing off.

33

Social networking liability

Even with a third party moderating a business’ website, there are still risks of claims being brought. So what can businesses do to protect themselves from libel claims? The following guidance applies to businesses whether or not they choose to moderate their websites:

»»Disable commenting on articles or blogs which are particularly contentious or emotive and implement software preventing profanities and spam from being posted (rather than doing this manually). »»Alert users to the fact that content generated does not necessarily reflect the views of the company and that users are responsible for any comments they make regarding people or businesses. »»Disclaim liability for any posts and reserve the right to remove posts without explanation or consultation. »»Enforce a compulsory registration system which is subject to the acceptance of terms which requires users to provide an email address and username before posting to encourage accountability. For example, The Guardian and The Independent newspapers employ Disqus to manage their websites in this way. »»Have social media principles and policies which state that the company does not check comments before they appear or alternatively do not upload any material until it has been moderated (although this may restrict the relationship between the business and its customers). »»Keep a record of all posts removed.


www.iicom.org March 2011 Volume 39 Issue 1

IIC TMF Washington 2010 Prospects for cloud computing Washington, DC – Speakers at the Washington, DC, IIC Telecommunications and Media Forum’s session on cloud computing all seemed to agree that cloud computing, while recently emerging as a new buzzword in the world of information technology, is merely a modern form of network based computing that has already existed for several years. The national security and individual privacy issues that cloud computing raises, therefore, have and can continue to be addressed using traditional government policies and approaches. The general consensus of the speakers, however, was that the best approach to safeguarding both national security and individual privacy in relation to cross-border data flows will be through implementation of new multi-stakeholder agreements.

software (e.g. email), platform (e.g. sale services such as force.com) and infrastructure (e.g. network and storage), each of which has varying degrees of user control. As the cloud grows due to more pervasive broadband, processing power and storage capacity will also grow.

globally for years and would find it very difficult to function without doing so. For example, companies offering 24-hour customer support often need employees in different parts of the world to address consumer questions and complaints in a timely manner.

The recent growth in cloud computing services has been driven by the many benefits that it provides to consumers, small business and large enterprises, such as quick access to scalable computing resources. From the consumer perspective, cloud computing is attractive because it provides users with the option to pay-as-they-use and enables them to have their content backed-up and available anywhere. Similarly, entrepreneurs and small business with innovative

Costs and concerns

Growth and drivers Eric H. Loeb, Vice President of the International External and Regulatory Affairs Team of AT&T Services, began the presentation with defining cloud computing as “a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” Mr. Loeb explained that the “cloud” is defined by the amount of control the user has over resources, and that it is important to view the “cloud” as part of a continuum. Examples of cloud computing services include 34

ideas but limited infrastructure and resources see cloud computing as a relatively inexpensive means to accessing a virtually infinite amount of computing resources. Mr. Loeb added that cloud computing also makes it feasible for developing countries to implement e-Gov initiatives that need to scale over time, e.g., e-Health and e-Education. From a large enterprise perspective, data mobility and cloud computing services have become essential to operations. Sheba Chacko of Global Operational and Americas Regulation, BT, explained that large corporations have been circulating data

The benefits of cloud computing, however, do not come without certain costs and concerns. AT&T, for instance, has spent several billion dollars investing in its super Internet data centers around the world. In addition, cloud computing raises various consumer concerns, such as loss of control over personal data. Justin Brookman, Senior Resident fellow at the Center for Democracy and Technology explained that consumer concerns related to cloud computing are magnified by the fact that there are very few regulations or policies in place to protect consumer privacy and data stored in the “cloud.” For example, governments are not required to obtain a subpoena in order to gain access to a user’s data stored in the “cloud” or by a third party, unlike data stored on a consumer’s hard drive. Similarly, consumers have little say in what providers are allowed to do with their data. Government officials could, for instance, place pressure on a provider to relinquish a user’s data. Moreover, providers could use stored data for advertising purposes and for third party solicitations. Without stronger consumer protection laws in place, Mr. Bookman noted, consumers will be hesitant to adopt additional cloud computing services.


www.iicom.org March 2011 Volume 39 Issue 1

Government responses The laws and policies that have been enacted to address cloud computing and data transfers have generally been put into place to protect the government. Stewart A. Baker, a Partner at Steptoe & Johnson LLP, enumerated what he deemed to be the “two and a half” government responses and approaches to protecting national sovereignty due to data mobility. First, governments have traditionally taken the approach of prohibiting data from traveling from their country to a foreign country, citing security reasons. Most nations, however, do not view this as a feasible option in the long-run because data will inevitably travel to other countries. A second approach has been to treat data according to the rules of the location where the hardware is owned. Finally, pursuant to the “half” method, governments have attempted to regulate other jurisdictions, such as by fining providers of one country for violations of another country’s laws. Mr. Baker noted that this “half” approach is currently employed by the European Union.

that cloud computing services arise out of a unique set of institutions. Thus, it is important to understand how these institutions function in order to create a policy to address the multiple issues that arise out of cross-border data flows. The more stakeholders are actively and genuinely engaged, the easier it will be for others to adopt and adhere to these agreements. Although some parties suggest that the multi-stakeholder approach is too slow, once the agreements are in place, it will be much easier to make changes and amendments, unlike formal treaties. Harriet P. Pearson, Vice President, Security Counsel and Chief Privacy Officer of IBM Corporation, noted that the U.S. was once very isolated and protective over its data. Now that the nation is committed to sharing information and working with others, it will be easier to form multilateral coalitions and discuss appropriate policies to address

accountability, transparency and jurisdiction. The more transparent the process is, the more confidence consumers will have in cloud computing services. Mr. Weitzner indicated that the Department of Commerce’s Internet Policy Taskforce will soon issue a report to address data privacy issues. In the meantime, he stressed the importance of ensuring that interested stakeholders, such as companies, governments and consumers, are engaged in the process and willing to enter into voluntary negotiations. Ms. Pearson specifically encouraged private companies to begin developing best practices. Ann LaFrance and Angela Kung, Squire, Sanders & Dempsey L.L.P.

Multi-stakeholder approach

Daniel J. Weitzner, Associate Administrator at NTIA’s Office of Policy Analysis and Development, explained 35

IPTV IIC TMF Washington NewsAnalyisis AnalysisDC 2010

None of the speakers could predict what policies the U.S. will adopt, but all agreed that the best approach to safeguarding both national security and individual privacy will be through implementation of multistakeholder agreements.


www.iicom.org March 2011 Volume 39 Issue 1

IIC TMF Washington 2010 Meeting spectrum demand Washington DC – At the IIC Telecommunications and Media Forum in Washington in December 2010, the panelists explored ways to address the challenge of rising spectrum demand. There was consensus that more spectrum is needed to meet growing demand for mobile broadband. Several approaches for how to meet that demand were discussed, including technical solutions, spectrum management, and re-purposing spectrum.

Demand for Spectrum The rising demand for mobile internet access and the predicted increase in machine-to-machine communications is creating a need for more spectrum. The U.S. is moving from a spectrum surplus in 2009 to a projected spectrum deficit in 2013. An October 2010 Federal Communications Commission (FCC) Staff Technical Paper indicated that growth in U.S. mobile data usage by the year 2014 is projected to be 35 times 2009 levels. The expected shortage of spectrum for wireless broadband services is a major concern for the wireless industry. Professor Thomas Hazlett of George Mason University stated that we are in a mobile data crisis. Jim Patrick of the Canadian Wireless Telecommunications Association said there is not enough wireless capacity to meet emerging demand and noted that Industry Canada recently launched a consultation focusing on this matter. The panelists discussed multiple reasons for the increase in demand. The number of mobile broadband 36

users is increasing, and there has been significant growth in the number of wireless connections over the past 10 years. There is more mobile data usage per user, and Smartphone users use more data. Part of the demand is due to new devices that did not exist two years ago, such as e-readers (e.g., the Kindle and the Nook), tablet computers (e.g., the iPad and the Galaxy), and wireless health devices. Use of mobile devices for apps, video, gaming, and internet access is leading to increased demand.

Solutions The panelists explored a number of solutions to address the rising demand and need for additional spectrum. Professor Hazlett said we need to consider which

strategies can be used to get maximum bandwidth into the market and divert that bandwidth to its highest valued use. Technological developments have the potential to maximize the efficient use of spectrum and contribute to alleviating spectrum demand. Mr. Patrick presented several options for network management, including more cell sites, modulation and propagation schemes, different internet protocols. Another option is traffic management practices, but that gets close to the network neutrality debate.

Dean Brenner of Qualcomm said technological solutions can help address the demand for spectrum, but will not meet this demand on their own. He addressed Qualcomm’s research and development efforts on supplemental downlink, which takes unpaired spectrum and uses it to provide additional downlink capacity for paired spectrum in another band. He noted that twothirds of traffic on a mobile system is on the downlink. Technology for supplemental downlink is standardized for HSPA, and Qualcomm is working on it for LTE. Mr. Brenner also discussed spectrum sharing techniques and femtocells.

Spectrum management The panelists discussed spectrum management. Governments around the world are evaluating ways to free up additional spectrum, but it takes time as spectrum planning can take 5 to 10 years to bring spectrum to market. The U.S. National Broadband Plan (“NBP”) calls for making available 300 MHz of spectrum over 5 years and 500 MHz over 10 years. The U.K. announced it will try to get 500 MHz over 10 years. Canada is undertaking a spectrum inventory and is developing a long-term roadmap for releasing particular spectrum bands for wireless broadband services.

Re-purposing spectrum One way to find additional spectrum is to re-purpose spectrum used by the government. In the U.S., President Barack Obama asked the National Telecommunications and Information Administration (NTIA),


www.iicom.org March 2011 Volume 39 Issue 1

which oversees the federal government’s spectrum, to look at spectrum availability and come up with additional spectrum. Charla Rath of Verizon noted that this is a difficult area because incentives are different in the public sector and incentive auctions cannot be done with federal government users. Re-purposing spectrum held by commercial licensees could provide another source of additional spectrum. Ms. Rath praised the FCC’s NBP because it took some very hard questions, such as the reallocation of broadcast television spectrum, and put them on the table for discussion, which has tended to change the nature of the debate.

Professor Hazlett believes the market could do it more efficiently. He suggested doing this through overlay rights. Both Ms. Rath and Professor Hazlett discussed the use of spectrum overlays in the PCS band where PCS licenses were created as overlay licenses on spectrum partially occupied by incumbent microwave users. Wireless companies with PCS licenses then negotiated with the microwave users to relocate and compensate them. Professor Hazlett contended that overlays would be an efficient way to re-purpose broadcast spectrum. The spectrum overlay process could include arbitration mechanisms and judicial functions, if necessary.

Robert Kelly of Squire, Sanders & Dempsey L.L.P., who moderated the panel, noted that there is a natural tension between the market and the regulator. Some on the panel advocated for government to play the main role in making additional spectrum available, while others on the panel and in the audience argued for a larger role for the market and private transactions, such as secondary market transactions for spectrum. Given the large projected need for additional spectrum, there was agreement that technical solutions, spectrum management, and re-purposing spectrum will all be needed to meet the demand. Bob Kelly and Steve Lederman, Squire, Sanders & Dempsey L.L.P.

Professor Hazlett raised concerns about incentive auctions and suggested instead that the FCC should auction overlay licenses in broadcast spectrum and allow 37

IPTV IIC TMF Washington NewsAnalyisis AnalysisDC 2010

The panelists discussed using incentive auctions to re-purpose broadcast television spectrum. FCC Wireless Telecommunications Bureau Chief Ruth Milkman said the FCC would like to have incentive auctions as a tool in its policy toolbox and would like to see Congress provide that authority to the FCC. Incentive auctions would allow incumbent licensees, if they chose, to relinquish some or all of their spectrum, which the FCC would auction, and the licensee would receive some of the auction proceeds. Bureau Chief Milkman emphasized that incentive auctions would be a voluntary approach where broadcasters come in with proposals and then the FCC does a forward auction.

parties to negotiate private agreements. In his view, the FCC’s plan to do incentive auctions as a twosided auction of broadcast spectrum in which the FCC would serve as a market-maker is an expensive way to re-purpose this spectrum.


www.iicom.org March 2011 Volume 39 Issue 1

Martin Cave,

Regulatory economics expert

There have been many changes in the regulatory and policy environment in Europe and elsewhere. What are the drivers and what is the context?

I am pretty satisfied with the way the European regulatory framework works. I think it is well considered. It is a bit of miracle that it has come through given the nature of decision making in the European Union! But it is performing well, in general. And it has been copied quite widely throughout the world. So I think the current situation with regulatory issues is broadly OK, although there are a few things about investment that concern me. We often think of a tension between the regulatory environment and the private sector investment environment. We live in a regulated industry - is it light touch or too heavy for investment in network deployment to take place?

You have to remember the comments of the famous American economist George Stigler who recommended his acquaintances in business to ‘get regulated’ as they would make more profit if they did! Those people who say ‘we don’t want to be regulated’ have to be careful what they wish for. Having said that, there is a tension between regulation and

38

investment, and it is a very basic and familiar one. If you know you are going to be regulated to supply your assets at a very low price to your competitors then you won’t build them. In the copper world where the assets were already underground, this wasn’t a problem but in the fibre world, there is an explicit decision to build which is involved and this is a problem. There is a way of resolving this - two ways - actually. One is effectively regulatory holidays - you give the incumbent who makes the fibre investment an opportunity to have more power, and the other way outside Europe is the use of public finance. It is quite extrordinary how three years ago we regarded the telecom sector as a source of funding for the government and now it is a kind of sink for funding from the government. We are a bit behind the game as far as Europe is concerned because of the parlous state of most European economies. What is the main motivation for government spending?

If you think about it there are three reasons why governments might intervene. One is industrial policy - to become more competitive, the second is social

policy - to deal with the digital divide policy, and the third is as part of a recovery plan. The recovery plan motive was quite strong a couple of years ago, but now it is most definitely an industrial policy plan: to become more competitive and enhance your prosperity, and, perhaps more importantly eat your neightbour’s lunch. Is there a conflict between sector specific and competition approaches for regulators?

I think in Europe sectoral legislation continues to rule the roost. There are two areas where competition laws are important. The first is in relation to mergers and consolidation where competition authorities can block, but have not shown much appetite recently for doing so. The second is in relation to state aids, obviously important in view of my earlier comments in relation to public investment. But if you look at issues related to fibre, it is really where you should unbundle, what price of the unbundled assets should be, whether you should focus on passive or active assets, and those issues are very much in relation to sectoral legislation. Competition law is very much in background to pick up the pieces where things go wrong,


www.iicom.org March 2011 Volume 39 Issue 1

for example, in the case of Deutsche Telekom where competition authorities effectively overrode the sectoral regulator in areas that the sectoral regulator seemed willing to countenance. But that is a kind of safety net, rather than a prime mover in the regulatory framework. Are you concerned with competition in this market because the capital intensive nature of the business causes problems for competition and business models inevitably get pushed towards consolidation?

It is an interesting question. I think the jury is still out on it. I used to believe that fibre would lead to a very clear re-monopolization of the sector, and that if access products were available they would lead to much less investment by competitors. But now it looks like in some jurisdictions by mandating access to ducts you are creating a kind of investment race, and you might end up with duplicated investment, and consumers basically being able to choose. That is one way in which things have changed. The other way that things have changed is where incumbents are facing a sort of consortium. Something like this seems to be happening in Italy where three operators have gotten together to build a fibre network. In such circumstances operators may prefer to ally with the incumbents as well, but if they don’t, you may end up with a situation where there are in fact two

How is it possible to give any sort of regulation to a government-owned network? Surely, it is a contradiction in terms?

I think the key is really related to the separation debate. You [should] confine the public funding and the monopoly aspect to as small a part of the value chain as possible. And then, on the back of the government owning the network, it is possible to enforce a high degree of separation which prevents discrimination against the other competitors in the value chain. My own view is that in Australia, they have gone too far and re-nationalized too much. In Singapore, however, they have managed to leverage a relatively small amount of public spending to build a network largely owned by the private sector and which is separated in really quite clever ways. So you think there are different flavours of doing this?

I think there are ways of doing this that have the effect of enhancing competition, but full-blooded renationalization does not seem to me to be a sensible way forward.

But surely you have to make a necessary accommodation of the incumbent in some sense?

It is a form of accommodating behaviour, but in other respects it may be tougher on the incumbent because what you may be doing in effect is taking over their access network. On the other hand by that very act, you are separating their network from the other aspects of their enterprise, and that may put their retail activities at a disadvantage compared with what happened before. So it is good for incumbents in some ways, especially if they get a good price for their assets, but it is bad for incumbents in other ways, if they lose their unique selling proposition which is that they are the organization that has been responsible historically for the network. What will happen in the regulatory environment over the next two years?

As far as fibre is concerned, there is still going to be a lot of regulatory turbulence with lot of regulatory organizations taking their own decisions about what to do. More important is what the spectrum regulators will do, because if they can make spectrum available, it will facilitate the development of the immense future potential of mobile broadband. Prof Martin Cave is BP Centennial Professor at the London School of Economics.

Printed in England by H. Charlesworth & Co Ltd, Flanshaw Way, Flanshaw Lane, Wakefield WF2 9LP UK

International Institute of Communications, 2 Printers Yard, 90a Broadway, London, SW19 1RD,UK

Archive password: halibut Annual subscription ÂŁ175 Online: www.iicom.org

Tel +44 (0) 20 8417 0600 Fax +44 (0) 20 8417 0800

First Person

The IIC publishes Intermedia to provide a forum for a wide range of people and views. Intermedia does not necessarily reflect the opinions of IIC officers, trustees and members. Credit quotations as source: Intermedia, Journal of the International Institute of Communications Š 2011. ISSN 0309 11 8X

fairly complete fibre networks in urban areas. I think it is very important to emphasize that this sort of competition is going to be confined to relatively affluent city districts, and elsewhere the population will be relying on a single fibre. This makes it very important to see what happens in the wireless world.


International Institute of Communications PRESIDENT Fabio Colasanti STAFF Andrea Millwood Hargrave Director General Carol Geldart Director of Programmes Stephen McClelland Editor in Chief, Intermedia Director of Publishing, Media and Electronic Communities Joanne Grimshaw Projects Executive BOARD MEMBERS Andrew Barendse South Africa Bernard Courtois Canada Tim Cowen UK Alasdair Grant Hong Kong Ann LaFrance US Augusto Preta Italy


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.