Share this The Web 2.0 user and its ability to filter and differentiate content.
Gabriel Soto MA Graphic Branding and Identity London College Of Communication 2011
Share this.
The Web 2.0 user and its ability to filter and differentiate content.
Internet, and more specifically the web 2.0 has opened the possibility for those who used to be passive receivers of communication in the traditional mass media to actively participate and become content generators.
thanks to the proliferation of free blogs and sites.”2 Emily Bell in an exchange with Keen titled ‘Is today’s internet killing our culture?’ argues “Amateur is not going to replace professional – it is idiotic and misleading to suggest it will. But it will supplement and expose mainstream media – in fact it already does.”3
The participatory platforms such as blogs, social networks, video sharing sites, wikis, podcasts and forums allow any person with internet access to create, post and share written, visual, audible and audiovisual content which is immediately available to millions of people, without any revision by an editor or curator.
Nowadays the expert and the amateur share the stage, performing at the same time for a constantly growing and widely diverse audience with free, quick, self-driven access. The present essay is an exploration on how the Internet user/reader/receiver is able to differentiate and be selective of the contents on the most popular web 2.0 platforms. It is not the aim to get into value judgements on the expert v. amateur dilemma, because it is already a fact that content produced either by professional experts and amateurs coexists in the net.
Each of these sites has its own default interface layout - with some customization possibilities – through which this vast amount of content is presented, and where only the avatar image, the username and the content itself changes from post to post. Then, through the same interface it can be seen content produced by an expert or accredited professional, who is a reliable or accurate source, as well as amateurish content or inaccurate information produced by random anonymous users.
The education, culture, context and experience of the reader determine its ability to be critical and selective of the content and information it puts itself in front of. This applies either to digital and traditional media. Nevertheless, the amount of content available since the digital era is larger and more diversely sourced, which defines a greater challenge for the reader to find and select reliable good quality information, when its needed.
Andrew Keen in ‘The Cult Of The Amateur’ (2007) states that web 2.0 is producing a “flattening of culture that is blurring the lines between traditional audience and author, creator and consumer, expert and amateur.”1 In addition, his major concern is the eventual disappearance of the professional and the contamination of traditional media institutions by amateurs, as he claims, “the very traditional institutions that have helped to foster and create our news, our music, our literature, our television shows, and our movies are under assault (…) Newspapers and newsmagazines, one of the most reliable sources of information about the world we live in, are flailing,
Traditional media like newspapers, magazines, journals, TV channels and radio stations have created digital versions on web 2.0 platforms such as Twitter, YouTube and Facebook, adopting its language to publish professionally produced content. Official newspaper websites are being redesigned to have a more a 2.0 approachable appearance, enabling user comments and ratings.
1
Verified Sources that total freedom does not work very well. Beside, there is no doubt Wikipedia’s usefulness. I think it has gained quality and reliability in recent years.”8
Keen (2007) states, “These days, kids can't tell the difference between credible news by objective professional journalists and what they read on joeshmoe.blogspot.com”4 , yet the difference is made clear by user driven tools such as Twitter’s “Verified Badge” used “to establish authenticity of well known accounts so users can trust that a legitimate source is authoring their Tweets"5 (e.g. New York Times Twitter account: @nytimes6 ), or official YouTube channels (e.g. Channel 4 YouTube channel7), which are customized with logos and brand imagery, within the default layout of YouTube’s interface, adding some interaction features not available for regular user’s channels. So web 2.0 sites are developing tools that give a certain hierarchy to the content, highlighting the one produced by certain sources; actually not that democratized.
Due to the free access to a vast amount of usergenerated information, readers are less likely to pay for it, which affects the traditional media industry. John-Paul Flintoff (2007) analysing Andrew Keen’s book, states “If Britannica and publications like it should disappear we’ll be obliged to rely on the unreliable patchwork of information parcelled out on Wikipedia by people who often don’t even reveal their identity.”9 In 2005, ‘Nature’ scientific magazine published a special report titled ‘Internet encyclopaedias go head to head’, an investigation in which 42 entries (i.e. articles) were chosen from Wikipedia and Encyclopaedia Britannica websites to be reviewed and compared by a panel of experts. “Eight serious errors, such as misinterpretations of important concepts, were detected in the pairs of articles reviewed, four from each encyclopaedia. But reviewers also found many factual errors, omissions or misleading statements: 162 and 123 in Wikipedia and Britannica, respectively”10 .
Content Editing Users want fast information and web 2.0 provides it in several ways. Wikipedia for instance, a free online encyclopaedia written and edited by users, is one of the most visited sites on the Internet, yet one of the most controversial. It seems to be the ultimate democratization of knowledge, created by and for the user, but it is exactly because of that reason that some of the content is inaccurate, incomplete or crosses the line between objective information and subjective opinions.
It was proven that Wikipedia has errors, although what really stands out from the results of this investigation is the fact that Britannica also does, and nearly the same amount as Wikipedia. Then, in this case user generated and the expert edited source information are not much apart in terms of accuracy, as they appear to be. This undermines one of the main arguments of Wikipedia critics (e.g. Flintoff and Keen) when they contrast it to Britannica, as the example of a reliable source for information edited by experts.
With time, Wikipedia had to close up a little their open editing possibilities. Hierarchies and control mechanisms were created to gain quality. Nicholas Carr (2011), one of the most influential digital era critics, talks about this case in an interview with Spanish newspaper ‘El País’: “one of the lessons is
2
Contrast and Comparison is looking for in the web is for entertainment: short videos on YouTube, live video streaming on sites such as UStream and Justin TV, TV stations with available online streaming like BBC in the United Kingdom or CBS in the United States, online movie rental sites like NetFlix, amongst so many other forms of audiovisual content.
It is important to have in mind that the Internet user is still having parallel offline media experiences. It watches TV, goes to the cinema, reads a book or attends a concert - maybe not as much as previous generations or in the same manner, but it still does -, so it has an unconscious idea of the quality standards and conventions of traditional professionally produced media. Then by comparison and contrast, the user naturally filters content. It can be said that online media can be validated by its offline presence. If it is offline, it is paid, and an average user will not be able to have that kind of presence outside the net. Anyhow, that boundary tends to disappear, as The Guardian’s journalist Oliver Burkeman (2011) acknowledges after the South by Southwest festival of film, music and technology 2011 in Austin, Texas, referring to the common views of web entrepreneurs: “they herald the final disappearance of the boundary between "life online" and "real life", between the physical and the virtual.”11
Social Filtering Every piece of content can be easily shared, rated and commented. Even those comments can be rated, shared and commented on as well. Everybody gets to give an opinion. Content generators like bloggers seem to be desperate for feedback and attention, and for everybody to share their posts. But the reader is in the same rush, it is also desperate to socialize everything it finds and sees to get some feedback of its own, just for sharing content produced by someone else. It is a feeding frenzy in both ends.
The degree to which the user can be aware of the quality of content depends on the nature of the content. For instance, it is easier to discern the quality of audiovisual media rather than the written word, because the difference is much more noticeable between professionally produced audiovisual pieces, with a budget and proper equipment, and amateur low-cost videos. Any Internet user browsing on YouTube would immediately tell the difference between a Hollywood movie trailer and a homemade video.
Therefore, this behaviour results in a mass filtering mechanism by users, as Ross Dawson (2006) states “With massively more content available, we need the means to filter it (…) Fortunately, Web 2.0 is in fact just as much about user filtered content as about user generated content. As far more people participate in the web (…) the collective ability of the web to filter content is swiftly growing, and will more than keep pace with the growth in content.”13 What could be understood as “good content” gets to be spread, while “bad content” gets left behind.
According to ‘Wired Magazine’12 fifty one percent of the total Internet traffic by mid 2010 in the United States was used for video browsing and streaming. This demonstrates that the kind of content the user
3
Producer / consumer dichotomy Keen (2007) states that users “can use their networked computers to publish everything from uninformed political commentary, to unseemly home videos, to embarrassingly amateurish music, to unreadable poems, reviews, essays, and novels.”14
abstracts going for quick wins”16 and he doesn’t mean it in a positive way, but the fact that is happening is what makes it relevant to this argument. In conclusion, web 2.0 users have developed mechanisms to distinguish and filter the constantly growing amount of content they view every time they go online. Several internal and external factors of each user and the content define the degree to which those mechanisms are effective. Most of the mechanisms are developed and are used unconsciously, and because of the ever-changing nature of the digital world, even more varied and complex forms of consuming and understanding content will emerge.
Since the user is both a producer and consumer, it knows the possibilities available to access and use the tools needed to easily publish content on several sites, which raises awareness that every other user could be the one creating and publishing the viewed content. Being aware of its field of action makes it aware of the field of action of its equals. In other words, the idea of “knowing it could have been me who published that” factor keeps the user from blindly relying on web 2.0 content.
Education and awareness on how to approach – and even produce and share content on the internet are vital to assure that today’s and next generations take the best advantage of the possibilities the net has to offer, and the ones it will in a not so distant future.
Quick referencing It is practically impossible to avoid user-generated content. Google search results, for example, show web 2.0 sites mixed with “official” websites in certain order defined by a series of complex of the search engine algorithms regarding relevance. Wikipedia entries, for instance, are one of the most common results. Due to the immediacy of the Internet, the user can easily search and find several references of a specific topic, which gives the possibility to quickly contrast and dismiss information or content judging by quality, source or mere appearance of the site. Nicholas Carr (2008) states in his article ‘Is Google Making Us Stupid?’15: “there are signs that new forms of “reading” are emerging as users “power browse” horizontally through titles, contents, pages and
4
References 1 Keen, A., 2007. The cult of the amateur: how today’s internet is killing our culture. New York: Doubleday.
10 Giles, J., 2005. Internet encyclopaedias go head to head. Nature. [online] Available at <http://www.nature.com/nature/journal/v438/n7070/f ull/438900a.html> [Accessed 2 April 2011]
2 Keen, A., 2007. The cult of the amateur: how today’s internet is killing our culture. New York: Doubleday.
11 Burkeman, O., 2011. SXSW 2011: The Internet is death. [online article] The Guardian. Available at: <http://www.guardian.co.uk/technology/2011/mar/15/ sxsw-2011-internet-online>. [Accessed 29 March 2011].
3 The Guardian, 2007. Is today’s internet killing our culture? Andrew Keen v Emily Bell [online] Available at: <http://www.guardian.co.uk/commentisfree/2007/au g/10/andrewkeenvemilybell> [Accessed 29 March 2011].
12 Anderson, C. Wolff, M., 2010. The web is death. Long live the internet. [online article] Wired Magazine. Available at: < http://www.wired.com/magazine/2010/08/ff_webrip/a ll/1>. [Accessed 29 March 2011]
4 Keen, A., 2007. The cult of the amateur: how today’s internet is killing our culture. New York: Doubleday. 5 Twitter Support. About Verified Accounts [online] Available at: <http://www.twitter.com/help/verified> [Accessed 29 March 2011].
13 Dawson, R., 2006. Web 2.0 and user filtered content. [online article] Available at: <http://rossdawsonblog.com/weblog/archives/2006/0 9/web_20_and_user.html>. [Accessed 29 March 2011]
6 New York Times Twitter Account [online] Available at: <http://www.twitter.com/nytimes> [Accessed 25 March 2011].
14 Keen, A., 2007. The cult of the amateur: how today’s internet is killing our culture. New York: Doubleday.
7 Channel 4 - YouTube channel [online] Available at: < http://www.youtube.com/channel4> [Accessed 25 March 2011].
15 Carr, N., 2008. Is Google Making us Stupid? What the internet is doing to our brains. [online article] The Atlantic Online. Available at: <http://www.theatlantic.com/doc/200807/google>. [Accessed 2 March 2011]
8 Carr, N., 2011. Interview by Barbara Celis [online interview] El País, Spain. 29 January 2011. Available at <http://www.elpais.com/articulo/portada/mundo/distr aido/elpepuculbab/20110129elpbabpor_3/Tes> [Accessed 30 March 2011]
16 Carr, N., 2008. Is Google Making us Stupid? What the internet is doing to our brains. [online article] The Atlantic Online. Available at: <http://www.theatlantic.com/doc/200807/google>. [Accessed 2 March 2011]
9 Flintoff, JP., 2007. Thinking is so over. The Sunday Times.
5
Bibliography Keen, A., 2007. The cult of the amateur: how today’s internet is killing our culture. New York: Doubleday.
9/web_20_and_user.html>. [Accessed 29 March 2011]
The Guardian, 2007. Is today’s internet killing our culture? Andrew Keen v Emily Bell [online] Available at: <http://www.guardian.co.uk/commentisfree/2007/au g/10/andrewkeenvemilybell> [Accessed 29 March 2011].
Carr, N., 2008. Is Google Making us Stupid? What the internet is doing to our brains. [online article] The Atlantic Online. Available at: <http://www.theatlantic.com/doc/200807/google>. [Accessed 2 March 2011] Sanger, L., 2011. Confundador de Wikipedia se volvió detractor de la enciclopedia virtual. Intrerview by Macarena García [online interview] El Tiempo, Colombia. April 2, 2011. Available at <http://www.eltiempo.com/tecnologia/internet/cofun dador-de-wikipedia-critica-la-enciclopedia-en-linea_9112920-4
Carr, N., 2011. Interview by Barbara Celis [online interview] El País, Spain. 29 January 2011. Available at <http://www.elpais.com/articulo/portada/mundo/distr aido/elpepuculbab/20110129elpbabpor_3/Tes> [Accessed 30 March 2011] Flintoff, JP., 2007. Thinking is so over. The Sunday Times.
Naughton, J., The internet: Everything you ever need to know. [online article] The Guardian. Sunday 20 June 2010. Available at <http://www.guardian.co.uk/technology/2010/jun/20/i nternet-everything-need-to-know> [Accessed 28 march 2011]
Giles, J., 2005. Internet encyclopaedias go head to head. Nature. [online] Available at <http://www.nature.com/nature/journal/v438/n7070/f ull/438900a.html> [Accessed 2 April 2011] Burkeman, O., 2011. SXSW 2011: The Internet is death. [online article] The Guardian. Available at: <http://www.guardian.co.uk/technology/2011/mar/15/ sxsw-2011-internet-online>. [Accessed 29 March 2011]. Anderson, C. Wolff, M., 2010. The web is death. Long live the internet. [online article] Wired Magazine. Available at: < http://www.wired.com/magazine/2010/08/ff_webrip/a ll/1>. [Accessed 29 March 2011] Dawson, R., 2006. Web 2.0 and user filtered content. [online article] Available at: <http://rossdawsonblog.com/weblog/archives/2006/0
6
Visual Summary
7
8
9
10
11
12