6 minute read
Computer Says No
As Instagram announces tougher rules against what it considers to be ‘inappropriate’ material, fashion and beauty photographer Dave Kai Piper takes a highly personal look at social media in general. OPINION COMPUTER SAYS NO
HOTO SHARING SERVICE Instagram has declared a blanket review of its policies regarding what it considers to be ‘inappropriate’ content, and if you happen to be a photographer who posts sometimes edgy material, then it could mean that your view counts will plummet. While the intention, which is part of a broader set of ‘integrity’ announcements by Facebook, the parent company, appears to be designed to limit the availability of pornographic imagery, in reality anything that’s construed to be ‘sexually suggestive' could be demoted and hidden from Explore and Hashtag. Instagram is also planning to hide the number of likes a photographer might have to focus attention on the image rather than engagement, with this programme now rolling out in Canada.
Advertisement
While doubtless preventing the circulation of some genuinely offensive material the new ruling risks creating a situation where machine learning algorithms and content moderators are making the decision on which photos and videos are recommendable, and while ‘NonRecommendable’ photos, will still be visible to individual followers through the feed and in Stories, the broad reach that might once have been enjoyed will be taken away.
It raises the issue of how many classic studies of the past, by the likes of Bill Brandt, Man Ray, Edward Weston and the edgier work of Helmut Newton and Bob Carlos Clarke would be allowed to be shared if those photographers were working today and it could easily affect how contemporary photographers such as fashion and beauty specialist Dave Kai Piper extend their reach. We asked Dave for his views on social media in general and how he sees the new ruling. P
DAVE KAI PIPER - OPINION Jump back three years ago, it’s my birthday and my partner surprises me with a trip to London to visit a private Helmut Newton gallery. Such an amazing day with hours to revel in the work of a photographer who today would be the very person who is targeted by mainstream media’s incoming rule changes. Would a photographer like Newton be able to use Instagram?
I’ve always liked images that have a strong storyline and which can provoke thoughts and tension, which is why I found an affinity with Newton. Elements of his work fall into mine and I’ve been shooting a blend of fashion and nude images for years. I’ve been banned on Facebook before and learned a while back that my account is not really my account. Eventually I felt that putting time and effort into a media that I had little control over was not really for me. I know that as a photographer I’ve lost out on growing my business and my community via these free social platforms, and I’m fine with that: I’m not an ‘influencer’ and I never want to be.
A few days ago Tech Crunch reported on the fact that Instagram, - through its parent company Facebook - is having a crackdown on edgy memes and scantily clad girls, but somehow I guess this won’t affect Kim Kardashian or Kardi B. You see, this is about public perception and money. The big corporations want to be seen as trying to make their communities safer, but it seems to me that they don’t want to lose money by doing it.
Instagram says: ‘We have begun reducing the spread of posts that are inappropriate but do not go against Instagram’s Community
Guidelines.’ Tech Crunch says this: ‘That means if a post is sexually suggestive, but doesn’t depict a sex act or nudity, it could still get demoted. Similarly, if a meme doesn’t constitute hate speech or harassment, but is considered in bad taste, lewd, violent or hurtful, it could get fewer views.’
Does this also mean that Facebook is happy to share inappropriate content as long as it generates revenue? How can something be inappropriate and not be against community guidelines? It’s a tricky line to toe in the face of shareholders and the public demands.
USER BASE I don’t have a single problem with a company looking to protect or shape its user base. They have a free-to-use product and it’s very good too. The tricky issue is, as Facebook’s Henry Silverman explains, “As content gets closer and closer to the line of our Community Standards at which point we’d remove it, it actually gets more and more engagement. It’s not something unique to Facebook but inherent in human nature.”
So, if you are Instagram how do you encourage people to share anything they want, all day long, which by definition means having no restrictions on the upload process, while also stopping clickbait and fake news? The new direction that Facebook is taking is focusing on borderline content. This is stuff that is not removed, but hidden until it’s popular enough to be reviewed again, but by a human rather than an algorithm. So what is borderline content?
‘Instagram has no guidelines about what constitutes borderline content - there’s nothing in Instagram’s rules or terms of service that even mention non-recommendable content or what qualifies.’ Say hello to AI censorship. Just the use of a hashtag, certain wording or if the computer system flags your image, you’re in trouble.
As photographers, where do we stand? There are two types of people that use social media: content consumers and providers, with creatively-minded people producing more content than most. Sharing our views and images is fun and important as it helps us grow as visual communicators and learn about new things. Which is all great, but we need to play within the moving scope of where the social media bubble is going. The larger the network grows the more divisive it potentially is. Facebook understands this, thus it will show you what it thinks you want to see. This kills diversity and increases polarisation in the user
base. We’ve known this for a long time and the darker aspect of this is that when a user base grows exponentially the quality of user decreases at the same time.
Large communities need strong leadership to avoid being led by the masses. When you have so many opinions in so many different directions all you get is a boring average of nothing. And yet average is just what the social media giants want us to be.
Empowering the masses with control is a dangerous game. “When people interact, they end up agreeing, and they make worse decisions,” says university researcher Daniel Richardson. “They don’t share information, they share biases.” Take this thought process and apply it to the huge numbers of people using social media, and it’s alarming. The good thing is that it’s not all doom nd gloom. The trick is to stay away from the huge groups of trolls and loud-mouthed people just causing excess noise. You need to take some responsibility for the things you put out there and the same for the things you see. Find the smaller groups of trusted people that you can share your content with and enjoy things that they are sharing with you. Treat the social world as it treats you, with a great big dose of caution. / techcrunch article: www.techcrunch. com/2019/04/10/instagram-borderline/ Daniel Richardson: www.bbc.com/future/story/ 20160113-are-your-opinions-really-your-own