8 minute read

Rising Menace

Julie Inman Grant, Australia's first ESafety commissioner is on the frontline against image-based abuse, online child sexual abuse, and cyberbullying.

When 17-year-old law student Noelle Martin took a random selfie of herself dressed up to go out one night in 2012, she could never have envisaged the nightmare that was to come.

Advertisement

That image, stolen off her Facebook page—along with others, stolen from her social media accounts, or from friend’s accounts—would be used to create an entirely new identity for the shy and academically-gifted daughter of a close-knit Australian-Indian family.

Porn Star.

I felt like I was going to be sick. It has changed the course of my life.

Googling Noelle Martin’s name now takes you into a spiral of ever-depraved images. They include morph porn, where her face is used on explicit images, “tribute” pages, where men ejaculate on images of her then photograph those image alongside their penis, and even porn videos where digital technology has been used to place her face on another body engaged in various sex acts.

The websites carry her name, personal information about where she lives and what she studied, and even include photos of her 11-year-old sister. They also carry comment trails that read like they have been dredged straight from the bottom of a festering swamp of misogyny.

Noelle remembers a random night at her student digs at Sydney’s Macquarie University when she made the hideous discovery of her online alter-ego through a reverse image search in Google.

To say she was shocked barely touches the sides.“Honestly I felt like I was going to be sick.”“It has changed the course of my life.”

While she immediately went to the police, Noelle received the first of many exposures to victim shaming. In any event, there was nearly nothing authorities could do to remove the by now countless images. Her name was, and will forever be, associated with pornography.

Julie Inman Grant is Australia’s first eSafety Commissioner—heading the only government agency in the world dedicated purely to looking after the safety of its citizens online.

Eighteen months into her five-year-term, Julie has already been a visible and effective combatant against the global scourge of child pornography, cyberbullying and image-based abuse. This is sometimes referred to as “revenge porn”, although Julie argues there is no action on which to base “revenge”, and the images are rarely created for prurient interest.

“Let’s call it what it is—online abuse.”

We are lagging 10 years behind in terms of properly educating our community on safety issues.

When Canberrans awoke to the front-page news in August 2016 that local female students had found their images illegally uploaded to the AussieSluts website, it hit home that no one with an electronic footprint was safe. In fact, the ring had uploaded unauthorised images from underage students across 70 Australian schools. The images were hosted by a Dutch server and were finally removed in April of this year after a concerted investigation. But as anyone knows, images are never really removed on the internet, more like rearranged.

The appointment of Julie in 2017 signalled the Turnbull Government was eager to get on the front foot.

Encompassing the role of the Children’s eSafety Commission in an expanded and world-first agency, Julie oversees a staff of 80 who are on the frontline of proactively combatting some of the most insidious online activity imaginable. The Office also serves as a safety net, providing Australians with an easy and supportive place to report serious cyberbullying, image-based abuse and illegal online content, including child sexual abuse material.

As Commissioner, Julie has significant regulatory powers to penalise and fine social media companies, perpetrators and other content hosts for failing to take down violating content. She works with NGOs, law enforcement and industry around the globe to combat all forms of online abuse.

It is a huge job—but Julie has an almost perfect professional pedigree for the position.

A Seattle-born graduate of the Boston University College of Liberal Arts—later gaining a Masters of International Communication—Julie went to Washington DC with “big ideals and even bigger hair, because it was the ‘90s.”

In 1991, she worked as a Legislative Assistant to a Congressman who had a small software start-up in his state needing assistance. It was Microsoft. Julie met Bill Gates her second day on the job and would eventually rack up 17 years for the computing behemoth.

She would help organise the first White House Summit on children’s online safety during the Clinton years. She would move on to Twitter, managing online safety and extremism, and more recently helped Adobe on digital transformation and cybersecurity.

The invitation to head Australia’s dedicated eSafety agency was one she could hardly pass up—and was made logistically easier by the fact she had moved here in 2000 and married an Australian HR specialist.

Julie is the first to concede the challenges inherent in policing the internet. So is the concept of retrofitting safety measures when the network is so enormous and amorphous as to almost defy comprehension— much less regulation.

“We play something of a game of ‘whack-a-mole’ and I think we are lagging 10 years behind in terms of properly educating our community on safety issues.”

“I took the job on because I want to create a different kind of government agency—one that is fast and nimble and innovative and provides citizens with compassionate service where we can genuinely help and where we can genuinely disrupt the trade in child sexual abuse images and help law enforcement track down the perpetrators and save the kids. That is why I am here.”

One area that offers hope is her campaign to get tech platforms to adopt safety-by-design as part of their product development process. That is to encourage everyone coming into the marketplace to anticipate the worst excesses of trolls and deviants— and to circumvent damage to users through in-built safety features.

Had Facebook Live integrated more safety protections into their live streaming before it was rushed to market (to compete with Meerkat and Periscope which already had competing live streaming services in operation), they might have hired their 3000 content moderators before a dozen live rapes, murders and suicides were broadcast from their platform.

“We are going to have a set of checklists for the person developing an app in their garage who doesn’t have a team of people to help and they might not be thinking ‘hey, maybe I should have an online abuse button or cyberbullying policy that goes with my Terms of Service. Maybe I should be using machine learning and artificial intelligence to track whether or not abuse is being hurled or a groomer is trying to get a child’s address’.”

Julie has less patience for the social responsibility credentials of the bigger social media companies.

“They are businesses. They are business first—they invest in infrastructure and engineering—and it’s about monetising the product. But what companies like Twitter and Facebook are starting to realise is that you can’t keep monetising something if users don’t want to come to your platform because it is so threatening and toxic.”

We are letting complete strangers have access to our children in our own living rooms.

She hopes continued public discourse and market forces will force them to take the issue of safety, responsibility and respect to a more suitable level of investment.

While accepting that social media is inherently democratising and can engender a real sense of community, Julie said she had come to the conclusion “that it exposes the sad underbelly of the human condition.”

And against this, the mother of three under 12 says, parents are the frontline of defence.

“If you give your three-year-old an iPad, you need to be responsible.”

These are the words of a woman who has seen footage of five-year-olds coerced into simulating sexual acts online.

“Parents need to be as engaged in their children’s online lives as they are their every-day lives.”

“We are letting complete strangers have access to our children in our own living rooms. We need to understand that.”

Parents also needed to come to grips with the prevalence of young people sending naked or sexually explicit material online.

“This is becoming a more normalised courtship ritual and new form of peer pressure that young girls in particular are feeling when a boy asks for a naked shot to ‘show me how you love me’. They don’t have the maturity, experience or cognitive reasoning abilities to understand the potential ramifications. So we as parents need to be talking to them about the risks of this behaviour.”

The eSafety website has a number of resources— including a menu for parents and educators as well as age-appropriate interactive learning spaces for children and culturally-appropriate tips for young adults.

There are also quick links to report cyberbullying, illegal content or image-based abuse.

So far, the office has helped 900 children have serious cyberbullying content removed through the life of the scheme—that’s an 80 per cent success rate without the use of formal powers.

The office has also had 270 reports of image-based abuse since last October.

Meanwhile, even tougher enforcement measures have just been approved with Parliament passing new Civil Penalties legislation. This legislation now provides the eSafety Office with formal powers on image-based abuse—three year prison terms and $105,000 fines for those found using unauthorised images. Those fines rise to $525,000 for corporations abusing images.

If you ask Noelle Martin about the Civil Penalties legislation, she says it is long overdue.

“We need to do everything, absolutely everything we can, to crack down on these people who engage in image-based abuse. It is beyond disgusting. It is beyond description. New civil penalties will send a strong signal at a federal level and I think will act as a serious deterrent. Mainly, I think we need new laws to keep up with this problem because otherwise we will always be one step behind.”

Noelle made the decision in 2016 to go public with her story and now campaigns in the image-based abuse arena.

“I felt I had no choice but to take a stand. I felt I couldn’t hide from it anymore and I decided to be open and call it out rather that keep it private and carry that shame and burden with me. It was the hardest thing I have ever done.”

And yet, it has also been empowering.

“I will always believe that I have a right to dignity and privacy and my reputation and I should be free to engage in technology without fear of abuse.”

“It is not up to me to change who I am or what I wear or my online behaviour. I did absolutely nothing wrong. It is up to the abusers to be called out for their behaviour because it is illegal and appalling.”

To this day, she does not know who is behind her deliberate targeting, although a staggering one in five Australians have been impacted by imagebased abuse. While Noelle suspects it may have been someone known to her, she had never even had a boyfriend when it began.

Her public exposure has led to more victim-shaming and online abuse and at one point Noelle was so damaged by her experience she sought to end her life.

These days she is seeking strength and solace from the messages of support she receives from other victims.

“I do know that by speaking out I have helped people. I don’t think I will ever escape it, but I am trying to take control of it.”

Victims of image-based abuse no longer need to suffer in silence. There is help through the eSafety Office where there is information, advice and a team of dedicated professionals to help guide victims to the right outcomes at www.esafety.gov.au. •

This article is from: