16 minute read

Montecito Moms

Sarah Powers: Podcaster

by Dalina Klan

We could call her Podcaster Powers, but this Montecito Mama is close to being a native – and we are thrilled to introduce her to the MJ this week!

Sarah (Wysel) Powers moved to Montecito with her family in 1986 when she was five years old. Her mornings were spent walking to school (Montecito Union) and stopping at Pierre LaFond for a giant cookie or chocolate croissant. Flash forward 35 years later… she is still stopping at Pierre’s (for a coffee and maybe still that massive cookie), but now with her own kids!

Powers comes from a professional writing and communications background –everything from ghostwriting to corporate communications. In 2016, she and her business partner, Meagan Francis, co-founded their company, Mom Hour Media, and launched the podcast, “The Mom Hour.”

First, what is a podcast? Powers explains: “It is an audio program you can listen to using a phone or tablet or computer. It is like the radio programs of decades past, but you don’t have to catch them on a specific day and time!”

The best part of a podcast is that they are free and open-source for the most part, and there are podcasts on every topic imaginable. Some aim to teach or educate, some are simply for entertainment.

Powers and her partner say their podcast is conversational: “Many people like to listen while they drive, or, in the case of our audience, while they rock a baby in the middle of the night or push a stroller around the neighborhood! There’s an intimacy to the listening experience and, at the same time, it’s extremely flexible when it comes to when and how you consume your podcasts.”

“The Mom Hour” is an hour-long weekly audio program for parents. Powers and Francis bring their perspectives as moms of many kids (eight between them!), with different backgrounds and points of view and take a conversational, nuanced approach to motherhood and parenting topics. They are not parenting experts in the clinical or professional sense, but they offer real-life experience and judgment-free encouragement to newer moms who are wondering if their experience is normal or looking for light at the end of a sleep-deprived, tantrum-filled tunnel.

To date, they have done more than 630 episodes on topics such as making friends as a mom, potty training, sharing about kids on social media (or not), managing screen time, talking to kids about money,

Santa Barbara experiences over my own and it’s really special to see.”

Powers says that being her own boss has given her the flexibility to continue to be a mom while also focusing on her career:” I don’t share everything about our kids, however, we do lean into topics that are relevant to our lives and create episodes around ideas we’re genuinely interested in.” but local habits die hard.” and more. The show has been downloaded more than 15 million times and they now have more than a dozen contributing voices who add their stories and perspectives on the podcast.

There are different ways to make money as a professional podcaster, but working with advertisers is the most common. At “The Mom Hour,” Powers sells advertising spots on her show in the same way a television network might; brands have a message they want to reach a target demographic (i.e., moms of young kids), and they take a couple of “commercial breaks” in each episode.

Powers and husband Bryan (who works for Mercer Advisors) have three kids: Luc (14) and Reid (12) attend Santa Barbara Middle School, and Violet (10) is a 4th-grader at Montecito Union.

If you have a mobile phone, you already have a podcast app on it – even if you’ve never used it! Search for “The Mom Hour” in any podcast app and it’s easy to find.

And the best part about it: Powers can do it all from home, which is now back here in 93108. After attending college in the Chicago area and then getting married and living in Phoenix and Orange County, she and her husband came back because of (you guessed it) COVID.

“We muddled through online school and shifting work paradigms through the spring of 2020, and with summer we made the decision to move to Santa Barbara. My kids get tired of hearing me point out old places I went to school, had a job, hung out with friends – but as time goes on, they’re now layering their own

So, what’s a perfect weekend for this Montecito native? “My 10-yearold and I love to do what we call a ‘Village Saturday.’ We park the car and bring a basket, and see how many quaint errands we can do on foot: the Montecito Library is a must, plus a stop at the Post Office, the hardware store, the grocery, and the dry cleaner. And I usually pick up sandwiches at Wine & Cheese. I know it goes by a new name,

Dalina Klan is a former television news producer and writer. She is a Montecito native and graduated Westmont College with a degree in Communication Studies and Theatre Arts.

I’m doing these commencement speeches now. I just did Vanderbilt and I’m doing one on Sunday. And it’s such a different world from the world I grew up in, from the America I grew up in. Anyway, I fell into journalism at an incredible moment. It was 1986, the People Power Revolution sent an electric charge through society…

[ The People Power Revolution was a series of popular demonstrations in the Philippines that saw the country transition from authoritarian rule under Marcos to a democracy. It was during this period that Ressa embarked on a career in journalism. A time where, as Ressa writes: “…in the 1980s, another agreed-upon fact, a foundation of our shared reality, was that without good journalism, without the sound production of facts and information, there would be no democracy. Journalism was a calling.” ]

When I was in my twenties, it was either go back to the United States or stay [in the Philippines] and do this startup. Rappler was much, much later, but do this company called Probe. And it was incredible. It’s the best way I could have learned television because we did it the way we thought it should be done, instead of following what others had done. If I had been in the U.S., it would be very, very different than what I wanted to do. And journalism was the best. I loved it.

The reason why I can’t see going into politics, is because what I loved about journalism is it continued the ideals I grew up with. And this is my worry now for this next generation. What are the ideals? It is that in the end truth wins, in the end there’s a meritocracy; in the book I wrote about this, the empty mirror that you search for knowledge, you search for a view of the world and you have to have enough confidence to do that; but not so much confidence that it becomes an arrogance that simply fills the mirror with your image. When you take yourself out of that picture, your image no longer obscures the objective truth that lies behind it. So, it’s like the Buddhist version of the myth of the cave of Plato.

Once I fell into journalism, I loved that the head of state would ask me, what are the principles? And that once you get beyond the ceremonial proforma questions you need to ask, that powerful figure and I are getting used to each other as people. How incredible to have somebody who has to make these really tough decisions, and you can ask them any question you want. I mean for television there’s a form, and I wrote about this, it’s the most unnatural way of being natural. But it was such an opportunity to continue to learn, I continue to understand why things happen. That’s why I fell into journalism.

Polarization is another way to describe this. You pull them out, you don’t hear. Democracy is all about listening to all sides and then making up your mind independently.

GL: Social media was something that, from early on in your career, you invested in heavily, right? You saw it as an opportunity for citizen empowerment. But then, as you say in your book, it ended up “tearing down everything you hold dear.” Do you think that your strong embrace of Facebook in building Rappler added to Facebook’s power?

MR: Definitely. And that’s why it felt like a betrayal. But when I look back over it, it’s the same mistake lawmakers make. News organizations have a set of standards and ethics and we self-regulate. Why would we expect tech companies – whose primary motivation is profit, who have no standards or ethics manual, who don’t actually embrace putting guardrails on, like protecting the public sphere – to behave like news organizations? In retrospect, I should have known that. But I guess I thought they were like CNN, because I was there in the beginning days of CNN; I was there during a tailwind. I was there when we were making mistakes because we were growing so fast.

I set up the Manila bureau and the Jakarta bureau and my team in Jakarta was one of 12 teams that tested out new technology and new equipment. So, I gave Facebook that same courtesy. When you’re a fast-growing organization, you make mistakes. But if you are guided by the right principles, you come out of it and you fix it. This is why I realized that news organizations, journalists, are different. Every decision we made for the public put us at risk every step of the way. When in the time of Duterte we published the three-part weaponization of the internet series, I had an idea that it could be dangerous. I rolled that past our board; I secured board approval because I thought the series may actually threaten the business. Now if they had said no, I would’ve fought it. And ultimately our shareholders believe in and give journalists the power. Because I’ve worked in organizations where the businesspeople win over the editorial all the time. We created an organization that isn’t like that… But, we were the first ones that were attacked in the courts for it and hopefully we’re winning.

On that part, I can’t really talk about it because I’m kind of like on a gag order, but… I had 10 criminal warrants of arrest starting in 2019 and eight of them came within three months of each other. I just kept getting arrested and then posting bail. But all of them came out of the Philippine depository receipts, it was five tax evasion charges from that event in November 2015. It was the civil cases and the criminal cases.

GL: So, in 2017, you met with Mark Zuckerberg [Facebook CEO] and that’s when you said “97% of the Philippines are on Facebook” and he said, “Maria, where’s the other 3%?” Were you shocked that he said that, or did you already start to know by then that he was a big part of the problem?

MR: I was still working with Facebook at that point, and I laughed when he said that. I still was hoping that they would do the right thing and I thought they just didn’t understand it. I don’t know why I gave Facebook the benefit of the doubt. I didn’t know until post Cambridge Analytica, and we became fact-checking partners of Facebook in 2018. What happened in January 2018 was that they pivoted, and Mark Zuckerberg began saying that they were all about family and friends. So, they started threatening news. So, a news organization in Slovakia dropped 60% of page views. We tracked this on a daily basis with Rappler. We dropped maybe 15% based on their choices.

Then when Cambridge Analytica happened, that was March 2018 … we tracked the networks for disinformation. And the way we were able to take apart the tactics was because we were plugged into the APIs, which they shuttered as soon as Cambridge Analytica happened. And ostensibly it was because of data privacy issues. But that’s not true. I think it was because they didn’t want anyone else exposing exactly what was happening. And 2019 was when I really went frontal, and I started demanding answers.

When you look at how long it took me, it’s because I’m old power. I grew up at a time when if there is an issue, you sort it out, you pound it out behind the scenes. And in that sense, I wasn’t working like a journalist, I was working as the head of an organization, and it was a tough thing. So, I guess it was 2019 when I started getting arrested. In fact, the first time I got arrested, there were Facebook people in the office, and I thought that would send ripples, but it sent no ripples.

GL: Can you explain what Astroturfing is, or the fake bandwagon effect?

MR : Sure. AstroTurf is fake grass, right? So, what we did was we started looking, we started pulling the data and one of the things I think people didn’t realize was that it’s not just the post itself. Let’s say ABC News posts something, and then what happens is in the comment section – this is where the propaganda comes in – lies come in, and they come in quick and fast. There’ll be 20,000 comments… [but they’re not real] and it makes it look like it’s a grassroots statement: We hate what ABC is saying. But in the end, it’s all been done by the same small group. It is insidious manipulation. And what it makes people feel is that, oh, if this many different people are saying this and they believe this, it must be right.

We applied what used to happen with traditional media to this technology platform – which allows exponential lies and in fact spreads lies faster, distributing the lies more. So that was one of the first things they did, astroturfing. And how did I discover this? Because I saw it on my own feed. I saw suddenly so many people turn against me. I guess I still do, but I had a great reputation in the Philippines. The reason why the largest network asked me to come back home and run the largest news group is that I was reforming so many of the things that they were working on. But I watched people switch overnight from the long track record I had to – “She’s lying, she’s a criminal.” – and I was like, this is not normal. And that’s what made me begin to pull the data thread and see where it would lead. So, Astroturfing is like fake grass. It is fake. It’s trying to create a groundswell that is manufactured and controlled.

GL : So you have this theory, about friends and then friends of friends and then friends of friends of friends. Do you think that the original business plan of Facebook always intended to lead to this place? It started out as a way to connect with our friends. But ultimately that became a vehicle for gossip or for influencing people. And now we’re more influenced by our friends than we are by ads or by authority. Do I have that right?

MR: Yes. So a book called Socialnomics, which was very early on, kind of laid out that you listen more to your family and friends. If a family member tells you to go have hummus, you would be more prone to go get hummus than if there was a blinking ad or if you watch it on CNN, right? So that was already kind of established as human behavior. Your question is, did they mean for this to go there? No, I don’t believe that they meant for that to happen. I don’t think they’re evil. Although when they see it, when they see genocide and they don’t do anything, I think they cross the line. But in general, was this their vision? No, they didn’t know. What they did have is an iterative AB testing. They tested different ideas in a way that journalists don’t. Because we don’t test things.

What we do, I guess the closest we come, is we (reporters) learn how to tell stories better, but it’s about the story. It isn’t about manipulating the audience; it is about capturing their attention. So, I think what they did is they just wanted to make more money. It’s a profit motive and the way you make more money is by keeping people scrolling on your site. It’s about that metric. It’s time on site, and the way that happens is that you recommend things. Where they went crazy is they never thought about the harm that AB testing could do. So, in the book I actually take what every social media organization uses, which is friends of friends. It’s chapter seven, “How friends of friends broke Democracy.” That one algorithm is a recommendation engine for growth for all the social media platforms. What they didn’t realize was friends of friends pulled society apart, pulled apart the public spheres so that you can no longer have a functioning democracy. That’s the gap that opens. So that tech people will call it, Eli Pariser calls it, Echo Chambers

Polarization is another way to describe this. You pull them out, you don’t hear. Democracy is all about listening to all sides and then making up your mind independently. What these algorithms have done is splintered our public sphere. It’s like taking one editor and replicating that editor a million times. The last part is an algorithm that makes you stay scrolling longer, and it radicalizes. We’re splintering this way and then we’re radicalized downward. This is the most interesting thing about America today. They think the problems are out there, but they don’t realize, this is radicalization. It’s radicalization, but it’s not in security matters, it’s in politics and it’s an extreme radicalization. And part of the reason I think I saw it earlier is because I studied this. This is what my first book was about.

GL: You refer to the Philippines as ground zero for the terrible effect that social media can have on a nation’s institutions and its culture and the minds of its people. Can you tell me why, in your opinion, does the rest of the world need to pay attention to what has happened in the Philippines?

MR: I think the time to have paid attention was in 2016. I was in Mountain View and I pointed out to the Google News Initiative that all the data we had indicated this was happening and I think people thought I was crazy and I was like, this is coming for you. And when did it hit America? When did you finally have evidence of it? January 6th… and we haven’t solved any of these problems.

I mean, let me move it forward. AI is machine learning. This is the first time that humanity was subjected to common-use AI, which in social media is a curation and growth model. In December last year when generative AI was rolled out, again, we didn’t learn a lesson, governments didn’t learn the lesson that you cannot test these things in public.

It’s like giving a drug company carte blanche to test all their drugs in the town square. And then if half the town square dies, “Oh I’m sorry you died. This is important for us to keep testing the drugs.” So, what they’ve done now with generative AI, we’ve let it out of the gate, it’s testing AI in public and you are expecting people who will be affected by this, to test it for these large American companies.

And the harm be damned, because here’s the reality... the basic tenet of the first generation AI was to personalize your feed. I also thought that was insane because I was like, wait, that will create huge problems – if everyone has their own personal feed, how do we have a public sphere? When you each have your own version of reality, that’s called an insane asylum. Think about it, it doesn’t make sense. Their basic premise doesn’t work, and they keep rolling it out.

I just gave a commencement speech, and the hard part is: How do you tell kids to live by values that are being thrown out? Because at the very base level, information is corrupted from the beginning.

GL: Not only do they keep rolling it out, but people keep investing in it. And politicians understand that this is a way to control the airwaves. They understand that this is a way to discredit their opponents and to keep control of the narrative. But we keep using Facebook. So, what do we do? Would you advise people to get off Facebook?

MR : No, no, you can’t. We can’t. A news group can’t because that is now the primary distribution system. Social media is, especially if you’re a digital first and digital only news site, which Rappler is, there was no way that we could do that. Okay, what do we do? Well let me give you one stat. In August 2022, a few months before ChapGPT was rolled out, there was a survey of about 800 folks from Silicon Valley who were working on AI. And they said that 50% of those surveyed said that if you roll this out today, that there would be a 10% or greater chance that it would lead to an extinction event. Extinction, like human being extinction event.

So, Tristan Harris and I were talking, and he said, “Maria, if you are told that there’s a 10% or greater chance that the plane would crash, would you board the plane?” So, this is what we’re doing. And generative AI is significantly different because of the parameters that they use. What makes generative AI different is that it doesn’t do phrases anymore. In the past it used to be phrases, chunks. Now they do every word trying to replicate the way a human brain thinks, and the growth is off the scale. In the past, to invest – Silicon Valley would want hockey stick growth. But this is exponential.

GPT-2 was 1.5 billion parameters for every word you would go through. GPT-3, which we used to create about 50,000 pages for our May 2022 elections, was 175 billion. It went from 1.5 billion to 175 billion. GPT-4, which they just released, is 1 trillion parameters. And then they’re going to be releasing GPT-5 at the end of this year. Developers have said that the reason ChatGPT hallucinates is that they put so many variables in place it’s impossible for them to understand. But what GPT-4 can do is it can code itself. It is growing on its own. They cannot control it. That’s why these engineers thought that releasing it in the wild could lead to an extinction event. We don’t know what it’s going to become. But all you see coming out of Silicon Valley are all of the, “Oh this is incredible. It can write all your drudgery notes.” This is part of what I talked about at Vanderbilt. Good God, if you are outsourcing your writing, how will you learn to write.

If you don’t learn the drudgery of writing, how are you actually going to write novels? How are you going to write stories if you don’t do this every day?

So what needs to happen is something very simple; accountability. You have to stop the impunity. They must be accountable for every single harm that comes along the Editorial Page 384

This article is from: