VIDEO - YouTube CEO Susan Wojcicki on Kara Swisher’s Recode Decode podcast - Recode

On the latest episode of Recode Decode, YouTube CEO Susan Wojcicki joined Recode’s Kara Swisher onstage at the Lesbians Who Tech summit in San Francisco. Wojcicki talked about how YouTube is trying to keep young users safe by changing some of its policies, how she felt about the Google walkout last fall, and how the company is using AI to police toxic comments and which videos get recommended by the YouTube algorithm.

“We have to use humans and machines,” Wojcicki said. “And it’s the combination of using humans to generate basically what we’ll call the golden set or the initial set, the set that our machines can learn from. And then it’s the machines that go out and actually extend all this amazing knowledge that the humans have, and to be able to do that at scale.”

You can listen to Recode Decode wherever you get your podcasts, including Apple Podcasts, Spotify, Google Podcasts, Pocket Casts, and Overcast.

Below, we’ve shared a lightly edited full transcript of Kara’s conversation with Susan.


Kara Swisher: Susan and I have known each other, we just figured out, for about 20 years.

Susan Wojcicki: Yep.

We met a long time ago. I’ll tell this story very quickly. Her and it was Larry Page ... Who else was with us?

David Drummond.

David Drummond, who was the ...

Megan.

Megan ... Came to New York to talk to book publishers about putting book stuff on Google in the very early days, and there was an electrical outage that ...

Yes, on the entire eastern seaboard.

And so they had to stay in my mom’s apartment on the floor, and ended up spending the night on the floor, which Larry Page, even for an hour, is a lot. And we had a really great talk.

Yeah, in the dark.

In the dark about where the internet was going.

No electricity.

Yeah. So it was kind of a fascinating time to get to know each other. And since then, obviously, a lot has changed. So I want to just say, Susan is one of my favorite people in Silicon Valley. That said ...

You’re going to ask me lots of hard questions.

Yes, I am. That said, it’s really a difficult time for Silicon Valley right now, and I want to talk about some big issues about where YouTube is, where Google is, and the culpability of technology in the disaster that we find ourselves in. But she’s super fun at a party.

Anyway, so let’s start with that. This week, yesterday, you’ve had the latest controversy around YouTube, and there’s a controversy a day. Not just YouTube, but Facebook and others ... Was around pedophiles and the use of YouTube as a tool for pedophiles. So why don’t you talk a bit about what you all have done, and then I’ll have some questions about what happened in this situation.

Sure. Sure. So, first of all, I just want to say that we take kids’ safety incredibly seriously, and I’d say also the last two years have really been focused on responsibility of our platforms, and everything we’ve done has been through those lenses of how can we make sure that what we’re doing, the way we’re growing, is through responsibly.

And so in particular, with this specific incident that happened around child safety, which I take very seriously. I’m a mom. I actually have five children from four to 19, so I understand kids. At least, as a parent I understand it, and really want to do the right thing.

So as soon as we were made aware of these issues that were happening, and we had been working on child safety since we first launched YouTube Kids and became aware, but we became aware of some comments. The videos were okay. The videos were not violative, but the comments with those videos were, and as soon as we were made aware of them ... The number was in the low thousands ... We removed comments off of tens of millions of videos. We basically did that almost instantly, just in an abundance of caution, to make sure that we were able to stop it.

And then in the last week, and just yesterday, we spent time looking at our policies, and we announced some really significant changes, one of which is that we are no longer going to allow comments on videos that are featuring young minors anymore, and older minors that are engaged in risky behavior. And we also launched a new classifier that has 2X the potential to find comments that we think will be problematic.

Okay. Couple questions.

Yes.

”Made aware.” Why didn’t you know that? When I have a platform of my making or something, I know what’s happening on my platform. I know what’s happening in the comments. Silicon Valley in general has been sort of doing this. It’s not YouTube as much as Facebook, but it’s the excessive “I’m sorry” tour. Like, oops, sorry. Oops, sorry. Oops, sorry. And it’s the questions, like “made aware.” “We were made aware.” “We didn’t realize it.”

Talk about what goes into these platforms, and I realize the amount of video going over them. I realize how big it is. I realize the technical problem going in, but sometimes it feels as if they don’t have any sense of anyone ... It seems like they live a life that’s very safe. It’s typically white, young men, and they don’t understand the lack of safety for so many other people, not just children. So talk about that.

Yeah. Well, so I will say we have been very focused on child safety. It wasn’t like we were suddenly made aware and then we started focusing on child safety or comments. We have been doing that for the last couple years. The reason we were actually able to launch a classifier that had double the capacity is because we had been working on that to make sure that we were able to remove those comments. So all of this has been ongoing work.

The policy change to say we are no longer going to allow comments on minors, featuring young minors is ... We’re a platform that is really balancing between freedom of speech and also managing with our community guidelines. I think there are going to be many young people out there that are going to be upset because they’re going to feel like they’re posting videos and they no longer have the ability to use comments in the way that other creators can to be able to get feedback on their videos about what’s useful, what went well, what should their next video be about, what was successful.

And so this change took away some of the ability of people who are innocent, young people who are innocent who are posting, or their parents who are posting videos. In the end, that was a trade-off that we made because we felt like we wanted to make sure that protecting children was our No. 1 priority.

So doing a sweeping thing versus consulting, say, the parents, you can’t decide between what ...

Parents can always decide. So parents can always turn off the comments. That’s a functionality that everybody has. We also have classifiers for people to search through them. We search through them, turn them off if we find anything that’s violative in the comments just automatically, and so we give tools to our creators. But it’s a process where we’re trying to always get better, but in this specific case, we had to make a choice between giving people who are innocent and creating videos themselves and then making a decision about their safety.

Great, but I get back to the idea that comments are just vile on YouTube. Anyone who’s a woman gets ...

I have a channel.

Yes, you have a channel. They’re vile. When you look at them, don’t you say, “Why is this even occurring? Why am I even allowing this to happen?”

Well, first of all, we have community guidelines, and YouTube has had community guidelines since the very beginning, and those community guidelines include things like hate speech, and promotion to violence, and all kinds of other core areas that we believe in, but comments is a really important part of the platform for creators and fans, and they can always turn it off if they want to.

Here’s the thing. “They can always turn it off.” That’s really not ... Like, it’s off or on. It’s not that you are monitoring your platform in a way that’s responsible.

We do. We do. We do run classifiers. We look at the comments. We actually remove hundreds of millions of comments every quarter that we think are problematic, so we are monitoring it. But you have to realize the volume that we have is very substantial, and we’re working to give creators more tools as well for them also to be able to manage it.

When you talk about the volume, there is the volume, but y’all created it, you know what I mean? Like, I’m sorry — and it’s not you in particular, but you all — sometimes it feels like you have all reaped the benefit, but not the responsibility of having a platform, you know what I mean? And again, I think that a lot of the focus has been on Facebook on this, because it looks like, in their case, it’s a lot of sloppy management. That they just didn’t put the rules in place, or the tools in place, or things like that, and I do ...

Again, I get that it’s a massive problem, so what are the solutions to this? Because one of the solutions is regulatory. If all of a sudden tomorrow Section 230 of the Communications Decency Act was removed from you all, you’d have to figure it out rather quickly, right? You don’t have a broad immunity. What would you do?

Well, for the last two years, we have been really, really focused on responsibility, and everything that we do ... Whenever we talk about growth, we always talk about responsible growth. Every change that we make, we’re thinking through the lens of responsibility. And if you look at the number of changes that we’ve made, it’s substantial.

I’m happy to talk about more, in all the different areas, whether it’s ... And you have to realize there are many areas. We just touched briefly on child safety. There’s misinformation, there’s foreign government interference, there’s hate. There are many different areas that we’re focused on, and we’ve made a lot of progress. And I want to say there’s more progress to be made, I 100 percent acknowledge it.

But we, in the last year, have built very significant teams to be able to address this, and to address it not just from a people standpoint, but, this is a technical audience, from a technical point of view, of building up classifiers and machines to be able to identify problems and remove that content.

So talk about the technical solutions, because that’s always been the ... “AI is going to fix it,” whatever, and Casey Newton just wrote a great piece in The Verge about Facebook, people who go through Facebook videos, and they all want, the people who watch the conspiracy videos suddenly believe in conspiracies, they’re being badly affected, they’re having sex in the stairwells, it’s like crazy. They watch this stuff and they get warped.

Obviously, humans are not going to be the way you’re going to figure this out, but you have to put humans to the test. Talk about this, what could be done.

Well, we have to use humans and machines. And it’s the combination of using humans to generate basically what we’ll call the golden set or the initial set, the set that our machines can learn from. And then it’s the machines that go out and actually extend all this amazing knowledge that the humans have, and to be able to do that at scale.

So we have 500 hours being uploaded every minute to YouTube, and the only way to solve this, at the end of the day, is going to be with a combination of human and machines. And if we actually started releasing a transparency report, and if you look at the transparency report, you can see how the machines have ramped up in the last year. And in Q ... we’re about to release Q4 report, but in Q3, we removed over... almost 8 million videos. And of those, 75 percent of them were removed with machines, and of that 75 percent the majority didn’t even have a single view. And so that shows you that when you can do this at scale, it really makes a difference.

But recommendations, for example, we just made a change to how we handle recommendations, where we have readers, the readers go through — we make sure they’re representative from all parts of the US, we publish the guidelines — those readers then identify a set of videos that they think are, could be, they might technically meet the requirements of following our community guidelines, but they’re close. And there’s a lot of content that, there’s 1 percent that brushes up against the community guidelines. So what we do is we identify this with, a set of them, with humans, and then we use machines and machine learning to expand, and based on that then we are basically very unlikely to recommend that.

I’ll get to recommendations, I have a personal beef with you about that.

Okay. I can’t wait to hear.

My son, who is 13 years old, started watching Ben Shapiro videos. And he’s like the gateway drug to the next group. And then it goes right to Jordan Peterson, then it goes down and in three clicks he was in Neo-Nazi stuff. It was astonishing. And then I had to listen to it at dinner. And I was sort of like, “I’m going to kill Susan Wojcicki first.”

Okay. Here I am.

But it was sort of like, I was sort of like, it feels like, as I said, I think you’ve heard me say this, it feels like all you tech companies have built cities, these beautiful cities, but you decided not to initially put in police, fire, garbage, street signs, and stuff like that, and so it feels like The Purge every night. It’s a good joke, but it is, I’m sort of like, and then I’ve got this kid who’s like, “Well Ben Shapiro’s sort of smart.” I’m like, “No he’s not! Not even slightly! He’s clever but he’s an idiot.”

Anyway, it’s just exhausting. But it has a huge effect on him. Do you feel like the people, on a bigger thing — and it’s not your total responsibility, Susan — but do you feel like the people in Silicon Valley have a sense of this, of the impact they have, and are capable of dealing with it? Or will regulatory measures just have to come into place? Because it’s already starting in Europe, it’s starting in California here with privacy bills, do you feel like you’re all able to do that?

We have already made a huge difference, and we will continue. And that will have a big impact on how our platforms work. And we actually use this analogy of a city too, where we feel that we were initially starting out as a smaller city, and people kind of all knew each other on the internet, and then very quickly, we grew to this major metropolitan city. And our goal is, we’ve really ramped up, Google has committed to having 10,000 people committed to dealing with controversial content, which we staffed last year. We have tightened our guidelines, we have made very significant changes in how we handle our policies, like the recommendations.

So getting to your son ...

He’s lost.

No, we can work on your son here, I have a son too and I get some of these discussions also at the dinner table. I think what you’re describing is — and the way we think about it, too — look, there’s a set of content that has to meet the community guidelines. Ben Shapiro is going to meet the community guidelines. I don’t think you’re suggesting that we remove him from the platform. Are you?

I would, but I can’t. No, no.

Okay.

You know, last time I saw you, I was like, “Get Alex Jones off that platform,” and you’re like, “Well the community guidelines,” and then you got him off.

No, he’s not on the platform.

So I was right, but ...

It was the terms of service, actually.

I was the terms of service. He broke the terms of service.

He broke our terms of service, yes.

But what I mean is, I’m getting to the larger picture, and then I do want to get to issues around diversity and other things.

Sure.

Do you feel like you all have the capabilities? Because it sometimes feels like, again, it’s a small group of people that’s deciding on a large ... is there enough diversity, is there ... within your management structure to do that, to make these decisions?

Well, diversity is a huge and important part of YouTube, and that’s what I love about YouTube is the fact that we’ve been able to tell so many stories. And that’s why in many ways we want to make sure that we’re protecting the freedom of speech but also enforcing the community guidelines at the same time. And I do think we’ll make very significant progress. I think we have made significant progress, and we will continue to do that.

I’m talking about in your management structure, do you feel like, you all live in a zip code of Palo Alto or whatever, do you feel like there’s ...

No, actually half of YouTube lives in San Francisco.

Well, that’s good.

Yes.

But do you feel like there’s been enough ... I mean, for a long while, Susan, it was you and Sheryl [Sandberg] that were it.

Yeah. Well, so actually I just ran some stats before we ...

That’s just women. I’m not talking about people of color and different ages and ...

Sure. I think it’s incredibly important. I just looked at my management team, and I’ve been really focused on diversity at YouTube and bringing more leaders and more women and more people of color and underrepresented minorities to YouTube, I 100 percent agree this is essential. And I’m saying it from many perspectives. I think it’s important from a business perspective, because to have that point of view of representing everyone and understanding, and drawing the best talent.

I just looked at my management team. So when I got there, the number of directors at YouTube was about 15 percent in terms of women. Now, it is double, around 30 percent. So I’m not saying that we’re at 50 percent, which like, women are 50 percent of the population, but in my leadership, that number has doubled.

What about the rest of the ...

But how do you press that at the rest of Google, because here you are doing this, and I want to also get to people of color, and ...

Well, YouTube is going to keep growing.

Right.

So my goal is, I think YouTube is a company within Google, and so the goal for me is to think about how do I, and I have control of YouTube as a CEO, so my goal is let’s make YouTube really diverse, and show what it is to have a diverse tech company, and then deploy different techniques, whether it’s bringing in people who are supporting underrepresented communities, increasing our recruiting at colleges we might not have gone to beforehand, just making sure that it is a really inclusive place and the people that are there feel supported. And that also leaves a pathway for other tech companies, it leaves a pathway for Google, etc.

Now you’ve been around a long time, do you think it’s changed a lot? Because it doesn’t feel like it has. I mean, let’s be honest.

I think it has changed.

How?

Actually I think you’ve been helpful in changing it.

Yes, I know that, but I ... I can only say, “You fucking assholes” so many times before it loses ...

That’s helpful.

It’s helpful, but at some time, do you imagine it shifting? Because whenever I feel like it has, it feels like I see the same group of people. And want happens is, I think for example at Facebook, so I’ll use another company, because I think you’re a more obstreperous group at Google, the cohesion within the group at Facebook led directly to what has happened there. They always ragged about getting along, and I was like, “Well, who’s the pain in the ass here?” and they did say, “You, Kara,” but I don’t work there, and I’m not a billionaire either. So what has to change within tech to do that?

Well, I wrote this Vanity Fair piece called, “Breaking up the Silicon Valley Boys Club.” And the first point I made was, it needs to come from the CEO level. And I’ve really seen, the CEO has to make it a priority. They have to say, “This really matters, I’m going to give it the resources, I’m going to meet with the underrepresented groups, I’m going to focus on having a diverse management team.”

I realize I’ve been at Google for 20 years, right? And so, I think when I first started, if I compare it to now in today’s world, I think, A) there’s much more focus on how diversity really matters from a business standpoint, I think the technology is more diverse, like we see it reaching more people, and I also think the culture of any kind of discrimination or sexism, people are much more aware of that and much less willing to tolerate it, I think on both sides, both men, women, leadership, and employees. I think part of that has been, sadly, we’ve had #MeToo and all these different examples of companies that have not done a good job, and that has been really ...

Including Google.

Yes, including Google. But I’ll say the positive ... Although these were very negative events, and I’m so sad when I see that they happen, but the positive is it’s raised awareness of how companies cannot operate this and the consequences if they do. That is really important, to bring awareness that this is just not acceptable behavior in Silicon Valley.

All right. We’re going to finish up talking about ... I don’t have a timer here so I don’t know the minutes, but if someone has ... There’s no timer here so I can just keep going on. Two things I want to talk about, disinformation and regulation and the Google walkout. These are three things I want to get to.

Okay. Okay, great.

So the Google walkout, there’s people here ... I did a great podcast with a bunch of them. How did you see that within Google among the top ... You’re on the top management team at Google. You’ve just ended forced arbitration, correct?

Yes. Yes.

How did you look at that? Do you think ...

First of all ...

How many people here are from Google that were part of that? Do you all realize it’s a bigger deal or you were like, “It’s just some noisy employees?”

No, no. It was a big deal. It was a big deal. First of all, seeing the news was really upsetting. I think that was the sentiment of the entire management team, that it was really upsetting. The way I approach things is, “How can we make things better?” What I appreciate about Google is that we saw that people said, “I want to make this a better company. I’m upset. I want to go. I want to make change. I’m asking for change.” That means that it’s a company that people care about and that they want to make it a better place. So as a management team, we wanted to give people the leeway to tell their stories, to hear what wasn’t going well. We said, “Sure, if people want to go to the walkout, they should go to the walkout, and we’re going to listen to the stories, and we’re going to listen about how we can do better.”

I don’t want to say that everything is solved. It’s not. Right? There’s always going to be more to do. But if you look at the changes that we made really quickly, there were a lot of changes that were made quickly. The changes were, first of all, we started with not having forced arbitration around sexual harassment. That was the first change. We revamped the process of how it’s reported. I think people told their stories. And even though we had a process, we realized that process can be better and we made different changes there. We said we would report on it. We said, “Let’s have mandatory training.” We renewed our commitment to diversity. And there’s more. There’s ongoing work.

All right. Will you all, do you think, do you imagine, I know you’re not the person in charge of this, put an employee on the board of Google?

Yeah, so I’m not the person in charge of this.

Will you advocate for it? Do you think it’s a good idea?

Eh, I mean, I don’t know enough. I have never been on a board where there’s a employee.

No, it’s all white guys. I know.

No, there’s diversity.

Your board is pretty good. Your board is ...

Our board is pretty good. I’m also on the board of Salesforce and I think they have really good diversity there. That’s been great to see and great to see that commitment. So I don’t know. I mean, I’m not on the board and ...

What do you think about the idea of an employee on the board?

Well, it depends who the employee is, right? I mean, I’m an employee. I think it matters who the employee is. I think it’s to make ... I mean, boards usually review plans. They don’t necessarily come up with a plan to be able to make that change. So I think having focus at the Google level, at the Alphabet level, and as the team has currently been doing, is probably the right place to be able to be able to make a lot of change. I don’t want to say there aren’t benefits of having employee ... I don’t know. I’ve never been on a board with an employee, so I don’t ... I just don’t know enough about that.

The last thing — I’m not going to be able to get to privacy because I’ve got a short amount of time — is contractors.

Contractors, yes.

Contractors around … This is important to, a lot of people here are contractors.

Yes, yes, yes. We have temps, contractors, and vendors, TVCs.

Should they be treated the same way and not be these sort of second-class citizens of Silicon Valley? It’s increasing in employment, not just at Google, but everywhere.

Yeah, yeah. Yeah. First of all, the company, and many companies in Silicon Valley, need to employ TVCs to be able to scale and to be able to ramp up to an expertise that the company itself may not have. Right? That could be anything from the bus drivers and people who are, I don’t know, working across security ...

Yeah, and...

... or are lawyers who are advising us on very complicated areas where we don’t have full expertise. It’s a broad range of expertise that we have there.

We do have a supplier code of conduct. We want to make sure that all of the suppliers are implementing and treating the people that we employ, and that when we do employ them, that they meet a certain level in terms of our supplier code of conduct. I think that is a really important leverage point to make sure that that causes them as a vendor to make sure that they’re meeting those right requirements, to make sure that the TVCs, the temps, vendors, and contractors that we’re hiring, are getting fair and great working conditions.

All right. Do you believe Silicon Valley, this is the year of reckoning? This is my last question. This sort of has been a year of reckoning. Do you feel like the leaders get it? You know, I’ve always said to you, and I think you agree with me, that Silicon Valley has always been a mirror-tocracy rather than meritocracy. Do you believe they get it, that perhaps they did not hang the moon and they really do have to make these serious changes? Do you think that, since you’re at the very top of it?

I think they get it. I mean, it’s been a year where we have been ... I think any good leader needs to step back and say, “How can we do this better?” There was a period of time where I ... I’d say maybe this was two years ago, where we just said, “Look, something has to change here and we have to make a change.” I went to my team and we said, “Look in our what we call trust and safety area and how we handle this from a product standpoint. We need to have senior people. We need to have dedicated teams and they need to be big teams with the best people on them. We have to have 100 percent commitment to responsibility and to solving these challenges.”

I’ve been now for 20 years at Google and these have been hard years. These have not been easy years. We have a lot of challenges and they tend to always happen when someone is out or when we’re on vacation. But it doesn’t matter. You have to come back, wherever you are in the world. You need to be there. You need to be on it. We’ve been working around the clock to solve these issues. I’m not saying we’re done. I’m not saying we’ve solved everything. But we have 100 percent commitment to solve them. That commitment will lead to progress and that will lead to better products for everybody and I’m committed to doing it.

Great. Susan Wojcicki. [applause]

Okay. Thank you. Thank you.

recode_logomark

Recode Daily

Sign up for our Recode Daily newsletter to get the top tech and business news stories delivered to your inbox.

By signing up, you agree to our

Privacy Policy

and European users agree to the data transfer policy.

https://www.recode.net/podcasts/2019/3/11/18259303/youtube-susan-wojcicki-child-comments-videos-google-walkout-kara-swisher-decode-podcast-interview