Facebook Chief Operating Officer Sheryl Sandberg apologized Thursday for the social media giant’s data breach and admitted the company failed to do enough to protect the data of tens of millions of its users.
Yet she said the company does not know whether political consulting firm Cambridge Analytica, which used Facebook data to target voters in the run-up to the 2016 elections, still possesses user data from the company, and if so, what the data is.
“We were given assurances by them years ago that they deleted the data. We should’ve followed up. That’s on us. We are trying to do a forensic audit to find out what they have,” Sandberg said.
Cambridge Analytica’s misuse of Facebook data, which may have affected up to 87 million users, according to a blog post from the company this week, has sparked widespread anger at the company and its founder, Mark Zuckerberg, who will testify before Congress about how his company protects user data next week.
In an interview with the PBS NewsHour’s Judy Woodruff, Sandberg acknowledged the company “under-invested” in the safety and security of user data. Sandberg said Facebook is now working to rectify that.
“We were very focused for the last 10 years on building on social experiences [but] we were not focused enough on the possible misuses of data,” Sandberg said. “What we are doing now is looking much more holistically at all the ways Facebook data is used and making a lot of proactive changes.”
Other highlights from the interview:
John Yang:
Just a little while ago, Judy sat down with top Facebook executive Sheryl Sandberg.
Judy Woodruff:
Sheryl Sandberg, thank you very much for talking with us.
Sheryl Sandberg:
Thank you for coming to Facebook.
Judy Woodruff:
So, Facebook acknowledged yesterday that most of your two billion users could have had their profile, their personal profile information harvested, stolen.
That is a stunningly damaging piece of information, isn’t it?
Sheryl Sandberg:
Well, let’s be clear what happened here.
In the case, we had a feature that enabled to you find your friends. You could find your friends by their name, or their e-mail or their phone number. That was a good use and really people important to a lot of people.
People who shouldn’t have scraped that data and made a directory of it. But what matters here is that all the information they received was already public. They didn’t scrape any private data.
So it was information people had already listed on Facebook publicly. Now, that doesn’t make it OK. We shut down this use case. We are glad we found it. But it wasn’t private data for all of those people.
Judy Woodruff:
So was damage done by this or not?
Sheryl Sandberg:
Well, this plays into the overall situation we’re in, which is people not trusting how data is used on Facebook. And we know that we didn’t do a good enough job protecting people’s data.
I am really sorry for that, and Mark is really sorry. And we’re taking strong action. In fact, that announcement is part of the strong action we’re taking. We announced two weeks ago that we were going to take a very broad look at how Facebook data was used.
We were going to find problems, shut them down and tell people about it. And that is why that announcement happened. But that’s not all we are doing. We have shut down many other use cases in groups, in events, in pages and search. And starting Monday, we will begin rolling out to everyone in the world on Facebook at the top of their news feed a very clear and easy way to see what apps they have shared their data with and an easy way to delete those.
And as part of that, we are going to let people know if their data might have been accessed by Cambridge Analytica.
Judy Woodruff:
And I want to ask you about that, but why weren’t these steps taken sooner? And is there any chance that right now data is being lifted, taken, harvested, that people don’t want?
Sheryl Sandberg:
It’s a really important question.
And I think the answer on why this was — didn’t happen sooner really goes back to what Facebook was trying to do. We were very focused for the last 10 years on building social experiences.
And those are important. Those are why your friends know it’s your birthday, why you can share playlists. But we were not focused enough on the possible misuses of data. When we saw specific problems, we shut those specific problems down.
So in the Cambridge Analytica case, the friends of friends’ sharing that enabled that, we shut that down in 2015. But what we didn’t do until now — and, to be clear, we are late — but what we’re doing now is looking much more holistically at all the ways Facebook data is uses and making a lot of very proactive changes.
Judy Woodruff:
But on Cambridge Analytica, how certain are you that that data is — is destroyed, that it’s not available to anyone anywhere for use anymore?
Sheryl Sandberg:
With Cambridge Analytica, we don’t know what data or if they have any data at all right now. We were given assurances by them years ago that they deleted the data. We should have followed up. That’s on us.
We are trying to do a forensic audit to find out what they have. We started that. The U.K. government is now doing their own investigation. They get precedence. So, we are waiting. We don’t know at all what data they have.
The 87 million people we notified are people whose data might have been accessed by Cambridge Analytica. So we’re giving the most conservative possible estimate and notifying those people. But once we do our audit, if we can hone it and be more specific, we certainly will be.
Judy Woodruff:
Cambridge Analytica, of course, working with the Trump campaign for president. To what extent did Facebook play a role in electing Donald Trump?
Sheryl Sandberg:
The questions on this election, they are big and they are deep. And, certainly, we have done a lot of soul-searching on the role we played with the foreign interference that we didn’t see or catch early enough on our election.
I think people are going to be trying to answer that question for a long time. And it is an important question. We are very focused on learning the lessons and applying them. So you might have seen this week we took down another 270 pictures from the Russian IRA.
Those pages are in Russian.
Judy Woodruff:
This is the Internet Research Agency.
Sheryl Sandberg:
Their Internet Research Agency.
Sheryl Sandberg:
Now, that is the same organization that tried to interfere in our election and did put content on our Web site and interfered.
We were too slow then, but we found these pages now proactively and we took them down. And some people say, well, these were Russia — in Russian targeted to Russia. And our point is clear. This is a troll farm. This is completely deceptive information and there is no place on it — place for it on Facebook in the United States, in Russia, anywhere in the world.
We found this. And we are proactively going find things like this in other parts of the world.
Judy Woodruff:
Do you believe Facebook played a role in the Trump election?
Sheryl Sandberg:
Well, certainly…
Judy Woodruff:
In electing Donald Trump?
Sheryl Sandberg:
Look, certainly, every candidate at every level used Facebook. We also registered two million people to vote. So, of course we played some role in the election.
But what that role was and how that was influenced is something that people will study for a long time, and those are very important questions.
Judy Woodruff:
But, in a way, isn’t the horse out of the barn? You have now lost the confidence of many of your users who were saying they don’t know they can trust Facebook, they have to be careful about what they post.
How do you win back the trust that you had from so many people?
Sheryl Sandberg:
Trust is a really important thing. You are asking a really important question.
And, you know, for me personally, the fact that people would not trust us, I take responsibility for that. And that hits hard.
And here’s what we are doing. We are shifting the way we think about running our company. We are finding the problems ourselves. The problem we found ourselves you started with, everything we announced this week, we found, because we’re taking a proactive approach.
I am not going to sit here, Judy, and tell you there won’t be future problems. There will. We are at the beginning of what is a comprehensive review. We are trying to work quickly, but we are trying to work thoroughly.
So we are going to announce more things. People are going to continue to find more things. But here is our commitment. Our commitment is that this isn’t a one-time change or a one-time exercise. This is , because security and safety is an arms race. You build something, someone tries to abuse it. Then we’re going to build the next thing, and someone is going to try to abuse it.
And we are going to take a much more proactive stance. We are investing to the point that it changes our company’s profitability, so that we can get ahead of that.
Judy Woodruff:
You are saying you didn’t focus on some of these problems soon enough.
Some people, though, look at this and say, this raises questions about the whole Facebook business model, the concept of Facebook in the first place, because you are basically saying to people, build this great community up, communicate with all your friends and family, and we are going to make that information, some of it, available to our advertisers.
There is a disconnect there. People are looking at that and saying, can the two things stay — coexist? Can you have something that is a community and a commercial venture at the same time, where people are profiting off that information?
Sheryl Sandberg:
It’s a critical question, and I’m glad you asked.
We believe very deeply in our advertising model, because, just like TV, it’s what enables us to make this product available to people all around the world for free. Two billion people use the product. If it weren’t advertising-based, most of those people would not be able to.
But then the deeper question you are asking is, can we run an ads business where we serve targeted ads in a way that protects people’s privacy? And the answer to that is a very clear yes. We have always built privacy into our ads models.
We do not sell data or give your personal data to advertisers, period. What happens on Facebook is, someone wants to advertise. We are able to show targeted ads that you will hopefully be interested in without passing any personal data.
Our commitment to that remains very strong. And we believe that being able to offer a free service is very important for the community we build.
Judy Woodruff:
Well, we know clearly some good has come out of what Facebook does. At the same time, you have got critics out there saying you, Mark Zuckerberg, your CEO, and others in the leadership of Facebook let your success go to your head, in effect, that you were doing so well, you were growing so fast, making so much money, that you forgot about one of the essential promises you made to your users.
And that is their privacy. How do you explain what went wrong inside the leadership, the thinking of Facebook?
Sheryl Sandberg:
Well, we made big mistakes, and we know that.
And I think it really is that we were very focused on social experiences, and pretty idealistic, that we believed in a world where people could share and experience things together. And we just weren’t thinking enough about the bad use cases. And that’s on us.
We are learning those lessons. So, for example, if you think about fake news on Facebook, happened quickly, we weren’t doing enough. Now we are. We are now working very closely with third-party fact-checkers, so that we can identify things as false.
Judy Woodruff:
But, last fall — and I will just interrupt quickly — Mark Zuckerberg said sometime last year that it was crazy to think the Russians were using Facebook.
It turned out to be the case. I mean, there is a — the appearance is that Facebook didn’t want to see some of the problems until it absolutely had to see them.
Sheryl Sandberg:
Well, Mark apologized for that comment. He knows it was way too flip.
And we are taking strong action. So going back to fake news, what we are doing now is that if someone is about to post something that is false, we warn them, hey, our third-party fact-checkers have said this is false. If you have posted something, we go back to you and warn you.
We dramatically reduce the distribution. And we have a partnership set up with AP in all 50 states, ahead of the, you know, the U.S. midterm election to mark false news.
Judy Woodruff:
To people watching all this, though, Sheryl Sandberg, what do you say to folks who say, well, it happened before at Facebook, several big mistakes; how do we know there won’t be another one?
What are you changing inside the way you make decisions inside this company?
Sheryl Sandberg:
Well, I’m never going sit here and say there won’t be more mistakes, and I’m never going to sit here and say there won’t be content we don’t want on Facebook.
There are two billion people who post every day. We have a no-hate policy, but someone is going to post a bit of hate, and we are going to work hard to get it down.
But here is what I will say, that we are fundamentally shifting the way we think about this. We are no longer just trying to build social experiences. We are also — I mean, we always were concerned about privacy, but not enough. We are also taking more proactive steps to get ahead of the possible misuse.
And you are already seeing us do that.
Judy Woodruff:
And what about the bigger question out there, one of the bigger questions out there, which is, should one private company have control over this many interactions, two billion people, growing? How much of a role is there for government?
I mean, you and Mark Zuckerberg have said you are open to some government regulation. How much are you open to and how could — how far are you willing to go to let an outsider or even a competitor come in and change the fact that you are dealing with a massive number of humans on this planet?
Sheryl Sandberg:
We have given a lot of thought to that.
And we are — we do operate under lots of regulation all over the world. We are in a lot of dialogue now and always, but particularly now, because there are big questions out there about what role should tech company has — should tech companies have, particularly in our size and scope?
We’re not just open to regulation. We’re moving ahead of it. So, the most likely regulation in the United States is the Honest Ads Act. It may or may not pass. We have already built the tool. It’s live in Canada. It will be live in the U.S. before the election.
And what it means is that anyone can look at any page on Facebook and see all the ads they are running. And for election ads, you will be able to see how much was spent, who paid for it and the demographics.
And we are going to build a four-year look. Going forward, we have to start showing the data, so, four years from now, four years of data. That is completely industry-leading transparency. And that, we are doing before — we are open to the regulation as well, but we can’t wait.
We have to do more now.
Judy Woodruff:
But for the critics who say Facebook needs competitors, it needs to be taken over by somebody else, it needs much more regulation, what do you say?
Sheryl Sandberg:
I say all of those open questions. We will see what happens. We have competitors, obviously, we compete with.
But the most important thing I say to them is that we understand that we were behind. We are getting much more proactive. You are already seeing us this week. It’s never going to be perfect. This is an arms race.’
We are going to build something. Someone is going to try. We are going to try to get all the hate content off. And we are doing better and better. But what you are going to see from us is a real commitment and a real belief in what we do every day.
Judy Woodruff:
How hard has this been?
Sheryl Sandberg:
You know, it should be hard, because we have a really big responsibility here. We know that.
Judy Woodruff:
Sheryl Sandberg, COO of Facebook, thank you very much.
Sheryl Sandberg:
Thank you for coming here to be with me.