No violent organisation can use our network: Facebook

Monika Bickert, head of Facebook’s product policy, is responsible for the team that sets content standards on the world’s largest social media site with over 1.3 billion users. She decides what people can post and say, as well as guides the enforcement of these policies. And every time a piece of content is reported to Facebook it is her team’s job to review it and apply their content standards to it. Excerpts from an exclusive interview with Nandagopal Rajan.

Q: Is it a challenge keeping Facebook clean?
A: Our job is to provide a space for people to connect and share things that are important to them and at the same time make sure that we are keeping those people safe and free from abuse. That is extremely challenging given the size of our population and its global diversity. It is something we are very committed to and we are working hard everyday to get it right and get better at it.

Q: Is there some part that is especially challenging and is it getting tougher to do your job these days?
A: There are two primary difficulties with the job. People come from very different backgrounds and legal systems. They sometimes have different ideas about what is okay to share and what is not. But we have to have one set of standards to apply across the globe for the entire community. We have to do that because we want them all to have the ability to interact across the borders with the same speech. We are trying to do these lines for what type of content is acceptable and that becomes very challenging.

The second challenge is that we have 1.3 billion people and that means we do get a lot of reports. We want to be able to respond to those reports efficiently and consistently. We want to make sure we are making the same decision about a piece of content reported from India and Mexico wherever the reviewer is based. For that reason we have to craft our policies to be very objective. We can’t tell our reviewers to take down a post that is tasteless or impolite for people will have very different ideas of what that means. This is why the policies at times have to be more blunt than we would like them to be.

Q: How does the reporting system at Facebook work?
A: We want people to tell us when something is not right on the site. That is why we make it so easy to report content. In fact, you can report any piece of content on Facebook whether you are using the network on desktop or mobile. When someone reports a post, it is sent to an employee who has the language skills and training to apply our content standards. For instance, if someone reports a threat of self harm that post goes to a person who has the training in dealing with such instances and understands the language it is posted in. After we review the content and make a decision about whether or not it violates our policies, we remove it from the site or leave the content on the site. In either case we sent a message to the person who reported it.

Q: Is there a time frame in which the process is completed?
A: It really depends on what is being reported. Of course it is important that we respond in time, but it is more important to us that we get it right. We know we will not always get it right. There are real people reviewing the report and they will make mistakes from time to time. But our accuracy rates are very high and we are proud of that. One of the things that we do to ensure a fast turnaround is that we have reviewers around the world. We actually have around the clock review of posts.

Q: But isn’t the time taken to respond in cases of self harm important?
A: I am very happy that you asked this question as I am very proud of what we do to keep people safe from instances of self harm. If somebody reports a threat of self harm, the first thing that we do is to provide the person who reported with resources they can use to reach out and help the person. That happens automatically. The next thing that we do is we prioritise the review of this content. Any time we think a person’s life is in jeopardy, like with self hard or terrorism related content, that type of report goes to the front of the queue. We also have a network of safety organizations around the world and we might reach out to them to keep the person safe.

Q: We have had recent instance of terror groups using social network to reach out to people and propagate their message. Is there a way to pre-empt this?
A: We have taken a very strong stance against terrorism. If you look at our community standards, we say clearly that we do not allow violent organisations or members of that organisations to use Facebook. If we find that they are using Facebook for any reason, even to talk about something personal, we would remove them from the site and ensure that they do no comeback. Also, we do not allow content that praises or supports terror organizations or their acts. We will absolutely remove that content.

Q: Terror changes colour across the world. So, are you looking at all terror organisations through the same glass?
A: We understand that the landscape is always changing and new violent organisations do emerge in locations around the globe. So any group that has engaged in violent acts anywhere in the globe is banned from Facebook.

Q: How do you tackle cases of cyber bullying or harassment?
A:  These are difficult issues and we do rely heavily on our community tell us what is going wrong. Fortunately, with a community as large as ours it is basically the world’s largest neighbourhood watch programme. And people do tell us when they see something going wrong.

Q: Where does the offline legal system kick in? What is the threshold for that?
A: There is behaviour online and behaviour offline. We see only the behaviour online and that often does not tell the whole story. There will always be situations when we will be unable to determine what is going on. That is why we interact a lot with safety groups who deal with everything from bullying to terror.

Q: What happens when someone tries to impersonate a person, especially a celebrity?
A: It is common for people to create to a page to discuss or celebrate a public figure. But we have to ensure that no one is misleading others to believe that this is a public figure when it is not. So we do two things, first we remove pages that are impersonating a public figure, and secondly if it is a fan page or a page that is discussing that person, we will place a tag next to it saying unofficial.

Q: Do you get a lot of content take down requests from governments?
A: We publish a transparency report that has all information on this. We do get requests from governments. Sometimes the content is in violation of our standards and is removed. But at times they are within our standards but are nevertheless illegal in that country. In such cases we make the content unavailable within that geography.

Q: Isn’t it tough to apply the same scale for content across the globe, given that you have different sensibilities and sensitivities?
A: It is extremely challenging. We know that people might have different ideas about different ideas about what is acceptable. That means they might be upset or will vehemently disagree with it. But that is not necessarily a bad thing. Facebook is a powerful tool for speaking up when you see something that is inaccurate or offensive. We have seen instances where somebody will say something ignorant and others in the community will speak up against that. That kind of counter narrative can be extremely effective in raising the overall awareness of society about what is going on. Tools like Facebook can be powerful instruments to bring in positive social change.

Q: Do you also interact with security agencies of countries and use their information to tackle terror for instance?
A: We have a number of safety organisations, NGOs, community groups and academicians that provide their insights to us on a number of different policies. Our ears are always open, but the biggest source of information for us is our community. We listen to this feedback and that drives the continued refinement of our policies.

Facebook
Comments (0)
Add Comment