twitterfacebookgoogle+register
+ Reply to Thread
Page 11 of 11 FirstFirst ... 9 10 11
Results 101 to 106 of 106

Thread: Anti-Cybercrime Law

Share/Bookmark
  1. #101
    Confidential

    By: Michael L. Tan - @inquirerdotnet Philippine Daily Inquirer / 05:09 AM April 04, 2018

    There’s a quiet cultural revolution going on in offices throughout the country involving the very way we think and has involved an audit, not of numbers, but of all kinds of procedures and technologies.

    I thought long and hard about a short title for my column that would describe what this cultural revolution is all about and settled on that one word, “confidential,” which is the way we used to handle matters, mainly around correspondence. For large offices, you would write or stamp “Confidential” on a document and seal it well before passing it on. To some extent that’s still being done today, sometimes with almost ridiculous but ineffective zeal.

    Look at the mail you’re getting at home and you’ll find many are still stapled in a way that you have to remove the staple wire first before opening the envelope or you risk ripping through the document inside, which might be a check, now mangled and could be rejected by the bank when you want to deposit it.

    The other way documents were “secured” was to tape it end to end. Being on the receiving end of so many of these documents, I would tell my staff to call the sender and politely complain about how they have mummified the document, making it practically impossible to open.

    Now comes a law that could either worsen, or rationalize, the way we handle confidential information. The full name of that law would take an entire paragraph so I will just use the short one, the Data Privacy Act of 2012. It took some years to get a bureaucracy going together with implementing rules and regulations, all to ensure the protection of personal information in all of the country’s offices.

    Overseeing this gargantuan task is a National Privacy Commission, which has been pressing hard on institutions to establish data protection offices that will ensure that all sensitive personal information is secured, and not just by stamping “confidential” and stapling or mummifying documents.

    Call centers and data

    The law was actually forced upon us by the business process outsourcing (BPO) industry, which employed 1.3 million Filipinos last year and still counting. Most of them are in the contact subsector, better known as call centers and they have access to all kinds of sensitive information provided not just by Filipinos but residents of countries throughout the world. Remember the last time you wanted to clarify a credit card billing which you think was mistaken and how they asked you about everything from your mother’s maiden name to the last credit transaction you had.

    Think of all the forms you’ve had to fill out over the years. To get some kind of insurance, for example, you may have been asked to submit to a full medical checkup where you were asked all kinds of questions about the causes of death in your family, and your own existing illnesses, surgical procedures … all the way up to your sex life: how many people have you had sex with, were they male or female, and what kind of sex did you have?

    All that information is stored in computer databases.

    Educational institutions have all kinds of sensitive data. I’m thinking of two offices in particular. There’s the health service’s medical records, and there’s the registrar, which still has the “jackets” — enrollment forms and, until recently, grades—of each and every person who has ever enrolled in UP, whether you finished or not. Diliman, being the main campus, has files that go back more than a century. I have a feeling many UP alumni would worry more about their grades, rather than their medical records, being kept private.

    The new law spells out penalties, including imprisonment, for leaking out sensitive personal information. Hold your breath as I mention some of them: “race, ethnic origin, marital status, age, color and religion, philosophical or political affiliation,” “health, education, genetics or sexual life,” “offenses committed or alleged to have been committed,” “social security, current health records, licenses or its denials, tax returns.”

    Each of those items could be further broken down when you think of what “sensitive” can mean. Under this law, we cannot provide the grades of a student to anyone, not even the parents, if the student has reached legal age (1 and unless signed consent is given.

    Whistleblowers

    The reason I said all this involves a cultural revolution is that the idea of data privacy is almost nonexistent in our culture. The norm is to share information. Have a crush on someone and want to send a gift on his or her birthday? No problem, everyone knows what office and who in that office can get you that information.

    Birthdays seem relatively benign but think hard about it and you might be opening the doors to unwanted attention.

    Salaries are another example, a source of a lot of resentment within an office, so in UP Diliman we are now banishing the days of open pay slips. Instead, the pay slips come in an encrypted envelope, like the ones banks give you when you get a new ATM card with a password.

    Medical records are particularly sensitive. For years now, as dean and then as chancellor, I’ve had to sign all applications for reimbursement of medical expenses and I’ve always been uncomfortable at how the papers go through several offices and personnel, with all kinds of sensitive information. Culture comes into the picture with people, often with the best of intentions, telling friends, “Did you know Professor xxx has cancer? Maybe we can help raise funds for her.”

    Golden rule

    In other cases, malicious intentions may come in. It’s been so tiring having to respond to anonymous “sumbong” (whistleblowing) made to the government’s hotline, with distorted information that clearly came from access to official files in government agencies, questioning everything from bonuses to an out-of-town conference and often out of a desire for character assassination rather than a concern over corruption. The Filipino term “maiinit na mata” (hot eyes) is an appropriate description of the sources of data breaches.

    “Data breach” is a term that will make it into our vocabularies as the Data Privacy Act is implemented and the breach is fairly easy given that so much information is now stored in computer databases. The Data Privacy Act requires that the databases are secured, requiring layers of passwords and security measures, but even the best of systems will still be heavily challenged by culture, to the point where we may have to require all staff—secretaries, receptionists, even drivers — to sign nondisclosure agreements, meaning they are bound by law not to give out sensitive information, not just from computers and documents but from meetings and even conversations.

    But more than nondisclosure agreements, we will have to look for other ways to tackle cultural attitudes to privacy. It will mean, for example, reminding someone during a conversation that they just named someone and gave away sensitive information about that person. It will mean, too, periodic meetings discussing data privacy and breaches and the foundation of it all, the golden rule: Do not do unto others what you do not want done to you.
    FRIENDS LANG KAMI

  2. #102
    Digital privacy vs public security

    By: Drexx D. Laggui - @inquirerdotnet Philippine Daily Inquirer / 12:10 AM September 11, 2016

    THE #WarOnDrugs has reached a crescendo now that President Duterte has issued Proclamation No. 55 declaring a state of national emergency on account of lawless violence.

    According to open-source reports, elements of the illegal drug trade have stepped up their battle against the government by establishing a partnership with terrorists and kidnap-for-ransom organizations in their efforts to distract or discourage law enforcers from doing their jobs. The bombing of a Davao City night market on Sept. 2 is a heinous sample of their criminal synergy.

    Criminals today have the advantage

    The #WarOnDrugs is not easy. The numbers are not in favor of our government. The 2015 Annual Report of the Philippine National Police revealed a volume of 201,010 for index crime, and 474,803 for nonindex crime, for a total of 675,813 reported crimes to the police alone.

    Index crimes include reports of crime against persons like murder, physical injury and rape, as well as crimes against property, such as robbery, theft, car theft and even cattle rustling. Nonindex crimes include illegal drug use, cybercrime, physical injury and damage to property.

    With regards to drug-related cases, conviction rates of criminal cases over the past five years have been poor, according to Justice Secretary Vitaliano Aguirre II. In 2015, there were 43,462 cases but only 782 convictions for a 2-percent success rate.

    What is not widely publicized is that when a serious crime has been committed, every law enforcer knows that he has 48 hours to have solid leads, suspects or arrests by that time. Otherwise, the chances of solving the crime drop by half.

    To add to the challenge, there are now some 101 million Filipinos. Combining all law enforcement and military personnel and all those from other government agencies, there are only about 400,000 of them.

    In contrast, according to the Dangerous Drugs Board, there are roughly 1.7 million Filipinos engaged in illicit drug use. That’s 425 percent more than the number of our law enforcers and peace officers.

    Rethink crime fighting

    The time has come for our country to rethink the state’s crime-fighting processes and the tools required to support it. Our government needs to step back and see the big picture, to think unconventionally against enterprising criminals.

    Criminals have come to be as progressive as commercial businesses in adopting technology to improve their capabilities, to be more effective and efficient in the conduct of their crimes.

    For example, criminals have used the internet to conduct surveillance operations against their targets to plan out their crimes, as well as assess the net worth of victims and drug users if they are profitable targets or not.

    Wikimapia or Google Maps can give criminals the ability to assess escape routes or vulnerabilities of bank branches or homes of kidnap victims, while Facebook or Instagram or Twitter can yield great information about the lifestyle patterns, family and friends, or the financial capacity of the victims. Waze can even give them the fastest escape route from the crime scene.

    Most important for the criminals, however, is for technology’s ability to enable them to conspire invisibly with their members. Criminals have exploited technology to plan, communicate and commit a crime in the virtual world, meeting each other without the risk of physical presence and even share their profits, all online.

    Technology: secret weapon of the state

    Fortunately, technology can work for the law enforcers, too. The 101 million Filipinos have 110 million mobile devices.

    Find the phone, find the criminal.

    Nothing is more personal to the criminal than his or her mobile phone. The mobile phone knows more about each individual, about his secret lives more than he would care to admit, or more than what his friends or families know. A mobile phone tracks one’s activity, location, his relationship with friends or family or victims, his favorite food, political affiliation, likes or dislikes, his fears or fantasies, gender or sexual preferences, health, spending habits, travels, comments to friends and families that reveal his sentiments and social influence, his cars and home, or his fortunes and misfortunes, and can collect them all as big data.

    Psychometrics

    Analyzing big data involves a process called psychometrics, enabling the measurement of mental traits, abilities and processes of the person.

    Finding the phone to find the criminal means that there needs to be a continuous observation of a criminal in a place, the person or group that he or she interacts with, or any ongoing activity in order for the law enforcer to gather information about the criminal and the crime. Finding the phone means online surveillance and intelligence gathering.
    FRIENDS LANG KAMI

  3. #103
    Vigilance for data privacy rights

    The concept of using high-tech tools for online surveillance is scary for everybody and reasonably so. Edward Snowden expressed this in Reddit succinctly: “Arguing that you don’t care about the right to privacy because you have nothing to hide is no different [from] saying you don’t care about free speech because you have nothing to say.”

    Online surveillance is not a new idea or initiative, however.

    Section 12 of the Cybercrime Prevention Act of 2012 provides for real-time collection of traffic data. “Law enforcement authorities, with due cause, shall be authorized to collect or record by technical or electronic means traffic data in real-time associated with specified communications transmitted by means of a computer system. Traffic data refer only to the communication’s origin, destination, route, time, date, size, duration, or type of underlying service, but not content, nor identities.”

    In February 2014, however, the Supreme Court struck down this provision for being unconstitutional. The court explained that online surveillance was not evil by itself, but rather, the law was unclear on what “due cause” meant and that may lead to the state abusing this power, and using it as a tool for general warrantless search against anybody and everybody.

    The law has been thus interpreted to be enforceable only if there is a warrant issued “with specifity and definiteness” so that our law enforcers would not be given unlimited surveillance powers. The only thing waiting now is for a criminal investigation case to test this law. That will not be too far off from today.

    The Supreme Court decision is consistent with the provisions of the Data Privacy Act of 2012, which declares that “it is the policy of the state to protect the fundamental human right of privacy, of communication while ensuring free flow of information to promote innovation and growth.

    “The state recognizes the vital role of information and communications technology in nation-building and its inherent obligation to ensure that personal information in information and communications systems in the government and in the private sector [is] secured and protected.”

    The Data Privacy Act applies to everyone, protecting “individual personal information in information and communications systems in the government and the private sector.”

    At its core, this law prescribes appropriate jail time and fines for any violation against any person’s personal information, that “refers to any information whether recorded in a material form or not, from which the identity of an individual is apparent or can be reasonably and directly ascertained by the entity holding the information, or when put together with other information would directly and certainly identify an individual.”

    Sensitive personal info

    It goes even further by defining sensitive personal information and then prescribing even harsher penalties. Sensitive personal information includes anything:

    About an individual’s race, ethnic origin, marital status, age, color and religious, philosophical or political affiliations

    About an individual’s health, education, genetic or sexual life of a person, or any proceedings for any offense committed or alleged to have been committed by such person, the disposal of such proceedings, or the sentence of any court in such proceedings

    Issued by government agencies peculiar to an individual which includes, but not limited to, social security numbers, health records, licenses or its denials, suspension or revocation and tax returns

    Specifically established by an executive order or an act of Congress to be kept classified.

    However, recognizing that no human right is absolute, this law also states that it does not apply to any private or sensitive information that is “necessary in order to carry out the functions of public authority which include the processing of personal data for the performance by the independent, central monetary authority and law enforcement and regulatory agencies of their constitutionally and statutorily mandated functions.”

    Update antiwiretap law

    Additionally, it is also interesting to monitor developments in today’s 17th Congress, specifically the five bills that have been separately filed by Senators Gregorio Honasan, Ping Lacson, Grace Poe and Sonny Angara. These seek to update Republic Act No. 4200, or the Anti-Wiretapping Law of 1965.

    Their proposals are unified in using the force-multiplying power of technology to go after the criminals in drug-related cases and those charged with plunder, kidnapping, money-laundering, robbery, piracy, rebellion, treason, espionage, provoking war and sedition.

    Privacy vs security and Equilibrium-Adjustment Theory

    Obviously, zipping along the fine line between privacy and security will be challenging across an undefined and foggy cyberterrain.

    To help the Philippines along, a navigational aid like the Theory of Equilibrium-Adjustment may be used by our government decision-makers. Back in 2011, professor Orin Kerr from George Washington University Law School proposed that a government balance the application of laws with the protection of human rights. This means that a government shall tighten or relax the law’s protections in response to changing technology and social acceptance.

    When new technologies expand law enforcement’s capabilities, the law does (and should) respond by placing new controls on the government; when new technologies give criminals the advantage, the law does (and should) respond by loosening the government’s restraints.

    Negative legal right

    To complement the Theory of Equilibrium-Adjustment, the paradigm that the Data Privacy Act is just one of the regulators of our human right to privacy, must also be embraced. When one begins to realize that the law is a negative legal right (i.e. it explicitly says what we should not do with respect to the rights of another person), then one would also follow the realization that there are other previously unrecognized factors affecting our privacy rights and interests.

    These structural constraints include economic and physical and technological barriers, and are associated with costs that act as nonlegal regulations. These factors are expressed by professor Harry Surden of Stanford Law School in his essay “Structural Rights in Privacy.”

    New guide

    Combining both legal ideas above in the context of Section 12 of the Cybercrime Prevention Act and that of the Data Privacy Act, a new guidance can be generated: If a proposed law enforcer’s online system for real-time collection of traffic data makes it too cheap (in terms of financial cost, social acceptance, technology and logical controls) for our government to collect investigation data, that otherwise would have been physically impossible or too expensive to do so, then the use of that high-tech system violates our expectations of privacy. Otherwise, it is all acceptable.

    Defining and enforcing the privacy interests of the Philippines is not a one-time activity, but is a very dynamic and contentious process. To paraphrase, our country’s privacy interests is not a destination, but a journey.

    Security vs security

    Referring to the landmark February 2014 decision of the Supreme Court, one will realize that we are not looking at the question of privacy versus security after all. Rather, it’s actually a question of security versus security.

    The question of security against criminals, or security against law enforcement abuse, maybe easier to answer. When law enforcers are empowered with “due cause” to collect or record by technical or electronic means traffic data in real-time, do you trust them that they have just reason or motive to do so?

    That they will conduct their online surveillance with faithful adherence to a lawful procedure all the time? Are you hopeful that the operational risks against the abuse of the online surveillance system have proper countermeasures? Do you have confidence that the countermeasures against abuse are sufficient and correct all the time?

    Until the answer is “yes” to all these questions, only then can the question of security versus security can be reliably answered.

    (Drexx D. Laggui, principal consultant of Laggui & Associates Inc., conducts vulnerability assessment, internet preparation testing and computer forensics.)
    FRIENDS LANG KAMI

  4. #104
    For the first time, Facebook spells out what it forbids

    Associated Press / 07:30 PM April 24, 2018

    NEW YORK - If you’ve ever wondered exactly what sorts of things Facebook would like you not to do on its service, you’re in luck. For the first time, the social network is publishing detailed guidelines to what does and doesn’t belong on its service – 27 pages worth of them, in fact.

    So please don’t make credible violent threats or revel in sexual violence; promote terrorism or the poaching of endangered species; attempt to buy marijuana, sell firearms, or list prescription drug prices for sale; post instructions for self-injury; depict minors in a sexual context; or commit multiple homicides at different times or locations.

    Facebook already banned most of these actions on its previous “community standards” page, which sketched out the company’s standards in broad strokes. But on Tuesday it will spell out the sometimes gory details.

    The updated community standards will mirror the rules its 7,600 moderators use to review questionable posts, then decide if they should be pulled off Facebook. And sometimes whether to call in the authorities.

    The standards themselves aren’t changing, but the details reveal some interesting tidbits. Photos of breasts are OK in some cases – such as breastfeeding or in a painting – but not in others.

    The document details what counts as sexual exploitation of adults or minors, but leaves room to ban more forms of abuse, should it arise.

    Since Facebook doesn’t allow serial murders on its service, its new standards even define the term. Anyone who has committed two or more murders over “multiple incidents or locations” qualifies.

    But you’re not banned if you’ve only committed a single homicide. It could have been self-defense, after all.

    Reading through the guidelines gives you an idea of how difficult the jobs of Facebook moderators must be. These are people who have to read and watch objectionable material of every stripe and then make hard calls – deciding, for instance, if a video promotes eating disorders or merely seeks to help people. Or what crosses the line from joke to harassment, from theoretical musing to direct threats, and so on.

    Moderators work in 40 languages. Facebook’s goal is to respond to reports of questionable content within 24 hours. But the company says it doesn’t impose quotas or time limits on the reviewers.

    The company has made some high-profile mistakes over the years. For instance, human rights groups say Facebook has mounted an inadequate response to hate speech and the incitement of violence against Muslim minorities in Myanmar.

    In 2016, Facebook backtracked after removing an iconic 1972 Associated Press photo featuring a screaming, naked girl running from a napalm attack in Vietnam. The company initially insisted it couldn’t create an exception for that particular photograph of a nude child, but soon reversed itself, saying the photo had “global importance.”

    Monica Bickert, Facebook’s head of product policy and counterterrorism, said the detailed public guidelines have been a long time in the works.

    “I have been at this job five years and I wanted to do this that whole time,” she said.

    Bickert said Facebook’s recent privacy travails, which forced CEO Mark Zuckerberg to testify for 10 hours before Congress, didn’t prompt their release now.

    The policy is an evolving document, and Bickert said updates go out to the content reviewers every week. Facebook hopes it will give people clarity if posts or videos they report aren’t taken down. Bickert said one challenge is having the same document guide vastly different “community standards” around the world.

    What passes as acceptable nudity in Norway may not pass in Uganda or the US.

    There are more universal gray areas, too.

    For instance, what exactly counts as political protest? How can you know that the person in a photo agreed to have it posted on Facebook?

    That latter question is the main reason for Facebook’s nudity ban, Bickert said, since it’s “hard to determine consent and age.” Even if the person agreed to be taped or photographed, for example, they may not have agreed to have their naked image posted on social media.

    Facebook uses a combination of the human reviewers and artificial intelligence to weed out content that violates its policies. But its AI tools aren’t close to the point where they could pinpoint subtle differences in context and history – not to mention shadings such as humor and satire – that would let them make judgments as accurate as those of humans.

    And of course, humans make plenty of mistakes themselves.
    FRIENDS LANG KAMI

  5. #105
    The Companies Cleaning the Deepest, Darkest Parts of Social Media

    We spoke to the documentary-makers behind 'The Cleaners,' the film about the people who take down content after you report it.

    This article originally appeared on VICE UK.

    Every minute of every single day, 500 hours of video footage is uploaded to YouTube, 450,000 tweets are tweeted, and a staggering 2.5 million posts are posted to Facebook. We are drowning in content, and within all of that content, there's undoubtedly going to be a chunk deemed as offensive—stuff that's violent, racist, misogynistic, and so on—which gets reported.

    But what happens once you've reported that content? Who takes care of the next steps?

    Directors Moritz Riesewieck and Hans Block have made a documentary exploring exactly that question, and the answer is much more depressing that you might have imagined. The Cleaners got its UK Premiere at Sheffield Doc/Fest in June, so I caught up with the pair shortly after to discuss how social media organizations are cleaning up the internet at the cost of others' lives.

    VICE: Tell me about what drew you to this subject and why you wanted to make a film on it.

    Hans Block: In 2013, a child abuse video went on Facebook and we asked ourselves how this happened because that material is obviously out there in the world but not usually on social media sites. So we began to ask if people were filtering the web or curating what we see. We found out that there were thousands of humans doing the job every day in front of a screen, reviewing what we're supposed to see or not see. We learned that a lot of the work is outsourced to the developing world, and one of the main spots is Manila in the Philippines, and almost nobody knows about it.

    We found out very quickly when trying to contact the workers that it's a very secretive industry, and the company tried to stop the workers from speaking out. The companies use a lot of private policies and screen the accounts of the workers to make sure that nobody is talking with outsiders. They even use code words. Whenever a worker is working for Facebook, they have to say they are working for the "honey badger project." There's a real atmosphere of fear and pressure because there are reprisals for the workers; they have to pay a €10,000 [$11,672] fee if they talk about what they're doing. It's written into their nondisclosure agreements. They even fear they will be put in jail.

    But you managed to track down some workers and ex-workers, and gained their trust for the film. Are these guys moderating all content or just stuff that gets reported?

    Moritz Riesewieck: There are two ways the content is forwarded to the Philippines. The first is a pre-filter, an algorithm, a machine that can analyze the shape of, say, a sexual organ, or the color of blood or certain skin color. So whenever the pre-filter is analyzing and it picks up on something that is inappropriate, the machine will send that content to the Philippines and the content moderators will double check if the machine was right. The second route is when the user flags the content as being inappropriate.

    So this pre-filter algorithm is effectively capable of racial profiling people under the justification of what? Trying to detect terrorists or gangs?

    Hans: We've been trying to work out exactly how this machine is working, and this is one of the big secrets the companies have. We don't know what the machine is trained for in terms of detecting content. There are obvious things like a gun or a naked sexual organ, but some of the moderators told us that skin color is being detected to pick up on things like terrorism, yes.

    What happens when something is deleted? Is it just removed from the user who uploaded it or is it removed universally?

    It is taken down universally. Although, in one case, it's different: child pornography. Whenever a content moderator is reviewing child pornography, they have to escalate this and then report the IP address, the location, and name of user—all the info they have, basically. This gets sent to a private organization in the states and they then analyze all the info and forward it onto the police.

    Are the content moderators adequately trained? Both in understanding the context of what they are reviewing and also in terms of the significance that some of their decisions can have?

    Moritz: I would say this is this biggest scandal about the topic because the workers are very young; they are 18 or 19 and have just left school. [The companies] are recruiting people from the street for these roles. They only request a very low profile of skills, which is basically being able to operate a computer. They are then given three to five days of training, and within that, they have to learn all the guidelines coming from Facebook, Google, YouTube, etc. There are hundreds of examples they have to learn. For example, they have to memorize 37 terror organizations—all their flags, the uniform, the sayings—all in three to five days. They then have to give the guidelines back to the company because they are afraid someone will leak them.

    Another horrible fact is that the workers only have a few seconds to decide. To fulfill the quota of 25,000 images a day, that means they have three to five seconds on each. You're not able to analyze the text of an image or thoroughly make sure you're making the right decision when you have to review so much content. When you click, you then have another ten options to click based on the reason for deletion—nudity, terrorism, self-harm, etc. They then use the labeling of the content moderators to train the algorithm. Facebook is working very hard to train AI to do the job in the future.

    So the workers are providing the training to an algorithm that will eventually take their own job?

    Hans: It won't be possible for AI to do that kind of job because they can analyze what is in the picture, but what is necessary is reading the context, to interpret what you are seeing. If you see someone fighting, it could be a scene from a play or a film. This sort of thing is something a machine will never be capable of.

    Are the workers trained for the severity and trauma of what they are going to see—death, abuse, child pornography, etc?

    Moritz: They are not trained for that. There is one moderator in the film who, only on her first day, after training and signing the contract, properly realized what she was doing there. She ended up reviewing child pornography on her first day and said to the team leader she was unable to do this, and the response was, "You've signed the contract. It's your job to do that." There's no psychological preparation for them to do their work. They have a quarterly session where the whole team is brought into one room and a psychologist asks, "Does anyone have a problem?" Of course, everyone is looking at the ground and afraid to talk about their problems because they are afraid to lose their job. It's for a good reason that Facebook is outsourcing work to the Philippines because there's such a big social pressure attached: The salary is not just their own—it's often for their whole family, of up to eight to ten people. It's not easy to leave the job.

  6. #106
    ^^^ (Continued)

    What's the scale of the operation out there?

    We can't know the exact amount, but we think that, for Facebook alone, it's around 10,000 people. If you then add in all the other companies that are outsourcing this work to the Philippines, it's around 100,000. It's a big, big industry.

    Bias is something that is explored interestingly in the film. You have a content moderator who describes herself as a "sin preventer" and is very religious, while another person is very pro-President Rodrigo Duterte and his violent anti-crime and drugs stance. Are they imparting their personal views, politics, and ethics onto something that should be objective?

    Hans: Whenever Facebook is asked a question about the guidelines, they try to promote that the guidelines are objective and that they can be executed by everyone. That's not true, and that's what the film is about. It's very important what kind of cultural background you have. There are so many areas in the guidelines that require you to interpret and use your gut feeling to decide about what you see.

    Religion is a big part of the Philippines; Catholicism is very strong. The idea of sacrifice is crucial in their culture. The idea is that sacrificing yourself to get rid of the sins of the world will make it a better place. So, for a lot of the workers, it is viewed as a religious mission. They use religion to give the job meaning, and that helps them do it a bit longer, as when you're traumatized through work you need to find meaning. On a political level, Duterte is very strong there and people believe in what he is doing. Almost all the content moderators we spoke to were really proud he won the election. Some of the people see this work as an extension of his work, so they will just delete what they don't like in line with the country's political views.

    All eyes are on Facebook at the moment post-Cambridge Analytica. How do you think things are going to pan out in the area of content moderation?

    Moritz: Zuckerberg, whenever he is asked in a testimony, says he will hire another 20,000 people across the network on content safety. But this is not the solution. It's not about the number of workers—you can hire another 20,000 low-wage Filipino workers doing the job and it won't fix any problem with online censorship. What he needs to do is to hire really well-trained journalists. Facebook is not just a place to share vacation pictures or invite someone to your birthday anymore; it's the most major communication infrastructure in the world. More and more people are using Facebook to inform themselves, so it's very important who is deciding what is published. Think about if someone else decided what was published on the Guardian's front page tomorrow—that would be a disaster and it would be a scandal, but this is the status quo of the social media companies right now. This has to change, but this costs money, and the only goal of the company is to gain more money... that's why they hire low-wage workers in the Philippines.

    Some of the moderation requests, such as terrorism, are even coming directly from the US government, right?

    Absolutely. The list of terrorist organizations that have to be banned on Facebook comes from Homeland Security, but obviously, in different parts of the world, we have very different ideas of who is a terrorist and who is a freedom fighter. This is a lie that Facebook has always stated; that they are a neutral platform, and that they are just a technical tool for the user. It's not true; they are taking editorial decisions every day.

    How did you find the people who had left this job were getting on? Are they forced into a silence or tracked or anything?

    Hans: When we were researching, there was one moment when the [content moderation] company was taking photos of us and they were then distributed to everyone in the company—even the former workers—with a warning that talking to us will lead to them being in trouble. There's a really big atmosphere of fear that the company is spreading. One employee contacted us via Facebook, and he was really angry with us and telling us to leave the city, or something will happen to us. So even the former workers are still scared of speaking after leaving. We had lawyers on our team to protect them, however; we knew what was written into their contracts and what we could say in our film.

    One of the greatest tragedies captured in the film is that it tells the story of a worker who died by suicide. As far as you're aware, was that an anomaly or is that happening on a larger scale and not being reported?

    Moritz: It is a wider thing happening in the company. The suicide rate is very high. Whenever we spoke with content moderators, almost everyone knows about a case where someone committed suicide because of the work. It's a problem in the industry. That's why it was important to include that in our film. They need to hire proper psychologists to protect them. The man in question who took his own life worked in reviewing self-harm content and had asked to be removed from doing that several times.

    Are these social media companies aware that people are killing themselves over this work, do you think?

    Hans: Good question. Yes, I think they do know because we made the film, but this is also a problem with outsourcing; it's so easy for Facebook to say, "We don't know about that because it's not our company and we are not responsible because we don't hire them and we're not responsible for the working conditions." This is the price we, and Facebook, are paying for cleaning social media—people killing themselves. We have to pressure these companies as loudly as we can.


 
+ Reply to Thread
Page 11 of 11 FirstFirst ... 9 10 11

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts

 
Visitor count:
Copyright © 2005 - 2013. Gameface.ph