Home https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Health https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ She was called the n-word and gave her instructions to split her wrists. What did Facebook do?

She was called the n-word and gave her instructions to split her wrists. What did Facebook do?



Or you can tell your mother that you hope her son is raped and not kicked out of the world's most popular social media platform. Or you can tell a mother whose 5-year-old daughter has died that "if your children continue to die, it's God who tries to say you don't deserve them."

You can write to this mother: “F ** k you c ** t. are you still c ** t dead? fingers crossed. you spit on c ** such as you. c c dog. you die c ** t. you piece of sh * tc ** t dog. You are ignorant dumb dogs. Die c ** t. "

You can still stay on Facebook.

These are the findings of a six-month CNN harassment and harassment investigation that parents, doctors, and others who advocate for Facebook vaccination have faced.

Facebook says the platform is conducting its own investigation following CNN findings.

Facebook: We do not tolerate harassment and harassment

Facebook officials – already under fire for the platform's role in the 2016 election, among others questions – they say they do not tolerate bullying and harassment.
"We have a responsibility to Azim people in our services – whether from pho rorizam, harassment or threats, "says the CEO of Facebook Mark Zuckerberg in November last year.
"Harassment and harassment occur in many places and occur in many different forms," ​​according to the Facebook community standards. "We do not tolerate this type of behavior because it prevents people from feeling safe and respected on Facebook."
  Her son passed away. And then the anti-waxers attacked her.

The CNN investigation questioned these allegations. Our investigation found that Facebook sometimes allows users stay on his platform, even when repeatedly violate Facebook's standards for harassment and harassment and verbally abuse others in the most hateful and violent ways.

For example, a woman who received a message calling her the n-word and telling her to cut her wrists, reported the message on Facebook. Facebook has determined that the message violates its community standards and that the sender is a repeat offender – but the sender was still allowed to remain on the platform until CNN began to ask questions.

"It's terrifying. says Andrew Marantz, author of the new book "Antisocial."

"Facebook likes to call itself a community, and if they want to keep that promise they should try to protect people from things at the bare minimum" 19659002] Zuckerberg says his company needs to do more.

"We have a lot of work to do about bullying and harassment," Facebook's chief executive told reporters in May.

"I was scared"

Odom, who received the n-word message, agreed.

Odom, a mother of three young children in Austin, Texas , urged others on Facebook to get vaccinated to help protect people like her brother, who has Crohn's disease and can't get certain immunizations.

She ght she was used to attacks against vaxxer, but in April Odom logged into Facebook Messenger and read a message that made her physically ill.

"Can we immunize babies [sic] from becoming ignorant n ** *** like you?" "a user writes to her, spelling the n-word." You uneducated twice, here's something you need to learn … How to cut those wrists properly. Kill yourself, kill your kids, kill your parents, kill everyone. "

Then the user gave illustrated instructions for how

" I didn't know if it was a threat. I was scared, "Odom said." I called my husband and sent him a screenshot and he was scared too. "

Odom immediately reported the message to Facebook. According to Facebook, not only did the message violate Facebook standards, but the user was violated Facebook standards, according to a Facebook spokesman.

User Penalty: For 30 days, she was not allowed to send messages to Messenger, according to a Facebook spokesperson, otherwise she was allowed to continue to participate in and post to Facebook as usual.

Odom said she was "fired up" by Face

"Calling someone a horrible racial slur and telling them to kill is

In August, four months after Odom reported the message, CNN sent it to Facebook and asked why the user had not been removed by Facebook removed its user profile in September.

"We further investigated the account and decided that the account needed to be acted upon," a CNN spokesman wrote in response.

The spokesman did not explain why Facebook did not remove the user himself. The spokesman, who spoke on condition of anonymity, declined to make available to the Facebook manager an interview for the story.

CNN Investigation

Our investigation began in March with a story about vaccine advocates for attacking vaccines on Facebook, Following the publication of this story, a Facebook spokesman asked CNN to submit the content we collected while reporting our story .

With the permission of the recipients, CNN sent 65 pieces of content to the spokesperson that vaccine advocates said they found offensive, including publications and comments and direct messages.

  Messages and comments received from mothers advocating for vaccines on Facebook after the death of their children.
The spokesperson sent comments to the Facebook community standards team, which found that 39 of these pieces of content were published out of 31 individual users, they violate their community standards, most of them with regard to bullying.

Facebook removed only one user – the one who wrote the message to Odom. Other users remain on the platform, even though they have violated Facebook standards.

With the permission of the recipients, CNN sends the 39 pieces content to ethics, lawyers and social media experts.

"Whoever sends these messages, you have to be out of social media forever. All of them," said Arthur Kaplan, head of medical ethics at New York University. "Can't Facebook do more for the police? They have to."
"This is absolutely excruciating," says Mary Ann France, a professor at the University of Miami Law School who specializes in social media. "Facebook can do much better.

Maranz, the author of the book, recalled a famous saying by Zuckerberg.

" Facebook is one of the most profitable companies in the world. They can move fast and disrupt things. Why can't they move fast and fix things? "

" If they don't cross the line of citizenship, what does it do? "

When CNN asked why Facebook was removing only one of the 31 users from the platform, the spokesman gave two answers.

The spokesman did not explain how Facebook was able to identify the person who sent the message to Odom, but not to other users.

  Facebook debuts pop-ups with vaccines to stop it disinformation space

The spokesman also did not explain why they could not identify users, but CNN identities most of them using screenshot information such as names, profile pictures, place of employment, and schools attended.

Second, the spokesman noted that Facebook generally does not remove anyone for any violation of the standards of harassment and harassment. Instead, multiple violations or strikes are required.

"We don't want people to play on the system, so we don't share the specific number of strokes that lead to temporary blocking or permanent shutdown," according to a Facebook report, "Promoting our Community Standards." "The effects of the strike vary depending on the severity of the violation and the person's history on Facebook."

In addition to the user who sent the message to Odom, the CNN investigation found at least two users who claimed to have violated Facebook

One user told CNN that he had been in "Facebook's prison" seven times, and another said that she had been similarly punished on Facebook "quite often."

When CNN indicated that these two users were still on Facebook, although it seemed a lot of blows, the spokesman replied that Facebook would launch an "in-depth investigation" of users who wrote the infringing content.

Ethicists question why Facebook did not "remove the users who have committed some of the most gross abuses – especially those who promote violence.

  Posts and comments received from Facebook by advocates for vaccines .
"Some of them deserve an immediate ban," says Timothy Coulfield, a professor of health law and policy at the University of Alberta in Canada. "If they do not cross any border of citizenship, what is it doing?"

Cowfield and Kaplan, an ethicist at NYU, said it was especially important for Facebook to be strict on abuses by anti-vaxxers as they could intimidate users who publish facts about vaccines.

They cited this year's measles outbreak – the largest in 27 years with more than 1,200 cases, according to the American Centers for Disease Control. Experts say measles flourishes largely because of anti-vaccine misinformation on social media, especially on Facebook.

"The space of the vaccine is extremely sensitive in this highly vulnerable life – newborns, cancer patients and those with immune diseases – hang. on balance, "Kaplan wrote in an email to CNN." Facebook should have no tolerance or place for anti-vaxx jealousy and hooks. They should be monitored or reported, verified and then blocked and pronto banned. Public health should end vaccine misinformation and threatening behavior. "[19659063] This year, Facebook took steps to reduce the impact of anti-vaccine The platform announced in March that it would begin to lower the ranking of groups and pages that spread such misinformation in its News Feed and Search options, and in September Facebook announced that educational pop-ups would emerge. social media platforms when a user searches for vaccine-related content, visits vaccine-related Facebook groups and pages, or touches a hashtag vaccine on Instagram.

The Difficulty of Detecting Harassment

In January 2018. Zuckerberg Challenges: Fix Facebook.
"The world feels anxious and divided, and Facebook has a lot of work to do – whether it protects our community from abuse and hatred, protects itself from intervention by nation states, that the time spent in Fac ebook is well spent, ”Zuckerberg wrote in a Facebook post. "My personal challenge for 2018 is to focus on fixing these important issues. We will not prevent all mistakes or abuses, but we are making too many mistakes at the moment in imposing our policies and preventing the abuse of our tools."
In a series of Community Standards Implementation Reports, Facebook revealed the results of its efforts.
For example, in the first three months of 2019, the social media platform downloaded 5.4 million pieces of content that violated its standards against child nudity and child sexual exploitation, in almost every case – 99% of cases – Facebook detects this content on its own without being alerted by users.

But Facebook has had much less success in detecting harassment and harassment.

In the first quarter of 2019, the social media platform took action on 2.6 million pieces of content to violate bullying and harassment standards, but users had to report it 86% of the time; only 14% of the time Facebook detects and flags content on its own.

The problem is that although the algorithm for detecting nudity is relatively easy, it is much more difficult to detect when someone is harassed online, since there are subtleties involved in speech recognition than speech that does not violates community guidelines.

"When you engage in hate speech and harassment, the nuances of language become even more difficult – understanding when one condemns racial fraud as opposed to using it to attack others," Zuckerberg wrote in a note last November.

Facebook says because it's difficult to detect harassment, they have to rely on users who report abuse.

"When it comes to harassment and harassment, context really matters," the spokesman added. " is to tell the difference between harassment and levity a lazy joke without knowing the people involved or the nuance of the situation. That's why we rely heavily on reports – if you or someone you know is bullied or harassed, we encourage you to report it. "

A recent Facebook report came to the same conclusion.

" In areas such as harassment and harassment, where context is so vital to understanding whether content violates our policies, we expect that in the future our automated systems will not have an opportunity to discover this content on a scale similar to other policy violations, "says the Facebook Transparency Report." [In] many times, we need someone to report to us about this behavior before we can we identify or remove em. "
Last year, Guy Rosen, Vice President of Facebook Product Management, said the platform was trying to improve." We are determined to improve our understanding of these types of abuse so that we can improve actively since their discovery, "wrote Rosen." We know in general that we have much more work to do to prevent the abuse of Facebook. Machine learning and artificial intelligence will continue to help us detect and eliminate bad content. "

Relying on mourning mothers

  Catherine and Greg Hughes & # 39; 1-month-old son, Riley, died of donkey cough.

While Facebook urges users to report abuse, Catherine Hughes says, that she just couldn't.

Hughes' 1-month-old son, Riley, died of a whooping cough. has others get vaccinated to protect babies like Riley, who are too young to receive all their photos.

Hughes estimates that since her son died in 2015, she and her husband have received thousands of abuse In Facebook posts and comments, it estimates that about half of them are from users in the United States.

They are called baby killers. It's called whore and worse. They were told to kill themselves. They received death threats.

Comments continued to roll even on the day of her son's funeral.

"These messages, while we were at Riley's funeral, particularly kicked us in the gut. You are having the worst day of your life and they thought their beliefs were more important than basic human ethics, "she said.

Hughes noted that even though she did not grieve for the loss of her child, she would never can report the sheer volume of posts, comments and messages. Plus, she said, many of them were comments on her own profile and although such comments may be hidden or deleted, Facebook does not allow users to report harassment or harassment comments made in their own profile or timeline.

"Co. and when it comes to the culture of bullying, you can't just leave it to the victims – the offended parents – to do the work to eliminate this abhorrent behavior, "said Hughes." Facebook has done incredible things. They are so forward thinking. Why can't they to invent the technology and algorithm to find a solution to this problem? "

  Serez Marotta's son, Joseph, died of influenza in 2009 at the age of 5.

Serres Marota agrees,

Her 5-year-old son Joseph died of the flu in 2009. After calling on others to get the flu vaccine, Marot was attacked by anti -Vackers. She was called obscene names and a user sent a death threat. Like Hughes, she does not report comments and messages.

„Твърдо вярвам, че Facebook трябва да може да измисли как да наблюдава това и смятам, че е тяхна отговорност да го направят“, казва Марота, която живее в Сиракуза, Ню Йорк и е главен оперативен директор на Families Fighting Flu. "Ако могат да измислят алгоритми за наблюдение на детската порнография, те трябва да могат да измислят алгоритъм за наблюдение на този тип поведение.

Маранц, авторът на книгата, е съгласен.

" Ако тези хора са предполага се, че са най-добрите компютърни инженери в целия свят и те не могат да напишат програма, която залага на коментар, който казва на някой да се самоубие и да я нарече n-думата – това ми се струва неправдоподобно, "казва той. [19659002] Marantz каза, тъй като платформата използва съкращаване на данни и алгоритми за насочване на рекл ми към потребителите, те трябва да използват същите инструменти за защита на потребителите, вместо да разчитат на потребителите да докладват, когато те са тормозени.

"Това наистина отнема много на chutzpah и жертви, които обвиняват, че твърдят, че тенусът трябва да бъде върху жертвите, "каза той.

Предложения за Facebook

Когато CNN представи тази критика на вниманието на Facebook, говорителят изпрати коментар по имейл.

„Искаме членовете на нашата общност да се чувстват сигурни и уважавани по отношение на Фа cebook и ще премахне материал, който изглежда целенасочено е насочен към частни лица с намерение да ги унизи или срамува. Опитваме се да дадем възможност на нашите потребители с контроли, като например да блокираме други потребители и да модерираме коментари, така че те да ограничат излагането им на нежелано, обидно или обидно съдържание. Ние също така насърчаваме хората да съобщават за поведението на тормоз в нашата платформа, така че да можем да прегледаме съдържанието и да предприемем правилни действия ", пише говорителят.

Говорителят също посочи изявлението на Facebook за тормоза и тормоза на страницата си със стандарти за общност, която споменава. Хъбът за предотвратяване на тормоза на платформата, ресурс за тийнейджъри, родители и учители, търсещи подкрепа за проблеми, свързани с тормоза и други конфликти.

Преди година Facebook въведе „нови инструменти и програми, т ака че хората да могат по-добре да контролират нежелани, обидни или обидни преживявания. във Facebook, "включително възможността да се съобщава, когато приятел или член на семейството е възможно да бъдат тормозени или тормозени.

" Веднъж съобщени, екипът ни за общностни операции ще прегледа публикацията, ще запази анонимността на вашия доклад и ще определи дали той нарушава нашата общност Стандарти ", пише в публикацията Antigone Davis, глобален ръководител на безопасността на Facebook.

Facebook преглежда повече от два милиона парчета съдържание y, според говорителя на Facebook.

Говорителят повтори техническите предизвикателства при откриването на тормоз, преди да бъде съобщено, тъй като тормозният пост и безобидният пост могат да съдържат подобен език.

Каулфийлд, експертът в областта на здравеопазването и политиката, каза няма съмнение, че има технически предизвикателства за полицейските тормози, още повече, че Facebook има толкова голям обем съдържание.

„Има проблем със ресурсите и управлението на Facebook със сигурност“, каза той. "Но като се има предвид къде сме в културен план, в тази ера на неистини и дезинформация, смятането се промени и трябва да започнем да бъдем по-агресивни при наблюдението на тези платформи. Мисля, че са необходими по-ясни правила и по-ясни действия."

 A предупреждение на майка: Ако имате бели тийнейджъри, слушайте ...

Трима експерти имат предложения за това как Facebook може да преодолее техническите предизвикателства за откриване на тормоз.

Маранц, авторът на книгата, заяви, че повече човешки очи, които гледат на съдържанието, могат да изминат дълъг път към решаване на проблемите на тормоза. [19659114] "Утре, чрез фиат, Марк Зукърбърг може да направи Facebook малко по-малко печеливш и изключително по-малко аморален: Той може да наеме хиляди повече модератори на съдържание и да ги заплати справедливо", пише Маранц в скорошно мнение на New York Times.

Том Уилър , бившият председател на Федералната комисия по съобщенията мисия, има идея как софтуерът може да помогне. Той предполага, че правителството трябва да принуди Facebook и други сайтове за социални медии да предоставят резултатите от своите алгоритми на трети страни.
Подобно "отваряне", той пише миналата година в мнението на New York Times, би включвало нещо, наречено отворен интерфейс за програмиране на приложения. API-ите са начин за външни лица да имат достъп до части от базата данни на платформата.
Докато Facebook позволява достъп до някои данни от своите API, той позволява далеч по-малък достъп от други платформи, като Twitter и YouTube, според Дарън Линвил и Патрик Уорън , изследователи от университета в Клемсън.

Екипът на Клемсън заяви, че по-отворените API-та на Twitter им позволяват да идентифицират заподозрени руски тролове и да сигнализират за Twitter.

„Патрик и аз убихме няколко дузини руски тролове, защото Twitter е отворен, и ние можем“ не правете същото във Facebook ", каза Линвил, доцент в Клемсънския колеж по поведенчески, социални и здравни науки.

Изследователите отбелязаха, че макар сравнително затворените API на Facebook дават на своите потребители повече уединение, това го прави по-строг за изследователи, които да помогнат на Facebook да постигне цели като намаляване на тормоза.

„Има последствия за това, което Facebook прави“, казва Уорън, доцент в бизнес училището в Клемсън. „Има отр цателни последици за обществения интерес.“

Уилър, сега посещаващ сътрудник в института Брукингс и старши сътрудник в училището в Харвард Кенеди, заяви, че ако Facebook бъде принуден да отвори резултатите от своите алгоритми, тогава външните лица могат да направят своето проучване на тормоза върху платформата.

„По този начин бихте могли да проектирате свой собствен алгоритъм, който да казва:„ Искам да потърся използването на n-думата “или каквото и да е, което искате да потърсите,“ Уилър каза пред CNN .

Уилър каза, докато това няма да се отърве от тормоза във Facebook, това ще бъде стъпка в правилната посока.

"Не можем да разрешим този проблем с магическа пръчка, но не можем да започнем

Даниел Ситрон, юридически учен и получател през тази година на „Гениална помощ“ на Фондация Макартър за работата си в борбата с онлайн тормоза, също има предложение за Facebook.
Онлайн шпионите са проникнали във ваксина срещу Facebook групи за анти-ваксина и са открили членове на групата, издавали c всички да атакуват защитниците на ваксините, включително майките, които оплакват мъртвите си деца.

Резултатът е вълни от тормоз, като десетки коментари ил и повече влизат наведнъж, според жертвите.

"Аз наричам тези кибермоби. Това е смърт с хиляда съкращения. Това е като хиляда пчелни ужилвания ", заяви Citron.

Citron каза, че Facebook трябва да измисли начин да докладва тълпата, вместо да отчита всяка отделна част от съдържанието.

" Facebook трябва да разглежда кибермоба като истински феномен ", каза тя." Трябва да има начин, по който потребителите да ги уведомят, че се случва тази буря – където те могат да кажат: "Вижте моята страница, кибермобът е слязъл." "

Citron, професор в Училището по право в университета в Бостън, който е бил неплатен съветник на Facebook, заяви, че през изминалото десетилетие Facebook е преминал" по дълъг път "по проблемите на тормоза и тормоза.

" Те не

Но тя каза, че Facebook се сблъсква с присъщ конфликт в самата полиция.

„Те не искат да де-платформират хората. Те искат своя бизнес ", каза тя.

Citron каза, че Facebook трябва да помни, че губи бизнес, ако жертвите на малтретиране се изплашат от платформата.

" Не е добре хората да се ужасяват и да ги гонят офлайн ", каза тя .

Мората, защитникът на ваксината, чийто 5-годишен син е починал от грип, каза, че не се е уплашила от Фейсбук – но е разговаряла с много родители, които са получили насилствени заплахи и са ужасени .

„Очевидно имаме свобода на словото в тази страна, но когато говорите за причиняване на вреда на друг човек, не мисля, че това трябв да бъде позволено“, каза тя. „Facebook прави лоша услуга, като позволява този тип лошо поведение да се разпространява. Те улесняват това и не трябва да бъдат. "

Джон Бонифийлд, Минали Нигам и Кристен Роджърс от CNN допринесоха за този доклад.


Source link