In what will only come as a surprise to most, two former Microsoft employees have filed a lawsuit against the company over claims that they have developed PTSD (Post Traumatic Stress Disorder) in their roles on the Online Safety team. They believe to have developed the medical condition after reviewing child pornography and other horrific material, reports The Daily Beast. And the lawsuit revolves around Microsoft not providing them with efficient healthcare and treatment.

The two plaintiffs (or former employees), namely Henry Soto and Greg Blauert, were a part of the aforementioned team. At their role, they had been granted a ‘God-like’ status as they had access to any customer’s communication at any time. But they had been trusted and tasked with the responsibility of reviewing materials that’ve been flagged as potentially illegal. They had been in charge of the safety team, starting from 2008.

The duo had also been handed over the responsibility to screen user’s communications for child pornography and evidence of other crimes. With regards to the same, Soto had to view “horrible brutality, murder, indescribable sexual assaults” whereas Blauert had been tasked with reviewing “thousands of images of child pornography, adult pornography” and other graphic content. Thus, viewing some of the most sick-minded and twisted content on the face of the planet didn’t play well with their health.

Talking about the same, the complaint filed by their lawyers read,

Many people simply cannot imagine what Mr Soto had to view on a daily basis as most people do not understand how horrible and inhumane the worst people in the world can be. 

[Blauert was required to] review thousands of images of child pornography, adult pornography and bestiality that graphically depicted the violence and depravity of the perpetrators.

Thus, viewing some of the most sick-minded and twisted content on the face of the planet didn’t play well with their health. Both of them are now suffering from symptoms of PTSD, which include nightmares, anxiety, and hallucinations. The explanation for the same was offered to the Redmond giant, who enrolled them in a Wellness Program. Microsoft’s safety program supervisors even instructed them to take smoke breaks and long walks to deal with their condition, says the complaint. They were even asked to play video games during their breaks.

But the said wellness program didn’t benefit them due to the appointment of a lacking therapist. Thus, they are alleging Microsoft for being negligent towards the mental health and trauma of the members of their review team. They believe that the supervisors are unaware of the consequences of the viewing these taxing and profane pieces of content on the daily. The two of them were stuck in the said department for about 18 months and this took a toll on their psyche.

 

A Microsoft spokesperson has replied to queries stating that the company ‘disagrees’ with the claims in the lawsuit. And further, it has been added that the imagery provided to its reviewers is altered to reduce its realism. The company also defines time restrictions on the amount of moderated content. Commenting on the said lawsuit, the spokesperson further adds,

[Microsoft] takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work.

With regards to the lawsuit, Soto and Blauert, are now seeking damages for their mental condition. They’re looking to afford better medical services to curb their hallucinations and trauma attached to PTSD. Further, having worked on the said team for long, the duo are suggesting changes to the wellness program. They believe more time off and regular psychological consultations with a trained professional will protect the health of individuals.

Thus, the technology behemoths are steadily shifting the task of moderation from being human-controlled into the hands of computer and artificially intelligent systems. These programs can be taught to judge the content and then weed out suspicious material. This content, that will be relatively less than the massive library of horrific material, can then be reviewed by humans. They are ultimately aimed to protect the child or help the authorities solve mystery cases.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.