Have you ever filtered out content in your Facebook feed or on a search engine because it is too sexually graphic or disturbingly violent? Ever wondered, with that First Amendment and all, who determines if your assessment of the content is accurate or somehow reflects the standards in U.S. laws like the Communications Decency Act?
Wired magazine recently ran a story on the people paid relatively paltry sums to keep the Internet clean for you and me. Often these folks are overseas, and often they are vendors for the big tech giants. In other words, large tech companies pay people to look all day at stuff we can't bear to look at just once.
"Employees are given a battery of psychological tests to determine their mental baseline, then interviewed and counseled regularly to minimize the effect of disturbing images. But even with the best counseling, staring into the heart of human darkness exacts a toll." Knowing this, should employers be legally obligated to do more to minimize the harm to these employees? If you are a large tech company, can you minimize your legal exposure by outsourcing this work to vendors in other countries?
Where might you look for answers? Think like an attorney representing these companies and read legal publications targeted at in-house counsel.
Try The In-house Counsel's Essential Toolkit or Inside Counsel to get started.