TikTok, Google, and other tech firms are being sued by a California school board for creating a “destructive environment for children” and allowing educators and parents alike to deal with the growing mental health crisis among young people. The lawsuit, filed on behalf of the San Mateo County Board of Education and Nancy Magee, superintendent of San Mateo County Schools, targets several of the companies behind the major social networking sites.
The lawsuit alleges that the aforementioned companies use artificial intelligence and machine learning to deliver harmful content to children and youth. “Just as we had ‘Big Tobacco,’ now we have ‘Big Tech’ exploiting children. Just follow the tech lobby’s swift and forceful attacks on our California Legislature’s recent efforts to put in place common-sense rules addressing the tracking and profiling of users under the age of 18,” Joe Cotchett, an attorney with the firm representing the school board, said in a statement.
The lawsuit mentions that tech companies have created an “unprecedented mental health crisis” with full knowledge of what they were doing to profit from their platforms by generating addiction and offering harmful content to younger users.
“Every day schools are dealing with the consequences, which include distracted students, increased absences, more children diagnosed with ADHD, cyberbullying being brought into classrooms, and even physical damage to our San Mateo schools, one example being the vandalism caused by the so-called TikTok “Devious Lick Challenge” at the beginning of the school year,” Magee said.
A Google spokesperson told FOX Business: “We have invested heavily in creating safe experiences for children across our platforms and have introduced strong protections and dedicated features to prioritize their well-being.” “For example, through Family Link, we provide parents with the ability to set reminders, limit screen time, and block specific types of content on monitored devices.”
For its part, a Snap Inc. spokesperson mentioned the following in a statement: “Nothing is more important to us than the well-being of our community, at Snapchat, we curate content from known creators and publishers and use human moderation to review user-generated content before it can reach a large audience, which greatly reduces the spread and discovery of harmful content.”