WORCESTER, Mass. (WBZ NewsRadio) — Worcester Schools are halting the use of an artificial intelligence safety-check software that determines when students are looking up words that might indicate they're thinking about harming themselves.
School Committee member Laura Clancy said they decided to pause the program, which was installed on Worcester's school-issued laptops, because the board has too many unanswered questions.
"[It's] liability issues," Clancy said. "We wanted our legal counsel to look into things. The filtering system, we were concerned about false reporting. The other thing we were concerned about is if we wanted it to be 24hr monitoring, who's monitoring it on the weekend?"
School committee member Tracy Novick said she believes Worcester students need one-on-one support, not artificial intelligence monitoring their internet activity.
"We are spending money on people sorting through Google searches," she said. "It doesn't actually give students mental health support which is what they actually need."
WBZ NewsRadio's Kim Tunnicliffe reports:
Follow WBZ NewsRadio: Facebook | Twitter | Instagram | iHeartmedia App
Written by Brit Smith
(Photo: Getty Images)