student security – The 74 America's Education News Source Wed, 11 Sep 2024 15:37:15 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png student security – The 74 32 32 Data Privacy Advocates Raise Alarm Over NYC’s Free Teen Teletherapy Program /article/data-privacy-advocates-raise-alarm-over-nycs-free-teen-teletherapy-program/ Thu, 12 Sep 2024 12:30:00 +0000 /?post_type=article&p=732707 This article was originally published in

New York City’s free online therapy platform for teens may violate state and federal laws protecting student data privacy, lawyers from the New York Civil Liberties Union and advocates charged in a letter Tuesday to the city’s Education and Health Departments.

, a $26 million partnership between the city Health Department and teletherapy giant Talkspace launched in late 2023, connects city residents between ages 13 and 17 with free therapists by text, phone, or video chat.

In less than a year, roughly 16,000 students have signed up, Health Department officials said. Sign-ups disproportionately came from youth who identified as Black, Latino, Asian American and female and live in some of the city’s lowest-income neighborhoods, .


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


Information shared with a therapist is subject to stringent protections under the federal Health Insurance Portability and Accountability Act, or HIPAA. But before connecting with a therapist through Teenspace, teens go through a registration process that asks for personal information like their name, school, mental health history, and gender identity. Advocates are concerned such information is being improperly collected and could be misused.

For one, teens enter the registration information before securing parental consent – a possible violation of federal student privacy laws, the letter contends.

And families don’t get a chance to review the privacy policy – which discloses that registration information can be used to “tailor advertising” and for marketing purposes – before entering the registration information, advocates allege. There’s an option for teens to request that their data be deleted from the company’s platform, but it’s hard to find, according to advocates.

“It’s all very invasive,” said Shannon Edwards, a parent and founder of AI For Families, an organization that seeks to help families navigate artificial intelligence, who co-authored the letter along with NYCLU and the Parent Coalition for Student Privacy. “It’s also very unclear that parents understand what they’re getting themselves into.”

Advocates also pointed to the risk of a potential data breach – something the city has in recent years.

Advocates say similar about have been circulating for years and questioned whether city officials did sufficient due diligence or built in enough additional privacy safeguards before inking the contract.

“It’s the opacity of the relationship here, and the failure to make manifest what the city is doing to ensure there isn’t this data accumulation and sharing for inappropriate purposes,” said Beth Haroules, a senior attorney at the NYCLU who co-authored the letter.

Health Department spokesperson Rachel Vick said the agency has “taken additional steps to protect the data of Teenspace users and ensure information is not collected for personal gain, including stipulations that require all client data to remain confidential during and after the completion of the city’s contract and barring use of data for any purpose other than providing the services included in the contract.”

Client data is destroyed after 30 days if a teen doesn’t connect with a therapist, officials said.

A spokesperson for Talkspace referred questions to the Health Department.

The extent to which Teenspace is subject to state and federal laws governing student privacy in educational settings is somewhat murky, given that the contract is with the city’s Health Department, not its Education Department.

But NYCLU attorneys contend “the City cannot absolve itself of its responsibility to provide the protections inherent in federal and state laws…simply because the contract sits with DOHMH instead of DOE. The service is promoted on public school websites, and it is DOE’s responsibility to ensure that student data is protected, regardless of which City agency signs the contract.”

Parents may be more inclined to trust the platform because it has a “stamp of approval” from the school system, Edwards added.

A Health Department spokesperson didn’t specify whether the program is subject to education privacy laws, but said it’s “not a school based service.”

Teenspace has been the city’s highest-profile effort to address the ongoing youth mental health crisis.

“We are meeting people where they are with a front door to the mental health system that for too long has been too hard to find,” said Ashwin Vasan, the city’s health commissioner, in May.

Some teens have praised the program, noting it’s a way to bring mental health care to young people who may not otherwise have access.

But some mental health providers have argued it can’t replace the kind of intensive care a clinician provides, especially for kids with severe mental health challenges.

Company officials shared in May that they had helped 36 teens navigate serious incidents including reports of suicide attempts and abuse – cases they referred to child protective services, in-person therapists, or hospitals.

Talkspace CEO Jon Cohen previously told Chalkbeat the company uses an artificial intelligence algorithm to scan transcripts of therapy sessions to help identify teens at risk of suicide.

Even advocates critical of Teenspace’s privacy protections acknowledge the severe shortage of mental health providers and say teletherapy can play a role in filling the gap.

“We know you cannot find providers … there is such a need,” said Haroules. But advocates said the city can do more to ensure its vendors are meeting strict standards for data privacy, especially with such sensitive information.

“Everyone thinks, well, mental health is important for kids, these kids of services are required … when on the other side is: ‘How are they getting to it?’” said Edwards. “It doesn’t matter what the app is, there has to be a standard.”

This was originally published by Chalkbeat. Chalkbeat is a nonprofit news site covering educational change in public schools. Sign up for their newsletters at .

]]>
When Should Teachers Call the Police? /article/when-should-teachers-call-the-police/ Mon, 09 Sep 2024 17:01:00 +0000 /?post_type=article&p=732547 This article was originally published in

Update: The bill was ordered to the inactive file on the last night of the session Aug. 31. It had been amended to keep mandatory police notification requirements if a student assaults or threatens a teacher. The bill would have still let teachers choose to call the police if a student is using or possessing controlled substances, and it would also decriminalize willful disturbance by students.

During Zuleima Baquedano’s first year as a teacher, she faced an important choice. 

One of her students had difficulty controlling her emotions. One day, she had a meltdown and kicked Baquedano down.

The principal asked Baquedano if she wanted to call the police, because the incident legally counted as assault. But not long before, the student had moved in with her family after being in and out of foster care, was beginning the diagnostic process for her disability and had been working with Baquedano on coping mechanisms. 


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


“Any contact with police would have really put all of that in jeopardy,” Baquedano said. “Calling the police, getting Child Protective Services involved and all that would have completely just ruined any kind of progress she’d made.” 

Baquedano decided against calling the police. “I’m never going to regret advocating for her, despite the fact that several teachers told me I couldn’t let her get away with it, and that she did this on purpose when they didn’t even know her,” she said. 

She had a choice because she worked at a charter school in Los Angeles. Staff at traditional public schools don’t have the same freedom: Under California law, they are required to make a police report if a student assaults them — and can be prosecuted if they don’t. 

A bill before the Legislature in its final week . 

But what supporters see as a common sense bill, opponents see as going too far, raising partisan tensions in an election year in which crime and education are top of mind for many voters. 

A difficult path to the Senate

, a San Jose Democrat, has been trying to get similar legislation passed for four years.

“The data very clearly shows that when law enforcement is required to come onto campus, those that they choose to arrest are disproportionately people with disabilities and students of color,” Kalra said in an interview. 

 found that students with disabilities make up 26% of school arrests, despite being 11% of total enrollment. According to a , students of color are handcuffed by police at a disproportionate rate — 20% of Black students compared to 9% of white students. 

“This bill is really a turning point in addressing issues around school climate,” said Oscar Lopez, an associate managing attorney at Disability Rights California, a sponsor of the bill. 

This is the first time Kalra’s bill has made it to the Senate, and it wasn’t easy. It barely squeaked out of the Assembly by a vote of just 41-22, with seven Democrats voting “no.” 

“It’s unfortunate that a common sense bill like this has struggled so hard to make it through the Legislature,” Kalra said. 

And opposition is organizing.

Last week, Senate Republicans , listing concerns about school safety, drug possession and the relationship between schools and law enforcement. 

“The bottom line is this is going to make our school campuses less safe,” Senate GOP Leader  of San Diego told CalMatters. “It’s going to endanger our students, teachers, administrators and even the law enforcement professionals who have to serve on these campuses.”

Law enforcement officials worry that AB 2441 could open the door to eliminating school resource officers. 

“School officials and law enforcement should work together, especially when it comes to pupils whose behavior violates the law and puts school safety in jeopardy,” said Cory Salzillo, legislative director of the California State Sheriffs’ Association. “Removing requirements just runs counter to that notion.”

If AB 2441 were to pass, there would still still be times when staff are required to call the police. Under federal law, local education agencies must call law enforcement if a student has a firearm or is caught selling controlled substances. 

Some opponents have also raised concerns about school administrators’ ability to discern between students who are selling controlled substances or just possessing them — a task they think should be left to law enforcement, particularly .

“Schools are not isolated in the community, so when there are crimes being committed, even if it’s simple possession of a controlled substance, that’s something that law enforcement should be aware of,” Salzillo said. 

The California Department of Public Health plans to announce a new fentanyl education campaign on Wednesday. 

“Fentanyl is so dangerous that we need to be all hands on deck on dealing with that crisis on our school campuses,” Jones said. “Removing this requirement of reporting is just unbelievable to me at this point in time.” 

Because of an amendment to the bill, staff would also need to notify law enforcement if someone needed immediate medical attention. 

After the Senate Republican Caucus released its analysis — and sent it to its entire press list for the first time — supporters of the bill accused them of fear mongering and spreading misinformation. 

“There’s been a lot of untruths shared and promoted by the opposition to this bill,” said Rachel Bhagwat, legislative advocate at ACLU California Action, a bill sponsor. 

Jones denied that’s what’s happening. 

“California voters and taxpayers are fed up with the criminal justice system in California right now,” he said. “They’re fed up with the progressive wing that’s continuing to decriminalize crime.” 

Preventing the school-to-prison pipeline

 that when young people face severe discipline at school — such as police interaction, suspension or expulsion — they are less likely to graduate high school and more likely to go to prison. 

“The interpretation of normal, age-appropriate behaviors as being threatening and criminal and dangerous is leading to a situation where young people are not getting educational opportunities in school, and they’re being funneled into further criminal contact and the criminal system,” Bhagwat said. 

Under current state law, staff are required to try other methods — such as meeting with parents, speaking with a psychologist, creating an individualized education plan or restorative justice programs — before resorting to something more severe. 

“Between counseling and other programs, there are methods to use that don’t involve punitive consequences such as a misdemeanor crime,” Naj Alikhan, senior director of marketing and communications for the Association of California School Administrators, wrote in a statement to CalMatters.

The bill would also get rid of a clause that makes it a crime to “willfully disturb” public schools and meetings. Under this provision, students could be criminally prosecuted for running in hallways or knocking on doors. 

“It’s somewhat of a vague term,” Kalra said, “and it’s been used against students who might have behavior issues. There’s a lot of different reasons why a student may be causing a disturbance and we want to give schools the ability to decide how they want to handle those situations.” 

An amendment to the bill would make it an infraction for someone to prevent a school staff member from calling the police. 

Baquedano — who  before the Senate education committee in July and now teaches in Santa Ana — said that if the bill passes, there are serious situations, like having a deadly weapon or being in possession of drugs, where she would still call.  

“There’s an assumption that we’re going to stop calling the police, and that’s not the case,” she said. “The idea that we wouldn’t have that common sense is a little insulting.” 

It’s a decision Baquedano said teachers deserve to have. 

“People should trust us — the professionals in the situation, who’ve been trained, who’ve gone through education to do this — they should be trusting our judgment,” she said. “We’re the ones who best know our students. We spend all these hours with them a year, sometimes more than parents do.”

Kalra remains optimistic that AB 2441 will pass the Senate this week and make it to Gov. Gavin Newsom’s desk. 

“You would hope,” he said, “that legislators would understand the need for us to support all students, and I’m hopeful that at least we can get this bill through to see that it’s not going to create some doomsday outcome.”

This was originally published on .

]]>
Whistleblower: L.A. Schools’ Chatbot Misused Student Data as Tech Co. Crumbled /article/whistleblower-l-a-schools-chatbot-misused-student-data-as-tech-co-crumbled/ Mon, 01 Jul 2024 10:30:00 +0000 /?post_type=article&p=729298 Just weeks before the implosion of AllHere, an education technology company that had been showered with cash from venture capitalists and featured in glowing profiles by the business press, America’s second-largest school district was warned about problems with AllHere’s product.

As the eight-year-old startup rolled out Los Angeles Unified School District’s flashy new AI-driven chatbot — an animated sun named “Ed” that AllHere was hired to build for $6 million — a former company executive was sending emails to the district and others that Ed’s workings violated bedrock student data privacy principles. 

Those emails were sent shortly before The 74 first reported last week that AllHere, with in investor capital, was in serious straits. A June 14 statement on the company’s website revealed a majority of its employees had been furloughed due to its “current financial position.” Company founder and CEO Joanna Smith-Griffin, a spokesperson for the Los Angeles district said, was no longer on the job. 

Smith-Griffin and L.A. Superintendent Alberto Carvalho went on the road together this spring to unveil Ed at a series of high-profile ed tech conferences, with the schools chief dubbing it the nation’s first “personal assistant” for students and leaning hard into LAUSD’s place in the K-12 AI vanguard. He called Ed’s ability to know students “unprecedented in American public education” at the ASU+GSV conference in April. 


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


Through an algorithm that analyzes troves of student information from multiple sources, the chatbot was designed to offer tailored responses to questions like “what grade does my child have in math?” The tool relies on vast amounts of students’ data, including their academic performance and special education accommodations, to function.

Meanwhile, Chris Whiteley, a former senior director of software engineering at AllHere who was laid off in April, had become a whistleblower. He told district officials, its independent inspector general’s office and state education officials that the tool processed student records in ways that likely ran afoul of L.A. Unified’s own data privacy rules and put sensitive information at risk of getting hacked. None of the agencies ever responded, Whiteley told The 74. 

“When AllHere started doing the work for LAUSD, that’s when, to me, all of the data privacy issues started popping up,” Whiteley said in an interview last week. The problem, he said, came down to a company in over its head and one that “was almost always on fire” in terms of its operations and management. LAUSD’s chatbot was unlike anything it had ever built before and — given the company’s precarious state — could be its last. 

If AllHere was in chaos and its bespoke chatbot beset by porous data practices, Carvalho was portraying the opposite. One day before The 74 broke the news of the company turmoil and Smith-Griffin’s departure, spotlighted the schools chief at a Denver conference talking about how adroitly LAUSD managed its ed tech vendor relationships — “We force them to all play in the same sandbox” — while ensuring that “protecting data privacy is a top priority.”

In a statement on Friday, a district spokesperson said the school system “takes these concerns seriously and will continue to take any steps necessary to ensure that appropriate privacy and security protections are in place in the Ed platform.” 

“Pursuant to contract and applicable law, AllHere is not authorized to store student data outside the United States without prior written consent from the District,” the statement continued. “Any student data belonging to the District and residing in the Ed platform will continue to be subject to the same privacy and data security protections, regardless of what happens to AllHere as a company.” 

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

A district spokesperson, in response to earlier questioning from The 74 last week, said it was informed that Smith-Griffin was no longer with the company and that several businesses “are interested in acquiring AllHere.” Meanwhile Ed, the spokesperson said, “belongs to Los Angeles Unified and is for Los Angeles Unified.”

Officials in the inspector general’s office didn’t respond to requests for comment. The state education department “does not directly oversee the use of AI programs in schools or have the authority to decide which programs a district can utilize,” a spokesperson said in a statement.

It’s a radical turn of events for AllHere and the AI tool it markets as a “learning acceleration platform,” which were all the buzz just a few months ago. In April, Time Magazine education technology companies. That same month, Inc. Magazine dubbed Smith-Griffin in artificial intelligence in its Female Founders 250 list. 

Ed has been similarly blessed with celebrity treatment. 

“He’s going to talk to you in 100 different languages, he’s going to connect with you, he’s going to fall in love with you,” Carvalho said at ASU+GSV. “Hopefully you’ll love it, and in the process we are transforming a school system of 540,000 students into 540,000 ‘schools of one’ through absolute personalization and individualization.”

Smith-Griffin, who graduated from the Miami school district that Carvalho once led before going onto Harvard, couldn’t be reached for comment. Smith-Griffin’s LinkedIn page was recently deactivated and parts of the company website have gone dark. Attempts to reach AllHere were also unsuccessful.

‘The product worked, right, but it worked by cheating’

Smith-Griffin, a former Boston charter school teacher and family engagement director, founded AllHere in 2016. Since then, the company has primarily provided schools with a text messaging system that facilitates communication between parents and educators. , the tool relies on attendance data and other information to deliver customized, text-based “nudges.” 

The work that AllHere provided the Los Angeles school district, Whiteley said, was on a whole different level — and the company wasn’t prepared to meet the demand and lacked expertise in data security. In L.A., AllHere operated as a consultant rather than a tech firm that was building its own product, according to its contract with LAUSD obtained by The 74. Ultimately, the district retained rights to the chatbot, according to the agreement, but AllHere was contractually obligated to “comply with the district information security policies.” 

 The contract notes that the chatbot would be “trained to detect any confidential or sensitive information” and to discourage parents and students from sharing with it any personal details. But the chatbot’s decision to share and process students’ individual information, Whiteley said, was outside of families’ control. 

In order to provide individualized prompts on details like student attendance and demographics, the tool connects to several data sources, according to the contract, including , an online tool used to track students’ special education services. The document notes that Ed also interfaces with the stored on , a cloud storage company. , the Whole Child platform serves as a central repository for LAUSD student data to help educators monitor students’ progress and personalize instruction. 

Whiteley told officials the app included students’ personally identifiable information in all chatbot prompts, even in those where the data weren’t relevant. Prompts containing students’ personal information were also shared with other third-party companies unnecessarily, Whiteley alleges, and were processed on offshore servers. Seven out of eight Ed chatbot requests, he said, are sent to places like Japan, Sweden, the United Kingdom, France, Switzerland, Australia and Canada. 

Taken together, he argued the company’s practices ran afoul of data minimization principles, a standard cybersecurity practice that maintains that apps should collect and process the least amount of personal information necessary to accomplish a specific task. Playing fast and loose with the data, he said, unnecessarily exposed students’ information to potential cyberattacks and data breaches and, in cases where the data were processed overseas, could subject it to foreign governments’ data access and surveillance rules. 

Chatbot source code that Whiteley shared with The 74 outlines how prompts are processed on foreign servers by a Microsoft AI service that integrates with ChatGPT. The LAUSD chatbot is directed to serve as a “friendly, concise customer support agent” that replies “using simple language a third grader could understand.” When querying the simple prompt “Hello,” the chatbot provided the student’s grades, progress toward graduation and other personal information. 

AllHere’s critical flaw, Whiteley said, is that senior executives “didn’t understand how to protect data.” 

“The issue is we’re sending data overseas, we’re sending too much data, and then the data were being logged by third parties,” he said, in violation of the district’s data use agreement. “The product worked, right, but it worked by cheating. It cheated by not doing things right the first time.”

In a 2017 policy bulletin, the district notes that all sensitive information “needs to be handled in a secure way that protects privacy,” and that contractors cannot disclose information to other parties without parental consent. A second policy bulletin, from April, outlines the district’s authorized use guidelines for artificial intelligence, which notes that officials, “Shall not share any confidential, sensitive, privileged or private information when using, prompting or communicating with any tools.” It’s important to refrain from using sensitive information in prompts, the policy notes, because AI tools “take whatever users enter into a prompt and incorporate it into their systems/knowledge base for other users.” 

“Well, that’s what AllHere was doing,” Whiteley said. 

L.A. Superintendent Alberto Carvalho (Getty Images)

‘Acid is dangerous’

Whiteley’s revelations present LAUSD with its third student data security debacle in the last month. In mid-June, a threat actor known as “Sp1d3r” began to sell for $150,000 a trove of data it claimed to have stolen from the Los Angeles district on Breach Forums, a dark web marketplace. LAUSD Bloomberg that the compromised data had been stored by one of its third-party vendors on the cloud storage company Snowflake, the repository for the district’s Whole Child Integrated Data. The Snowflake data breach may be one of the largest in history. The threat actor claims that the L.A. schools data in its possession include student medical records, disability information, disciplinary details and parent login credentials. 

The chatbot interacted with data stored by Snowflake, according to the district’s contract with AllHere, though any connection between AllHere and the Snowflake data breach is unknown. 

In its statement Friday, the district spokesperson said an ongoing investigation has “revealed no connection between AllHere or the Ed platform and the Snowflake incident.” The spokesperson said there was no “direct integration” between Whole Child and AllHere and that Whole Child data was processed internally before being directed to AllHere.

The contract between AllHere and the district, however, notes that the tool should “seamlessly integrate” with the Whole Child Integrated Data “to receive updated student data regarding attendance, student grades, student testing data, parent contact information and demographics.”

Earlier in the month, a second threat actor known as Satanic Cloud claimed it had access to tens of thousands of L.A. students’ sensitive information and had posted it for sale on Breach Forums for $1,000. In 2022, the district was victim to a massive ransomware attack that exposed reams of sensitive data, including thousands of students’ psychological evaluations, to the dark web. 

With AllHere’s fate uncertain, Whiteley blasted the company’s leadership and protocols.

“Personally identifiable information should be considered acid in a company and you should only touch it if you have to because acid is dangerous,” he told The 74. “The errors that were made were so egregious around PII, you should not be in education if you don’t think PII is acid.” 

L.A. parents and students, we want to hear from you.  using AllHere’s Ed:

]]>
Exclusive: Dems Urge Federal Action on Student Surveillance Citing Bias Fears /article/exclusive-dems-urge-federal-action-on-student-surveillance-citing-discrimination-fears/ Thu, 19 Oct 2023 18:01:00 +0000 /?post_type=article&p=716619 A coalition of Democratic lawmakers on Thursday called on the U.S. Education Department to investigate school districts that use digital surveillance and other artificial intelligence tools in ways that trample students’ civil rights. 

, the coalition expressed concerns that AI-enabled student monitoring tools could foster discrimination against marginalized groups, including LGBTQ+ youth and students with disabilities. The Education Department’s Office for Civil Rights should issue guidance on the appropriate uses of emerging classroom technologies, the lawmakers wrote, and crack down on practices that run afoul of existing federal anti-discrimination laws. 

“While the expansion of educational technology helped facilitate remote learning that was critical to students, parents and teachers during the pandemic,” the lawmakers wrote, “these technologies have also amplified student harms.” 


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


Lawmakers asked the Education Department’s civil rights office whether it has received complaints alleging discrimination facilitated by education technology software and whether it has taken any enforcement action related to potential civil rights violations. 

The letter comes in response to a recent national survey of educators, parents and students, the findings of which suggest that schools’ use of digital tools to monitor children online have based on their race, disability, sexual orientation and gender identity. The survey, conducted by the nonprofit Center for Democracy and Technology, found that while activity monitoring has become ubiquitous in schools and is intended to keep students safe, it’s used regularly as a discipline tool and routinely brings youth into contact with the police.

Findings from the CDT survey, lawmakers wrote, “raise serious concerns about the application of civil rights laws to schools’ use of these technologies.” Letter signatories include Democratic Reps. Lori Trahan of Massachusetts, Sara Jacobs of California, Hank Johnson of Georgia, Bonnie Watson Coleman of New Jersey and Adam Schiff of California. Trahan, who serves on the House Energy and Commerce Committee’s Innovation, Data and Commerce Subcommittee, has previously called for tighter student data privacy protections in the ed tech sector. 

The monitoring tools, such as those offered by for-profit companies GoGuardian and Gaggle, rely on artificial intelligence to sift through students’ online activities and flag school administrators — and sometimes the police — when they discover materials related to sex, drugs, violence or self-harm. 

Two-thirds of teachers reported that a student at their school was disciplined as a result of activity monitoring and a third said they know a student who was contacted by the police because of an alert generated by the software. 

Children with disabilities were more likely than their peers to report being watched, and special education teachers reported heightened rates of discipline as a result of activity monitoring. The findings, researchers argue, that entitle children with disabilities equal access to an education. Even beyond the technologies, students with disabilities are subjected to disproportionate levels of school discipline, including restraint and seclusion, when compared to their general education peers. 

Half of all students said their schools responded fairly to alerts generated by monitoring software, a sentiment shared by just 36% of LGBTQ+ youth. In fact, LGBTQ+ youth were more likely than their straight and cisgender peers to report that they or someone they know was disciplined as a result of monitoring. And nearly a third of LGBTQ+ youth reported that they or someone they know was outed because of the technology. 

More than a third of teachers said their school monitors students’ online behaviors outside of school hours — and sometimes on their personal devices. 

In a similar student survey, released this month by the American Civil Liberties Union, a majority of respondents expressed worries that the monitoring tools — despite being designed to keep them safe — could actually cause harm and a third said they “always feel” like they’re being watched. 

The 74 has reported extensively on schools’ use of digital surveillance tools to monitor students’ online behaviors, and the tools’ implications for youth civil rights. The company Gaggle previously flagged to administrators student communications that referenced LGBTQ+ keywords like “gay” and “lesbian.” The company says it halted the practice last year in the wake of pushback from civil rights activists. 

Given the survey findings, the lawmakers urged the Education Department to clarify “how educators can fulfill their civil rights obligations” as they develop policies related to artificial intelligence, whose rapidly evolving role in education more broadly — including students’ use of tools like ChatGPT — has become a topic of debate. 

“This research is particularly concerning due to linkages between school disciplinary policies and incarceration rates of our nation’s youth,” the coalition wrote, adding concerns that the tools can create hostile learning environments. 

]]>
ChatGPT Is Landing Kids in the Principal’s Office, Survey Finds /article/chatgpt-is-landing-kids-in-the-principals-office-survey-finds/ Wed, 20 Sep 2023 04:01:00 +0000 /?post_type=article&p=715056 Ever since ChatGPT burst onto the scene last year, a heated debate has centered on its potential benefits and pitfalls for students. As educators worry students could use artificial intelligence tools to cheat, a new survey makes clear its impact on young people: They’re getting into trouble. 

Half of teachers say they know a student at their school who was disciplined or faced negative consequences for using — or being accused of using — generative artificial intelligence like ChatGPT to complete a classroom assignment, , a nonprofit think tank focused on digital rights and expression. The proportion was even higher, at 58%, for those who teach special education. 

Cheating concerns were clear, with survey results showing that teachers have grown suspicious of their students. Nearly two-thirds of teachers said that generative AI has made them “more distrustful” of students and 90% said they suspect kids are using the tools to complete assignments. Yet students themselves who completed the anonymous survey said they rarely use ChatGPT to cheat, but are turning to it for help with personal problems.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


“The difference between the hype cycle of what people are talking about with generative AI and what students are actually doing, there seems to be a pretty big difference,” said Elizabeth Laird, the group’s director of equity in civic technology. “And one that, I think, can create an unnecessarily adversarial relationship between teachers and students.”   

Indeed, 58% of students, and 72% of those in special education, said they’ve used generative AI during the 2022-23 academic year, just not primarily for the reasons that teachers fear most. Among youth who completed the nationally representative survey, just 23% said they used it for academic purposes and 19% said they’ve used the tools to help them write and submit a paper. Instead, 29% reported having used it to deal with anxiety or mental health issues, 22% for issues with friends and 16% for family conflicts.

Part of the disconnect dividing teachers and students, researchers found, may come down to gray areas. Just 40% of parents said they or their child were given guidance on ways they can use generative AI without running afoul of school rules. Only 24% of teachers say they’ve been trained on how to respond if they suspect a student used generative AI to cheat. 

Center for Democracy and Technology

The results on ChatGPT’s educational impacts were included in the Center for Democracy and Technology’s broader annual survey analyzing the privacy and civil rights concerns of teachers, students and parents as tech, including artificial intelligence, becomes increasingly engrained in classroom instruction. Beyond generative AI, researchers observed a sharp uptick in digital privacy concerns among students and parents over last year. 

Among parents, 73% said they’re concerned about the privacy and security of student data collected and stored by schools, a considerable increase from the 61% who expressed those reservations last year. A similar if less dramatic trend was apparent among students: 62% had data privacy concerns tied to their schools, compared with 57% just a year earlier. 

Center for Democracy and Technology

Those rising levels of anxiety, researchers theorized, are likely the result of the growing frequency of cyberattacks on schools, which have become a primary target for ransomware gangs. High-profile breaches, including in Los Angeles and Minneapolis, have compromised a massive trove of highly sensitive student records. Exposed records, investigative reporting by The 74 has found, include student psychological evaluations, reports detailing campus rape cases, student disciplinary records, closely guarded files on campus security, employees’ financial records and copies of government-issued identification cards. 

Survey results found that students in special education, whose records are among the most sensitive that districts maintain, and their parents were significantly more likely than the general education population to report school data privacy and security concerns. As attacks ratchet up, 1 in 5 parents say they’ve been notified that their child’s school experienced a data breach. Such breach notices, Laird said, led to heightened apprehension. 

“There’s not a lot of transparency” about school cybersecurity incidents “because there’s not an affirmative reporting requirement for schools,” Laird said. But in instances where parents are notified of breaches, “they are more concerned than other parents about student privacy.” 

Parents and students have also grown increasingly wary of another set of education tools that rely on artificial intelligence: digital surveillance technology. Among them are student activity monitoring tools, such as those offered by the for-profit companies Gaggle and GoGuardian, which rely on algorithms in an effort to keep students safe. The surveillance software employs artificial intelligence to sift through students’ online activities and flag school administrators — and sometimes the police — when they discover materials related to sex, drugs, violence or self-harm. 

Among parents surveyed this year, 55% said they believe the benefits of activity monitoring outweigh the potential harms, down from 63% last year. Among students, 52% said they’re comfortable with academic activity monitoring, a decline from 63% last year. 

Such digital surveillance, researchers found, frequently has disparate impacts on students based on their race, disability, sexual orientation and gender identity, potentially violating longstanding federal civil rights laws. 

The tools also extend far beyond the school realm, with 40% of teachers reporting their schools monitor students’ personal devices. More than a third of teachers say they know a student who was contacted by the police because of online monitoring, the survey found, and Black parents were significantly more likely than their white counterparts to fear that information gleaned from online monitoring tools and AI-equipped campus surveillance cameras could fall into the hands of law enforcement. 

Center for Democracy and Technology

Meanwhile, as states nationwide pull literature from school library shelves amid a conservative crusade against LGBTQ+ rights, the nonprofit argues that digital tools that filter and block certain online content “can amount to a digital book ban.” Nearly three-quarters of students — and disproportionately LGBTQ+ youth — said that web filtering tools have prevented them from completing school assignments. 

The nonprofit highlights how disproportionalities identified in the survey could run counter to federal laws that prohibit discrimination based on race and sex, and those designed to ensure equal access to education for children with disabilities. In a letter sent Wednesday to the White House and Education Secretary Miguel Cardona, the Center for Democracy and Technology was joined by a coalition of civil rights groups urging federal officials to take a harder tack on ed tech practices that could threaten students’ civil rights. 

“Existing civil rights laws already make schools legally responsible for their own conduct, and that of the companies acting at their direction in preventing discriminatory outcomes on the basis of race, sex and disability,” the coalition wrote. “The department has long been responsible for holding schools accountable to these standards.”

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
Survey Reveals Extent that Cops Surveil Students Online — in School and at Home /article/survey-reveals-extent-that-cops-surveil-students-online-in-school-and-at-home/ Wed, 03 Aug 2022 04:01:00 +0000 /?post_type=article&p=694119 When Baltimore students sign into their school-issued laptops, the police log on, too. 

Since the pandemic began, Baltimore City Public Schools officials have with GoGuardian, a digital surveillance tool that promises to identify youth at risk of harming themselves or others. When GoGuardian flags students, their online activities are shared automatically with school police, giving cops a conduit into kids’ private lives — including on nights and weekends.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


Such partnerships between schools and police appear startlingly widespread across the country with significant implications for youth, according to . Nearly all teachers — 89% — reported that digital student monitoring tools like GoGuardian are used in their schools. And nearly half — 44% — said students have been contacted by the police as a result of student monitoring. 

The pandemic has led to major growth in the number of schools that rely on activity monitoring software to uncover student references to depression and violent impulses. The tools, offered by a handful of tech companies, can sift through students’ social media posts, follow their digital movements in real-time and scan files on school-issued laptops — from classroom assignments to journal entries — in search of warning signs. 

Educators say the tools help them identify youth who are struggling and get them the mental health care they need at a time when youth depression and anxiety are spiraling. But the survey suggests an alternate reality: Instead of getting help, many students are being punished for breaking school rules. And in some cases, survey results suggest, students are being subjected to discrimination. 

The report raises serious questions about whether digital surveillance tools are the best way to identify youth in need of mental health care and whether police officers should be on the front lines in responding to such emergencies. 

“If we’re saying this is to keep students safe, but instead we’re using it punitively and we’re using it to invite law enforcement literally into kids’ homes, is this actually achieving its intended goal?” asked Elizabeth Laird, a survey author and the center’s director of equity in civic technology. “Or are we, in the name of keeping students safe, actually endangering them?”

Among teachers who use monitoring tools at their schools, 78% said the software has been used to flag students for discipline and 59% said kids wound up getting punished as a result. Yet just 45% of teachers said the software is used to identify violent threats and 47% said it is used to identify students at risk of harming themselves. 

Center for Democracy and Technology

The findings are a direct contradiction of the stated goal of student activity monitoring, Laird said. School leaders and company executives have long maintained that the tools are not a disciplinary measure but are designed to identify at-risk students before someone gets hurt.

The Supreme Court’s recent repeal of Roe v. Wade, she said, further muddles police officers’ role in student activity monitoring. As states implement anti-abortion laws, that data from student activity monitoring tools could help the police identify youth seeking reproductive health care. 

“We know that law enforcement gets these alerts,” she said. “If you are in a state where they are looking to investigate these kinds of incidents, you’ve invited them into a student’s house to be able to do that.”

A tale of discrimination

In Baltimore, counselors, principals and school-based police officers receive all alerts generated by GoGuardian during school hours, according to by The Real News Network, a nonprofit media outlet. Outside of school hours, including on weekends and holidays, the responsibility to monitor alerts falls on the police, the outlet reported, and on numerous occasions officers have shown up at students’ homes to conduct wellness checks. On , students have been transported to the hospital for emergency mental health care. 

In a statement to The 74, district spokesperson Andre Riley said that GoGuardian helps officials “identify potential risks to the safety of individual students, groups or schools,” and that “proper accountability measures are taken” if students violate the code of conduct or break laws.

“The use of GoGuardian is not simply a prompt for a law enforcement response,” Riley added.

Leading student surveillance companies, including GoGuardian, have maintained that their interactions with police are limited. In April, Democratic Sens. Elizabeth Warren and Ed Markey warned in a report that schools’ reliance on the tools could violate students’ civil rights and exacerbate “the school-to-prison pipeline by increasing law enforcement interactions with students.” Warren and Markey focused their report on four companies: GoGuardian, Gaggle, Securly and Bark. 

In , Gaggle executives said the company contacts law enforcement for wellness checks if they are unable to reach school-based emergency contacts and a child appears to be “in immediate danger.” In on the company’s website, school officials in Wichita Falls, Texas, Cincinnati, Ohio, and Miami, Florida, acknowledged contacting police in response to Gaggle alerts.

In some cases, school leaders ask Securly to contact the police directly and request they conduct welfare checks on students, the to lawmakers. Executives at Bark said “there are limited options” beyond police intervention if they identify a student in crisis but they cannot reach a school administrator. 

“While we have witnessed many lives saved by police in these situations, unfortunately many officers have not received training in how to handle such crises,” in its letter. “Irrespective of training there is always a risk that a visit from law enforcement can create other negative outcomes for a student and their family.” 

In its , GoGuardian states the company may disclose student information “if we believe in good faith that doing so is necessary or appropriate to comply with any law enforcement, legal or regulatory process.” 

Center for Democracy and Technology

Meanwhile, survey results suggest that student surveillance tools have a negative disparate impact on Black and Hispanic students, LGBTQ youth and those from low-income households. In a letter on Wednesday to coincide with the survey’s release, a coalition of education and civil rights groups called on the U.S. Department of Education to issue guidance warning schools that their digital surveillance practices could violate federal civil rights laws. Signatories include the American Library Association, the Data Quality Campaign and the American Civil Liberties Union.

“This is becoming a conversation not just about privacy, but about discrimination,” Laird said. “Without a doubt, we see certain groups of students having outsized experiences in being directly targeted.”

In a youth survey, researchers found that student discipline as a result of activity monitoring fell disproportionately along racial lines, with 48% of Black students and 55% of Hispanic students reporting that they or someone they knew got into trouble for something that was flagged by an activity monitoring tool. Just 41% of white students reported having similar experiences. 

Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity — often called outing — as a result of activity monitoring. LGBTQ youth were also more likely than straight and cisgender students to report getting into trouble at school and being contacted by the police about having committed a crime. 

Some student surveillance companies, like Gaggle, monitor references to words including “gay” and “lesbian,” a reality company founder and CEO Jeff Patterson has said was created to protect LGBTQ youth, who face a greater risk of dying by suicide. But survey results suggest the heightened surveillance comes with significant harm to youth, and Laird said if monitoring tools are designed with certain students in mind, such as LGBTQ youth, that in itself is a form of discrimination. 

Center for Democracy and Technology

In its letter to the Education Department’s Office for Civil Rights Wednesday, advocates said the disparities outlined in the survey run counter to federal laws prohibiting race-, sex- and disability-based discrimination. 

“Student activity monitoring is subjecting protected classes of students to increased discipline and interactions with law enforcement, invading their privacy, and creating hostile environments for students to express their true thoughts and authentic identities,” the letter states. 

The Education Department’s civil rights division, they said, should condemn surveillance practices that violate students’ civil rights and launch “enforcement action against violations that result in discrimination.”

Lawmakers consider youth privacy

The report comes at a moment of increasing alarm about student privacy online. In May, the Federal Trade Commission announced plans to crack down on tech companies that sell student data for targeted advertising and that “illegally surveil children when they go online to learn.” 

It also comes at a time of intense concern over students’ emotional and physical well-being. While the pandemic has led to a greater focus on youth mental health, the May mass school shooting in Uvalde, Texas, has sparked renewed school safety efforts. In June, President Joe Biden signed a law with modest new gun-control provisions and an influx of federal funding for student mental health care and campus security. The funds could lead to more digital student surveillance.

The results of the online survey, which was conducted in May and June, were likely colored by the Uvalde tragedy, researchers acknowledged. A majority of parents and students have a favorable view of student activity monitoring during school hours to protect kids from harming themselves or others, researchers found. But just 48% of parents and 30% of students support around-the-clock surveillance. 

“Schools are under a lot of pressure to find ways to keep students safe and, like in many aspects of our lives, they are considering the role of technology,” Laird said. 

Last week, the Senate designed to improve children’s safety online, including new restrictions on youth-focused targeted advertising. The effort comes a year after a showing that the social media app Instagram had a harmful effect on youth mental well-being, especially teenage girls. One bill, the Kids Online Safety Act, would require tech companies to identify and mitigate any potential harms their products may pose to children, including exposure to content that promotes self-harm, eating disorders and substance abuse.

Yet the legislation has faced criticism from privacy advocates, who argue it would mandate digital monitoring similar to that offered by student surveillance companies. Among critics is the Electronic Frontier Foundation, a nonprofit focused on digital privacy and free speech. 

“The answer to our lack of privacy isn’t more tracking,” the . The legislation “is a heavy-handed plan to force technology companies to spy on young people and stop them from accessing content that is ‘not in their best interest,’ as defined by the government, and interpreted by tech platforms.” 

Attorney Amelia Vance, the founder and president of Public Interest Privacy Consulting, said she worries the provisions will have a negative impact on at-risk kids, including LGBTQ students. Students from marginalized groups, she said, “will now be more heavily surveilled by basically every site on the internet, and that information will be available to parents” who could discipline teens for researching LGBTQ content. She said the legislation could force tech companies to censor content to avoid potential liability, essentially making them arbiters of community standards. 

“When you have conflicting values in the different jurisdictions that the companies operate in, oftentimes you end up with the most conservative interpretations, which right now is anti-LGBT,” she said.

]]>