technology – The 74 America's Education News Source Thu, 26 Mar 2026 19:09:11 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png technology – The 74 32 32 Meta and YouTube Ordered to Pay $3M to Young Woman in Social Media Addiction Trial /article/meta-and-youtube-ordered-to-pay-3m-to-young-woman-in-social-media-addiction-trial/ Fri, 27 Mar 2026 16:30:00 +0000 /?post_type=article&p=1030429 This article was originally published in

After nine days of deliberation, a Los Angeles jury found Google and Meta liable for harms stemming from the design of their social media products on Wednesday and ordered them to pay $3 million in compensatory damages to a plaintiff who said that Instagram and YouTube caused depression, body dysmorphia and suicidal thoughts.

Meta was 70 percent of damages and YouTube the rest. The amount owed the plaintiff may rise, and the jury will over potential punitive damages for egregious conduct, per The New York Times.

This is the tackling the legal question of whether features of social media, like autoplay, infinite scroll and beauty filters can cause harm to users.

“This momentous verdict shows that tech companies will be held accountable for the harm they cause. These companies have spent years choosing profit over people’s well-being, and now a jury has decided they must pay the price for their actions,” said Maddy Batt, a legal fellow at Tech Justice Project, a law firm specializing in suits against AI chatbots.

The plaintiff, KGM, filed her lawsuit using a pseudonym in 2023. KGM, now 20, says she has been addicted to social media since she was a child. It was one of three cases selected out of thousands as “bellwether trials” to test out a new theory of liability.

Batt cautioned that the outcome of this trial doesn’t mean “an automatic legal win” for the thousands of pending cases, as determining causation varies greatly given the circumstances. “Each individual plaintiff still does have to show, if they go to trial, that any negative mental health outcomes they personally experienced were linked to social media,” she said.

It is a huge boon to tech accountability advocates to see this success though, Batt said, and could lead to tech companies changing their products because of the amount of money in play to settle cases or pay damages. This jury decision, coupled with a $375 million verdict against Meta announced yesterday, is the first step to achieving that goal.

The New Mexico Attorney General Raúl Torrez sued Meta in 2023, alleging the company misled constituents over how safe its platforms are for children. State prosecutors focused specifically on Instagram’s potential to facilitate the sexual exploitation of kids.

On Tuesday, a jury sided with New Mexico, saying the company also engaged in deceptive trade practices. Meta was ordered to pay $5,000 per violation — $375 million total. Torrez at a future bench trial, and hopes to compel changes to the platform. Meta said it plans to appeal.

Batt pointed out that this trial is the first time tech leaders like Mark Zuckerberg have had to make a case and submit to questioning in front of a jury of their peers. (The CEO did not take the stand in the New Mexico case.) Large tech companies have faced a public backlash over the past decade, and much of it has revolved around their products’ impact on the mental health of young people.

Frances Haugen, a whistleblower, leaked internal research documents from the company previously known as Facebook showing girls reported their eating disorders worsening after using Instagram. Social media use can prompt girls to compare and criticize their own bodies, and many companies struggle to moderate on their platforms.

Over two-thirds of teenage girls reported using Instagram, more than boys. A quarter each of Black and Latinx teens said they use Instagram and YouTube “constantly” according to a by Pew Research Center.

Google argued that YouTube was not social media, while Meta of KGM’s anxiety, depression and body dysmorphia. Meta’s lawyers deconstructed KGM’s home environment, alleging her parents’ divorce and treatment by her mother were the root cause of her emotional pain. The companies also argued that it wasn’t the way their products were designed that caused problems, but rather the specific content seen.

KGM originally named the companies behind Snapchat and Tiktok in the lawsuit, but those parties settled for an undisclosed sum before the trial started. The trial focused on Instagram and Facebook, both Meta products, and YouTube, which is owned by Google.

The burden was on KGM’s lawyers to prove that Meta and Google were negligent in their design of social media products and show that those same products caused the plaintiff’s mental health issues. The jury agreed with those arguments.

KGM testified that features like notifications , and she was unable to stop whenever she tried to limit her usage. She said she started her first Instagram account at age 9 and joined YouTube at age 10, even though legally kids aren’t supposed to have online accounts before they’re 13. Almost all of her Instagram posts had image filters on them, and KGM said she didn’t feel bad about her body until she began using the platform.

The tech accountability watchdogs who rallied behind KGM are ecstatic over this win. “The era of Big Tech invincibility is over,” said Sacha Haworth, executive director of The Tech Oversight Project, in a statement.

For parents who have lost their kids to what many describe as social media-related harms, this is a moment of vindication.

“For years, families have been told this was a parenting issue, but the jury saw the truth: these companies made deliberate decisions to prioritize growth and profit over kids’ safety,” said Shelby Knox, director of online safety campaigns at nonprofit ParentsTogether.

Social media companies have been battling allegations of harm, particularly to kids, for years. Most of the claims are easily dismissed under Section 230, the law that says a platform isn’t held liable for third-party content it hosts. But these bellwether cases are testing whether the design of products like YouTube, Facebook and Instagram are inherently harmful. Plaintiffs have pointed to the impacts of features such as infinite scroll and face filters as harmful regardless of the content being shared.

The case concludes as Congress works to pass a package of internet bills that is but that critics say may lead to the removal of digital and — a particular concern given the Trump administration’s policy positions.

In her statement, Haworth at The Tech Oversight Project called on lawmakers to pass the Kids Online Safety Act, one of the most hotly debated pieces of tech legislation in recent years. It has failed to pass the House since its first was introduced in 2022, but now is being considered as part of the aforementioned package.

“It’s good that people are suing these companies and winning in court to reduce their power and force them to change their policies,” said Evan Greer, director of digital rights nonprofit Fight For The Future, to The 19th. But she’s concerned how the verdict in KGM’s case will be used to advocate for laws that she says could threaten free speech online.

Greer pointed to the way activists are using social platforms to monitor abuses by Immigration and Customs Enforcement, advocate for human rights and discuss accustations of sexual abuse against people like Jeffery Epstein. “We need policies that address corporate abuse without kneecapping the ability of front-line activists to use social media to change the world,” she said.

Jess Miers, associate professor of law at the University of Akron School of Law, is concerned about the long-term consequences of the verdict. While these cases focus on the way platforms are designed, said in practice, there isn’t a strong delineation between content and feature design.

“Autoplay is only engaging because of what it plays,” she told The 19th. “Infinite scroll only retains users because of what it surfaces.” She pointed out many apps use these kinds of features, but those aren’t the ones being sued.

Thus, liability tied to design will inevitably trickle down to judgements about content. “The only practical way to reduce the risks alleged in these suits is to restrict or suppress categories of content that might later be characterized as harmful or ‘addictive,’” she noted.

And what’s the content most likely to be labeled as harmful? “History shows they expand to cover disfavored speech—whether that’s reproductive health information, gender-affirming care, or speech about policing and immigration enforcement,” she said.

“The people most likely to be affected are those who already rely on the Internet as a primary space for connection and support,” Miers said — like disabled people, LGBTQ+ youth or people looking for accurate information on contraception.

was originally reported by Jasmine Mithani of . .

]]>
‘Commons’ Founders Say Phone-Free Schools Rob Kids of Agency /article/74-interview-commons-founders-say-phone-free-schools-rob-kids-of-agency/ Wed, 25 Feb 2026 15:30:00 +0000 /?post_type=article&p=1029087 Over the past few years, the phone-free schools movement has rapidly gained steam, with states and school districts pushing to limit smartphone access during school hours. As of early 2026, , have restricted or banned student mobile phone usage in K-12 classrooms. Companies like Los Angeles-based Yondr, which offer special magnetic pouches that lock phones away, are experiencing brisk business.

While the policies are almost uniformly popular, a few observers see a downside. The movement “happened so quickly there wasn’t a thoughtful, nuanced approach” to the problem of helping young people manage digital distraction, said Julia Gustafson, a public health expert who spent five years developing school partnerships for Yondr.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


She and partner Shannon Godfrey last year founded , a technical solution to distraction that they believe offers the benefits of a bell-to-bell mobile phone ban that also teaches students how to manage their digital habits and learn skills that give them greater agency without hiding their devices in a pouch.

On its website, The Commons describes itself as “airplane mode for schools,” creating what amounts to a large geofence around a campus that essentially turns off the Internet during the school day. Schools can “whitelist” sites they need, such as Google Classroom, Khan Academy, Duolingo and the like, but others are inaccessible. Students keep their phones with them, but they must adjust the app’s settings to turn individual apps or games on.

Students who look for ways around the system trigger a notification that offers a “nudge,” giving them the opportunity to turn the apps off. If they don’t, alerts go to administrators, who can easily track down the student and address the issue.

At bell time, the geofence deactivates, said Gustafson. When students walk off campus, it deactivates as well. “It’s tier-one social norming,” she said. “Students are building the skills they need every single day, along with their peers doing the same thing. It makes the right choice the easy choice, by automatically silencing those distractions.”

Godfrey, whose background is in ed tech, said the app helps schools minimize distractions while helping students practice “healthier tech habits,” something bans don’t address. The habits, she said “can transfer beyond the school walls” and help students develop life skills that will be valuable as adults. 

The 74’s Greg Toppo talked recently with Gustafson and Godfrey about what they see as the inadequacy of phone-free schools policies and, in Gustafson’s words, how such policies send “a completely mixed message” to kids about the power of technology. 

The conversation has been edited for length and clarity.

The 74: Let’s talk about the phone-free schools movement. I can’t remember the last time I saw something catch fire so quickly and grow so rapidly. I gather that you folks have a slightly different point of view on this in terms of distraction and keeping kids focused on school.

Julia Gustafson: It’s been simmering under the surface for a long time. People have noticed that there’s something wrong with how people are engaging with their phones, but more importantly, the addictive applications that are on the phones. COVID was a catalyst to people waking up and understanding that there truly is something wrong here. Going beyond that, it’s been a movement, both on a parental level and a school level, because we’re seeing teacher attrition rates higher than they ever have been. How can we support our teachers, and how can we support our students and parents getting intimately involved? 

It always takes a little while for research to catch up, and research has now finally caught up. That being said, the way in which it’s being handled, talking about it as bans and prohibition, is a surrender to not understanding what to do about a truly wide-spanning public health topic. A ban or prohibition is action, versus what we were doing before, which is inaction. But no one has really taken a thoughtful approach to thinking about how we can do this differently, with guardrails to support people’s interactions with phones. 

Shannon Godfrey: My background has been in education technology, and so I’ve seen the positive of when tech is used appropriately in the classroom to aid student success. Julia and I together come in with that thoughtful approach. But when you look at some of the research around neuroscience or behavioral science, adolescents haven’t yet developed the skills for self-regulation, impulse control, attention management. And most of the apps that are competing for their attention are intentionally engineered to make it hard to disengage — and that’s something we know adults struggle with too.

So to Julia’s point, this is really a societal problem and a public health issue. But the difference with adults is that we’ve had time and context to develop coping strategies. We’ve developed systems to manage the distractions, and it’s getting more difficult for students to be able to handle that. 

Our “a-ha” moment [was] having experience helping schools go phone-free, and seeing that the short-term, immediate impact was phenomenal, but really talking with schools about the exceptions [that didn’t work]. How do we start to use tech positively when we’re using Duolingo or mobile optimized apps in the classroom? How do we make sure that students are really developing some of the skills beyond the four walls of schools? We are having a lot of these conversations. We need something a little bit more intentional, and I think that is something tech can solve.

Julia, you used the word “surrender” earlier. I’m guessing that you would say a phone-free strategy doesn’t teach the skills of “saying no” and limiting your time on an app — or even learning about what the app is trying to do. I wonder if you could talk a little bit about that.

Gustafson: When policymakers are addressing demands from both parents and schools, what they’re lacking is that context in which technology is integral for anyone to be successful now, but also into the future. And so when you ban or prohibit something, that’s sending a completely mixed message to the students that the technology is to be embraced and it’s going to make you a leader — but needs to be locked away. Then we need dual-factor authentication to log into our Chromebooks, and so we take out this “prohibited” device, open it up to use the dual factor authentication — but then are bombarded with 200 notifications from Tik-Tok. So boom, this rebound consumption happens, and you’re locked into that as a distraction, vs. going on your phone, using it as the tool that it was designed to be, and being able to move forward.

I was listening to a radio program about phone-free schools the other day and one of the panelists said that if school is a place where we prepare young people for their life after school, there’s only one kind of job where they ask you to put your phone away: a low-paid service job. Do you have any ideas on that?

Gustafson: That goes back to what I was saying at the beginning: Using technology appropriately is integral to someone’s ability to be a leader in today’s society, so it is a huge mixed message when you’re telling somebody to lock a device away throughout the day instead of actually being able to utilize it when there are practical applications — and denying them that opportunity to learn the right time, place and manner to use that piece of technology. And you can think about that for phones, but you can also think about that for tablets and computers, which can be equally distracting during the school day.

Godfrey: We’ve had the opportunity to meet with students and have focus groups. Students are savvy and they’re smart. A lot of times with phone bans, are we saying that students can’t learn self-regulation and they can’t learn impulse control? When you talk with students, they’re saying, “Hey, I want schools to help me learn self-regulation. I just don’t necessarily agree we should pretend the phone doesn’t exist.” And in our focus groups, we have students come back and say, “Why is it so wrong if I believe the phone is my device of choice? Maybe it’s the only thing I can afford. Maybe it’s just what I’m used to using because they’re so sophisticated now and just readily available. But if I’m using it for academics, and I choose the academic app or to upload my Google Classroom or to submit an assignment or a chat in Google Classroom, why can’t I use my phone for that if that’s the appropriate time? Why can I use my computer in class but not my phone? If you’re helping us learn time, place and manner, then why is the phone so wrong?” You’re almost saying one thing but then asking them to do another. 

Let’s talk about The Commons: If I’ve got a game on my phone that doesn’t need Internet access, I’ve got access to that as well. What’s your thinking on that?

Gustafson: We do track the amount of time students are spending on that on their phone during the school day, so if a student downloads a game that doesn’t need Internet access, we can see on the admin dashboard that Greg has spent two hours on his phone today. That’s a little odd. Let’s go check in and see what the scoop is. So that’s one of the ways that we can try to prevent students from doing that. And then I’ll also just add that The Commons isn’t the school’s cell phone policy. This is a measure that gets inserted into the school cell phone policy to just help make it easier for that right time, place and manner, and for students to comply with it. So if I’m sitting there playing a game for two hours on my phone, I’m sure that someone is going to notice that, and that’s when that policy comes into play.

Turning off the Internet, for lack of a better term, seems like a smart move — with obviously these other sites whitelisted for school use. I guess somebody might squint and say it’s kind of the same thing as putting a phone in a pouch. What’s the difference?

Gustafson: The pouch doesn’t have any guardrails. So if a teacher decides, “Hey, everyone, take out your phone for Duolingo” in language class, it’s unfettered access all over again. You might get 100 notifications. It all comes back. But with The Commons app, you have the guardrails up at all times. You don’t actually need to lock a phone away. You don’t need to spend time taking a phone out of a pouch or getting it or retrieving it, plus it constantly has guardrails on so the focus can always be on the task at hand.

Can you dig in a little bit more deeply? What are students learning?

Gustafson: Behavioral economics really is the science about making the right choice the easy choice, by helping people make decisions that are ultimately the best for them. And so in the case of school, it’s being able to stay off of distracting applications. 

What are you actually learning to do better using this app?

Gustafson: We just interviewed some teachers right before the holiday break. What they were saying is, “We see that students just have more control over their phones. They’re not fiddling with them as much. They have better impulse control.” And that’s a huge win. We talked about behavior change. So much of this is an impulse for people to reach out to their device without actually understanding that they’re doing it until they’re already in their phone. If we can start controlling those impulses and allow people to develop the skill set of controlling their phone use when their phone is still next to them — because that’s the skill they’re going to need when going into college or their career — that’s a huge win for us.

Godfrey: We’re giving them a feedback loop. They’re taking their real device — they don’t have to lock it away and pretend it doesn’t exist — and learning how to manage it in the wild. Our students are recognizing that when I set foot on campus, this is time to put our phone away. It’s sometimes that subtle nudge I need, but it’s helping me build this habit. It’s helping me remember, “Yep, this is school time. This is my time to engage, my time to learn, my time to focus.”&Բ;

And it’s been phenomenal. I’m getting better grades, and I’m playing with kids during recess, and we’re checking out basketballs, and I’m noticing my peers are interacting with us, and we’re paying attention to the teachers.

So the phone is sitting in front of me. I don’t have to put it in a pouch. I don’t even necessarily have to put it in my backpack. Yet all the things that I would use it to have fun with aren’t there. They essentially aren’t working. So how am I learning impulse control? 

Gustafson: Because of all the addictive apps on the phone that people are hardwired to reach out to it, even if it doesn’t buzz, even if it doesn’t do anything, sometimes even the sight of it — it’s now wired in my brain that the minute I have a sense of boredom I’m pulling out my phone to cure that boredom. By reducing all of the fun and addictive apps on it, we’re actually helping rewire the brain to not want to continue. 

So it’s saying, “In certain conditions, this phone is not the same kind of machine.”&Բ;

Gustafson: If for eight hours during the day when they’re at school, we’ve shifted their brain to understanding that this is a boring device and they have control over it — they have impulse control over that device — they’re now having the awareness to practice those same skills outside the walls of the school. 

One of the appeals of a phone-free school is that it’s very clean and easy for the adults. If every kid’s phone is in a bag, I don’t have to worry about it. What The Commons is doing, in a sense, could make life more complicated for certain adults, having to chase down the kid who’s on Tik-Tok, or using some site they shouldn’t be. 

Godfrey: It’s interesting. From our experience and talking with schools, we see that a lot of programs with pouches roll out really successfully at the beginning, but then there are damages to pouches happening, or students faking a phone into the shoe rack. They’re working the system. Our schools are spending more energy playing Whack-a-Mole, and as those inconsistencies continue to creep up, the fidelity of the program starts to go away. And as the fidelity goes away, students are realizing that they can get away with it. And so then they do.

With our schools, what we’ve been able to do for the first time is actually help focus our administrators on where to put their attention: Where are students actually struggling with being able to put their phone down? Are these students who actually need more support and intervention? And when we also look at grades, attendance and some of these other data points and factors, if the phone is traditionally a root cause to a lot of these problems, how do we really support that student before they get off task and have a greater risk of not graduating?

]]>
ICE Taps into School Security Cameras to Aid Trump’s Immigration Crackdown /article/ice-taps-into-school-security-cameras-to-aid-trumps-immigration-crackdown-74-investigation-shows/ Tue, 10 Feb 2026 11:30:00 +0000 /?post_type=article&p=1028296 This story was co-published with 

Police departments across the U.S. are quietly leveraging school district security cameras to assist President Donald Trump’s mass immigration enforcement campaign, an investigation by The 74 reveals. 

Hundreds of thousands of audit logs show police are searching a national database of automated license plate reader data, including from school cameras, for immigration-related investigations.

The audit logs originate from Texas school districts that contract with Flock Safety, an Atlanta-based company that manufactures artificial intelligence-powered license plate readers and other surveillance technology. Flock’s cameras are designed to capture license plate numbers, timestamps and other identifying details, which are uploaded to a cloud server. Flock customers, including schools, can decide whether to share their information with other police agencies in the company’s national network. 

Multiple law enforcement leaders acknowledged they conducted the searches in the audit logs to help the U.S. Department of Homeland Security enforce federal immigration laws, with one saying the local assist was given without hesitation. The Trump administration’s aggressive DHS crackdown, which , has had a significant impact on schools

Educators, parents and students have been swept up, with immigrant families being targeted during . School parking lots are one place the cameras at the center of these searches can be found, along with other locations in the wider community, such as mounted on utility poles at intersections or along busy commercial streets.

The data raises questions about the degree to which campus surveillance technology intended for student safety is being repurposed to support immigration enforcement, whether school districts understand how broadly their data is being shared with federal agents and if meaningful guardrails exist to prevent misuse. 

“This just really underscores how far-reaching these systems can be,” said Phil Neff, research coordinator at the University of Washington Center for Human Rights. Out-of-state law enforcement agencies conducting searches that are unrelated to campus safety but include school district security cameras “really strains any sense of the appropriate use of this technology.”

Flock devices have been installed by more than 100 public school systems nationally, government procurement records show, and audit logs from six Texas school districts show campus camera feeds are captured in a national database that police agencies across the country can access. School district Flock cameras are queried far more often by out-of-state police officers than by the districts themselves, according to the records.

School police officers use Flock cameras to investigate “road rage,” “speeding on campus,” “vandalism” and “criminal mischief,” records show. There is no evidence school districts themselves use the devices for immigration-related purposes — or that they’re aware other agencies do so. 

Typical Flock automated license plate reader, mounted to a pole and powered by a solar panel (Wikipedia, CC)

and previously revealed that police agencies nationwide were tapping into Flock camera feeds to help federal immigration officials track targets. In some cases, local law enforcement agencies enabled direct sharing of their networks with U.S. Border Patrol. 

Immigration officials’ unprecedented use of surveillance tactics to carry out their controversial mission has . That school district cameras are part of that dragnet has not been previously reported. 

Randi Weingarten, president of the American Federation of Teachers, called the revelation an “egregious end run around the Constitution” that will add to the pressure on Congress to rein in U.S. Customs and Immigration Enforcement. By accessing campus feeds, she said, immigration authorities are violating the rights of students, parents and educators “to be free from unreasonable search and seizure.”&Բ;

The teachers union in September after it ended a longstanding policy against conducting immigration enforcement actions in and around  

“Schools are sacred spaces — and ICE knows it needs a judicial warrant to access them,” Weingarten said in a statement. The teachers union filed its lawsuit, she said, “so schools remain safe and welcoming places, not targets for warrantless surveillance and militarized raids.”

 High school students in Bloomfield, New Jersey, walk out of class on Feb. 3 to protest heightened federal immigration enforcement actions in the state. (Photo by Kyle Mazza/Anadolu via Getty Images)

The scale of it is phenomenal

At the Huffman Independent School District northeast of Houston, records reveal it was the campus police chief’s administrative assistant who granted U.S. Border Patrol access to district Flock Safety license plate readers in May.  

Police departments nationwide also routinely tapped into the eight Flock cameras installed at the 30,000-student Alvin Independent School District south of Houston. Over a one-month period from December 2025 through early January, more than 3,100 police agencies conducted more than 733,000 searches on the district’s cameras, The 74’s analysis of public records revealed. Of those, immigration-related reasons were cited 620 times by 30 law enforcement agencies including ones in Florida, Georgia, Indiana and Tennessee. 

Dr. Ronald E. McNair Junior High School in the Alvin Independent School District. (Djmaschek, Wikipedia)

Flock offers a list of standardized reasons that agencies must choose from when running a search. For the Alvin school district’s cameras, immigration-related reasons identified by The 74 include “Immigration (civil/administrative)” and “Immigration (criminal).”&Բ;

The data put into focus the scale of digital surveillance at school districts nationally and “just how dangerous these tools are,” said Ed Vogel, a researcher and organizer with The NOTICE Coalition: No Tech Criminalization in Education.

“The scale of it is phenomenal, and it’s something that I think is difficult for individual people in their cities, towns and communities to fully appreciate,” said Vogel, who’s also with the surveillance-monitoring Lucy Parsons Labs in Chicago. 

The Flock camera audit logs and other public records about their use by school districts were provided exclusively to The 74 by The NOTICE Coalition, a national network of researchers and advocates seeking to end mass youth surveillance. The 74 also filed public records requests to obtain information on schools’ use of Flock cameras and conducted an analysis to reveal the extent of the immigration-related searches. Those findings were shared with the law enforcement agencies and school districts mentioned in this story. 

Three of the 10 agencies that conducted the most immigration-related searches in the Alvin school district logs participate in the 287(g) program, which deputizes local officers to perform certain immigration enforcement functions and has also become a point of controversy. The program has during Trump’s second term.

Alvin school district Police Chief Michael Putnal directed all questions to district spokesperson Renae Rives, who provided public records to The 74 but did not acknowledge multiple requests for comment. 

Amanda Fortenberry, the spokesperson for the Huffman school district, said in an email the district is “reviewing the matters you referenced,” but declined to comment further.

Flock Safety, which across 7,000 networks nationally, didn’t respond to The 74’s requests for comment, nor did the Department of Homeland Security.

‘We will assist them — no questions asked’

Camera settings information obtained by The 74 through public records requests suggests that Alvin school district police officers are unable to search their own devices for immigration-related purposes. But the school system allows such queries routinely from out-of-state police officers, audit logs reveal. 

Flock searches for civil immigration reasons that appeared in the Alvin school logs, such as trying to locate someone who is unlawfully present in the U.S., were more than two times more frequent than those conducted for investigations involving immigrants suspected or convicted of committing a crime. 

Also included among the reasons given for immigration-related searches are “I.C.E.,” in reference to Immigration and Customs Enforcement, “ERO proactive crim case research,” an apparent reference to ICE’s Enforcement and Removal Operations division and “CBP Investigation,” an apparent reference to U.S. Customs and Border Protection.

Lt. Blake Hitchcock

In Carrollton, Georgia, officers routinely use Flock’s nationwide lookup to track suspects outside their jurisdiction, Lt. Blake Hitchcock said in an interview. Immigration-related searches that appear in the Alvin school district’s audit log by the Carrollton Police Department were conducted to assist federal agents at the request of the Department of Homeland Security, Hitchcock said. He declined to elaborate on specifics.

Federal agents “were working directly” with a Carrollton police officer who had access to the Flock cameras “and they asked him to run it and they did,” Hitchcock said. If federal agents ask his office to help them with an immigration case, Hitchcock said, “we will assist them — no questions asked.”

Flock searches are typically broad national queries, and officers do not select individual cameras, he explained. Instead, with each search request, the system automatically checks every camera that Flock customers share with the nationwide database, including those operated by school districts.

Because a school district is part of the national lookup, Hitchcock said, its cameras will be searched any time another participating agency conducts a nationwide inquiry. He said Flock’s nationwide search is helpful to track people who “go from jurisdiction to jurisdiction to commit crimes.” He pointed to in 2020 when Carrollton officers used Flock cameras to rescue a 1-year-old who was kidnapped at gunpoint some 60 miles away. 

In Galveston, Texas, Constable Justin West confirmed that immigration-related searches that appeared in the Alvin school district’s audit logs from his department were tied to the county’s participation in the federal 287(g) program.

County deputies with federal immigration enforcement powers “have been working on arresting targeted criminal illegal aliens,” West wrote in an email, and use Flock cameras “to determine locations and travel patterns of the illegal aliens being sought.”&Բ;

Galveston deputies’ Flock searches that appeared in the Alvin school district audit logs led to several arrests, West said, while several of the investigations remain ongoing. Flock logs show the Galveston County searches were conducted for both criminal and civil immigration investigations. 

While the Trump administration maintains its immigration crackdown centers on removing dangerous criminals, surged to 43% in January. and immigrants with no pending civil immigration actions against them have similarly been detained. 

Other agencies that participate in the 287(g) program that were heavily represented in the Alvin ISD logs include the Texas Department of Public Safety and the Florida Fish and Wildlife Conservation Commission, each of which conducted more than 60 immigration-related searches that queried the school district’s cameras in the one-month period. 

The Texas Department of Public Safety and the Florida Fish and Wildlife Conservation Commission were among four agencies that did not respond to The 74’s inquiries about their searches. The other two were the Lowndes County Sheriff’s Office in Georgia and the Greene County Sheriff’s Office in Ohio. 

In Mesquite, Texas, searches labeled “Immigration (criminal)” were “conducted as part of an investigation to locate a suspect wanted on felony criminal charges,” Lt. Curtis Phillip said in an email. While the suspect “was believed to be unlawfully present in the United States,” Phillip said his department doesn’t use Flock cameras “for the purpose of enforcing federal civil immigration law.”

“When a search is conducted across the shared network, the activity may appear in the audit records of all participating system owners, even when the investigation itself is unrelated to schools or school-based activity,” Phillip said. “There is no efficient mechanism to exclude specific entities, such as school districts, from those searches.”

In Grant County, Indiana, 238 immigration-related searches included in the Alvin ISD audit logs were conducted “by one of our deputies” as part of “a confidential investigation,” Jay Kay, chief deputy of the county sheriff’s office, said in an email. He didn’t elaborate further. The reason given for the search in the Flock audit log was “Immigration (civil/administrative) – Test.”&Բ;

It’s not clear whether every search tagged as immigration-related necessarily was. John Samples, captain of the Little Elm, Texas, police department, said a detective selected “immigration” as a search reason while assisting the Department of Homeland Security on a sex crimes investigation and a separate terrorism-related case. That word choice, Samples said, was “not the best course of action” and will be “corrected on our end.”&Բ; 

The police department in Texas City, Texas, denied it used the system to enforce federal immigration laws. While the agency monitors “several thousand Flock Cameras across the United States,” Captain Brandon Shives said his department’s searches in the Alvin ISD logs should not have been categorized as immigration-related and that it was the result of a “clerical error.”

‘Your community and beyond’

Flock Safety has repeatedly stated that it does not provide the Department of Homeland Security with direct access to its cameras and that all data-sharing decisions are made by local customers, including school districts. 

“ICE cannot directly access Flock cameras or data,” the company . “Local public safety agencies sometimes collaborate with federal partners on serious crimes such as human trafficking, child exploitation or multi-jurisdictional violent crime,” but decisions about “how data is shared are made by the customer that owns the data, not by Flock.”

The company acknowledged in August it ran pilot programs with the DHS to assist federal human trafficking and fentanyl distribution investigations but that “all ongoing federal pilots have been paused” after the initiative faced scrutiny and legal pushback.

“You know how maybe your grandparents approve every friend request they get on Facebook? It’s like that. It’s always been like that.”

Dave Maass, investigations director, Electronic Frontier Foundation

Public records provided by the Alvin school district, which began purchasing Flock cameras in 2023 and has since spent more than $50,000 on its eight devices, include Flock marketing materials that tout the ability to share data with other police agencies. 

“Not only do we place cameras where you need them,” the document notes, “we offer access to available cameras in your community and beyond your jurisdiction.”

In fact, nationwide sharing is a staple of Flock’s business model, said Dave Maass, director of investigations at the nonprofit Electronic Frontier Foundation. Maass has spent the last decade researching how police use automated license plate readers like Flock and, in at least one case last year, in Texas, where the procedure is illegal. 

“That’s something that’s a selling point for them,” Maass said, adding that his research has shown that police agencies agree to provide outside officers access to their Flock data with little deliberation. 

“You know how maybe your grandparents approve every friend request they get on Facebook?” Maass said. “It’s like that. It’s always been like that. You’ll have an agency that will request access to other places and other places will just not even question it. They’ll just hit ‘sure, approve.’”

‘A unique level of responsibility to protect their students’

Flock Safety provides audit logs that allow law enforcement customers to see how their automated license plate reader cameras are being used. The reports “support accountability and public trust by making usage patterns visible and reviewable,” the company said in the recent blog post. 

None of the law enforcement officials contacted by The 74 said they used the audit logs to ensure people with access to their data queried the information for legitimate and legal purposes. Given the overwhelming volume of law enforcement searches that are included in the Alvin school district audit logs in just a month, Maass said, such reviews would be practically impossible.

Adam Wandt

Adam Wandt, an attorney and associate professor at New York City’s John Jay College of Criminal Justice, said license plate readers can be invaluable tools for solving serious crimes and finding missing persons. 

But he also acknowledged the devices present significant privacy concerns and questioned whether the broad sharing of school-controlled camera data violates federal student privacy rules. The revelation that school-owned Flock cameras are being queried for immigration enforcement purposes, he said, “will cause significant discussions to be had in the near future within many school districts” that contract with the company. 

“School districts are in a unique position, they have a unique level of responsibility to protect their students in specific ways,” including their privacy, Wandt said.

Vogel of the NOTICE Coalition said students and parents should demand transparency from their school districts about whether they employ Flock license plate readers and whether the data from those cameras are being fed to immigration agents. 

“These are just tools, and whoever has control over them gets to define how they’re used,” Vogel said. “I have a feeling that immigration enforcement was not one of the reasons that was discussed when they said, ‘We need to get a contract with Flock Safety.’”

]]>
Opinion: It’s Time to Embrace AI Literacy for Kids /article/its-time-to-embrace-ai-literacy-for-kids/ Sun, 08 Feb 2026 11:30:00 +0000 /?post_type=article&p=1028182 Artificial intelligence has become an incredibly polarizing topic, with one side eager to integrate it into every aspect of life and the other side running from it as fast as they can.  Is this new technology an existential threat or a transformational opportunity? According to Pew from September, “Americans are more concerned than excited” about the proliferation of AI and want to exert more control over its use.

About 62% of U.S. adults report interacting with AI several times a week, and adults and children alike engage on a regular basis with AI without even realizing it. Children are growing up in a world where this technology is unquestionably a part of daily life, shaping their lives in ways no one can yet fully understand. Giving them a clearer understanding of how AI works has never been more important.

This fall, the three of us met at an event at the National Children’s Museum which brought together technology leaders, museum educators, policymakers, teachers and academic researchers focused on guiding our kids safely and productively into our technology-driven world.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


Our key takeaway? Regardless of where you stand on this issue, a common ground must be forged now. Constructive dialogue must happen, and it needs voices from both sides to produce a healthy outcome for our children. Helping kids understand AI means being both optimistic and cautious, recognizing its promise while acknowledging its shortcomings and risks.

What if, alongside helping our youngest learn to use AI, we placed greater emphasis on teaching them how it works? By nurturing children’s critical thinking skills, we give them the power to understand it as a tool—where it can augment human effort, and where it fails miserably.

AI is ushering in a new wave of innovation, but it is also enabling new forms of deception and manipulation. It provides access to a wealth of knowledge and opportunities, but the resulting information overload can undermine learning, cognition, creativity and human connection.

Society as a whole, from educational institutions to policy makers to parents at the dinner table, need to invest in children’s AI literacy now. In doing so, we can instill some of the most important lessons: how to be creative and discerning in the world in which we live, preparing them for a future full of new opportunities.

According to the World Economic Forum’s Future of Jobs Report, employers expect that 39% of workers’ core skills will change by 2030, with technological skills gaining importance most rapidly. AI will open up new fields of biomedical research. It will help us feed our growing global population. But it will also force many of us to rethink our jobs and educational pathways.

So, on a global scale, an investment in our children’s AI literacy not only ensures a competitive workforce but also safeguards national prosperity, security and the responsible use of powerful technologies. Whether you think AI is exciting or threatening, children must be introduced to age-appropriate concepts about it so that they can build fluency and prepare for the future.

Another takeaway from our conversation? Adults must learn alongside — and sometimes from — our kids. As adults, we have the responsibility of fostering children’s safe use of this powerful tool. But let’s give ourselves the grace to acknowledge that we don’t understand AI either.  We didn’t grow up with it, and experts and technology leaders believe that generative AI has surpassed the understanding of its creators.

There is a window of opportunity to bring everyone to the table. As parents, educators and lifelong learners, we need to have deeper conversations about AI — especially how it shapes children’s learning, development and daily lives. We don’t have to fully comprehend it or agree with all its intended uses; we just have to be open to talking about it and taking action. By approaching this with curiosity, we can thoughtfully consider appropriate uses and guardrails for kids—something we didn’t do early enough when America’s children first began using online tools like social media.

There are organizations starting to address AI literacy and technology education for families. Sesame Street and Google collaborated to release a on the healthy use of digital technology. Common Sense Media, with support from the National Parents Union and EDSAFE AI, has a series of about digital citizenship and AI arranged by grade level and a for parents as well. The website provides research-based articles, podcasts and other resources to help parents navigate age-appropriate technology use. Children’s museums are developing hands-on, screen-free experiences to help demystify the processes underlying AI. There needs to be more of this, supporting children’s understanding of the fundamentals, not just how to use its applications.

AI’s purpose is not to replace human life, but to enhance it. Yet, the current conversation — especially around children’s use of AI — is too passive, treating these complex systems as inevitable rather than intentional creations. Educators, industry leaders and policymakers need to insist on a richer, more engaging dialogue about how it shapes kids’ learning, choices and experiences. 

Whether it’s the weather report from a smart device or personalized help from a chatbot, AI literacy is now essential for young people to navigate civic life. No matter your viewpoint, it is time to embrace AI literacy. The stakes are too high for anything less than universal, active participation in preparing children for the world they’re inheriting and will soon lead.

]]>
Today’s Kids Can’t Tell Time /article/todays-kids-cant-tell-time/ Tue, 03 Feb 2026 16:40:37 +0000 /?post_type=article&p=1028040
]]>
Study: 98% of Teens Attend Schools Limiting Cellphones, but Most Still Use Them /article/study-98-of-teens-have-school-cellphone-bans-but-majority-dont-follow-them/ Wed, 28 Jan 2026 18:30:00 +0000 /?post_type=article&p=1027779 As schools implement cellphone restrictions, new research shows that teens mostly support the policies — but that doesn’t mean they follow them. And students spend an average of an hour and a half using the phone in school every day no matter how restrictive the policies are, despite the consequences.

A University of Southern California published Monday surveyed roughly 1,700 parents and 364 students ages 13 to 17 last fall. Researchers used the annual to analyze students’ cellphone use and their , along with parents’ perceptions of the restrictions. At least have some form of ban or limitation on cellphones during instructional time.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


About 98% of students attend schools with cell phone restrictions, according to the study. Some 76% of teens and 93% of parents said they support some type of ban. 

But the researchers found that students still use their cellphones in school. About two-thirds of teens at schools with complete phone bans said they use their device during the day, including in class, and more than half of students whose school restricts cellphones during instructional time don’t follow the rules.

“The results are pointing towards both parents and teens wanting to have at least some form of restrictions on cell phone use in classrooms — neither are reporting major downsides,” said Anna Saavedra, one of the study’s researchers. “(Students and parents) are really supportive of the restrictions and they even support making rules stronger. Part of the challenge has been that even though schools have these rules, teens are telling us that they’re breaking them.”

Most students reported two categories of cellphone bans: either prohibiting use for the entire day or only during instructional time. Nearly 75% of teens said that no matter the policy, their school still lets them keep their phones with them. Some 5% said their school doesn’t permit cellphones on school property. 

The study also found that teens use their phone in school for an average of 1.5 hours a day regardless of the type of ban. That matches other that found students ages 13 to 18 spend an average of 70 minutes on their smartphones during the school day, typically using social media or gaming apps. 

Restricting cellphone use only during class instruction is a rule that 68% of students and 53% of parents support. About 24% of teens and 7% of parents said they would prefer no restrictions.

Overall, 42% of teens and 76% of parents said their schools’ rules are “just right.” About 48% of students and 8% of parents thought they were too strict. Half of students said their school’s rules were different and stricter than the previous year’s. 

Most teachers enforce phone policies, according to the study. Nearly two-thirds of students said their teacher gives a verbal warning if someone breaks the rules. Other common consequences include taking the device away for the rest of class or for the entire day; notifying parents; giving detention; or requiring a parent to pick up the phone.

Though the rise of smartphones has been linked to negative student outcomes like poor academic achievement, the teens and adults surveyed by USC said they don’t believe cellphone policies have much of an effect. The majority said the rules had no impact in areas such as sense of community, relationships with teachers and bullying or fighting. The majority of students also said there was no effect on academic performance, making friends or their likelihood of attending school.

About 28% of the teens said the rules made the classroom learning environment better, while 26% said they made it worse. One-third of students said the policies improve academic integrity or reduce cheating, while 19% said the opposite.

A recent University of Pennsylvania of 20,000 educators found that stricter cell phone policies are associated with more positive outcomes reported by teachers. Nearly half of schools in the study have a “no show” rule — where students can have their phones if they keep them out of sight — but this policy isn’t as effective as more restrictive rules. 

“The stricter the policy, the happier the teacher and the less likely students are to be using their phones when they aren’t supposed to,” said University of Pennsylvania Professor Angela Duckworth about the data. “We’re also finding that focus on academics is higher in schools that do not permit students to keep their phones nearby, including in their backpacks or back pockets.”

Disclosure: The Overdeck Family Foundation provides financial support to The 74.

]]>
Indiana Senators Push Forward Social Media Limits for Minors, Stricter School Tech Policies /article/indiana-senators-push-forward-social-media-limits-for-minors-stricter-school-tech-policies/ Wed, 21 Jan 2026 19:30:00 +0000 /?post_type=article&p=1027221 This article was originally published in

Indiana’s Senate Education Committee on Wednesday advanced two bills aimed at reshaping how young Hoosiers interact with technology — one that would restrict minors’ access to social media platforms and another that would require schools to strengthen technology plans and give parents greater control over at-home device use.

, authored by committee chair Sen. Jeff Raatz, R-Richmond, passed the panel 11-2 and was recommitted to the Senate Judiciary Committee, where it must be approved before moving to the full chamber.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


Democrats Sen. Andrea Hunley, D-Indianapolis, and Sen. Shelli Yoder, D-Bloomington, were the only “no” votes against the social media.

The measure contains multiple provisions, but a highly-discussed section would substantially restrict minors’ access to social media. Under the proposal, social media companies like Meta would be required to obtain written parental permission before a minor under age 18 could create an account.

social media restriction language but ultimately stalled in the House.

Indiana Secretary of Education Katie Jenner testifies before the Senate Education Committee on Jan. 7, 2026. (Photo by Casey Smith/Indiana Capital Chronicle)

Supporters argued in committee that the bill is a response to growing concerns over social media’s impact on children’s mental health and school environments.

Indiana Secretary of Education Katie Jenner emphasized the toll she said social media is taking on students across Hoosier schools.

“For most of us in the room, social media arrived when we were already well into adulthood,” Jenner said, adding that “our children growing up today do not have that same luxury” of a childhood free from constant comparison, cyberbullying, algorithm-driven content and addictive features.

But critics raised concerns about enforcement, privacy and rights of students.

Samantha Bresnahan with the American Civil Liberties Union of Indiana, for example, argued that such restrictions could infringe on minors’ constitutional rights and require intrusive data collection to verify age and consent.

Parents get more say

, authored by Sen. Spencer Deery, R-West Lafayette, takes a different tack on technology.

The measure, which passed the education committee 12-1, would require Indiana’s traditional public and charter schools to include in their technology plans a description of how they will enable parents to exercise control over school-provided devices when they are not in school and strengthen internet use and wireless communication policies.

If approved, schools must adopt policies by Jan. 1, 2027, that would let parents increase the strength of content filters on school-issued devices and limit the time students can use those devices outside school hours. The bill also directs schools to prohibit use of school equipment “for noneducational purposes during instructional time.”

Hunley was the lone vote against the proposal.

“I think that our school boards can already do this if they would like to,” she said. “I’m a big fan of home rule and local control, and I think that the level of government that’s closest to the school building should be the one to make this decision and enact this policy, not the state.”

Sen. Stacey Donato, R-Logansport, voted in favor but urged additional consideration of how parental controls might apply during e-learning days.

“We talked about the parental controls on an e-learning day, that (parents) may not want a YouTube video or a TikTok or pick-your-poison that may be used in structure for the educational experience,” she told Deery. “I just encourage you to look into that.”

Democrats also pressed for clarity on potential costs to schools.

Sen. Fady Qaddoura, D-Indianapolis, asked whether districts might need to spend money to implement stronger parental controls.

Deery said his office could not identify any examples where Hoosier schools would take on additional costs because most already contract with vendors that offer such functionality. “

We’ve yet to find any institution that does not have a contract with a vendor that does not offer this,” he said. “I’ve confirmed with virtually all of the major vendors. So, I’m not aware of (any costs schools would incur).”

is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Indiana Capital Chronicle maintains editorial independence. Contact Editor Niki Kelly for questions: info@indianacapitalchronicle.com.

]]>
Opinion: 2,739 Ed Tech Tools Later, Where Are the Outcomes? /article/2739-ed-tech-tools-later-where-are-the-outcomes/ Wed, 03 Dec 2025 13:30:00 +0000 /?post_type=article&p=1024367 Step into any school district today, and you’ll see it: a dizzying maze of educational technology tools. On average, districts access annually. Ed tech providers roll out flashy features, sometimes without clear evidence that they actually improve student learning. And yet, when results fall short, districts are left paying for products that don’t deliver. 

As districts navigate mounting financial pressures within a shifting K-12 funding landscape, the stakes could not be higher. The opportunity to invest in solutions that deliver outcomes has also never been greater.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


The call is simple: Buy what works, build for impact and hold everyone accountable to outcomes. Recent research that EdSolutions conducted on behalf of the revealed critical insights about how to make this happen.

When contracts focus on results over bells and whistles, every dollar stretches further toward meaningful learning gains. The question is no longer “What can this program do?” but “What outcomes will my students achieve as a result?”

This is the moment to move beyond feature checklists and unclear expectations for dosage. Districts and providers alike must embrace outcomes-based contracting, an approach that puts student learning at the center.

It’s not just about shifting financial incentives; it’s about ensuring shared accountability for implementation integrity. Every dollar should drive measurable student gains, not just fund another tool. Districts must weigh evidence of effectiveness as heavily as price when assessing value. Providers must clearly define the conditions — professional learning, supports, and implementation as designed — required to achieve results and ensure the product price reflects the full cost, including these conditions.

The call is simple: Buy what works, build for impact and hold everyone accountable to outcomes.

In today’s crowded edtech landscape, district leaders say they want to buy what works. A 2024 EdSolutions survey of 400+ educators shows most rely on evidence tools — 60% cite EdReports, 47% What Works Clearinghouse, 37% Evidence for ESSA — when considering options. Yet when it comes to actual purchasing, our analysis of district requests for proposals found price, not evidence, still drives decisions.

Why? Because quality evidence is scarce. Of 14 widely used K-12 math and literacy products we analyzed, only four earned the highest ratings for effectiveness. With limited proof and tight budgets, districts default to comparing features and costs — $20 vs. $40 — rather than asking which tool actually helps students learn.

Districts need to flip that script and push beyond price by asking: Does the evidence hold up in our context? Are the promised outcomes worth the investment? Providers need to shift the conversation by proving their products deliver results, not just bells and whistles. And funders need to step in to underwrite rigorous, independent studies that give the field the confidence it badly needs.

Buying the right product is just the first step. Without strong implementation support, even the best tools flop. Take a district that invests in a new math platform: It looks affordable on paper, but training is optional, usage is inconsistent and get the required practice Results stall, teachers grow frustrated and the district ends up paying for something that never stood a chance.

The evidence is clear. Researchers from Northwestern University found that when teachers receive even, student gains are dramatically larger than when products are used off the shelf. Yet too many providers treat professional learning and requirements to use as designed as “extras” rather than essentials.

Implementation has real costs: time, resources and training to use tools as designed. Providers should be transparent about these requirements and build them into their pricing and messaging. If a product’s effectiveness depends on dosage, training or fidelity, those elements aren’t optional; they’re part of the product itself.

Outcomes-based contracting transforms the provider-district relationship. By tying payments to student outcomes, districts must commit to implementing as designed, while providers must commit to delivering tools that actually work. Both parties have skin in the game.

The OBC approach sparks the critical conversations that traditional contracts don’t always surface:

  • What outcomes do we expect, and how will we measure them?
  • Who is this product designed for, and is that population similar to our target population?
  • What implementation steps are non-negotiable and by whom?
  • What professional learning and time commitments are required?

Instead of retrofitting products for the wrong contexts, OBC clearly and strategically defines the outcomes and expectations upfront. Instead of hiding implementation requirements in the fine print, OBC makes them explicit and actionable. This goes beyond accountability for outcomes, creating a unique opportunity to improve both product design and teaching practice together by working through real-world usability challenges to achieve the product’s research-backed intent. It’s a win-win.

Budgets are tight, communities want results and funders demand proof. Traditional contracting rewards features and sales; OBC rewards outcomes. It’s time to flip the script — and pay for what works.

]]>
From Biker Bars to Schools, Yondr Founder Sees Phone Pouches as ‘Impulse Disrupters’ /article/from-biker-bars-to-schools-yondr-founder-sees-phone-pouches-as-impulse-disrupters/ Tue, 02 Dec 2025 11:30:00 +0000 /?post_type=article&p=1024188 If you’ve been in a school recently, you’ve likely seen students tucking their mobile devices into those colorful, magnetic .

As of last month, had enacted phone restrictions in K-12 classrooms, with 27 banning phones in classrooms outright. In many cases, schools are asking students to drop their phones in Yondr pouches for the school day, at a cost of about $30 per student annually. 

What you may not know is that the pouches have been floating around for more than a decade, first appearing in an Oakland biker bar — and that the man behind them had thinkers like French philosopher and English novelist on his mind as he developed the idea.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


More than a decade later, Graham Dugoni sees the pouches as a low-tech, countercultural way to help young people begin to see unexplored frontiers in their own lives.

Born in Oregon in 1986, Dugoni briefly played professional soccer in Norway and the U.S. before taking his first real “adult” job in finance in Atlanta. He recalls a “Kafka-esque” experience toiling away in a windowless office — in his free time, he began immersing himself in philosophy and teaching himself jazz piano. 

Philosophers like and got him thinking about technology and society, while jazz — with its improvisations and emphasis on self-expression — pushed him to explore broader themes of personal freedom.

A pivotal moment happened in 2012, when Dugoni, by then based in California’s Bay Area, was enjoying a music festival. He watched in shock as an intoxicated concertgoer danced uninhibitedly while a perfect stranger filmed him with a smartphone, then uploaded the video to social media. Dugoni began searching for a way to make such interactions impossible, wondering how he could create phone-free spaces that foster genuine connection — and a measure of privacy.

“To see someone just having a good time and being uninhibited and watching them be filmed and posted online,” he said in an interview, “I just followed it out logically. Where does that go?”

He’d read enough about the corrosive effects of technology to know that while tech can help create a more open, democratic society, “You don’t get something for nothing.” He knew that giving up privacy in the public sphere could have “a tremendously huge impact on people’s ability to communicate, to express themselves freely, to be swept up into a shared moment.”

In 2014, Dugoni developed the first magnetic pouches out of materials from his local hardware store and began selling them door-to-door — his first customer was a biker bar in Oakland that wanted to dissuade patrons from filming its burlesque shows. Around the same time, he signed his first school.

Then, in 2015, he got a call from comedian Dave Chappelle’s manager, who wanted to at his shows to enforce a no-phones policy. That helped push Yondr into public consciousness, with schools, artists and venues soon queuing up.

Students placing mobile phones into Yondr pouches. The California-based company’s pouches are now used by about 2 million students in all 50 states and 45 countries. (Yondr)

The disruption of the COVID-19 pandemic began shifting parents’ attitudes around mobile phones and schools. And Jonathan Haidt’s 2024 book , which urged schools to go phone-free, pushed the company to even bigger prominence. Yondr now boasts about 150 employees. The company, which is privately held, doesn’t share revenue figures, but a spokeswoman said it has seen “sustained triple-digit growth” over the past three years. Its pouches are used by about 2.5 million students in all 50 states and 45 countries, and the company said the figure could triple once total sales are tallied by the end of the year.

TIME included the pouches as their “” — under the “Social Good” heading, which also included a new malaria vaccine and a 3D-printed resin water filter for people without access to safe drinking water. 

By now, many students understand the importance of going phone-free, even if the locking pouch impinges on their social life. “It’s not the best, but I think it’s for the best,” one student last spring. 

The 74’s Greg Toppo recently chatted with Dugoni, 39, to ask him about the company’s origins, his philosophy and why he considers phone-free schools as spaces where kids can be kids, focus on their studies and develop vital relationships.

Their conversation has been edited for clarity and length.

I wanted to ask you about that 2012 music festival where you came up with the idea for the pouches. What was on your mind? 

I was looking at the smartphone, and the fact that everyone had a recording device, but also access to the Internet. I knew that that was a fundamentally new human experience, and that, from a pure sociology standpoint, there are going to be questions asked because of that that have never been asked before. No one’s had to ask questions about what degree of privacy can you assume in the public sphere. No one had to think about what effect would the ability to be recorded or show up online in any context do to social interaction, to the idea of privacy, to the idea of intimacy. 

This new tool, I felt, was ushering in these questions. But I was walking around San Francisco in my waking life, and no one else was aware of them. In an education setting, it was happening in a different way to the same degree: the push to put more tech into the classroom, faster, which was really nonsensical in a lot of ways. But at a music festival, to see someone just having a good time and being uninhibited and watching them be filmed and posted online, I just followed it out logically. Where does that go? 

I had read enough of people like Foucault and things like that to understand what that ultimately leads to. In a lot of tech society, there’s this idea that transparency in all things is going to create a more open society and more democratization. And like anything, you don’t get something for nothing. You give something up. And that’s how I saw it playing out. If there’s no degree of privacy in the public sphere, I saw it having a tremendously huge impact on people’s ability to communicate, to express themselves freely, to be swept up into a shared moment — things that are deeply valuable for an individual’s psychology, but also the collective consciousness and experience of civil society.

You guys strike me as a privacy company, first and foremost, but also a tech company that’s turning back the clock, in a way. Is that the way you see yourselves?

Not really. I would say we’re a bit of a counterculture company, really. And I would say we’re definitely not a tech company.

I purposely, early on, did not go with early venture capital money because there’s a certain profile that those companies have to follow. What I’m about, especially for young people in a school setting, but also people in daily life, is a sense of choice and a sense of freedom, and especially showing this younger generation that there is a way to walk through the world that’s not completely mediated by screens and the Internet. 

It’s not poo-pooing technology or what it can do. The question is, really, how do you integrate it into our lives? And I don’t think anyone has a perfect answer for it. But I’ve always felt that phone-free schools and spaces, that Yondr started — we created that concept — is a really good way to give people some sense of what that is, because people have to experience it. 

How quickly did you start thinking about schools as users of these pouches?

Our first customer was a venue, and we got a lot of notoriety early on from working with certain artists, like Dave Chappelle. But really, at the same time that we started working with a few venues, we got our first school customer around the Bay Area. So from the very beginning, the two pillars of the company have been centered on those two — that’s been lost in the general story a bit. Now, going around the Bay in 2014, talking about a phone-free school, you can imagine how many doors got shut in my face. But even then, from talking to teachers, I knew it was a huge problem — it just hadn’t floated up into general awareness enough for superintendents to take any notice of it. But teachers knew, even back then.

So where was this brave new school that came to you and said, “We need to do this”?

Well, they didn’t come to me. I went to them. I was going door-to-door. The first school that said Yes was Peninsula High School in San Bruno, south of San Francisco.

And what did they see that nobody else did?

I would say principals and teachers fell into two camps, for the most part, around phones. One group saw it as so far gone that this was a bell that could not be unrung. On the other side, you had teachers and people who knew it was a huge deal, but they were trying to figure out a solution. For a lot of reasons, it’s a difficult thing to unwind. It’s wrapped up with social behavior, social psychology, habits, all of those things. So this principal fell into that camp: someone who had the gusto, the energy and wanted to try to do something. I came to them and said, “Look, I think there’s a way to do this, and I think I can help you do it.” Now, I didn’t know anything about how to actually make it work, so it didn’t work so great in the early days. But we’ve spent the last 11 years figuring out all the things that have to go with it to make this work for a school, a district, and now whole states.

As you said, the ethos at the time was to get more tech in schools, not less. I can see what you were up against.

The drive, at the end of the day, to make things faster, easier, cheaper and more available, it’s very tantalizing. You’re turning kids and people in general into information-retrieval machines, which is very different than critical thinking.

What changed? Obviously COVID had a hand in this. What else? 

Eleven years ago, everything was different, and our team was out on the ground, going into schools. And basically the way we’ve grown as a company to where we are now — we operate in all 50 states, we’re in 45 countries and millions of students use Yondr every day — we did it brick-by-brick, school-by-school. We went in and helped them actually do it, figure out a policy, help them implement it, learn from them how to do it. We’ve had a huge ground game over the years. Up until COVID, we were building that out. We were building around pockets of teachers at first, who helped us figure it out, and then we realized we had to expand into the whole school to make it work. Then it started to grow. And we’re building up just by word of mouth, teachers and principals saying, “Hey, this works, and this company has helped us.” 

Then COVID hit, and that basically flattened out our business. We almost went under. But it also had an incredibly positive effect in the aftermath, because so many teachers — and parents especially — saw what it meant for their kid to be behind a screen for that long. They saw what was happening. So out of COVID, the conversation completely flipped. Whereas before our team was out kind of evangelizing, saying, “Hey, here’s what a phone-free school is, a phone-free space is” — we invented the term — we have people kicking it back to us now and saying, “Yeah, we get it. There’s a problem here, and we’re looking for a solution.” The zeitgeist really changed and people’s awareness clicked over. 

I guess Jonathan Haidt’s book didn’t hurt.

It added a lot of fuel to the fire, but it was, in terms of us, all the schools mentioned [in the book], they’re Yondr schools. So we already knew it. But the general awareness that it generated was tremendous.

A couple of weeks ago, I was in a school in Boston that’s using these pouches. My favorite comment from a teacher was, “My students are laughing at my jokes again.” What are some of the reactions that you remember?

Those are the little stories we look for. We have the case studies that show improvements in academic performance, teachers getting more teaching time back, students feeling safer on campus. But the way I see what we do is that it’s a broader cultural shift inside of a school. And so stories like you just mentioned, we hear that all the time: Teachers are seeing the students’ eyes again. We hear a lot that the body language, the posture of students inside the hallways, totally changes. We hear a lot of times that more books have been checked out in the first three weeks at a library than the entire previous school year.

One that’s most interesting to me, in a way, is we’ve heard from a lot of schools that more lunches are being eaten at the cafeteria. It’s not because the kids are less distracted. It’s because a lot of kids are afraid of eating lunch in the cafeteria because they don’t want to be filmed or recorded in an embarrassing moment and posted online. 

What I like about those stories is they help people who are not in the day-to-day, like teachers are, realize what an existential situation these young kids are stepping into. And it reframes that: A phone-free school is not taking something away from students. We’re trying to give them a space to be kids and to focus on their studies, develop the social relationships, a sense of identity that they’re going to need. And phone-free space is part of that.  

Speaking about technology, you recently said it has “this total neutralizing effect on people’s ability to express themselves, because there’s no such thing as intimacy without privacy.” That seems like a big part of this project.

It’s very difficult to find frontiers in modern society anymore: Places you can go where there’s unexpected things, there’s adventure, there’s a sense of unexplored territory. That’s especially hard for this younger generation, which has grown up always being able to look around corners. Things are curated and manicured, and they know where people are at all times. You can look at it through the lens of privacy, which is real, but also through that lens of just what’s unexplored. And when you go to a show that uses Yondr, it’s unexplored. What happens there is for the people who are there. And it makes the experience richer. It leaves a deeper impression on the people there. 

What about the ways students try to get around these pouches? How do you view that? Do you view that as helping you problem-solve or rethink the pouches themselves?

Of course it happens. We’ll talk to principals and be super candid: “You know the students who are going to buck against a new policy, and you know there are going to be students who smuggle a phone through their sock, or whatever.”&Բ;

I always want to hear the stories. I smirk a little bit, because it’s good to see that students are using their ingenuity and being creative. But it’s not really about that. The broader message is that it precipitates a cultural shift in the school, where the expectation is that the school is phone-free, bell-to-bell. What we found is that after two or three weeks, that becomes the new normal. Once you establish that inside a school, and a culture that supports it, that’s the point. So if a student finds a workaround, or they want to bring in a phone, the important thing is that the community is ready to deal with that in a way that is appropriate for them. If you reinforce the benefits of a phone-free culture, eventually you win everyone over as they start to see the results.

So we’re not naive about it. We know we’re not going to win over every 16-year-old overnight. But we can convince them and show them that they might enjoy it once they’ve experienced it.

I was listening to a call-in show about phone-free schools the other day, and one of the panelists pointed out that if school is a training ground for students’ real lives, the only jobs where they’re going to have to put away their phones are low-paying service jobs. I’d never thought about it in those terms. Does that give you pause?

There’s something much more fundamental than that happening. I’ve talked to a lot of people in different state agencies. I can tell you they’re having an extremely difficult time hiring young people right now, and a lot of that comes down to their ability to focus, to think critically and to just socialize. Those are skills that you’re less likely to develop if you have a crutch in your pocket that makes those things less risky or easier. A lot of modern technology, it ultimately makes something easier. Now, that’s fine. We do a lot of trade-offs in our life for convenience. But when you get down to what education is about, it’s not just about using a tool. You have to be able to build up critical thinking muscles and some of the aptitude that’s going to carry you through life. 

People say, “Well, we should teach kids how to use these devices.” Absolutely. How do you plan to do that? If you have something in your pocket soliciting your attention all the time, that becomes basically wired into your central nervous system and always offers you a path of least resistance when anything difficult comes along, how do you plan to educate someone, especially a digital native who has no experience of the world without it? So it’s more, “How do you believe human psychology works, and how do you actually develop habits and patterns of thinking?” 

The pouch is more of an impulse disrupter. A student feels the phantom vibration in their pocket. They reach for it. Hand feels the pouch. You’re allowing a new pathway to emerge and develop that leads to a new habit. Because it’s hard to make the argument that young people are not going to have enough exposure to the Internet and their phones to learn how to use them. You can make a lot of arguments to say that six to eight hours a day without it to focus on their studies and being a kid is probably a good thing, given what we know. 

Last question: Talk about your tech habits.

I’ve had a flip phone for 10 years. I’m not saying everyone should do that. That’s my own choice. It makes a lot of things in life very inconvenient, very difficult. But on balance, it helps me because I have fewer inputs than the average person. My morning, I’m not flipping open the news and getting carried away to some place about things I can’t affect in any positive way, which is a big part of the modern world as well. If you allow everything to solicit your attention and your empathy, what are you left with to affect the things positively that you can control? 

That’s a funny effect of digital media in general: There’s a lot of important things, and it doesn’t mean you shouldn’t care about them, but what can you affect? For me, that’s a choice I have. So I operate in front of the computer, or I do phone calls. It slows my world down. I place a big emphasis company-wide on writing, on clarity of writing, and clarity of thinking that comes out of that. 

And then in my own home life, it’s all about boundaries. Technology as a theme — this is not just the Internet — it’s not totally neutral. Albert Borgmann and Martin Heidegger write about this: It’s not something that knocks at the door and asks permission to enter. You have to create boundaries. And to me, boundaries are best created in a physical way. So I use a computer in one room in my house. That’s it. So my mental associations are, if I’m here, I’m doing work. 

]]>
Opinion: Can AI Keep Students Motivated, Or Does it Do the Opposite? /article/can-ai-keep-students-motivated-or-does-it-do-the-opposite/ Fri, 24 Oct 2025 18:30:00 +0000 /?post_type=article&p=1022394 Imagine a student using a writing assistant powered by a generative AI chatbot. As the bot serves up practical suggestions and encouragement, insights come more easily, drafts polish up quickly and feedback loops feel immediate. It can be energizing. But when that AI support is removed, some students .

These outcomes raise the question: Can AI tools genuinely boost student motivation? And what conditions can make or break that boost?


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


As AI tools become more common in classroom settings, the answers to these questions matter a lot. While tools for general use such as ChatPGT or Claude remain popular, more and more students are encountering AI tools that are purpose-built to support learning, such as Khan Academy’s Khanmigo, which personalizes lessons. Others, such as ALEKS, provide adaptive feedback. Both tools adjust to a learner’s level and highlight progress over time, which helps students feel capable and see improvement. But there are still many unknowns about the long-term effects of these tools on learners’ progress, an issue I continue to study as an educational psychologist.

What the evidence shows so far

Recent studies indicate that AI can boost motivation, at least for certain groups, when deployed under the right conditions. A showed that when AI tools delivered a high-quality performance and allowed meaningful interaction, students’ motivation and their confidence in being able to complete a task – known as self-efficacy – increased.

For foreign language learners, found that university students using AI-driven personalized systems took more pleasure in learning and had less anxiety and more self-efficacy compared with those using traditional methods. with participants from Egypt, Saudi Arabia, Spain and Poland who were studying diverse majors suggested that positive motivational effects are strongest when tools prioritize autonomy, self-direction and critical thinking. These individual findings align with a broader, that found positive effects on student motivation and engagement across cognitive, emotional and behavioral dimensions.

from my team at the University of Alabama, which synthesized 71 studies, echoed these patterns. We found that generative AI tools on average produce moderate positive effects on motivation and engagement. The impact is larger when tools are used consistently over time rather than in one-off trials. Positive effects were also seen when teachers provide scaffolding, when students maintain agency in how they use the tool, and when the output quality is reliable.

But there are caveats. More than 50 of the studies we reviewed did not draw on a clear theoretical framework of motivation, and some used methods that we found were weak or inappropriate. This raises concerns about the quality of the evidence and underscores how much more careful research is needed before one can say with confidence that AI nurtures students’ intrinsic motivation rather than just making tasks easier in the moment.

When AI backfires

There is also research that paints a more sobering picture. of more than 3,500 participants found that while human–AI collaboration improved task performance, it reduced intrinsic motivation once the AI was removed. Students reported more boredom and less satisfaction, suggesting that overreliance on AI can erode confidence in their own abilities.

suggested that while learning achievement often rises with the use of AI tools, increases in motivation are smaller, inconsistent or short-lived. Quality matters as much as quantity. When AI delivers inaccurate results, or when students feel they have little control over how it is used, motivation quickly erodes. Confidence drops, engagement fades and students can begin to see the tool as a crutch rather than a support. And because there are not many long-term studies in this field, we still do not know whether AI can truly sustain motivation over time, or whether its benefits fade once the novelty wears off.

Not all AI tools work the same way

The impact of AI on student motivation is not one-size-fits-all. Our team’s meta-analysis shows that, on average, AI tools do have a positive effect, but the size of that effect depends on how and where they are used. When students work with AI regularly over time, when teachers guide them in using it thoughtfully, and when students feel in control of the process, the motivational benefits are much stronger.

We also saw differences across settings. College students seemed to gain more than younger learners, STEM and writing courses tended to benefit more than other subjects, and tools designed to give feedback or tutoring support outperformed those that simply generated content.

There is also evidence that general-use tools like ChatGPT or Claude do not reliably promote intrinsic motivation or deeper engagement with content, compared to learning-specific platforms such as ALEKS and Khanmigo, which are more effective at supporting persistence and self-efficacy. However, these tools often come with subscription or licensing costs. This raises questions of equity, since the students who could benefit most from motivational support may also be the least likely to afford it.

These and other recent findings should be seen as only a starting point. Because AI is so new and is changing so quickly, what we know today may not hold true tomorrow. In a paper titled , the authors argue that the speed of technological change makes traditional studies outdated before they are even published. At the same time, AI opens the door to new ways of studying learning that are more participatory, flexible and imaginative. Taken together, the data and the critiques point to the same lesson: Context, quality and agency matter just as much as the technology itself.

Why it matters for all of us

The lessons from this growing body of research are straightforward. The presence of AI does not guarantee higher motivation, but it can make a difference if tools are designed and used with care and understanding of students’ needs. When it is used thoughtfully, in ways that strengthen students’ sense of competence, autonomy and connection to others, it can be a powerful ally in learning.

But without those safeguards, the short-term boost in performance could come at a steep cost. Over time, there is the risk of weakening the very qualities that matter most – motivation, persistence, critical thinking and the uniquely human capacities that no machine can replace.

For teachers, this means that while AI may prove a useful partner in learning, it should never serve as a stand-in for genuine instruction. For parents, it means paying attention to how children use AI at home, noticing whether they are exploring, practicing and building skills or simply leaning on it to finish tasks. For policymakers and technology developers, it means creating systems that support student agency, provide reliable feedback and avoid encouraging overreliance. And for students themselves, it is a reminder that AI can be a tool for growth, but only when and curiosity.

Regardless of technology, students need to feel capable, autonomous and connected. Without these basic psychological needs in place, their sense of motivation will falter – with or without AI.The Conversation

This article is republished from under a Creative Commons license. Read the .

]]>
AI Is Being Used in Schools, but Statewide Guidance Is a Work in Progress /article/ai-is-being-used-in-schools-but-statewide-guidance-is-a-work-in-progress/ Wed, 03 Sep 2025 16:30:00 +0000 /?post_type=article&p=1020249 This article was originally published in

Brayden Morgan says artificial intelligence is here to stay and everyone should embrace it.

“We have to adapt. We have to stay up to date,” said the 17-year-old high school senior and student member on the Anne Arundel County Board of Education. “We have to learn about it and make sure our students know how to use it [the] right way [and] that they’re learning and not being enabled on technology.”

That may be easier said than done.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


The technology better known as “AI” is already being used by students and teachers in Maryland schools. But the state has yet to develop specific statewide guidelines on how to effectively use the powerful new computing tool, and what guardrails to protect students from using it inappropriately, such as plagiarism on essay papers and other work.

State education officials have been working behind the scenes for more than a year on language, and county school systems have made tentative steps toward developing their own policies. But it’s been slow going.

Brayden Morgan, the student member on Anne Arundel County Board of Education, says schools need to adapt to the presence of AI. (William J. Ford/Maryland Matters)

Jing Liu said there’s a couple of reasons many school districts in Maryland, and the nation, don’t yet have artificial intelligence policies in place.

“The AI space is developing really, really fast. All the AI tools are developed at lightning speed,” said Liu, an associate professor in education policy at the University of Maryland, College Park.

Liu, who’s also directs the school’s Center for Educational Data Science and Innovation, said evidence-based research needs to be done quickly to help inform policymakers and school district leaders on how to design AI policy. He said a policy would include certain tools used to meet education standards and guardrails to ensure appropriate uses.

“I think we are still at a very early stage in terms of understanding their [AI] impact,” Liu said. “There hasn’t been a lot of research looking at the impact of particular AI users on teacher and student learning outcomes.”

Maryland education officials said they have been working behind the scenes on AI guidance for more than a year.

A from the state Board of Education summarizes artificial intelligence frameworks such as potential benefits like tutoring and personalized learning assistance, aiding creativity and collaboration and operational and administrative efficiency. Some of the risks are plagiarism and academic dishonesty, overreliance and loss of critical thinking, and perpetuating societal biases.

State Superintendent Carey Wright said in an interview Thursday that statewide guidance on AI could be released by the end of the school. In the meantime, Wright has advice for educators and other school leaders on effectively using AI in schools.

“The things that I would hope they’re doing is developing lesson plans that are aligned to our standards. That’s key because our statewide assessment is aligned to our standards,” she said.

“We don’t want just a hodgepodge of things being taught,” Wright said. “So, anything that they can do that is going to make their life easier, but also guiding children in [what’s] appropriate and what’s not appropriate, in terms of the use of AI.”

School district look

A few school districts have implemented AI guidelines.

Prince George’s County school leaders began last school year that stresses professional learning, ethical considerations and curriculum integration.

For the 2025-26 school year that began last week, there will be follow-up meetings with stakeholders, training workshops for staff and school administrators to start assessing how to implement AI instruction in the classroom.

Students in can read about it in their new student code of conduct. The guidelines highlight definitions, educational and ethical uses, academic integrity and supervision and monitoring.

There’s also a warning for prohibited conduct: “Any misuse of AI tools will be subject to disciplinary action. In certain circumstances, law enforcement may be notified.”

Frederick County public schools superintendent Cheryl Dyson talks with a student at Gov. Thomas Johnson High School last week, during the first day of school for students in the county. (William J. Ford/Maryland Matters)

Frederick County Superintendent Cheryl Dyson said the school board there is working on an artificial intelligence policy.

Last year, Dyson said curriculum writers used AI to generate topics related to the curriculum that young people would be interested in.

As for teachers, Dyson said they will not only know whether students produced their own work, but also help guide them to think critically.

“When you learn [the abilities of] a student, you can tell when something is an anomaly,” she said. “It’s really about explicit teaching of the writing process, or any process really about learning, because we want to know what they [students] know, not what the computer knows.”

Maryland State Education Association president Paul Lemle provided an example on how he utilized AI last year as a social studies teacher. Lemle asked students to compare a few political ads, but he also required that they make ads of their own.

“It was OK for them to use AI in that assignment,” he said. “If they wanted to use it to research the ads that they were comparing, fine. If they wanted to use it to suggest lines in their script, fine.

“But the AI couldn’t tell them this kind of ad will work in this kind of political context. They had to make that decision for themselves,” Lemle said.

Tiffany Carpenter, 25, said she began to use artificial intelligence during the first week of school to help with a lesson plan for the entrepreneurship class she teaches at Dr. Henry A. Wise Jr. High School in Prince George’s County. Part of that plan, she said, was utilizing AI to design a logo.

“AI’s logo-making isn’t always perfect, so I just use that as a template so that I can get a start,” said Carpenter, who’s going into her fourth year teaching at Wise. “It’s giving you ideas, not the final product. That’s what I tell and show my students. AI is just a tool to help. They still have to do the work and learn from it.”

But without a 100% guarantee, there remains an unknown that students can utilize that technology and easily obtain good grades.

‘Do more together’

With an ongoing statewide and nationwide teacher shortage, aspiring teachers will need to be taught about how to effectively integrate artificial intelligence into the classroom.

Darilyn Mercadel is doing just that at Bowie State University, where she is teaching several classes and is the coordinator of elementary education programs in the school’s College of Education.

Before students are enrolled in college, high school students already know how to use AI technology through computer programs such as ChatGPT, developed and released in 2022 by OpenAI. ChatGPT can translate complex topics into simpler sentences. In addition, the user can ask questions through text, audio or even image prompts.

But Mercadel stressed there’s other programs such as Adobe Firefly that generate graphics and edit photos; Intellectus that analyzes and breaks down data; and the “Siri voice recognition program on iPhones.

Mercadel said students use a tutoring service program called “CircleIn.” Not only does the program provide academic support, it can also be used to collaborate with their peers. But to ensure aspiring educators, which she calls “preservice students,” understand a particular topic, Mercadel will also give an oral exam.

“There are other means that we can assess students besides your usual paper, or your typical test that we used to give before AI,” she said. “One thing about AI is that it’s a genie that’s out of the bottle. We’re not going to place that back.”

The effectiveness of AI has decreased Mercadel’s time to plan her courses from about two weeks to an hour. She stressed the human element is integrated because “AI can’t do things that humans can do, which is the emotional side and the understanding of our students. It gives me more time to get to know my students and families.”

Mercadel, who taught early childhood education in Prince George’s and Montgomery counties, said another positive with AI is interacting with students who may be hearing impaired.

“Now students are able to swipe that doesn’t require that occupational hand holding of a pencil and can tap or swipe to communicate. That’s a huge difference,” she said. “It’s assistive technology. We have these tools that make it easier, but that also gives us time to do more together.”


ChatGPT says it’s OK

As a policy, Maryland Matters does not use AI to write stories, but we did ask ChatGPT, one of the most popular programs, if AI should be allowed in K-12 classrooms for students and teachers. We were told in a split-second, “Yes … with clear boundaries, ethical safeguards, and developmentally appropriate practices in place.” Which it went on to list:

Why AI should be allowed in K-12

1. Enhances teaching and learning

  • For Teachers:
    • Saves time (e.g., grading drafts, generating lesson plans, adapting materials).
    • Enables differentiated instruction and support for diverse learners.
  • For Students:
    • Offers personalized learning (e.g., adaptive math platforms).
    • Improves accessibility (e.g., AI captioning, translation for ELLs).
    • Can support creativity and exploration (e.g., storytelling, art, simulations).

2. Builds essential future-ready skills

  • AI literacy will be as critical as digital literacy.
  • Students need to understand AI’s capabilities, limits, and ethical dimensions.
  • Introducing AI responsibly prepares students for a workforce increasingly shaped by automation and machine learning.

3. Promotes equity (if managed carefully)

  • With proper funding and access, AI can help close gaps in learning support.
  • It can provide real-time feedback and scaffolding for students who need it most.

But use must be carefully controlled
AI should not be used indiscriminately. The risks of misuse, bias, over-reliance and data misuse are real, especially for younger students. Uses, by grade level, should be:

  • Kindergarten-2nd grader should use teacher-controlled tools for tasks like speech-to-text, storoy generation and reading apps.
  • Grades 3-5 should have high supervision with AI image tools, math help and idea generation.
  • Grades 3-8 call for guided creative and assistive use for essay feedback, science simulations, chatbots.
  • Grades 9-12 allow responsible, independent use for research, peer review, coding help and ethical analysis.

When AI should not be used

  • For final grading decisions or disciplinary choices.
  • As substitute for human connection, mentorship, or social-emotional support.
  • In a surveillance capacity (e.g., emotion tracking, behavior prediction).
  • Without full transparency with parents and students.

Final position: Allow AI, with guardrails

AI belongs in the classroom — not as a replacement for teachers or learning, but as a tool for enhancing both. Its use must be:

  • Age-appropriate
  • Ethically grounded
  • Transparent
  • Optional and opt-in where appropriate
  • Accompanied by teacher training and student AI literacy education

is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Maryland Matters maintains editorial independence. Contact Editor Steve Crane for questions: editor@marylandmatters.org.

]]>
Kids Shouldn’t Access Social Media Until They’re Old Enough to Drive, Book Says /article/kids-shouldnt-access-social-media-until-theyre-old-enough-to-drive-book-says/ Tue, 02 Sep 2025 10:30:00 +0000 /?post_type=article&p=1020144 Jean M. Twenge holds an unusual place among Ph.D. psychologists. For the past two decades, she has toggled between the obscurity of the academy and the glare of academic fame. 

The author of two college textbooks and five books for non-academic readers, she is equally at home researching and writing about adolescent mental health, sleep disorders, digital technology, homework and narcissism. She was one of the first experts to warn nearly that smartphones could hold negative consequences for our mental health. A decade after the advent of the iPhone, Twenge went viral in 2017 with an that asked, provocatively, “Have Smartphones Destroyed a Generation?”


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


A professor at San Diego State University, she has collaborated for years with the researcher and author Jonathan Haidt, whose 2024 book was a mega-bestseller that has helped build momentum for school cellphone bans in a growing number of states — .

And she is one of the few experts in the education and mental health world to have appeared on HBO’s .

Cover of Jean M. Twenge’s new book, 10 Rules for Raising Kids in a High-Tech World 

Twenge’s 2017 book, , looked at how modern teens are somehow both more connected than previous generations and less prepared for adulthood. In it, she theorized that depression rates among teens are rising because they spend more time online, less time with friends in person, and less time sleeping — a problematic combination. 

The dilemmas Twenge identified in 2017 are only getting worse: By 2023, the typical American teen was spending nearly five hours a day using social media, recent research finds, with severe depression rates rising. In , girls who were heavy users of social media were three times as likely to be depressed as non-users.

Her , out Tuesday, offers practical guidelines for parents raising kids in the age of ubiquitous connectivity and sophisticated — some would say addictive — social media.

Twenge doesn’t shy away from challenging harried parents to do better. Among her suggestions: No one — parents included — should have electronic devices in the bedroom overnight. Likewise, she says, the first handheld device a kid should receive is a “basic phone” that allows calls, texts and not much else.

“It’s a really big myth out there that if kids are going to communicate, it has to be on social media,” she said. “That’s just not true.”

Ahead of its publication, Twenge spoke with The 74’s Greg Toppo about her rules, her work with Haidt and her belief that we need stiffer laws that keep young people off social media until they’re old enough to drive.

Their conversation has been edited for length and clarity.  

I wanted to start with a quote from your book. It’s a parent’s description of his 10-year-old after she got her first smartphone: “She suddenly wasn’t playing with her younger siblings as much. Novels were promptly cast aside. She wasn’t around to help with dinner anymore. She danced less, laughed less. She was quieter. Our home was quieter.” That’s so heartbreaking, but I’m guessing it’s not unusual.

I don’t think it is. Many, many parents describe how their kids are different after they give them a smartphone. And it’s especially heartbreaking when that’s a 10-year-old, but even when it’s a 16-year-old who might otherwise be ready. It’s very noticeable how they change after they get that phone in their pocket.

Were there any particular data points about smartphones and social media that persuaded you they were causing a mental health crisis?

It was a slow process for me, and it wasn’t an immediate conclusion when I first started to see these trends in adolescent mental health. It was first a process of ruling out obvious causes, like the economy, which wasn’t aligned at all, and any other big events that might happen. I would trace it, really, to the big that I work with on teens, where there was just this combination all at once of not just rising depression, but teens spending less time with each other in person and less time sleeping. And then realizing, “Well, wait: What might explain all of those things happening at the same time?” 

And it seemed clear that a good amount of that answer is probably smartphones and social media, particularly after I found a Pew Research Center poll about the ownership of smartphones, that [it] in the U.S. at the end of 2012. And that’s right around the same time all these changes were happening.

I want to dig into a few of your rules. No. 3: “No social media until age 16 or later.” That seems a lot tougher than what most families practice. Why 16? And what do you say to parents who worry about their kids’ social isolation and FOMO or Fear Of Missing Out?

I have not found that with my kids — that they’ve been socially isolated for not having social media. Most other parents I talked to who have put off social media have also not found that with their kids. Social media is just one mechanism for communicating. There’s so many others. Kids can call each other, they can text each other — they do a lot of texting. They can FaceTime each other, they can get together in person. Usually that ends up tilting toward texting, but it does not have to be social media. It’s a really big myth out there that if kids are going to communicate, it has to be on social media. That’s just not true.

And that leads to rule No. 4, where you advocate “basic phones” — your phrase — before smartphones. In a world where even school assignments need Internet access, is that practical for most families?

Yeah, because kids have laptops. And if the family can’t afford to buy them a laptop, almost all schools provide a laptop. So they have Internet access on their laptop even if they don’t have it on their phone. And laptops have come so far down in price too, that if you haven’t bought a laptop recently, or if you use Mac laptops like I do and my kids do now, you might not realize you can get a . So that’s another big thing: Maybe 10 years ago, if a kid doesn’t have Internet access on their phone, then they don’t have Internet access at all. That’s just not true in the current landscape.

Although you do have problems with school laptops.

Oh, yes. I mean, this is a thing! They get Internet access on the laptop, whether it’s a school laptop or a personal one, and then that opens a whole other can of worms. Absolutely true. Laptops are the bane of my existence as a parent, particularly the school laptop, although they’ve gotten a little bit better, at least in my district. 

Actually, that was going to be my next question, this parental controls thing. It sounds like your district is being responsive.

Well, on that issue, they still don’t have a coherent phone policy during the school day. In the high school, it’s especially bad. That’s something I’m hoping will change. It is changing in a lot of schools around the country, thankfully. A lot more schools are doing “no phones during the school day, bell to bell,” which is what needs to happen.

A big message of the book is phone-free schools. And I know you’ve worked with , who has pushed for schools to get rid of phones. A few critics have said that this is a to a complex problem, and that it’s not entirely clear that phones are actually causing the mental health issues that Haidt has become a best-seller writing about. How do you respond to that criticism?

There are a couple of things to unpack there. For one thing, even if you take mental health out of the equation, kids should still not have their phones at school for academic and focus reasons, for the reason of developing social skills by talking to their friends at lunch, for the reason that a bell-to-bell ban is actually easier to enforce than a classroom-by-classroom ban. There are so many reasons for it that don’t even include mental health. 

The second question is [about] the research on phones and social media and mental health: We’ve known for quite a while that teens who spend more time on social media are more likely to be depressed or unhappy. Almost every single study finds that. Where you sometimes get more debate is, “O.K., that’s correlation. What about causation?” But in the last 10 years, we’ve gotten a lot more studies, and the studies that ask people to cut back or give up social media for at least three weeks a month or so, almost all of those studies show an improvement in well-being. And I don’t want to get too in the weeds here, but that’s actually a little bit shocking, because by definition in those experiments, you’re taking people who are at average use and having them cut back to low. 

That’s actually not where we see the biggest effects in the correlational studies. The heaviest users are much more likely to be depressed than the average or light users. So, you know, you can’t ethically do an experiment that would really answer the exact question: You can’t take 12-year-olds, randomly assign them to spend eight hours a day on social media, and then see what happens. At least I hope not.

In the book, you talk about the 10 rules “creating a firewall for kids against anxiety, attention issues and constant insecurity.” I think most parents would get behind that. But let’s be honest, they’re users of these tools themselves. How do we craft rules around web dependence and social media without being hypocrites?

Parents have to be role models. Parents are also allowed a small amount of what I call “digital hypocrisy.” Because they’re adults, they have jobs, they may be responsible for elderly parents, etc. But that said, parents should think about their technology use as well. They should get their phones and electronic devices out of their bedroom at night. They should also consider doing things like not having social media on their phone. If they want to use Facebook or Instagram or Twitter, do it on your laptop. That’s what I do. I mean, I don’t have much social media to begin with. I have X, but I don’t have it on my phone, and that’s very much a purposeful decision. During family dinners, unless there’s a really specific reason for me to have my phone with me, it’s upstairs.

That seems to be an easy one: Phones away at dinner.

Well, you’d think so, but you’ve got to get the whole family on board, and sometimes husbands are not really into that.

I want to skip to Rule No. 8: “Give your kids real-world freedom,” which will probably be met with some resistance. I have a 4-year-old grandson, and when I read your recommendation to let 4-to-7-year-olds go find items a few aisles away in the grocery store, I shouted, “Hell no!”

Why? Why is there, do you think, a resistance to that idea?

I have nightmares about this child being snatched from me at Safeway. I guess I want you to just pull me back from the edge, if you would.

I mean, that is not just unlikely to happen — the chances of that are so infinitesimal it probably shouldn’t even factor into our decision making. There’s one stat in there, and I forget the exact number, but someone calculated that if you wanted your kid to get kidnapped, how many hours — it turned out to be years — would they have to be in your front yard for that to happen? It’s something like 100,000 years. 

O.K., well that helps.

And a four-year-old loves that stuff! They love being grown up. I mean, look, even if you don’t do the grocery store thing, make sure they learn how to tie their own shoes, that they know how to get dressed. I remember when my girls were that age, and it occasionally amazed me when I would be with other moms in various situations and their kids couldn’t dress themselves at that age, and that’s where it starts. 

At pretty much every age, the great thing is that giving kids independence makes it easier for parents. It is easier as a parent if your 4-year-old can dress themselves. It is easier if your teenager makes dinner once a week. It’s good for everybody.

A lot of people might see this freedom rule as somehow contradictory to some of the other rules, in which you talk about adults being “in control.” Can you parse that?

For sure. Jon has said this as well — and I completely agree: We have kids in the real world and underprotected them online, and these principles are just trying to get those two to balance. When you’re talking about the real-world freedom thing, it’s not a matter of letting kids completely run wild and do whatever they want. We’re talking about giving kids some of the freedoms that parents themselves had when they were kids, and to build independence in a way that is really good for kids and good for them as they grow up. 

I can’t even remember who said this to me when I had young kids: “You’re not raising children, you’re raising adults.” And that’s just so true. That is your job as a parent. Giving kids some freedom and independence is a really, really key part of raising an adult.  

I wrote a whole book about learning games, and one of the powerful ideas that I took from that reporting is that many adults don’t realize video games have become. You acknowledge that, saying gaming is the primary way that some kids spend time with friends. But I gather that you see the risks as well. And I wonder if you could talk about that.

It really comes back to the principle of “Everything in moderation.” Many games are not as obviously toxic as social media. Games tend to be more in real time, more interactive. But is it a good idea for kids to be spending five or six hours a day gaming? Probably not. There have to be some limits.

You quote , the Facebook founder, admitting they’re “exploiting a vulnerability in human psychology” to keep users on the app. Given social media’s sophistication, are mere parental rules sufficient? I mean, don’t we need a bigger hammer, like legislation and policies? 

Absolutely! Yes! Yes! It would be absolutely amazing for parents and for kids if we had laws that verified age for social media. I mean, ideally, that would be age verification to make sure they’re 16 or older, to raise the minimum age to 16. But even if we just enforced existing law with the minimum of 13, that would be progress, given the enormous numbers of 10-, 11- and 12-year-olds who are on social media, often without their parents’ permission — often explicitly against their parents’ permission — and actually against the law [Children’s Online Privacy Protection Rule] that was passed in 1998.

What is the biggest obstacle to getting better regulation, or, to your point, to enforcing the existing regulations?

It’s interesting. The barrier is not the inability to verify age or the inability to verify age without a government ID. There are so many companies that will verify age now that they have their . It can be done in many different ways. The biggest barrier is tech companies themselves. Any time a state passes a law about verifying age on social media or even pornography sites, the companies — every single time. They have sued to keep those laws from going into effect.

Are any emerging technologies that parents should be concerned about? Do your rules need updating for AI or virtual reality or whatever comes next?

AI chatbots are what a lot of parents are rightly worried about. And yes, you could certainly modify or add to the rules and say, “No AI chat bots until 16 or 18 — probably 18.” And of course, it depends on what we’re talking about. It is common for kids to use ChatGPT when they need to look up something for homework or even have it write their essays — that’s a whole other horrible discussion. But what I’m specifically referring to is the many chatbots out there right now that are supposed to be AI friends, or worse, . There’s already been a tragic case of a child who , apparently due to one of these AI girlfriends. It’s just really scary to think of kids having their first romantic relationship with an AI chatbot. It’s terrifying.

The good news is, if you follow that rule about your kids having basic phones, if you give them one of the phones that’s designed for kids, those phones do not allow AI relationship chatbots. It’s on their banned apps, just like social media and pornography and violence apps. Parents have such a tough job, and it’s nice that there are at least a few tools out there that can make their lives easier and keep their kids off of things like AI girlfriend and boyfriend chatbots.

In keeping with the theme of overwhelmed parents, I wonder: If I were to come to you as a parent and say, “Oh my God, Jean, 10 rules is a lot. If I could only do two or three, where would I start?” Is that even a smart thing to do? And if so, where would you start?

I would say, “No electronic devices in the bedroom overnight.” Start there, because the research is so solid on it, and it’s such a straightforward rule, and it works for everybody, of all ages. Your teenager can’t say, “Well, you do it differently,” or, “You get to be on social media.” No, actually, my phone is outside my bedroom when I sleep at night too. So that’s a great place to start. And then, just because they have so much utility, I would probably say the second rule, about basic phones, because even with all of the mess of the laptops, I’m just so happy and grateful that my kids did not have the Internet or social media in their pocket until they were older.

As a parent and a grandparent, I really appreciate you using your real life to inform a lot of these rules. In a way, it hardens them a bit, makes them more durable. Anything I haven’t asked you about that you feel needs to be in the mix?

Two things I’ll throw out there just in terms of pushbacks: With “No phones during the school day,” the pushback is often “What about school shootings?” And it’s actually less safe for students to have access to their phones during an active shooter situation. And I go through the reasons for that in that chapter. 

And then the real-world freedom piece: When you look at the things that I’m suggesting in terms of how to give your kids freedom, obviously letting them go off on their own in the real world is important, and you should do that too. But there are lots of things in that list of suggestions you can do without even leaving the house: teens making their own doctor and hairstylist appointments, for example, or middle-school kids, or even elementary school kids, cooking dinner for the family. Those are great experiences for kids to have without too much parental interference. 

You do have to — and I know this by experience — step back, especially with the cooking piece, and let them do it by themselves and learn how to make mistakes. It’s tempting to just be there when they’re doing that, but you learn quickly that if you leave them alone, they’ll figure it out. And then you can go do something else. Go and read that book you’ve been meaning to read for a while. Go for a walk. Watch TV. Have some relaxation time that you wouldn’t otherwise get. 

I wrote a piece a couple weeks ago on unschooling, this idea of pulling kids out of school and letting them find their own level and their own interests. This almost strikes me as unparenting.

It is — and I’m not a huge fan of unschooling, because it’s a rare kid it would actually work for — but it is. It’s the general idea that not being up in your kids’ business all the time is better for both parents and kids. It’s something we really have to consider more.

]]>
Proposed Indianapolis Public Schools Policy Offers Guidelines on AI Use /article/proposed-indianapolis-public-schools-policy-offers-guidelines-on-ai-use/ Fri, 27 Jun 2025 14:30:00 +0000 /?post_type=article&p=1017428 This article was originally published in

Indianapolis Public Schools is considering a policy on artificial intelligence that would guide the district as it experiments with AI tools for teachers and staff.

The policy — which the school board could vote on later this month — follows a yearlong pilot program in which 20 staff members used a district-approved AI tool to better understand its uses and challenges. Although the policy does not address specific acceptable student uses, it lists general guiding points for staff to ensure AI tools are appropriately used for teaching and learning.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


“There’s still a lot to learn from a broader group of adult users before we’re putting students in an environment that maybe doesn’t match curriculum or what teachers are learning at the same time,” Ashley Cowger, the district’s chief systems officer, said. “We want to make sure that staff feel well equipped to determine what the boundaries are for use of AI in a classroom.”

The basic guidelines are one step into a that IPS and other districts now navigate. While districts could use and Google Gemini to cut down on time-consuming, administrative tasks, they must also balance concerns over .

Plus, IPS will launch a second phase of its pilot AI program this upcoming school year in which even more staff will use a generative AI tool — the chatbot Google Gemini. However, the district is not yet adopting a districtwide tool for staff.

“We are focused on playing the long game so that we’re not finding ourselves in a situation where we’re procuring a bunch of different systems and then those systems don’t meet our needs in a year or two,” Cowger told the school board in May.

The district did not respond to questions from Chalkbeat Indiana about the second phase of the pilot program and future rules for student use of AI tools by deadline.

AI could support lesson plans, create reports

The draft policy states that AI must be used to produce equitable outcomes while also adhering to applicable federal laws such as the Family Educational Rights and Privacy Act, which mandates the privacy of student records.

Staff must only use AI tools approved by the district, which would license the appropriate AI products, according to the draft policy. An “AI Advisory Committee” of administrators, teachers, and technology and legal experts would also provide input on the districtwide use of AI.

It’s unclear when the advisory committee would be created.

Acceptable uses of AI listed in the draft policy include using it to:

  • Draft communications such as emails and newsletters
  • Create data summaries or reports
  • Support lesson planning
  • Automate “repetitive, low-risk tasks”

The acceptable uses were shaped by this past school year’s pilot, which concluded that using AI helps staff do more complex analytical tasks with less human brainpower and capacity, Cowger told the board.

“It also allowed us to simplify administrative tasks,” she said. “Our schools send out newsletters every week. We also do district communication regularly. We have tools like a generative AI tool that can help us at least craft a first draft that doesn’t require 100% of human brainpower all the time.”

The first phase of the pilot included teachers, administrators, and central office staff.

Google Gemini will cost $177 per user in the second phase of the pilot program for 2025-26, Cowger said. The second, broader phase could help the district figure out how far it would like to stretch its use of AI going into the 2026-27 school year.

Cowger said the district’s “responsible use agreements” for district-issued technology, such as laptops, will also need to be updated to “encompass the world of AI.”

And although the district negotiated a cheaper cost per user for Gemini in the second phase of the pilot, Cowger said officials will have to think about the future potential cost if AI use grows districtwide.

Using free versions of AI tools comes with the risk of sharing sensitive student information — such as a student’s personalized education plan — on the internet, Cowger said.

The district has also outlined a “roadmap” for professional learning for staff that will be used in the upcoming school year.

“From hearing feedback from the pilot group over the course of this year we heard a lot of what people want to know. People want to know how the tools actually work. They understand it’s not magic, but they also don’t need to know all of the science behind it,” Cowger said. “Something in the middle that they need to understand.”

This story was originally published by Chalkbeat. Chalkbeat is a nonprofit news site covering educational change in public schools. Sign up for their newsletters at .

]]>
Opinion: The Road to Educational Equity: Can Ed Tech Solve the Digital Divide? /article/the-road-to-educational-equity-can-ed-tech-solve-the-digital-divide/ Thu, 12 Jun 2025 14:30:00 +0000 /?post_type=article&p=1016827 In a nation where ZIP codes often determine opportunity, the promise of educational equity remains out of reach for millions of students. Despite years of reform, the link between a child’s environment and their academic outcomes still remains.

Today, as schools integrate digital tools into everyday learning, a new dimension of inequality has come into focus: access to technology. While some students benefit from personalized platforms and high-speed connectivity, others are still left behind, struggling to participate in a system that increasingly assumes digital access. The debate is no longer whether ed tech can improve education, but whether it will reach those who require it the most.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


The integration of technology into classrooms has the potential to improve learning, but only if access is universal. In reality, disparities in broadband connectivity, device availability, and digital literacy continue, especially in rural and low-income regions.

A 2024 report by the indicates that 43% of adults earning less than $30,000 annually lack broadband access, and nearly half of households making under $50,000 struggle to afford internet services. 

This leads to a “homework gap” that disproportionately impacts students in excluded communities, limiting their ability to complete assignments and engage with digital learning resources.

Beyond infrastructure, the challenge extends to technology deployment. Schools with more resources can invest in training educators, curating high-quality digital content, and supporting students with tailored interventions. In contrast, under-resourced schools may lack the technical assistance and instructional direction required for effective ed tech integration. Without thoughtful implementation, technology risks becoming a superficial fix rather than a meaningful equalizer.

To bridge the gap, tech access should be treated as a foundational right, not a privilege. That means investing in affordable internet for all households, making sure every student has access to a reliable device, and providing the support systems that make digital learning meaningful and accessible.

Ed tech, when designed and deployed with equity in mind, can be an effective tool to close learning gaps. AI-powered and gamified learning platforms, for example, offer the ability to personalize content to meet students where they are, regardless of age, ability, or background. 

Adaptive platforms, for instance, are able to recognise when a student is behind and make real-time material adjustments. Through milestones and rewards, gamified modules can keep students motivated. This is especially helpful for students who might otherwise lose interest in a strict, one-size-fits-all approach. These features can have a particularly on classrooms with a variety of learning demands but a small number of teachers.

Too often, though, innovative learning technologies are piloted in affluent districts with the budget and infrastructure to support them, while the students who could benefit most remain out of reach. Without targeted strategies to expand access and usage, ed tech risks strengthening the very disparities it aims to address.

True equity means creating educational technology that represents the diversity of the learners themselves. This includes considering various cognitive styles, linguistic backgrounds, and cultural situations. Platforms should offer multilingual support, dyslexia-friendly fonts, sensory-sensitive modes for neurodiverse kids, and culturally relevant material. Without these design considerations, ed tech may inadvertently exclude the very students it aims to uplift, even when devices and internet access are available.

The answer lies not just in the tools themselves, but also in how and where they are deployed. Equity-focused implementation requires a commitment to both access and impact –- ensuring students can use the technology, and that the technology truly supports their learning journey.

 This is not a challenge educators can tackle alone. It requires coordinated action from policymakers, district leaders, nonprofit partners, and the tech community itself.

Public investment should prioritize infrastructure development in under-served areas, such as expanding broadband coverage and subsidizing device distribution. Equally important is funding for professional development, helping teachers integrate digital tools into their pedagogy in ways that are culturally responsive, developmentally appropriate, and aligned with academic goals.

At the policy level, educational equity must be embedded into procurement decisions, funding formulas, and accountability frameworks. Leaders must ask not just whether technology is available in schools, but whether it is making a measurable difference for students who have historically been left behind.

Collaboration across sectors is critical. Nonprofits can help support communities in navigating the digital learning landscape. Tech providers can design solutions with accessibility and inclusion built in from the start. And local governments can act as conveners — aligning resources, reducing duplication, and ensuring families are supported beyond the school day.

There is no silver bullet to educational inequity, but there is momentum. Across the country, districts are experimenting with community Wi-Fi programs, public-private partnerships, and learning models that prioritize flexibility and student engagement. These efforts prove that with the right intentions, innovation and inclusion can go hand in hand.

What’s needed now is sustained commitment. We should resist the temptation to view ed tech as a short-term fix or an optional add-on. Instead, it must be approached as a core element of a broader equity agenda, one that prioritizes student outcomes, not just new tools.

Ed tech holds enormous promise, but only if we build systems that ensure its promise reaches every student. That starts with recognizing that the digital divide is not just a tech problem, it’s an equity problem. And equity is something we must design for from the beginning.

]]>
Opinion: Instead of Banning Cellphones in School, Our Connecticut District Embraced Them /article/instead-of-banning-cellphones-in-school-our-connecticut-district-embraced-them/ Tue, 20 May 2025 16:30:00 +0000 /?post_type=article&p=1015905 To many teachers and administrators, the biggest enemy of education sits in the pockets and backpacks of their students. Viewed as a classroom distraction, in K-12 districts across the country, ensuring that social media and artificial intelligence apps are inaccessible during the school day.

While the intentions behind the bans are understandable, are schools unknowingly holding back students in the long run? 

At Meriden Public Schools in Connecticut, we were frustrated by our students’ growing dependency on their cellphones and the potential misuse of AI and other tech tools. But Meriden is also a district that pioneers innovation by embracing new technology and teaching methods. 

The reality is, technology isn’t going away — it’s only going to become more prominent in students’ everyday lives. According to the , AI and technology are  expected to transform 86% of businesses in the next five years, making digital literacy a must-have skill for tomorrow’s workforce. As district administrators, we held the responsibility to foster responsible, productive digital citizens in our hands. We just had to find the right balance between traditional and tech-reliant learning.

The district’s acceptable-use policy provides a solid framework that encourages the responsible use of all technologies while allowing administrators the flexibility to pilot new tools. To help teachers and staff navigate the ever-changing AI landscape, our school leaders and instructional technology team created a library of documents and guidelines, including AI FAQs and an academic honesty and integrity checklist to use with students.

In addition, ensuring the effective use of technology has meant expanding our digital citizenship curriculum. All Meriden students complete grade-appropriate lessons each year, which cover topics including online safety, cyberbullying and how to build a positive online profile. While younger pupils participate in offline simulations to learn about the responsible use of social media in the future, older students can take classes in digital photography, video production and other tech-related topics.

Refining our technology guidelines required us to revisit our cellphone usage rules. With, Meriden chose to take the opposite approach. School leaders realized that it’s not the device that matters, but quick and easy access to high-quality digital content. Meriden students have always been able to access digital curriculum through their Chromebooks in the classroom, but they prefer the convenience and familiarity of their smartphones.

So rather than sitting in a pouch all day, cellphones are now being used as learning tools. Meriden students use their phones to create photos, audio recordings and videos to demonstrate learning, monitor assignments and grades in , and regularly communicate with teachers, counselors and coaches through . They also rely on their phones to access critical AI learning tools, including , which generates personalized study guides and practice questions, and the that teaches ethical digital practices and allows them to conduct research in a controlled environment.

To promote the effective use of AI, cellphones and social media, the district provides educators  with training on integrating technology into learning and student data privacy. While teachers can request that  phones be “off-and-away” during class time, many have made them a part of their lessons. For instance, in math classes, students are encouraged to take photos of the examples and use them as guides when solving complex problems. In dual-enrollment public speaking classes, students record their speeches, which helps them work on timing, pacing and delivery.  Similarly, in physical education classes, students use their phones to demonstrate proper form and receive feedback on personalized workouts.

Embracing technology allows educators the flexibility to facilitate small-group instruction during class time. While one group of students learns alongside the teacher, their classmates work on digital content at their own pace and grade level with a virtual tutor such as and .

Tools like have also helped educators automate daily tasks, such as generating rubrics and creating learning materials, while streamlines the grading process, alerts teachers when students are copying and pasting text rather than doing original writing and helps ensure that they receive targeted, personalized instruction. Now, teachers can spend more time interacting with students and less on administrative duties.

As new tools and policies are implemented, the district has continued to keep parents in the loop with information sessions and regular communication. That open dialogue has prevented the pushback many districts have received. Most parents have been receptive to our “off-and-away” cell phone policy, not just from a safety aspect, but an educational one as well.

AI is already reshaping tomorrow’s workplace, and for the sake of students’ success, schools have to take the fear out of technology. Administrators should feel empowered to try different tools, show educators how AI can assist them in their daily operations and design curriculum that thoughtfully incorporates new technology. 

School leaders must do more than equip students with digital literacy skills — they need to teach them how to use digital tools appropriately and responsibly, to be good stewards of technology. There’s power in those cellphones sitting in students’ pockets and backpacks. It’s up to educators to get them to use it the right way.

]]>
Boys Outperform Girls in Middle School STEM, Reversing Gender Gap, Study Finds /article/boys-outperform-girls-in-middle-school-stem-reversing-gender-gap-study-finds/ Tue, 13 May 2025 12:30:00 +0000 /?post_type=article&p=1015122 Boys are surpassing girls in middle school math and science achievement, according to new research comparing three of the nation’s top academic assessments.

A by the testing company NWEA shows a gender gap in eighth grade STEM achievement has returned following the pandemic.

Historically, boys have tested better than girls in math and science in middle school, said Megan Kuhfield, one of the NWEA report’s authors. But the gender gap disappeared in 2019, according to results from (TIMSS), an assessment administered across dozens of countries every four years. For the first time since 1995, girls outperformed boys in eighth grade math and science that year.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


But TIMSS scores released in December 2024 showed that girls’ performance substantially declined more than boys’ in eighth grade science and math. The study showed the same trend was found in two national tests: assessment and the (NAEP). 

Across all three tests, gender gaps in math and science went from almost nonexistent in 2019 to favoring boys starting in 2022. The MAP Growth assessment — which is administered annually — shows that the gaps widened mainly between 2021 and 2024, when students returned to classrooms.

Kuhfield said the research is concerning because decades of progress in for STEM achievement was wiped out in four years. 

“It’s really hard to say definitively what’s happening here,” she said. “It’s the million-dollar question — why did we see these gaps close by 2019 and then reopen during the last five years?”

Researchers discovered that girls suffered more during COVID-19, but Kuhfield said if that was the main cause, reading test scores would have followed a similar pattern. Girls still outperformed boys in literacy on the latest NWEA and NAEP assessments, according to the study.

“That kind of led me to two other theories that are going on kind of in my head,” she said. “One being: Maybe there’s something about how teachers are interacting with students in the classroom — reinforcing old stereotypes of pushing boys [more] towards advanced math. We don’t have evidence of this.”

Kuhfield said her other theory is that there’s been a shift in education to focus on boys’ academic achievement as researchers have found they are .

The NWEA study includes recommendations for schools to improve the equity in STEM education. Researchers suggest examining classroom dynamics and instructional practices to ensure boys aren’t receiving more teacher attention, and providing academic and emotional support — particularly to girls — to improve math and science skills.

]]>
Technologists Welcome Executive Order on AI in Schools But Say More Detail is Needed /article/technologists-welcome-executive-order-on-ai-in-schools-but-say-more-detail-is-needed/ Mon, 05 May 2025 16:30:00 +0000 /?post_type=article&p=1014744 This article was originally published in

Education software experts say they’re cautiously optimistic about a Trump administration drive to incorporate AI into classrooms, but such a program needs clear goals, specific rules — and enough money to fund the costly systems.

“AI is, inherently, really expensive,” said Ryan Trattner, CEO of AI-assisted studying tool Study Fetch. “It’s not something that scales like a normal piece of software where it might be the same price for 1,000 people to use it as 100,000.”

Among a handful of education-related executive orders last week, President Donald Trump released an order to incorporate artificial intelligence education, training and literacy in K-12 schools for both students and teachers.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


The move is in line with other actions Trump has taken to promote quick growth of artificial intelligence in the U.S., including rolling back the 2023 Biden administration executive order that aimed to promote competition within the AI industry while creating guidelines for responsible government use of the technology. Introducing AI to grade school children is meant to create an “AI-ready workforce and the next generation of American AI innovators,” the order said.

A task force made up of members from various federal departments — like the Departments of Agriculture, Education, Energy and Labor, as well as the directors of the Office of Science and Technology Policy, the National Science Foundation and other federal agency representatives — will be developing the program over the next 120 days.

Some makers of AI tools for students said they are cautiously optimistic about more widespread use of AI in schools, saying it would better prepare kids for the current workforce. But they say success with this program hinges on the ability to measure outcomes for AI learning, an understanding of how AI plays a role in society and a set of clear federal guidelines around AI, which the U.S. does not currently have.

Many students, parents and teachers are already using AI in some portion of their learning, often through AI-powered tutoring, counseling, training, studying or tracking tools mostly available from private companies.

Bill Salak, chief technology officer at AI learning and studying platform Brainly, said that many AI tools built for education right now aim to fill gaps in schools where teachers are often spread thin. They may be using AI tools to help them make lesson plans, presentations or study guides. Brainly was founded on the idea of simulating student-run study groups, and is a supplement to classroom learning, Salak said.

Salak is happy to see an initiative that will prompt educators to incorporate AI literacy in schools, saying he feels we’re in a “rapidly changing world” that requires much of the workforce to have a baseline understanding of AI. But he says he hopes the task force gets specific about their goals, and develops the ability to measure outcomes.

“I do think there will be further mandates needed, especially one in which we revisit again, like, what are we teaching?” he said. “What are the standards that we’re holding our teachers to in terms of outcomes in the classroom?”

Specific objectives may come after the 120 day research period, but the executive order currently says that the initiative will develop online resources focused on teaching K-12 students foundational AI literacy and critical thinking skills, and identify ways for teachers to reduce time-intensive administrative tasks, improve evaluations and effectively teach AI in computer science and other classes. It also seeks to establish more AI-related apprenticeship programs targeted at young people.

Trattner of Study Fetch said he’s eager to see a green light from the administration for schools to invest in AI education. The Study Fetch platform allows students and teachers to upload course material from a class, and receive customized studying materials. Trattner said that initially many educators were worried that AI would allow students to cheat, or get through classes without actually learning the material.

But he said in the last year or so, teachers are finding specific tasks that AI can help alleviate from their long to-do lists. Generative AI chatbots are probably not the best fit for classrooms, but specific AI tools, like platforms that help students learn their curriculum material in personalized ways, could be.

“Everybody knows this, but teachers are extremely overworked, with multiple classes,” Trattner said. “I think AI can definitely help educators be substantially more productive.”

But cost is something the committee should consider, Trattner said. The executive order calls for the development of public-private partnerships, and said the committee may be able to tap discretionary grant funding earmarked for education, but it didn’t outline a budget for this initiative. AI tools are often more expensive than other software that schools may be used to buying in bulk, Trattner said.

Some AI tools are targeted toward other parts of the school experience, like College Guidance Network’s Eva, an AI counseling assistant that helps users through the college application process, and helps parents with social and emotional dynamics with their children.

Founder and CEO Jon Carson said he’s not sure that this executive order will make a big impact on schools, because schools tend to follow state or local directives. He also feels like the current administration has damaged its authority on K-12 issues by attempting to shut down the Department of Education.

“In another era, we might actually even bring it up if we were talking to a school district,” Carson said. “But I don’t think we would bring this up, because the administration has lost a lot of credibility.”

Carson hopes the committee plans for security and privacy policies around AI in schools, and folds those principles into the curriculum. Federal guidance on AI privacy could help shape everyone’s use, but especially students who are at the beginning of their experience with the technology, he said.

A successful version of this program would teach students not just how to interact with AI tools, but how they’re built, how they process information, and how to think critically about the results they receive, Salak said. Educators have a right to be critical of AI, and the accuracy of information it provides, he said. But critical thinking and validating information is a skill everyone needs, whether the information comes from a textbook or an algorithm.

“In a world where there’s so much information readily accessible and misinformation that is so readily accessible, learning early on how to question what it is that AI is saying isn’t a bad thing,” Salak said. “And so it doesn’t need to be 100% accurate. But we need to develop skills in our students to be able to think critically and question what it’s saying.”

The specific recommendations and programing stemming from the Artificial Intelligence Education Task Force likely won’t come until next school year, but Salak said he feels the U.S. workforce has been behind on AI for a while.

“I really hope that we’re able to overhaul the agility at which the education institution in America changes and adapts,” Salak said. “Because the world is changing and adapting very, very fast, and we can’t afford to have an education system that lags this far behind.”

]]>
Opinion: Career-Connected Learning: Engaging Students by Teaching Real-World Skills /article/career-connected-learning-engaging-students-by-teaching-real-world-skills/ Tue, 11 Feb 2025 19:30:00 +0000 /?post_type=article&p=739790 The average American student spends roughly 15,000 hours in school between kindergarten and 12th grade, far more than the needed to master almost anything. Imagine a school that reimagines these 15,000 hours to give graduates not only the foundational knowledge necessary to navigate life, but also the skills to pursue a career.

Such a school could expose students to a multitude of career fields, allow students to choose learning opportunities that reflect their passions, and facilitate credential-building experiences that support students in launching careers they care about – all before entering college or the workforce. 

This type of learning isn’t hypothetical, and it isn’t always restricted to high school. Innovative communities across the country are proving the power of career-connected learning – which integrates real-world skills and experiences into curricula – to give students of all ages the 21st-century know-how needed to thrive and lead in the future. 


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


Just outside Austin, Texas, IDEA Round Rock Tech recognizes that must access computer science courses to be prepared for the region’s . The school implemented a comprehensive COMP3 (computer science,computational thinking, and general computing) progression for all of its pre-K through high school students. Programming languages like Python and JavaScript bolster students’ access to tech jobs (if they want them) and build the foundational logic and problem-solving skills they’ll need in any career. 

At the Brooklyn STEAM Center in New York City, 11th and 12th grade students from across the borough spend their afternoons “learning by doing.”&Բ; Located at the Brooklyn Navy Yard, a robust industry ecosystem with over 400 businesses, STEAM students choose from six in-demand industries, engaging in professional work, developing robust industry networks, and ultimately creating tangible pathways to a career.

Students’ personal stake in the industry and opportunities they pursue is helping STEAM build toward its founding goal of transforming the “school to prison pipeline” into “school to career.” It’s working: 83% of STEAM’s first graduating class earned a career credential, 100% had a fully-developed post-secondary plan, and 95% enrolled in a four-year college.

Career-connected learning solves for the future by engaging students today. Where I work at , a national nonprofit committed to extraordinary learning for all children, my colleagues and I are hearing from too many students that school is falling short. It is not engaging, relevant or connected to their real-life. They are telling us directly in surveys like our and continued tracking of Gen-Z engagement. They’re also telling us by simply not showing up to class. 

By giving students agency to pursue the kinds of relevant, rigorous learning experiences they care about, career-connected learning can help solve the youth disengagement crisis. 

In Chicago, families designed Intrinsic Schools to address the troubling reality that just of kids entering local public schools would earn a four-year degree by the time they were 25. Intrinsic built a unique school design where students personalize and own their learning with support from innovative technology that helps students and teachers know where to focus and adjust day-to-day. 

For Isaaq, who went on to graduate from University of Chicago with a degree in computer science and psychology, this flexible design was key in pursuing his budding passion for math. While taking three math classes concurrently – unheard of in a traditional curriculum that stresses sequential, paced progression – Isaaq launched a club around video games and used his math skills to code a real-time rankings system he’d been told “couldn’t be done.”&Բ;

This student-centered design looks different for every kid, but gets results for most of them: more than 90% of the class of 2023 enrolled in college, compared to the national college enrollment rate of 39%.

Rural communities are also tackling student engagement with career-connected learning. In Colorado’s Clear Creek School District, students were increasingly disengaged in school as their community confronted a serious water crisis. Spurred by students’ advocacy for project-based learning, Clear Creek High School transformed 34 of its classes to tackle real-life challenges, in part by learning more about the careers that influence them. 

In AP Bio, students began learning about filtration systems and water quality. Some students delved into communications, fundraising, and liaising with school and business leaders. In just one school year, students’ belief that they’ve “seen adults in my school listen to the ideas and voices of youth when making decisions” grew from 45% to 54%. And the momentum generated by Clear Creek students led to a commitment of at least $150,000 to mitigate the water issues.

In each of these communities, career-connected learning is giving students a say in what, where, and how they learn. IDEA Round Rock, Brooklyn STEAM, Intrinsic, and Clear Creek are refusing to accept the limitations of a school model designed over a century ago, with students batched by age, curriculum standardized, and uniformity prized. Instead, these schools are elevating student voices and re-designing their education offerings to meet the needs of modern youth. 

Importantly, all of these schools arrived at their career-focused innovations through “,” a process that starts by listening to students and engages the whole school community to reshape school to meet student needs. When we listen to students, they tell us they want to grow new skills, explore new opportunities, and build their own futures—starting in K-12. 

These schools aren’t anomalies. Career-connected learning can take root in any community—red or blue, urban or rural, coastal or heartland—willing to come together to design learning that responds to the demands and opportunities of the 21st century. Our students are spelling out what they want from school today. It’s up to educators to  listen to them and create schools that make their 15,000 hours count. 

]]>
Tech Aims to Reduce Teacher Burnout – But it Can Sometimes Make it Worse /article/tech-aims-to-reduce-teacher-burnout-but-it-can-sometimes-make-it-worse/ Tue, 04 Feb 2025 19:01:00 +0000 /?post_type=article&p=739095 This article was originally published in

When we set out to study pandemic-related changes in schools, we thought we’d find that learning management systems that rely on technology to improve teaching would make educators’ jobs easier. Instead, we found that teachers whose schools were using learning management systems .

Our findings were based on a survey of 779 U.S. teachers conducted in May 2022, along with subsequent focus groups that took place in the fall of that year. Our study was peer-reviewed and published in April 2024.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


During the COVID-19 pandemic, when schools across the country were under lockdown orders, schools adopted to facilitate remote learning during the crisis. These technologies included learning management systems, which are online platforms that help educators organize and keep track of their coursework.

We were puzzled to find that teachers who used a learning management system such as Canvas or Schoology reported higher levels of burnout. Ideally, these tools should have simplified their jobs. We also thought these systems would improve teachers’ ability to organize documents and assignments, mainly because they would house everything digitally, and thus, reduce the need to print documents or bring piles of student work home to grade.

But in the , the data told a different story. Instead of being used to replace old ways of completing tasks, the learning management systems were simply another thing on teachers’ plates.

A telling example was seen in lesson planning. Before the pandemic, teachers typically submitted hard copies of lesson plans to administrators. However, once school systems introduced learning management systems, some teachers were expected to not only continue submitting paper plans but to also upload digital versions to the learning management system using a completely different format.

Asking teachers to adopt new tools without removing old requirements is a recipe for burnout.

Teachers who taught early elementary grades had the most complaints about learning management systems because the systems did not align with where their students were at. A kindergarten teacher from Las Vegas shared, “Now granted my kids cannot really count to 10 when they first come in, but they have to learn a six digit student number” to access Canvas. “I definitely agree that … it does lead to burnout.”

In addition to technology-related concerns, teachers identified other factors such as administrative support, teacher autonomy and mental health as predictors of burnout.

Why it matters

Teacher burnout has been a persistent issue in education, and one that became especially .

If new technology is being adopted to help teachers do their jobs, then school leaders need to make sure it will not add extra work for them. If it adds to or increases teachers’ workloads, then adding technology increases the likelihood that a teacher will burn out. This likely compels more teachers to leave the field.

Schools that implement new technologies should make sure that they are by offsetting other tasks, and not simply adding more work to their load.

The broader lesson from this study is that teacher well-being should be a primary focus with the implementation of schoolwide changes.

What’s next

We believe our research is relevant for not only learning management systems but for other new technologies, including emerging artificial intelligence tools. We believe future research should identify schools and districts that effectively integrate new technologies and learn from their successes.

The is a short take on interesting academic work.The Conversation

This article is republished from under a Creative Commons license. Read the .

]]>
Opinion: A Better Understanding of What People Do on Their Devices Is Key to Digital Well-Being /article/a-better-understanding-of-what-people-do-on-their-devices-is-key-to-digital-well-being/ Tue, 24 Dec 2024 17:30:00 +0000 /?post_type=article&p=736576 This article was originally published in

In an era where digital devices are everywhere, the term “screen time” has become a buzz phrase in discussions about technology’s impact on people’s lives. Parents are . But what if this entire approach to screen time is fundamentally flawed?

While researchers have made advances in measuring screen use, a detailed critique of the research in 2020 in how screen time is conceptualized, measured and studied. how digital technology affects human cognition and emotions. My ongoing research with cognitive psychologist builds on that critique’s findings.

We categorized existing screen-time measures, mapping them to attributes like whether they are duration-based or context-specific, and are studying how they relate to health outcomes such as anxiety, stress, depression, loneliness, mood and sleep quality, creating a clearer framework for understanding screen time. We believe that grouping all digital activities together misses how different types of screen use affect people.

By applying this framework, researchers can better identify which digital activities are beneficial or potentially harmful, allowing people to adopt more intentional screen habits that support well-being and reduce negative mental and emotional health effects.

Screen time isn’t one thing

Screen time, at first glance, seems easy to understand: It’s simply the time spent on devices with screens such as smartphones, tablets, laptops and TVs. But this basic definition hides the variety within people’s digital activities. To truly understand screen time’s impact, you need to look closer at specific digital activities and how each affects cognitive function and mental health.

In our research, we divide screen time into four broad categories: educational use, work-related use, social interaction and entertainment.

For education, activities like online classes and reading articles can improve cognitive skills like problem-solving and critical thinking. Digital tools like mobile apps can by boosting motivation, self-regulation and self-control.

But these tools also , such as distracting learners and contributing to poorer recall compared with traditional learning methods. For young users, screen-based learning may even have on development and their social environment.

Screen time for work, like writing reports or attending virtual meetings, is a central part of modern life. It can improve productivity and enable remote work. However, may also lead to stress, anxiety and cognitive fatigue.

Screen use for social connection helps people interact with others through video chats, social media or online communities. These interactions can promote and even such as decreased depressive symptoms and improved glycemic control for people with chronic conditions. But passive screen use, like endless social media scrolling, can such as cyberbullying, social comparison and loneliness, especially for teens.

Screen use for entertainment . Mindfulness apps or meditation tools, for example, can . Creative digital activities, like graphic design and music production, can reduce stress and improve mental health. However, too much screen use may by limiting physical activity and time for other rewarding pursuits.

Context matters

Screen time affects people differently based on factors like mood, social setting, age and family environment. Your emotions before and during screen use can shape your experience. Positive interactions can lift your mood, while with certain online activities. For example, we found that affect how readily people become distracted on their devices. Alerts and other changes distract users, which makes it more challenging to focus on tasks.

The social context of screen use also matters. Watching a movie with family can strengthen bonds, while using screens alone can increase feelings of isolation, especially when it replaces face-to-face interactions.

Family influence plays a role, too. For example, parents’ screen habits , and structured parental involvement can help reduce excessive use. It highlights the positive effect of structured parental involvement, along with mindful social contexts, in managing screen time for healthier digital interactions.

Consistency and nuance

Technology now lets researchers track screen use accurately, but simply counting hours doesn’t give us the full picture. Even when we measure specific activities, like social media or gaming, studies don’t often capture engagement level or intent. For example, someone might use social media to stay informed or to procrastinate.

Studies on screen time often vary in how they define and categorize it. Some focus on total screen exposure without differentiating between activities. Others examine specific types of use but may not account for the content or context. This lack of consistency in defining screen time makes it hard to compare studies or generalize findings.

Understanding screen use requires a more nuanced approach than tracking the amount of time people spend on their screens. Recognizing the different effects of specific digital activities and distinguishing between active and passive use are crucial steps. Using standardized definitions and combining quantitative data with personal insights would provide a fuller picture. Researchers can also study how screen use affects people over time.

For policymakers, this means developing guidelines that move beyond one-size-fits-all limits by focusing on recommendations suited to specific activities and individual needs. For the rest of us, this awareness encourages a balanced digital diet that blends enriching online and offline activities for better well-being.The Conversation

, Doctoral student in the College of Health and Human Development,

This article is republished from under a Creative Commons license. Read the .

]]>
AllHere CEO Arrested for Fraud /article/allhere-ceo-arrested-for-fraud/ Fri, 20 Dec 2024 15:55:32 +0000 /?post_type=article&p=737531
]]>
Computer Programs Monitor Students’ Every Word in the Name of Safety /article/computer-programs-monitor-students-every-word-in-the-name-of-safety/ Sat, 26 Oct 2024 12:01:00 +0000 /?post_type=article&p=734595 This article was originally published in

Whether it’s a research project on the Civil War or a science experiment on volcano eruptions, students in the Colonial School District near Wilmington, Delaware, can look up just about anything on their school-provided laptops.

But in one instance, an elementary school student searched “how to die.”

In that case, Meghan Feby, an elementary school counselor in the district, got a phone call through a platform called , whose algorithm flagged the phrase. The system sold by educational software company GoGuardian allows schools to monitor and analyze what students are doing on school-issued devices and flag any activities that signal a risk of self-harm or threats to others.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


The student who had searched “how to die” did not want to die and showed no indicators of distress, Feby said — the student was looking for information but in no danger. Still, she values the program.

“I’ve gotten into some situations with GoGuardian where I’m really happy that they came to us and we were able to intervene,” Feby said.

School districts across the country have widely adopted such computer monitoring platforms. With the youth mental health crisis worsened by the COVID-19 pandemic and school violence affecting more K-12 students nationwide, teachers are desperate for a solution, experts say.

But critics worry about the lack of transparency from companies that have the power to monitor students and choose when to alert school personnel. Constant student surveillance also raises concerns regarding student data, privacy and free speech.

While available for more than a decade, the programs saw a surge in use during the pandemic as students transitioned to online learning from home, said Jennifer Jones, a staff attorney at the Knight First Amendment Institute.

“I think because there are all kinds of issues that school districts have to contend with — like student mental health issues and the dangers of school shootings — I think they [school districts] just view these as cheap, quick ways to address the problem without interrogating the free speech and privacy implications in a more thoughtful way,” Jones said.

According to the most recent youth risk behavior from the federal Centers for Disease Control and Prevention, nearly all indicators of poor mental health, suicidal thoughts and suicidal behaviors increased from 2013 to 2023. During the same period, the percentage of high school students who were threatened or injured at school, missed school because of safety concerns or experienced forced sex increased, according to the CDC .

And the threat of school shootings remains on many educators’ minds. Since the Columbine High School shooting in 1999, more than 383,000 students have experienced gun violence at school, according to .

GoGuardian CEO Rich Preece told Stateline that about half of the K-12 public schools in the United States have installed the company’s platforms.

As her school’s designee, Feby gets an alert when a student uses certain search terms or combinations of words on their school-issued laptops. “It will either come to me as an email, or, if it is very high risk, it comes as a phone call.”

Once she’s notified, Feby will decide whether to meet with the student or call the child’s home. If the system flags troubling activity outside of school hours, GoGuardian Beacon contacts another person in the county — including law enforcement, in some school districts.

Feby said she’s had some false alarms. One student was flagged because of the song lyrics she had looked up. Another one had searched for something related to anime.

About a third of the students in Feby’s school come from a home where English isn’t their first language, so students often use worrisome English terms inadvertently. Kids can also be curious, she said.

Still, having GoGuardian in the classroom is important, Feby said. Before she became a counselor 10 years ago, she was a school teacher. And after the 2012 Sandy Hook Elementary School mass shooting, she realized school safety was more important than ever.

Data and privacy

Teddy Hartman, GoGuardian’s head of privacy, taught high school English literature in East Los Angeles and was a school administrator before joining the technology company about four years ago.

Hartman was brought to GoGuardian to help with creating a robust privacy program, he said, including guardrails on its use of artificial intelligence.

“We thought, ‘How can we co-create with educators, the best of the data scientists, the best of the technologists, while also remembering that students and our educators are first and foremost?’” Hartman said.

GoGuardian isn’t using any student data outside of the agreements that school districts have allowed, and that data isn’t used to train the company’s AI, Hartman said. Companies that regulate what children can do online are also required to adhere to regarding the safety and privacy of minors, including the Family Educational Rights and Privacy Act and the Children’s Online Privacy Protection Rule.

But privacy experts are still concerned about just how much access these types of companies should have to student data.

School districts across the country are spending hundreds of thousands of dollars on contracts with some of the leading computer monitoring vendors — including GoGuardian, Gaggle and others — without fully assessing the privacy and civil rights implications, said Clarence Okoh, a senior attorney at the Center on Privacy and Technology at the Georgetown University Law Center.

In 2021, while many schools were just beginning to see the effects of online learning, The 74, a nonprofit news outlet covering education, published an investigation into how Gaggle was operating in Minneapolis schools. Hundreds of documents revealed how students at one school system were subject to constant digital surveillance long after the school day was over, including at home, the outlet reported.

That level of pervasive surveillance can have far-reaching implications, Okoh said. For one, in jurisdictions where legislators have expanded censorship of “divisive concepts” in schools, including critical race theory and LGBTQ+ themes, the ability for schools to monitor conversations including those terms is concerning, he said.

A by the Electronic Frontier Foundation, a nonprofit digital rights group based in San Francisco, illustrates what kinds of keyword triggers are blocked or flagged for administrators. In one example, GoGuardian had flagged a student for visiting the text of a Bible verse including the word “naked,” the report said. In another instance, a Texas House of Representatives site with information regarding “cannabis” bills was flagged.

GoGuardian and Gaggle both also dropped LGBTQ+ terms from their keyword lists after the foundation’s initial records request, the group said.

But getting a full understanding of the way these companies monitor students is challenging because of a lack of transparency, Jones said. It’s difficult to get information from private tech companies, and the majority of their data isn’t made public, she said.

Do they work?

Years before the 2022 shooting at Robb Elementary School in Uvalde, Texas, the school district purchased a technology service to monitor what students were doing on social media, according to . The district sent two payments to the Social Sentinel company totaling more than $9,900, according to the paper.

While the cost varies, some school districts are spending hundreds of thousands of dollars on online monitoring programs. Muscogee County School District in Georgia paid $137,829 in initial costs to install GoGuardian on the district’s Chromebooks, . In Maryland, Montgomery County Public Schools for the 2024-2025 school year after spending $230,000 annually on it, later , according to the Wootton Common Sense.

Despite the spending, there’s no way to prove that these technologies work, said Chad Marlow, a senior policy counsel at the American Civil Liberties Union who authored a on education surveillance programs.

In 2019, Bark, a content monitoring platform, claimed to have helped prevent 16 school shootings in a describing their Bark for Schools program. The Gaggle company website says it 5,790 lives between 2018 and 2023.

These data points are measured by the number of alerts the systems generate that indicate a student may be very close to harming themselves or others. But there is little evidence that this kind of school safety technology is effective, according to the ACLU report.

“You cannot use data to say that, if there wasn’t an intervention, something would have happened,” Marlow said.

Computer monitoring programs are just one example of an overall increase in school surveillance nationwide, including cameras, facial recognition technology and more. And increased surveillance does not necessarily deter harmful conduct, Marlow said.

“A lot of schools are saying, ‘You know what, we’ve $50,000 to spend, I’m going to spend it on a student surveillance product that doesn’t work, instead of a door that locks or a mental health counselor,’” Marlow said.

Some experts are advocating for more mental health resources, including hiring more guidance counselors, and school policies that support mental health, which could prevent violence or suicide, Jones said. programs, including volunteer work or community events, also can contribute to emotional and mental well-being.

But that’s in an ideal world, GoGuardian’s Hartman said. Computer monitoring platforms aren’t the only solution for solving the youth mental health and violence epidemic, but they aim to help, he said.

“We were founded by engineers,” Hartman said. “So, in our slice of this world, is there something we can do, from a school technology perspective that can help by being a tool in the toolbox? It’s not an end-all, be-all.”

is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Stateline maintains editorial independence. Contact Editor Scott S. Greenberger for questions: info@stateline.org. Follow Stateline on and .

]]>
Can AI Bring Students Back to the Great Books? /article/can-ai-bring-students-back-to-the-great-books/ Sun, 15 Sep 2024 11:01:00 +0000 /?post_type=article&p=732858 Is your teenager annoyed by Nietzsche? Confused by Conrad? Through with Thoreau? Now she can talk to the expert inside her e-book.

The creators of a new, artificial-intelligence-assisted publishing effort called hope that offering interactive, personalized guidance and commentary from well-known writers, scholars and celebrities will help bring classic books alive for students.

They’re also aiming to help adults who might otherwise struggle in solitude through these weighty volumes.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


In the process, they predict, the titles could capture a much bigger audience, one that someday may be able to talk back to the experts and even influence how scholars interpret literature. 

The challenge is whether they can make the AI work without being creepy or intrusive.

The price: $29.95 per book, with multi-book subscriptions available. They also plan to offer discounts to schools and find philanthropic partners as underwriters. 

Among the key selling points of Rebind’s e-books is that it offers a clever synthesis of original commentary and “lite” AI that seamlessly matches the experts’ utterances to readers’ queries. So a student studying George Washington’s could pose a question to none other than historian — or at least the version of her already pressed between the covers of an e-book on presidential speeches.

The improbable effort grew out of an equally improbable meeting between the philosopher and John Dubuque, great-grandson of the founder of the retail chain Plumbers Supply. Dubuque had spent 14 years as its CEO and sold the company in 2021, at age 38.

Suddenly retired, he set about reading philosopher Martin Heidegger’s famously difficult Being and Time, hiring an Oxford scholar for twice-weekly private tutoring sessions. 

“I had this amazing experience and realized at the end of it, ‘It’s too bad more people can’t access this,’” he said. “This is the only way I ever could have read this book.”

Dubuque also began playing with ChatGPT, asking it to summarize passages from equally difficult books like Alfred North Whitehead’s Process and Reality. He was deeply impressed with the AI, warts and all, and concluded that if someone could tame it for students, cut down on “” and focus it on the books, it’d be a game-changer. 

He shared his ideas with Kaag, who had helped him get through William James’ The Varieties of Religious Experience.

John Kaag

Kaag had just published Sick Souls, Healthy Minds: How William James Can Save Your Life, which resonated with his benefactor. Kaag, who as a kid had been a poor reader with a stutter, recounted to Dubuque how his mother would sit at their kitchen table and help him muscle through assignments. 

They realized that many people want to tackle classics like Moby Dick and James Joyce’s Ulysses, Dubuque said, but get intimidated by big, difficult books. “So they just give up and read things that they can read, not the things that they really want to read.”

‘We’re choosing the people and they’re choosing the books’

Kaag soon recruited his friend Clancy Martin, an author and professor at the University of Missouri in Kansas City, who signed on to help find “Rebinders” for at least 100 AI-assisted e-books, offering readers what amounts to a one-on-one conversation with a novelist, critic or historian about the book.

The endeavor already boasts an impressive stable of author-experts: The Irish novelist John Banville on Joyce’s Dubliners, Goodwin on U.S. Presidents’ speeches, novelist on Adventures of Huckleberry Finn, Deepak Chopra on Buddhism and environmentalist on John Muir.

But there are also some unlikely pairings: Margaret Atwood on A Tale of Two Cities, Roxane Gay on Edith Wharton’s The Age of Innocence, producer, actor and writer Lena Dunham on E. M. Forster’s A Room With a View, and the critic on Romeo and Juliet

We’re choosing the people — and they’re choosing the books,” said Martin. 

Clancy Martin

To avoid copyright fights, the company is limited, for the moment, to books in the public domain, published before 1928. But Rebind is also in conversation with the world’s three largest publishers about offering contemporary books like 1984, Fahrenheit 451 and David Foster Wallace’s 1996 novel Infinite Jest.

Kipnis, who last spring wrote a of becoming a Rebinder, has said the endeavor “will radically transform the entire way booklovers read books.”

Acknowledging her misgivings about AI more broadly, she finally admitted to herself that perhaps this particular bet is worth pursuing. “The nihilist in me thinks if humans are going to perish, we might as well perish reading the Classics,” she wrote.

On occasion, Kaag, 44, and Martin, 57, have tried to politely steer a few scholars away from their first choice, with mixed results: When he offered the gig to novelist , for instance, Martin promised he could tackle any book he liked. So Greenwell proposed Henry James’ The Golden Bowl — a classic, but not exactly James’ most widely read novel. 

“I said, ‘O.K., Henry James is a great idea,” Martin recalled. “‘What about The Portrait of a Lady?’”

Sorry, Greenwell said. It was The Golden Bowl or nothing. 

Martin threw out a few other titles: The Turn of the Screw? Daisy Miller?

Eventually, he said with a laugh, they resolved it: “He’s doing The Golden Bowl.”&Բ;

So far, only a few prominent authors have opted not to participate — the literary novelist Andre Dubus III, a close friend of Kaag’s, told him he was “dancing with the devil.”&Բ;

Kaag said he’s getting a mixture of “really good” emails and “really serious hate mail” from colleagues fearful of AI. He takes that fear to heart, having spent much of his career . His classes, he said, have always been “very personal and very one-on-one.”

But he shifted his thinking a few years ago, after suffering from heart troubles that culminated in a cardiac arrest at age 40: “I just thought to myself, ‘I really would like to explore things that I hadn’t explored before.’”

Invoking Dubuque’s intimate tutoring sessions, he thought, “You can only scale one-on-one tutorials, or one-on-one conversations, so far.”

If AI can make that happen and bring the joy of reading to more people, he thought, perhaps it’s worth trying something new. “So to me, I don’t think it’s scary.”

‘Basically every question that I could possibly imagine’

Each book begins with a high-production-value video offering a sneak peak of what lies within. In the case of Henry David Thoreau’s Walden, we get sweeping drone shots of Walden Pond, complete with the Rebinder — in this case Kaag himself — taking a swim. He lives in nearby Concord, Mass., and has taught the book for more than a decade at the University of Massachusetts Lowell. 

For the Walden Rebind, Kaag recorded 30 hours of audio commentary, answering “basically every question that I could possibly imagine” a college student asking. 

The volume of commentary ranges widely, from 10 hours for Dubliners to nearly 80 for Ulysses by the philosopher .

As for how Rebind will be used, Kaag sees it not as a replacement for class discussions, but as preparation, a tool that can field questions readers might be too embarrassed to ask in class.

The way Rebind works will be familiar to anyone who reads e-books, but with a revelatory twist: Readers can highlight and annotate text, but they can also open up a chat window anywhere and type or dictate questions about a passage or sentence. They can wonder aloud about ideas or passages they’re curious about, or simply type: “I’m lost.”&Բ;

AI analyzes the query and matches it to the pre-loaded commentary, telling readers, if they click on a little icon, which parts of the answer are original and which are the AI smoothing out the syntax to be responsive to the query.

Screenshot of an exchange with author John Banville about the novels of James Joyce. Rebind can specify the parts of an answer that are an expert’s actual words and those generated by AI to personalize it to the reader’s query.

Antero Garcia, an associate professor in the Graduate School of Education at Stanford University and vice president of the National Council of Teachers of English, said he likes the transparency that comes with that breakdown. “I actually hope more AI does something like that, where you can see the sources of things” it presents to readers.

But he worries that tools like Rebind could draw users more into reading as a solitary pursuit. “If I’m lost in Dubliners, that’d be great to go to my English teacher or to a friend and, God forbid, have a reading group or a book group and just have a conversation about this text,” he said. 

Garcia said he was reluctant to overstate the isolating effects of AI, “but I do think there’s something missing as a result of relying on AI to guide us in our reading, rather than relying on reading being an inherently social thing.”

In the long term, Rebind actually seeks to integrate social elements that allow students in a class to “read and work together” within a text. Eventually, they hope to give teachers space for their own commentary. Future versions may offer Rebinders feedback from readers and the opportunity for deeper discussions via AI-moderated book clubs.

One feature stands out as potentially game-changing: If a reader wants to basically journal within the e-book, revealing his or her personal challenges along the way, that prompts the AI to search for commentary that helps: If you’re reading Walden, for instance, and type in, “This book makes me think of my times of loneliness and depression,” the e-book will reply: “I can understand how Thoreau’s reflections on solitude and the challenges of living authentically might resonate with feelings of loneliness and depression.”

That’s then followed up with a brief discussion of Thoreau’s encouragement “to remain attentive, even when things don’t particularly seem bountiful.”

The new e-books will also allow users to take notes, then use them to challenge the Rebinder to a conversation. While that could easily become a big privacy risk, Dubuque said Rebind will never sell user data, since it’s inviting users to “share the deepest, most meaningful things in their life and really give themselves to these books.” Profiting off those details is “not an option.”

‘Dancing with the devil’?

At the moment, the interactions are all through text, but the Rebinders have all given permission to have their voices reproduced so they can someday “chat” directly with users. “We have voice clones,” Dubuque said. “They’re very good.”

John Dubuque

But for now audio remains an open question, an option they’re not quite ready to offer. On the one hand, who wouldn’t want to chat about Dubliners with Banville? On the other hand, that could be weird. A small portion of the conversation wouldn’t be Banville at all, but a crusty, Irish-accented Banville-bot.

Dubuque predicted they’ll eventually end up using voice, but he wants to do it carefully.

“We’re very sensitive to the ‘ick factor’ of AI.”

His plan is to release the first books next month. 

Though it’s a for-profit company, with Dubuque its only funder, Martin said he also sees it as an effort to ensure that more young people get the chance to read great books under the guidance of great teachers. “Most of us don’t get to go to Columbia or to Yale or to Princeton,” he said. Fewer still get to study with scholars like Goodwin, Atwood, Banville or Gay.

But Garcia, the Stanford scholar, urged caution.

“There’s something fraught about this pursuit of scale,” he said. “In trying to deliver good books or good learning experiences to people, we ultimately get funneled into this pathway: The way to get it to the most people is to take away that human element or dilute that human element through AI. It feels like that’s when you lose the spirit of it.”

For his part, Martin wants to make Rebind “the most fun, most dynamic and most interesting way” to read books. It won’t supplant the solitary experience of reading, he said, it’ll offer something different: the choice to read a book in solitude or to “have a whole rich conversation about it with someone.”&Բ;

Or both. 

]]>
South Carolina Board of Education Passes Statewide Cellphone Ban for Public Schools /article/south-carolina-board-of-education-passes-statewide-cellphone-ban-for-public-schools/ Fri, 06 Sep 2024 14:30:00 +0000 /?post_type=article&p=732492 This article was originally published in

COLUMBIA — South Carolina school districts must ban students from using their cellphones during the entire school day, but exactly how they go about it is up to district officials, according to the state Board of Education passed Tuesday.

At the very least, districts must require students to keep their phones and connected devices, such as smartwatches, turned off and in their backpacks or lockers from the time the first bell rings in the morning until the dismissal bell in the afternoon, according to the state policy.

But the state board said districts can decide whether to enact sterner rules, as well as the consequences for violating them.


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


Districts that do not put a policy in place that is at least as strict as the one the state board passed Tuesday could lose their state funding.

“We’re saying, ‘This is what state law says, and so you’ve got to implement it,’ but we are leaving a lot of discretion, a lot of latitude, to districts on how exactly they do it,” board member Christian Hanley said.

The decision follows a clause the Legislature included in the state spending plan requiring the state board to create a policy prohibiting cellphones for K-12 students in the state’s public schools. The specifics, legislators left up to the board, which in turn left many of the details to local school boards.

Although state board members supported the idea of banning cellphones in schools, they said they worried about unintended consequences of the new policy, such as putting another task on overworked teachers, increasing the number of out-of-school suspensions or cutting students off from their parents during emergencies.

“Implementation of such a policy over a school day scares me,” said board chair David O’Shields. “Why? Because once we create this policy, it is the requirement of every district to follow suit, and there is the law of unintended consequences, and it frightens me.”

School boards will to put in place a policy at least as strict as the one the state board enacted, according to a memo the department sent to superintendents in June. District must submit those policies to the department to ensure compliance.

The state board, which passed the policy 15-1, added a stipulation that districts must report back about how implementation went in case the board finds a need to adjust its policy ahead of next school year.

“All of these things look good, but just because it looks good doesn’t mean it is good.” O’Shields said.

The policy

In the state policy, the board did decide lunch and other breaks should be considered part of the school day, meaning students must leave their cell phones stowed away during those times.

Districts may choose to take it further telling students not to bring their devices to school at all. Or they can buy lockable pouches to store them. Some may also decide to include bus rides, field trips or athletic events as times when students can not access their phones, according to the policy.

The policy also leaves room for exceptions.

If students have an assignment they cannot complete on school-provided devices, districts can allow students to keep their phones with them to use as part of their classwork.

Students with disabilities who need access to phones or tablets to learn would still be allowed to use the devices. And students with certain outside jobs, such as volunteer firefighters, can seek a written exception from their superintendent to use their phone during the day, according to the policy.

Enforcement also will largely be up to school districts. The policy requires “disciplinary enforcement procedures,” with increasing consequences for repeat offenders, but it doesn’t specify what that means.

State board members did discourage using out-of-school suspension as punishment for violating the policy. Taking a student out of school because they are breaking a rule meant to keep them focused on their classwork feels counterintuitive, said state Superintendent Ellen Weaver.

“The whole idea behind this policy is that we want students in classrooms getting instruction,” Weaver told reporters. “Taking students out of that instructional space really doesn’t make a whole lot of sense as far as I’m concerned.”

Still, different situations may warrant different punishments, so board members wanted to leave that decision up to the districts, said board member David Mathis.

Timing

Some board members felt they did not have enough time to create the policy.

Board member Beverly Frierson was the sole “no” vote, not because she disagreed with it but because she thought the board was too rushed to give the policy the consideration it needed, she said.

O’Shields, the board chair, worried teachers may have to spend too much time policing cellphones. Still, he agreed some kind of action was necessary.

“I know we need control, and there is an addiction, no doubt,” O’Shields said.

The policy has support from legislators, teachers’ advocates and Gov. Henry McMaster. Since 2020, McMaster has included this clause in his state budget recommendations. This was the first time legislators agreed to put it in the final plan.

“The research is clear,” McMaster wrote in a letter to the board Tuesday. “Removing access to personal electronic devices during the school day improves student academic performance and removes distractions that exacerbate anxiety among our adolescents.”

“Our responsibility is to create an environment where teachers can teach, and students can learn,” the letter continued.

In a statewide survey the education department conducted, 55% of teachers and administrators who responded said they supported a total ban on cellphones during the school day. Another 37% said they wanted students to have limited access during class time, with the chance to check their phones between classes or at lunch.

Along with being distracting while students are trying to learn, phones can erode their social skills and encourage bullying, Weaver said.

“I think the dividend that we will see this pay for schools and for our students’ future will be worth it in the end,” Weaver said.

is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. SC Daily Gazette maintains editorial independence. Contact Editor Seanna Adcox for questions: info@scdailygazette.com. Follow SC Daily Gazette on and .

]]>
Ohio School Districts Use Surveillance Software to Monitor Student Devices /article/ohio-school-districts-use-surveillance-software-to-monitor-student-devices/ Wed, 28 Aug 2024 12:30:00 +0000 /?post_type=article&p=732161 This story mentions suicide. If you or someone you know needs support now call, text or chat the .

Ohio’s largest school district recently started using surveillance software on students’ devices.

Columbus City Schools partnered with Gaggle — a Texas-based student safety technology company that provides constant surveillance — at the end of last school year, district spokesperson Jacqueline Bryant said in an email.

“This is an added layer of security to ensure students are not visiting unapproved sites,” she said in an email. “Gaggle employs advanced technology and human insight to review students’ use of online tools 24/7/365 days a week and provides real-time analysis, swiftly flagging any potentially concerning behavior or content; this includes signs of self-harm, depression, substance abuse, cyberbullying, or other harmful situations.”


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


Gaggle is currently partnered with about 1,500 school districts across the country, but would not answer how many of those districts are in Ohio, Gaggle spokesperson Shelby Goldman said.

“We have a practice to not answer questions about specific school districts,” she said in an email.

Ohio’s three largest school districts — Columbus, and — use Gaggle. Cleveland did not answer questions the Ohio Capital Journal sent about Gaggle.

Cincinnati Public Schools started using Gaggle in 2013 and it is active for all grades, according to the district. It costs the district $323,780 to use Gaggle.

“Cincinnati Public Schools prioritizes the safety and well-being of its students and staff, and utilizes Gaggle to monitor threats for individual student safety and the safety of each school community,” according to the district. “The District monitors content on District-provided devices and applications based on specific language and phrases, generating trigger alerts for review, rather than continuous monitoring.”

Gaggle, which started in the 1990s, monitors school platforms such as Google Workspaces and Microsoft Office 365, but does not look at students’ personal email addresses or private social media accounts.

“Gaggle is an early warning system that identifies children in crisis so that schools can intervene before a tragedy happens,” Goldman said in an email. “Gaggle partners with school districts to help them monitor student activity on the technology (devices and accounts) provided by the school district.”

The company estimates it helped save , according to their report from last fall.

“We believe finding the right balance between monitoring for safety purposes and protecting student privacy and confidentiality is important, and we’re committed to continuing to support districts in achieving both,” Goldman said in an email.

Gaggle uses Artificial Intelligence technology to spot things that could be an issue and a Human Safety Team reviews them before contacting the school.

“Our reviewers are looking at the context to determine if an item is related to an actual concern or maybe a simple reference to something that is harmless when in context,” Goldman said in an email.

Gaggle can flag things as early warning signs or an imminent threat, which is treated with a higher level of urgency. It altered Ohio school districts to 1,275 student incidents that required immediate intervention in 2021, according to an .

Columbus City Schools, which has about 47,000 students, is implementing Gaggle in middle and high schools. Students can’t opt out of it.

The district signed two contracts with Gaggle — the first for $58,492.40 in January and $99,180 in June, according to school board documents.

During the district’s Gaggle pilot from April 2022 to December 2023, 3,942 pieces of content were looked at by Gaggle’s Safety Team which led to 226 “actionable student safety concerns that were sent to Emergency Contacts,” according to a school board document.

Even though Sharon Kim’s two students are in elementary school and won’t yet be affected by the district’s Gaggle implementation, she is concerned about the district using surveillance technology.

“School should be a safe place for our kids,” Kim said. “They spend so much time in their lives at school, it should be a place where they feel safe, not where they feel like they’re being monitored and surveilled every single minute of the day. I really feel that this kind of surveillance is a huge hindrance to that.”

is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Ohio Capital Journal maintains editorial independence. Contact Editor David Dewitt for questions: info@ohiocapitaljournal.com. Follow Ohio Capital Journal on and .

]]>