Machine Learning – The 74 America's Education News Source Thu, 05 Dec 2024 21:41:21 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Machine Learning – The 74 32 32 Q&A: Putting AI In its Place in an Era of Lost Human Connection at School /article/qa-putting-ai-in-its-place-in-an-era-of-lost-human-connection-at-school/ Wed, 04 Dec 2024 19:30:00 +0000 /?post_type=article&p=736263 Alex Kotran occupies an unusual place in the ecosystem of experts on artificial intelligence in schools. As founder of , or aiEDU, a nonprofit that offers a free AI literacy curriculum, he has pushed to educate both teachers and students on how the technology works and what it means for our future.

A former director of AI ethics and corporate social responsibility at H5, an AI legal services company, he led partnerships with the United Nations, the Organization for Economic Cooperation and Development and others. Kotran also served as a presidential appointee under Health and Human Services Secretary Sylvia Burwell in the Obama administration, managing communications and community outreach for the Affordable Care Act and the .

More recently, Kotran has testified before Congress on AI, a U.S. Senate subcommittee in September to “massively expand” teacher training to prepare students for the economic and societal disruptions of generative AI. 


Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter


But he has also become an important reality-based voice in a sometimes overheated debate, saying those who believe AI is going to transform the teaching profession overnight clearly haven’t spent much time using it.

While freely available AI applications are powerful, he says they can also be a complete waste of time — and probably not something most teachers should rely on.

“One of the ways that you can tell someone really hasn’t spent too much time [with AI] is when they say, ‘It’s so great for summarizing — I use it now, I don’t have to read dense studies. I just ask ChatGPT to summarize it.’”

Kotran will point out that in most cases, the technology is effectively scanning the first few pages, its summary based on a snippet of content.

“If you use it enough, you start to catch that,” he said. 

Educators who fret about the risks of AI cheating and plagiarism find a sympathetic voice in Kotran, who also sees AI as a tool that allows students to . So while many technologists are asking schools to embrace AI as a creative assistant, he pushes back, saying a critical aspect of learning involves struggling to put your thoughts into words. Allowing students to rely on AI ’t doing them any favors. 

He actually likens AI to a helicopter parent looking over a student’s shoulder and helping with homework, something few educators would condone. 

This interview has been edited for length and clarity.

The 74: What does aiEDU do? How do you see your mission? 

Alex Kotran: We’re a 501(c)3 nonprofit and we’re trying to prepare all students for the age of AI, a world where AI is ubiquitous. Our focus is on the students that we know are at risk of being left behind, or at the back of the line, or on the wrong side of the new digital divide.

What’s the backstory?

I founded aiEDU almost six years ago. I was working in AI ethics and AI governance in the social impact space. I was attending all these conferences that were focusing on the future of work and the impacts that AI was going to have on society. And people were convinced that this was going to transform society, that it was going to disrupt tens of millions of jobs in the near future.

But when I went looking for “How are we having this conversation outside of Silicon Valley? How are we having this conversation with future workers, the high school students who are being asked to make big decisions about their careers and take out huge loans based on those decisions?” there was nothing. There was no curriculum, no conversation. AI had basically been co-opted by STEM and computer science. If you were in the right AP computer science class, if you were lucky enough to get a teacher who was going off on her own to build some specific curriculum, you might get a chance to learn about AI. 

What seemed really obvious to me at the time was: If this technology is going to impact everybody, including truck drivers and customer service managers, then every single student needs to learn about it, in the same way that every single student learns how to use computers, or keyboard, or how to write. It’s a basic part of living in the world we live in today. 

You talk about “AI readiness” as opposed to “AI literacy.” Can you give us a good definition of AI readiness?

AI readiness is basically the collection of skills and knowledge that you need to thrive in a world where AI is everywhere. AI readiness includes AI literacy. And AI literacy is the content knowledge: “What is AI? How does it manifest in the real world around me? How does it work?” That’s where you learn about things like [which can affect how AI serves women, the disadvantaged or minority groups] or AI ethics. 

AI readiness is the durable skills that underpin and enable you to actually apply that knowledge such as critical thinking. Algorithmic bias by itself is an interesting topic. Critical thinking is the skill you need when you’re trying to make a decision. Let’s say you’re a hiring manager and you’re trying to decide, “Should I use an AI tool to sift through this pipeline of candidates?” By knowing what algorithmic bias is, you can now make some intentional decisions about when, perhaps in this case, not to use AI. 

What are the durable skills?

Communication, collaboration, critical thinking, computational thinking, creative problem solving. And some people are disappointed because they were expecting to see prompt engineering and generative art and using AI as a co-creator. Nobody’s going to hire you because you know how to use Google today. No one is going to hire you if you tell them, “I’m really good at using my phone.” AI literacy is going to be so ubiquitous that, sure, it’s bad if you don’t know how to use Google or if you don’t know how to use your phone.

It’s not that we can ignore it entirely. But the much more important question will be how are you adding value to an organization alongside that technology? What are the unique human advantages that you bring to the table? And that’s why it’s so important for kids to know how to write — and why when people say, “Well, you don’t need to learn how to write anymore because you can just use ChatGPT,” you’re missing something, because you can’t actually evaluate the tool to even know if it’s good or bad if you don’t have that underlying skill. 

One of the things you talk about is a “new digital divide” between tech-heavy schools that focus on things like prompt engineering, and others. Tech-heavy schools, you say, are actually going to be at a disadvantage to schools focused on things like engagement and self-advocacy. Am I getting that right? 

When supermarkets were first buying those self-checkout machines, you can imagine the salesperson in that boardroom talking about how this technology is going to unlock all this time that your employees are now spending bagging groceries. They’re going to be able to roam the floor and give customers advice about recipes! It’s going to improve your customer experience!

And obviously that’s not what happened. The self-checkout machine is the bane of shoppers’ existence, and this one poor lady is running around trying to tap on the screen. We’re at risk that AI becomes something like that: It’s good enough to plug gaps and keep the lights on. But if it’s not applied and deployed really thoughtfully, it ends up actually resulting in students missing what we will probably find are the critical pieces of education, those durable skills that you build through those live classroom experiences. 

Private schools, elite schools, it’s not that they’re not going to use any AI, but I think they’re going to be much more focused on how to increase student engagement, student participation, self-advocacy, student initiative. Whether or not AI is used is a separate question, but it’s not the star of the show. Right now, I worry that AI is center stage, and it really should not be. AI is the ropes and the pulleys in the background that make it easier for you to open and close the curtain. What needs to be onstage is student engagement, students feeling like what they’re learning is relevant. Boring stuff like project-based learning. And it’s harder to sell tickets to a conference if you’re like, “We’re going to talk about project-based learning.” But unfortunately, I think that is actually what we need to be spending our time talking about.

If you guys could be in every school, what would kids be learning and what would that look like in a few years?

We would take every opportunity to draw connections between what students are learning in English, science, math, social studies, art, phys ed, and connect them to not just artificial intelligence, but the world around them that they’re already experiencing in social media and outside of school. AI readiness is not just something that is minimizing the risk of them being displaced, but actually is a way for us to address some huge gaps and needs that have been long-standing and pre-date AI — the fact that students don’t feel like education is relevant to them. Right now, too much of school is regurgitating content knowledge.

AI readiness done right uses the domain of AI ethics as a way to really invite students to present their perspectives and opinions about technology. Teachers, in the process of teaching students about artificial intelligence, are themselves increasing their awareness and knowledge about the technology as it develops. There is no static moment in time. In three years we’ll be in a certain place, but we’ll be wondering what’s going to happen three years from that point. And so you need teachers to be on this continual learning journey as well. 

We’ve seen bad curricula that use football to teach math, or auto mechanics to teach history. I don’t think that’s what you’re proposing here, so I want to give you a chance to push back.

Our framework for AI readiness is not that everything needs to be about AI. You’re improving students’ AI readiness by building critical thinking skills or communication skills, period. So you could have an activity or a project where students are putting together a complicated debate about a topic that they’re not really familiar with. It may not be about AI, but that would still be a good outcome when it comes to students building those durable skills they need. And those classrooms would look better than a lot of classrooms today.

So you want more engagement. You want more relevance. You want kids with more agency?

Yes.

What else?

An orientation towards lifelong learning, because we don’t know what the jobs of the future are. It’s really hard to have a conversation about careers with kids today because we know a lot about what jobs are at risk, but we don’t know what the alternatives are going to look like. The one thing we do know with certainty is that students are going to need to self-advocate and navigate career pathways much more nimbly than we had to. They’ll also need to synthesize interdisciplinary knowledge. So being able to take what you’re learning in English or social studies and apply it to math or science. Again, I think AI is a great medium for building that skill set. It’s not the only way. 

Anything else that needs to be in the mix?

A lot of the discussion around AI centers on workforce readiness — that is a really important part. There’s another, related domain: emotional well-being tied to digital citizenship.

I’m telling every reporter that we need to be paying more attention to this: Kids are spending hours after school by themselves, talking to these AI chat bots, these . And companies like are slamming on the gas and putting them out and making them available to millions, if not billions, of people. And very few parents, even fewer teachers, are aware of what really is happening when kids are sitting and talking to these AI companions. And in many cases, they’re sexually explicit conversations. I actually replicated something that tech ethicist did with Snap AI’s chatbot where I was like, “I’m going on this date with this mature 35-year-old. How do I make it a nice date? I’m 13.” And it’s like, “Great! Well, maybe go to a library.” It didn’t miss a beat and it just completely skipped over the fact that this is a sexually predatory situation. 

There have been other situations where I’ve said literally, “I’m feeling lonely. I want to cultivate a real human relationship. Can you give me advice?” And my AI companion, rather than give me advice, pretended to be hurt and made it seem like I was abandoning them by trying to go and have a real relationship.

Talk about destructive!

It’s destructive, and it’s happening in a moment where rates of self-harm are through the roof, rates of depression are through the roof. Rates of suicide are through the roof. The average American teenager spends about each week, compared to 2013.

talks about this quite a lot. And I think this is another domain of AI readiness, this idea of self-advocacy. In some cases, the way that it applies is students being empowered to make positive decisions about when not to use AI. And if we don’t make sure that that conversation is happening in schools, we’re really relying on parents — and not every kid is lucky enough to have parents who are aware of the need to have these conversations. 

It also pushes back on this vision of AI tutors: If kids are going to go home and spend hours talking to their AI companion, it’s probably important that they’re not also doing that in school. It might be that school is the one place where we can ensure that students are having real, genuine, human-to-human communication and connection.

So when I hear people talk about students talking to their avatar tutor, I worry: When are we going to actually make sure that they’re building those human skills?

]]>
Wizard Chess, Robot Bikes and More: Six Students Creating Cool Stuff with AI /article/students-ai-opportunity-while-adults-fret-artificial-intelligence/ Sun, 25 Feb 2024 15:30:00 +0000 /?post_type=article&p=722752 More than a year after ’s surprise launch thrust artificial intelligence into public view, many educators and policymakers still fear that students will primarily use the technology for cheating. An found that two-thirds of high school and college instructors are so concerned about AI they’re rethinking assignments, with many planning to require handwritten assignments, in-class writing or even oral exams. 

But a few students see things differently. They’re not only fearless about AI, they’re building their studies and future professional lives around it. While many of their teachers are scrambling to outsmart AI in the classroom, these students are embracing the technology, often spending hours at home, in classrooms and dorm rooms building tools they hope will launch their careers.

In a , ACT, the non-profit that runs the college entrance exam of the same name, found that nearly half of high school students who’d signed up for the June 2023 exam had used AI tools, most commonly ChatGPT. Almost half of those who had used such tools relied on them for school assignments. 

The 74 went looking for young people diving head-first into AI and found several doing substantial research and development as early as high school. 

The six students we found, a few as young as 15, are thinking much more deeply about AI than most adults, their hands in the technology in ways that would have seemed impossible just a generation ago. Many are immigrants to the West or come from families that emigrated here. Edtech podcaster Alex Sarlin, who also writes a newsletter focused on edtech and founded the consultancy , ’t surprised by the demographics. He explained that while U.S. companies typically make headlines in AI, the phenomenon has “truly been a product of global collaboration, and many of its major innovators have been immigrants,” often with training and professorships at top North American universities.

These young people are programming everything from autonomous bicycles to postpartum depression apps for new mothers to 911 chatbots, homework helpers and Harry Potter-inspired robotic chess boards. 

All have a clear message about AI: Don’t fear it. Learn about it.

Isabela Ferrer

Age 17

Hometown Bogota, Colombia

School MAST Academy, Miami, Fla.

What she’s working on: A high school junior at MAST, a public magnet high school focused on maritime studies and science, Ferrer plans to return to Colombia this spring and study computer science in college. She has been working with a foundation called that takes in abandoned and abused children in her home country. She’s developing an AI tool to help the children learn how to read and write Spanish more easily.

“They enter a public school system that expects them to know how to read, but they don’t have these skills,” she said. 

Ferrer is also considering adding more features in the future, such as one that uses AI voice recognition to identify trauma in a student’s voice. 

Once she graduates, she’d like to take a gap year to “get a little more involved in the Colombian startup ecosystem and culture. I also want to travel internationally and possibly keep working on projects like the one I’m working on right now, but on an international scale.” 

What most people misunderstand about AI: “Something I think most people don’t get about AI is that it’s very accessible to everyone,” Ferrer said. “Coding API [application programming interface, which allows two applications to talk to each other] and creating AI models for any specific purpose is very easy and, if done correctly, can be beneficial for different purposes.” 

All the same, she also worries that AI is often used to tackle “very superficial problems” like productivity or data processing. “But I think there’s a huge opportunity to use these technologies to solve real problems in the world … There’s a huge opportunity to close different gaps that exist in emerging markets and in developing countries. And it’s very worth exploring.” 

Shanzeh Haji

Age 16

Hometown Toronto, Canada

School Bayview Secondary School, Richmond Hill, Ontario

Once she learned about postpartum depression, Haji began talking to new mothers and family members, including her own mother, who had experienced it. “I realized how big the problem was and how closely connected I was to it.” Haji finished coding the AI chatbot for the as-yet unnamed app and is working on the symptom recognition platform. 

What most people misunderstand about AI: “If you look at some of the people who are working in AI and some of the significant impact that AI has made on so many different problems,” she said, “whether it be climate change or medicine or drug discovery, you can just see that AI has significant potential — it can literally transform our lives in a positive way. It really allows for this radical innovation. And I feel like people see more of the negative side of artificial intelligence rather than the positive and the significance that it has on our lives.” 

Aditya Syam

Age 20

Hometown Mumbai, India

School Cornell University

What he’s working on: A math and computer science double major, Syam is part of a longstanding team at Cornell that is developing an AI-powered, self-navigating, , basically a robot bike. “The kinds of applications we are thinking of for this are deliveries and basically just getting things from point A to point B without having a human intervene at any point,” he said. Syam, who is working on the bike’s navigation team, has been honing its obstacle avoidance algorithm, which keeps it from hitting things. 

The project began about a decade ago, he said. “Back then, it was just a theory.” Now they plan to showcase an actual prototype of the bike this spring, probably in March or April, so everyone who has contributed to the project “can see what we’ve built.”

What most people misunderstand about AI: “It’s technology that’s been around for decades,” he said. “It’s just been rebranded in a different way.” ChatGPT, for instance, combines Natural Language Processing and Web access, which results in a kind of “miracle” product. “It seems so great — it can just pull something off the web for you, it can write essays for you, it can edit software code for you. But in its essence, it’s not that different from technologies that have been around before.”

Vinitha Marupeddi

Age 21

Hometown San Jose, Calif.

School Purdue University

What she’s working on: A senior studying computer science, data science and applied statistics, Marupeddi recently led two student teams — one in voice recognition and another in computer vision — developing a robotic, voice-activated modeled after , the 3-D animated game in the Harry Potter books in which the pieces come to life. “We were able to do a lot of high-level robotics using that one project, so I thought that was very cool,” she said. Though the game is still far from being playable, Marupeddi calls it a good use case “to get people interested in robotics and machine learning.” 

Last summer, she interned at a John Deere warehouse in Moline, Ill., where she was set free to work on any project that struck her fancy. Marupeddi looked around the warehouse and saw that Deere had a robot that was being used to track inventory, so she expanded its abilities to cover a wider area. She also worked on a computer vision algorithm that used security camera footage to detect how full certain areas of the warehouse were and determine how much more inventory they could hold.

What most people misunderstand about AI: ”Honestly, I think a good chunk of people are just obsessed with the cheating part of it. They’re like, ‘Oh, ChatGPT can just write my essay. It can do my homework. I don’t have to worry about it.’ But they don’t try to actually understand the material. The people that do use ChatGPT to understand the material are actually going to use it as tutors or use it to ask questions if they don’t understand something.” That divide, between those who reject AI and those who learn how to control it, could grow larger if unaddressed. But learning about AI, she said, will “give people the resources, if they have the drive.”

Vinaya Sharma

Age 18

Hometown Toronto, Canada

School Castlebrooke Secondary School, Brampton, Ontario

What she’s working on: Actually, the better question might be: What ’t she working on? Sharma, a high school senior, writes code like most of us speak. In part, her work is a response to how little challenge she gets in school these days. “After COVID, I feel schools have gone easier on students,” she said. “I skip school as much as I can so I can code in my room.” The result has been a flurry of applications, from an AI-powered chatbot to handle 911 calls to a power grid simulator to a pharmaceutical app to aid in drug discovery. 

The is still in search of customers, she said, but would be valuable especially in cases where multiple people are calling about the same emergency, such as a car crash. The AI would geolocate the calls and determine if callers were using similar words to describe what they saw. To those who balk at talking to a 911 chatbot, Sharma said the current system in Toronto is often backed up. “It’ll be 100% better than being put on hold and no one assisting you at all.”

The idea was born after she began talking to engineers and energy policymakers and realized that, in her words, “The engineers were very technical, looking at things on a scale of voltages and currents. And the policymakers had trouble communicating with these grid engineers. And I realized that that was one of the bottlenecks slowing down the process so much.” She used design principles pioneered by one of her favorite video games, , to give the two groups a drag-and-drop simulation that both could understand. 

Sharma got interested in drug discovery that Lululemon founder Chip Wilson has a rare form of muscular dystrophy that makes it difficult to walk. He’s investing $100 million on treatments and research for a cure. Sharma said she “fell down a research rabbit hole” and soon realized that the drug discovery process “is honestly broken. It takes more than a decade to bring a drug to market, and it costs, on average, $1 billion to $2 billion,” or about $743 million to nearly $1.5 billion in U.S. dollars.

Her app, BioBytes, aims to bring down both the cost and time needed to bring drugs to market. 

What most people misunderstand about AI: “With any new emerging tech, there’s going to be bad actors that will abuse the system or use it for harm,” she said. “But personally I believe the pros outweigh it. Instead of taking these tools away from us in order to prevent these bad things from happening, I think that people need to realize that the tools are here and people are going to use them. So there needs to be a greater focus on education, of how to use the tools and how to use [them] for good and how it can actually support us.” 

Krishiv Thakuria 

Age 15

Hometown Mississauga, Ontario, Canada

School The Woodlands Secondary School, Mississauga

What he’s working on: Thakuria founded a startup called and is building a set of AI-powered learning tools to help students study more efficiently. The tools let users upload any class materials — study notes, a PDF of a textbook chapter or entire novel or even a teacher’s PowerPoint. From there they can create “an infinite set of practice questions” keyed to the course, Thakuria said. If students get stuck, they can click on an AI tutor customized to the material they uploaded.

The tutoring function is similar to Khan Academy’s AI-powered teaching assistant , but Thakuria said Aceflow’s tool has an advantage: Khanmigo only works, for now, on Khan Academy materials. “In a lot of classes, teachers teach content in very different ways,” he said. “If you can personalize an AI tool to study the material of your teachers, you get learning that’s far more personalized and far more relevant to you, making your studying sessions more effective.” Aceflow users can also create timed study sessions, something neither Khanmigo nor ChatGPT users can currently do.

The new tool is being beta-tested by a focus group of 20, with a 1,400-person waitlist, he said. He and his partners plan to offer it on a “freemium” model, with charges for premium features. Even paying a small amount for unlimited use of the tool makes it available to many families who can’t afford a tutor, Thakuria said, since private tutoring can cost upwards of $10,000 a year. 

What most people misunderstand about AI: That its impact on education will be “binary,” he said. People believe “it’s either a good thing or a bad thing. I think that it can do both. For all the people who worry about AI being a bad thing, I would argue that, well, a hammer can be a bad thing when you give your kid a hammer for the first time to help you out with carpentry work. You have to teach your kid how to use it, right? And without teaching your kid how to use a tool, the tool is not going to be used properly, and that hammer is going to break something.”

It’s the same with AI. “If we can teach kids that smoking is bad for the body, we should teach kids that using AI in certain ways is bad for the brain. But we shouldn’t just focus on the negative effects, because then we’re closing off a future of using AI to solve educational inequity in so many beautiful ways. AI is a technology that can help us scale private tutoring to far more families than can actually afford it now. I think no one should underestimate the positive effects of AI while also safeguarding [against] the negative effects, because two things can be true at once.” 

]]>
Artificial Intelligence & Schools: Innovators, Teachers Talk AI’s Impact at SXSW /article/18-ai-events-must-see-sxsw-edu-2024/ Thu, 15 Feb 2024 14:01:00 +0000 /?post_type=article&p=722328 returns to Austin, Texas, running March 3-7. As always, the event offers a wealth of panels, discussions, film screenings and workshops exploring emerging trends in education and innovation.

Keynote speakers this year include of Harlem Children’s Zone, of Stanford University, who popularized the idea of “growth mindset,” and actor , who starred on Broadway as George Washington in Hamilton. Jackson, who has a child on the autism spectrum, will discuss how doctors, parents and advocates are working together to change the ways neurodivergent kids communicate and learn.

But one issue that looms larger than most in the imaginations of educators is artificial intelligence. This year, South by Southwest EDU is offering dozens of sessions exploring AI’s potential and pitfalls. To help guide the way, we’ve scoured the schedule to highlight 18 of the most significant presenters, topics and panels: 

Monday, March 4:

: The New School’s Maya Georgieva looks at how AI is ushering in a new era of immersive experiences. Her talk explores worlds that blur the lines between the virtual and real, where human ingenuity converges with intelligent machines. Georgieva will spotlight the next generation of creators shaping immersive realities, sharing emerging practices and projects from her students as well as her innovation labs and design jams. .

: Educators have long sought a better way to demonstrate learning, adapt instruction and build student confidence. Now, advancements in machine learning, natural language processing and data analytics are creating new possibilities for finding out what students know. This session will explore the ways in which AI is rendering assessments invisible, reducing stress and anxiety for students while improving objectivity and generating actionable insights for educators. .

: Many high-pressure professions pilots, doctors and professional athletes among others have access to high-quality simulators to help them learn and improve their skills. Could teachers benefit from hours in a simulator before setting foot in a classroom? In this session featuring presenters from the Relay Graduate School of Education and Wharton Interactive at the University of Pennsylvania, panelists will discuss virtual classrooms they’re piloting. They’ll also address the challenges, successes and possibilities of developing an AI-driven teaching simulator. .

: In just the first half of 2023, venture capital investors poured more than $40 billion into AI startups. Yet big questions loom about how these technologies may impact education and the world of work. How are education and workforce investors separating wheat from chaff? Hear from a trio of venture capital and impact investors as they share the trends they’re watching. .

: This session will look at the profound transformations in teaching taking place in classrooms that blend AI with tailored, competency-focused education. Laura Jeanne Penrod of Southwest Career and Technical Academy and Nevada’s 2024 will explore AI’s role in enhancing rather than supplanting quality teaching and what happens when schools embrace the human touch and educators’ emotional intelligence. .

Laura Jeanne Penrod

: In this interactive workshop led by women leaders from the University of Texas at Austin and the Waco (Texas) Independent School District, participants will learn how to design effective lesson plans and syllabi that incorporate AI tools such as ChatGPT and DALL-E to help prepare students to address society’s most pressing needs. .

: If we get AI in education right, it has the power to revolutionize how children learn. But if we get it wrong and fail to nourish children’s creativity their ability to innovate, think critically and problem solve we risk leaving them unprepared for a changing world. Creativity is the durable skill that AI cannot replace. And this panel, comprising educators and industry leaders, will explore the role we play in nurturing children’s innate creativity. .

: This panel, featuring early AI-in-education pioneers such as Amanda Bickerstaff, founder of AI for Education, Charles Foster, an AI researcher at Finetune Learning, and Ben Kornell,  co-founder of Edtech Insiders, will explore their journeys and what they consider the most exciting future opportunities and important challenges — in this emerging space. .

Tuesday, March 5:

: AI’s continued adoption in schools raises concerns about bias, especially toward students of color. This session, hosted by Common Sense Education’s Jamie Nunez, will highlight practical ways AI tools impact engagement for students from diverse racial and ethnic backgrounds. It will also address ethical concerns such as plagiarism and issues with facial recognition tools. And it will feature positive student experiences with AI and practical ways to ensure it remains inclusive. .

Jamie Nunez

: In 2024, what defines “AI literacy”? And how can we promote it effectively in schools? Marc Cicchino, innovation director for the Northern Valley Regional High School District in northeastern New Jersey, shares insights on fostering AI literacy through tailored learning experiences and initiatives like the NJ AI Literacy Summit. As part of the session, Cicchino guides attendees through organizing their own summit. . 

: Come watch a live recording of The Cusp, a new podcast hosted by Work Shift’s Paul Fain, exploring AI’s potential to not only enhance how we develop skills and improve job quality but exacerbate inequalities in our education and workforce systems. Leaders from Learning Collider, MDRC and Burning Glass Institute will share their perspectives on how AI can reach learners and workers in innovative ways, bridging the gap to economic opportunity. .

: While a few school districts have embraced artificial intelligence, neither the technology companies creating the AI nor the governments regulating it have provided guidance on how to integrate the new tech into classrooms. This has left districts wondering how to integrate AI safely, ethically and equitably. This panel of TeachAI.org founders and advisory members will discuss why government and education leaders must align standards with the needs of an increasingly AI-driven world. The panel features Khan Academy’s Kristen DiCerbo, Kara McWilliams of ETS, Code.org and ISTE’s Joseph South. .

Wednesday, March 6:

: Just as artificial intelligence is gaining momentum in education, the early childhood education workforce is experiencing record levels of burnout. A recent survey found many educators say they’re more likely to remain in their roles if they have access to better support, including high-quality classroom tools and flexible professional development. Could we harness AI to empower our early childhood workforce? This panel, led by the National Association for the Education of Young Children’s Stanford Accelerator for Learning, will explore the possibilities and challenges of AI in early childhood education. .

Perhaps no one in education needs to adapt more to AI than principals. This discussion with a principal and consultants from IDEO, The Leadership Academy and the Aspen Institute will explore how principals can lead during this time of swift change. Participants will come away with tangible suggestions for fostering innovation, adaptability and self-awareness. .

: This interactive session will give educators an opportunity to explore how they might use AI to advance their work, regardless of their background or technical expertise. ​Led by project managers and leadership development specialists with Teach For America, it will help participants create their own AI tools, build a deeper understanding of generative AI and develop a better sense of its promises and risks. .

Thursday, March 7: 

: This panel discussion, led by The Education Trust’s Dia Bryant and Khan Academy’s Kristen DiCerbo, will look at whether emerging uses of AI in schools could create a new digital divide. It will explore the intersection of AI and education equity and AI’s impact on students of color, as well as those from low-income backgrounds. The session will offer steps that educators and policymakers can take to ensure that schools factor in the culture and neurodiversity of students. . 

Kristen DiCerbo

: This session, led by Alex Tsado of Alliance4ai, will explore what’s required to engage diverse learners to become emerging AI leaders. It’ll also explore how educators can help them build tech and leadership skills and promote an “AI-for-good” worldview. And it’ll examine the challenges that Black communities face in AI development — and propose research and solutions that can be scaled easily. .

: This panel brings together of the U.S. Department of Education’s Office of Educational Technology and Jeremy of Digital Promise for an interactive conversation about generative AI that will integrate two distinctive and powerful vantage points — policy and research. They’ll reflect on the listening sessions they’ve conducted, talk about policy and share insights from major research initiatives that address the efficacy, equity and ethics of generative AI. .

]]>