茄子视频

Explore

Texas Universities Deploy AI Tools to Review How Courses Discuss Race and Gender

Records obtained by The Texas Tribune show how universities are using the technology to reshape curriculum under political pressure, raising concerns about academic freedom.

Texas A&M University in College Station on Nov. 25, 2025. The Texas A&M University System is using an artificial intelligence tool to sift through its course offerings and flag those that could raise concerns under new rules restricting how faculty teach about race and gender. (Courtney Sacco for The Texas Tribune)

Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

A senior Texas A&M University System official testing a new artificial intelligence tool this fall asked it to find how many courses discuss feminism at one of its regional universities. Each time she asked in a slightly different way, she got a different number.

鈥淓ither the tool is learning from my previous queries,鈥 Texas A&M system鈥檚 chief strategy officer Korry Castillo told colleagues in an email, 鈥渙r we need to fine tune our requests to get the best results.鈥

It was Sept. 25, and Castillo was trying to deliver on a Chancellor Glenn Hegar and the Board of Regents had already made: to audit courses across all of the system鈥檚 12 universities after conservative outrage over a gender-identity lesson at the flagship campus intensified earlier that month, leading to and . 

Texas A&M officials said the controversy stemmed from the course鈥檚 content not aligning with its description in the university鈥檚 course catalog and framed the audit as a way to ensure students knew what they were signing up for. As other public universities came under similar scrutiny and began preparing to comply with that gives governor-appointed regents more authority over curricula, they, too, announced audits.

Records obtained by The Texas Tribune offer a first look at how Texas universities are experimenting with AI to conduct those reviews. 

At Texas A&M, internal emails show staff are using AI software to search syllabi and course descriptions for words that could raise concerns under new system policies restricting how faculty teach about race and gender. 

At Texas State, memos show administrators are suggesting faculty use an AI writing assistant to revise course descriptions. They urged professors to drop words such as 鈥渃hallenging,鈥 鈥渄ismantling鈥 and 鈥渄ecolonizing鈥 and to rename courses with titles like 鈥淐ombating Racism in Healthcare鈥 to something university officials consider more neutral like 鈥淩ace and Public Health in America.鈥

While school officials describe the efforts as an innovative approach that fosters transparency and accountability, AI experts say these systems do not actually analyze or understand course content, instead generating answers that sound right based on patterns in their training data.

That means small changes in how a question is phrased can lead to different results, they said, making the systems unreliable for deciding whether a class matches its official description. They warned that using AI this way could lead to courses being flagged over isolated words and further shift control of teaching away from faculty and toward administrators.

鈥淚鈥檓 not convinced this is about serving students or cleaning up syllabi,鈥 said Chris Gilliard, co-director of the Critical Internet Studies Institute. 鈥淭his looks like a project to control education and remove it from professors and put it into the hands of administrators and legislatures.鈥

Setting up the tool

During a board of regents meeting last month, Texas A&M System leaders described the new processes they were developing to audit courses as a repeatable enforcement mechanism. 

Vice Chancellor for Academic Affairs James Hallmark said the system would use 鈥淎I-assisted tools鈥 to examine course data under 鈥渃onsistent, evidence-based criteria,鈥 which would guide future board action on courses. Regent Sam Torn praised it as 鈥渞eal governance,鈥 saying Texas A&M was 鈥渟tepping up first, setting the model that others will follow.鈥 

That same day, requiring presidents to sign off on any course that could be seen as advocating for 鈥渞ace and gender ideology鈥 and prohibiting professors from teaching material not on the approved syllabus for a course.

In a statement to the Tribune, Chris Bryan, the system鈥檚 vice chancellor for marketing and communications, said Texas A&M is using OpenAI services through an existing subscription to aid the system鈥檚 course audit and that the tool is still being tested as universities finish sharing their course data. He said 鈥渁ny decisions about appropriateness, alignment with degree programs, or student outcomes will be made by people, not software.鈥

In records obtained by the Tribune, Castillo, the system鈥檚 chief strategy officer, told colleagues to prepare for about 20 system employees to use the tool to make hundreds of queries each semester. 

The records also show some of the concerns that arose from early tests of the tool.  

When Castillo told colleagues about the varying results she obtained when searching for classes that discuss feminism, deputy chief information officer Mark Schultz cautioned that the tool came with 鈥渁n inherent risk of inaccuracy.鈥

鈥淪ome of that can be mitigated with training,鈥 he said, 鈥渂ut it probably can鈥檛 be fully eliminated.鈥

Schultz did not specify what kinds of inaccuracies he meant. When asked if the potential inaccuracies had been resolved, Bryan said, 鈥淲e are testing baseline conversations with the AI tool to validate the accuracy, relevance and repeatability of the prompts.鈥 He said this includes seeing how the tool responds to invalid or misleading prompts and having humans review the results.

Experts said the different answers Castillo received when she rephrased her question reflect how these systems operate. They explained that these kinds of AI tools generate their responses by predicting patterns and generating strings of text.

鈥淭hese systems are fundamentally systems for repeatedly answering the question 鈥榳hat is the likely next word鈥 and that鈥檚 it,鈥 said Emily Bender, a computational linguist at the University of Washington. 鈥淭he sequence of words that comes out looks like the kind of thing you would expect in that context, but it is not based on reason or understanding or looking at information.鈥

Because of that, small changes to how a question is phrased can produce different results. Experts also said users can nudge the model toward the answer they want. Gilliard said that is because these systems are also prone to what developers call 鈥渟ycophancy,鈥 meaning they try to agree with or please the user. 

鈥淰ery often, a thing that happens when people use this technology is if you chide or correct the machine, it will say, 鈥極h, I鈥檓 sorry鈥 or like 鈥榶ou鈥檙e right,鈥 so you can often goad these systems into getting the answer you desire,鈥 he said.

T. Philip Nichols, a Baylor University professor who studies how technology influences teaching and learning in schools, said keyword searches also provide little insight into how a topic is actually taught. He called the tool 鈥渁 blunt instrument鈥 that isn鈥檛 capable of understanding how certain discussions that the software might flag as unrelated to the course tie into broader class themes. 

鈥淭hose pedagogical choices of an instructor might not be present in a syllabus, so to just feed that into a chatbot and say, 鈥業s this topic mentioned?鈥 tells you nothing about how it鈥檚 talked about or in what way,鈥 Nichols said. 

Castillo鈥檚 description of her experience testing the AI tool was the only time in the records reviewed by the Tribune when Texas A&M administrators discussed specific search terms being used to inspect course content. In another email, Castillo said she would share search terms with staff in person or by phone rather than email. 

System officials did not provide the list of search terms the system plans to use in the audit.

Martin Peterson, a Texas A&M philosophy professor who studies the ethics of technology, said faculty have not been asked to weigh in on the tool, including members of the university鈥檚 AI council. He noted that the council鈥檚 ethics and governance committee is charged with helping set standards for responsible AI use.

While Peterson generally opposes the push to audit the university system鈥檚 courses, he said he is 鈥渁 little more open to the idea that some such tool could perhaps be used.鈥

鈥淚t is just that we have to do our homework before we start using the tool,鈥 Peterson said.

AI-assisted revisions

At Texas State University, officials ordered faculty to rewrite their syllabi and suggested they use AI to do it.

In October, administrators flagged 280 courses for review and told faculty to revise titles, descriptions and learning outcomes to remove wording the university said was not neutral. Records indicate that dozens of courses set to be offered by the College of Liberal Arts in the Spring 2026 semester were singled out for neutrality concerns. They included courses such as Intro to Diversity, Social Inequality, Freedom in America, Southwest in Film and Chinese-English Translation.

Faculty were given until Dec. 10 to complete the rewrites, with a second-level review scheduled in January and the entire catalog to be evaluated by June. 

Administrators shared with faculty a guide outlining wording they said signaled advocacy. It discouraged learning outcomes that describe students 鈥渕easure or require belief, attitude or activism (e.g., value diversity, embrace activism, commit to change).鈥

Administrators also provided a prompt for faculty to paste into an AI writing assistant alongside their materials. The prompt instructs the chatbot to 鈥渋dentify any language that signals advocacy, prescriptive conclusions, affective outcomes or ideological commitments鈥 and generate three alternative versions that remove those elements. 

Jayme Blaschke, assistant director of media relations at Texas State, described the internal review as 鈥渢horough鈥 and 鈥渄eliberative,鈥 but would not say whether any classes have already been revised or removed, only that 鈥渕easures are in place to guide students through any adjustments and keep their academic progress on track.鈥 He also declined to explain how courses were initially flagged and who wrote the neutrality expectations.

Faculty say the changes have reshaped how curriculum decisions are made on campus.

Aimee Villarreal, an assistant professor of anthropology and president of Texas State鈥檚 American Association of University Professors chapter, said the process is usually faculty-driven and unfolds over a longer period of time. She believes the structure of this audit allows administrators to more closely monitor how faculty describe their disciplines and steer how that material must be presented.

She said the requirement to revise courses quickly or risk having them removed from the spring schedule has created pressure to comply, which may have pushed some faculty toward using the AI writing assistant.

Villarreal said the process reflects a lack of trust in faculty and their field expertise when deciding what to teach.

鈥淚 love what I do,鈥 Villarreal said, 鈥渁nd it鈥檚 very sad to see the core of what I do being undermined in this way.鈥

Nichols warned the trend of using AI in this way represents a larger threat. 

鈥淭his is a kind of de-professionalizing of what we do in classrooms, where we鈥檙e narrowing the horizon of what鈥檚 possible,鈥 he said. 鈥淎nd I think once we give that up, that鈥檚 like giving up the whole game. That鈥檚 the whole purpose of why universities exist.鈥

The Texas Tribune partners with Open Campus on higher education coverage.

Disclosure: Baylor University, Texas A&M University and Texas A&M University System have been financial supporters of The Texas Tribune, a nonprofit, nonpartisan news organization that is funded in part by donations from members, foundations and corporate sponsors. Financial supporters play no role in the Tribune’s journalism. Find a complete .

This first appeared on .

Did you use this article in your work?

We鈥檇 love to hear how The 74鈥檚 reporting is helping educators, researchers, and policymakers.

Republish This Article

We want our stories to be shared as widely as possible 鈥 for free.

Please view The 74's republishing terms.





On The 74 Today