Teachers may benefit from hands-on learning as much as students when it comes to understanding generative AI—but educators need a clear vision, not just tech training, to make AI tools that solve their classroom problems effectively.
That’s one takeaway from an ongoing study of educator-designed AI pilots in California. Researchers from the Center on Reinventing Public Education at Arizona State University tracked more than 80 teachers and administrators in 18 California schools, including district, charter, and private campuses, who created and piloted AI tools through the Silicon Schools Fund’s “Exploratory AI” program in the 2024-25 school year.
Teams of teachers and administrators from each school received six training sessions to learn how generative artificial intelligence works, identify problems it can solve, and built and test tools to, among other things, differentiate lessons for students of different academic levels, encourage teacher collaboration, and improve student behavior.
“It was really freeing to just play around with AI and explore use cases,” said David Whitlock, who led one development team as a vice principal at Gilroy Prep charter school in Gilroy, Calif. “One of the big benefits of all this AI stuff, is we can now adapt our tech to meet students and staff where they’re at versus them having to adapt to a new platform.”
CRPE found even with relatively limited training, teachers learned to build and customize tools quickly. Whether teachers truly integrate AI tools into their instructional practice, though, depended on whether AI was being used to solve a specific problem rather than “efficiency for efficiency’s sake.”
“The underlying instructional model that a school is using really seems to matter,” said Chelsea Waite, a senior researcher at CRPE. “AI could be a core accelerator, fueling the teachers’ capacity to deliver on an instructional goal, but in other places it was more like a paint job. In absence of a clear vision, it ended up seeming like an interesting tool but not much else.”
Teachers built and tested new AI tools
The CRPE analysis comes as many teachers report feeling unprepared to use AI in their classrooms.
“Among our building staff, so many people think AI is taking away from our interactions with each other. It’s taken away that human touch,” said Jackie Wilson, the executive director of Summit Tamalpais High School, a charter school in Richmond, Calif., who participated in the pilot. “So we wanted to ensure that our bot was going to prompt people to want to engage more with fellow humans and learn more about how to communicate better with them, how to resolve conflict, how to increase the efficacy of their team dynamics if it was in a work environment, to manage stress, and to build their capacity as leaders.”
Wilson and her team created a chatbot that helps teachers use an Enneagram personality assessment to plan collaborations. It’s since become a fixture in the school’s professional development meetings and even parent-teacher conferences.

The development team at Gilroy Prep, part of a four-campus Navigator Schools charter network, wanted to tackle a common problem. Like many districts, the charter network uses restorative justice practices for discipline, but struggles to make time for teachers to facilitate the process while also informing parents and administrators about behavior incidents.
Whitlock, who has since become Navigator Schools’ technology innovation director, and his colleagues created an app that allows teachers to generate a restorative activity based on a discipline incident’s description and severity, the grade and reading level of the students involved, the behavioral goals desired (like empathy or responsibility), and the time available for the restorative practice.

The app has proven popular with teachers trying to respond to behavior problems on the fly.
Ally Funk, then a 6th-grade science, technology, engineering and math teacher at Gilroy Prep, used the app last year after a pair of students acted up during a field trip. The app generated a related reading with reflection and discussion questions, as well as a model letter to parents on the discipline incident and how to reinforce the lesson at home.
“Once I hit start, it comes up with a reading passage and questions to go with it, and then a whole message that I can kind of proofread and send to parents,” Funk said. “That way, I’m not having to overthink my workload over students that just didn’t want to participate in a fun field trip.”
Funk, who was on the development team in 2024-25 and has stepped in as an assistant principal at Gilroy Prep this school year, said the tool took weeks of trial and error to fine-tune. While staff could upload the school’s behavior policies and decision matrix, for example, they could not for privacy reasons enter personal student data. That meant it was limited in its ability to detect patterns.
. “A chatbot is only as knowledgeable as what you teach it, and so you have to keep either feeding it information or practicing the outcome you want,” Funk said.
Gilroy Prep teachers regularly use the restorative practice generator, which is being expanded across campuses in Navigator Schools’ charter network in the 2025-26 school year. But Funk said the app only works within the context of strong student-teacher trust in the schools.
“I still think there obviously needs to be human interaction,” she said. “This restorative assignment generator just gives a piece of paper with questions based on their behavior. You have to have the relationships to build it on. So if you haven’t built [student-teacher] relationships that should be priority no. 1.”