AI is Officially Here, There, Everywhere, and Nowhere

Districts playing catch up can still adopt sound policies for AI

Photo of a teacher in the front of a class demonstrating an exercise

When it comes to digital technology, educators and school systems haven’t historically been fleet of foot. But artificial intelligence is partially bucking the trend. Many teachers are embracing it, even as school systems follow form and are moving slowly, or barely at all.

Among the myriad ways school systems can respond, there are two obvious poor choices. On one end of the spectrum, they could turn entirely away from AI—which districts like New York City, Los Angeles, and Seattle initially moved to do. On the other, they could rush to use AI for its own sake rather than for a clear educational purpose. There’s plenty of pressure to put AI in the classroom—both from vendors hawking AI products and superintendents wanting to show bold leadership. It would be all too easy for districts to jump on the AI trend and repeat the mistakes of the past. Remember fads like open classrooms in the 1970s and whole language in the 80s?

AI isn’t like CD-ROMs—it’s a rapidly evolving, transformational technology. School systems should act quickly but strategically to find a sensible, educationally sound path. The best policies will integrate AI with intentionality and help students and schools make progress over the long haul.

What’s the best way forward? Don’t focus on AI. Focus on the problems that matter—and see where AI can help.

Initially Adrift

District responses to AI have been all over the map, and many districts have lurched from one approach to another. Several big-city districts banned ChatGPT almost immediately after it was launched in November 2022. But months later, most had rolled back their bans and instead started to encourage the use of AI.

For example, Walla Walla Public Schools in Washington State initially banned ChatGPT. Then, the district repealed the policy and trained its teachers in how to use AI tools.

“[I was] a little bit red-faced, a little bit embarrassed that we had blocked [ChatGPT] in the spring,” Keith Ross, the district’s director of technology and information services, told a local-news outlet. “[It] really shed light that we need to not wait on this and get moving and find out how to supply the tool to the students.”

Recent surveys of teachers and administrators reveal similar contradictions. In an EdWeek Research Center survey conducted in late 2023, about one in five teachers said their district lacked clear policies regarding AI products, and the same share reported that students are not allowed to use it. That same survey also found that more than half of teachers believe that AI usage in school will grow next year.

A survey of district technology leaders by edtech company eSpark in November 2023 found that only 4 percent of districts had a formal, documented policy governing the use of AI. Thirty-nine percent of respondents said their districts were working on one, but 58 percent said their districts had yet to start developing such a policy. Meanwhile, 87 percent of district technology leaders reported they participated in a webinar or presentation about AI in schools in the past 6 months. Some 52 percent said their teachers were independently incorporating AI into their practice, but only 9 percent said they were doing something systematic with AI.

It’s no wonder why. The AI product landscape is teeming with new options for teachers to try, and few have been thoroughly evaluated by their districts. The barriers to entry to creating an AI education startup are extremely low right now—even if the sustainability and impact of such efforts are open questions. According to Reach Capital, a venture capital firm specializing in education companies, there were at least 280 education tools that “incorporate generative AI as a core engine of their product” as of September 2023. More are emerging every month, and many offer “freemium” access so that teachers can try them for free.

Along with ChatGPT, free AI tools for teachers like MagicSchool and Ethiqly have become integral to the daily work of Rachel Morey, who teaches English Language Arts at Walnut Creek Middle School in the suburbs of Erie, PA. She has used these programs to “brainstorm lesson plans, write tests, create worksheets, adapt texts to meet the needs of diverse learners,” she said, as well as to support students in writing essays and delivering feedback. One of the biggest appeals of AI, she said, is how it helps her save time.

Tools and Guidance Emerge

How can districts close the policy and practice gap? An important first step is safeguarding sensitive student and teacher data and ensuring that clear guidelines are in place regarding plagiarism and academic work. These are separate issues from how schools actually use AI and draw on sophisticated technological and legal expertise. Right now, rather than focusing on detailed specifics—which is almost impossible given how quickly AI is evolving—districts need to level-up and focus on key principles to help educators, students, and administrators use AI-powered products responsibly.

These are complex questions, but districts do not need to figure it all out on their own. In October 2023, the Consortium for School Networking, a professional association for school technology administrators, and the Council of the Great City Schools jointly published a “K–12 Generative AI Readiness Checklist.” The detailed questionnaire covers AI readiness from a half-dozen views, including leadership, data, operational, and legal readiness, and was developed in partnership with Amazon Web Services.

That same month, TeachAI published its “AI Guidance for Schools Toolkit.” The initiative was created by more than 60 individuals, governments, and organizations, including Code.org, ETS, the International Society for Technology in Education, Khan Academy, and the World Economic Forum. Its three-part framework for implementing AI in schools, which starts with guidance and policy to address the risks to learning that AI poses, notes that “the first step should be ensuring that AI use complies with existing security and privacy policies, providing guidance to students and staff on topics such as the opportunities and risks of AI, and clarifying responsible and prohibited uses of AI tools, especially uses that require human review and those related to academic integrity.”

States have gotten in the game as well. The North Carolina Department of Public Instruction, for example, released guidance that prods districts to “review current EdTech providers deploying generative AI to vet their safety, privacy, reliability, and efficacy, to determine if they are appropriate to be used for your school, and which users they will be open to based on their Terms of Service and school or district policies.” Ohio published a five-part AI Toolkit for school districts, which it created with the aiEDU nonprofit organization.

Principles to Design a Path to Progress

Despite the slow pace of district-level policies, it’s also reasonable to worry that districts may move too quickly and rush to use AI without intention, just to say they are doing something with it. According to Scott Muri, superintendent of Ector County Independent School District in Texas, “What’s missing from [several of the frameworks and conversations] around AI is the vision. What are we trying to do or achieve? Where are we going?”

As education thought leader Tom Vander Ark said, “Schools need to shift the primary question from ‘how do we do integrate AI into our school’ to ‘what does great learning look like and how can we use AI to support that? And what kind of work can students do with smart tools?’”

The Readiness Checklist framework thankfully starts there, as the first question asks, “Does the use of Generative Artificial Intelligence (Gen AI) align to achieving your district’s mission, vision, goals, and values?” This isn’t a rhetorical question. The answer may be no.

The risks here are great. Far too often, districts base edtech questions on a search for technology for its own sake. School systems should not frame their efforts as an “AI initiative” unless the focus is how to prepare students for a world with AI or to make sure that schools know how to safeguard against its downsides. Instead, leaders should follow a tried-and-true design thinking process to successfully innovate and put AI to its best use.

That means starting with the problem the district needs to solve and the goal it seeks to achieve. Leaders should ask, is what they’ve identified a priority? Some problems relate to serving mainstream students in core subjects, while others arise because of gaps at the margins, such as not offering a particular elective. Both areas are worthy of innovation. But schools shouldn’t embrace a classroom technology unless it’s saving teachers time, extending their reach, or deepening their understanding of their students.

With the problem or goal identified, school systems then need to be specific about what success would look like. How would they know if they had made progress? What’s the measure they would use?

From there, the focus should be identifying the student and teacher experiences needed to make progress toward the goal. And only then should schools consider the physical and virtual setup to deliver those experiences. In other words, the “stuff”—the content, curriculum, analog and digital technologies, including those powered by AI—should come at the end of the process, not the beginning.

By considering a potential role for AI within this greater context, schools can avoid succumbing to a short-lived fad without sitting on their hands and watching the world pass them by. In these early years of our AI-powered futures, the goal should be measured investments that will stand the test of time.

Michael B. Horn is an executive editor of Education Next, co-founder of and a distinguished fellow at the Clayton Christensen Institute for Disruptive Innovation, and author of From Reopen to Reinvent.

Last Updated

NEWSLETTER

Notify Me When Education Next

Posts a Big Story

Business + Editorial Office

Program on Education Policy and Governance
Harvard Kennedy School
79 JFK Street, Cambridge, MA 02138
Phone (617) 496-5488
Fax (617) 496-4428
Email Education_Next@hks.harvard.edu

For subscription service to the printed journal
Phone (617) 496-5488
Email subscriptions@educationnext.org

Copyright © 2024 President & Fellows of Harvard College