May/June 2026 IASB Journal

A Board Member’s Guide to AI

By Jim Batson, Ed.D.

It feels like we can’t turn around without hearing about AI. From daily news cycles to conference agendas, it has largely dominated the conversation of the past few years, and nearly drowned out everything else. Artificial Intelligence has become inescapable.

When I was considering writing an article about AI, I didn’t want this to be just another article that would add to the noise. The reality is, AI is already in use throughout your district. As public school board members, we have an obligation to ensure that the district provides a safe and effective environment in which our students can learn. The actions you take and decisions you make as a board should contribute to the well-being and success of your students.

Most of the AI conversation today centers on what is considered Generative AI, or “Gen AI.” This includes tools like Google Gemini and ChatGPT. While the concept of AI has been around since John McCarthy coined the phrase “artificial intelligence” in 1956, OpenAI and its product, ChatGPT, made Gen AI commonplace, reaching 100 million users in just two months.

This generative form of AI is relatively user-friendly and generates text, images, or even videos in response to prompts. It’s already familiar to many people, including students. Other, less-known forms are becoming popular.

Analytical AI, through tools like Google NotebookLM, provides repositories to store documents and other data that you can search, summarize, or otherwise perform various analyses using common, everyday language requests.

Developer AI tools from GitHub and Microsoft Copilot provide developers with tools to generate computer code in a fraction of the time it takes a human programmer.

Agentic AI, such as Zapier and Microsoft Copilot, provides tools to organize and execute tasks.

Conversational AI, also known as chatbots, has become a popular yet concerning application. Chatbots provide information and feedback in a conversational manner and often take on a human-like nature. While there are many benefits to chatbots, there are significant concerns about people, especially young people, becoming emotionally attached to their chatbot “friend.” In a recent Pew Research Study, over half of students surveyed said they have used chatbots for homework help, and 12% admitted they received emotional support from their chatbot.

 AI in Schools
Students, staff, and teachers alike can use AI to assist with their learning and with operational support for the district. In the recent CoSN 2025 State of EdTech study, the overwhelming majority (94%) of EdTech Leaders see AI’s potential for positive impact in education. Generative AI (Gen AI) was ranked the top tech priority, with 80% of respondents working in districts with some sort of Gen AI initiative.

Using AI for students comes in many forms, such as a virtual personal tutor that provides step-by-step guidance on complex math or science tasks. Students could use a chatbot to practice a foreign language or use Gen AI to get writing assistance as a brainstorming partner, an editor, or even a summarizing application for long, complex documents and articles. A student can also become better organized through breaking large projects into multiple tasks or creating a study plan for an upcoming final exam.

Examples where teachers can use AI are differentiating instruction by adjusting the reading level of a complex document or aiding in lesson planning by developing customized rubrics. AI could assist by more efficiently aligning lessons to learning standards or quickly developing quizzes from complex curriculum materials.

Administrators could analyze large volumes of data, draft communications to parents, and then effectively translate them into multiple languages. Meeting transcripts can be easily summarized, and scheduling efforts can be streamlined.

Board members can use AI to summarize large documents and analyze large volumes of data.

Protecting Schools and Students
As of this writing, the state of Illinois and the Illinois State Board of Education have not provided specific guidelines or expectations related to the use of AI for school districts in our state. Public Act 104-0399, which took effect on January 1, 2026, requires ISBE to publish comprehensive statewide guidelines for AI in K-12 schools by July 1, 2026. These guidelines are expected to include guidance on key issues such as AI literacy, academic integrity, AI bias, and data privacy.

While 34 states currently have some official policy or guidance for K-12 districts, Illinois is not one of them, yet there are many existing policies and laws that impact the use of AI. Laws such as FERPA, ISSRA, COPPA, IDEA, Title VII, and others protect student privacy and ensure access to educational materials. A district’s policies will have documented guidance related to topics such as bullying, safety, and use of technology. In Illinois, we have comprehensive student data privacy rules in the Illinois School Code that define how a student’s data is protected. This state public act is known as the Student Online Personal Protection Act, or SOPPA. Given that an AI tool uses the data it gathers to learn, any personal information that is loaded into an AI tool will be stored and potentially used by others. This makes student data privacy a major concern when deploying AI applications. SOPPA rules require districts to negotiate data privacy agreements with every provider of services to prohibit the sharing of student data and ensure that the supplier and any of its subcontractors protect all student data it obtains in the course of using their applications.

Because AI systems learn by gathering data, it is critical that all AI tools used in schools meet state and federal data privacy requirements. AI tools can be categorized as public or private. Those considered Private AI should be FERPA- and SOPPA-compliant, thereby keeping data private. A commonly used example of this is the Gemini version in the Google Workspace for Education suite of applications. Other examples are MagicSchool AI, School AI, and Khanmigo. Regardless of the application, if it is intended for students and gathers student information, it requires a SOPPA agreement with the vendor.

The dark side of AI, however, continues to emerge with concerns over bullying and deep fakes becoming more prevalent. While these sorts of behavioral issues have been around for a long time, the quality and ease of generating realistic images and videos make inappropriate use of these tools quick and easy to distribute, leading to significant emotional and personal damage. A district’s policies, handbooks, and guidelines should be reviewed with these new technologies in mind.

 What’s Next for the Board of Education?
So, what should a board member do? Three things should be considered.

First, and most important, support the districts’ effort to embrace AI. This is especially important for AI-related professional development for everyone, including students, staff, teachers, administrators, and parents.

Second, ensure you are familiar with policies, guidelines, and legal requirements related to AI. Board policies are your responsibility, so you should understand them.

Third and lastly, ask questions regarding what your district is doing in response to AI. These questions should be to inform the board, not critique or question the work of the administration. The more you understand, the better a decision-maker you will be.


Jim Batson, Ed.D. enjoys a diverse career in technology and education. He spent two decades in education technology leadership, most recently as Director of Technology for Fenton CHSD 100 in Bensenville. He has also held adjunct faculty and guest lecturer roles at community colleges and universities throughout Illinois. Batson is also in his 23rd year of school board service, and is currently Board President for CHSD 128 in Vernon Hills.