Responsible AI Use in School Board Governance
By Gary Fasules
In the mid-1980s, I wrote my master’s thesis on how Fortune 500 companies used marketing models. At that time, most relied on simple spreadsheet analysis — not because the problems or analysis were simple, but because data and processing power were limited.
Forty years later, that limitation has disappeared. Data now grows exponentially, roughly doubling every two years, and computing power continues to expand at a similar rate. These two factors are crucial to the rapid advancement of artificial intelligence. Meanwhile, today’s high school students are being introduced to computer science, data science, and AI at a much faster pace than before.
Given these developments, school boards across the country are working to figure out how to address artificial intelligence through district policy. The Illinois Association of School Boards has issued policy guidance on AI use in schools. (See sample PRESS Policy 6:235, Access to Electronic Networks; Administrative Procedure 6:235-AP1, Acceptable Use of the District’s Electronic Networks; and Administrative Procedure 6:235-AP3, Development of an Artificial Intelligence (AI) Plan and AI Responsible Use Guidelines.)
However, an important governance question remains largely unexplored: While boards are developing policies on AI use in their districts, are they also considering how AI might affect their governance work?
There will always be advances in technology that expand the tools available to school boards and districts. While the tools may evolve, the responsibilities of governance remain the same. The question is not whether artificial intelligence will be present in our district; it already is. The governance challenge is whether boards themselves can responsibly use emerging technology while upholding the principles of good governance.
AI is now a tool accessible to board members, but how do boards ensure it supports meaningful discussion rather than replacing it? How do they keep the line between governance and administration clear? And how do boards of education maintain the collective decision-making that is central to effective leadership?
The answer lies in anchoring any board’s use of AI firmly within established principles of good governance. By doing so, boards can approach any emerging technology with clarity and discipline rather than uncertainty or reaction.
Let’s explore how boards can approach artificial intelligence through the lens of governance.
IASB clearly outlines key principles of effective governance: maintaining the distinction between governance and administration, acting with integrity, engaging the community responsibly, and upholding transparency and accountability. As AI becomes more prevalent in education, these principles will remain unchanged. In fact, they become even more crucial. For board presidents and board members, responsible use of AI should be grounded in three governance standards: role integrity, ethical discipline, and community-centered accountability. These standards align directly with the IASB’s expectations and foundational principles for good governance. Today, school boards across the country face decisions in an environment where communities expect greater transparency, accountability, and trust.
For boards, AI is not just a technical tool; it is a governance responsibility.
Many board members are skeptical of, and even distrust, AI. That skepticism is understandable. However, when used properly and within clear governance boundaries, AI can be a valuable support tool for school boards. When used carelessly, it can undermine role clarity, ethical responsibilities, and public confidence.
The IASB guidance clearly states that AI-enabled tools must be used safely, ethically, and equitably, in accordance with board policy. For school boards, this means AI use is no longer just a technical matter; it is a governance concern. Board members must ensure that all AI use, whether by students, the administration, or the board, adheres to established policies and legal requirements, while maintaining the important distinction between governance and administration.
Role Integrity: Maintaining the Governance-Administration Boundary
The first principle, role integrity, is fundamental. Effective governance relies on clearly distinguishing the board’s duty to set direction and oversee, from the administration’s duty to execute and run district operations. As AI tools become more accessible, it is increasingly important that boards ensure their use supports thoughtful governance rather than slipping into operational decision-making.
AI must reinforce — not blur — the distinction between governance and administration.
AI tools can inadvertently blur that line if board members are not cautious in how they utilize them. When a board member begins using AI to examine operational options, develop implementation strategies, or propose logistical solutions, the discussion can quickly shift from governance to management. Although this shift may seem subtle, it alters the board’s expected role and crosses the boundary between the board’s and the superintendents’ responsibilities in a school district.
A key to maintaining the separation between governance and administration is proper AI prompt design. How a board member formulates a prompt largely determines whether AI supports thoughtful governance or slips into operational decision-making.
Weak prompts often result from ambiguity, unclear roles, or leading language. These issues can cause responses to shift from governance questions to operational directives. On the other hand, effective prompts are neutral, open-ended, and clearly aligned with the board’s governance responsibilities.
Governance-focused prompts help board members understand issues, anticipate community concerns, and prepare for meaningful engagement. These prompts explore impacts, values, tradeoffs, and oversight considerations. Operational prompts, on the other hand, seek guidance on implementing how programs should be designed, which options to choose, or how changes should be carried out.
As mentioned earlier, the line between governance and operational prompts can be subtle, sometimes only a few words make a difference. Consider the issue of school attendance boundary changes.
A governance prompt might be: “What concerns might parents raise if the district considers changing school attendance boundaries?” This prompt raises potential concerns about student well-being, academic opportunities, transportation logistics, property values, and fairness in decision-making. Now, change just a few words: “How should the district change school attendance boundaries?” The prompt has shifted to an operational focus. Instead of identifying community concerns, it begins to suggest how the district might implement the change by using data analysis, prioritizing student stability, reducing transportation challenges, ensuring fairness among schools, and engaging the community. The difference may seem small, but this shift moves the board from understanding an issue to directing its implementation.
Therefore, board members should never use AI to design solutions, evaluate staff performance, determine logistics, or signal commitments. Maintaining this boundary safeguards the integrity of the board’s role and supports proper administrative authority.
Ethical and Collective Governance Discipline: Protecting Confidentiality and Public Trust
Maintaining role integrity is the primary step for board members in responsible AI use. However, staying within the board’s governance role does not remove the ethical issues associated with AI. Questions about confidentiality, transparency, and the integrity of board discussions remain key to effective governance. Therefore, any board member who chooses to utilize AI must do so with a strong commitment to ethical standards.
AI use must reflect board integrity, confidentiality, and public accountability.
Privacy and confidentiality are vital ethical responsibilities of the board when using AI. Board members should never consider AI platforms as private spaces for discussion. Prompts must be sanitized to remove district names, individual names, identifying details, district-specific data, or any sensitive nonpublic information. This rule applies whether board members are using personal devices or district devices and networks.
Just as board discussions are governed by open meetings laws and ethical standards, AI use must adhere to the same principles and caution. Neutral wording of prompts is crucial. Prompts should avoid making assumptions that could influence the AI’s response or suggest a predetermined outcome. Responsible AI use demonstrates respect for individuals, protects the district, and fosters public trust. AI does not replace ethical responsibility; it requires it.
The Illinois Open Meetings Act aims to ensure that boards act collectively and publicly. It encourages transparency during discussions, provides the public access to those conversations, and requires advance notice of the issues to be addressed. However, artificial intelligence is generally used by individuals and often outside of group debates. If not handled carefully, this difference can lead to subtle governance challenges.
For example, a board member might privately use AI to shape arguments, draft statements aligned with perceived community concerns, or organize talking points on an issue. While this may seem like simple preparation, it could unintentionally sway board discussions. Boards have long operated under the principle of “one gets, all get,” meaning that information available to one member should be accessible to all, so decisions are based on a shared understanding.
Therefore, board members who choose to use AI individually must exercise discipline and good judgment. AI should not impel the conversation. Instead, it should aid in the board member’s personal preparation and understanding, used to advocate a position or frame an argument that shifts deliberation outside the public conversation. Maintaining open and honest deliberation among board members is crucial for ethical conduct and effective governance.
Community-Centered Accountability: Strengthening Listening and Deliberation
Ethical discipline ensures that board members use AI in a way that protects confidentiality, maintains transparency, and preserves the integrity of board deliberations. However, role clarity and ethical use alone are not enough. School boards exist to serve the communities that elect them and the students those communities entrust to their care. One of the most important responsibilities of a board is to listen carefully to community concerns, consider those perspectives thoughtfully, and deliberate openly before making decisions that affect the district.
AI should strengthen listening and informed deliberation—not replace community voice.
Given this, when might it be appropriate for boards to use AI in their governance role? Ideally, during preparation for engaging the community on complex or sensitive issues, such as consolidating bus routes to address transportation costs. Again, in these situations, AI should only be used to support understanding and preparation, not decision-making.
When used properly, AI can assist the board in identifying common community concerns, explaining why these issues matter to families, and highlighting questions board members should be prepared to address. In the above situation, governance-focused prompts ask what the board needs to understand before engaging with the community, such as values, tradeoffs, and potential impacts. Operational prompts, on the other hand, seek to determine how an issue should be implemented, by whom, and on what timeline. Keeping this distinction clear helps maintain role clarity and public trust.
An example of a well-constructed governance-focused prompt related to transportation consolidation could be:
I am a school board member preparing for community engagement regarding a possible consolidation of bus routes to reduce transportation costs. My goal is to understand potential community concerns and prepare to listen respectfully. Do not advocate for a specific outcome, provide operational detail, or make commitments. Provide a one-paragraph response written in a professional tone.
When this prompt was submitted to an AI platform, the response highlighted potential concerns related to student safety, equity, family routines, and trust in the decision-making process. It did not suggest a specific course of action or propose implementation strategies. Instead, it emphasized listening, which is exactly the board’s role at this stage.
Although this example emphasizes transportation, the same method applies to many governance decisions. Boards can adopt this framework when preparing for discussions on budget changes, attendance boundary adjustments, facilities planning, program growth or reductions, or other issues that greatly affect the community. While specific issues may vary, the governance discipline stays the same: Follow board policy, define the board’s role clearly, frame questions neutrally, and use AI to prepare for listening rather than justifying outcomes.
From a governance standpoint, AI should function solely as a support tool. It must only assist board members in anticipating concerns, organizing their thoughts, and engaging more thoughtfully with the community. It does not replace judgment, substitute for deliberation, or grant authority.
Ultimately, AI will not improve governance. It will reveal it. AI will amplify the clarity or the confusion that board members have when approaching their responsibilities. When rooted in role integrity, ethical discipline, and community-focused accountability, AI can enhance preparation, improve listening, and enable more thoughtful public engagement. When misused, it can muddy roles, introduce bias, and weaken the trust that boards are chosen to protect.
Illinois school boards have experienced significant technological changes before, including the expansion of cellular networks, the adoption of online board portals, and the livestreaming of meetings. Each of these shifts altered how boards communicated, but none changed how they were expected to govern. The same holds true for AI.
Effective governance in the age of AI will not be measured by how the technology advances, but by how disciplined board members stay. The core principles of good governance remain unchanged despite new tools. If board members decide to use AI, they must do so with restraint, clarity, and a strong commitment to their governance responsibilities. AI should never be employed to generate solutions, manage staff, or approve commitments. Public confidence depends on maintaining that distinction — and on the board’s willingness to uphold it.
Gary Fasules is a Director of Outreach and Training with IASB. He works with districts statewide on training initiatives and building customer engagement programs. This is the first in a multi-part series by Fasules connecting school board members with the world of artificial intelligence.
AI Survey for School Leaders
This article by Gary Fasules, IASB Director of Outreach & Training, explores how boards can approach AI within their governance responsibilities. IASB invites you to share your perspective through a brief survey on AI use, governance, and community engagement. Your responses will help shape upcoming AI articles that reflect what board leaders are experiencing across the state. Access the survey here: AI and School Board Governance Survey.