Oliver Page
Case study
July 23, 2025
As artificial intelligence tools explode in availability and popularity, educators across the country are bringing them into classrooms, often without official approval or oversight. From AI writing assistants to automated grading platforms and real-time tutoring bots, teachers are exploring new ways to support students using the latest innovations. The problem? Many of these tools are unvetted, untracked, and unaccounted for.
Welcome to the era of Shadow AI in schools, a quiet but rapidly growing cybersecurity and compliance risk.
Just like Shadow IT (the use of unauthorized hardware or software), Shadow AI refers to artificial intelligence tools that staff or students begin using independently, without involvement from school IT departments or administrators. And while the intent is often good. Boosting efficiency or engagement, the consequences can be serious: student data exposure, FERPA violations, and insecure systems being quietly introduced into your district’s network.
Many AI tools have been released and adopted at lightning speed. New chatbots, grading apps, and curriculum enhancers hit the market weekly. Districts, however, move slower bound by policy reviews, board approvals, and procurement processes.
This lag means that teachers often experiment with AI before a formal review process even exists, creating a vacuum in governance.
Unlike legacy software that required installation or licensing, many AI tools are accessible via browser or smartphone app. There's no tech ticket to submit, no approval needed. One login with a school email, and a teacher or student can be off and running.
Most IT systems don’t detect this usage unless explicit monitoring tools or firewalls are in place meaning these tools can live undetected in a classroom for months.
Teachers want what’s best for their students. When they discover a tool that helps struggling readers or automates repetitive tasks, they’re inclined to test it immediately. But without understanding where the data goes, what permissions are granted, or whether privacy terms comply with district policies, well-meaning adoption can lead to high-stakes data exposure.
Allowing unvetted AI tools into the educational environment opens up serious cybersecurity and compliance gaps, including:
None of these were malicious decisions. But all of them introduced real vulnerabilities without district knowledge or approval.
Rather than punish early adopters or ban AI tools altogether, districts should take a proactive, collaborative approach to governing AI usage in classrooms. Here’s how:
Teachers and staff need a clear, simple way to:
This doesn’t have to be complex. A simple form integrated into the edtech approval workflow can work.
Even experienced educators may not understand what makes AI tools uniquely risky. Offer brief training sessions (30 minutes or less) that explain:
Maintain a vetted list of AI tools that meet your district’s compliance, data privacy, and security standards. This empowers teachers to innovate within safe parameters.
Consider using third-party evaluation services or partnering with cybersecurity organizations to streamline vetting.
If your district uses firewall or DNS logging tools, configure them to flag new or unknown AI domains. While this won’t catch everything, it helps surface emerging usage patterns for investigation.
Ensure your Acceptable Use Policy (AUP) includes references to AI tools. Clearly define acceptable behavior, prohibited usage, and student responsibilities.
Then reinforce this through digital citizenship programs, making AI literacy part of cybersecurity literacy.
The rise of Shadow AI in schools doesn’t mean educators are reckless, it means they’re resourceful. Teachers are eager to embrace tools that help students learn. It’s up to leadership to meet that energy with clear policies, trusted guidance, and secure systems.
Districts don’t need to fear AI, but they do need to govern it. That starts with visibility, communication, and practical approval pathways.
If your school or district needs help designing an AI tool vetting process or staff awareness training, CyberNut can help. We offer templates, review workflows, and educator-friendly training materials to turn Shadow AI into safe AI.
Visit cybernut.com to schedule a consultation or access our free AI Policy Starter Kit for schools.
Oliver Page
Some more Insigths
Back