Department of Education AI Toolkit

The Department of Education1Look: I had an existential crisis while drafting this blog post. I regularly read newsletters about education and EdTech as part of my job and am acutely aware of the hubbub around the potential elimination of the Department of Education. Couple that with relaxed restrictions on AI development from the new administration, and I ended up in a loop of questions: Does any of this matter? For all I know, once Linda McWrestling is in charge of public education, these guidelines and toolkits will be formally retracted. I decided that wasn’t the blog post I wanted to write; all this politicking doesn’t change a good idea, and I hope thousands of schools and districts have seen this Toolkit and will consider it important regardless of what Sam Altman tells Trump is safe. released a toolkit for Safe, Ethical, and Equitable, AI Integration last October. I finally made the time to read it and love what I found. As with most things in education, if it’s sensible in that context, it is worth considering in every context.

Of all institutions to look to for guidance on burgeoning technology, the Department of Education is one of the most fascinating. Consider its mission:

The mission of the Department of Education is to promote student achievement and preparation for global competitiveness by fostering educational excellence and ensuring equal access for students of all ages.2Let’s ignore this part near the end, which is surely due to change.

Achieving this goal requires promoting relevant skills and technology while balancing the need for a measured, sound pedagogical approach. The world of education is paradoxical through and through: it’s progressive in its desire to chase after new systems to improve student outcomes while maintaining a strict conservatism in its overall structure. It wants to standardize what students learn while emphasizing the necessity of flexible, open-ended approaches. It is always looking to adjust itself to educate students better but can’t afford a misstep that could ruin the education of a generation. American society hugely undervalues teachers, but those educators have the most significant impact on the future of that same society. Education is quite the odd duck, but this inherent tension can lead to incredibly thoughtful approaches that, in their best iterations, guide and don’t dictate.

The department’s AI Toolkit is ideal in this sense. It is true to its title in that it doesn’t contain rigid rules that attempt to fit every scenario or, worse, only the typical scenario. Instead, it is a set of principles, thought experiments, and workshops that an administrator or other educator can bring to their peers to determine a sensible approach for their school system with full awareness of the tradeoffs they may be making.

It starts with a metaphor, stated in a charming way that has written by committee all over it:

Consider the metaphor of a mountain trek to represent the journey of incorporating AI in education.

Everyone knows you must declare your metaphors.

Whether you’re bullish or bearish, this metaphor is apt for the future of AI. The document briefly expands:

Like preparing for a challenging climb, achieving AI success requires careful planning, teamwork, and risk management.

This understates the connection. Trekking up a mountain can be disastrous if one is ill-prepared, misled, over-confident, or if the whims of nature cause a fuss. The current state of AI makes it difficult to determine which companies are selling quality hiking poles and which are hawking oxygenated boots that claim to eliminate all issues with altitude sickness, letting you traipse up in one go without any acclimation. Because the accessibility and power of these AI tools are so new, we accept their abundant and terrible mistakes. The issue is that those mistakes, so innocuous and humorous when it’s a single person asking about a pizza recipe, become calamitous when scaled up to a school, a district, a state, a country. Would we feel the same way about a tent that needed to keep us warm and dry at 12,000 feet? What about keeping your children warm?3If only we could agree on what But think about the children meant.

If nobody has yet summited a peak, it’s unwise to get in line behind the now-motionless queue of dead bodies that thoughtlessly threw themselves at the challenge. Or, worse, they were told by their mentors, who were told by their supervisors, who were told by a man with a monocle that the mountain was the way fortune lies. In the same way, we can’t unquestioningly commit to AI in education just because some extremely wealthy companies tell us it’s time to do so. There’s no doubt that the tools in their current state are fascinating and already beneficial when appropriately applied; I also have little doubt that new tools will come about, though I’m not confident that the current approach is capable of an acceptable level of accuracy and dependability. Regardless, there’s a vast difference between someone using ChatGPT as a glorified WebMD or writing emails and systemically adjusting education in a way that will alter millions of lives.

But I digress.

The Toolkit successfully assists educators in adequately collaborating on planning and risk management. Its ten modules are organized into three parts:

  1. Mitigating Risk
  2. Building a Strategy for Integration
  3. Maximizing Opportunity and Guiding Effective Use

These three sections create a balance that addresses the tension explored above: focus first on the possible risks and fully understand them before developing a strategy; once the plan is ready for implementation, begin to optimize and make sure all users are well-equipped to handle these changes.

As stated above, this strategy works well for education and applies to any other context. Beyond that, the broad three-part framework listed above is good guidance for any new tool or process. How detailed one needs to be depends on the potential impact of the change, but I plan to think through these steps in my regular operational and project work.

Of course, the modules comprising these three broader sections are detailed and focused explicitly on AI. If you look through the document, concentrate on the discussion questions near the end of each module. These provide the best insight into the potential concerns educators should have and highlight the remarkable scope this Toolkit can cover.

Consider this suggested question from Module 6: Planning Your AI Strategy.

How do we ensure attention to important problems that do not have immediate technological solutions (e.g., truancy) with attention to problems that are a fit for AI and other emerging technologies?

This is an excellent question. Broadened to other disciplines, someone in a planning session can ask: What are we at risk of forgetting about because of this new project, and how do we avoid doing so?

Many objections are levied against government bureaucracy, but the resources they create are high-quality, straightforward, and extremely useful, even beyond their intended scope. I hope educators make use of this Toolkit, that many institutions more broadly use the NIST AI Risk Management Framework to guide their discussions and decisions, and that we can find a way to balance the excitement and fervor for new technology with the need to take it slowly and not score own-goals that significantly impact the safety, privacy, and future of our students.

  • 1
    Look: I had an existential crisis while drafting this blog post. I regularly read newsletters about education and EdTech as part of my job and am acutely aware of the hubbub around the potential elimination of the Department of Education. Couple that with relaxed restrictions on AI development from the new administration, and I ended up in a loop of questions: Does any of this matter? For all I know, once Linda McWrestling is in charge of public education, these guidelines and toolkits will be formally retracted. I decided that wasn’t the blog post I wanted to write; all this politicking doesn’t change a good idea, and I hope thousands of schools and districts have seen this Toolkit and will consider it important regardless of what Sam Altman tells Trump is safe.
  • 2
    Let’s ignore this part near the end, which is surely due to change.
  • 3
    If only we could agree on what But think about the children meant.

Leave a Reply