糖心传媒

We’re now at a turning point for AI and education

Date:

Share post:

As we settle into the summer months, education may not be top of mind for everyone鈥攂ut the 2025鈥26 school year is just around the corner. Moreover, in the world of AI, the pace of change doesn鈥檛 pause for vacation.

Earlier this year, in January and April, the federal government issued two major directives that are already reshaping the AI landscape. Although a few months have passed since their release, their impact is only beginning to unfold鈥攁nd for those of us committed to supporting educators and learners, it鈥檚 time to pay attention.

These shifts鈥攍argely driven by the second Trump administration鈥攁re not just regulatory headlines or noise. They have direct and significant implications for how AI will affect education in the months and years ahead.

If you’re in the business of supporting educators and learners鈥攚hether through content, curriculum design, platforms, or policy鈥攖his is a moment to pay close attention. Here鈥檚 what happened, what it likely means and what we can do about it.

What changed: An order and a memo

1. January 23, 2025: Executive Order 14179

President Trump signed , titled “Removing Barriers to American 糖心传媒 in Artificial Intelligence.” This order revoked President Biden鈥檚 2023 AI executive order, which had emphasized safety, equity, civil rights protections, and transparency in AI development and deployment.

In its place, the new order mandates that federal agencies “revise, suspend, or rescind” policies that might restrict innovation in AI. It prioritizes global competitiveness and removes many of the procedural guardrails introduced to manage ethical risks and mitigate bias.

It also sets a deadline for agencies to propose new AI action plans that support American dominance in the field.

2. April 3, 2025: OMB Memos M-25-21 and M-25-22

Building on the executive order, the Office of Management and Budget issued new guidance on how federal agencies should acquire, deploy and govern AI. These memos focus on streamlining AI adoption, emphasizing risk-based governance primarily for “high-impact” systems.

Agencies must still designate chief ai officers and submit inventories of AI use, but the tone and direction have shifted markedly: the priority now is speed, innovation and cost-efficiency.

Why this matters for education now

AI is already reshaping teaching and learning鈥攆rom personalized learning platforms to automated grading tools and AI-generated (and human-curated) curriculum content. Plus, federal policy doesn鈥檛 just influence regulation; it sets the tone for what types of innovation are encouraged, funded, and widely adopted.

These two actions do a few things to education:

  • Signal a green light for edtech innovation: Developers of AI-powered educational tools are now operating in an environment with fewer regulatory constraints. This may encourage more experimentation and rapid deployment of new tools into classrooms.
  • Reduce focus on equity and ethics: The revoked Biden-era executive order had strong provisions on ensuring AI didn’t deepen racial, linguistic, or ability-based disparities in education. That emphasis has faded, and we risk seeing a proliferation of tools that perform well technically but are less attuned to the presence and needs of the diversity in learners in the real world.
  • Pressure on educators to adapt: With more AI-related tools hitting the market faster, teachers may be expected to integrate something new into their practice with鈥攜et again鈥攍ittle time, training or clarity. This could widen the already growing gap between well-resourced schools and those that lack infrastructure or support.
  • Limited accountability mechanisms: The new policies put the burden of risk assessment and governance on agencies and, by extension, the local districts and educators who adopt AI tools. In many cases, teachers and administrators won’t have the resources or expertise to independently vet these technologies.
  • Growing importance of AI literacy: As AI tools become more embedded in the classroom, both students and educators need a foundational understanding of what these tools do (good and bad)鈥攁nd don鈥檛 do (good and bad). Without this literacy, we risk not only misusing powerful tools but also falling behind in the essential skill of discerning fact from fiction in an AI-influenced information environment.
  • Heightened concerns around data privacy and security: With increased AI integration comes greater data collection. In a landscape with loosened federal safeguards, the onus may now fall on local actors to ensure that sensitive student data is handled responsibly and securely.

What we can do: 5 steps to consider

1. Double down on ethics by design

Make fairness, transparency, and inclusion a core part of your product or content development process鈥攏ot just for compliance, but for trust and impact. Make it part of your product and/or content value proposition.

2. Create Resources for Educators

Develop toolkits, training guides and decision frameworks to help teachers understand AI capabilities, limitations and ethical concerns. Given the pace at which things are moving, think about how you can offer these aids in a 鈥渏ust-in-time鈥 fashion with frequent revisions and updates.

3. Prioritize inclusive data

Ensure the data informing your AI tools represents diverse learners across race, language, ability and socio-economic status. While such measures might enjoy less popularity in the current political environment, those of us committed to education and student success understand that these concepts apply to all students, regardless of their background, identity, and experience.

4. Build evaluation tools

Offer rubrics or third-party evaluation mechanisms so schools and districts can assess the quality and bias of AI tools before adoption.

5. Advocate locally

Federal policy may have shifted, but many state and district leaders still care deeply about safe and equitable learning environments. And, with the dissolution of the Department of Education, the center of gravity is shifting locally anyway. Partner with those local leaders.

Begin with awareness, end with action

The guardrails may have come off, but the responsibility to students and teachers鈥攁nd our future鈥攔emains. Those of us who build, fund or advocate for educational content and tools have a critical role to play in ensuring that AI serves all learners and their teachers well.

That begins with awareness, but it must end in action.

Eric Stano
Eric Stano
Eric Stano is the vice president of consulting, curriculum, and product strategy at Magic EdTech.

Related Articles