For the past couple of years, unrelenting change has come fast.
Even while schools are stuck dealing with deep challenges, COVID-19 pandemic relief funding is running its course. Meanwhile, new technologies seem to flow out in an unstoppable stream. These often have consequences in education, from an increase in cheating on assignments enabled by prose-spewing chatbots, to experiments that bring AI into classrooms as teaching assistants or even as students.
For some teachers and school leaders, it can feel like an onslaught.
Some educators connect AI to broader changes that they perceive have been harmful to students, says Robin Lake, director of the Center on Reinventing Public Education. Through interviews, sheâs found that some educators link AI to social media and cellphones. So theyâre having an understandably emotional response, she adds: âItâs kinda scary if you think about it too long.â
But in this ever-shifting stream of change, Lake is among those who believe new technology can be steered in a way that navigates schools to a more promising channel for reducing disparities in education in the U.S.
However, if thatâs going to happen, itâs imperative that education leaders start pushing AI to transform teaching and learning in ways that are beneficial, particularly for low-income and historically disadvantaged students, observers like Lake argue.
If artificial intelligence doesnât help solve disparities, advocates worry, it will worsen them.
Hazard Lights
AI has been used in education since at least the 1970s. But the recent barrage of technology has coincided with a more intense spotlight on disparities in student outcomes, fueled by the pandemic and social movements such as protests over the killing of George Floyd. AI has fed hopes of reaching more equality thanks to its promise to increase personalized learning and to boost efficiency and sustainability for an overworked teaching force.
In late 2022, the White House released a âBlueprint for an AI Bill of Rights,â hoping that it would strengthen privacy rights. And last year, the U.S. Department of Education, along with the nonprofit Digital Promise, weighed in with recommendations for making sure this technology can be used âresponsiblyâ in education to increase equity and support overburdened teachers.
If you ask some researchers, though, itâs not enough.
There have been fears that AI will accidentally magnify biases either by relying on algorithms that are trained on biased data, or by other methods such as automating assessments that ignore student experiences even while sorting them into different learning paths.
Now, some early data suggests that AI could indeed widen disparities. For instance: Lakeâs organization, a national research and policy center thatâs associated with Arizona State Universityâs Mary Lou Fulton Teachers College, released a report this spring that looked at K-12 teachersâ use of virtual learning platforms, adaptive learning systems and chatbots. The report, a collaboration with the RAND Corporation, found that educators working in suburban schools already profess to having more experience with and training for AI than those in urban or rural schools.
The report also found that teachers in schools where more than half of students are Black, Hispanic, Asian, Pacific Islander or Native American had more experience using the tools â but less training â than teachers who work in majority-white schools.
If suburban students â on average, wealthier than urban or rural students â are receiving more preparation for the complexities of an AI-influenced world, it opens up really big existential questions, Lake says.
Big Promises â or Problems
So how can advocates push AI to deliver on its promise of serving all students?
Itâs all about strategy right now, making smart investments and setting down smart policy, Lake says.
Another report from the Center on Reinventing Public Education calls for more work to engage states on effective testing and implementation in their schools, and for the federal government to put more detailed guardrails and guidance in place. The report, âWicked Opportunities,â also calls for more investment into research and development. From its perspective, the worst outcome would be to leave districts to fend for themselves when it comes to AI.
Part of the reason urban districts are less prepared for AI may be complexity and the sheer number of issues they are facing, observers speculate. Superintendents in urban districts say they are overwhelmed, Lake says. She explains that while they may be excited by the opportunities of AI, superintendents are busy handling immediate problems: pandemic recovery, the end of federal relief funding, enrollment declines and potential school closures, mental health crises among students and absenteeism. What these leaders want is evidence that suggests which tools actually work, as well as help navigating edtech tools and training their teachers, she adds.
But other observers worry about whether AI is truly the answer for solving structural problems in schools broadly.
Introducing more AI to classrooms, at least in the short term, implies teaching students using screens and virtual learning, argues Rina Bliss, an associate professor of sociology at Rutgers University. But many students are already getting too much screen and online time at home, she says. It degrades their mental health and their ability to work through assignments, and educators should be cautious about adding more screen time or virtual learning, Bliss says.
Bliss also points to a âprint advantage,â a bump in how much is learned from print materials compared to screens, which has to do with factors like engagement with the text and how quickly a studentâs eyes can lock onto and stay focused on material. In her view, digital texts, especially when they are connected to the internet, are âpots of distractions,â and increasing screen-based instruction can actually disadvantage students.
Ultimately, she adds, an approach to instruction that overrelies on AI could reinforce inequality. Itâs possible that these tools are setting up a tiered system, where affluent students attend schools that emphasize hands-on learning experiences while other schools increasingly depend on screens and virtual learning. These tools shouldnât replace real-world learning, particularly in under-resourced schools, she adds. She worries that excessive reliance on this technology could create an âunderclass of studentsâ who are given artificial stopgaps to big problems like school understaffing and underfunding. It wouldnât be responsible to lean on AI as the quick fix for all our economic shortages in schooling, Bliss argues.
So how should educators approach AI? Perhaps the correct posture is cautious hope and deliberate planning.
Nobody knows precisely how AI will impact education yet, argues Lake, of CRPE. It is not a panacea, but in her estimation thereâs a real opportunity to use it to close learning gaps. So itâs important to craft plans to deliver on the potential: âA lot of people freeze when it comes to AI, and if they can instead think about what they want for their kids, their schools, and whether AI can help, that seems like a productive path to me, and a much more manageable one,â Lake says.
Thereâs nothing wrong with being hopeful, she adds.