News and Insights
Roads Diverged: The AI Path We Choose Will Make All the Difference for Education
February 19, 2026
A Q&A with Kate Johnson and Betheny Gross
AI is rapidly reshaping our world. With mixed reactions and myriad efforts across the education sector, countless individuals and organizations are focused on identifying ways we can leverage this tool to improve access, instruction and outcomes. Yet, the siloed systems in the U.S. create unique challenges in the integration and application of AI-related strategies and tools. WGU Labs Research Director Dr. Betheny Gross has spent nearly 25 years researching what makes K-12 and postsecondary education systems work and shares her thoughts on the bold path the sector must choose to make necessary change for the future.
What have you observed over your 20-plus years in education research about how technology has evolved in shaping education?
When I started research, the technology in education was simply in the systems, in the back-office systems, and some of the earliest technologies that I was witness to in the field were statistical efforts to measure quality. And tech was used by the system to improve performance. We saw all those waves of smart boards, but those kinds of technologies really didn’t change anything. They, maybe, removed some of the chalk dust that might have been in classrooms. Computers and the internet changed the way information was shared, the way students interacted with information, and the way teachers interacted with information. Then we saw this wave of Learning Management Systems (LMS) and collaborative technologies, which, again, were facilitative of things that were already happening.
As I was finishing my time with K-12 research, I did several studies related to the movement around personalized learning, and this is where folks were really trying to figure out if they could change the instructional model with technology. Trying to think about how we can use technology, computers, computing power to make a more individualized experience for students, to give students more autonomy and choice in their learning. I’m sure there are plenty of people who might disagree with me about this, but my general summation is it didn’t quite take.
Now we’re in this age of AI. At WGU Labs, we think this could really be transformative. We could build an entirely new model of post-secondary education. We think. That’s what we’re in the process of figuring out. Right now, we’re still very optimistic. That said, I still don’t think this is going to change K-12. I would love to be proven wrong.
Why do you think K-12 will have a harder time with AI?
To build something different, you need a piece of the market that’s underserved or untapped. And the thing about K-12 is every kid goes to school already. AI will be used in the K-12 space, but, I expect, it will be used to perform the main functions and the main roles that we recognize today a little bit more efficiently. I’m not anticipating a big change in the way teaching and learning happens in K-12. In higher ed, I would say, maybe not higher ed, but in post-secondary education, I’m more optimistic, because one of the things that we noted early on, especially in our work, is that there are somewhere in the order of 25 million people in the United States who do not pursue any post-secondary learning at all. That is a market that is unserved, and we can build for that market, and in so doing, build something truly different.
You made a distinction there between post-secondary and higher ed. Can you tell me why, and what that means when it comes to AI?
There is an identification of higher ed with the existing institutions that today are providing four-year degrees, and these systems are highly defined systems in how they deliver education, and who they deliver education to, in part because they have a long tradition themselves, and also in part because there’s a whole system of regulations and rules that govern and constrain what they’re able to do. Post-secondary education is something broader. It’s all the opportunities for learning after you exit the primary and secondary system, the K-12 system.
There are all kinds of learning experiences, and they can come through a variety of different providers and institutions, and I don’t think we have yet seen the full spectrum of what’s possible. We’re in an era where we have to think creatively about this, because people need to consistently come back and get a top-up on skills, explore new areas and disciplines as the labor market evolves much more rapidly. So, the idea that you can go to school for four years and be set for the rest of your career is fading. This creates a lot of opportunities for us to really rethink how we design and deliver and provide learning experiences throughout people’s lives.
When we first started working together, we talked about how AI-powered institutions could make higher education more affordable and accessible—almost like a gym membership: pay-as-you-go and focused on earning skills and credentials over time. Given how big of a barrier cost is, how much could this change help solve the affordability problem and change financial aid in higher education?
There was a need to rethink financial aid even before AI was on the scene. There was a need to provide a more diverse range of post-secondary offerings, to have clearer expectations and to give students access to resources. I hope that this more diverse portfolio of educational offerings is a stimulus for the government and regulatory agencies to rethink the rules we’re putting on accessing aid. They are doing some of that with the Workforce Pell.
What do you think are the biggest gaps in research or narratives about AI and education?
I’m a little concerned with the vision and narrative that’s out there. I’m surprised that we’re still talking about kids cheating, and what that demonstrates is that we haven’t imagined a different vision of teaching and learning. We’re panicked that these things are corrupting our existing patterns and practices, and kids are using ChatGPT to do essays. I’m not surprised by the consternation, but I am surprised that we haven’t figured out how to adapt to this, and that, to me, shows a lack of vision.
We’re not talking about what could be enough. And we’re not helping people enough to shift to a new mode. The reality is that this is all changing fast. You just simply cannot sit on your hands; if you take six to eight months to figure something out, then the next thing is going to eclipse that. So, you have to give over some control and just ride the wave.
There are so many places where AI meets the academic integrity piece of education, which so much of the institutional model is really built around. What do you think AI does in terms of integrity in the learning process, from both sides, to help or hurt students and faculty or teachers?
When students do not engage in the cognitive load – to try to reason through something, to process it, to make mistakes, and make corrections, then they don’t learn. AI can absolutely undermine the rigor of the learning process, and the main person who is hurt by that is the student. They came here to do something, and even if they get that little credential, they may not get hired or stay hired, because the, at some level, lack of skill shines through.
For the sake of students, we need to figure out how to create a learning experience where they can’t offload the cognitive process to ChatGPT. At Labs, we see how AI gives a chance for us to create those kinds of immersive experiences at scale and across a range of disciplines in an affordable way. We’ve always known that immersive engagement is critical for learning. It was just far too costly to do that across all these systems, and they’re hard things to design. But if we can use the support of generative AI to help us build those experiences, then we’ve made some progress for the students.
What then is the most promising use of AI? And is it in rethinking that immersive experience, is it in course development, or another area?
We can create a whole learning system, a learning system that includes the curriculum, the instructional delivery, the student support, and the assessment. We can create that to be responsive to every individual student. Can we do it right now? Not so much, but we’re getting there. That’s the promise we’ve got. We can do assessments that are observational, continuous and authentic. We can provide engaging instructional experiences. We can provide students with continuous support that knows them and can be responsive to them. And we can make it all very responsive to students, so that when they’re struggling with one thing, AI can adapt the plan and focus on it some more.
One of the biggest challenges has always been that the systems between K-12, post-secondary and workforce don’t align–they don’t work together. How could AI, and its reshaping of post-secondary, help to break down those silos and better connect the systems so that the learning feels relevant and pathways are clearer?
We’ve been thinking about that at WGU Labs, too. What are the data science and machine learning that are going to make it more possible? To take real-time data about the labor market, about employment, opportunities, and then provide information to individuals, potential employees and people who are looking for jobs. We’re going to be able to more readily get that information, make it available in real time and translate into the skills needed to access a sector more rapidly, more dynamically.
If we build a Gen AI agent that is able to help individuals, workforce agencies or learning providers to interpret those data, then we’ll be able to adapt our programs much more readily to have the education that people need to get the jobs that are available right now. Individuals will be able to identify what they need to do and where they need to go to get the skills to access those jobs.
If others in the field are reading this, what would you like them to consider as they think about AI’s application in education?
I think it’s that lack of vision, the mainstream narrative of fatalism with AI. I read things like “this is happening to us, and we have no control here.” I know that these big companies that we have no influence over are creating these AI systems, but we do have the opportunity to shape how we use those systems to design learning opportunities.
I don’t want to give over control of what the learning experience is to large tech companies. We as educators should own that and shouldn’t give up. These are tools. We can use them. And we can shape them, too.
Dr. Betheny Gross oversees Labs’ research initiatives, including evaluations of education technology and higher education policy to improve student access and success. Over the last 20 years, she has examined K-12 district and state policies to increase students’ access to high quality learning. Her recent research draws on principles of equity by design and examines systemic strategies to address the opportunity gap in learning by providing underserved families with the information, financial and navigational support to access rich learning opportunities in and out of school. Her work has been published in several journals, including Harvard Law and Policy, Educational Evaluation and Policy Analysis, American Education Research Journal, Journal of Education Finance and Journal of Policy Analysis and Management.
Get in touch
We’d love to talk with you about how your organization is innovating and adapting for the future of education, and how we may be able to help.
