Last week I had the opportunity to participate in Old Dominion University’s Fall Industrial Advisory Board meeting for the Engineering Technology program. It was more than a meeting; it was a glimpse into the future of education and the workforce. As we talked about the curriculum, one question dominated the conversation: What does AI mean for students entering a world where machines can write, analyze and even build?
My answer is that AI, education and the workforce are no longer separate conversations. They are a single continuum. Students need to think about AI as more than a tool or a subject; it’s the medium through which their work will be delivered. When you’re learning how to design a circuit, program a PLC or model a supply chain, you’re also learning how to collaborate with an agent that thinks differently than you do.
## A conversation at Old Dominion University
At ODU we discussed how the traditional model of education—four years of theory followed by a lifetime of practice—no longer fits. Technology cycles move too quickly. The skills you learn as a freshman may be obsolete by the time you graduate. Instead, education must become a partnership between students, faculty and industry that lasts for decades. In our board meeting we broke down what that partnership looks like:
* **Continuous learning:** AI isn’t a field you enter once; it’s an evolving environment. Students should leave campus with the mindset that they’ll be reskilling and upskilling throughout their careers.
* **Applied experience:** Classroom theory must be complemented by hands‑on projects that use AI to solve real problems. Internships and co‑ops that integrate AI tools into workflows teach students to translate academic concepts into operational value.
* **Cross‑disciplinary collaboration:** AI is not just for computer science majors. Mechanical, electrical, civil and logistics engineers all need to understand how to frame questions for models, interpret results and guide agents toward ethical outcomes.
* **Ethics and context:** AI can only be as good as the data and context we give it. Students must learn how to evaluate the quality of data and the biases that may be present, and how to use AI responsibly in a way that enhances human judgment rather than replaces it.
## Bridging education and the workforce
The AI market doesn’t wait for anyone. Companies across logistics, manufacturing and construction are already embedding AI into their systems to automate routine tasks and to assist with complex planning. The graduates we’re training now will walk into environments where AI is expected to handle scheduling, quality control and predictive maintenance. The question isn’t whether AI will be part of their job; it’s whether they are prepared to lead that transformation.
Employers want people who can ask the right questions of AI systems, understand the difference between correlation and causation, and make decisions based on AI‑generated insights. That’s why we’re working with ODU to develop capstone projects that pair students with local companies to solve live problems using AI agents. When students iterate with real data and see the impact of their work on a shop floor or warehouse, the lessons stick.
## The role of universities and industry
Universities shouldn’t try to forecast every technology trend. Instead they should build a curriculum that teaches students how to learn, how to think critically and how to work in teams that span disciplines. Industry partners need to engage in that process by sharing their needs and by giving students access to systems and mentors. At the board meeting I heard from employers who were desperate for graduates who understand both the physics of a system and the dynamics of AI. They don’t want someone who can recite textbook definitions; they want someone who can engineer a solution with a data‑driven partner by their side.
## Looking ahead
AI in education isn’t about replacing human teachers or eliminating the liberal arts. It’s about augmenting our capabilities so we can solve bigger problems. Students will still need to learn calculus, thermodynamics and statics, but they’ll also need to learn how to prompt a language model, interpret an anomaly detection plot and evaluate the impact of an algorithmic decision on safety and fairness. That’s a tall order. It means that universities and industry must work together like never before.
The future workforce will belong to people who can bridge the gap between human understanding and machine intelligence. Every student needs to know that now.
Latest posts by Chris Machut (see all)
- Episode 30 – SaaS Isn’t Dead – Generic SaaS Is - February 25, 2026
- Episode 29 – The Matrix Got One Thing Right About AI Agents - February 5, 2026
- Episode 28 – ChatGPT Saved My Dog - January 8, 2026



