As AI tools rapidly make their way into classrooms and workflows, engineering educators are grappling with how to use them without losing the fundamentals. EE World interviewed Robert W. Heath Jr., 2025 IEEE/RSE James Clerk Maxwell Medal recipient and Charles Lee Powell Chair in Wireless Communications at the University of California, San Diego, who shares a candid perspective on where AI helps, where it hinders, and what it really means to educate and be an engineer in the age of intelligent tools.
EE World: How do you balance traditional engineering skills with AI and data-driven approaches?
Robert: In courses, we are seeing that students are increasingly relying on AI tools to solve problems that are the foundation of traditional engineering skills. Unfortunately, when they don’t have access to these tools, they seem to struggle with those same problems. It seems that many students are using AI as a shortcut to complete an assignment rather than as a tool to help them learn the material.
EE World: In your view, should AI be taught as a standalone subject or integrated across existing engineering disciplines?
Robert: There should be courses on the fundamentals of AI made available to a broad set of engineering disciplines. This includes the principles of machine learning, both shallow and deep, the use of large language models, and the mathematical foundations of AI, like signal processing, linear algebra, and probability. There is also room for courses that make use of data-driven tools to solve problems and may use those approaches to augment more traditional types of engineering.
EE World: How do you define what it means to be an engineer in the age of AI?
Robert: An engineer uses mathematics, physics, and other scientific disciplines to solve problems. AI is another tool that can be applied to solve problems or to help build tools that can solve problems. An engineer should be able to validate the solution, even if obtained using AI, to establish why that solution meets all of the specifications and design requirements.
EE World: Are there ethical or professional identity concerns about students overly relying on AI tools in their problem-solving? Are we headed toward an age of the virtual engineer (AI) doing the grunt work, so the physical engineers (people) have more opportunities?
Robert: The first ethical concern is whether the use of AI tools is properly acknowledged in any assignment. Many faculty ask students to provide code if they generate and plot a result, or the names of collaborators, for a homework assignment. This is seen as an issue of transparency. I would ask that students provide prompts and details of the evaluation for whatever AI tools are employed by a student.
The second question is more subtle. The promise now is that AI does the grunt work and the engineers do more of the higher-level work. But I think that this will just translate into a new definition of grunt work. There will always be more desirable and less desirable work.
EE World: How do we prepare students to critically evaluate the output of AI tools, rather than blindly trust them?
Robert: We need to figure out how to incorporate AI tools to complement traditional engineering teaching. The hope is that we can teach critical evaluation as part of traditional problem-solving.
EE World: Should AI instruction be centralized in one department (e.g., computer science) or be dispersed across all engineering departments?
Robert: The fundamentals of AI naturally fall in the departments of electrical and computer engineering, telecommunications, and computer science, where they originally developed. These are the natural homes for teaching courses that talk about the development of these tools from a technical perspective. The broad use of these tools by society can be handled separately in different departments.
EE World: Do you utilize AI? How do you keep up with AI instruction?
Robert: I read several newsletters and experiment with AI tools frequently in my work.
EE World: How have colleagues responded to AI-related content — are they excited, intimidated, skeptical? How does that compare to hires fresh out of school?
Robert: Our main experiences, so far, are with engineers who are using AI (especially LLMs to help in their assignments. In the last two years, there has been a dramatic uptick in the ability of AI to be used to solve the mathematical problems that are found in many parts of electrical engineering, including circuits and signal processing, among other areas. At the same time, we have seen a decline in core mathematical skills in the students enrolling in these courses, said to be a legacy of COVID. Now we find that students without the core mathematical skills are relying on AI to solve their homework problems. Unfortunately, they don’t seem to use AI to learn the material, and success rates in exams are dropping.
EE World: What advice would you give to engineering departments just beginning to explore AI integration?
Robert: It is too early to incorporate AI integration, aside from on an experimental level. We do not have enough data to put together best practices, yet that will guide us on the right ways to incorporate AI.
EE World: Do you think AI will eventually change ABET accreditation standards or licensure expectations?
Robert: Not in a direct way. ABET accreditation standards did evolve over the years and were no doubt influenced by technical developments. For example, there was a transition in the 1970s from slide rules to calculators. This did not specifically direct a change to ABET requirements to my knowledge.