One of the questions l've seen and heard explicitly and implicitly asked of late is who is going to teach the general undergraduate student population how to use AI. Given the recent Cengage Group report that the majority of recent graduates wish they had been trained on how to use Generative AI, this is a skill colleges and universities will want to incorporate into the curriculum.
Remember: We’re looking at a general student population — not future coders. The world's departments of Computer Science are already working that problem and trying to grapple with the problem that their colleagues have created algorithms that can do much of what they are teaching their students to do.
Much, but not all.
So here’s what we need our students to learn: They need to learn how to consider a problem deeply and think through its issues. Then, they need to take what they have considered and use it to frame a prompt that consists of a well defined request that is accompanied by specific restraints that will instruct the Large Language Model how to respond.
This is what every research methods course — whether specific to a major or embedded in the Freshman Composition sequence — tries to teach its students to do.
We are not looking at a significant search for personnel or long-term re-training of those already there. They already have the skills.
They need help reimagining them.
To facilitate this re-imagination, faculty in these areas need is some basic support and training on how to incorporate Generative AI tools and tasks into their curriculum so they can move past the plagiarism question and begin to see this as an opportunity to finally get students to understand why they have to take their Composition of Methods class.
Administrators will have to figure out how to put the tools in their hands, provide them with the training they need, and how to better reward them for imparting the high tech, business-ready skills that the AI revolution is demonstrating that they provide.