
While the recent wave of white-collar layoffs may have spooked employees and job seekers, Robert Seamans, a professor at New York University’s Stern School of Business, says his current MBA students aren’t worried.
“I don’t really get the sense that they’re afraid of the job market or think there’s going to be dramatic change,” Seamans told a gathering of technology executives at the CNBC Technology Executive Council Summit in New York City last week. “These students are already participating in the job market before they return to school and are used to the ups and downs of being in the workforce.”
Seamans said his focus is on ensuring students in the classroom have the skills they will need upon graduation, including generative AI and machine learning more generally. Performing experiments with AI in a group setting is one way to familiarize students with the technology, its benefits, and limitations.
For example, he recently asked his students to write a short essay on whether a mandatory return to the office is good or bad for the workforce. He then had them choose an extensive language model of their choice to strengthen their argument. The second part of the assignment had students write another paper, but this time they were asked to respond to the LLM in an adversarial manner with critical feedback. This is what Seamans calls the “black sheep” approach.
“We’re trying to get people to understand that they can interact with AI in many different ways,” Seamans said. He added that in experiments, many students preferred the more adversarial method. This is probably because the method more closely models the variety of opinions and ideas that occur in the real workplace. “We don’t know what the best practices are yet, but that’s why we want you to keep trying different things,” he said.
AI “as a tool, not a crutch”
Earlier in the day at the TEC Summit, a group of high school and college students spoke to CNBC’s Contessa Brewer about how they’re being exposed to AI (or not) in the classroom. Their responses show that despite enthusiasm for technology in the workplace, students are advised to take it slow.
Aarnav Sathish, a high school senior, said teachers are strictly prohibited from using AI in the classroom. But outside of school, the 17-year-old uses ChatGPT to help with homework assignments, and is quick to add that he wants to use AI “as a tool, not a crutch.”
Edin Okonkwo, 19, an undergraduate student at Columbia University, said professors also discourage students from using AI, preferring instead to develop the subject matter skills needed for each class. But like Satish, she uses AI outside the classroom for tasks she feels are repetitive or that she already knows how to do. “If I have to write a bunch of emails that all feel the same, I use it to sound a little different,” Okonkwo said. “If you don’t already know how to code in that coding language, don’t use it to code.”
Brothers Carson and Andrew Boyer both attended Georgia Tech, but they have different experiences with AI. Carson, 19, a freshman studying engineering, said his professors were reasonably accepting of AI. He feels that being able to practice conversation using ChatGPT is most useful in his Chinese classes. “It’s like having a Chinese tutor,” he says.
Andrew, a 21-year-old fourth-year student, said his professors encourage the use of AI, but “don’t want people to copy and paste their work.” He was recently surprised when his teacher allowed students to use the internet and AI during a midterm exam for his information security class. At first, that seemed like a good thing, but I soon realized that the professors had set up the exam in such a way that the AI couldn’t answer many of the more subtle and visual questions.
“I think the average grade in the class was about 60 points,” Andrew said. “At Georgia Tech, we are evolving and leveling up our research toward more advanced concepts that we need to understand and be able to implement ourselves.”
If there’s one thing New York University’s Seamans wishes tech executives had made room for when it comes to young people joining their organizations, it’s that despite the focus on AI, employers are ultimately dealing with human beings who need empathy and understanding.
“Everyone who joins your company brings a unique set of human skills,” he said. “Some people are great public speakers or group leaders, and some people are great in finance. What they want is the chance to work with this technology and be a contributing member of whatever team they’re on. AI is going to change, and what they really want is a workforce of engaged, engaged minds and a workplace where this kind of thinking is encouraged.”
