Artificial intelligence use has been associated with everything from fear of judgment and loneliness to misogyny and illiteracy — a baffling array of outcomes that's often alarming, but defies easy categorization.
Now the plot thickens. In a new study published in the journal BMC Psychology, South Korean scientists surveyed 504 college-level Chinese art students and found that the ones who exhibited higher rates of narcissism, psychopathy, and Machiavellianism were more likely rely on ChatGPT and the AI art generator Midjourney than their better-balanced peers.
The paper, by psychology researchers Jinyi Song of South Korea's Chodang University and Shuyan Liu of Baekseok University, framed AI use among art school students as akin to academic misconduct behaviors like cheating, lying, and plagiarism. Those behaviors, the researchers explained, are also associated with the aforementioned "dark" personality traits, which are drawn from the "Dark Triad" model used to assess negative personality characteristics.
Drawing from six art-focused universities in Sichuan province which represented a diverse set of disciplines including visual art, music, drama, and dance, the researchers found that students who scored higher for dark personality traits were more likely to try to pass AI-generated work off as their own — a major problem in the world at large, and especially so in the arts and academia.
Those same students who scored highly on the "Dark Triad" questions were, as the paper explains, also more anxious about their academic performance and more likely to procrastinate on assignments, which led to greater reliance on AI tools for their schoolwork. We've seen similar trends surrounding student procrastination and AI in the past as well.
Along with measuring for the traditional "Dark Triad" traits, the researchers also asked survey questions about how materialistic the survey cohort was. As they found, those who scored higher for materialism, or for whom external rewards and praise were a motivating factor, were also more likely to use AI to achieve those ends.
The authors of the paper suggest that colleges and universities should redesign curricula so that assignments are "less susceptible to plagiarism" and AI mimicry. The researchers added, as others before them have also suggested, that schools should figure out better ways to teach students about the "associated hazards and ethical quandaries" surrounding AI, which will hopefully help them realize that using the nascent technology as a shortcut or crutch is counterintuitive to education.
More on AI in schools: College Students Are Sprinkling Typos Into Their AI Papers on Purpose
Share This Article