By Remi Kalir and Aria Chernik

A new research study finds frequent generative AI use among Duke students along with strong opinions about trustworthiness and calls for educational guidance.
“If I use AI to guide me in problem solving for math or physics, I have to double check the work and correct the AI,” one student responded. “I use it mostly as a guide or a starting point, not to give me finished answers to skip my work.”
Researchers with Duke’s Center for Applied Research and Design in Transformative Education (CARADITE) found that students routinely use AI as a writing assistant, a problem-solving tool, and as a thought partner. The study “Gauging Duke Students’ Perspectives on Generative AI” offers a timely look at the complex and evolving relationship between education and AI, revealing how AI is increasingly embedded–but not uncritically accepted–in students’ academic and everyday lives.
Co-designed by CARADITE undergraduate research assistants Emma Ren (T‘27) and Barron Brothers (T‘26), the study collected data throughout April of 2025 and analyzed responses from 104 undergraduate and graduate students representing more than 40 fields of study. It is the first IRB-approved, large-scale survey examining Duke students’ use and opinions of AI.
CARADITE recently published initial findings and related pedagogical connections to coincide with the start of the new academic year when all Duke undergraduate students will have no-cost access to ChatGPT. The study’s findings are especially relevant for faculty, staff, and administrators seeking to better understand and support students during this technological shift.

Frequent AI Use in Students’ Academic and Everyday Lives
CARADITE’s findings show that AI use among students is frequent and varied across educational and personal contexts. As a thought partner, nearly 70% of students reported using AI at least once a week to help explain complex topics, and close to half (48%) use AI for problem-solving in science and math. Over half (52%) use AI weekly to paraphrase or summarize their writing, and a similar percentage (50%) rely on AI for feedback about their academic writing assignments.
Furthermore, 42% of surveyed students reported using AI to help complete writing assignments at least once per week. As one student noted, “While I worry that I am losing my natural ability to write, I feel like using AI has improved my writing and made it more concise and I finish work much faster with it relieving a lot of stress.”
Regarding other frequent uses of AI, about half of surveyed students use AI regularly to summarize documents (49%) and troubleshoot technical issues (46%); 40% report using AI weekly for search and assistance finding academic sources.
Overall, student AI use was revealed as intentional and complex, with findings suggesting variation in how and why students turn to AI. One student commented, “I use AI to write sentences or paragraphs for me sometimes and I’m afraid of plagiarism so I never cite it,” while another noted, “I like it most and feel most comfortable using it (ethically speaking) when consulting AI for feedback on something I have already written or assisting me in coding.”

Student Opinions about Learning, Limits, and Trustworthiness
Alongside high rates of use, Duke students expressed both confidence in and skepticism about the capabilities of AI, particularly when it comes to accuracy, trustworthiness, and educational value.
An overwhelming 94% of students agreed that AI responses are not equally accurate across disciplinary contexts, and 75% said AI often provides inaccurate answers to prompts. Students also noted that AI’s simplified responses can hinder deeper understanding, with 62% reporting that AI tends to oversimplify content. “It is inaccurate,” one student shared, “And I am worried about the consequences.”
However, a majority of students also agreed that AI is trustworthy when assisting with academic writing and when used as a thought partner. Nearly three-quarters of students (73%) believe AI responses are trustworthy when paraphrasing or summarizing writing, and 57% agreed that AI can be trusted to provide feedback on writing assignments. Six out of ten respondents agreed that AI is trustworthy when drafting professional correspondence (65%), coding (61%), and in explaining complex topics or concepts (61%).
In written comments, students requested that Duke faculty and administrators provide clear guidelines about the responsible use of AI. “I would ask them to define clear guidelines in the beginning of class about what exactly is considered ‘AI generated content,’” one student shared. “As an educator,” another student wrote, “It is your responsibility to educate your students about the ethics of AI, as well as its unreliability.”
Despite concerns and requests, most Duke students perceive AI as a lasting and transformative influence in their higher education experience. Eight out of ten students (80%) believe AI will enable more personalized learning within five years, and over half (56%) said AI will play a significant role in shaping the industries they plan to enter. Ultimately, many students offered measured optimism about how AI might productively complement their educational experience at Duke. “I’d be interested in discussing ethical ways to use it,” a student commented, “If they [faculty] think there are applications of AI in their course that are not cheating, that contribute to learning rather than take away from it.”
For questions or comments about this study, please contact CARADITE Faculty Director Aria Chernik or Associate Director Remi Kalir at caradite@duke.edu.

