How Duke School of Nursing Is Leading the Way in AI-Enhanced Teaching

What does it look like when instructors lean into the possibilities of generative AI—not as a challenge to be overcome, but as a tool for deepening student engagement and preparing learners for their future workplace?

At the Duke School of Nursing, some members of the faculty and staff are not just asking this question—they’re answering it through experimentation and collaboration. What follows are three powerful examples of how AI is being used as a basis to transform learning, support student success, and model what it means to be bold, ethical, and curious in an evolving digital landscape.

AI-Powered Student Support

When Tina Johnson, Manager of Educational Technology, School of Nursing, saw students struggling to navigate online courses, she didn’t wait for a silver bullet—she built a chatbot.

Using Microsoft’s Copilot Studio, she developed a generative AI assistant trained on a course’s Canvas content, offering students 24/7 assistance embedded directly on the course’s homepage. Prompting tips like “ask one question at a time” and “use clinical reasoning language” help students get useful, course-specific answers while reinforcing critical thinking.

Her approach was as thoughtful as it was technical—ensuring content organization and building in transparency for students. The result: fewer emails to instructors, less confusion, and a stronger sense of guidance in an online learning environment.

Learning and Leading with AI

For Assistant Clinical Professor Elaine Kauschinger, generative AI represents more than a tool—it’s a shift in mindset. Inspired by Gartner Research, conversations with other Duke colleagues, and conference conversations, she introduces her students to AI through a holistic, ethical lens. Already grounded in virtual patient simulations, her pilot program has expanded to include text-to-video tools (like Vidnoz), custom GPTs embedded in her courses, and prompt engineering lessons.

Kauschinger teaches her students how to use and build with AI through platforms like ChatGPT and M365 Copilot. Her students complete an assignment in which they create their own custom GPTs tailored to the course content and their own clinical learning goals. These AI models then serve as personalized, adaptive “study buddies,” helping students navigate the overwhelming volume of material they’re expected to master. By designing AI tools that meet their specific needs, students don’t just consume content, they also develop a deep understanding of the technology increasingly shaping the nursing profession.

It’s important to note that the decision to bring AI into the classroom raised hard questions for Kauschinger. When deciding whether to train bots on copyrighted textbooks, she decided to invite librarians and copyright experts into the conversation. Ultimately, she decided it was unethical to train her chatbot on the ebook used in her class, so she trained it on her own notes, lectures, and materials instead. 

Her next ventures? In the short-term, she plans to create her own Socratic chatbot powered by experiential learning theory that will help students sharpen clinical judgment through dialogue. Looking even further ahead, she envisions students building multimodal AI agents that integrate voice, text, and images to support clinical reasoning. She imagines a tool like this taking the classic EKG—once a static image requiring manual interpretation—and transforming it with AI into a dynamic learning partner capable of synthesizing and analyzing results in real time.

“Stay curious,” she advises other faculty. “If you aren’t getting things wrong, you’re not being creative.” She points out that “we are all nervous,” but try to “stay curious through this grand new learning experience.”

AI as a Tutor and a Colleague

Clinical Professor Michael Zychowicz is exploring AI in nearly every facet of his teaching—from grading short-answer assessments to preparing for course accreditation reviews. He has incorporated the Canvas-based chatbot developed by Johnson into his courses and is currently refining it for next semester to better align with adult learning theory and Socratic questioning techniques. His goal is to enhance the bot’s effectiveness as a student tutor and individualized course guide, enabling it to prompt students with questions like, “What topic would you like to study today?” 

Zychowicz also envisions a more immersive AI future. As a consultant for MedVR Education, he’s piloting avatar-patient simulations where students engage in lifelike conversations with “patients” through AR, VR, and flat screen platforms. He’s already created a GPT-based agent that simulates patient dialogue—a tool in beta with exciting implications for low stakes clinical simulation. Students specify a body system, and AI generates a random diagnosis and a complete patient profile. They talk with the AI patient to gather health history, and the AI leads an interactive discussion on differential diagnoses, treatment plans, and patient education, providing rationale and feedback for each. Tools like this allow students to practice skills like history taking and diagnostic reasoning.

In the first week of class, he opens a conversation about AI—not just to lay ground rules, but also to get students genuinely excited about its potential. Knowing they’ll encounter AI in their professional lives as nurses, he encourages them to use it, but thoughtfully: as a brainstorming partner or a way to refine and organize their ideas, not as a replacement for their own thinking. He also invites students to imagine the future of AI in healthcare, helping them develop curiosity and critical awareness.

His advice to his peers: “Don’t get frustrated. There will be a learning curve. AI will not always give you the answer you want – using it is an iterative process.” He goes on to say, “Be bold. Push the limits of what you think it can do.”

Human Values at the Center

Despite the buzz around AI tools and tactics on how to use them, the faculty and staff members from Duke’s School of Nursing featured here share a common commitment to human-centered learning. Each emphasizes the need for transparency, ethical awareness, and curiosity in the brave new world of AI. They don’t just deploy AI—they teach students how and why to use it, grounded in real-world implications like HIPAA, FERPA, copyright, and responsible data use.

Perhaps most importantly, in the classroom, they model what it means to be learners themselves by remaining open to experimentation, being reflective in their practice, and facing uncertainty with creativity and care.


Interested in exploring AI in your own teaching?
LILE is here to support you. Whether you’re curious about course-integrated chatbots, want help developing AI assignments, or just want to talk through the ethics of AI in the classroom, please reach out to us at lile@duke.edu! You can also find resources from Duke on teaching with AI, including LILE’s teaching guides on the AI at Duke website.