Detailed Guidelines for Faculty Use of Generative AI:
Integrating AI in Medical and Health Sciences Education

We recognize the transformative potential of Artificial Intelligence (AI) to enhance medical and health sciences education. These guidelines aim to provide faculty with a framework for the responsible and effective integration of AI into teaching practices, while upholding our commitment to ethical conduct, student-centered learning, and academic excellence. These guidelines are not exhaustive and will be updated periodically as technology evolves. 

I. Core Principles 

  • Human-Centered Approach: AI should augment, not replace, the essential role of faculty in fostering critical thinking, clinical reasoning, and the development of professional identity in our students. 
  • Ethical and Transparent Use: AI tools should be used in a manner that is transparent, fair, and accountable. Faculty should be aware of the limitations and potential biases of AI and communicate these to students. 
  • Student-Centered Learning: The primary goal of integrating AI is to enhance student learning outcomes and engagement. AI should be used to personalize learning experiences, provide timely feedback, and promote active learning. 
  • Faculty Development: The institution will provide ongoing support and resources to faculty to develop the necessary skills and knowledge to effectively integrate AI into their teaching. 
  • Rigorous Evaluation: The effectiveness of AI in medical education will be continuously evaluated to ensure that it is meeting its intended goals and improving the quality of education. 
  • Data Privacy and Security: All use of AI must comply with relevant data privacy regulations (e.g., HIPAA) and institutional policies. Student data should be handled with the utmost care and used only for educational purposes. 
  • Academic Integrity: The use of AI should promote academic integrity. Faculty should clearly define the appropriate use of AI in assignments and assessments and take steps to detect and prevent misuse. 

II. Ethical Use and Academic Integrity 

  • Upholding Academic Integrity: 
    • Faculty must model and explicitly teach students about the ethical boundaries of using AI in academic work and future medical practice. This includes emphasizing that AI is a tool to augment, not replace, critical thinking, independent work, and professional judgment. 
    • Practical Examples:  
      •  Case Studies: Present scenarios where inappropriate AI use could lead to ethical breaches (e.g., using AI to diagnose without understanding the underlying principles, relying solely on AI-generated summaries of patient histories without critical review). 
      • Assignment Design: Design assessments that require higher-order thinking skills (analysis, synthesis, evaluation) that are difficult for current AI models to replicate without significant human input. 
      • Plagiarism Discussions: Explicitly discuss what constitutes AI-assisted plagiarism in the context of medical education (e.g., submitting AI-generated case write-ups as one's own). 

III. Distinguishing Human and AI-Generated Content 

  • Transparency is paramount. All teaching materials, assignments, and assessments that incorporate AI-generated content must be clearly labeled. This ensures students understand the source of the information and can critically evaluate it. 
  • Practical Examples:  
    • Lecture Slides: Clearly indicate if AI was used to generate any text, images, or diagrams within lecture slides (e.g., "Image generated with AI assistance"). 
    • Assessment Instructions: If AI tools are permitted or used in any part of an assessment (e.g., AI-assisted literature review), provide clear guidelines on how to cite the AI and distinguish its contributions from the student's own work. 
    • Feedback on Student Work: If AI is used to provide initial feedback on student submissions, inform the students that this is the case and emphasize that faculty review and final grading will be human-led. 

IV. Transparency and Disclosure 

  • Informing Students about AI Use: 
    • At the beginning of a course or when introducing an AI tool, faculty should clearly explain how the tool will be used, its purpose, its limitations, and any expectations for student interaction with it. 
    • Practical Examples:  
      • Course Syllabus: Include a section on AI tools used in the course, outlining their function and the rationale for their use. 
      • In-Class Announcements: Before using an AI tool in a lecture (e.g., for real-time polling analysis or generating visual aids), explain its function and how the results will be interpreted. 
      • Assignment Guidelines: If students are permitted to use AI for specific tasks (e.g., brainstorming, summarizing), provide clear instructions and limitations. 
  • Faculty may use consistent symbolic representation for AI use within written assignments and activities. Examples of symbols:

  • Disclosing AI-Generated Content in Materials: 
    • Beyond simply stating that AI was used, faculty should, where appropriate, provide context about the extent and nature of AI involvement in generating specific content. 
    • Practical Examples:  
      • Developing Case Studies: If an AI was used to generate initial drafts of patient scenarios, faculty should mention this and highlight the human oversight in refining the case for clinical accuracy and pedagogical value. 
      • Creating Quiz Questions: If AI was used to generate a pool of potential questions, faculty should assure students that these questions were reviewed and validated by subject matter experts. 

Skill Development and Training 

  • Faculty Training on Responsible AI Use: 
    • Institutions should invest in comprehensive training programs that go beyond basic tool usage. These programs should focus on pedagogical implications, ethical considerations, bias detection, and effective integration strategies for AI in medical education. 
    • Practical Examples:  
      • Workshops: Offer workshops on topics like "Designing AI-Integrated Assessments," "Ethical Considerations of AI in Medical Pedagogy," and "Evaluating the Accuracy and Bias of AI-Generated Medical Information." 
      • Online Modules: Develop self-paced online modules covering the fundamentals of AI, its applications in medical education, and best practices for responsible use. 
      • Peer Mentorship: Pair faculty who are experienced with AI integration with those who are new to it. 

VI. Equipping Students with AI Literacy:

  • Medical students need to develop critical AI literacy, understanding how AI works, its potential benefits and limitations in healthcare, and the ethical considerations surrounding its use in clinical practice. 
  • Practical Examples:  
    • Dedicated Curriculum Modules: Integrate modules on AI in medicine, covering topics like machine learning basics, AI applications in diagnosis and treatment, ethical implications, and data privacy. 
    • Critical Appraisal Exercises: Train students to critically evaluate AI-generated medical information and research findings. 
    • Simulated AI Interactions: Use simulations where students interact with AI diagnostic tools or decision support systems, followed by discussions on the AI's strengths, weaknesses, and potential biases. 

VII. Review and Updates 

These guidelines will be reviewed and updated periodically as AI technology evolves and new best practices emerge. You are responsible for staying informed of any changes. 

Expanded Ideas for Consideration

Personalized Learning and Intelligent Tutoring:

AI for Personalized Learning: 

  • Detail: AI can analyze student performance data to identify learning gaps and tailor educational content and pacing to individual needs. However, faculty oversight is crucial to ensure that personalization aligns with learning objectives and doesn't inadvertently narrow the curriculum. 
  • Practical Examples:  
    • Adaptive Quizzes: Implement quizzes that adjust difficulty based on student responses, providing targeted remediation for areas of weakness. 
    • Personalized Resource Recommendations: AI systems can suggest relevant articles, videos, or learning modules based on a student's performance in specific topics. 
    • Learning Path Customization: Allow students to progress through certain modules at their own pace, with AI providing support and resources as needed. 

Intelligent Tutoring Systems: 

  • Detail: Intelligent tutoring systems can provide immediate, targeted feedback on student work, helping them identify and correct errors in real-time. Faculty should carefully select and evaluate these systems to ensure they align with learning objectives and provide pedagogically sound feedback. 
  • Practical Examples:  
    • Anatomy Dissection Simulations: AI-powered simulations can provide immediate feedback on the accuracy of student identifications and dissections. 
    • Clinical Reasoning Exercises: Intelligent tutoring systems can guide students through diagnostic processes, offering feedback on their choices and reasoning.

Automated Grading and Assessment:

AI for Automated Grading: 

  • Detail: AI can efficiently grade objective assessments (e.g., multiple-choice questions) and provide initial feedback on more complex tasks (e.g., short essays, coding assignments). This frees up faculty time but requires careful design of rubrics and validation of AI accuracy. 
  • Practical Example:  
    • AI-Assisted Essay Scoring: Employ AI to provide feedback on grammar, style, and adherence to basic rubric criteria, with faculty focusing on evaluating the depth of understanding and critical analysis. 

Review of AI-Generated Assessments: 

  • Detail: Faculty must critically review all AI-generated assessment items and grading to ensure accuracy, fairness, and alignment with learning objectives. AI should be seen as an assistant, not a replacement, for faculty expertise in assessment design and evaluation. 
  • Practical Examples:  
    • Faculty Validation of Question Banks: Before using AI-generated questions in exams, faculty should review them for clinical accuracy, relevance, and potential bias. 
    • Spot-Checking AI-Graded Assignments: Regularly review a sample of AI-graded assignments to ensure the AI is applying the rubric correctly and consistently. 
    • Addressing Student Appeals: Have a clear process for students to appeal AI-generated grades, with faculty providing the final judgment. 

AI-Driven Simulations and Practical Applications:

AI-Driven Healthcare Simulations: 

  • Detail: AI draft highly realistic and interactive simulations of clinical scenarios, allowing students to practice their skills in a safe and controlled environment. These simulations can adapt to student actions, providing personalized learning experiences. 
  • Practical Examples:  
    • Virtual Patient Encounters: AI can power virtual patients with complex medical histories and dynamic responses to student questioning and treatment decisions. 
    • Surgical Training Simulators: AI can provide real-time feedback on procedural steps, surgical technique, and instrument handling.

AI for Virtual Patients: 

  • Detail: AI-powered virtual patients can offer consistent and repeatable scenarios for students to practice history taking, physical examination, diagnosis, and treatment planning. The AI can track student progress and provide feedback on their clinical reasoning. 
  • Practical Examples:  
    • Standardized Patient Interactions: AI can create virtual patients with specific conditions and communication styles, allowing students to practice their interviewing and communication skills. Note that these patients require significant training, cost, and often contain inarticulate answers, hallucinations and bias.
    • Branching Case Scenarios: AI can draft scenarios that evolve based on student decisions, providing a more dynamic and engaging learning experience. 
    • Remote Learning Opportunities: AI-powered virtual patients can provide valuable clinical simulation experiences for students in online or hybrid learning environments.