Some of the key critical questions to ask about any AI text generators are:
These resources were compiled by the librarians at KU Medical Center Library.
Guidance from KU's Center for Teaching Excellence. KUMC/KU access only.
The U.S. Department of Education Office of Educational Technology's policy report addresses the clear need for sharing knowledge, engaging educators, and refining technology plans and policies for artificial intelligence (AI) use in education. The report describes AI as a rapidly-advancing set of technologies for recognizing patterns in data and automating actions, and guides educators in understanding what these emerging technologies can do to advance educational goals—while evaluating and limiting key risks.
In English & Spanish - examples of policies used by instructors in their syllabi from a variety of disciplines.
Use prompts at your own risk, outputs may not be correct. To help learn to better prompt AI.
The University of Pittsburgh gives many tips and ideas when using AI in teaching.
Describes highlights from a webinar (linked below) on ChatGPT and other AI
Asks how AI tools could hurt, or benefit, students with disabilities
Points you to some colleges’ guidance on ChatGPT
What educators need to know about research, writing, tutoring, and grading tools like ChatGPT and other generative AI tools; How colleges can best prepare their faculty members to teach with and about AI; Teaching students how to use AI tools wisely.
This material is from The Center for Teaching Excellence's comprehensive AI Resources & Guidance.
The University of Kansas does not have a specific policy about use of generative artificial intelligence in teaching and learning. The University Senate Rules and Regulations do provide guidance on academic integrity, though:
Academic misconduct by a student shall include, but not be limited to, disruption of classes; threatening an instructor or fellow student in an academic setting; giving or receiving of unauthorized aid on examinations or in the preparation of notebooks, themes, reports or other assignments; knowingly misrepresenting the source of any academic work; unauthorized changing of grades; unauthorized use of University approvals or forging of signatures; falsification of research results; plagiarizing of another's work; violation of regulations or ethical codes for the treatment of human and animal subjects; or otherwise acting dishonestly in research.
The KU Code of Ethical Conduct also provides guidance, emphasizing the importance of demonstrating accountability, modeling ethical standards, fostering honest pursuit of knowledge, ensuring originality of work, and attributing ideas drawn from others’ intellectual work.
To supplement those, we encourage faculty members to include a statement about permissible use of generative artificial intelligence in their classes.
This should include guidelines on how students may use generative AI in your classes and when they should not use AI. (See Maintaining academic integrity in the AI era for advice on what to include.)
Students should understand how generative AI tools were created, how they work, and why they must be used with caution. So plan on using class time to discuss generative AI. Don't worry about having extensive understanding of generative AI or knowing all the terminology. Just be honest with your students and encourage them to ask questions. (See our list of readings and tools for ideas on possible topics for discussion or ways to use generative AI.)
No one expects you to be an expert in the use of generative AI. You should have a basic understanding of how it works and what it can do, though. Many students are already using tools like ChatGPT, and you should have a sense of how they might use them with your assignments.