The use of generative artificial intelligence (AI) tools has proliferated quickly. ChatGPT
, for example, is an application that produces written text at a desired length, format, and style, depending on the user’s prompts. Copilot
is an AI generator provided by Minnesota State to Office 365 users. These applications and others like them perpetually “learn” how to respond to a prompt and expand their knowledgebases through millions of pieces of information being fed to them.
Students at all levels have increasingly been using tools like ChatGPT to varying degrees. With the power and easy access these tools offer, misuse may occur. There is growing concern among instructors and others over how students are using these tools. In response to this concern, AI detection tools have also emerged. One such tool is Turnitin, which has an extension used to flag submitted student work for AI. There are several other popular AI detection tools that perform a similar function. The information and guidance in this document address all AI detection tools and applications in a general sense.
Within the past year, the Minnesota State System Office activated a pilot of the AI detection feature in Turnitin. Some instructors adopted this tool as part of their grading process for written assignments. At the end of the Fall 2023 semester, the System Office ended the Turnitin AI detection pilot.
Although many studies, as well as current data results from AI detection companies themselves, show the limited usefulness of AI detection tools such as Turnitin, Saint Paul College has reinstated the Turnitin AI detector tool at SPC for the remainder of the Turnitin contract term, which ends in June 2024. The College Academic Technology Team (CATT), which will be formed this semester and will include faculty representation, and leadership at the College will continue to determine the usefulness and equity of this tool for the long term.
Use caution when employing AI detection tools. Concerns have emerged from various academic institutions about the reliability of Turnitin and other AI detectors, as well as ethical and equity implications. Reports from these institutions indicate that the tool yields a sizeable percentage of “false positives” in flagging AI or plagiarized work—it often reads original, valid student writing as plagiarized. In particular, students whose first language is not English are more likely to see their work erroneously flagged as plagiarized. See Vanderbilt University’s guidance on the Turnitin AI detector
for more information about this problem. Turnitin has not been forthcoming with details around how their AI detector works. This lack of transparency, as well as the substantial number of false positives, creates problems when relying on this tool for grading. For these reasons, many institutions across the country have opted to discontinue use of the Turnitin AI detector and other similar tools. For these reasons, this tool should never be used as a singular point of reference to determine the validity of student work
How to Use it
Remember that an AI detection tool is just one data point that can be used in assessing student work. It is important not to rely on this tool alone to raise a plagiarism accusation against a student. If you use an AI detector, continue to use standard evaluation methods, comparison of current writing against the student’s previous writing, and your years of experience as an instructor when assessing student work. Trust is the foundation of good instruction.
If you suspect a student’s work has been generated with AI, here are some steps you could take:
- Talk with the student, in a non-accusatory manner, and ask them about their processes and sources they've used for the assignment.
- If you feel certain the work is not the student’s, and after you have spoken with them, ask them to resubmit the assignment for a lower grade if the situation allows for some flexibility.
- Submit an academic integrity violation here: Academic Integrity Violation Form
Saint Paul College is committed to becoming an anti-racist, trauma-informed institution
and believes that ensuring equity, inclusion, and a sense of belonging for all students is vital to the work we do. Although we want to ensure students are creating original work at all times, communication and clear expectations help provide a sense of belonging for our student population.
Become proactive rather than reactive.
- Familiarize yourself with available AI tools (i.e., Microsoft Copilot) and with conversations and implications around the still-evolving AI academic tools.
- Write a clear syllabus policy about what is and isn’t okay when completing work in your courses. (Some ideas to get you started from Minnesota State)
- Long term, rethink current assessment techniques and teaching strategies to educate students about good AI use and when it is unacceptable. Don’t change everything in your course all at once; give yourself some time to consider ways you can work within this paradigm shift in your classes and how you can mitigate the ever-changing AI environment in which our students live. We encourage you to reach out to James Smrikarov, Instructional Designer, or Eric Kline, Academic Technologist, in AEI for support as you navigate this cultural and academic shift.