Turnitin's Guide for Approaching AI-Generated Text in Your Classroom "shares eleven practical strategies that educators can begin using in classrooms now to meet the needs of student who are likely using text-generated by AI when completing assignments."
In March of 2024, Turnitin hosted a webinar for NIC faculty, Interpreting Turnitin Similarity and AI Reports, to provide training in how to use Turnitin's Similarity and AI reports. The 60 minute presentation is viewable in your browser.
The following articles discuss the failure rates of AI detection software. The desire to defeat detection of AI use has created a market for a new class of generative AI tools. The Advanced AI Detector and Humanizer from Undetectable.AI is one of the better. A user inputs AI generated text, the tool processes it and generates a likelihood of detection score. The user can then ask the software to "humanize" the original AI-generated input to improved the likelihood of defeating detection.
The following statement appears in the student research guide, Copyright, Fair Use, and Plagiarism for Students:
Unless your instructor allows the use of AI without attribution, failing to acknowledge your use of AI in your work is considered a form of plagiarism.
Plagiarism is using someone else's work or ideas and representing it as your own. This includes, but is not limited to, ideas, writings, artworks, and data. Text and other outputs generated by AI do not represent work that is solely your own. The standards for citing AI use are evolving. Both APA and MLA have published standards and new, independent citation forms are being advocated by academics and professionals. If in doubt about how to cite your use of AI, ask your instructor.
The concept of plagiarism as it applies to AI generated content may be confusing to some. To help clarify when and how plagiarism applies to work created with AI assistance, Microsoft Copilot provides the following guidance in response to a prompt entered by a librarian.
Prompt: Explain how the concept of plagiarism can apply to content created with generative AI tools.
Copilot:
The concept of plagiarism in the context of generative AI tools is a significant and evolving topic. Plagiarism traditionally refers to the act of using someone else's work without proper attribution, passing it off as one's own. With the advent of generative AI tools, this concept extends to the use of AI-generated content. If someone uses content created by AI without proper disclosure, it could be considered a form of plagiarism, especially if the content is presented as entirely original and human-created.
Generative AI tools can produce text, images, and other forms of content that may closely mimic human-generated work. As such, it's important to consider the ethical implications and establish clear guidelines for use. For instance, academic institutions and publishers may require disclosure of AI assistance in the creation of content to maintain transparency and academic integrity.
In the professional setting, using AI tools to assist with content creation is becoming more common, but it's crucial to understand and follow any existing policies regarding AI-generated content. This includes giving credit to the AI tool used and being transparent about the extent of AI involvement in the final product.
It's also worth noting that there are tools and resources available to detect AI-generated content, which can be used to ensure that AI assistance is properly disclosed and that the content meets the required standards of originality.
In summary, the concept of plagiarism applies to generative AI tools when there is a lack of transparency and proper attribution regarding the use of AI-generated content. As these tools become more integrated into various fields, it's essential to develop and adhere to guidelines that uphold ethical standards and intellectual property rights.
("Explain how the concept of plagiarism can apply to content created with the assistance of generative AI tools" Prompt.
Copilot, Windows 11 23H2 version, Microsoft, 1 Mar. 2024, https://copilot.microsoft.com.)
To further understand why plagiarism is a critical concern when it comes to content generated by generative AI tools, consider:
AI tools like Copilot and GPT4 learn from vast amounts of training data gleaned from websites, books, academic papers, and other sources. During training, these tools absorb patterns, correlations, and language structures from the data they ingest, and do so without regard for copyright, attribution, or plagiarism risk. As a result, they can unintentionally paraphrase, or reproduce verbatim, passages of copyrighted text. As example, a lawsuit revealed that OpenAI software generated New York Times articles nearly word-for-word. Providing proper attribution through citation protects authors including students from allegations of academic/professional fraud or copyright violations.
Just as it would be unethical to have a family member or another student complete an assignment for you, failing to disclose the extent to which you have used AI in your work is also unethical and a potential violation of academic codes. The librarian who created this Research Guide used Microsoft Copilot to help create several sections. For each use they provided attribution for the work performed by AI or with the assistance of AI. If you are unsure about whether or not your use of AI requires disclosure, ask your instructor for their course policy.
Failing to acknowledge the use of AI to complete an assignment can violate academic integrity standards in several ways :
("What are some ways that the failure to disclose the use of AI tools to complete an assignment can be a violation of academic integrity?"
Prompt. Microsoft Copilot, Windows 11 23H2 version, Microsoft, 1 May, 2024, https://copilot.microsoft.com/)