There are usually few meaningful shortcuts in our learning process. Learning is a “time-skill, where the ticking away of the unforgiving seconds plays a dominant part in both learning and application of the skill” (Canning, 1975, p. 277), now known as sequential and map learning, where a skill becomes implicit and automatic. For best retention, we need regular, focused practice to develop unconscious, implicit learning (Snyder et al., 2014, p. 162).
And it is the time-oriented nature of learning - mastery over a long period of time - which is why I think the use of AI in the academic world is going to be problematic.
- Problem 1: There are no shortcuts to learning. Putting a question into ChatGPT will not allow us to gain mastery. It will not build our academic critiquing muscles. Our comprehension will not grow. It will just mean we can ask questions.
- Problem 2: Lacking a bullshit detector. ChatGPT is a model for predictive text. In the brief amount of testing that I have done (read more here), I have often found that ChatGPT is inaccurate, misleading, or plain wrong. When the AI reaches the bounds of its ‘knowledge’ I think it reverts to 'type' and falls back on text prediction. It then gives us a string of seemingly logical next words: but it will not necessarily 'make sense' in the academic meaning of the words. Without us having our own personal understanding of the materials, we will not be able to judge whether the AI result is accurate ...or not.
- Problem 3: No idea who the experts are. When we use ChatGPT we lack those markers of academic writing which has arisen from us having 'done the work'. The AI is rarely able to provide us with real citations and references to support the learning which arises from familiarity with the field. The process of having gone mining for the views of qualified, careful researchers who are professional enough to have had a paper pass peer review and to be published facilitates our learning, helps us to map our field, builds our critical thinking skills, and to understand where new trends are emerging.
- Problem 4: Poor writing. If we don't practice our own writing, we will not get better at expressing ourselves. Like driving a car, a dual understanding of the rules harnessed with sound practice will improve our performance.
- Problem 5: Not our original work. The expectation is that we OWN our own writing. We can use tools, but we need to (a) acknowledge that we have done so, and (b) our academic writing needs to be our own, not the product of someone else's AI software. If we use AI, I feel the owners of the writing is really the AI owner. It is not ourselves.
Most academic institutions require students to state that assignments contain their own original work on submission into plagiarism software. . This is a key issue, well-explained by TurnItIn:
"Similar to contract cheating, using AI to write an assignment isn’t technically plagiarism. No original work is being copied". Actually, I am unsure about the validity of this: the creators of AI have paid plenty to create the AI software, and I feel that technically they may own the output. BUT there are also growing ownership disputes about the large language models, copyrighted materials, and so forth which various stripes of AI have been trained on (Morriss, 2023). Perhaps it might be better to say that ownership is murky. "But at the same time, it isn’t the student’s original work. In many ways, AI writing and contract cheating are very much alike; in the realm of contract cheating, students submit a request to an essay mill and receive a written essay in return. With AI writers, the only difference is that a computer generates the work. To that end, using AI writing to complete an assignment and represent it as your own work qualifies as academic misconduct" (TurnItIn, 2024).
If we want to truly gain mastery, then we need to work at it. We must accept there are no shortcuts to the learning process. We must do the work. We have to read so we get to know the experts. We learn to evaluate our critical eye and hone our bullshit detector. We write to improve our writing. We own our output.
Sam
References:
Canning, B. W. (1975). Keyboard skill-a useful business accompaniment. Education + Training, 17(10), 277-278. https://doi.org/10.1108/eb016409
Morriss, W. (2023, December 15). Who owns AI created content? The surprising answer and what to do about it. Reuters. https://www.reuters.com/legal/legalindustry/who-owns-ai-created-content-surprising-answer-what-do-about-it-2023-12-14/
Snyder, K. M., Ashitaka, Y., Shimada, H., Ulrich, J. E., & Logan, G. D. (2014). What skilled typists don’t know about the QWERTY keyboard. Attention, Perception, & Psychophysics, 76(1), 162-171. https://doi.org/10.3758/s13414-013-0548-4
TurnItIn. (2024). What is the Potential of AI Writing?. https://www.turnitin.com/blog/what-is-the-potential-of-ai-writing-is-cheating-its-greatest-purpose
No comments :
Post a Comment
Thanks for your feedback. The elves will post it shortly.