Those who don’t create intellectual property should respect the creations of those who do
Overall Assessment
The article presents a personal moral argument against AI use in education, framed as a news piece but functioning as an opinion column. It uses emotionally charged language, selective evidence, and a truncated quote to portray AI as inherently unethical. No effort is made to present balanced perspectives or technical accuracy.
"But asking st"
Misleading Context
Headline & Lead 30/100
The headline functions as an opinion statement rather than a neutral news summary, using moralistic language to frame the issue, which misleads readers about the article’s nature.
✕ Sensationalism: The headline frames a broad ethical claim as a moral imperative, using absolutist language that oversimplifies a complex debate around AI and intellectual property.
"Those who don’t create intellectual property should respect the creations of those who do"
✕ Editorializing: The headline is phrased as a moral judgment rather than a news headline, positioning the author’s opinion as universal truth.
"Those who don’t create intellectual property should respect the creations of those who do"
Language & Tone 20/100
The tone is heavily opinionated and moralistic, using emotionally charged language and personal anecdotes to condemn AI practices, with no attempt at neutrality.
✕ Loaded Language: The author repeatedly uses emotionally charged terms like 'stealing', 'theft', and 'robbing' to describe AI training, equating it with criminal behavior without nuance.
"requires ChatGPT to scrape the internet for examples of my work and use them without permission or payment: something The Writers Union of Canada – like all writers’ organizations worldwide – has been trying very hard to educate people not to do."
✕ Appeal To Emotion: The anecdote about childhood stealing is used to emotionally equate AI model training with moral failure, rather than engaging with technical or legal distinctions.
"I told my mother about this entertaining activity, only to be mortified when she reframed it as “stealing” and marched me back to the store to apologize to Mr. Lou."
✕ Narrative Framing: The article constructs a moral arc from childhood innocence to adult ethical failure, framing AI use as a societal decline in honesty.
"I learned my lesson when I was six years old, but some folks never do."
✕ Editorializing: The author injects personal judgment throughout, such as calling university guidelines 'extraordinary' in a clearly negative sense, without neutral reporting.
"Recently, I was idly scrolling when I came upon something extraordinary."
Balance 25/100
The article relies solely on the author’s perspective and selectively presents a university guideline without balance or verification, lacking any opposing or expert viewpoints.
✕ Cherry Picking: The article cites a single university guideline out of context, without seeking comment from the institution or presenting broader academic consensus on AI use.
"In “Artificial Intelligence: A Guide for Students,” Thompson Rivers University, situated in Kamloops, British Columbia, suggests the following:"
✕ Vague Attribution: Claims about global writer organizations are made without specific citations or evidence.
"something The Writers Union of Canada – like all writers’ organizations worldwide – has been trying very hard to educate people not to do."
✕ Loaded Language: Refers to AI-generated content as 'filching' and 'theft' without engaging counterarguments from AI developers or legal scholars.
"They don’t consider that they are robbing those who created the art they are filching"
✕ Omission: Fails to include any voices from AI researchers, educators, or legal experts who might offer alternative interpretations of fair use or training data.
Completeness 30/100
The article lacks essential context about AI training, copyright law, and educational policy, while truncating a key quote to strengthen a misleading narrative.
✕ Misleading Context: The article cuts off a university guideline mid-sentence, omitting its cautionary advice about citations and copyright restrictions, creating a false impression of endorsement.
"But asking st"
✕ Omission: Fails to explain how AI models actually train on data, whether such use qualifies as copyright infringement under current law, or ongoing legal cases on the topic.
✕ Cherry Picking: Focuses on a single example from one university without contextualizing it within broader educational policies or AI ethics frameworks.
"In “Artificial Intelligence: A Guide for Students,” Thompson Rivers University, situated in Kamloops, British Columbia, suggests the following:"
AI is framed as a hostile force stealing from creators
The article uses emotionally charged language like 'theft' and 'robbing' to equate AI training with criminal behavior, and presents AI as inherently unethical without engaging counterarguments.
"requires ChatGPT to scrape the internet for examples of my work and use them without permission or payment: something The Writers Union of Canada – like all writers’ organizations worldwide – has been trying very hard to educate people not to do."
AI use in education is framed as inherently illegitimate and unethical
The article truncates a university guideline to make it appear as if institutions endorse IP theft, then condemns the practice as morally bankrupt, ignoring existing academic guidance on citation and fair use.
"But asking st"
Writers are framed as marginalized creators whose rights are being ignored
The author positions writers as underpaid, vulnerable creators whose work is being exploited, appealing to empathy and solidarity with artistic professionals.
"who are for the most part underpaid and sometimes – especially in the case of musicians, writers, and painters – actually poor."
AI is portrayed as operating through deception and unauthorized use
The author frames AI development as fundamentally dishonest, relying on 'scraping' without consent, and implies systemic untrustworthiness in how AI tools are built and promoted.
"requires ChatGPT to scrape the internet for examples of my work and use them without permission or payment"
Educational institutions are portrayed as failing to uphold ethical standards
The article singles out one university's guideline without context and implies negligence or moral failure in academia’s approach to AI, suggesting a decline in ethical rigor.
"In “Artificial Intelligence: A Guide for Students,” Thompson Rivers University, situated in Kamloops, British Columbia, suggests the following:"
The article presents a personal moral argument against AI use in education, framed as a news piece but functioning as an opinion column. It uses emotionally charged language, selective evidence, and a truncated quote to portray AI as inherently unethical. No effort is made to present balanced perspectives or technical accuracy.
A Toronto-based writer has expressed concern about a university guideline that permits students to generate AI content in the style of living artists for critical discussion. The article highlights ongoing debates about intellectual property, AI training data, and academic integrity, though no response from the university or technical experts is included.
The Globe and Mail — Business - Tech
Based on the last 60 days of articles
No related content