A Massachusetts couple claims that their son's high school attempted to derail his future by giving him detention and a bad grade on an assignment he wrote using generative AI.
A Massachusetts couple claims that their son's high school attempted to derail his future by giving him detention and a bad grade on an assignment he wrote using generative AI.
An old and powerful force has entered the fraught debate over generative AI in schools: litigious parents angry that their child may not be accepted into a prestigious university.
In what appears to be the first case of its kind, at least in Massachusetts, a couple has sued their local school district after it disciplined their son for using generative AI tools on a history project. Dale and Jennifer Harris allege that the Hingham High School student handbook did not explicitly prohibit the use of AI to complete assignments and that the punishment visited upon their son for using an AI tool—he received Saturday detention and a grade of 65 out of 100 on the assignment—has harmed his chances of getting into Stanford University and other elite schools.
What would the parents' stance be if he'd asked someone else to write his assignment for him?
Same thing.
Dale and Jennifer Harris allege that the Hingham High School student handbook did not explicitly prohibit the use of AI to complete assignments
I'll bet you the student handbook doesn't explicitly prohibit taking a shit on his desk, but he'd sure as Hell be disciplined for doing it. This whole YOU DIDN'T EXPLICITLY PROHIBIT THIS SO IT'S FINE!!!111oneoneeleventy! thing that a certain class of people have is, to my mind, a clear sign of sociopathy.
Basically their stance is that the school policy didn't explicitly say he couldn't use AI, so perhaps the policy specifically mentions another person doing the assignment?
You know, now that I think about it, if I were in an admissions office I'd be keeping a quiet database of news stories like this so I know which people I would automatically reject no matter what their scores.
Reminds me of some bass-ackwards story I read about boardgames. A couple was saying "the rules don't forbid this" so they were putting pieces in the wrong places. What a nightmare that would have been.
People who do that at my games table get uninvited from games nights. I might also point out that the rules don't forbid me tossing my glass of baijiu into their faces but they're probably thankful I'm not doing it.
Yeah, I can’t really understand why anyone would think that you wouldn’t fail for this. You’re being tested on your ability to do something and having a machine do it for you. At most generous to AI it’s like bringing calculators to an arithmetic class.
It's been a while since teachers were allowed to give out 0s in highschool. When I taught 12 years ago the lowest I was allowed to give was a 65. Even if nothing was turned in.
I imagine this must depend on the location of the school in question. Im in my mid 20s, so my high school experience was more recent than 12 years ago, but I remember getting quite a few zeros (was an absolutely horrible procrastinator who would tend to respond to the stress of having a due date coming up by doing anything else to not think about the source of said stress, which led to a lot of simply not turned in schoolwork)
Oh jeez. Maybe it’s that I was in private school but I was a senior in high school and I only stopped getting zeros for un turned in work because my mom got cancer.
What fucking snowflakes. When I was a kid, if you had someone write your paper for you, you got a 0 for the assignment. When you go to college, they'll fail you out of the course for that shit (because its cheating).
The only ones harming this kid's future is the parents trying to coddle their kid and protect them from the (rather light) consequences of their actions.
I taught in Chinese universities for 16 years. Initially I liked it. The students were hard-working and respectful. Parents listened to teacher advice. If kids were caught cheating there was Hell to pay ... from the parents, not just the school.
Over that 16 year period, though, everything changed. Parents started showing up to middle schools whose response to any misconduct was to privately donate red portraits of Chairman Mao to the school administrators and suddenly all records of misconduct went missing. Marks were "reassessed". Leading to universities being flooded by the worst imaginable students who'd never had a negative effect to any shenanigans their entire lives.
Only universities are a different world entirely. It takes a whole lot more red portraits of Chairman Mao to get misconduct erased in university. Way more such portraits than all but the top 0.1% could pay. So these poor kids, having slid by for 12 years of no consequences suddenly get hit square between the eyes with consequences that for the first time in their lives Daddy couldn't erase by waving said red portraits around.
Yes, they were little shits. Yes, I hated them as students. But I still felt bad for them as people because they were made monsters. They weren't born monsters.
Still didn't stop me from quitting teaching, though.
Obviously, that only concerns copying human work, not copying AI generated work. The art of parroting other people's work is to creatively rephrase it, right? You don't have to actually comprehend the concepts if you're good enough at reciting them.
That's a joke, using irony to comment on a skewed understanding of academia and people trying to skirt the point to get ahead with less effort.
The student handbook also doesn't have any warnings against inserting it into your rectum, because we expect common sense to tell you that's a terrible idea.
Actually, the lesson I'm starting to see over the past few years is that for certain groups of people, there are ABSOLUTELY no consequences and every failure is just failing up. There's a good chance this kid will never find out.
OK, the parents are suing. And the district already filed a motion to dismiss.
Please understand, the world isn't a nuts as the headlines tell us. Judges toss frivolous lawsuits all day long. We only hear about the nut cases because they're nut cases. Money says this case is never heard.
Considering how many kids get into Ivy League schools purely because of who their parents are and/or how much money they donate, you’re most certainly right
It's honestly more impressive that the family even found lawyers to take the case. As someone who's been dealing with a frightfully similar situation at work, entitled parents trying to use a lawsuit to "correct" a clear student error, we've had 4-5 different law firms reach out to us for details about the case, and every time they thank us for our time and refuse to pursue the case further because it's clear the kid was in error.
Bad parenting. Not only did they not talk to their kid about what constitutes honourable academic conduct, not only did they not talk to their kid about the pitfalls of using generative AI, especially in an academic context, they are now teaching their brat that the proper response to fucking up is to blame the rules, to blame the school, to blame other people. Bad parents.
Having worked with parents like this before: No. None at all. They'd rather throw thousands of dollars at different attorneys hoping one of them will take the case to teach their children to never have shame.
Should kids use chatgpt to do their assignments, probably not. I think everyone here is looking at this in the wrong way though. If they rules did not state he could not use it, a proper response to me would be to tell the kid to do the project over without using chatgpt on another topic, and update the rules. Instead they did the school equivalent of arresting the student and detaining him (detention), and marked the assignment poorly which impacts his future.
The kid should not have done this.
The school/teacher also should not have done this.
According to the information we have, no rules were broken, so it was an unwarranted punishment.
On a side note your comment is also very "fall in line" thinking. One could argue the parents are standing up for their kid and teaching him how to stand up for himself.
The authorities need to follow written laws and procedures. Otherwise we are just punishing people for being different.
Everyone should be mad at the school because we are having to use taxes to address a situation that a teacher could have addressed long before by just telling the student to do the assignment over.
Bullshit. Every academic honesty policy I've seen says, in short, to do your own work, including this school's:
Hingham Public Schools, however, claims that its student handbook prohibited the use of “unauthorized technology” and “unauthorized use or close imitation of the language and thoughts of another author and the representation of them as one’s own work.”
If the student tries to pass off AI writing as his own, it definitely falls under that second clause. Does it really need an exhaustive list of all the places/people/technologies to not copy from?
There were rules against using AI, they're just arguing that they weren't in the "Student Handbook".
If you click through to the legal filing linked in the article, they lay out that they informed the students of the rule during a lecture, they have a record of his attendance at that lecture, and parents also got handouts during a parent teacher day.
edit: quote
During the first week of class, RNH and his classmates were given a copy of HHS’ written policy on Academic Dishonesty and AI expectations.4 The students are clearly informed that this policy applies to all classes, not simply ELA classes. The policy was distributed in RNH’s class on the same day a PowerPoint presentation entitled “AI & Schoolwork” was presented to RNH’s class.5 This is the PowerPoint presentation referenced in paragraph 129 of the Verified Compliant.
Attendance records show that RNH attended the class at which the policy was distributed and the PowerPoint presentation was shown. Furthermore, the written policy was also posted on Google Classroom, on online portal containing policies which is accessible to HHS’ students. It was also distributed at Parent's Night which was held in September 2023. If RNH’s parents were present at Parent’s Night, a copy would have been provided to them.6
I think I don’t have enough details to agree with you.
Lots of variables, some with make the school look good, and/or the kid.
The student might be an angel who used a small bit of gpt, after saving puppies all night ; or a hellion someone finally had enough of, after repeated issues.
The parents may be bad, absolute stereotypes. Or perhaps there is a deeper story here about why they are willing to publicly humiliate themselves ; which most lawyers and/or common sense would have told them ahead of time.
Looks like the handbook does explicitly mention it:
Academic Integrity: Cheating and Plagiarism
To cheat is to act dishonestly or unfairly in order to gain an advantage. In an academic setting, cheating consists of such acts as communicating with other student(s) by talking or writing during a test or quiz; unauthorized use of technology, including Artificial Intelligence (AI), during an assessment; or any other such action that invalidates the result of the assessment or other assignment. Plagiarism consists of the unauthorized use or close imitation of the language and thoughts of another author, including Artificial Intelligence, and the representation of such as one’s own work. Plagiarism and cheating in any form are considered disciplinary matters to be addressed by the school. A teacher apprehending one or more students cheating on any graded assignment, quiz or test will record a failing grade for that assignment for each student involved. The teacher will inform the parent(s) of the incident and assistant principal who will add the information to the student’s disciplinary file. The assistant principal may take further action if they deem it warranted. See Code of Discipline.
If that's the case, then he shouldn't have been punished. Regardless of people's feelings about AI, imagine this were any other circumstance. "You did something that's not against the rules but I don't like, so I'm going to fail you and give you detention". That's a load of horseshit. Imagine they did the same thing if he had the paper transcribed through his speech. You don't get to make up rules after the fact and then punish someone for them.
Students were informed before the assignment, so he knew he shouldn't have used it. This handbook is not some legal document or something so I don't think the parents have a case. If anything the kid got away easy with a 65 instead of a 0.
The kid used AI. The lawsuit doesn't argue they didn't and are being unfairly punished. They're arguing that there weren't any rules explicitly saying they couldn't use AI.
Sounds like rich parents mad at the world cuz their kid fucked up. How can they ruin our perfect Billy's life over a decision he made, knowing full well it was wrong!!!! Now he might have to go to a less prestigious college... Boohoo!
This is one reason why people don't want to be teachers and why education is going down the toilet. Entitled parents who run to lawyers in our hyperlitigious society every time their spawn is slightly inconvenienced.
The story is an eyeball grabber precisely because it is being pitched as "stupid entitled parents".
Dale and Jennifer Harris allege that the Hingham High School student handbook did not explicitly prohibit the use of AI to complete assignments and that the punishment visited upon their son for using an AI tool—he received Saturday detention and a grade of 65 out of 100 on the assignment—has harmed his chances of getting into Stanford University and other elite schools.
Hingham High is regularly ranked as one of the best schools in the country, and has a reputation operating as a feeder into the Ivy League and similar tier universities. In these kinds of high-stakes environments, GPA and Class Rank are a form of commodity that parents (not unjustifiably) go to the mat to wrangle. The difference between admittance and denial to a school like Stanford can be hundreds of thousands a year in future professional income for the kid.
But that's the real root of the problem here. A single grade on a single test in a single class determining a student's entire socio-economic trajectory creates all sorts of moral hazards. One of which is parents willing to litigate over a grade.
Perhaps the problem isn't with this particular pair of parents realizing the stakes, but with an increasingly steep pyramid of incomes based on where you enter the workforce.
Honestly this is a big reason I can't root for our society in its current form.
Everyone in an area, barring diagnosed disability requiring special education, should go to the same PUBLIC schools to develop empathy with Americans who don't live behind their guard gates, to have similar academic starting points if even a partial "meritocracy" is something we'd like to try to actually aspire to, and to reverse rich parents having no skin in the game and forcing them to advocate FOR public schools with their power rather than lobby to further destroy them for tax cuts because being greedy sociopaths is kind of their thing.
The idea that a child's future prospects are so dependant on their parent's socioeconomic status, rather than solely the child's aptitude and motivation, makes this whole place nothing but a bad clown show to me. Feudalism with a marketing team.
In a country where intelligent and hard working children are lost to schools we starved to cut wealthy sociopath's taxes, while dynastic entitled nitwits like George W Bush and Donald Trump literally cannot fail despite barely being able to walk without tripping on their own shoes or bankrupting yet another company, trying just makes one a sucker.
I have my doubts that a student that uses generative AI to complete assignments would stand a chance at getting into an Ivy league school. It doesn't take a rocket scientist student to know that using gen AI to write your assignment is cheating.
This is also why zero tolerance doesn't work with bullies. Because the moment the bully gets in trouble, his bully parent will waddle into the office and bully the faculty and staff because their little shit stain got in trouble. Faculty doesn't wanna deal with these bully parents, so the bully kids get away with everything as a result
None of my friend's parents growing up would sue the school, but they were all the type of parents to go in and argue with teachers over grades. It was usually to go from a B to an A or some bullshit.
My parents on the other hand were more like, you fucked that up didn't you if I didn't do well on something. I would have been mortified if they argued about grades on my behalf.
Perhaps it is also that LLMs are horrible at making any kind of argument and probably wrote a shit paper, never mind the plagiarism? Frankly a 65 is a high mark for doing something like this
Someone else in the comments said that it's possible (may vary by state / locale) that 65 may be the lowest grade they're allowed to give now. So if that's the case, I suspect the teacher would have given them a 0 if they could.
Perhaps it is also that LLMs are horrible at making any kind of argument and probably wrote a shit paper
They tend to be excellent at churning out pages of high school tier writing prompt slop. The precise grammar, the easy-to-read formatting, and the automatic citations make them the ideal tool for generating this kind of beginners writing.
The problem with LLMs in a writing class is the same as calculators in a math class. Its trivial to learn how to use, but doesn't instill the background into how and why it produces these outputs. It's a literal black box.
The purpose of churning out term papers isn't to provide useful information to your grade-school teacher. It is to practice the art of research, analysis, condensation, and presentation. You're supposed to create shit writing at the early stage of your development. That's part of the learning process. Write bad. Get instruction on how to improve. Write better. Get more instruction. Write good.
Bringing an LLM to a writing class is like bringing a hydraulic press to the gym.
Article doesn't say if he used AI to wholesale write his paper, which obviously is cheating, or if he used it as a resource like Google. Some details would be nice here.
I don't think using it as a resource would be a good thing. It's not a good tool for that. But I think it's perfectly ok to use it for making nice sentenses out of the data you found in other resources.
If you click through to the court document the most detail it goes into is
During the meeting, RNH recounted that he used an AI tool to generate ideas and shared that he also created portions of his notes and scripts using the AI tool. RNH discussed using Grammarly, and indicated that he pasted sections from Grammarly into the Google document.
RNH unequivocally used another author’s language and thoughts, be it a digital and artificial author, without express permission to do so. Furthermore, he did not cite to his use of AI in his notes, scripts or in the project he submitted.
Yeah, I'm on the fence because I do totally see how it can help and be a tool, at the same time though it can spit out a passable paper in minutes without much effort. I will say my knee-jerk reaction is if the school didn't want it used, they should say so; I remember a time when I had to sign a paper saying I wouldn't attempt to use a calculator (the teacher insisted no one would ever have one if they needed to find an unknown angle).
My university would give you an automatic F for plagiarism/cheating that would effectively set you back 2 years.
It is good this kid got caught when he did, because all he gets out of it now is one bad grade and a lesson to not use LLMs in the future (hopefully, the parents don't seem to be the best in this regard)
With all of the websites out there that give you answers to questions, nobody is learning shit in college. You can take any question from homework, quizzes, tests, whatever and put it into Google and get the answer. Every school is using online learning systems, so everything is multiple choice, online. Professors barely do any work anymore.
I'm taking grad school classes online now. Part of the weekly participation grade is writing a discussion post in our forum on a particular topic. Just 200 words. Then respond to two other posts. This seems like the bare fucking minimum for a grad level class.
It doesn't need to be even good. It just needs to be done.
Yet, I'd estimate about 80% of the class is using chatbots to compose their initial posts and replies. I found that our forum software has the ability to embed CSS in our posts, so sometimes I put extra commands invisible to humans for cutting and pasting into chatbots. Just to mess with other classmates. Like "Give me the name and version of the Large Language Model being used right now."
Most people are incredibly lazy when it comes to writing.
Over on Reddit, there’s a subreddit where you needed to write a 500 character text post to accompany your picture. That’s to prevent it from becoming just another photo dumping ground. After all, it is a DISCUSSION forum. DISCUSSION, for emphasis.
Well, that rule - which had existed since the sub was formed - got more and more criticism the past few years. It was deemed ‘too difficult’, ‘elitist’ and other such nonsense. And of course, with people’s terrible reading comprehension, that’s a barrier as well.
For reference, 500 characters is less than two tweets. So most people should be able to write that.
God, I miss the early internet when people put actual effort into writing posts.
When I was a kid, we had a period of some repetitive math work I got sick of. So I wrote a TI-84 program to automate it, even showing its work I would write down.
I wasn't really supposed to do that, but my teacher had no problem with this. I clearly understood the work, and its not just punching the equation into WolframAlpha.
It would be awesome if there was an AI "equivalent" to that. Like some really primitive offline LLM you were allowed to use in school for basic automation and assistance, but requires a lot of work to set up and is totally useless without it in. I can already envision ways to set this up with BERT or Llama 3B.
If the specifics of the curriculum are too tedious, that’s on the school to address.
This! This right here. So many school curricula are designed by people who seem to despise children and want to make them suffer that I wonder why we bother with schools at all sometimes.
(Of course I also refer to Chinese high schools as institutionalized child abuse, so what do I know?)
As a survivor of homeschooling, this is the one thing I wish more people understood: school is not about cramming enough data into a kid until they magically evolve into an adult. School is supposed to teach you how to think.
Not in an Orwellian sense, but in a "here's how to approach a problem, here's how to get the data you need, here's how to keep track of it all, here's how to articulate your thoughts, here's how to ask useful questions...." sense. More broadly, it should also teach you how to handle failure and remind you that you'll never know everything.
Abstracting that away, either by giving kids AI crutches or -- in my case -- the teacher's textbook and telling them to figure it out, causes a lot of damage once they're out of the school bubble and have to solve big, knotty problems.
to be fair, understanding something well enough to automate it probably requires learning it in the first place. Like obviously an AI that just tells you the answer isnt going to get you anywhere, but it sounds more like the user you were replying to was suggesting an AI limited enough that it couldnt really tell you the answer to something, unless you yourself went through the effort to teach it that concept first. Im not sure how doable this is in practice, My suspicion is that to actually be able to be useful in that regard, the AI would have to be fairly advanced and just pretend to not understand a concept until adequately "taught" by the student, if only to be able to tell if it was taught accurately and tell the student that they got it wrong and need to try again, rather than reinforce an incomplete or wrong understanding, and that theres a risk that current AI used for this could instead be "tricked" by clever wording into revealing answers that its supposed to act like it doesnt know yet (on top of the existing issues with AI spitting out false information by making associations that it shouldnt actually make), but if someone actually made such a thing successfully, I could see it helping with some subjects. I'm reminded of my college physics professors who would both let my class bring a full page of notes and the class textbook to refer to during tests- under the reasoning that a person who didnt understand how to use the formulas in the text wouldnt be able to actually apply them, but someone who did but misremembered a formula would have the ability to look them up again in the real world. These were by far some of the toughest tests I ever had. Half of the credit was also from being given a copy of the test to do again for a week as homework, where we were as a class encouraged to collaborate and teach eachother how so solve the problems given, again on the logic that explaining something to someone else helped teach the explainer that thing too.
I wasn't really supposed to do that, but my teacher had no problem with this. I clearly understood the work, and its not just punching the equation into WolframAlpha.
This is the way it should be. If you created the program on your own, as opposed to copying it from elsewhere, you had to know how to do the work correctly in the first place. You've already demonstrated that you understand the process beyond just being able to solve a single equation. You then aren't wasting time "learning" something you've already learned just to finish an otherwise arbitrary number of problems.
Yeah, I got sick of manually inputting my physics lab data in college. TA absolutely had no problem with me handing in a python script as my work instead of a bunch of handwritten formulas.
But this is writing. The thing about writing is that it is the critical skill being taught here. Most classes that involve much writing see it as the crucial element. The student is being taught to gather information, process concepts, and effectively communicate reasonable conclusions from all of it in a way that others can understand. And ideally in a way that’s pleasant to read.
I get it, I fucking hated writing in school. I thought it was pointless and frustrating and that I’d never benefit from it. But it turned out to be one of the most critical skills I was taught. It made me an effective communicator and taught me to better organize my thoughts when attempting to express them, or to understand them. I struggle to think of a way any generative tool could take some of the load without taking a large portion of the lesson away from the student in the process.
If you click through to the motion to dismiss, it details the AI policy that says if you do use AI just to get ideas, you have to include transcripts of your AI chat to prove that you aren't plagiarizing it.
The way I see AI as a tool in a classroom or learning setting is that you should be punished if you willingly used it due to laziness, not understanding the course work, or I assume most likely both. On its own it's not terrible (environment aside), but it's certainly not something I'd accept if I were a teacher grading homework.
A basic pocket calculator, or even graphing calculator, of the sort you'd expect to see in a high-school are not capable of providing the solutions to high-school level math problems. They're beyond being given arithmetic with single numeric answers at that point.
In contexts where you do need numeric answers to a formula, such as in physics, you can absolutely use a calculator and that's fine.
I mean I've been out of HS for a bit now, but I definitely remember it being a bit of a debate on whether calculators are allowed or not.
I get why I'd be down voted initially, as AI and calculators are quite different use cases, mainly the fact they can be a tool to utilize to make things easier.
However again, I get that it's a wide difference, with calculators you definitely still have to somewhat understand what you're doing and why.
Presenting work or ideas from another source as your own, with or without consent of the original author, by incorporating it into your work without full acknowledgement.
I think that covers 100% of your argument here.
LLMs can't provide reference to their source materials without opening the business behind it to litigation. this means the LLM can't request consent.
the child, in this case, cannot get consent from the original author that wrote the content that trained the LLM, cannot get consent from the LLM, and incorporated the result of LLM plagiarism into their work and attempted to pass it off as their own.
the parents are entitled and enabling pricks and don't have legal ground to stand on.
LLMs are certainly trained without consent, but they exist to spot common patterns. It's only likely to plagiarise if that text is also similar to lots of other text.
In fact, the academic practice of references and exact quotes has actually increased the tendency of statistical models to "plagiarise".
LLM will continue to be a useful academic tool. We just have to learn how best to incorporate them into our testing.
the parents are entitled and enabling pricks and don't have legal ground to stand on.
After reading that the exam rules basically said not to use chatgpt or similar, I completely agree.
And what if you had an app on your phone that let you just take a picture of the question, and write out the answer it gave you? A calculator still requires that you know what to input, and at the level of math where a calculator really is just easy mode, it absolutely would specifically prohibit them.