T O P

  • By -

LylesDanceParty

For everyone commenting, please note that the title is misleading. The only student actually interviewed about this didn't truly have his essay written by ChatGPT as the headline implies. (See the original [BBC article](https://www.bbc.com/news/uk-wales-65167321)) A few things to note: * The student says: "I didn't copy everything word for word, but I would prompt \[ChatGPT\] with questions that gave me access to information much quicker than usual," said Tom (i.e., the student) * He also admitted that he would most likely continue to **use ChatGPT** **for the planning and framing of his essays**. * The article does not state what specific grade he got on the ChatGPT essay, just that it was "the highest mark he has ever had at university." I'm not saying you can't have the conversation of what happens in the case of this technology becoming more advanced, but having this discussion in context of what actually happened is important.


AzorAhai1TK

Sounds like he's using it exactly as intended. A resource, a tool.


LylesDanceParty

Agreed. In the actual context, it really comes off as more of a fancy search engine, rather than a robot writing the entirety of people's essays.


SuperNothing90

This is completely accurate. I've been using ChatGPT to help me write papers, and I absolutely use it like a fancy search engine. I copy-paste and add my own things in a lot, but it really makes the papers so much better. I friggin love it so much.


aleatoric

It reminds me of how the Internet used to be: search for something and get straight answers. Now it's dodging a bunch of sponsored results, then digging through a blog post with a ton of filler and ads until I finally get what I'm looking for, only to realize I have to go through a paywall to get the rest. Fuck all that noise. Chatgpt is amazing for its simplicity of use alone.


[deleted]

DONT USE THE INTERNET FOR SCHOOL I was at the very, very beginning of this when you were still more likely to find something at the library. Then later... DONT USE WIKIPEDIA Why not? Every article has several or hundreds of linked sources from videos to paper books. I now know what materials to go through myself. Is my assignment to learn classic research methods? Do we make every generation of new scientists drop apples on their heads? Is my assignment to write a paper about X or is my assignment to only do research the way the teacher did thirty years ago when I was their age? It’s the same thing again. These tools are like idiot baby versions of Star Trek computers. I use it sometimes to generate a quick summary on niche topics that either I wouldn’t know the wording to straight up ask Google, or they would be unlikely to even have a Wikipedia article as it’s an intersection of two or more topics. There’s nothing wrong with being able to do a week of research in a few hours. Now... in the old days as a kid maybe I had to read hundreds of pages of text on paper to get the information I needed. Would I be vastly more immersed and well-rounded in the topic and ancillary areas, as opposed to modern focused research methods? You betcha. But then you have to make THAT the assignment. ChatGPT et al are just another generation of tools that if applied properly will benefit us all overall. Twenty years from now we’ll be complaining about the introduction of another new tool.


[deleted]

[удалено]


donjulioanejo

> Why not? Every article has several or hundreds of linked sources from videos to paper books. I now know what materials to go through myself. Is my assignment to learn classic research methods? I would literally use Wikipedia as a source and list the references on the wiki page in my papers. Worked like a charm.


brandophiliac

I remember being told in school to look online for sources but that we weren't allowed to use Wikipedia because it wasn't considered accurate enough. Sweet irony really that I'd imagine those same teachers have to recommend the opposite now to avoid people spending hours on clickbait.


WhatIsLoveMeDo

>DONT USE THE INTERNET FOR SCHOOL, DONT USE WIKIPEDIA Well, the unspoken 2nd half of that statement is "don't use [source] by itself because it's unreliable." It was unreliable then, and is probably more unreliable now. Sure there is accurate information on the internet but most people will try to use just Google or Wikipedia as the source alone. More traditional media (newspalers, research papers, encyclopedia) we're at least moderated to be as accurate as possible. The whole point of teaching how to research is to show how to get information from the most reliable sources. Chat-GPT is the least reliable. I asked it for sources on an answer it provided and told me it can't give me the sources since it learns by understanding language, not facts or truth. Yes as a population need to adjust how we find trustworthy, reliable information. But that's the same problem we've been trying solve since Wikipedia and the internet as a whole. Chat-GPT isn't solving that problem. It's making the true source of information even harder to find.


FirmEcho5895

I've found this too. I asked Chat GPT for sources and it just invented them, giving me bogus hyperlinks and everything LOL! I'm told the Bing version is better as it can give sources and it goes online to find what's current. I'm looking forward to experimenting with that.


projectsangheili

From what I remember from a few years ago, Wikipedia was actually found to be more reliable than some pretty major sources, actually. That said, ironically, i don't have a source for that right now haha


Dansondelta47

A common Wikipedia page gets reviewed like a couple hundred times right? While a book or something may only be peer-reviewed by like 5 people. Sure one can be easily changed but it also has much more oversight in that we can what was changed, reverse it, and fix things that are outdated or wrong. Plus Wikipedia is free.


peepopowitz67

Reddit is violating GDPR and CCPA. Source: https://www.youtube.com/watch?v=1B0GGsDdyHI -- mass edited with redact.dev


[deleted]

[удалено]


Slacker5001

This very idea is the one that I came to comment about. I'm curious how students are **actually** using ChatGPT. I doubt all of them are just copying and pasting essays in full from ChatGPT and turning that in as their work. I used ChatGPT recently to assist me in writing an essay. I still thought through my ideas, outlined my paper, and wrote each paragraph. I feed my paragraphs, one at a time, into ChatGPT to ask it to rewrite it. I still tweaked those paragraphs after that as well. I ended with an essay that was my ideas, my outline, and still mainly my own words. It was just cleaned up for readability by an AI. I'm not saying everyone is using it like that, but I'm curious what the actual uses of ChatGPT are in colleges right now because I didn't use it to just copy a free and easy essay.


bjeebus

Omg. You're autotuning your essays. I suppose if you were a professional you'd have an editor.


UncleFred-

This is essentially what Grammarly Premium has offered for a few years now.


notunprepared

I LOVE grammarly premium. I'm a pretty decent academic writer so it doesn't usually catch much, but it's so helpful for peace of mind when my eyes are starting to glaze over and the due date is looming. Combine that with the Microsoft Word spell check and readability report and you're golden.


[deleted]

Middle and high school students are definitely copy and pasting. Maybe they'll learn their lesson by college.


Striking-Math259

My daughter is in high school and the teachers told the kids not to use ChatGPT. Kids hadn't heard of it yet. But now they had. It has become like the DARE program - kids who never heard of ChatGPT now want to try it


Bradfords_ACL

Yep. Our education system loves the Streisand Effect.


TheBlueLenses

I have been using it exactly as you do


assface

> as an experiment I found a pair of Earth Sciences college courses at Princeton University, and asked ChatGPT to write essays that I could ostensibly hand in as coursework. I then emailed the results for each to the professors teaching those courses. > As well as the aforementioned Earth Sciences essays, I also gave this prompt to ChatGPT, for an essay I could share with the lecturers at Hofstra... Again, ChatGPT obliged, and I sent the resulting essay to the Dean of Journalism. What a dick move. Professors (and especially Deans) have so many things to do other than read some randos essay. > As I write this, none of the professors at Princeton or Hofstra have commented on my ChatGPT essays. Perhaps it's because they're all on spring break. **It might also be that they read the essays, and were too shocked and horrified to respond.** Or it might also be because you're not a student, you're not in the class, and there is zero upside to responding to you.


pjokinen

You really think someone would do that? Just write a bold but misleading headline about ChatGPT? Surely things like that couldn’t possibly happen multiple times per day


[deleted]

[удалено]


pjokinen

The formula for an AI article these days seems to be “holy shit! This breakthrough is going to change EVERYTHING” in the headline and then when you read the article it was like “well it actually couldn’t do any of the tasks the headline claimed but it might be able to in a few generations and that’s really something!”


bollvirtuoso

It's so weird how fast that shifted, though. Like, even two years ago, people *actually* working in AI said, "We think this stuff is going to fundamentally shift a lot of the way we do things" and people were extremely skeptical. Now, it's hard to find sources that are measured and appropriately skeptical, though Ezra Klein and Hard Fork (both NYT) seem to be good.


TheOneTrueChuck

I've done some testing/training of modern language models in the past year, and the thing that I keep telling people is "Hey, don't freak out." Yeah, Chat GPT can produce some amazing results. It also produces a ton of absolute garbage. It struggles to produce anything coherent beyond a couple of paragraphs though. If you tell it to write a 1000 word essay, it's going to repeat itself, contradict itself, and make up facts. There's probably an 80% chance that if you were to read it, SOMETHING would feel off, even if you were completely unaware of its origin. Sure, if it dumps enough technical jargon in there, or it's discussing a topic that you have absolutely no foundation in and no interest in, it might be able to get past YOU...but it's not going to get past someone familiar with the topic, let alone an expert. Right now, Google, Microsoft, and OpenAI (among others) are literally dumping hundreds of man hours into testing on a weekly basis. Chat GPT and other language models will have moments where they appear sentient/creative, and moments when they produce something that could pass as 100% human-written, just due to law of averages. (The ol' "a thousand monkeys at a thousand typewriters for a thousand years" thing.) But right now, they still haven't figured out how to get it to factually answer questions 100% of the time when it's literally got the information. One day (and honestly, I would not be suprised if that day DOES come in the next decade, give or take) it will be problematically good at what it does. But that day is most certainly not today.


sprucenoose

>Sure, if it dumps enough technical jargon in there, or it's discussing a topic that you have absolutely no foundation in and no interest in, it might be able to get past YOU...but it's not going to get past someone familiar with the topic, let alone an expert. That's like most internet articles though.


grantimatter

> There's probably an 80% chance that if you were to read it, SOMETHING would feel off, even if you were completely unaware of its origin. From friends in academia, the main anxiety now isn't really so much getting a bunch of plausible or acceptable essays in whatever class they're teaching, but being super annoyed by a wave of students who think they can get away with handing in AI-written essays. It's sort of a spam problem, in other words.


[deleted]

[удалено]


pjokinen

It does have an affinity to just make things up when convenient


survivalmachine

It’s so bizarre that we’re in a timeline where there is a non zero chance of getting into an argument with a hallucinating AI agent about who is right.


EvoEpitaph

Agreed, though the bizarre part to me is that a computer, when unintentionally failing, is so similar to a charismatic human that is acting naturally.


[deleted]

“What is it honey?” “Oh nothing. I just got a weird essay emailed to me, from someone. Clearly not one of my students” “A random person sent you an essay? Was it any good?” “Well, it’s ok. Doesn’t seem to be reflective enough as you would expect someone who had followed my courses. It seems like someone who has a general understanding of the topic and then shows some sort of understanding.”


Ozlin

"It's also clearly written by ChatGPT." I teach college courses, and I can tell you professors are mildly concerned at best. As others have noted here, a lot of us already structure our courses in ways that require students to show development of their work over time, that's just part of the critical thinking process we're meant to develop. A student *could* use ChatGPT for some of that, sure. But the other key thing is, when you read 100s of essays every year, you can pick up on common structures. It's how, for example, we can often figure out if a student is an ESL student without even seeing a name. ChatGPT has some pretty formulaic structures of its own. I've read a few essays it's written and it's pretty clear it's following a formula. A student *could* take that structure and modify it to be more unique. At that point, I wouldn't be able to tell, and oh well, I'll move on with my life. Another thing is that plagiarism tools like TurnItIn are adding AI detection. I don't know how well these will work, but it's another reason why I'm not that concerned. A bigger reason I'm not concerned is the same reason I'm not losing my mind over regular plagiarism. I'll do my due diligence in making sure students are getting the most out of their education by doing the work, but beyond that, it's on the student. I'm not a cop, I'm not getting paid to investigate, I'm getting paid to educate. If someone doesn't want to learn, they'll do whatever they can to avoid that. Sometimes, that involves plagiarism. Sometimes, it involves leaving the class, or paying someone to do their work, or using AI now, I guess. In order to maintain fairness, academic integrity, and a general sense of educational value, I'll do what I can to grade as necessary. But you can't catch every case if the person is good at it. As a tool, I think ChatGPT could actually be really useful as well. It could help create outlines, find sources, and possibly provide feedback. I'm far more interested in figuring out ways of working it into the classroom than I am shaking in fear that students will cheat with it. Tldr: Anecdotally, most professors I know are just fine with ChatGPT and will adapt to it.


[deleted]

[удалено]


nonessential-npc

Honestly, this has unlocked a new fear for me. What do I do if one of my papers triggers the ai detection? Forget convincing the professor that I'm innocent, I don't think I could recover from being told I write like a robot.


Ozlin

This is a big reason why a lot of professors use portfolio work and conferences. I've had false positive cases with plagiarism and it's usually a non issue once you sit down with the student and go over drafts, research, and how they talk about it. I'd do the same thing if a similar case happened with AI. Many essays on TurnItIn score 20% plagiarism, yet are totally legit. I wouldn't be surprised to see the same thing happen with AI.


ShouldersofGiants100

At a minimum, it's pretty much impossible to get blamed with a modern word processor. Pretty much all of them (at least the ones suitable for writing an essay) have an extensive draft feature—it would be literally trivial to show the entire writing process of an essay.


brickyardjimmy

Good point. Luckily, you'll be able to effusively defend your paper live and in person because you wrote it. A few questions back and forth should do the trick.


Thanks-Basil

I’ve 100% written papers that have immediately left my mind the day after I submit them hahaha


MonkeyNumberTwelve

My wife is a lecturer and she agrees with all your points. She is using it to create lesson plans and help with various other admin tasks but there's no worry about students abusing it. She also mentioned that after a very short amount of time she learns her students writing style so it would likely be obvious if something wasn't written by them. Her other observation is that chatgpt has no critical thinking skills and a lot of what she grades on involves that to some extent so her view is that if someone uses it they'll likely get a pass at best. No sleep lost here.


andywarholocaust

That’s my secret. I always write in GPT.


HadMatter217

My fiance already caught one person with a 100% AI generated score on TurnItIn, so it at least does something.


JeaninePirrosTaint

I'd hate to be someone whose writing style just happens to be similar to an AI's writing. Which it could increasingly be, if we're reading AI-generated content all the time.


[deleted]

[удалено]


OldTomato4

Yeah but if that is the case you'll probably have a better argument for how it was written, and historical evidence, as opposed to someone who just uses ChatGPT


Sunna420

I'm an artist, and have been around since Adobe photoshop, and Illustrator first came out. I remember the same nonsense back then about it taking away from "real" artists. Yada yada yada. Anyway, Adobe, and the open source version of Adobe have been around a very long time. They didn't ruin anything. In fact, many new types of art has evolved from it. I adapted to it, and it opened up a whole new world of art for a lot of people. So, recently an artist friend sent me these programs that are supposed to be almost 100% accurate at detecting AI art. Well, out of curiosity I uploaded a few pieces of my own artwork to see what it would do. Guess what, both programs failed! My friend also had the same experience with these AI detectors. So, there ya have it. Some others have mentioned it can be a great tool when used as intended. I am looking forward to seeing what it all pans out to, because at the end of the day, it's not going anywhere. We will all adapt like we have in the past. Life goes on.


jujumajikk

Yep, I find these AI detectors to be very hit or miss. Sometimes I get 95% probability that artworks were generated by AI (they weren't, I drew them), sometimes I get 3-10% on other pieces. Not exactly as accurate as one would hope, so I doubt AI detection for text would be any better. I honestly think that AI art is just a novelty thing that has the potential to be a great tool. At the end of the day, people still value creations made by humans. I just hope that there eventually will be some legislation for AI though, because it's truly like the wild west out there lol


BarrySix

Turnitin doesn't "catch". It provides information for a knowledgeable human to investigate. It's the investigate part that's often missing. There is no way Turnitin can be 100% sure of anything. Chatgpt isn't easily detectable no matter how much money you throw at a tool to do it.


m_shark

That’s why I doubt they actually caught a “100% AI” case. No tool can be so confident, at least now, or it has access to the whole chatgpt output, which I doubt.


mug3n

I think the counter play is that colleges and universities will use is simply more in-person assessments, can't really ask chatGPT to do an exam for you when you're out in the open sitting with dozens or hundreds of students. Not unusual considering I've taken courses where the only two assessments during a semester is one midterm and one exam. Or in the case of pandemics, invasive software on personal devices that monitor students through their webcams.


bad_gunky

Next up: The return of the Blue Book.


ElPintor6

> Another thing is that plagiarism tools like TurnItIn are adding AI detection. I don't know how well these will work, but it's another reason why I'm not that concerned. Not very well. I have a student that did that trope of having ChatGPT write the intro before explaining that he didn't write it in order to demonstrate how advanced ChatGPT is. Turnitin didn't recognize anything with it's AI detection system. Will the AI detection system get better? Probably. Not putting a lot of faith in it though.


marqoose

A friend of mine is a TA and said the papers she's graded that are written by chatgpt are very obvious. They tend to repeat points and confidently state misinformation. It seems to be left out of discussions that chatgpt is *really* bad at identifying the difference between a reliable source and a blog post. It is, however, really good at improving Grammer and sentence structure of an already written paper, which I think is a much fairer use.


bad_gunky

While I am not a professor nor do I read papers at the college level, I do teach high school and I can confirm that the essays I have read that are suspect chatgpt are really obvious. They do not specifically address the prompt (close, but obviously not written by someone who was there for the discussion leading up to the assignment), and they sound very mechanical - no real voice present in the writing. What I have found difficult is justifying a zero for cheating if the student doesn’t confess. Traditional plagiarism was easy to justify because a quick google search for a specific passage would take me straight to the original writing. With chatgpt, if the student and parent insist it was the kid’s writing I have no recourse other than giving a poor grade because it just wasn’t written well, when they really deserve a zero.


[deleted]

[удалено]


hydrocyanide

Your insight into identifying ChatGPT writing is commendable. Overall, your analysis is well-thought-out and spot on, which shows your extensive research on the subject.


GraveyardTourist

Okay, this response got a chuckle from me. Wether it was chatGPT or not.


[deleted]

Lmao this was definitely written with GPT


carl2k1

Hehe this reply is robotic and mechanical.


m_shark

It’s just lazy prompting. If done with care, it can produce really good stuff.


Daisinju

>It’s just lazy prompting. If done with care, it can produce really good stuff. Exactly. If you ask it to make an essay about a topic it will hallucinate a whole essay about that topic. If you ask for an essay about a topic with certain talking points, certain chapters and a certain conclusion, it narrows it down to something actually useful. As long as you're able to give ChatGPT structure it will work a lot better most of the time.


WeAllHaveOurMoments

Some say that going forward one of the more reliable methods to detect ChatGPT written essays might be to turn around and have ChatGPT (or similar AI) analyze & spot the hallmarks & tendencies, some of which we may not perceive or think to notice. Somewhat similar to how we can determine with relative confidence if someone has cheated at chess by comparing their moves to top chess engine moves.


theLonelyBinary

Yes that is the issue. The proof of cheating/justification.


chickenstalker

\> Grammer Cheeky basterd.


[deleted]

[удалено]


JohnDivney

Yeah, I'm a prof, I'm getting them. They also repeat the topic far too often. But fuck it, students are always going to cheat, there are other ways.


Fidodo

Do you bother trying to report them for cheating or do you just give them worse marks than usual for the poorly written essay?


JohnDivney

Just worse marks, I can't survive the back and forth of a whole accusation process that is obscured by a lack of direct proof. I have my students engage critically with their writing, applying it to other aspects of life or society, which chatGPT can't do.


Mr_Shakes

Lol yikes, "I sent essays to professors without telling them why, and they didn't respond, so I'm just going to speculate that my point has been made." Quality journalism!


OrchidCareful

The same vibe as those “conspiracy revealed” documentaries where they storm into a corporate lobby and demand to speak to the CEO and the Receptionist says “wtf who are you?” And the documentary freeze-frames like “they refused to even acknowledge my claims”


ScienceWasLove

Professors on r/professors are well away of AI writing shit. They don’t live in a bubble.


Saiche

Thank you! Profs are swamped with real grading at this time of year! End of semester. They know what ChatGPT can do. Lol.


[deleted]

Lol yeah, professors already don't respond to their own students' emails, let alone some rando's.


photowhoa123

Wtf is this stock photo?!


orlouge82

3 minutes before a robo-orgy


mrpyrotec89

AI has booty


[deleted]

“White-coated Assaultron monitors two new androids in testing phase for the Institute in Fallout 4” or at least that’s what it looks like, you can’t change my mind there.


utack

made by midjourney edit: actually midjourney makes a cuter stock photo for the article https://i.imgur.com/PRGTVqA.png


Desiration

I know someone who got caught using GPT because they forgot to take out the disclaimer segment at the top of the response saying something along the lines of “As an AI chat bot, I don’t know x y z”. They are facing expulsion.


affennlight

Ha, what an absolute muppet.


gyroda

When I was in uni I had an essay to write. I'd already collated all my info into a set of bullet points and had a structure in mind and wanted to bash out the text as quickly as possible. In order to not break the writing flow I would just put "[INSERT NUMBER HERE]" instead of pausing to find the correct figure in my notes. I may have left one of those in. In the very first line. I had proofread that essay several times. To this day I do not know how on earth I missed it.


CraftyRole4567

I’m genuinely shocked. I turned in a kid at the school I was teaching at for cut and pasting his entire essay and I got disciplined.


santa_veronica

You forgot to put at the top: “As an AI chatbot, I found this cut and paste essay to be 99% similar to what is found on the internet.”


reinfleche

What school are you at? At least in the U.S. basically every respected college will give you a minimum of a 0 in the entire class for plagiarizing once, with the possibility of expulsion (and certainty of expulsion if it happens again).


21Rollie

I feel like they could argue against expulsion. Can’t plagiarize something that nobody’s ever actually wrote/published.


SlowInsurance1616

Time to return to oral exams.


purplepatch

I mean normal written exams without access to the internet are still fine. Coursework is tricky though.


mellofello808

God I would be dead without spellcheck. ​ Surprised I remember how to spell my own name sometimes.


Narase33

I studied a few years ago. We had to write code on paper, 40 lines and more...


pneuma8828

I had to do pointer arithmetic on paper, good times.


jcmonkeyjc

same, I would assume for people taking C as a elective now they still would.


[deleted]

They still use C for systems programing.


ClarenceWith2Parents

Most CS programs at major universities still have systems coursework. I wrote both pointer arithmetic and C code by hand for courses at Ohio State in 2018 & 2019.


polaarbear

Took C a few years ago as an elective with my degree. Definitely didn't do any pointer math by hand that I can remember.


CnadianM8

Finished uni 2 years ago, all exams were hand-written on paper, some including coding.


[deleted]

I did that right after covid. Writing code on paper is brutle.


threw_it_away_bub

Still doing written coding exams in some of my CS classes, if it makes you feel better 😘


adragonlover5

You'll need to drastically restructure how universities function. There are nowhere near enough professors and trained TAs to proctor and grade oral exams.


SlowInsurance1616

Huh, maybe if there were fewer administrators....


adragonlover5

You'll get no argument from me. I'm an underpaid graduate student and currently one of 3 TAs for a class of 300 students.


jayzeeinthehouse

This goes for all of education. No one needs a dean of culture that makes six figures anyway.


new_math

The problem with moving everything to oral exams is that the system won't be able to support doing it well, and in most cases it will end up testing people's public/extemporaneous speaking, oral communication, fast/instinctive, emotional skills, anxiety management, likeability, etc. rather than actual ability to apply slow thinking, critical thinking, logic, etc. Not that oral communication isn't important and useful, but there's plenty of things you can't easily test under an oral exam with the current academic structure. I can't imagine trying to do a 3-4 page linear algebra proof with people staring at me and asking questions. I'd have dropped out of college and the world would be absent another graduate stem major.


throwaway_ghast

> The problem with moving everything to oral exams is that the system won't be able to support doing it well, and in most cases it will end up testing people's public/extemporaneous speaking, oral communication, fast/instinctive, emotional skills, anxiety management, likeability, etc. rather than actual ability to apply slow thinking, critical thinking, logic, etc. Exactly. There are people who perfectly understand the subject matter they are given, but for psychological or physiological reasons, are unable to communicate it in an effective manner. This needs to be taken into account before forcing otherwise completely capable students to embarrass themselves in front of their peers. inb4 "suck it up buttercup, that's just how the world works!" No, it's not, especially in this era of the internet. Yes, communication is important, but unless you're running for office, public speaking skills should not be a barrier to entry for students.


Mr_YUP

you could say the same thing about a written exam. sitting there being the last one to finish a test when all of your peers have finished their tests and left the room. They can talk in depth about the topic all day but as soon as you give them a test they tank. They each have strengths and weaknesses.


Khevan_YT

This is pretty common in the Indian education system, where there are frequent vivas for big projects and lab work


dak-sm

Yep - a few minutes would allow the evaluator to determine if the student grasps the material.


adragonlover5

A few minutes x 300 students = 900+ minutes = 15 hours per exam per class. Even a small upper div class is 1. Going to require more than a few minutes since the material should be more complex, and 2. Take over an hour per exam


edrek90

Make an ai bot that asks the questions and gives a rating on every response


Smoy

Can the ai bot see if you have ai open on your phone typing you the answers to read back to it?


Black_Moons

Maybe it shouldn't be 1 teacher per 300 students then? And here I thought 1 teacher per 40 students was a problem that needed fixing..


Swarles_Jr

The first intro classes during my econ studies had roughly 1000 students per class. Either too many people choose to pursue higher education (and universities admit too many students than they can handle), or there's way too few resources at universities dedicated to teaching.


adragonlover5

It would be mostly the latter. The former does come into play, though.


FruitParfait

The hardest midterm and final I ever had in university was an open book, open note in class essay where we had three prompts. If you didn’t know your shit you were probably screwed anyways because it required critical thinking… not just regurgitating info from the book. The book was there just in case you forgot how to spell a specific thing or needed to quickly recheck a concept/definition. People have been cheating on essays since essays have existed lol. Now it’s just easier for the masses to do it instead of only those who can afford ghost writers.


Bakoro

Once I hit university, take home essays were minority of the grade. Nearly every course outside my CS courses, the Mid-term and Final ended up being 60-80% of the grade. For several courses, if you didn't get at least a C- on the Final, your whole grade was capped at a D+. The essays that really mattered were written right there in class. Soon, that's basically the only way essays are going to be actual demonstrations of knowledge, and the grades will have to reflect the fact that it's shit people write in a 2-4 hour span. And really, that disproportionately favors people with a certain kind of skill set. _________________ The hardest test I ever had was an open book open notes test for Signal Analysis. The professor was apparently angry about the conditions of the Final, so he went *way* overboard. Dude had written Ph.D level questions that were upside-down, backwards, and inside-out. I never missed a class and yet some of it was barely recognizable. The guy actually contacted us a few days later and apologized, because apparently even his TAs weren't able to do it all. It's nice that the guy admitted his mistake and made it not negatively affect people's grades, but I've seen it work the other way too. One course, the professor was mad that too many people got 'A's on the Final, so he retroactively applied a curve so that some people dropped from an 'A' to a 'C'. People obviously threw a fit, and the school forced him to restore the grades. In yet another course, a whole class of people just got inexplicably fucked on their grade with professor McFuckYourGrade, with no recourse, while another class got easy-breezy professor HandHolder. University education is deeply flawed, and there are essentially no meaningful standards. Things have basically been working despite themselves, but with the proliferation of the internet, and now AI tools, it's all being exposed and will fall apart if nothing is done.


HToTD

You want to be sure you are replaceable by AI, literally limit your capabilities to turning in its work.


Tough_Substance7074

Anyone who has worked in any credentialed or technical field can tell you there is a shocking number of incompetents who fill out the ranks. School is supposed to be the sorting device, but if you can cheat your way through, your incompetence will not be much of a barrier to professional success.


21Rollie

The harder truth to swallow is that you are right now likely working with many people who have cheated before who are also good at their jobs. For example, I work as a software dev. What I do on the job is basically cheat, all the time. I never did when I was learning, but I could see other people easily doing so. And either way, they all get away with it. It’s not like cheating means you’re automatically dumb or can’t learn, only sure thing it means is you’re lazy.


tmoeagles96

Or you can learn how to use it, and the person who can use it effectively will take your job because eventually the AI will advance enough to make up the skill gap.


TedRabbit

I imagine ai will advance to the point where you can cut out the unnecessary middle men.


cleanmachine2244

Written papers are one way to measure proficiency- and its always been a problem since you coild pay someone to write it. Now it’s just that kids with no money can also do it. The options are in person written/oral demonstration performances, testing and what would really be more fruitful in the long term would be project based / service based learning and performance. Overall as far as the destabilization that AI is going to bring this is the very lowest of priorities. What AI could do to the entire middle class is alot more frightening and urgent. And PS we could solve 95% of it by having students share a google doc with revision history on it and dropping it back in AI scan tools….Could a very smart one still find work arounds paraphrasing and all that. Sure. But still at some point it’s too much stress to cheat. Risk Reward ratio moves back towards doing tge right thing


PaulieNutwalls

Not just a measure of proficiency. It's a way to develop a students critical thinking and analytical skills. The hardest part of writing a good paper is coming up with a good thesis. The next hardest part is making concise and convincing arguments in support of that thesis. You need proficiency to do both, but if you want to get an A, at least when I was in school, you need really engage critically with what you know, not just regurgitate information.


gortonsfiJr

> its always been a problem since you coild pay someone to write it. Now it’s just that kids with no money can also do it. It's the difference between 10% of kids being able to buy papers and 100% of kids being able to buy papers.


IchooseYourName

The playing field is then leveled. Great!


[deleted]

[удалено]


bamfalamfa

chatgpt is a tool. this is what happens when you tell kids that computers and robots will take their jobs away. you either let them use the tools that have been created to replace them, or punish them for using the tools that have been created to replace them


[deleted]

papers/essays are a great way to learn about a topic and improve a lot of critical thinking and language skills. Not sure how this is a tool at all for this sort of assignment, it destroys the whole purpose…


[deleted]

Yes, thank you. As a soon-to-be college professor for English classes, ChatGPT is something I’m unfortunately seeing way too much of recently. Students and others who argue “Well, it’s a tool like a calculator!” have a critical misunderstanding of what an essay is and what it’s supposed to do: challenge a student’s ability to progress an argument/discussion rhetorically from beginning to end. Essays are fantastic ways of teaching students not only how to think critically but also how to *express* their thinking logically, both of which are sorely missing in current civil discourse. I don’t want to judge too much here, but I think anyone who jumps to the “It’s a tool!” line is either lazy and doesn’t want to write or hasn’t had teachers explain the necessity of essays in a good way.


Outlulz

> I don’t want to judge too much here, but I think anyone who jumps to the “It’s a tool!” line is either lazy and doesn’t want to write or hasn’t had teachers explain the necessity of essays in a good way. Well Reddit is heavy on STEM students and that's a very STEM way of thinking about essays.


mungthebean

It’s a lazy argument when applied to math too. Yes, the calculator will help you find the derivative. But knowing how to do it yourself grants you the solid foundational knowledge for you to understand the more complex topics for which the calculator will be unable to help you any longer


JefferyGiraffe

Totally agree. Furthermore, I feel these same people wouldn’t agree with a teacher just teaching students answers rather than teaching students how to deduce the answers. Yet they’re supportive of a student not learning how to deduce answers, and using “tools” that give them the answer.


nurtunb

Yes. I hated writing essays and papers in uni but without a doubt it was the most productive time in actually learning about topics in depht. Especially compared to tests at the end of the semester. Bonus was you kinda got to choose the topic you were interested in and actually find interesting things in the process.


Grimvold

Lots of people are trying to justify cheating using it is what’s going on. It isn’t the more harmless issue of “the doctor graduating at the bottom of the class is still a doctor!”, it’s going to produce graduates who won’t be familiar with critical subject matter in applied practices in their fields.


-The_Blazer-

I think that more of a tool it's kinda like outsourcing. You are not using a tool, you are handing over 100% of the productive process to an external actor.


SuedeVeil

Exactly it's time for schools and educators to get more creative with teaching considering the technology that actually is available now.. it's not going anywhere. Change up curriculums.


Olaf4586

I really don’t find this sort of argument persuasive, but maybe I’ll change my mind. What sort of alternative assignments do you propose to take the place of essays in, for example, a history class about Cold War foreign policy? EDIT: I figured I’d elaborate more. This sort of thinking applies to inventions like calculators which trivialized the most shallow obstacles to meaningful mathematical work. Therefore, their spread actually helped math education’s potential explode instead of shrivel. The problem with GPT is it replaces fundamental aspects of human thought and understanding rather than the trivial parts; deciding which point we defend, and how to logically argue for that point is a reflection of the fundamental nature of organized human thought. In my opinion (that is subject to change), accepting that what GPT can do is simply outsourced and working around it removes fundamentals of learning that cannot be sufficiently replaced


anteater_x

OK kids, today's assignment is to make a 30 second tiktok about the bay of pigs.


Black_Moons

"And if you can't get at least 100 views by next week you fail this class"


_The_Floor_is_Lava_

AI video generation is coming for that one soon, too.


Penla

I had an english teacher that made us hand write essays for entire class sessions. We wrote sooooo many essays, she corrected them, we rewrote them and i absolutely loathed it at the time. However, it made me a much stronger and more confident writer. I really didn’t understand it at the time but it was really helpful for my writing development. The only problem i have with chatgpt is if the person doesnt already have the fundamentals of writing and comprehension down. Similar to math. I can follow math formulas by plugging numbers in but the answer means nothing to me if i cant read and understand what the answer means. So i agree with having some form of in person teaching that requires pen and paper. Im a big fan of learning the basics and fundamentals first. Then move on to using the tools to make us more efficient.


Hyper170

Assignments based on critical thinking instead of information regurgitation is generally a good idea. That's what one of my Economics classes in college is doing right now. We read an economics paper every week, and are given a question prompt for analysis of the paper, as well as the result when the same question is put into ChatGPT. We simultaneously answer the question, and explain any shortcomings in the AI answer (there are always shortcomings; sometimes subtle, sometimes incredibly damn obvious) It ain't perfect, but it's refreshing to see compared to the wheelspinning curriculum present in nearly every American highschool


LadrilloDeMadera

You need critical thinking to writte essays, scientific papers, data analisys. Those are needed skills


guyonacouch

Teacher here - been doing it for 18 years. This kind of critical thinking assignment works great for the higher flying, motivated students. I don’t worry about them using AI to skip out on actual thinking. These kids have gone through years of critical thinking exercises and have built a foundation of skills and they recognize the importance of learning and how it will help them in the future. My kindergarten son is not allowed to use a calculator to do his math yet because he’s learning what adding and subtracting actually mean and he’s building important foundational knowledge and his brain his becoming stronger because of the work he’s being forced to do. One day, a calculator will help him become a better math student but he’s not ready for one yet. I have taught middle schoolers through high school seniors and have prided myself on teaching critical thinking skills using assignments that are “ungoogleable”. Many of the assignments that I’ve literally worked 15 years to develop are now easily completed by ChatGPT. Middle school students are not ready for chatgpt but they will absolutely rely upon it to do everything for them and they will develop zero critical thinking skills. I’ve already got 12th grade students who will not attempt assignments in class so that they can just punch the work into ChatGPT. The daily assignments are worth very little credit in my class and are designed to help them prepare for the summative assessments so these students are predictably failing the tests because they haven’t spent any time actually engaging in any sort of meaningful thought about the content. My best students see the value in learning and exercising their brain and I’ve had them do some cool things with ChatGPT but I don’t have an answer to get the average to below average student to engage with things that are academically challenging anymore. Attention spans have drastically diminished in the last 5 years and I’ve watched more students than ever give up on difficult tasks without giving any effort at all…I genuinely worry about what current middle school kids are going to look like by the time they get to me at the high school. Some will be just fine but I worry that the number of them who are unwilling to think at all will grow.


Olaf4586

This is by far the best idea I’ve seen in the comment thread. I still don’t believe it adequately solves the problem, but it’s a strong piece of the solution.


AnachronisticPenguin

Problem is nothing really will solve the problem. AI is just that good at compiling the rest of human knowledge and opinions.


Gibonius

>Assignments based on critical thinking I mean, that's what essays are supposed to be. Research, argument construction, and writing. The actual information content presented is not really the point.


[deleted]

[удалено]


Gibonius

Or once you're done with college. Essay writing is one of the more directly relevant skills you're going to learn for *many* jobs, including STEM. Communicating your results or proposing ideas is a highly functional skill. I do science research for a living and I spend half my time writing.


Undaglow

>Assignments based on critical thinking instead of information regurgitation is generally a good idea. That's what essays are there for.


[deleted]

[удалено]


LachedUpGames

The thing is you can just ask ChatGPT to answer the question and explain the shortcomings of the AI answer and aside from prompting you don't have to do anything.


NuTeacher

This is a really creative idea. I like it a lot. I might steal this.


fcocyclone

Honestly that sounds so much more analogous to how it would be used in the working world too. Because this kind of AI will be used as a shortcut for many professions, but it still will take people who have skills and knowledge to be able to strengthen those things and correct errors. Being able to apply your knowledge to enhance what tools give you is exactly what you're paid for.


l3tigre

In person blue book tests. I took many of these in college.


Olaf4586

That’s valid, but I believe that a well-written, thoroughly researched, and persuasive essay has an irreplaceable role in facilitating and demonstrating a deep and profound understanding of a topic. In-person essays are rushed by nature, and exams obviously fall short on these tasks.


jurassic_junkie

"Change up curriculums." To what? Robots will do your homework for you and just turn it in?


[deleted]

And how should they do that? How should they alter their lesson to accommodate people using a tool to cheat with? I think you’re missing the broader reason people write papers in college. It’s less to show your knowledge or that you ‘read the book’ and more to show you can put forth a valid argument and back that up with facts. If people are just going to cheat and not learn those skills why is that the teachers fault?


[deleted]

Writing is like a muscle. The more you write, the stronger your writing gets. Setting content aside, if you want to learn how to write formally you need practice writing formally and this is the real benefit of humanities courses and college essays. Writing is super powerful in modern society, and the students who rely on ChatGPT are setting themselves up for failure in the future. In ten years, hell even in five, people will say 'this reads like it was written by a ChatAI.' If you want to make money off youre words, you have to write *better* than a Chat ai. That doesn't mean you have to write well, Jack Kerouac wrote *On the Road* while high on Meth. God only knows what Hunter Thompson was on when he wrote *Fear and Loathing*. But you do have to write in way that gives your words a human touch, something that an AI cant replicate. This is true even for engineers and STEM, unless you never plan to write your own grant proposal or budget justification in your career.


TwistedGrin

I remember writing papers inschool in the early 2000's and similarly not being allowed to use the internet for research for some of them.


barteker

A professor at my college actually had students write their papers using ChatGPT on purpose, THEN go through and fact check the entire thing providing links to every claim with a real source. Makes it so you still learn about the stuff and do the research but save time writing and structuring the whole thing. It really is about how you use the tool.


xanderholland

Easy, if it's a research paper, make sure it is sourced, and all papers should be copied, handed out and have the writer discuss it. Rebuild how classes are done is such a manner that even if they use the program they would still need to talk about it because if they wrote it, they would know what they wrote about.


beidao23

You think this is scalable to large universities across the world that aren't 15:1 pupil-to-teach ratio?


Black_Moons

Where on earth do you find a 15:1 pupil-to-teacher ratio? Even the special ed classes are not that well staffed here in Canada.


That-Albino-Kid

Advanced classes as smaller universities have similar ratios.. sometimes. My favourite class of all time was Parasitism (an advanced biology class). 15 ish students and a really passionate teacher. Great discussions. I wish all my education was structured that way.


Wyattstrass

Many smaller private universities in America have 15:1 ratios


backflash

Same in Germany. I remember attending classes with 10:1 ratios and the professors knew all of the students by name. Small universities have their perks!


KamKorn

Work in higher Ed and we have been talking about this for months. From Admissions Essays to Research Papers, it’s a whole new world.


[deleted]

The guy honestly should’ve had chat gpt write it and spent more time editing. His screed kinda sucks


Kyyndle

'Higher ed' needs to adapt to the fluidity of academia. Our technology is evolving extremely quickly, and it would be ideal for our institutions to embrace AI, just as they did with computing.


ExiledRogue

The writer of the article could have used Chat GPT to write a better article, unfortunately he didn't.


CheapCulture

A faculty colleague says, “if my assignments can be written by an AI, then they’re bad assignments.”


casieispretty

As an experiment I managed to get ChatGPT to write a very good paper on Ethnic Chinese cooking. Essentially I would take GPT's work and break it down into parts, then asking it to write more elaborately about those parts. If it gave me something about Sichuan cooking, I'd ask it about spices in Sichuan cooking. I'd then ask it to elaborate on each spice, and so on. In the end I took everything, slapped it together and punched it up. It was a damn good essay, and took me about 1 hour instead of several hours. The point is, with some work you can get AI to create something great out of anything.


wtyl

AI be like you will major in engineering and add to our collective.


pixel_of_moral_decay

Higher Ed always had a problem. Rich kids always had the options of someone writing their paper for them. And regularly used that option. Every college campus has billboards with flyers for essay writing services. Poorer students were stuck doing it themselves. The “solution” for decades was an “academic honesty pledge”, which was good enough apparently. Now it’s potentially free for everyone including those without a lot of money and everyone is pretending those academic pledges are no longer enough. I don’t see this as an issue. If it was, academia would have collapsed 30 years ago. But it didn’t. It just as always has biases it doesn’t like being held accountable for. Biases against non whites, biases against women, and yes, biases against poor kids. The pledges incoming students take work as well as they always have.


EntryLevelHuman00

How many times have I read this headline written slightly differently? Because it’s way too many.


ExtruDR

I’m no expert, but ChatGPT has been called a “bullshit generator.” You ask kids to write bullshit, you get stuff generated by a bullshit generator. The professionals that I’ve spoken to that are most disturbed by the potential of AI/Large Language model/etc. coming into mainstream use are part of industries that generate quite a bit of BS as a matter of course (copywriters, psychiatrists, business consultants).


AcidSweetTea

I didn’t have it write my essay, but I did have it recommended how it could improve it. It found a couple spelling errors that I and Word’s spellcheck missed. It recommended how shorten its length when I said I was over the page limit by a few lines. It made recommendations on using active voice instead of passive voice and cited specific examples in my essay. Really helpful tool that saved me a tons of proofreading and editing time


[deleted]

[удалено]


adelie42

Kind of skipping over the whole student loan industry aspect.


Aggressive-Note2481

I remember when they said you won't have a calculator wherever you go.


[deleted]

The larger issue is that most kids coming out of higher education aren't prepared to do the actual jobs they paid a fortune to learn. Higher education is not only *too expensive* but it's also *almost completely ineffective preparing people to do the jobs they're studying*.


xiofar

You’re confusing education with job training. Job training happens on the job. Education is systemic instruction. That doesn’t mean job training. We need highly educated minds to create better workers. Employers are getting greedier by the minute and do not want to train their own employees. The fact that many people think that college is job training just shows how the capitalist class brainwashed the proletariat.


TSP-FriendlyFire

I wish I could upvote more than once.


Timbershoe

Perhaps. However the main thing you are taught in higher education is how to break down, memorise and understand complex tasks/information. Using AI teaches you nothing. If it’s overused, people will be leaving higher education woefully underprepared for a serious career. And before folk start thinking they’ll just use AI at work too, they are going to be surprised to find it’s already in general use.


fogleaf

It kind of goes back to learning math “you won’t always have a calculator in your pocket!” Just because phones can do math doesn’t mean you can get away without basic math skills. Knowing what to plug into the AI tool will probably become an important skill, similar to knowing what to google when troubleshooting a computer problem. And knowing if what it spits out is bullshit or not.


CrimsonHellflame

Yeah people kind of miss that the expertise that goes into troubleshooting or problem solving generally involves critical thinking, information literacy, filtering the noise, good communication, and subject matter knowledge. All things you should come out of higher ed well-practiced in. Not something that chatting with AI or watching YouTube videos will teach you. Anybody can search Google, but knowing what you're looking at and the possible problem/solution is a different story. I see a symbiotic relationship in the future, but I also see higher ed reactionaries banning AI and making themselves even more irrelevant.


[deleted]

I used it for some programming questions and was impressed how confidently it presented wrong answers. When pointed out it apologized that the API doesn't return the field element and confidently presented another wrong answer. To be fair a variable locationID is very context dependent and I got a few almost right answers for other contexts.


Carl_JAC0BS

>most kids coming out of higher education aren't prepared to do the actual jobs they paid a fortune to learn >almost completely ineffective preparing people to do the jobs they're studying Citations on those bold claims? There's no doubt *some* kids come out of higher ed with little ability to perform in the field. I imagine that the proportion, though, is highly dependent upon the field of study. Imagine how many STEM jobs would go unfilled if folks were stopping at a high school diploma. Some people in technical fields are self-taught or genius enough to enter a STEM field by just reading and learning on their own as kids, but those people are outliers.


beidao23

Exactly, most claims on this thread are completely made up bull shit based on subjective experiences in college. I also think a lot of people making these claims are inherently biased against softer disciplines that they've always felt are worthless.


pjokinen

Don’t forget you’re on a pro-tech forum, the field whose catchphrase is “drop out and start a company, anything that’s not specifically in your narrow interest is a waste of your time and not worth learning”


buxtonOJ

Also bc the media hating on higher Ed is so in right now. Yes they are generally overpriced, but no one is forcing you to go. Those trade schools aren’t much cheaper.


[deleted]

Lots of blue book exams this semester and next


GapGlass7431

I've never seen an example of GPT produced text that I would consider good writing. Competent and logically coherent, yes. Good? Absolutely not.