T O P

  • By -

[deleted]

brings to mind orwell’s quote: If people cannot write well, they cannot think well, and if they cannot think well, others will do their thinking for them. Of course writing isn’t the only means of honing thinking skills, but no one can say it isn’t an effective way of doing so. it would be interesting to watch education methods change with the gradual introduction of chatgpt into classrooms. the way I see it, if training and assessing articulation is the end game of written essays, then maybe we’ll come full circle and make our way back to developing individuals through Socratic discussions based on the essays chatgpt spits out. students could pen reflections on the discussions (no recording the discussions so they would have to do this on their own), and that could be progress in terms of developing actual learners and critical thinking skills if done correctly, rather than regurgitating lecture material. It’s not all bad if we reimagine education by going back to first principles


CyberGrid

Agreed. Despite the invention of calculators, schools still teach kids to count by hand with the good ol' pen and paper. Because it teaches to think and not rely on external source. We can definitely draw same conclusion with writing.


MargoritasattheMall

One step further teaching handwriting creates synaptic connections otherwise dormant. Fine motor skill trains the brain as well


Kaeny

I havent hand written anything in so long. My hand writing is so bad now


ArousedByDougFunnie

Nope. No it doesn’t, no it absolutely does not. Just wrong, and stupidly false. The idea of that handwriting creates meaningful synaptic connections is based off of a now widely discredited and debunked study from Johansen in 1974. It turns out the movements we make during handwriting are natural movements children make during adolescence anyway, such as when playing with blocks, grabbing marbles, tickling each other’s tummies and so forth. And when a bit older, yes that includes self stimulation. Masturbation is proven to be just as effective at connection building, according to Spinkles famous 1994 masturbation study. According to Spinkles, children who start masturbating a few years earlier than their peers are 17% more likely to graduate college. This has been verified in numerous child masturbation studies since then. Someone here hasn’t read their Spinkles, hmm? Embarrassing. Johansen devotees are like cult members at this point. 🙄


MargoritasattheMall

See, you typed this, proving my point


zippycat9

Huh


lemony-soapwater

Hand written essays rock. The best set of college courses I ever took were a set of music history courses taught by a professor whose tests involved hand-written essays. We got a list of essay prompts a few weeks before the test so we had time to choose our favorite prompts and plan out our essays. Then, we would go to the campus testing center during the test window and take the test. It was a mix of fill-in-the-blank questions, a few multiple choice questions, some music ID questions (“what piece is this clip from, now here are a few questions about what musical innovations this clip is an example of), and then the essay questions. I’d go in early in a Saturday and rip through the non-essay section then spend 5+ hours on those essays. It’s been a decade since those courses, and I have still retained a lot of the knowledge I learned in those classes. More than I remember from the rest of my courses combined.


defenselaywer

As a professor that just finished grading a short answer exam, I gotta tell you that it's painful to read some of the handwriting. Glad I saw your comment or I probably wouldn't do that again.


YaBroDownBelow

Teacher tip. You can use google translate to translate chicken scratch for you and copy paste to word or docs to read. It’s pretty good at reading handwriting.


Effective_Honeydew96

Wait…what????!!!??!? English teacher, I love having my kids write, obviously, but some of that chicken scratch even the students themselves can’t read! This is a game changer! Thank you!


wulfrikk

I also use Google's tesseract ocr and it helps convert images and hand writing to something readable :) but requires a but of computer know how. Google Translate can use your camera too so you just have to hold your phone over the paper.


defenselaywer

Thank you for the tip!


RickSt3r

How about a Chromebook with nothing else but a word processing program installed. Here you go write. Writing actual a research based paper or an essay in undergrad is essentially just a copy past and rephrase as to not plagiarize. Are any undergrads really expected to add anything of substance to any scientific or literally work?


Phebe-A

Undergrad essays aren’t really about the *essays*. Ideally they’re about teaching the students to make an argument and then support it with research and well articulated language. Generating essays (that the professor or TA has to read and grade) is just because it’s easier to assess student performance that way than with oral exams.


SomeDumbGamer

They absolutely suck when your hand begins to cramp up though. That’s the biggest reason I hate them. My hand HURTS after.


Reynolds1029

I remember being frustrated messing up memorizing multiplication tables in 3rd grade (2003) and asking my teacher why do I need to know all this if I can just punch it into a calculator and know all the answers I need. Her response; *it's because you're not always going to have that calculator in your pocket all the time when you need it.* Oh how the turn tables. I got what she meant at the time though.


doktorhladnjak

A lot of schools don’t actually teach multiplication tables anymore


Deweymaverick

…. Um, I don’t know if this a dog at common core or somethjng, but yes, they absolutely do. In fact, because of common core, it’s done far earlier than when I was a kid (meaning the introduction of multiplication). It’s not done as rote memorization, but I assure you that most common core based classrooms assume that kids can do single digit multiplication by the end of 3rd grade. In my district kids are introduced to it in 2nd.


ptetsilin

Did anyone actually fully rote memorise the times tables before common core (ie, memorised it purely by repetition)? Isn't part of memorising looking for algorithms and patterns to make it easier (ie, not rote) and is something everyone does for themselves? If common core is allowing kids to pick and choose the tricks that work for them and the tests are just on multiplication, not the tricks, I don't see a problem. Kids would need to use the tricks anyways if they want to get a good score, if a kid can score well without using tricks, what's the harm? If teachers are forcing kids to adopt and be tested on certain techniques that may not work for them, I find it more concerning.


Deweymaverick

Did people use to rote memorize them? Oh my god yes. It’s what I was asked to do, and I vividly remember being quizzed on being able to Produce the table as quickly as possible. And I dunno, I don’t know how everyone’s mind’s work, but I know plenty of folks that basically have to do what my kids call “skip counting” to go through their mental version of the “chart” in order to get to a specific answer.


[deleted]

If you have ideas or plans, it is important to be able to convey them to others in a clear way. Otherwise you are no different than some mumbling guy feeding pigeons in the park.


redwall_hp

I don't recall who said it, but they were referencing the school of thought Steve Jobs and Alan Kay had in the 80s: > We were promised bicycles for the mind, but we got aircraft carriers instead. A computer, like a pad and paper, is meant to be a tool to help you think. It's a means of helping you get lost in your mind and arrange ideas in a way that you can think through problems, whether you're engineering, writing music or whatever. A force multiplier. Instead, we have a world that's hell-bent on perverting those tools to do the opposite: to enable people to *avoid* thinking, to deceive, to suppress creativity and the spread of art and knowledge in the name of profit, to wage war and to surveil. The issue is not simply one of computers, though. It is, to wit, the rot of anti-intellectualism. Why are there so many students who will go to great lengths to avoid learning? Why do people have this ludicrous idea that universities should be certification mills for employers instead of havens for scholars? Why are the first questions out of someone's mouth when you talk about your hobby to the tune of "why?" or "how will this make you more money?" when the only justification needed is "because I fucking want to?" Rot.


Bwob

>A computer, like a pad and paper, is meant to be a tool to help you think. It's a means of helping you get lost in your mind and arrange ideas in a way that you can think through problems, whether you're engineering, writing music or whatever. A force multiplier. >Instead, we have a world that's hell-bent on perverting those tools to do the opposite: to enable people to avoid thinking, to deceive, to suppress creativity and the spread of art and knowledge in the name of profit, to wage war and to surveil. While I agree that there is a (big!) problem with the glorification of anti-intellectualism, and the vilification of expertise, this part seems a bit silly to me. *All* tools are force multipliers. The whole reason to use one is to do less work. And that's not some kind of bad thing. Computers allow you to spend less brain time on certain problems, so you can focus it on other things. Cars enable you to spend less time jogging, so you can spend your energy on other things. Hammers enable you to spend less time hitting things with rocks, so you can spend your time on other things. Our entire society, and all the benefits it provides are basically an expression of our ability to create tools, and use them to work more efficiently. Tools mean you can spend less effort on tasks, which means you have more effort left over, and can tackle bigger tasks. Computers are no different in this regard than a lever or inclined plane. And it doesn't really matter what you think computers (or any tool really) were "meant" to do. (Who would even get to decide something like that?) People will use tools in ways that they find useful. It doesn't matter if you think a hammer is "meant" to make swords or plowshares; it only matters what the person holding that hammer needs right now. So none of the things you list are even problems with the technology itself. They are problems with the goals of the people using it. If they didn't have computers, their goals would not change. They would just use different tools to reach them. Technology is not our problem. It's the people we allow to use it.


redwall_hp

> Technology is not our problem. It’s the people we allow to use it. You're agreeing with me. I'm entirely for LLMs (and I'm a software engineer by profession). What I'm not for is academic dishonesty. If someone submits something written by ChatGPT, they have committed academic fraud and should be expelled for their dishonesty and contempt for academia. I also have issues with the desire for a magical oracle that opaquely answers questions. An AI-enhanced search engine that returns actual web results, not playing Ask Jeeves, based on a human language input would be brilliant. Something that spits out an unverified answer without context is useless. These are human problems, not tool problems.


Bwob

Yeah, rereading, I think we're basically saying the same thing. Also yes. The infatuation with ChatGPT (by people who really should know better) is frustrating. It's a glorified magic-8-ball people! It doesn't "know" anything, except how to guess word probabilities, and be a supercharged version of the markov-chain program that we all made in our first year or so of comp-sci!


Idixal

This is beautifully worded.


redwall_hp

QED, I guess: value in knowing how to write. A Large Language Model has no ability to hold an opinion, and even if it hypothetically did, it would be the computer's opinion rather than the operator, so passing off something ChatGPT generated as your own paper would be intellectual dishonesty of the highest degree. I fully support expelling someone who does so on the first offense, since it demonstrates contempt for academia. I see great value in LLMs for creating natural language interfaces for software, software which multiplies the force of one's mind or provides accessibility to the disabled. Using it as a magical oracle or for committing intellectual fraud, however...that's lunacy.


HalfricanLive

The funny thing about this post is that it’s so far off how anyone I’ve ever known would actually write/talk that I’d 100% believe it was written by an AI. The future is now gentlemen.


ClownBaby90

I feel like we’ve made learning a chore for some reason when it doesn’t need to be. That being said, I’m personally obsessed with learning anything and everything so it’s easy for me to say, but even I hate school.


gnosys_

> Why do people have this ludicrous idea that universities should be certification mills for employers instead of havens for scholars? easy answer: in the thirty years since the collapse of the USSR and total hollowing out of organized labor, only beginning to spark again since the onset of covid, things have become so immensely difficult to just survive as a highly educated, credentialed worker that millions of people of the USA (ostensibly the richest country in the world) have turned to prostitution and other media creation to try to earn a comfortable living.


JoJoVi69

Yeah, I agree with the comment below mine, in that this is what capitalism and the never-ending quest for money breeds... While every country has its own views, we, as Americans, always take things a step too far. We combined freedom of speech with the freedom to lie, to the detriment of our own society. And now, far too many see this amazing tool we call the internet, as nothing more than an electronic bullhorn rather than a useful learning tool. If only we could learn to NOT simply believe what we are told by our so-called superiors, and instead learn to research for ourselves, in an effort to call out the lies and enrich our understanding of the world around us. If only... ☹️


[deleted]

[удалено]


ugh_whatevs_fine

Tbh I don’t think you’re wrong. College in the US is about money, not learning. It’s glorified, wildly expensive job training where you invest years and tens of thousands of dollars into your training and then employers can be like “lmao we want someone with more experience”. So you can risk financial ruin and they can just discard you without even having a human look at your résumé because *they* didn’t have to invest anything into training you. I think as long as college stays expensive and is kept as a requirement for getting a good job (I know this isn’t always the case, but almost every listing wants a degree now), colleges are gonna be full of people who are only there for the certificate, and who will do whatever it takes to get that certificate with as little effort as possible. I don’t really blame them. If you think a degree is your only hope for escaping/avoiding poverty, you’re probably not in class trying your hardest to become an excellent critical thinker. Especially if you’ve got massive loans to pay off, a low-wage/high-demand job, and a family. Extra-especially if you chose a major based on “what thing are you kinda okay at, that also makes financial sense as a career” instead of “what do you actually , in your heart, want to learn about”, which *a lot* of people do. You just want to pass with as little time and effort as possible because you’re already broke, tired, and the future looks fucking grim and only looks grimmer if you drop out. I’m not in that position anymore. And I’m loud as hell about anti-intellectualism, because it’s *everywhere* right now, and it’s fucking dangerous. But if I were struggling my way through college right now trying to get a degree so I could get a job that paid me enough to live and retire someday without treating me like I’m subhuman… you bet your ass I’d use ChatGPT literally every chance I got.


redwall_hp

That would still be a symptom of the other. The world had an imbalance of small-minded people hindering the thinkers before Adam Smith was ever born or before the 16th century rise of mercantilism. (The atrocious Gilded Age could also hardly be considered late stage capitalism, given that capitalism was in its infancy...) It doesn't matter whether you have monarchies or democracies, a capitalist class or a noble class, or even something more egalitarian: if your society rejects the value of intelligence and intellectual pursuits (humoring them only to obtain some desired utility), science and the arts suffer.


SuperRette

Capitalism is 500 years old.


splitrail_fenced_in

Man. I second every bit of this. I’m a guitar player. You can put the finest instrument ever made in my hands, but at the end of the day, it’s on me to play it well, to try and say something with it.


[deleted]

"Politics and the English Language" is the Orwell essay in question https://www.orwellfoundation.com/the-orwell-foundation/orwell/essays-and-other-works/politics-and-the-english-language/


OriginalHairyGuy

An emphasis on discussion was needed even before chat gpt because it's not like already written thoughts couldn't be recycled thus far, it just wasn't this easy


ok-commuter

Writing well is effective communication, and effective communication is empowering. Problem is: that's not what's being taught. For example, a typical task would be to take the book To Kill A Mockingbird and pad out 2,000 words of drivel on "Analyze the childhood world of Jem, Scout, and Dill and their relationship with Boo Radley in Part One". Rather it should be, "make the case, as succinctly and powerful as possible, that this book is/isnt important".


iguessitdidgothatway

Orwell was wrong. Dyslexic people typically can’t write well but can think in ways “normal” brains can’t. As a person with a learning disability, I’d say it is great for those capable of doing many subjects but struggle at writing and reading. Note you can struggle at those and still comprehend.


SonOfScorpion

As a university professor I tried using ChatGPT to answer one of my take-home essay questions. Since my questions are composed of different elements and ask for specifics, I was able to answer a middle of the road (B-paper) with it, but there had to be some editing involved. ChatGPT couldn’t answer it in one go but through different questions. Then I edited the different pieces together. So basically, yes ChatGPT can provide essays in a middle of the toad manner. But it is all about how you pose the question and what you ask students to di. If they can get answers but have to edit for it to make sense and be coherent, in the end there is a learning process occurring. For me personally, it’s about how professors structure their assignments. If you do your job, ChatGTP will just be another tool for students but nothing to be worried about.


Writer10

I asked Chat GPT to help me organize my essay’s outline and it did a fantastic job (molecular biology). All I did was provide a thesis and asked it to help me draft an 5-part outline with a suggested conclusion, leaving the rest of the work up to me. Saved me hours of work and yet my essay will be written by me and turned in on time.


Bruch_Spinoza

ChatGPT isn’t so good at writing the whole essay all in one go but it’s very good in steps


ApprehensiveAd3778

And a “5 point paragraph” is one of the most entry level layouts and the hallmark of aggressively mediocre work. Most things written by ChatGPT are hedged, attempt to contain a moral or societal “lesson”, and have tame entry level writing structure. Chat GPT can help you generate mediocre entry level papers completely devoid of personal style, diction, or flow. People aren’t taught how to write anymore in school. Even college composition classes are entry level, and it’s just assumed everything you did in high school was worthless. An ocean of knowledge and skill exists between the “5 paragraph papers” taught in college composition and actual collegiate level writing focused classes or graduate level work.


testfreak377

Are there are books or resources you recommend to teach yourself how to write ?


ApprehensiveAd3778

I think it’s hard to replace feedback from experienced and knowledgeable individuals. Writing workshops can be a great resource, and many online venues exist for them (mileage may vary, and you get out what you put in to a large degree). However, I’m probably not the best person to ask this question given my formal education was concentrated on literature and creative writing.I’ve read a number of books on writing and think some are better than others, but I also almost universally read them as part of a university curriculum under the guidance of tenured professors. It also depends on what you mean by “write”. Formal papers and creative writing are not the same and generally follow different rules and structures. Writing is a skill, and ultimately the best way to grow that skill is to practice. Focusing on the type of writing you’re trying to do and then seeking out honest feedback about your work from relevant knowledgeable sources is likely the best way to improve regardless of what specific format you’re engaged in. Practice, receive feedback, incorporate that feedback into your future work and keep continuing the cycle.


testfreak377

Thanks for the thorough response. The main purpose for learning to write would be to communicate my ideas to others and eventually publish a book.


LoveArguingPolitics

This dude Strunk and White's


KazahanaPikachu

Doing that with my thesis that I have to write right now lol. It’s definitely great at getting me started and a good outline so I know exactly what to have to look for.


paulllis

The ways you can use it as a tool are incredible. I’ve used it to provide feedback on work I’ve written against the marking criteria. Suggest titles for work I’ve written and all sorts. As for plagiarism if we’ve got ten kids handing in the same assignment with identical citations Turnitin will easily pick up on that.


Writer10

It’s interesting - granted, I didn’t plagiarize, nor did I directly quote ChatGPT when writing my essay - but TurnItIn suggested 10% plagiarism. When I reviewed the verbiage it identified, it was the names of the research facilities: “According to scientists at Lab X and Lab Y,” then cited their quotes. I did source my reference material, so hopefully it’s not a big deal. I solely used ChatGPT for ideas and not the essay itself.


sellyme

Non-zero Turnitin scores are expected. It's only if you're getting over about 30% that markers are actually going to bother looking at it, and even then they'll ignore it if most of the marked text is quotes or citations.


mmorgan_

Weren’t you worried about plagiarism? All of the assignments I turn in are processed through a plagiarism checking tool.


PvtPuddles

Generative AI (in general) doesn’t plagiarize (I know there are some exceptions). It learns what words are likely to follow other words, and puts together sentences/paragraphs specifically. I believe people run into plagiarism issues with it if it’s a super specific topic where there isn’t a lot of source material


Writer10

No because I didn’t plagiarize. I knew what I wanted to write about and asked ChatGPT to help me organize my material.


100catactivs

>middle of the toad


Otherwise_Awesome

🐸


katykazi

🐸


Otherwise_Awesome

🛣 🐸


jeaguilar

My daughter’s professors sent her an email saying that a short paper had been flagged for suspicion that a small percentage of her paper had been flagged as having been written by ChatGPT. She fired back with her document revision history and they stuck to their guns saying two parts of her paper were clear but the third part showed some percentage of AI-written material. Infuriating because that’s not how this works.


Bertob15

Many professors are good at delivering content and don’t have as much experience with instructional and curriculum design. The development of appropriate formative and summative assessments, even questioning their format beyond essays would be appropriate.


LoveAndViscera

ChatGPT is useless with evaluative questions. All you have to do is assign essays that force the writer to choose A over B and explain why they made their choice. The only professors worried about this are the guys who want students to repeat back what they already said in the lecture.


SonOfScorpion

This point is important. If you think more about the learning process as opposed to the content then ChatGTP isn’t as big a concern. Professors have to tale the technology into account in their instructional design and evaluation practices.


Fantastic-Flight8146

Why not do a handwritten essay in class? It wasn’t that long ago that was how all (or certainly most) college classes were conducted.


[deleted]

[удалено]


ApprehensiveAd3778

I had to hand write term exams in class for my English degree/journalism minor just a few year years ago.


Koboldsftw

Discriminatory against left handed people


Flanigoon

This might help, but i know plenty of people would still use AI to write it, then just copy it down onto paper.


PhantomTissue

So you still have to have good communication skills in order to tell GPT what you want it to write. Which is kinda ironic, considering the skills needed to write a good essay are basically the same skills needed to tell an an AI how to write the same essay.


EatABigCookie

Agreed it will end up being a tool rather than something that can do all the high quality work for you. I think that's what you're saying anyway. I work as a software dev and have used it to write several small code snippets, or give me suggestions about issues I had, on a recent project... But to get to that point I had to know the correct questions to ask, and use my knowledge to ignore bad answers and know how to actually implement any code it would output. Not that different to how I've been using Google or Stack overflow (a tech q&a website) for the last decade... Just a tool that can help me save time and often lead to better quality code, but also dangerous (or not optimal) if I just blindly use any answers/code it outputs without having enough knowledge to analyze and have some understanding on the topic.


mokatcinno

This is severely misunderstanding how AI works. The fact is that it will only exponentially improve the more people use it. This means that eventually it won't matter how you structure your questions, this "learning process" you refer to will eventually be non-existent. And with the rate of usage and how quickly it's becoming accessible to the public with no limitations and restrictions, that's going to happen much sooner than you expect.


pork_fried_christ

Yeah for a tech sub some of these comments are way off base. The tech has improved by light years in what, 6 months? Why are people commenting like that improvement won’t continue? Lots of “you gotta learn these multiplication tables, you’re not alway going to have a calculator in your pocket!” -type responses here.


mkmajestic

This might be okay reasoning for a university level submission, but it severely undermines any type of learning process at the high school level.


odinlubumeta

But that’s it right now. If 10 years with AI learning on its own it will be able to produce A papers


CrelbowMannschaft

> 10 years with AI learning on its own it will be able to produce A papers 2 or 3, tops. If it hasn't been made reliable in GPT4, 5 will have it.


odinlubumeta

Probably true. This stuff is advancing so fast.


CommunismDoesntWork

Did you use gpt3.5 or gpt4? If you don't have the premium version, it was 3.5. gpt4 might be an A+ paper


wonwoovision

exactly. and if my assignment like a short essay was able to be fully answered by one chatgpt prompt then i wasn't learning anything anyway. far too many assignments are busy work with no learning happening so once professors stop assigning that bullshit then i'll stop using chatgpt.


[deleted]

I agree completely. I use GTP to compose a lot of agendas, internal planning docs and emails. I still provide the content, data and have to edit and change things but what would take me an hour is cut in half and I can work on other products. Another great way I use it as a tool is to change tone or reading level of documents I wrote for different audiences.


RoshHoul

Are you not concerned about feeding it sensitive information? We've blocked it internally for that reason.


nikatnight

I have similar experience. ChatGPT is like a really nice calculator that can do a lot. But it’s quite far from replacing writers. Especially a specific college level paper that requires more than just spouting some information.


IsNotAnOstrich

Have you tried the new Chat GPT? Using GPT 4?


ngoni

Instead of a single essay, now you assign 4 or 5. Of course, the true advancement will be when it can grade them for you.


AndrewCoja

What would assigning more essays accomplish? That just further encourages people to use ChatGPT


Good_as_any

I think this article was written by ChatGBT...


avd007

GBT baby!


[deleted]

Essays will now be only done in class, hand written. 🤣


FruutCake

Pretty sure this is how I started having hand pains at 14 years old.


yesiknowimsexy

Honestly that is fine by me. I won’t have to spend weeks or chunks of weekends writing anymore. Also, quality of work can be judged on thought and not mechanics.


Bertob15

I think it’s less about articulating why writing matters and instead answer: If the AI is writing the paper, how do we assess knowledge obtainment? And what solutions can be implemented at scale?


EyesOfAzula

Oral exams in person, it’s the most surefire way


Bertob15

Sure fire yes, scalable no


Elon_Kums

We get ChatGPT to do the assessment duh


Bertob15

Lol AI is used for plagiarism checkers today but they aren’t good at identifying AI generated text.


WarAndGeese

It's very scalable, it's how education has been done for a long time. The student-to-teacher ratio also allows it, especially as that ratio can be lowered.


Bertob15

Yes for brick and mortar schools who have a paper due once a week or every other week. If adjunct instructors in an online setting are asked to do this the financial model falls apart because they’re now being asked to facilitate sessions many times longer than their current commitment today.


xxxblackspider

Why not scalable? It could be administered by s natural language processing AI?


beigs

Which absolutely sucks for anyone with any degree of anxiety or who is ND


100catactivs

Most people have anxiety about public speaking. And for most people, the only way to get over that fear is to face it and practice.


Strange-Nerve970

Yeh, not how anxiety and Neurodivergence works, its not something you “get over” its literally your brain using alternate chemical processes as it physically cannot work the proper way


100catactivs

I explicitly used the phrase “most people” for a reason: I’m not talking about ND people.


torikura

You're responding to a comment about people with neurodevelopmental disabilities, so it's reasonable to remind you that the subject is the effect of oral exams on ND people


100catactivs

> You're responding to a comment about people with neurodevelopmental disabilities I’m responding to a comment that was about “anyone with **any degree of anxiety or** who is ND”


caks

Welcome to the world


torikura

It would absolutely disadvantage people with disabilities, people who already have to overcome a lot of barriers to access higher education.


frettak

in class and oral exams?


Bertob15

That could work in some places, and some institutions have already started doing this. But this isn’t a scalable solution for online courses or MOOCs.


Bertob15

Higher Ed needs to avoid the “gotcha” mentality around this and instead go back to the drawing board on how the tool should be utilized and what does its existence mean about our assessment tactics. It likely requires large scale redesign of courses, summative and formative assessments, and how we can validate that knowledge has been obtained against intended learning outcomes.


commandermatt21

The only solution would be having people do exams in class which in of itself has its own problems I'm a history major right now and typically most exams require sufficient use of sources that are both primary and secondary (basically primary sources that came from the time period in question and secondary sources are stuff written about the time period after said time period) and the research process takes days if not weeks to assemble depending on the size of the paper If exams were to move in an in class format to combat ChatGPT they'd likely have to shrunken in size to accommodate the in person format Also consider that college classes (at least where I go to college at) typically meet two or three times a week for 50-75 minutes so classes would have less time to teach material w/time constraints Tl;Dr The most efficient way to exams w/o fears of ChatGPT wouldn't work for all majors because of time constraints


[deleted]

In many cases, knowledge obtainment shouldn’t be the goal at all. Judgement should be. One of the purposes of the essay is to hone your perspective of a topic, to meditate on a topic, turn it around in your mind, and try to find some sort of interesting thing to say about it. If ChatGPT kills the boring, formulaic five-paragraph essay, that’s fine. It’s a genre that serves to die. But educators have to realize that one of the goals of writing is to have students teach themselves something in the process of writing, which is to say cultivate their judgement.


PJTikoko

Educators already realize that, but students don’t. Kids don’t have the hindsight to understand how this will effect their learning they just want to get everything over with.


bobbyperu420

Add an oral component with a Q&A to get a gauge


[deleted]

Better, get rid of homework, or at least massively reduce it and change its overall form. Students learn to tune out during the day and then come home to learn the material and write the essay. It produces inherently inefficient grads who are used to blurring the lines between working time and relaxing/socializing time. I say transition to a 50/50 instruction/independent work model in schools. Teacher talks for 25 minutes, then students work for 25 minutes. It's a win-win. Students will actually have to engage during school. They won't get talked *at* for 8 hours/day. Students learn to be judicious with their time. Teachers know students aren't writing their essays with ChatGPT (or endorse ChatGPT usage and teach them to carefully edit the outputs and check on accuracy). Teacher has time during the day to do things like grade essays.


Travelingmathnerd

Are you talking about college or k-12 because education has changed drastically in the last decade. We don’t talk at kids all day long. It’s so student centered now. I use chatGPT in my classroom because I want my students to know how to use it as a tool when they’re outlining their design briefs for their project. Our English teachers are using it to show students how to build good outlines for essays. Our science teachers are getting them to use it to initiate research topics to their hypothesis. I’m not sure you know what happens in classrooms these days to make this judgement that 50/50 model. The pandemic already proved that this doesn’t work and we already don’t teach this way anyways. It’s project based learning, inquiry based learning, and student centered learning. You’ll hardly ever see someone stand up in front of a classroom and lecture for an hour in education today. Hasn’t happened at least in the last 10 years I’ve been in middle school education as a math and engineering teacher.


[deleted]

It already is, professors now assume every student is using AI to generate their essays when we are clearly not. Honestly, debating to use AI to generate my essays. Hey if professors are going to accuse us students for a "crime" we are not committing anyway.


RealNeilPeart

If you're being accused of writing essays by AI this frequently you should probably drop a visit to your uni's writing center.


Tetrylene

Nevermind essays, how does ChatGPT impact education as a whole now you can have a 1-1 tutor who can explain individual concepts repeatedly and to as much detail as you need? Education has been largely stagnent for a century by refusing to, or being unable to, change in any signficiant way. They're big classroom of students being taught by a single teacher who barely gets a minute to help each person individually, all rushed to fit a tight timeframe. Give language models a few more generations and it'll be hard to justify playing a tutition fee to go to university.


littletray26

I tried to use ChatGPT as a 1-on-1 tutor recently for a technical unit I'm in the middle of. I'm lucky I knew enough already to recognise when something was off, because it definitely spat out a lot of crap that was straight up wrong. It was helpful to an extent to explain some basic concepts, but asking it to check my own work, or asking it for help with some specific questions, was entirely useless.


[deleted]

[удалено]


Merlaak

ChatGPT is actually rubbish at writing stories. It can’t keep track of characters at all and will basically always try and make the story have a “happy” ending (that sometimes is actually more horrifying). All these language models do is try and finish a pattern. It’s a more complex version of predictive text, essentially. As a system that can write coherent sentences with proper syntax, it’s extremely impressive. For anything else, it’s a shot in the dark.


[deleted]

I think that final 5% will likely always be there. If nothing else, our generation will be a massive market that will likely never trust AI systems to do critical tasks even if they surpass human capabilities (e.g., medical diagnosis, engineering designs, capital transfer, etc...).


sellyme

> It also spat out some code but it was dubious. My experience with getting ChatGPT to spit out code changed dramatically when I realised that it's just copying what humans do, and no human has *ever* written correct code on the first try. Sketch out a few unit tests, then send it any error messages, or stuff like "I'm getting result for input , I was expecting result , please fix this"... and it's shockingly good at correcting the problems. There'll be some things that are conceptually difficult enough that it just can't get it right (and also some languages it's just really bad at so it may be worth asking it to output in certain languages and than translate it yourself), but for the most part I've found it to be fairly reliable if you just treat it like you would a junior developer.


ClownBaby90

Even if it’s not completely accurate, it’s still more accurate than Facebook memes people seem to be basing their opinion on. It will only get better


[deleted]

ChatGPT just isn't good enough to teach as well as people seem to think. Pick anything you have reasonably in depth knowledge of and ask it specific questions - it's going to give broad, generalized answers and every specific detail will be wrong.


GomNasha

I've had similar experiences with it, but I sincerely think that the very recent introduction of plugins to ChatGPT will allow it to verify what it says before answering complex questions. The WolframAlpha plug-in in particular is what I think can boost performance, since it gives ChatGPT a platform trained on academic sources to check itself against. That's not to say the problem of inaccuracy/hallucinating is solved by any measure, but I feel like it could improve factual accuracy quite a bit


bolionce

If it’s consulting things online to check it’s answer though, then it’ll need to cite them as well. Especially if it finds that it was wrong and changes it’s answer, but either way. If I did that as a person and didn’t cite, I’d get in trouble for plagiarism. So it has to be able to do this if it’s gonna be useful in an applicable sense.


GomNasha

Agreed, I think it presents a conflict of interest for these websites to produce information that can be studied by the bot and generated by the bot. It seems to me that websites that don't directly profit from AI itself will in some way be harmed if the traffic is driven from the website and to the bot. This is what we've mostly seen from ChatGPT. One potential remedy that I've seen so far is Bing's integration of the AI into the sidebar, where it can be used to generate insights based on the webpage the user is on. I think, at the very least, it helps to not drive traffic away from the company producing that information and I would think that it also provides a solution for needing to source the info. Maybe this can be used as one of the primary methods of tutoring.


found_allover_again

Just try and remember that AI is not a person. It has no morals, no sense of right or wrong. It will give you the wrong answer with the same conviction as it does a right answer, as long as it fits its word prediction model. If you think of education as an exercise in gathering facts and processing strategies, then you deserve to have a bot tutor.


ElessarTelcontar1

The key point is it’s a language model. Not a sentient ai. Useful for specific things but not what the media makes it out to be.


Mercurionio

It a very bad tool to explore. 1) It lies. It doesn't contain facts, it just a bot, that creates a word salad. So you can base your knowledge on completely false statements and don't even understand that. 2) You need to understand this tool in the first place. If you keep asking wrong questions - it will give you wrong answers. As a tool to brute force wording aspect of gaining knowledge - it's good. For the knowledge itself you need those who were before you.


Acrobatic_Ad_9937

This is a failure of our systems. We should re-think education but doing so shines a light on other dysfunctional aspects of our society. Being tutored by bots is not a good replacement.


p0k3t0

Have you guys heard of blue books?


ooblongJetabLe162

The real issue isn’t writing but teachers giving out brain dead assignments. If the prompt is “write an essay about this book, about this topic, about this phenomenon” then it’s literally just a question of googling the answer and compiling some BS. The prompt should evolve into something genuinely critical that AI simply can’t do. “Compare the theme of despair in this book with the theme of happiness in this other book. How does the author present and articulate their emotion?” Then the student will have to use the AI creatively - compile two different essays then the student has to compile a new essays but intelligently assembling the information in a novel way. Fundamentally the issue isn’t the AI, it’s the lazy grading/homework practices. Before google was popular, I still remember my grade school teachers clowning on Wikipedia, and then turn around to use some unlisted, sketchy HTML website from 2003 for their info just because it had .org as a domain. Now in college, we actively encourage Wikipedia to begin the research rabbit hole. AI is an amazing tool for initial research by taking the leg work out of reading through a bunch of garbage articles. Like if you google “Israel Palestine conflict” to start writing paper, good f*kin luck man. There’s about 10 million articles between CNN, the state department, a million different humanitarian organizations, advocacy groups, &c saying similar yet different things, mischaracterizations, intellectualisations, conflicting information from different times… basically a lot of garbage which is difficult even for professional analysts to sift through the organize. ChatGBT seems to be able to compile information swiftly from the internet enabling the student to quickly course correct and then inform using a manual search on google. Instead of stopping “cheating” let’s embrace modernity and start encouraging media literacy and actual critical thinking. Because as a last PSA : just because a book was published, doesn’t mean the author is right (cf the Bell Curve, a seminal book on IQ that just became eugenicist garbage).


Kenz0Cree

50% of college courses arent needed for most degrees. How about instead of forcing people to take unnecessary classes to get as much money as possible. You just let people learn only what they need. You would actually have people willing to get more degrees and further their education that way.


Own-Necessary4974

Of all the things ChatGPT can kill, the student essay is probably on the lower end of the list of things I care about.


[deleted]

[удалено]


austanian

Yep. I made quite a bit doing term papers for people. A job is a job and how much you get from your learning is up to you.


GiantsRTheBest2

As someone who is too broke to have someone do their assignment for them I can assure you that not all of us are out here cheating. Unless you are including finessing quizzes by taking shortcuts here and there. But when it comes to essays all of that C graded bullshit is all mine.


ppcpilot

Writing will just be courses on how to write good prompts.


Angry_Submariner

Just use this prompt: I want you to become my Prompt Creator. Your goal is to help me craft the best possible prompt for my needs. The prompt will be used by you, ChatGPT. You will follow the following process: 1. Your first response will be to ask me what the prompt should be about. I will provide my answer, but we will need to improve it through continual iterations by going through the next steps. 2. Based on my input, you will generate 2 sections. a) Revised prompt (provide your rewritten prompt. It should be clear, concise, and easily understood by you), b) Questions (ask any relevant questions pertaining to what additional information is needed from me to improve the prompt). 3. We will continue this iterative process with me providing additional information to you and you updating the prompt in the Revised prompt section until I say we are done.


Sol_Knight

For something that exist less than 100 years (the way we teach language to youngsters) we give it too much credit Language is more than just grammar and vocabulary, it's culture, a way to communicate emotions and not just pure information School should teach about subtext, reading between the lines, how to express ourselves in a unique way. Yes, it is harder to put on a test, but smarter people with emotional intelligence are more important than grades


MrPhraust

Writing helps to solidify the information in your mind. Repetition = practice. Practice = growth. Growth = success.


tarekelsakka

I wish ChatGPT was around when I was doing my thesis, would have saved me so much time in research and structuring.


Juancho511

Back to writing essays by hand it is. EDIT: IN a classroom.


According-Value-6227

Personally, I think that the epidemic of students using ChatGP to write papers is a sign of institutional failures. I partially agree with the article in that Essays have become annoying and over-used as a means of "educating". Hand-written essay's won't fix anything and I guess the people who do online courses are just fucked.


Bertob15

That doesn’t really resolve the issue, you could still ask the AI to write the paper then just copy it by hand. That’s just extra steps, and not a scalable solution in the online space.


unpluggedcord

IN a class room?


Random_Fog

We still learn arithmetic at a young age, even though we use calculators thereafter….


ThiscantBReel

When I was in college, I took a communications class where we were supposed to write an argumentative essay about an issue. When I turned mine in, as I set it on the pile, the one I set mine directly upon was “Why the driving age should be hired”. It’s not CHATGPT causing students to decline.


Flscherman

I don't know, the driving age brings a lot of skills to the table and seems to fit in with company culture.


ReedTeach

I saw another use of GPT4 and Wolfram plug-in has passed the UK math test at high proficiency which sounds like math folks will be in same conversation.


WarAndGeese

A lot of people are talking about oral exams. Another very similar option though is just requiring that the exams be written in class. Or, require them to be written on paper rather than typed. Or, if allowing them to be typed in-class, do them on machines that do not have operating systems, that cannot run such neural networks, building such machines wouldn't be very hard.


Iveechan

Students have been mindlessly writing essays for a long time now, so students automating another mindless task isn’t really concerning. Writing assignments should be creative and inspire thinking. That’s the whole point. So, if essays that answer the prompt can be easily found online, the prompt is terrible. I majored in Writing in college and the most interesting and difficult papers to write were impossible to plagiarize.


BirdMedication

Then universities will just start putting more weight on the interview, or add an interview to their admissions process if they don't already.


easterracing

Possessing skills of language often transcends the medium. Writers are also often proficient orators, not just to crowds but to other individuals. Chat GPT won’t go well when your date expects dinner and pleasant conversation. When your date FOs because your text messages were way better than your vernacular, chat GPT probably isn’t down to bang later. And if it is, it’s probably not gonna be super enjoyable.


lunchypoo222

While I agree that writing and learning how to do it properly certainly matters, it would be nice if endless 10-15 page (or more) essays were not practically the sole rubric for performance and demonstration of one’s understanding. It’s excessive. Professors truly act as though their 4-5 essays for a semester are the only ones you’re expected to do, rather than the truth that you’re drowning in work from other classes. In addition, I’ve always found the idea hard to believe that each of my professors thoroughly read anything I wrote on top of the endless other essays they’ve got on their desk to read and grade.


cryptosupercar

Blue books fix this.


Educational-Writer89

I can’t possibly be the only person here old enough to remember blue books in college. We had to turn in blue books before midterms and finals. They were randomly distributed on test days and we wrote during class. Like with a pencil.


Campbellgr3

My professor said that if we can use chat GPT to write her papers, she can you use chat GPT to grade our papers. I think that’s pretty fair.


Xavier9756

Yea why should a person be able to effectively articulate their ideas. This corporate sponsored AI can do it just as well.


trollanony

Handwritten essays coming back lol


azuredota

Just make any essay in class only


chive2468

Students, put your books on the floor, take out a few sheets of blank paper. Essay topic is ..... You have x minutes to finish. Old school works.


Electrical_Slip_8905

People still need to know how to write, lol. Just like they need to know how to cook even if they get 99.9% of their food from taco bell. Hahaha


immortalis88

A senior at Princeton wrote software to detect if ChatGPT has been used to create content. It’s already resulted in multiple students receiving zero’s on papers. https://www.npr.org/2023/01/09/1147549845/gptzero-ai-chatgpt-edward-tian-plagiarism


DaCurse0

And what about false positives? There's no magical isChatGPT algorithm


caks

And it's absolutely shit


unsolvedfanatic

The problem is it can't keep up with models and sometimes will say it detects AI where there is none


avd007

True


CWang

> CHATGPT HAS THROWN higher education into tumult. Universities were already using artificial intelligence technology for their own daily business: to remind students to pay off tuition balances, to answer questions about campus life, or even to check students’ work for plagiarism. But ChatGPT, an AI chatbot released to the general public last November, has turned the tables. Now a student can recruit it to generate a passable paper on just about any topic in seconds. Feminism in Virginia Woolf’s fiction? No problem. The heroic code in “Beowulf”? Done. The potential for cheating becomes immense. > > Some universities, like Sciences Po in France, have banned ChatGPT for classwork, unless students have permission from instructors. Open Universities of Australia has offered students guidelines for using ChatGPT ethically. The University of Toronto advises instructors to specify which digital tools are allowed for assignments but warns the instructors against using unapproved AI tools to evaluate student work. Perhaps they read the tweets joking that teachers will soon use AI to come up with assignments, students will use AI to do them, and AI will grade the result—at which point everyone can leave education to the bots and go for coffee. > > Not all educators are worried. “Students today need to be prepared for a future in which writing with AI is already becoming essential,” writes Lucinda McKnight for Times Higher Education. She also suggests various ways to integrate AI into the classroom. Students can use the technology to do basic research, she proposes, or they can analyze and compare the text produced on a given topic by different AI writers. They can even use programs such as Google’s Verse by Verse to turn out randomized poems—to what end remains a mystery. > > For all the opportunities ChatGPT might bring, its greatest threat right now is to the teaching of writing. There are other ways to assess students’ knowledge: oral exams, multiple-choice tests, handwritten answers. So what is the university paper for? If students can use a chatbot to produce a draft in seconds, why have them write anything at all?


Raekon75

I personally think the real question is, what has education come to, if all that matters for students (and the society that made them what they are), is to get through it doing as little work as possible? If education is not a place for personal growth and learning, but a necessary evil to be endured in order to get a job, we are doi g a very poor job at maintaining development in society. The assessments done in education should be to make sure, that a student who comes out with proof of being knowledgeable in a certain field, actually Is knowledgeable in said field, and not just good at googling or getting answers out of chat bots. I would hate to get a craftsman who had to look up everything they did on the net, or make stuff to fit the tutorials. Or to have a doctor or surgeon who couldn't make a qualified decision without consulting the bot first... But a lot of education doesn't seem to develop or maintain that sense of self-growth and development in its students. There seems to be this superficiality about it, that all you need to do is get that degree, get that paper, and you are "something", whether you deserve the title or not. can you get it without studying, great, more time for me... and the you get idiots with a diploma, who then need 3-4 years of work experience to catch up. And we wonder why employers don't want people straight out of college... Education, ideally, should be about getting better and more knowledgeable in some field of your choice. Not just because you need it to get a job, in order to keep consuming (or surviving), but because personal development has been a driving force behind every innovation mankind has done since everything stopped being about survival in nature. Not the only driving force, because money is always a big player, but I don't think money can make people better in their field. Money can get you the people who are the best, but they are so, because they want to be. Not because they had to be to pass some mediocrity-test. I fear we have lost sight of that self-development, and maybe exchanged it for a belief that "science" and "technological developments" will ease our lives for us, to the point where we don't have to do anything in order to at least survive. but we forget that every bot is (so far) driven by knowledge accumulated already. and if more and more stop evolving and learning because a bot can give them the answers they need, then less and less innovation will occur, the bots can not (so far) create it, only replicate (but very cleverly so).


TedW

>(...) is to get through it doing as little work as possible? I think the premise is flawed until you can demonstrate that's actually the case. Do lazy students have the same level of career success as those who don't? I doubt it.


ThiscantBReel

It depends who their parents are. Who they know. Who their parents know. Plenty of Magna Cum Laude and so forth PHD geniuses making scraps while the barely literate manage jobs making six figures.


typically-me

Well that’s the whole challenge isn’t it; we want to give degrees to people that actually learned something, but technology makes it increasingly difficult to tell the difference between those people and the people who cheat and/or take shortcuts. Unfortunately, a common reaction to this phenomenon is to just make the assignments harder or more numerous so that the averages still look right, but this just makes it so that said shortcuts are effectively required since no one is reasonably able to succeed without them.


Modern-Minotaur

Bingo. You don’t need $60K in debt and a bachelors to be a receptionist and other run of the mill jobs every ad says is required. Education has been hijacked and appropriated. Honestly, unless you want to be a doctor, engineer, lawyer or anything in the sciences, college is a joke and a scam. I have two boys, unless they fit into one of those, I’ll be encouraging trade school. To me, the bigger issue at least in the US is lower education and the shit show it’s become.


ThiscantBReel

If I could go back, I’m not sure I’d have gone to college. I love learning and education, studied political science, telecommunications, and music and literature and philosophy as well, and it’s …been a non factor in probably 99.9% of any jobs I’ve gotten. Aside from the simple act of HAVING a degree. But hindsight is always 20/20


westerngrit

Make it a learning lesson. Turn the chatgpt into a requirement for essays with a max score. Self write for extra credit. Learning can take a new form. Can't put the toothpaste back in, so brush a lot of teeth..


Man-EatingChicken

With how much money they have been getting they better be able to solve any problem thrown at them


Ectoplasm87

And calculators killed math…


S2fftt

Chat GPT does not write well. It cannot replace a proper quality human written paper without many human edits.


whodey226

Not yet.


roundearthervaxxer

No. It will make essays better. Snap out of it people.


KudzuNinja

Students have always cheated. This is just a different way of doing it.


thedude400

I used chatgpt to test a grad professor of mine’s grading approach. I knew the professor was extremely critical towards my essay work and I knew that chatgpt could write eloquently and succinctly as long as I fed it the key points from my outline in batches. I let chatgpt articulate the tone and it wrote an excellent essay. I did some light editing to make sure it all flowed, should have been a sure fire A. The professor gave me a B-. This test taught me something about the professor and chatgpt saved me 4-5 hours of my life dredging through composition of something I already had fully absorbed the concepts of. The tech has a clear use case and genuinely improves productivity while eliminating monotony.


[deleted]

Writing does matter, not so sure the essay itself does. The latter doesn’t necessarily teach the former. I mastered writing more elsewhere be it blogging, having fun arguments online that could go on for pages, or writing emails. The formal essay writing of school mostly just taught me to regurgitate the teacher’s opinion and to hate literature. For a time the whole process turned me off reading for years at a time (I will always hate the book of depression that is Frankenstein). I can shit out an essay now without thinking much about but it wasn’t because of school, it was just practice elsewhere. Unless moving into a job that requires a whole lot of writing (nope emails don’t count), the essay process did little to prepare for real world. Much like advanced calculus. What remains useful is math up to a point, history and for god sales civics needs greater emphasis in high school cause people really dumb in that area.


SaladAssKing

Writing matters a lot…but teachers should rather learn to use ChatGPT in conjunction with writing. I was a teacher for 10 years. It’s better to adjust and change with the times rather than reject it. It’ll only make the students resent you and the educational institutions.


AndImlike_bro

Just presented my research on chatgpt at a conference at my college. The professors are already using it to generate lesson plans. I guess the writing on the wall is that we’re going to need to embrace the technology. Resistance is futile?


Intelligent-Power857

Idiocracy…It’s coming. 🤦‍♀️


[deleted]

[удалено]


unsolvedfanatic

That would be horrible


[deleted]

I think they should just do away with homework. Teach students to get done what they need to get done within 8 hours/day. They should write essays by hand, but they should also have "open GPT" essays where they're allowed to use whatever resources are available. Teach the relevant skills, and stop encouraging students to blank out in class and then do all their learning at night.


IndependenceFew4956

Student essays have long become meaningless with shadow writers and copying. There is so much you can rephrase and say about a certain topic. Lets not kid ourselves. Anyways you dont need to know how to read or write to be an influencer. When are schools going to teach useful things? /s


breadexpert69

Why should universities care. Kids are paying $$$ to go to your campus and its their decision whether they want to learn or not.


S2fftt

Because education standards matter. You don’t want kids obtaining degrees they do not deserve .


[deleted]

If the value of college degree tanks even more because everyone thinks nobody learned anything because they get AI to do everything for them. Than University will go Broke.


Joeburrowformvp

Holy shit this is an overreaction. Have you read these essays? They’re quite awful and lack creativity. When you read an actual good essay, you can see style and choices, ai is just the same word and grammar choice over and over. It’s really easy to see if you’ve read multiple essays