I’ve actually started to recognize the pattern of if something is written in AI
It’s hard to describe but it’s like an uncanny valley of quality, like if someone uses flowery SAT words to juje up their paper’s word count but somehow even more
It’s like the writing will occasionally pause to comment on itself and the dramatic effect its trying to achieve
Yeah, this is true! It likes to summarize things at the end in a stereotypical format
It’s not a bad format either, AI seem to enjoy the five paragraph assay format above all other even for casual conversations.
The LLM isn’t really thinking, it is auto complete trained so the average person would be fooled thinking that text was produced by another human.
I’m not surprised it has flaws like that.
BTW here on Lenny there are communities with AI pictures. Someone created a similar community but with art created by humans.
While the AI results are very good, when you start looking and comparing it with non AI art, you start seeing that the AI while it is unique it still produces a cookie cutter results.
I have issue with using AI to write my resume. I just want it to clean up my grammar and maybe rephrase a few things just in a different way I wouldn’t because I don’t do the words real good. But I always end up with something that reads like I paid some influencer manager to write it. I write 90% of it myself so its all accurate and doesn’t have AI errors. But it’s just so obviously too good.
You are putting yourself down unnecessarily. You want your resume to talk you up. Whoever reads it is going to imagine that you embellished anyway. So if you just write it basically, they’ll think you’re unqualified or just don’t understand how to write a resume.
“While the thing you entered in the prompt, it’s important to consult this other source on your prompt. In summary, your prompt.”
Writing papers is archaic and needs to go. College education needs to move with the times. Useful in doctorate work but everything below it can be skipped.
Learning to write is how a person begins to organize their thoughts, be persuasive, and evaluate conflicting sources.
It’s maybe the most important thing someone can learn.
The trouble is that if it’s skipped at lower levels doctorate students won’t know how to do it anymore.
Are they going to know how to do it now if they’re all just Chat GPTing it?
Clearly we need some alternative mode to demonstrate mastery of subject matter, I’ve seen some folks suggesting we go back to pen and paper writing but part of me wonders if the right approach is to lean in and start teaching what they should be querying and how to check the output for correctness, but honestly that still necessitates being able to check if someone’s handing in something they worked on themself at all or if they just had something spit out their work for them.
My mind goes to the oral defense, have students answer questions about what they’ve submitted to see if they’ve familiarized themselves with the subject matter prior to cooking up what they submitted, but that feels too unfair to students with stage anxiety, even if you limit these kinds of papers to only once a year per class or something. Maybe something more like an interview with accomodation for socially panickable students?
I’m in software engineering. One would think that English would be a useless class for my major, yet at work I still have to write a lot of documents. Either preparing new features, explaining existing, writing instructions for others etc
BTW: with using AI to write essays, you generally have subject that is known and that many people write something similar, all of that was used to train it.
With technical writing you are generally describe something that is brand new and very unique so you won’t be able to make AI write it for you.
When I come across a solid dev who is also a solid writer it’s like they have super powers. Being about to write effectively is so important.
You can’t have kids go through school never writing papers and then get to graduate school and expected to churn out long, well written papers.
Unexpected pencil and paper test comeback
Already happening. My kid in high school has more tests and papers required to be hand-written this year.
And yes, TurnItIn legitimately caught him writing a paper with AI. Even the best kids make the stupid/lazy choice
When I was in college (2000-2004), we wrote our long papers on computers but we had what were called “blue books” for tests that were like mini notebooks. And many of the tests were basically, “Here is the topic. Write for up to an hour.”
And now my hand cramps if I write anything longer than a check. I can also type quickly enough that it basically matches the speed of my train of thoughts but actually writing cursive with a pen now, I get distracted and think, “Wait, how does a cursive capital ‘G’ go? Oh yeah. Hold on. What was I going to write?”
I pity the kids that have always typed for what their hands will go through on written tests
No way professors/TAs are going back to grading tests by hand.
Naw, they’ll use ocr
Most professors I dealt with when I did campus IT couldn’t get their office printer to work.
Not a problem, the next IT campus recruitment will list “OCR Scanner Operator” as a requirement and as a part of the job description. ;-)
I have it write all my emails. I’m so productive and everyone loves them. That or they’re also using ChatGPT, and it’s just two computers flattering each other.
I had it write an operation manual for a client I particularly hate. Told it to make it sound condescending by dumbing it down just to the point where I could deny it. The first few times it just sounded like a 5th grade teacher talking to a kid while in a bad mood, but eventually it figured out if it just repeated itself enough it got the effect I wanted.
Things like: user is to disconnect power before attempting to repair. It is vital that the step of disconnecting power before attempting to repair is carried out.
I’m also sent long gpt generated documents and I summarise them in bullet points with gpt 4. Truly the future we all imagined (I learnt to take extra time to write a FAQ as an introduction to anything I write specifically because I know that they will gpt through the document, so I provide that stuff in advance)
Machine learning tool used by people too lazy to do their actual job accuses everyone else of using machine learning tools.
Yeah that’s pretty funny given the circumstances. “Our AI found your AI.” Cool, so maybe none of this is working as intended. I’d be willing to bet nothing changes but the punishments for students.
Someone posted to the class discussion form with the bit about being an ai bot still included.
I wish it was a joke.
I didn’t do great in that class, but it was me getting 70% for not wanting to try and explain a mathematically concept in 500 words! They won’t take that away from me.
And nothing of value was produced.
To be fair- that value didn’t change much from pre ai.
And those papers get used as training data for next iteration of AI. Reinforcement learning!
Students? Even teachers are doing it…
“Likely “
Good. Academia lost its way anyways
deleted by creator