Viewing a single comment thread. View all comments

LettucePrime OP t1_j9nw6se wrote

Oh, Sam's calculator shtick. Yeah I fundamentally disagree with that. Writing is not the same as Arithmetic. The goal of an essay is not to convey information, but for the student to internalize the concepts & present their interpretation & interaction with the ideas in a compelling & unique way. AI-assisted tools, at least with the strength of ChatGPT, negate this process to the detriment of most of academia. The struggle is the process.

5

khamelean t1_j9nwfmw wrote

“present their interpretation & interaction with the ideas in a compelling & unique way.”

So to convey information then…

2

LettucePrime OP t1_j9nwrjp wrote

Yes. The student's information. The AI cannot interpret, interact, nor can it, by definition, be unique. The AI cannot be used by a student as a crutch get out of developing their own assessments, as is done presently - & the essay is still an excellent medium to do this.

1

khamelean t1_j9nwvus wrote

“the essay is still an excellent medium to do this.”

If that were true, then this wouldn’t be an issue.

1

LettucePrime OP t1_j9nx9qv wrote

I wonder why high school math students are still required to show their work when Wolfram Alpha can factor polynomials no problem.

2

khamelean t1_j9ny3zc wrote

Because they “won’t always have a calculator on them”…

Wolfram alpha can also show workings btw.

−1

LettucePrime OP t1_j9nyzj2 wrote

Yeah I'm very aware. I'm also aware it's detrimental to getting the knowledge in your head. It's the same reason we don't teach kids basic arithmetic with 4 function calculators. An essay is "showing your work" about the topic of your essay.

2

jedi_tarzan t1_j9q6dhv wrote

I disagree with some of this. "The struggle is the process" wafts of "I suffered, so so should you."

If our tools and technology progressed past the point of a certain test being useful, we move on and make new tests.

The math comparison is not useless. No one thinks writing and arithmetic are the same, so pointing it out isn't moving the discussion forward. The core point of the comparison is that when technology can perform part of the process, we change what it is we care about teaching. I don't know about you, but essays were often basically take-home busywork.

As far as writing essays go, LLM didn't exist when I was in school, but CliffNotes did. Sparknotes did. Enough internet to plagiarize with some clever editorializing. "Academia" has always had this problem. Some students will learn to the degree that they need to. And what industries are harmed by students fudging their essays? What jobs?

Won't those jobs also have access to the same tools? I'm in a very high level technical field and I now regularly use llm tools to get me started on templates for yaml files, terraform modules, etc. If anything, learning how to use it will be the skill.

2

LettucePrime OP t1_j9sho3d wrote

EDIT: I am so sorry this is long as shit & it ends on a downer. It's just a really morose & unpleasant read.

Later in the thread I used a better comparison: Wolfram Alpha is not used to teach pre-calculus. 4 function calculators are not used to teach basic arithmetic. We gate a student's "generative ability" based on the skills we want them to develop. Trigonometry does not measure a student's ability to draw a sine function, but rather their ability to represent, measure, & manipulate one. The robot can draw the line to match your function, that's the easy part. Making sure your function is correct is the part you need to learn.

The essay is the function, not the line. It is the proof of the struggle with something new that will produce necessary skills for development. At the very least, it's proof that the user can read a new thing & generate a cogent output from it, which is such an impressive accomplishment in nature that teaching it to machines has caused significant economic & social disruptions.

It's evidence of a user's ability to interrelate information - a process so complex it must be done essentially from scratch every time the user alters even one parameter of their data set. Where mathematical reasoning, at least elementary math, linearly grows in complexity, allowing students the ability to compress portions of the process generatively, no such linearity exists in any other discipline. No one in studying Faust is saying: "I learned about 17th century English Literature last year. I'll just plug Paradise Lost into the machine to return a comparison between Milton & Goethe's portrayals of aberrant desire"

Lastly, it's evidence of the user's ability to communicate, which can be considered a complex test of metacognition, a much simpler test of the arbitrary constraints of syntax, & a gauge for how fulfilling the experience was for the user. At the end of the day, that is what it's about.

We need people to have all of these skills. Many of them are difficult to learn. Most of them overlap with ChatGPTs advertised features. We are asking our education system to revolutionize itself in response to a new toy in an extremely short time while extremely underfunded & extremely overtaxed. This is a recipe for a goddamn catastrophe.

You asked what the actual fallout of the last several decades of neglecting liberal arts education has been, &, if I may be perfectly frank, I think it's produced a fucking wasteland. Our industries are corrupted by a revitalized mercenary fetish for cutting overhead & maximizing dividends at a human cost. Our public gathering places are being bulldozed & replaced with more profit-sucking real estate. Our actions are monitored, dissent is catalogued, & punishment is divvied out on an industrial scale. When it happens to us, so often we are incapable of placing it in a larger context. When it happens to others, we struggle with our incomplete grasp of empathy & susceptibility to reams of misinformation. All of this, helmed by engineers, computer scientists, lawyers, entrepreneurs, politicians, & citizens simultaneously over & under-educated.

I have a personal example. My dad held a degree in Nuclear Engineering & had nearly 30 year's experience in systems analysis, quality assurance, continuous improvement & adjacent managerial disciplines in the Energy, Aerospace, & Manufacturing industries. He died a year & a half ago. The disease was systematized ignorance. Delta variant was just a symptom.

2

jedi_tarzan t1_j9vg0ri wrote

> I am so sorry this is long as shit & it ends on a downer. It's just a really morose & unpleasant read.

I read it all, though. You're probably tired of hearing it but I'm still sorry for your dad. That sucks.

Anyway... damn. Yeah. You've made excellent points.

I'm a technophile, I believe in progress, and I watch on the daily as outmoded corporate interests stymie or erase technical progress that could become cultural progress. I watch politicians flagrantly ignore scientific evidence for climate change or disease, while watching journalists encourage them.

So, perhaps much of my opinion surrounding AI is derived of those feelings and sympathies. But I think you're right on much, maybe all of what you've said.

So now, I don't know. I don't know the best way forward. I don't think AI is going away. Companies are working out tools to detect AI-generated text, but technological progress also demands the AI-generators get smarter.

Thank you for writing that all out, though.

2