LettucePrime OP t1_j9sho3d wrote
Reply to comment by jedi_tarzan in Question for any AI enthusiasts about an obvious (?) solution to a difficult LLM problem in society by LettucePrime
EDIT: I am so sorry this is long as shit & it ends on a downer. It's just a really morose & unpleasant read.
Later in the thread I used a better comparison: Wolfram Alpha is not used to teach pre-calculus. 4 function calculators are not used to teach basic arithmetic. We gate a student's "generative ability" based on the skills we want them to develop. Trigonometry does not measure a student's ability to draw a sine function, but rather their ability to represent, measure, & manipulate one. The robot can draw the line to match your function, that's the easy part. Making sure your function is correct is the part you need to learn.
The essay is the function, not the line. It is the proof of the struggle with something new that will produce necessary skills for development. At the very least, it's proof that the user can read a new thing & generate a cogent output from it, which is such an impressive accomplishment in nature that teaching it to machines has caused significant economic & social disruptions.
It's evidence of a user's ability to interrelate information - a process so complex it must be done essentially from scratch every time the user alters even one parameter of their data set. Where mathematical reasoning, at least elementary math, linearly grows in complexity, allowing students the ability to compress portions of the process generatively, no such linearity exists in any other discipline. No one in studying Faust is saying: "I learned about 17th century English Literature last year. I'll just plug Paradise Lost into the machine to return a comparison between Milton & Goethe's portrayals of aberrant desire"
Lastly, it's evidence of the user's ability to communicate, which can be considered a complex test of metacognition, a much simpler test of the arbitrary constraints of syntax, & a gauge for how fulfilling the experience was for the user. At the end of the day, that is what it's about.
We need people to have all of these skills. Many of them are difficult to learn. Most of them overlap with ChatGPTs advertised features. We are asking our education system to revolutionize itself in response to a new toy in an extremely short time while extremely underfunded & extremely overtaxed. This is a recipe for a goddamn catastrophe.
You asked what the actual fallout of the last several decades of neglecting liberal arts education has been, &, if I may be perfectly frank, I think it's produced a fucking wasteland. Our industries are corrupted by a revitalized mercenary fetish for cutting overhead & maximizing dividends at a human cost. Our public gathering places are being bulldozed & replaced with more profit-sucking real estate. Our actions are monitored, dissent is catalogued, & punishment is divvied out on an industrial scale. When it happens to us, so often we are incapable of placing it in a larger context. When it happens to others, we struggle with our incomplete grasp of empathy & susceptibility to reams of misinformation. All of this, helmed by engineers, computer scientists, lawyers, entrepreneurs, politicians, & citizens simultaneously over & under-educated.
I have a personal example. My dad held a degree in Nuclear Engineering & had nearly 30 year's experience in systems analysis, quality assurance, continuous improvement & adjacent managerial disciplines in the Energy, Aerospace, & Manufacturing industries. He died a year & a half ago. The disease was systematized ignorance. Delta variant was just a symptom.
jedi_tarzan t1_j9vg0ri wrote
> I am so sorry this is long as shit & it ends on a downer. It's just a really morose & unpleasant read.
I read it all, though. You're probably tired of hearing it but I'm still sorry for your dad. That sucks.
Anyway... damn. Yeah. You've made excellent points.
I'm a technophile, I believe in progress, and I watch on the daily as outmoded corporate interests stymie or erase technical progress that could become cultural progress. I watch politicians flagrantly ignore scientific evidence for climate change or disease, while watching journalists encourage them.
So, perhaps much of my opinion surrounding AI is derived of those feelings and sympathies. But I think you're right on much, maybe all of what you've said.
So now, I don't know. I don't know the best way forward. I don't think AI is going away. Companies are working out tools to detect AI-generated text, but technological progress also demands the AI-generators get smarter.
Thank you for writing that all out, though.
Viewing a single comment thread. View all comments