Submitted by Parking_Attitude_519 t3_10mvqor in technology
glonq t1_j65g2s9 wrote
Yeah, I'm sure that will sove the problem
/s
wockyman t1_j65hp6x wrote
It's a great brainstorming tool, and it's described some specifics of complicated subjects to me more clearly than my profs did. Universities need to adapt to this, not blanket ban it. Unfortunately (based on how poorly universities have adapted to other challenges in the last couple decades) it'll likely just be one more thing that drives them into the same category as newspapers and broadcast tv.
Pilferjynx t1_j66189h wrote
It's like the calculator. You use it by first knowing the key ideas and then express the values with the tools computational power.
w-g t1_j66fo2e wrote
It's not that simple -- it's of course natural to ask whether the teachers are requiring rote tasks, or memorizing data. But it's likely that in the future AI systems will be able to produce meaningful texts with somewhat credible argumentation. I know several of teachers who do want students to think (instead of doing rote tasks) are also worried about chatGPT. For example, you may want to assess by asking students to write an essay with the specific goal to make a point, or a rebuttal of something that was already read in class, or whatever. The problem is that chatGPT can do that -- although a crappy job. But the crappy job may be just enough for the student to pass.
So the question is how to do assessment, knowing that students will have access to AI tools -- not chatGPT, but the evolved versions of it and also the other AI tools yet to come. Because we are not supposed to expect people to not think by themselves...
wockyman t1_j66oado wrote
Oh I know it's not simple, but I do believe it's required. We're going to have to reconsider some of our long-held assumptions about what education is for and where assessment fits into that. ChatGPT does a C+ job if given a blind prompt. But if you talk to it for a while about a subject, get it to define and clarify first principles, it can do an A- job of producing meaningful analysis. It will blindly concoct flasehoods sometimes. It'll give you a list of general sources, but it won't cite. But I agree, we'll likely blow past those limitations in a couple of years or less. So when everyone can basically talk to the computer from Star Trek: TNG, we're going to have to change the curriculum. I expect to see more practical, project-based classes that result in a complex final product. Like a bunch of mini-dissertations.
BlindProphet0 t1_j66tfq8 wrote
I love using it for brainstorming. It helped me in my current ethics class by rephrasing some of the more complex concepts.
danielalvesrosel t1_j65lztg wrote
Just goes to show how poorly some assignments are designed, if we have the means to do something better, why not build on-top of it..
Old_comfy_shoes t1_j671km9 wrote
"ChatGPT, how does one continue to use chatgpt, without any universities finding out."
DogsAreOurFriends t1_j68ggfb wrote
It is simply in place so that they can sanction someone caught using it.
Viewing a single comment thread. View all comments