Viewing a single comment thread. View all comments

verybakedpotatoe t1_j6dcnf5 wrote

I find it hard to believe that it can actually obtain any of these certifications when it can't perform any original analytical examination of anything.

Go ahead and try and ask it to perform analysis on publicly available information and it will fail. It can repeat analysis that anybody else has already done and maybe recombine and repackage it in a useful summary but it is wholly unable to answer a simple question like, "If I am leading a sow with seven piglets how many feet are there?".

It's the saxophone backpack problem all over again they never seem to have figured that part out. It's not really creative or intelligent, not yet. It's just brute force fitting pregenerated reasons together until they fit better.

>EDIT: I tried to get it to explain the logic it used. I am somehow even more disappointed.

53

joec_95123 t1_j6dl36h wrote

There are 2 feet, OP's, assuming OP still has both feet. The sow and the piglets have hooves.

21

weirdgroovynerd t1_j6dm3sw wrote

This whole limb- counting exercise has become rather...

... boaring!

See?

Can AI a deliver fire dad jokes like that?!

19

gurenkagurenda t1_j6dkaah wrote

I use ChatGPT to solve analytical problems all the time. The key is that you have to tell it to show it’s work:

> If I am leading a sow with seven piglets how many feet are there. Take it step by step > >> The sow has 4 feet

>> Each piglet has 4 feet

>> Total number of piglet feet = 4 * 7 = 28

>> Total number of feet = 4 + 28 = 32

It’s able to keep track of the analysis far better this way, and it also lets you check its work for errors.

Now obviously it left my feet out, but I think that’s a reasonable ambiguity, and not one that you usually find in professional exams. If I ask it to account for that, it gets it right.

6

verybakedpotatoe t1_j6dvcyq wrote

It didn't go so well for me. I need to master the special sauce to get better results.

32 is close and the reasoning is almost there, but the correct answer is 34 feet because I am leading them.

I started with the 'man from st ives' riddle and tried to create a novel and simple version of the riddle with a clear answer. I think I would have accepted 32 as a good effort, or even just 2 if it said they all have hooves, but 8 and 11 are just wrong.

9

TheRealDynamitri t1_j6flx29 wrote

What’s a “saxophone backpack problem”?

Tried googling but no joy

2

Nonya5 t1_j6ddypy wrote

The total number of feet would be 34, including your feet.

Now it gets it right. Humans never get things right on the first try either.

0

theduckspants t1_j6derw4 wrote

It just told me

"There are a total of 29 feet (8 for the sow, 7 x 4 for the piglets)."

So it thinks a sow has 8 feet and that 7x4 is 21

Then asked what is 7x4? It said 28

Then asked how many feet a sow has. It said typically a sow has four feet

Then reasked the original question and it said "There are a total of 29 feet (4 for the sow, 7 x 4 for the piglets)."

6

clintCamp t1_j6dkrfn wrote

Which is why it would be a great virtual doctor that can discern basic ailments that can be directed to over the counter medication, or pharmaceutical, but also be able to direct you to a real doctor when it gets more complicated. Most of the normal human ailments are well documented so other doctors can figure it out which is why this would be great. The only thing I could see going awry would be when it tries to make things up to make you happy. It would probably be better at analyzing drug interactions and stuff better than real doctors who screw up like humans though.

−1

trentgibbo t1_j6fb4be wrote

Your missing the problem. It doesn't know if something is more complicated or not. It might think a rather mundane issue is serious or vice versa.

8

AgeEffective5255 t1_j6g3z9i wrote

It doesn’t stop it from encountering the same problems human doctors encounter: not having all relevant information. We blame the people all the time, but the structures in place allow for errors to happen; you can’t catch a patient who is hiding symptoms or unknowingly visiting multiple doctors most times, you think ChatGPT will?

1

clintCamp t1_j6gb0vd wrote

If it was set up right, it would read in their medical profile and full history, and then use it's full medical knowledge to ask the patient relevant questions to narrow down potential causes, or refer them to get specific testing, which would update their profile. Unlike the real medical field, chatGPT medical could be updated with the latest research information often, so it doesn't keep using outdated info like MD's in real life.

1