Submitted by RadioFreeAmerika t3_122ilav in singularity
skob17 t1_jdrex9t wrote
Reply to comment by CommunismDoesntWork in Why is maths so hard for LLMs? by RadioFreeAmerika
One prompt takes only one path through the network to generate the answer. Still a few 100 layers deep, but only one pass. It cannot iterate over a complicated math problem to solve it step by step.
Ok_Faithlessness4197 t1_jdrrdia wrote
Yes it can, just need to prompt for a chain of thought. As another user mentioned, it can work through complicated math problems easily. The issue lies in its inability to determine when such an increase in resources is necessary, without human input.
ArcticWinterZzZ t1_jdt0urg wrote
I don't think that's impossible to add. You are right: chain of thought prompting circumvents this issue. I am specifically referring to "mental math" multiplication, which GPT-4 will often attempt.
liqui_date_me t1_jdt531o wrote
You would think that GPT would have discovered a general purpose way to multiply numbers, but it really hasn’t, and it isn’t accurate even with chain-of-thought prompting.
I just asked GPT4 to solve this: 87176363 times 198364
The right answer should be 17292652070132 according to wolfram alpha.
According to GPT4 the answer is 17,309,868,626,012.
This is the prompt I used:
What is 87176363 times 198364? Think of the problem step by step and give me an exact answer.
ArcticWinterZzZ t1_jdtlkru wrote
Even if it were to perform the addition manually, addition takes place in the opposite order that GPT-4 thinks. It's unlikely to be very good at it.
Viewing a single comment thread. View all comments