jagedlion t1_j8jy2ud wrote
Reply to comment by Graega in ChatGPT Passed a Major Medical Exam, but Just Barely | Researchers say ChatGPT is the first AI to receive a passing score for the U.S. Medical Licensing Exam, but it's still bad at math. by chrisdh79
So it does many of the things you listed.
It greatly compresses the training database into a tiny (by comparison) model. It runs without access to either the internet, nor the original training data. The ability for it to run 'cheaply' is directly related to how complex the model being built is. Keeping the system efficient is important and that's a major limit on the size of what it can store.
It was trained on 45TB of internet data, compressed and filtered down to around 500GB. A very limited size database already. Then it actually goes further to 'learn' the meaning though, so this is actually stored as 175 billion 'weights' which is about 700GB (each weight is 4 bytes). Still though, that's a pretty 'limited' inference size. Not like, do it on your own computer size, but not terrible. They say it costs a few cents per question, so, pretty cheap compared to the costs of actually hiring even a poor quality professional.
It does therefore have to 'study' ahead of time.
The only thing it doesn't do that you listed, is that it reads many sources, not just one. But the rest? It already does it.
Viewing a single comment thread. View all comments