Osemwaro OP t1_j06z76a wrote
Reply to comment by AlexeyKruglov in [D] Why are ChatGPT's initial responses so unrepresentative of the distribution of possibilities that its training data surely offers? by Osemwaro
Yeah, u/farmingvillein suggested that before you. The temperature parameter behaves like temperature in physics though, so low temperatures (i.e. temperatures below 1) decrease entropy, by biasing it towards the most probable tokens, and high temperatures increase entropy, by making the distribution more uniform.
Viewing a single comment thread. View all comments