It's a prediction engine sold as something intelligent. It's does this on such an incredible scale that the underlying hammer approach is washed out and LLMs can give you the impression of sophisticated reasoning - in terms of how a human views intelligent behavior.
LLMs are based on linguistics and probability chains and very far away from reasoning in terms of cognitive processes as we intuitively do.
There might be such mechanism put on top of LLMs in the futur, and that what Altman is selling everyone and tells it's around the corner, so it can enhance how this simulacrum performs, but ask it enough often and it will predict an answer, or, invent bullshit in simpler words.
Once it lands on a low confidence area, it's start to bullshit users better then any slimy used cars seller you ever met, if it hits the prediction correctly the while spiel goes on, if not, it apologize and recalculates the probabilities.