Comment by sega_sai
12 days ago
If you think that the output of current LLM is the ground truth, then yes, what are they doing is biasing.
12 days ago
If you think that the output of current LLM is the ground truth, then yes, what are they doing is biasing.
Bias tilting.
The opposite direction is "checking and reasoning".