LLMs can correct but not find reasoning errors
Share
Open
Uncover how LLMs can correct but not identify reasoning errors, revealed by Gladys Tyen and colleagues. Investigate the self-correction process and the proposed backtracking method improving LLM's performance when informed of mistake locations.