Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
player1234
7 months ago
|
parent
|
context
|
favorite
| on:
Why language models hallucinate
There is no such thing as confidence regarding the actual facts, only confidence in probable output from the input. Factual confidence is impossible with current architecture.
Consider applying for YC's Summer 2026 batch! Applications are open till May 4
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: