Lugh@futurology.todayM to Futurology@futurology.todayEnglish · 8 months agoEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgexternal-linkmessage-square68fedilinkarrow-up1299arrow-down127
arrow-up1272arrow-down1external-linkEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgLugh@futurology.todayM to Futurology@futurology.todayEnglish · 8 months agomessage-square68fedilink
minus-squareUmbrias@beehaw.orglinkfedilinkEnglisharrow-up1·8 months agoHallucinations are not qualia. Please go talk to an llm for hallucinations, you can use duck duck gos implementation of chatgpt, and see why it’s being used to mean a fairly different thing from human hallucinations.
Hallucinations are not qualia.
Please go talk to an llm for hallucinations, you can use duck duck gos implementation of chatgpt, and see why it’s being used to mean a fairly different thing from human hallucinations.