Counterpoint: I have a $5 RISC-V computer I can power off USB that has a dedicated AI chip capable of running basic image recognition/voice synthesis/time series prediction/whatever you want algorithms. This is what people need to think about when they think about the future of AI. All this shit with piping things into datacenters to throw away energy talking to the stochastic parrot is A - not even remotely going to be the main use for AI, B - not going to be necessary for tons of tasks (likely won’t be necessary to have ChatGPT-quality interactions within a couple years). ChatGPT is like the ENIAC of AI. Complaining about the energy use of AI and pointing to the energy use behind ChatGPT queries is the same kind of mistake as when Thomas Watson said in 1943 that there was, maybe, a world market for five computers. Yeah, there was maybe a world market for five ENIAC’s, tops, but get with the picture, people. AI’s much bigger than LLM’s.
Counterpoint: I have a $5 RISC-V computer I can power off USB that has a dedicated AI chip capable of running basic image recognition/voice synthesis/time series prediction/whatever you want algorithms. This is what people need to think about when they think about the future of AI. All this shit with piping things into datacenters to throw away energy talking to the stochastic parrot is A - not even remotely going to be the main use for AI, B - not going to be necessary for tons of tasks (likely won’t be necessary to have ChatGPT-quality interactions within a couple years). ChatGPT is like the ENIAC of AI. Complaining about the energy use of AI and pointing to the energy use behind ChatGPT queries is the same kind of mistake as when Thomas Watson said in 1943 that there was, maybe, a world market for five computers. Yeah, there was maybe a world market for five ENIAC’s, tops, but get with the picture, people. AI’s much bigger than LLM’s.