• 0 Posts
  • 15 Comments
Joined 5 days ago
cake
Cake day: December 18th, 2024

help-circle
  • Start bulking up by eating well, solid exercise routine, a bit of help from anabolic steroids. Pose with a formula-filled blackboard background shirtless while flexing your biceps for Instagram and Twitter. Become the math bodybuilding icon. Make jokes like “my muscles are not differentially equal to yours”. You should build an audience, and after that you’ll be able to expand into sponsorships, and OnlyFans. You can also do IRL prostitution, and earn thousands of $ per night. The key is to target either old hags, or rich homosexuals.

    Good luck. Let your biceps look like the bell curve of a Gaussian distribution


  • Large context window LLMs are able to do quite a bit more than filling the gaps and completion. They can edit multiple files.

    Yet, they’re unreliable, as they hallucinate all the time. Debugging LLM-generated code is a new skill, and it’s up to you to decide to learn it or not. I see quite an even split among devs. I think it’s worth it, though once it took me two hours to find a very obscure bug in LLM-generated code.










  • Large gains were due to scaling the hardware, and data. The training algorithms didn’t change much, transformers allowed for higher parallelization. There are no signs of the process becoming self-improving. Agentic performance is horrible as you can see with Claude (15% of tasks successful).

    What happens in the brain is a big mystery, and thus it cannot be mimicked. Biological neural networks do not exist, because the synaptic cleft is an artifact. The living neurons are round, and the axons are the result of dehydration with ethanol or xylene.