• 6 Posts
  • 417 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle





  • It has access to a python interpreter and can use that to do math, but it shows you that this is happening, and it did not when i asked it.

    I asked it to do another operation, this time specifying i wanted it to use an external tool, and it did

    You have access to a dictionary, that doesn’t prove you’re incapable of spelling simple words on your own, like goddamn people what’s with the hate boners for ai around here









  • If i train an LLM to do math, for the training data i generate a+b=cstatements, never showing it the same one twice.

    It would be pointless for it to “memorize” every single question and answer it gets since it would never see that question again. The only way it would be able to generate correct answers would be if it gained a concept of what numbers are, and how the add operation operates on them to create a new number.
    Rather than memorizing and parroting it would have to actually understand it in order to generate responses.

    It’s called generalization, it’s why large amounts of data is required (if you show the same data again and again then memorizing becomes a viable strategy)

    If I say “Two plus two is four” I am communicating my belief about mathematics.

    Seems like a pointless distinction, you were told it so you believe it to be the case? Why can’t we say the LLM outputs what it believes is the correct answer? You’re both just making some statement based on your prior experiences which may or may not be true