• FrowingFostek@lemmy.world
    link
    fedilink
    arrow-up
    30
    arrow-down
    4
    ·
    8 months ago

    I like the idea of these companies dumping all this money into a technology to replace people, only for people to not buy the product that tried to replace people.

  • PonyOfWar@pawb.social
    link
    fedilink
    arrow-up
    21
    arrow-down
    5
    ·
    8 months ago

    Honestly one of the AI applications I see real potential in. They can train the NPCs with an extensive backstory and the interactions with them could be way more dynamic than what we currently get for NPCs. Something like a more advanced version of “Starship Titanic”, if anyone remembers that.

    • MotoAsh@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      8 months ago

      You are imagining a supercomputer’s LLM running an NPC.

      It literally cannot be that fancy. Maybe they can fake it and fool a few rubes, but no there will be no deep characters ran by this.

      • PonyOfWar@pawb.social
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        8 months ago

        The way it works right now is usually over the cloud. I’ve already tried out a bit of “Convai” as a developer, which is a platform where you can create LLM NPCs and put them in Unreal Engine. It’s pretty neat, not perfect, but you can definitely give characters thousands of lines of backstory if you want and they will act in character. They will also remember any conversations a player had with them previously and can refer to them in later convos. Can still be fairly obvious that you’re talking to an LLM though, if you know what to ask and what to look for. Due to its cloud-based nature, there is also some delay between the player input and the response. But it has a lot of potential for dialog systems where you can do way more than just choose between 4 predefined sentences. Especially once running these things locally won’t be a performance-issue.

    • fruitycoder@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      8 months ago

      There are a couple indies and mods working on that! The trick definitely is to lower the power needed, maybe through a series of fine gunned models (might also lower the amount anacrinisms too)

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    arrow-up
    4
    ·
    8 months ago

    This is the best summary I could come up with:


    In the midst of all of that success, NVIDIA is working on smaller and larger initiatives, but they all seem to have one thing in common: they are AI-centered.

    One of these smaller initiatives comes from Ubisoft Paris, where a small team is testing out how to use AI, specifically Nvidia’s Audio2Face application and Inworld’s Large Language Model (LLM), to try to make a new generation of NPCs.

    As we see many studios, especially under Microsoft, begin to form unions, like the recent announcement from Activision QA workers, it might be possible to alleviate some of the risks around introducing AI.

    This could then allow the player to have a genuine conversation of discovery that provides a bespoke unique experience but would always still be true to the human writer’s intention.

    However, with the improvement of ChatGPT over time and image and video generation, there seems to be a more open mind around the idea of having some games use integrated large language models (LLMs) for NPC interactions.

    There have even been mods for popular games like Grand Theft Auto 5, where you can talk to NPCs with ChatGPT running to answer queries.


    The original article contains 575 words, the summary contains 193 words. Saved 66%. I’m a bot and I’m open source!

  • mrfriki@lemmy.world
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    8 months ago

    The company with more filling content in their games meet the company most willing to sell filling content. A match in heaven I’ll say.

    • olutukko@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      8 months ago

      Nom they will invent $3000 video card and some nvidia feature that makes the game impossible to run on anything else than that $3000 card

  • fruitycoder@sh.itjust.works
    link
    fedilink
    arrow-up
    2
    ·
    8 months ago

    If its not open source I’m not that interested. The gamed industry is full of cool but fucking useless tech because way too much is proprietary