• PeterPoopshit@sh.itjust.works
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    1 year ago

    It seems self hosted ai always needs at least 13gb of vram. Any gpu that has more than 12gb of vram is conveniently like $1k per gb for every gb of vram, beyond 12gb (sort of like how any boat longer than 18 feet usually costs $10k per foot for any every foot of length beyond 18ft). There are projects that do it all on cpu but still, ai gpu stuff is bullshit.