• chicken@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    1 year ago

    A year ago local LLM was just not there, but the stuff you can run now with 8gb vram is pretty amazing, if not quite as good yet as GPT 4. Honestly even if it stops right where it is, it’s still powerful enough to be a foundation for a more accessible and efficient way to interface with computers.