• 0 Posts
  • 49 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle











  • evo@sh.itjust.workstoAndroid@lemdro.idPixel 2023 December feature drop
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    11 months ago

    where the fuck “generative” is in the title

    LLMs and diffusion models have been in apps for months.

    Show me a single example of an app that has an LLM on device. Find a single one that isn’t making an API call to a powerful server running the LLM. Show me the app update that adds a multi gigabyte LLM into the device. I’ll wait…

    Feel free to not respond when you realize you are wrong and you have no clue what everyone else is talking about.



  • At a glance I was confused/angry why this would only be for the Pixel 8 Pro and not the standard Pixel 8 considering they both have the same Tensor G3.

    However, (from my own testing) it seems very likely the full 12 GB of ram the Pro has (vs the 8GB in the Pixel 8) is needed for some of these tasks like summarization.


  • evo@sh.itjust.workstoAndroid@lemdro.idPixel 2023 December feature drop
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    11 months ago

    Do you know how to read?

    Gemini Nano now powers on-device generative AI features for Pixel 8 Pro

    Technically auto complete can be considered Gen AI, but it obviously lacks the creativity that we all associate with Gen AI today. You don’t need a model that is generally useful to do auto complete.

    The point is it didn’t take a generally useful Gen AI model to do auto complete before but Google is now shipping features (beyond auto complete) that use such a model. Gen AI on device is novel.