• Cris@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    ·
    1 day ago

    Yeah, translation is the only thing here I have any desire at all to use

    • Pennomi@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      edit-2
      23 hours ago

      There are other practical things to use a browser-based AI for, like accessibility improvements for sites that didn’t provide proper alt text or other accessibility features.

      But I think an important one few people are talking about is an AI “babysitter” for my grandma so she doesn’t fall for phishing scams. Ad blocking does a lot to protect people there but some smarter detection would be good for a significant chunk of society. Not that this exists yet.

      • Cris@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        14 hours ago

        Yeah, I think accessibility is one of the few potentially actually valuable use cases for generative AI.

        If you can generate alt text completely locally in a private open source way that’d be pretty cool. It’d be nice if it funtioned as a extension that only does anything when you intentionally call it, at least by default. Maybe mapped to a keyboard shortcut, but I don’t know enough about what visually impaired users need to have a meaningful perspective on how the user experience should be implemented.

        But I’ve yet to see any companies talk about how that’s what they’re gonna use AI for. To me, AI has hypothetical usefulness specifically for tasks that are really important but that in practice no one actually puts resources into, or has any resource to put into.

        I also kinda wonder if local-only AI moderation tools could also make it a lot easier for fediverse mods to cover a lot of ground, or could abstract really disturbing content to reduce the mental load of moderating out the worst content folks post on the internet.

        But no one is interested in AI tech so they can build useful things, theyre interested in it because they can steal peoples intellectual labor and build products without having to pay the people who would produce that intellectual labor.

      • BroBot9000@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        2
        ·
        23 hours ago

        You can make software accessible without Ai, they just don’t want to spend the money on it. It’s a business decision that every company makes to save the money on accessibility. It’s a calculated cost to them

        Education is the answer and not creating a nanny state with fucking Ai babysitting and spying on everyone.

        This is some big brother lvl of bullshit just because oh no old people might be stupid enough to fuck themselves over. If they are that technically incapable, maybe it’s time to set up a device without any extra options.

        • Pennomi@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          21 hours ago

          You might make software accessible without AI but I sure as heck don’t trust everyone else to do it right. Unless you control all software (spoiler, you don’t) then your point is merely theoretical.

          Automated browser level accessibility to cover for other people’s errors is a perfectly reasonable solution to the problem that most developers are fucking lazy.

          Plus, there is no “big brother bullshit” when it comes to locally run AI. Your data never leaves your device, and the model weights are freely auditable. You seem to fundamentally misunderstand how this works.

          • BroBot9000@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            4
            ·
            edit-2
            21 hours ago

            And you are fucking gullible to think giving Ai Agents root level access for the sake of accessibility is reasonable or that the big companies are even going to allow local models in the future.

            Ai apologists get blocked

            • Pennomi@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              20 hours ago

              Don’t put words in my mouth. Root level access is obviously stupid. AI doesn’t have to be owned by corporations. If you think all AI is the same, you are the gullible one - open your eyes and actually look at the options out there. Apache licensed models exist out there, and no corporation can control that.

              You could say the same thing about all software - when it’s run by corporations it’s almost certainly there to harvest your data. But there is a ton of ethical software too!

          • Cris@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            14 hours ago

            I believe their point was that site developers should just include alt text, without ai ever being needed.

            But in practice, sometimes they don’t, and that’s been an issue for ages. And in practice, a lot of images on the internet are user generated/submitted, and most sites don’t have a culture of users writing alt text for their images. Mastodon does, but even lemmy doesn’t really.

        • Pasta Dental@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          arrow-down
          3
          ·
          21 hours ago

          this education thing has been said and said again since what, the 90s? and guess what, people still fall for scams! for the same reasons that an antivirus is a crucial security component for the non-tech enthusiasts crowd, an AI scam detector that runs locally will likely be a crucial tool in the next few years.