I think Microsoft’s planned recall feature where they screenshot everything you do so that it can be analysed by AI isn’t as bad as everyone makes it sound. It’s only bad because Windows is closed source and nobody can verify if what they say is true.

But if Microsoft aren’t lying and none of the data ever leaves your PC (which is supported by the fact that you need a pretty beefy machine to use it) then it is one of the more privacy friendly shit they’ve done recently. And I think they were fully aware that they could only sell “thing that records everything you do” if they could convince people that it doesn’t share that data. Guess they failed.

If it were open source I might even think about using it myself. If the hardware and subsequently power requirements weren’t so absurdly high.

  • jet@hackertalks.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    The big thing here is consent. If you run it yourself, i.e. opt into it. Then it’s consensual.

    Microsoft has demonstrated over a long period of time they are happy to force “optional” anti consumer things into people through

    • Bad defaults
    • Silent updates changing settings
    • Nag screens
    • More nag screens that pop up randomly hoping you misclick
    • Deceitful UI (Yes! Ask me later!)
  • Boozilla@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    Upvote for unpopular opinion.

    This “feature” is like a cop following you or your vehicle 24x7. Sure, you aren’t planning on doing anything illegal. But do you really want a cop following you 24x7?

  • catalog3115@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 month ago

    Oh! You have misunderstood the whole concept of privacy. I have a thought experiment for you:-

    Let’s assume Microsoft is not lying 🤥. The data (screenshot) remains on device, which is passed to some AI model like Image-to-text etc. This model generates text on-device. But no where Microsoft guarantee’s that the text generated or output from those AI models won’t be sent to the Microsoft. They only say the screenshots and AI models remain on-device, but the output/metadata can be sent to Microsoft.

    That is the issue. Earlier there were many apps where Microsoft couldn’t pry because they were encrypted etc. Now they don’t need to break any encryption they just need metadata. That’s easy to transfer and use.

    • al4s@feddit.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      An unpopular opinion can have more or less thought put into it and be genuinely interesting and get up- or downvoted accordingly. Just like a photograph in a photography sub can have more or less thought put into it and an interesting or boring subject and get up- or downvoted accordingly.

      Genuine photograph and the people downvote it… In a community named “photography”.

      Sounds like utter nonsense doesn’t it