• Pete Hahnloser@beehaw.org
    link
    fedilink
    arrow-up
    36
    ·
    7 months ago

    That closing quote is ominous:

    “Recall is currently in preview status,” Microsoft says on its website. “During this phase, we will collect customer feedback, develop more controls for enterprise customers to manage and govern Recall data, and improve the overall experience for users.”

    I read “so, yeah, we built in all the telemetry connections we swear we’ll never use … just for testing, ya know?”

    • smallpatatas@lemm.ee
      link
      fedilink
      arrow-up
      29
      ·
      7 months ago

      more controls for enterprise customers to manage and govern Recall data

      ahh ok so this is employee monitoring software

      • klangcola@reddthat.com
        link
        fedilink
        arrow-up
        13
        ·
        7 months ago

        Probably more what MangoKangoroo and B0rax talked about, that enterprises can opt out of this telemetry, due to compliance or Intellectual Property protection.

        So only the commoners get mandatory full-scale surveillance, Ehm I mean “ai enhancement”

  • Jayjader@jlai.lu
    link
    fedilink
    arrow-up
    5
    ·
    7 months ago

    In light of the recent forays by AI projects/products into the reason of coding assistants, from copilot to Devin, this reads to me as a sign that they’ve finally accepted that you can’t make an ai assistant that provides actual value from an LLM purely trained on text.

    This is Microsoft copying Google’s captcha homework. We trained their OCR for gBooks, we trained their image recognition on traffic lights and buses and so signs.

    Now we get to train their ai assistant on how to click around a windows OS.

    • maegul (he/they)@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      yep! I didn’t pick up on any explicit link … but the coupling AI and recall is not coincidence. It’s serfdom.

  • MangoKangaroo@beehaw.org
    link
    fedilink
    arrow-up
    41
    ·
    edit-2
    7 months ago

    I’m curious whether the increasingly invasive telemetry of modern Windows will have legal implications surrounding patient privacy here in the US. I work IT in the healthcare field, and one of our key missions is HIPAA compliance. What, then, will be the impact if Microsoft starts storing more and more in-depth data offsite? Will keyboard entries into our EHR be tracked and stored in Microsoft’s servers? Will we subsequently be held liable if a breach at Microsoft causes this information to leak, or if Microsoft just straight-up starts selling it to advertisers? Windows is our one-and-only option for endpoint devices, so it’s not like we can just switch.

    I genuinely don’t have the answers to these questions right now, but it may start to become a serious conversation for our department in the future if things continue at the trajectory they’re going at. Or, maybe I’m just old and paranoid and everything will be okie dokie.

    • SapientLasagna@lemmy.ca
      link
      fedilink
      arrow-up
      6
      ·
      7 months ago

      Like most of Microsoft’s more odious features, this one can be turned off through GPO/Intune policy across an organization. As such, the liability will mostly fall on the organization to make sure it’s off. The privacy and security impacts will be felt by individuals and small businesses.

      They claim that the data is only stored locally, so far. We’ll see, I guess.

      • MangoKangaroo@beehaw.org
        link
        fedilink
        arrow-up
        4
        ·
        7 months ago

        Sadly a lot of the privacy switches are exclusive to enterprise and education users, but our endpoints are running Pro (we have our previous supervisor to thank for that). I guess I’ll hope this is one of the ones we can just toggle off without any fuss.

    • B0rax@feddit.de
      link
      fedilink
      arrow-up
      18
      ·
      7 months ago

      I guess it will be like it was before, that there is a different version of windows for these use cases. Like Windows LTSC.

  • dumbass@leminal.space
    link
    fedilink
    arrow-up
    11
    ·
    edit-2
    7 months ago

    I open windows and it starts recording: opens Plex, plays Mash for 13 hours straight, PC closed down.

  • eveninghere@beehaw.org
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    7 months ago

    Well, so, you use password generator, the password screenshot is saved.

    This makes most password generators useless because they show the password for user feedback. You can turn this MS AI off, but I will have no idea if there was a bug.

  • Fluid@aussie.zone
    link
    fedilink
    English
    arrow-up
    17
    ·
    7 months ago

    A lawsuit waiting to happen… someone needs to class action MS for systemic breaches of privacy. Think of all the critical infrastructure, government, medical, policing, etc. systems processing sensitive, private, and in some cases classified, information.

  • MudMan@fedia.io
    link
    fedilink
    arrow-up
    8
    ·
    7 months ago

    I mean, no thanks.

    But they did this already, right? Their “Timeline” feature in Windows 10 recorded a log of your activities to display it in your Win+Tab menu screen. I switched it off immediately, but the point is this is a new approach to an old feature they have done in the past.

    Everybody must have turned it off, though, because it hadn’t been present in Win 11 until now. It’s still a dumb idea.

    • jcarax@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      Wish I had a choice, at work. Technically I can run Linux or MacOS, but I’d need to run a Windows VM for a few things anyway.