• dependencyinjection@discuss.tchncs.de
    link
    fedilink
    arrow-up
    15
    arrow-down
    2
    ·
    4 months ago

    Well, most of the requests are handled on device with their own models. If it’s going to ChatGPT for something it will ask for permission and then use ChatGPT.

    So the Apple Intelligence isn’t all ChatGPT. I think this deserves to be mentioned as a lot of the processing will be on device.

    Also, I believe part of the deal is ChatGPT can save nothing and Apple are anonymising the requests too.

    • Blue_Morpho@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      4 months ago

      Well, most of the requests are handled on device

      Doubt.

      Voice recognition, image recognition, yes. But actual questions will go to Apple servers.

      • dependencyinjection@discuss.tchncs.de
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        4 months ago

        Doubt.

        Is this conjecture or can you provide some further reading, in the interest of not spreading misinformation.

        Edit: I decided to read the info from Apple.

        With Private Cloud Compute, Apple sets a new standard for privacy in AI, with the ability to flex and scale computational capacity between on-device processing, and larger, server-based models that run on dedicated Apple silicon servers. When requests are routed to Private Cloud Compute, data is not stored or made accessible to Apple and is only used to fulfill the user’s requests, and independent experts can verify this privacy.

        Additionally, access to ChatGPT is integrated into Siri and systemwide Writing Tools across Apple’s platforms, allowing users to access its expertise — as well as its image- and document-understanding capabilities — without needing to jump between tools.

        Say what you will about Apple, but privacy isn’t a concern for me. Perhaps, some independent experts will verify this in time.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          Which is exactly what I said. It’s not local.

          That they are keeping the data you send private is irrelevant to the OP claim that the AI model answering questions is local.

      • dependencyinjection@discuss.tchncs.de
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        4 months ago

        Brother I do not care about your doubts.

        I want hard facts here.

        Do you think that if you enter into a contract with a company like Apple they’ll just be like, aww shit they weren’t supposed to do that. Anyway let’s carry on.

        No. This would open OpenAi up to potential lawsuits.

        Even if they did save stuff. It gets anonymised by Apple before even being sent to ChatGPT servers.

        • Fedizen@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          The hard fact is OpenAI is already exposing itself to lawsuits by training on copyrighted material.

          So the proof here should be “what makes them trustworthy this time?”

          • micka190@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            4 months ago

            There’s kind of a difference between “we scraped the internet and decided to use copyrighted content anyways because we decided to interpret copyright law as not being applicable to the content we generate using copyrighted content” (omegalul) and “we explicitly agreed to a legally-binding contract with Apple stating we won’t do that”.