I’m sure you understand this, but anonymized data doesn’t mean it can’t be deanonymized. Given the right kind of data, or enough context they can figure out who you are fairly quickly.
Ex: You could “Anonymize” gps traces, but it would still show the house you live at and where you work unless you strip out a lot of the info.
Now with LLMs, sure, you could “anonymize” which user said or asked for what… but if something identifying is sent in the request itself, it won’t be hard to deanonymize that data.
What? No. I would rather use my own local LLM where the data never leaves my device. And if I had to submit anything to ChatGPT I would want it anonymized as much as possible.
Is Apple doing the right thing? Hard to say, any answer here will just be an opinion. There are pros and cons to this decision and that’s up to the end user to decide if the benefits of using ChatGPT are worth the cost of their data. I can see some useful use cases for this tech, and I don’t blame Apple for wanting to strike while the iron is hot.
There’s not much you can really do to strip out identifying data from prompts/requests made to ChatGPT. Any anonymization of that part of the data is on OpenAI to handle.
Apple can obfuscate which user is asking for what as well as specific location data, but if I’m using the LLM and I tell it to write up a report while including my full name in my prompt/request… that’s all going directly into OpenAIs servers and logs which they can eventually use to help refine/retrain their model at some point.
IIRC they demonstrated an interaction with Siri where it asks the user for consent before enriching the data through chatgpt. So yeah, that seems to mean your data is sent out (if you consent).
I don’t know about the US but in European GDPR parlance, of it can be reversed then it is NOT anonymized and it is illegal to claim otherwise. The correct term is pseudonymized.
They want to build a monopoly like Google is for search.
There’s Bing, and some others. I’m using Kagi.
Google has a significant amount of marketshare, but it doesn’t really have the ability to determine the terms on which a consumer can get access to search services, which is what lets a monopoly be a monopoly.
If you look at the announcement, they’re pretty damn boxed in. They can’t scrap the local device, or iCloud. Open AI only gets queries that the dumber Apple models thinks would be better served by OpenAI. And each of those queries is prompted with a dialog that says “Do you want me to use ChatGPT to do that? Cancel / Use ChatGPT”
That said, on stage, Apple briefly mentioned that ChatGPT plus users would have more functionality. I’ll bet money that’s the real play. LLM model subscriptions in the App Store. Apple loves that sweet sweet AppStore and subscription money.
Question is, do they take a cut like with Spotify, or is basic, free, GPT 4 access payment enough?
They getting ALL the data
What data? The data that the user affirmatively agrees to send them that is anonymized? That data?
I’m sure you understand this, but anonymized data doesn’t mean it can’t be deanonymized. Given the right kind of data, or enough context they can figure out who you are fairly quickly.
Ex: You could “Anonymize” gps traces, but it would still show the house you live at and where you work unless you strip out a lot of the info.
http://androidpolice.com/strava-heatmaps-location-identity-doxxing-problem/
Now with LLMs, sure, you could “anonymize” which user said or asked for what… but if something identifying is sent in the request itself, it won’t be hard to deanonymize that data.
So you would rather submit your non-anonymized data? Because those bastards will find a way to unanonimize it. Is Apple doing the right thing or not?
What? No. I would rather use my own local LLM where the data never leaves my device. And if I had to submit anything to ChatGPT I would want it anonymized as much as possible.
Is Apple doing the right thing? Hard to say, any answer here will just be an opinion. There are pros and cons to this decision and that’s up to the end user to decide if the benefits of using ChatGPT are worth the cost of their data. I can see some useful use cases for this tech, and I don’t blame Apple for wanting to strike while the iron is hot.
There’s not much you can really do to strip out identifying data from prompts/requests made to ChatGPT. Any anonymization of that part of the data is on OpenAI to handle.
Apple can obfuscate which user is asking for what as well as specific location data, but if I’m using the LLM and I tell it to write up a report while including my full name in my prompt/request… that’s all going directly into OpenAIs servers and logs which they can eventually use to help refine/retrain their model at some point.
Do you have proof they’re sending it to OpenAI?
I believe I heard it’s done on device or on iCloud servers then deleted.
I mean, that’s the claim at least
IIRC they demonstrated an interaction with Siri where it asks the user for consent before enriching the data through chatgpt. So yeah, that seems to mean your data is sent out (if you consent).
I don’t know about the US but in European GDPR parlance, of it can be reversed then it is NOT anonymized and it is illegal to claim otherwise. The correct term is pseudonymized.
The point is that they can use that data for further training. They want to build a monopoly like Google is for search.
There’s Bing, and some others. I’m using Kagi.
Google has a significant amount of marketshare, but it doesn’t really have the ability to determine the terms on which a consumer can get access to search services, which is what lets a monopoly be a monopoly.
If you look at the announcement, they’re pretty damn boxed in. They can’t scrap the local device, or iCloud. Open AI only gets queries that the dumber Apple models thinks would be better served by OpenAI. And each of those queries is prompted with a dialog that says “Do you want me to use ChatGPT to do that? Cancel / Use ChatGPT”
That said, on stage, Apple briefly mentioned that ChatGPT plus users would have more functionality. I’ll bet money that’s the real play. LLM model subscriptions in the App Store. Apple loves that sweet sweet AppStore and subscription money.
Question is, do they take a cut like with Spotify, or is basic, free, GPT 4 access payment enough?