Skip to main content

This is how long (and why) OpenAIs Operator holds onto your deleted data

OpenAI’s Operator on a website

Operator, OpenAI's new AI agent, will save your deleted data for two months longer than deleted data from ChatGPT.

OpenAI has some noteworthy privacy policies nestled in the fine print of Operator's help page. One of which says that data from your Operator interactions — chats, browsing history, and screenshots — are kept in OpenAI's servers for up to 90 days after a user deletes them, per TechCrunch, which first spotted the discrepancy. ChatGPT retains deleted data for only 30 days.

Data retention policies are standard practice, and its common for them to range from months to years depending on the nature of the data. But fundamentally, Operator has access to personal and sensitive data by performing tasks on a user's behalf like browsing the web, logging onto sites (with your supervision), and taking regular screenshots of your screen in order to visually process the task at hand.

This data is automatically stored until you choose to delete it in your account settings. But even after you do that, "deleted chats and associated screenshots will be deleted from our systems within 90 days," said the help page.

Why does Operator save data longer than ChatGPT?

Naturally, this raises questions about why Operator data is saved for longer than ChatGPT data. An OpenAI spokesperson told Mashable, "as agents are a relatively new technology, we wanted to make sure our teams have the time to better understand and review potential abuse vectors." This allows the OpenAI team to improve security measures and protect from misuse, the spokesperson continued.

What does Operator do with my data?

On that note, OpenAI and "authorized service providers" can also access your Operator content. This is the same as ChatGPT's policy. But with Operator, that means OpenAI can also see screenshots, which adds a new level of Big Brother surveillance. OpenAI says this is in order to investigate illegal activity or misuse, provide technical support, or "handle legal matters."

Unless you've opted out, OpenAI also uses your Operator content to train its models. But the same setting that applies to ChatGPT also applies to Operator. So if you've already toggled off model sharing with ChatGPT, your Operator data stays with you. To enable this setting, go to your ChatGPT account page, then Data Controls, and click "Improve the model for everyone. In the popup window, toggle off this setting and hit Done.

Given the responsibility granted to Operator, OpenAI has taken other security measures. When it encounters a login page, it pauses and hands over access to the user for "take over mode." In this mode, Operator stops taking screenshots. It also has "watch mode" when navigating certain sites like Gmail, which requires the user's supervision.



from Mashable https://ift.tt/rMBsiCc
via IFTTT

Comments

Popular posts from this blog

Instagram accidentally reinstated Pornhub’s banned account

After years of on-and-off temporary suspensions, Instagram permanently banned Pornhub’s account in September. Then, for a short period of time this weekend, the account was reinstated. By Tuesday, it was permanently banned again. “This was done in error,” an Instagram spokesperson told TechCrunch. “As we’ve said previously, we permanently disabled this Instagram account for repeatedly violating our policies.” Instagram’s content guidelines prohibit  nudity and sexual solicitation . A Pornhub spokesperson told TechCrunch, though, that they believe the adult streaming platform’s account did not violate any guidelines. Instagram has not commented on the exact reasoning for the ban, or which policies the account violated. It’s worrying from a moderation perspective if a permanently banned Instagram account can accidentally get switched back on. Pornhub told TechCrunch that its account even received a notice from Instagram, stating that its ban had been a mistake (that message itse...

California Gov. Newsom vetoes bill SB 1047 that aims to prevent AI disasters

California Gov. Gavin Newsom has vetoed bill SB 1047, which aims to prevent bad actors from using AI to cause "critical harm" to humans. The California state assembly passed the legislation by a margin of 41-9 on August 28, but several organizations including the Chamber of Commerce had urged Newsom to veto the bill . In his veto message on Sept. 29, Newsom said the bill is "well-intentioned" but "does not take into account whether an Al system is deployed in high-risk environments, involves critical decision-making or the use of sensitive data. Instead, the bill applies stringent standards to even the most basic functions - so long as a large system deploys it."  SB 1047 would have made the developers of AI models liable for adopting safety protocols that would stop catastrophic uses of their technology. That includes preventive measures such as testing and outside risk assessment, as well as an "emergency stop" that would completely shut down...

If only your bike had a trunk. Oh wait, now it does.

Just to let you know, if you buy something featured here, Mashable might earn an affiliate commission. Biking is one of the best ways to get around, especially if you live in a city. It's quick, it's eco-friendly, and you get a bit of exercise.  If you already commute on two wheels or are thinking of starting, there's a storage device you kinda need. SEE ALSO: This bamboo keyboard combo adds a touch of tranquility to your workspace The Buca Boot is a pretty magical two-in-one hybrid: It’s a super secure storage box for your bike that works like the trunk of a car. You can lock your helmet or whatever else in it and leave it safely behind. It’s also a basket—open it up, and you can carry a bouquet of flowers and a baguette like the picturesque cyclist of your dreams.    Read more... More about Storage , Car , Bicycle , Trunk , and Cyclist from Mashable http://ift.tt/2eHNwLB via IFTTT