ChatGPT is leaking private conversations that include login credentials and other personal details of unrelated users, screenshots submitted by an Ars reader on Monday indicated.

Two of the seven screenshots the reader submitted stood out in particular. Both contained multiple pairs of usernames and passwords that appeared to be connected to a support system used by employees of a pharmacy prescription drug portal. An employee using the AI chatbot seemed to be troubleshooting problems they encountered while using the portal.

33 points

That’s on the users. It straight up tells you not to give it sensitive information.

permalink
report
reply
3 points

Every website/IT/whatever says since the beginning to not give out your login credentials to anyone.

permalink
report
parent
reply
9 points

That will be getting a problem in the future. People will start putting highly sensitive and confidential information into ChatGPT and the like. And of course they’ll use this data. Industrial espionage might get as easy as asking a common LLM for help with a specific problem.

permalink
report
reply
2 points

That’d be amazing if it could take all the data that’s fed to it and readily produce solutions like that.

What a time to be alive.

permalink
report
parent
reply
2 points

Doesn’t this mean that overwhelming non factual information would skew the results of chat gpt?

permalink
report
reply
0 points

No

permalink
report
parent
reply
14 points
*

I’m sorry but if you’re stupid enough to give chat gpt your passwords you deserve every bad thing that happens because of that.

This is not a chat gpt problem, it’s a PEBKAC one.

permalink
report
reply
4 points
*

It is a user problem and an OpenAI problem. Some data shouldn’t be getting shoved into ChatGPT, without a doubt.

ChatGPT is pulling from its history data which should be isolated to each user. It’s starting to hint at some exceedingly bad design around their AI.

Any time that ChatGPT is “broken” with creative prompts, a new filter is put in front of, or after, the AI model. (The model itself doesn’t change as it would be too expensive to re-train.) The bot then refuses specific input or clips potentially bad output. Life goes on.

Any data repositories that are use for chat should be physically separated from user history, and it isn’t. This implies a ton of different things, but it would all be speculation.

I am really thinking there is a great deal more fuckery going on than what OpenAI is showing to the public. Regardless of the technology, there always is a ton of fake going on with any company.

permalink
report
parent
reply
3 points

I’ve been using ChatGPT as a poor man’s psychological analyst.

Does this mean my conversations about my deepest fears are not safe??

permalink
report
reply
1 point

People are using it as a partner, they’ve already found that to be true. Probably teenagers, which is kind of worse.

permalink
report
parent
reply
2 points

Like sexual partner? Tell me more.

permalink
report
parent
reply
1 point

There are a lot of lonely people in this world, there was some mention of it in an article a few weeks back.

permalink
report
parent
reply

Technology

!technology@lemmy.ml

Create post

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

Community stats

  • 3.9K

    Monthly active users

  • 2.5K

    Posts

  • 40K

    Comments

Community moderators