You are viewing a single thread.
View all comments
355 points

There’s some sort of cosmic irony that some hacking could legitimately just become social engineering AI chatbots to give you the password

permalink
report
reply
228 points

There’s no way the model has access to that information, though.

Google’s important product must have proper scoped secret management, not just environment variables or similar.

permalink
report
parent
reply
113 points

There’s no root login. It’s all containers.

permalink
report
parent
reply
9 points

The containers will have a root login, but the ssh port won’t be open.

permalink
report
parent
reply
12 points

The containers still run an OS, have proprietary application code on them, and have memory that probably contains other user’s data in it. Not saying it’s likely, but containers don’t really fix much in the way of gaining privileged access to steal information.

permalink
report
parent
reply
52 points

It’s containers all the way down!

permalink
report
parent
reply
7 points

It does if they uploaded it to github

permalink
report
parent
reply
6 points

In that case, it’ll steal someone else’s secrets!

permalink
report
parent
reply
1 point

But you could get it to convince the admin to give you the password, without you having to do anything yourself.

permalink
report
parent
reply
4 points

Still, things like content moderation and data analysis, this could totally be a problem.

permalink
report
parent
reply
28 points

It will not surprise me at all if this becomes a thing. Advanced social engineering relies on extracting little bits of information at a time in order to form a complete picture while not arousing suspicion. This is how really bad cases of identity theft work as well. The identity thief gets one piece of info and leverages that to get another and another and before you know it they’re at the DMV convincing someone to give them a drivers license with your name and their picture on it.

They train AI models to screen for some types of fraud but at some point it seems like it could become an endless game of whack-a-mole.

permalink
report
parent
reply
6 points

While you can get information out of them pretty sure what that person meant was sensitive information would not have been included in the training data or prompt in the first place if anyone developing it had a functioning brain cell or two

It doesn’t know the sensitive data to give away, though it can just make it up

permalink
report
parent
reply

Programmer Humor

!programmerhumor@lemmy.ml

Create post

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

  • Posts must be relevant to programming, programmers, or computer science.
  • No NSFW content.
  • Jokes must be in good taste. No hate speech, bigotry, etc.

Community stats

  • 4.4K

    Monthly active users

  • 1.5K

    Posts

  • 35K

    Comments