froztbyte
I’m @froztbyte more or less everywhere that matters
there it is, sammy has gone and said people are just prompting the model wrong (I recall we’ve had that bit said here earlier)
but in true sammy grift: you just need to be asking the right questions to trump intelligence. “why do you want to suck, as a human?” sammy asks, not understanding a moment of humanity
when I’m debugging fucked up web pages (too often), the way I approach it is by loading the entire thing with network inspector view open (to catch requests), then right-click inspect on the element or something close to it. from that I find the element name/path/whatever, and then dig around in the request view to see what happened
my god, some of the useful idiots there are galling
It looks like it’s reasoning pretty well to me. It came up with a correct way to count the number of r’s, it got the number correct and then it compared it with what it had learned during pre-training. It seems that the model makes a mistake towards the end and writes STRAWBERY with two R and comes to the conclusion it has two.
says the tedious poster entirely ignoring the fact that this is an extremely atypical baseline response, and thus clearly is operating under prior instructions as to which methods to employ to “check its logic”
fucking promptfans. at least I have that paper from earlier to soothe me
I definitely don’t have the spoons to read this most recent of his emanations (yes, I am picking that word), but from just the start of it alone… god
the orange man isn’t even in the seat yet and all these motherfuckers are loudly shouting who they are