You are viewing a single thread.
View all comments View context
-9 points

Are you saying thats not true? Anything to substaniate your claim?

permalink
report
parent
reply
10 points
*

Kay mate, rational thought 101:

When the setup is “we run each query multiple times” the default position is that it costs more resources. If you claim they use roughly the same amount you need to substantiate that claim.

Like, that sounds like a pretty impressive CS paper, “we figured out how to run inference N times but pay roughly the cost of one” is a hell of an abstract.

permalink
report
parent
reply
10 points

“…we pay for one, suckers VCs pay for the other 45”

permalink
report
parent
reply
22 points

“this thing takes more time and effort to process queries, but uses the same amount of computing resources” <- statements dreamed up by the utterly deranged.

permalink
report
parent
reply
-8 points

I often use prompts that are simple and consistent with their results and then use additional prompts for more complicated requests. Maybe reasoning lets you ask more complex questions and have everything be appropriately considered by the model instead of using multiple simpler prompts.

Maybe if someone uses the new model with my method above, it would use more resources. Im not really sure. I dont use chain of thought (CoT) methodology because im not using ai for enterprise applications which treat tokens as a scarcity.

Was hoping to talk about it but i dont think im going to find that here.

permalink
report
parent
reply
9 points

Was hoping to talk about it but i dont think im going to find that here.

If only you’d asked ChatGPT “is awful.systems a good place to fellate LLMs”

permalink
report
parent
reply
8 points

Was hoping to talk about it but i dont think im going to find that here.

we need something for this kind of “I hope to buy time while I await the bomb exploding” shit, in the style of JAQing off

permalink
report
parent
reply
14 points
*

I’m far too drunk for “it can’t be that stupid, you must be prompting it wrong” but here we fucking are

Was hoping to talk about it but i dont think im going to find that here.

oh no shit? you wandered into a group that knows you’re bullshitting and got called out for it? wonder of fucking wonders

permalink
report
parent
reply
14 points

I often use prompts

Well, there’s your problem

permalink
report
parent
reply
14 points

“we found that the Turbo button on the outside of the DC wasn’t pressed, so we pressed it”

permalink
report
parent
reply

TechTakes

!techtakes@awful.systems

Create post

Big brain tech dude got yet another clueless take over at HackerNews etc? Here’s the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

Community stats

  • 2.1K

    Monthly active users

  • 428

    Posts

  • 9.6K

    Comments

Community moderators