You are viewing a single thread.
View all comments View context
-7 points

They don’t care if they earn money the next 5-7 years.

And they will hit the point of a great model doing human work for less than a monthly salary. It’s just a matter of time.

permalink
report
parent
reply
9 points

I’m incredulous.

There was that thread asking what people are using LLMs for and it pretty much came down to “softening language in emails”.

For most jobs LLMs can provide a small productivity bump.

IMO if an LLM can do most of your job then you’re not producing much value anyway.

permalink
report
parent
reply
5 points

I’d rather say that it’s a matter of exponentially increasing funding and computing power.

permalink
report
parent
reply
5 points

Without enough funding, they absolutely will care.

Thats between $33 billion and $47 billion at current costs. Someone needs to fund that.

I’d also note that their models seem to be getting worse, with outright irrelevant answers, worse perfoemance, failures in following instructions, etc. Stanford and UC Berkeley did a months-long comparison, and even basic math is going downhill.

permalink
report
parent
reply
5 points

LLMs are not advancing enough any more. There just isn’t any more useful human generated text to train new models on. The net is already full of AI generated slop. OpenAI currently spends 2.35 USD to make 1 USD. It’s fundamentally unsustainable.

permalink
report
parent
reply
-2 points

It costs 1 billion dollars to develop solar cells before they even sell the first product.

The costed 100.000 dollars when starting to sell.

They go for under 10 bucks per square today.

And it’s like that for any technology ever invented.

permalink
report
parent
reply
7 points

Solar panels are useful though.

permalink
report
parent
reply
2 points

Yes, but solar cells are in the end very simple products made of very simple resources, with a limited task: concerting one type of energy into another. That said, there is still research in making them more efficient and cheaper, and the that research isn’t cheap.
But generative AI / LLM takes an insane amount of resources to train and maintain, is complex to create, with a very complex task, and a slight increase in quality takes progressively more resources (like, say 10% better would be 50% more energy use - I don’t have the numbers anymore but iirc they were even worse). A better LLM would therefore be much, much more expensive while people are apparently already underwhelmed with the latest models. With the growing competition, fast rising costs and meagre quality updates, while already unable to financially sustain themselves right now, I truly don’t see it. Honestly, this is why I think Microsoft is cramming their subpar Copilot into everything - to sort of justify all the money they pumped into this.

permalink
report
parent
reply
2 points

It’s also like that for nearly every technology that has failed. For every Amazon that ran in the red until it grabbed enough market share to make a profit, there are 1000 firms that went tits-up, never having turned a profit. (Actual constant may vary from 1000, but it’s pretty damn big regardless).

permalink
report
parent
reply
1 point

I am honestly very very curious: how?

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 14K

    Monthly active users

  • 13K

    Posts

  • 573K

    Comments