You are viewing a single thread.
View all comments View context
2 points

https://www.lifewire.com/strong-ai-vs-weak-ai-7508012

Strong AI, also called artificial general intelligence (AGI), possesses the full range of human capabilities, including talking, reasoning, and emoting. So far, strong AI examples exist in sci-fi movies

Weak AI is easily identified by its limitations, but strong AI remains theoretical since it should have few (if any) limitations.

https://en.m.wikipedia.org/wiki/Artificial_general_intelligence

As of 2023, complete forms of AGI remain speculative.

Boucher, Philip (March 2019). How artificial intelligence works

Today’s AI is powerful and useful, but remains far from speculated AGI or ASI.

https://www.itu.int/en/journal/001/Documents/itu2018-9.pdf

AGI represents a level of power that remains firmly in the realm of speculative fiction as on date

permalink
report
parent
reply
1 point

Ah, I understand you now. You don’t believe we’re close to AGI. I don’t know what to tell you. We’re moving at an incredible clip; AGI is the stated goal of the big AI players. Many experts think we are probably just one or two breakthroughs away. You’ve seen the surveys on timelines? Years to decades. Seems wise to think ahead to its implications rather than dismiss its possibility.

permalink
report
parent
reply
2 points

See the sources above and many more. We don’t need one or two breakthroughs, we need a complete paradigm shift. We don’t even know where to start with for AGI. There’s a bunch of research, but nothing really came out of it yet. Weak AI has made impressive bounds in the past few years, but the only connection between weak and strong AI is the name. Weak AI will not become strong AI as it continues to evolve. The two are completely separate avenues of research. Weak AI is still advanced algorithms. You can’t get AGI with just code. We’ll need a completely new type of hardware for it.

permalink
report
parent
reply
1 point

Before Deep Learning recently shifted the AI computing paradigm, I would have written exactly what you wrote. But as of late, the opinion that we need yet another type of hardware to surpass human intelligence seems increasingly rare. Multimodal generative AI is already pretty general. To count as AGI for you, you would like to see the addition of continuous learning and agentification? (Or are you looking for “consciousness”?)

That said, I’m all for a new paradigm, and favor Russell’s “provably beneficial AI” approach!

permalink
report
parent
reply
1 point

This is like saying putting logs on a fire is “one or two breakthroughs away” from nuclear fusion.

LLMs do not have anything in common with intelligence. They do not resemble intelligence. There is no path from that nonsense to intelligence. It’s a dead end, and a bad one.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 12K

    Posts

  • 543K

    Comments