A prototype is available, though it’s Chrome-only and English-only at the moment. How this’ll work is you select some text and then click on the extension, which will try to “return the relevant quote and inference for the user, along with links to article and quality signals”.

How this works is it uses ChatGPT to generate a search query, utilizes WP’s search API to search for relevant article text, and then uses ChatGPT to extract the relevant part.

26 points

How would this be different from any browser that has Wikipedia search built in?

permalink
report
reply
15 points

Presumably it would evaluate claims in the text without the user having to do the search. Sounds cool to me.

permalink
report
parent
reply

It says the user has to highlight then click the extension.

I can currently right-click and then click “Search on Wikipedia” in the context menu. I believe this works in both FF and chromium browsers.

Fuck AI.

permalink
report
parent
reply
3 points

I prefer a floating button next to the text than having to set my default search engine to Wikipedia or downloading an addon that only adds it to the context menu, lol. I need to complete my unholy trinity of levitating context buttons

permalink
report
parent
reply
4 points

It could run the search in the background

permalink
report
parent
reply
3 points

AI!

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
1 point

It doesn’t have a friendly looking duck as a logo.

permalink
report
parent
reply
18 points

AI is going to start writing entire fake research papers and books written by fake authors, just so it can be cited as a source for a high school kid using it to cheat on a 500 word essay.

permalink
report
reply
3 points

We have become like really shitty gods. All this power to do freaken nothing. At least the Greek gods made lighting and thunder.

permalink
report
parent
reply
18 points

Is it that hard to fact-check things?? Not to mention, a quick web search uses much less power/resources compared to AI inference…

permalink
report
reply
19 points
*

Well, the hard truth is that AI’s convenient and sells

permalink
report
parent
reply
1 point

a quick web search uses much less power/resources compared to AI inference

Do you have a source for that? Not that I’m doubting you, just curious. I read once that the internet infrastructure required to support a cellphone uses about the same amount of electricity as an average US home.

Thinking about it, I know that LeGoog has yuge data centers to support its search engine. A simple web search is going to hit their massive distributed DB to return answers in subsecond time. Whereas running an LLM (NOT training one, which is admittedly cuckoo bananas energy intensive) would be executed on a single GPU, albeit a hefty one.

So on one hand you’ll have a query hitting multiple (comparatively) lightweight machines to lookup results - and all the networking gear between. One the other, a beefy single-GPU machine.

(All of this is from the perspective of handling a single request, of course. I’m not suggesting that Wikipedia would run this service on only one machine.)

permalink
report
parent
reply
8 points

Based on this article, it seems that on average an LLM query costs about 10x when compared to a search engine query.

permalink
report
parent
reply
1 point

Man - that’s wild. Thank you for coming though with a citation - I appreciate it!

permalink
report
parent
reply
3 points

A simple web search is going to hit their massive distributed DB to return answers in subsecond time.

It’s going to hit an index, not the actual data, it’s going to return approximate and not accurate results. Tons of engineering been done around basic search precisely to get more data locality.

Read a blog post at some time (please don’t ask me where) talking about Bing vs. Google when Bing started to use ChatGPT and it basically boiled down to “Google has the tech to do it, they don’t roll it out because they don’t want to eat the electricity bill this is MS spending money to get market share”. The cost difference in providing search vs. having ChatGPT answer a question was something like 10x. It might not be that way forever what with beating models down to work in trinary and stuff, though (that’s not just massive quantisation but also much easier maths, convolutions don’t need much maths when all you deal with is -1, 0, 1 IIRC you can throw out the multiplication unit and work with nothing but shifts and adds)

permalink
report
parent
reply
11 points

While I see this as one of the rare nice use of IA, if the use is just to fact check some text found on the web. It could also just fetch it on the site instead of using an AI.

Might be overkill to use LLMs here I think…

permalink
report
reply
4 points

The problem arises when a site uses different wording. Wikipedia’s search engine isn’t that good, so that problem could make the extension fail enough times to stunt retention.

permalink
report
parent
reply
1 point

Wikipedia’s search engine isn’t that good

That’s a pretty big understatement. It’s pretty awful.

permalink
report
parent
reply
11 points

I’m skeptical given how confident many recent AI models are at making wrong claims. Fact checking seems to be a rather poor use case for current AI models IMO.

permalink
report
reply
7 points

This looks less like the LLM is making a claim so much as using an LLM to generate a search query and then read through the results in order to find anything that might relate to the section being searched.

It leans into the things LLMs are pretty good at (summarizing natural language; constructing queries according to a given pattern; checking through text for content that matches semantically instead of literally) and links directly to a source instead of leaning on the thing that LLMs only pretend to be good at (synthesizing answers).

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 12K

    Posts

  • 543K

    Comments