18 points

sometimes-correct summary without needing to click on a single result

Crazier than it sounds. We don’t see the page contents AT ALL by default.

permalink
report
reply
3 points

Yet again destroying the internet even further.

No no no, you don’t want to go see pages with creative content. Stay here in my walled little garden, I have such wonders to show you

permalink
report
parent
reply
15 points

If you haven’t already, folks, switch your default search engine over to a searx. You’ll gain back the ability to actually find useful results. It’s not so good for shopping, though.

permalink
report
reply
2 points

I wasn’t aware of this search engine. Thanks!

permalink
report
parent
reply
14 points

If I were a content creator, why would I still need to let Google crawl my site. It probably won’t bring any traffic to my site.

permalink
report
reply
15 points

Site owners haven’t figured that out yet. They still cling to the notion that search optimization works. And it still does, to some extent.

Like, if you’re a small business owner providing local services in your city and you get customers that find you through Google, what can you do except continue to optimize for Google?

permalink
report
parent
reply
2 points

True. In your example, that makes sense. In cases where like newspaper/ journalism that earns ads revenue when ppl visit their articles, they will eventually lose those ads revenue when Gemini answers everything. But as u said, if they don’t let google to crawl, they lose ads revenue now. Tough choice.

permalink
report
parent
reply
76 points

Billions of queries becoming way more energy intensive for a feature almost nobody asked for, now the default. What the fuck are we even doing

permalink
report
reply
15 points

And it will hallucinate and give wrong answers

permalink
report
parent
reply
20 points

Appeasing shareholders and investors.

permalink
report
parent
reply
13 points

How? Are they expecting more ad income to offset the energy costs?

permalink
report
parent
reply
10 points

“Google, how do I calculate the circumference of a sphere?”

“Sign up for online math classes with University of Arizona today!”

permalink
report
parent
reply
9 points

Probably injecting ads “naturally” into the conversation.

permalink
report
parent
reply
34 points
*

Awesome. Truly spectacular.

Generative AI is so energy intensive ($$$), that Google is requiring users subscribe to Gemini.

Google is entirely dependent on advertising sales. Ad revenue subsidizes literally everything else, from Android development to whichever 8-12 products and services they launch and subsequently cancel each year.

Now, Google wants to remove web results and just use generative AI instead of search as it’s default user interface.

So, like I said: Awesome.

permalink
report
reply
12 points
*

While I agree in principle, one thing I’d like to clarify is that TRAINING is super energy intensive, once the network is trained, it’s more or less static. Actually using the network isn’t dramatically more energy than any other indexed database lookup.

permalink
report
parent
reply
10 points

Training will never stop, tho.
New models will keep coming out, datasets and parameters are going to change.

permalink
report
parent
reply
2 points

I firmly believe it will slow down significantly. My prediction for the future is that there will be a much bigger focus on a few “base” models that will be tweaked slightly for different roles, rather than “from the ground up” retraining like we see now. The industry is already starting to move in that direction.

permalink
report
parent
reply
22 points

It’s static, yes, but the static price is orders of magnitude higher. It still involves loading the whole model into VRAM and performing matrix multiplication on trillions of numbers

permalink
report
parent
reply
2 points

Indexing and lookups on datasets as big as companies like Google and Amazon are running also take trillions of operations to complete, especially when you take into account the constant reindexing that needs to be done. In some cases, encoding data into a neural network is actually cheaper than storing the data itself. You can see this in practice with gaussian splatting point cloud capture, where they are training networks to guide points in the cloud at runtime, rather than storing the position of trillions of points over time.

permalink
report
parent
reply
5 points

To be fair, I wouldn’t include “loading the whole model into VRAM” as part of the cost, given they can just keep it in there between different requests, and it might be down to hundreds of billions or dozens of billions instead of trillions… but even after all improvements it should still be orders of magnitude more expensive than normal search, which just makes their decision even crazier

permalink
report
parent
reply

Technology

!technology@beehaw.org

Create post

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

Community stats

  • 2.9K

    Monthly active users

  • 2.8K

    Posts

  • 55K

    Comments