Don’t worry folks, if we all stop using plastic straws and take 30 second showers, we’ll be able to offset 5% of the carbon emissions this AI has!
Google ghg emissions in 2023 are 14.3 million metric tons. Which are a ridiculous percentage of global emissions.
Commercial aviation emissions are 935.000 million metric tons by year.
So IDK about plastic straws or google. But really if people stopped flying around so much that would actually make a dent on global emissions.
Don’t get me wrong, google is a piece of shit. But they are not the ones causing climate change, neither is AI technology. Planes, cars, meat industry, offshore production… Those are some of the truly big culprits.
But they are not the ones causing climate change
The owners of google are capitalists. They are as responsible for climate change as any other capitalist.
Capitalists serve customers and do not operate in a vacuum. This finger pointing does nothing productive.
I’d rather we give up AI than give up meat or flying. I am not taking a ship half way across the world.
You do you. But other people may have other priorities.
Anyway, how many times have an user to use an AI to even come close to a single commercial plate through the Atlantic? It may be a freaking lot.
You giving away AI, or even forcing all humandkind to do so, might as well do nothing as far as climate change is concerned.
The annoying part is how many mainstream tech companies have ham-fisted AI into every crevice of every product. It isn’t necessary and I’m not convinced it results in a “better search result” for 90% of the crap people throw into Google. Basic indexed searches are fine for most use cases.
As a buzzword or whatever this is leagues worse than “agile”, which I already loathed the overuse/integration of.
Before AI it was IoT. Nobody asked for an Internet connected toaster or fridge…
I always felt like I was alone in this thinking. I think anyone with a bit of a security mindset don’t want everything connected, besides it makes them more expensive and easier to break. It’s certainly very convenient for programmed obsolescence.
And yet it’s still garbage…like their search
With adblock enabled I feel like their results are often better than for example Duckduckgo. I recently switched to using DDG as my standard search engine but I regularly find myself using Google instead to get the results I’m looking for.
Interesting, I’m actually the exact opposite. I always start with Google, because it’s usually good enough, but whenever it takes 2-3 tries to get something relevant, I switch to ddg and get it first try.
My issue is mostly with image search results. DDG’s images tend to be less relevant than Google’s. DDG also lacks “smart” results (idk the official term).
For example when you search “rng 25” on Google, it will immediately present you with a random number between 1 and 25. On DDG you have to click on one of the search results and then use some website to generate the number.
Or when searching for the results of a soccer game, Google will immediately present all the stats to you, while on DDG you will only find some articles about it.
Of course it really depends on the kind of search and I’m sure DDG will regularly have better results than Google too.
I skimmed the article, but it seems to be assuming that Google’s LLM is using the same architecture as everyone else. I’m pretty sure Google uses their TPU chips instead of a regular GPU like everyone else. Those are generally pretty energy efficient.
That and they don’t seem to be considering how much data is just being cached for questions that are the same. And a lot of Google searches are going to be identical just because of the search suggestions funneling people into the same form of a question.
Exactly. The difference between a cached response and a live one even for non-AI queries is an OOM difference.
At this point, a lot of people just care about the ‘feel’ of anti-AI articles even if the substance is BS though.
And then people just feed whatever gets clicks and shares.
In fact, Gemini was trained on, and is served, using TPUs.
Google said its TPUs allow Gemini to run “significantly faster” than earlier, less-capable models.
Did you think Google’s only TPUs are the ones in the Pixel phones, and didn’t know that they have server TPUs?
I hadn’t really heard of the TPU chips until a couple weeks ago when my boss told me about how he uses USB versions for at-home ML processing of his closed network camera feeds. At first I thought he was using NVIDIA GPUs in some sort of desktop unit and just burning energy…but I looked the USB things up and they’re wildly efficient and he says they work just fine for his applications. I was impressed.
The Coral is fantastic for use cases that don’t need large models. Object recognition for security cameras (using Blue Iris or Frigate) is a common use case, but you can also do things like object tracking (track where individual objects move in a video), pose estimation, keyphrase detection, sound classification, and more.
It runs Tensorflow Lite, so you can also build your own models.
Pretty good for a $25 device!
AI is just what crypto bros moved onto after people realized that was a scam. It’s immature technology that uses absurd amounts of energy for a solution in search of a problem, being pushed as the future, all for the prospect of making more money. Except this time it’s being backed by major corporations because it means fewer employees they have to pay.
There are legitimate uses of AI in certain fields like medical research and 3D reconstruction that aren’t just a scam. However, most of these are not consumer facing and the average person won’t really hear about them.
It’s unfortunate that what you said is very true on the consumer side of things…
energy for a solution in search of a problem,
Except this time it’s being backed by major corporations because it means fewer employees they have to pay.
Ah yes the classic it is useless and here is a use for it logic.
I take it you haven’t had to go through an AI chat bot for support before huh
Crypto has been hitting all time highs this year; there’s just more bros than before.