AI Work Assistants Need a Lot of Handholding

Getting full value out of AI workplace assistants is turning out to require a heavy lift from enterprises. ‘It has been more work than anticipated,’ says one CIO.

aka we are currently in the process of realizing we are paying for the privilege of being the first to test an incomplete product.

Mandell said if she asks a question related to 2024 data, the AI tool might deliver an answer based on 2023 data. At Cargill, an AI tool failed to correctly answer a straightforward question about who is on the company’s executive team, the agricultural giant said. At Eli Lilly, a tool gave incorrect answers to questions about expense policies, said Diogo Rau, the pharmaceutical firm’s chief information and digital officer.

I mean, imagine all the non-obvious stuff it must be getting wrong at the same time.

He said the company is regularly updating and refining its data to ensure accurate results from AI tools accessing it. That process includes the organization’s data engineers validating and cleaning up incoming data, and curating it into a “golden record,” with no contradictory or duplicate information.

Please stop feeding the thing too much information, you’re making it confused.

Some of the challenges with Copilot are related to the complicated art of prompting, Spataro said. Users might not understand how much context they actually need to give Copilot to get the right answer, he said, but he added that Copilot itself could also get better at asking for more context when it needs it.

Yeah, exactly like all the tech demos showed – wait a minute!

[Google Cloud Chief Evangelist Richard Seroter said] “If you don’t have your data house in order, AI is going to be less valuable than it would be if it was,” he said. “You can’t just buy six units of AI and then magically change your business.”

Nevermind that that’s exactly how we’ve been marketing it.

Oh well, I guess you’ll just have to wait for chatgpt-6.66 that will surely fix everything, while voiced by charlize theron’s non-union equivalent.

14 points
*

ChatGPT’s reaction each morning when I tell it that it’s now the year 2024 and Ilya no longer works at OAI

permalink
report
reply
10 points

Anyone who has seen tech hype like this before knows exactly what to expect.

This is why companies should pay for experience. They don’t, and we all get to go through the funhouse again.

permalink
report
reply
36 points

So, if you structure your data in such a way that using AI is completely unneeded, it’s the perfect system to use AI on.

permalink
report
reply
20 points

Wait, this is just a wiki with extra steps!

permalink
report
parent
reply
14 points

and extra bills

permalink
report
parent
reply
12 points
*

I wonder why no business types/economists (*) types don’t just stand up and go ‘this is all a scam’. It started with just needing a website (with some added idea that it might reduce advertisement costs, replace some secretaries who used to handle information requests). Fine, you can buy those and maintenance and running costs is cheap. And then you need an ecommerce site, so you need a team of devs, admins, security people, but it brings in some revenue. But then every five years there is something else, and then you are running a huge team of STEM people who all are working on the pivot to Apps/AI/Quantum resistant whatever/NFTs/Cryptocurrencies/GDPR/continuous development/checking if none of the daily updates of all your huge dependencies break something/microservices/huge cloud bills/training of these tech people/moderation of your sites for pedophiles, racists, other extremists, and random satanic scare style worries. Bottom line somebody must have discovered something like 'wait why are we spending this much money on all this crap? We sell pet food for fucks sake, Ecommerce didn’t reduce our costs as now we need to hire an maintain delivery people and cars.

*: I know why, because their jobs are also in on the scam.

permalink
report
parent
reply
25 points

If you don’t have your data house in order, AI is going to be less valuable than it would be if it was,” he said.

If your data house is in order, why do you need AI assistants to find your neatly organized information for you anyways?

permalink
report
reply
13 points
*

To have a dead simple UI where you, a person with no technical expertise, can ask in plain language for the data you want in the way you want them presented, along with some basic analysis that you can tell it to make it sound important. Then you tell it to turn it into an email in the style of your previous emails, send it, and take a 50min coffee break. All this allegedly with no overhead besides paying a subscription and telling your IT people to point the thing to the thing.

I mean, it would be quite something if transformers could do all that, instead of raising global temperatures to synthesize convincing looking but highly suspect messaging at best while being prone to delirium at worst.

permalink
report
parent
reply
22 points

Also, speaking from experience trying to do any database work for large corporate clients, no data house is in order. It’s basically saying “assume a spherical cow, then AI works”.

permalink
report
parent
reply
25 points
*

Was wondering if they’re using RaG, and they are, but in the worst possible way:

Complicating matters is the fact that Copilot doesn’t always know where to go to find an answer to a particular question, Spataro said. When asked a question about revenue, Copilot won’t necessarily know to go straight to the enterprise financial system of record rather than picking up any revenue-related numbers that appear in emails or documents, he said.

Thing might be rendered useful if you could constrain it to search a particular source or site. And even better, instead of hallucinating it could just give you a link and a citation. We could call it a search engine.

permalink
report
reply
14 points

If you think of LLMs as being akin to lossy text compression of a set of text, where the compression artifacts happen to also result in grammatical-looking sentences, the question you eventually end up asking is “why is the compression lossy? What if we had the same thing but it returned text from its database without chewing it up first?” and then you realize that you’ve come full circle and reinvented search engines

permalink
report
parent
reply

TechTakes

!techtakes@awful.systems

Create post

Big brain tech dude got yet another clueless take over at HackerNews etc? Here’s the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

Community stats

  • 1.9K

    Monthly active users

  • 329

    Posts

  • 5.9K

    Comments

Community moderators