“We will continue to evaluate long-term, scalable solutions that will help us make an informed decision on a future voice ordering solution by the end of the year,” the statement said.
Soon:
Are there any examples of user facing AI that isn’t a complete train wreck?
I’m sure some usafe as a tool to assist someone that can filter outliers exists for acientific pattern matching, I’m mostly wondering about stuff customers interact with that is just working without any news stories.
OpenAI seems to be functioning.
The problem with speech to text is the background noise and the many variations of speech. I’ve played around with a couple of models. I can get one to work with my voice with little effort in training, but when my window AC kicks in or my computer fan hits the highest setting, it becomes a problem because the training is very dependant on the noise floor. I think they are likely extremely limited in the audio gear available in combination with the compute hardware to make it viable. Human hearing has a relatively large dynamic range and we have natural analog filtering. A machine just doing math can’t handle things like clipping from someone speaking too loud, or understand the periodicity of all the vehicle and background noises like wind, birds, and other people in the vicinity. Everything that humans can contextualize is like a small learned program and alignment that took many years to train.
You will not see the full use cases of AI for quite awhile. The publicly facing tools are nowhere near the actual capabilities of present AI. If you simply read the introductory documentation for the Transformers library, which is the basis of almost all the AI stuff you see in any public spaces, the documentation clearly states that it is a a simplified tool that bypasses complexity in an attempt to make the codebase approachable to more people in various fields. It is in no way a comprehensive implementation. People are forming opinions based on projects that are hacked together using Transformers. The real shakeups are happening in business where companies like OpenAI are not peddling the simple public API, they are demonstrating the full implementations directly.
Nugget Overload
New band name.
This is the best summary I could come up with:
McDonald’s is removing artificial intelligence (AI) powered ordering technology from its drive-through restaurants in the US, after customers shared its comical mishaps online.A trial of the system, which was developed by IBM and uses voice recognition software to process orders, was announced in 2019.It has not proved entirely reliable, however, resulting in viral videos of bizarre misinterpreted orders ranging from bacon-topped ice cream to hundreds of dollars’ worth of chicken nuggets.McDonald’s told franchisees it would remove the tech from the more than 100 restaurants it has been testing it in by the end of July, as first reported by trade publication Restaurant Business, external.
“We will continue to evaluate long-term, scalable solutions that will help us make an informed decision on a future voice ordering solution by the end of the year,” the statement said.The technology has been controversial from the outset, though initially concerns centred on its potential to make people’s jobs obsolete.
However, it has become apparent that replacing human restaurant workers may not be as straightforward as people initially feared - and the system’s backers hoped.The AI order-taker’s mishaps have been documented online.In one video, which has 30,000 views on TikTok, a young woman becomes increasingly exasperated as she attempts to convince the AI that she wants a caramel ice cream, only for it to add multiple stacks of butter to her order.
Another popular video includes two people laughing while hundreds of dollars worth of chicken nuggets are added to their order, external, while the New York Post reported another person had bacon added to their ice cream in error, external.The ending of this trial though does not mean an end to concerns about AI reshaping the workplace.IBM said it would continue to work with McDonald’s in the future.
“This technology is proven to have some of the most comprehensive capabilities in the industry, fast and accurate in some of the most demanding conditions,” it said in a statement.
“While McDonald’s is re-evaluating and refining its plans for AOT we look forward to continuing to work with them on a variety of other projects.”
The original article contains 422 words, the summary contains 348 words. Saved 18%. I’m a bot and I’m open source!