Pshh, I’m working on an AI blockchain cloud based customer first smart learning adaptive agile Air Fryer that will blow the competition away.
Image generation models are generally more than capable of doing that they’re just not trained to do it.
That is, just doing a bit of hand-holding and showing SDXL appropriately tagged images and you get quite sensible results. Under normal circumstances it just simply doesn’t get to associate any input tokens with the text in the pixels because people rarely if ever describe, verbatim, what’s written in an image. “Hooters” is an exception, hard to find a model on Civitai that can’t spell it.
Yeah it’s been improving over the past few months. It’s hit or miss though
I will probably use these images in a corporate PowerPoint. I’m not asking for your permission, I’m warning you. Sorry, it’s too good. (I will credit you as a CTO of some company ending in -SYS or - LEA if you want)
Please tell me it has individually packaged chicken nugget pods with DRM and more plastic waste than food 🤤
Boy do I have the product for you: https://youtu.be/F_HOrMmWoMA?si=sNxyxbwaKOnZiPqW
Me too, but I make pathfinding algorithms for video game characters. The truly classic Artificial Intelligence.
I still remember fighting grunts in the original half life for the first time and being blown away. Your work makes games great!
It’s been a while since I looked at how Valve does it but it could be called a primitive expert system. And while the HL1 grunts were extraordinary for their time, HL2’s combine grunts are still pretty much the gold standard. Without the AI leaking information to the player via radio chatter it would feel very much like the AI is cheating because yes, HL2’s grunts are better at tactics than 99.99% of humans. It also helps that you’re a bullet sponge so them outsmarting you, like leading you into an ambush, doesn’t necessarily mean that you’re done for.
OTOH they’re a couple of pages of state-machines that would have no idea what to do in the real world.
Also, for the record: “AI” in gamedev basically means “autonomous agent in the game world not controlled by the player”. A “follow the ball” algorithm (hardly can be called that) playing pong against you is AI in that sense. Machine learning approaches are quite rare, and if then you’d use something like NEAT, not the gazillion-parameter neutral nets used for LLMs and diffusion models. If you tell NEAT to, say, drive a virtual car it’ll spit out a network with a couple of neurons, and be very good at doing that but be useless for anything else but that doesn’t matter you have an enemy AI for your racer. Which probably is even too good, again, so you have to nerf it.
LLMs (or really ChatGPT and MS Copilot) having hijacked the term “AI” is really annoying.
In more than one questionnaire or discussion:
Q: “Do you use AI at work?”
A: “Yes, I make and train CNN (find and label items in images) models etc.”
Q: “How has AI influenced your productivity at work?”
A: ???
Can’t mention AI or machine learning in public without people instantly thinking about LLM.
I imagine this is how everyone who worked in cryptography felt once cryptocurrency claimed the word “crypto”
Luckily that was only the abbreviation and not the actual word. I know that language changes all the time, constantly, but I still find it annoying when a properly established and widely (within reason) used term gets appropriated and hijacked.
I mean, I guess it happens all the time in with fiction, and in sciences you sometimes run into a situation where an old term just does not fit new observations, but please keep your slimy, grubby, way-too-adhesive, klepto-grappers away from my perfectly fine professional umbrella terms. :(
Please excuse my rant.
I’m still mad that ML was stolen and doesn’t make people think about the ML family of programming languages anymore.
The term machine learning was coined in 1959 by Arthur Samuel, an IBM employee and pioneer in the field of computer gaming and artificial intelligence.[9][10] The synonym self-teaching computers was also used in this time period.[11][12]
https://en.m.wikipedia.org/wiki/Machine_learning
It wasn’t so much stolen as taken back.
I had a first stage interview with a large multinational construction company where I’d be “the only person in the organization sanctioned to use ai”
they meant: use chatgpt to generate blogs
We are just taking “crypto” back to mean something useful. It was just a matter of some stupid people losing enough money.
I hope in a few years we can take “AI” back too.
“AI” will never shake the connotations science fiction has given it. The association is always going to skew towards positronic brains and Commander Data.
In the world of Actual Machines, “AI” is a term that should barely be tolerated in advertising departments, let alone anything remotely close to R&D
Investors are such emotionally led beings
As an older developer, you could replace “machine learning” with “statistical modeling” and “artificial intelligence” with “machine learning”.
I think people are hesitant to call ML “statistical modeling” because traditional statistical models approximate the underlying phenomena; e.g., a logarithmic regression would only be used to study logarithmic phenomena. ML models, by contrast, seldom resemble what they’re actually modeling.