ChatGPT generates cancer treatment plans that are full of errors — Study finds that ChatGPT provided false information when asked to design cancer treatment plans::Researchers at Brigham and Women’s Hospital found that cancer treatment plans generated by OpenAI’s revolutionary chatbot were full of errors.
These studies are for the people out there who think ChatGPT thinks. Its a really good email assistant, and it can even get basic programming questions right if you are detailed with your prompt. Now everyone stop trying to make this thing like Finn’s mom in adventure time and just use it to helo you write a long email in a few seconds. Jfc.
I use ChatGPT primarily for programming, and it’s particularly well suited for programming.
“Even get basic programming questions right if you are detailed with your prompt”
is underselling its capabilities in that regard. Especially GPT-4 has been able to help me with everything from obscure adobe ExtendScript scripts to infrequently seen ‘unsafe’ C# OpenGL perspective matrix math. All with prompts of a sentence maximum.
I’m specifically referring to ChatGPT. GPT-4 is a different beast that I’m sure is quite adept.
ChatGPT is GPT 3.5 & GPT 4, as far as I’m aware.
3.5 is also very capable when it comes to programming, for any well known framework or language. It’s not as capable, but it is still very capable.
I use it for D&D. It’s fantastic at coming up with adventures, NPCs, story hooks, taverns, etc.
All of those things are made up.
I’m going to need it to turn those emails back into the bullet points used to create them, so I don’t have to read the filler.