Today marks a significant moment in our journey, and I am thrilled to share some important news with you. After much thoughtful consideration, I have decid
After reading this article, I predict Firefox AI software ChatGPT feature built into Firefox in the future. This can be very useful if done right. Just hope there is an OFF switch for anyone who doesn’t want to use it.
@noroute@lemmy.world @yoasif@fedia.io local LLM execution times can be very fast on recent consumer hardware. No need to send anywhere, just like their translation - do it all on-device.
As an example, with no optimization or GPU support, my @frameworkcomputer@fosstodon.org (AMD) generates around 5 characters/sec from a 4 gigabyte pre-quantized model.