Check out my new community: !tech_memes@lemmy.world
Why would an RTX 4090 make Python faster?
Don’t worry this post was written by a first year computer science student who just learned about C. No need to look too closely at it.
If you create a meme like this at least inform yourself on how computers and software work.
If 4090 is not making stuff faster then why my games run faster with it? /s
What makes you think python is in optimized and bloated?
Did you know vast majority of AI development happening right now is on python? The thing that literally consumes billions of dollars of even-beefier-than-4090 GPUs like A100. Don’t you think if they could do this more efficiently and better on C or assembly, they would do it? They would save billions.
Reality is that it makes no benefit to move away from python to lower level languages. There is no poor optimization you seek. In fact if they were to try this in lower level languages, they’ll take even longer to optimize and yield worse results.
TBF, using AI as an example isn’t the best choice when it consumes an ungodly amount of power.
Every single high performance python library is written in C, you just don’t see it
They do do it in C. The packages are written in C, python is just used as the wrapper to allow less coding skilled data scientists to easily use it.
That’s like the entire data science joke. It’s C in a python trench coat.