Until commercial games and professional software start coming with arm versions this isn’t going to work out. And even if they did, I’m not entirely sure if it’s possible to use the same gpus on arm at all so even if they really push it, people that do pc gaming are going to reject it.
Because Intel has a fab and could sell capacity.
Uh, OK I guess? If their promised actually good shit is finally actually good.
I choose this manufacturer because it promises 5 years of software updates, much longer than others, and I like it’s operating system features more.
Coming coon to PCs near you.
I have a surface pro x. I can’t install Google drive on windows. I can’t install Linux. Affinity apps don’t get graphics acceleration because of some missing directX support. Neither does Blender, or Fusion360. Darktable and Rawtherapee only work under emulation. How is this a $1000+ laptop? All those things work flawlessly on an underspecced base MacBook air with 8GB of RAM (up until you need to use all the ram to keep five chrome tabs open anyway).
I know there’s some hyperbole here, but my point still stands: the author is right when they said that Microsoft hasn’t given up… Because it feels they’re not even trying. Apple said EVERYBODY MAKE ARM APPS NOW, and compatibility problems lasted a year. Not ten years.
Apple said EVERYBODY MAKE ARM APPS NOW, and compatibility problems lasted a year. Not ten years.
Because Apple’s priority has never been legacy support and backwards compatibility but Microsoft’s whole business model and key advantage with Windows is legacy support and backwards compatibility. It’s a different beast when you’re marketing to the enterprise instead of personal users.
Apple said EVERYBODY MAKE ARM APPS NOW
Uh, no. What they did is make sure x86 software still works perfectly. And not just Mac software - you can run x86 Linux server software on a Mac with Docker, and you can run DirectX x86 PC games on a Mac with WINE. Those third party projects didn’t do it on their own, Apple made extensive contributions to those projects.
I’d like to go into more detail but as a third party developer (not for any of the projects I mentioned above) I signed an NDA with Apple relating to the transition process before you could even buy an ARM powered Mac. Suffice to say the fruit company helped developers far and wide with the transition.
And yes, they wanted developers to port software over to run natively, but that was step 2 of the transition. Step 1 was (and still is) making sure software doesn’t actually need to be ported at all. Apple has done major architecture switches like this several times and are very good at them. This was by far the most difficult transition Apple has ever done but it was also the smoothest one.
It’s 2024, and I still have software running on my Mac that hasn’t been ported. If that software is slow, I can’t tell. It’s certainly not buggy.
Only real issues that I’ve seen lately are upstream with QEMU, which will probably be sorted soon, if they’re not already. I’m absolutely amazed at how well they implemented the x86_64 compatibility.
If found that a few open source apps that are stubbornly Intel only binaries can be compiled as universal apps in Xcode. For example OpenEmu.
I recently bought an Apple Silicon laptop and I’ve noticed that only a handful of my apps are run in Intel translation mode. Intel apps are significantly slower to launch because they front load the code translation, but I’ve been surprised to see how much software is fully native or universal now. Apple has done a good job giving developers the tools to write code that is portable, but it’s been stretching the pain over servers revision cycles. For example dropping 32bit software support two versions ago. I am not convinced Microsoft has the nerve to break compatibility but Linux will be fine.
MS is doing it. They are killing Outlook and moving to New Outlook. I can’t think of any other app that is as important as that one.
I don’t mean Microsoft apps. I know those will work fine. I mean all the effort Microsoft has put into making sure that Windows remains compatible with very old third-party software, including drivers.
Software tends to be written against an operating system or slightly more high-level API, not against an instruction set. We’re not in the 1960s, any more, compilers exist.
Even back during the x86 to x86_64 switchover people didn’t re-write software, we mostly just cleaned up legacy code, removing old architecture-dependent tricks that wouldn’t work in 64 bit mode and hadn’t been necessary to achieve proper performance for a couple of hardware generations, anyway.
There’s always going to be exceptions but if your calculator app needs more work than a recompile you did something wrong. Possible sources of nastiness in otherwise well-written software include anything that relies on manual memory layout, mostly alignment issues (endianness isn’t an issue x86 vs. ARM) and very occasionally some inline assembly for the simple reason that inline assembly has become exceedingly rare. I say “well-written” because there’s e.g. C code out there hitting so much undefined behaviour that it compiles on exactly one compiler and runs on one architecture but even then it’s a case-by-case call on whether to rewrite or clean it up. I’d tend towards rewrite in Rust but there’s conceivable exceptions. Also you should’ve fixed that shit up years ago.