jerakor
For an RN the average is $100k and the top top is ~$160k.
The answer is that the overall healthcare system needs to be rebuilt. The fact though is that the hospitals all know the incoming administration will be cool with corner cutting but also won’t pay out very well.
The trend already is for insurance to never pay out. Currently you have to bill for 4X the price of something just so the insurance company can write off how big of a savings things are when they pay out only 1/4th the cost which ends up hopefully being the actual cost.
So you have a hospital administration attempting to extract value from their workers, because they know that no one else will pay. Besides if a strike goes on they can just get the new administration to handwave hiring “holistic nursing” professionals who take a fake online test. They can just read out that the AI says that the problem is a combination of verbal irregularity and overall health being a bit behind.
You think MOST hospital staff make OVER 150k in the Portland area? That is an incredibly hot take.
The average salary is about 95k but that also includes the “high” earners and I put high in quotations because it still isn’t like tech sector. A medical assistant at providence starts at 48k/yr. Desk worker starts at about 30k/yr.
That means the average worker can afford $650 to $2000 a month in rent or mortgage. Even at the top end that isnt enough to get a 2 bedroom apartment if they are single with a kid. That isnt enough to even approach buying a house unless they are splitting it with someone else making more than them.
I see this a lot which is wild to me because I feel like S4 felt like it finally was real Star Trek but just rushed and some of the damage to some characters couldn’t be fixed. All the major plot points that make Enterprise relevant to Star Trek happen in S4.
I’m curious where you put Discovery? That is the one I struggle the most with. My primary issue there is that for me I have to actually like and want to be invested in a character but as far as I’m concerned 10 episodes in to Discovery if the ship blew up all hands lost the Federation I can’t think of anyone I’d feel sad for. Enterprise though has Trip and Phlox who are S tier, a few fantastic guest stars, and no character that is bottom bin material to me no matter how much fanfic quality writing they tried to force on T’Pol.
You make a lot of good points in here but I think you are slightly off on a couple key points.
These are ARM not x64 so they use SVE2 which can technically scale to 2048 rather than 512 of AVX. Did they scale it to that, I’m unsure, existing Grace products are 4x128 so possibly not.
Second this isn’t meant to be a performant device, it is meant to be a capable device. You can’t easily just make a computer that can handle the compute complexity that this device is able to take on for local AI iteration. You wouldn’t deploy with this as the backend, it’s a dev box.
Third the CXL and CHI specs have coverage for memory scoped out of the bounds of the host cache width. That memory might not be accessible to the CPU but there are a few ways they could optimize that. The fact that they have an all in a box custom solution means they can hack in some workarounds to execute the complex workloads.
I’d want to see how this performs versus an i9 + 5090 workstation but even that is going to already go beyond the price point for this device. Currently a 4090 is able to handle ~20b params which is an order of magnitude smaller than what this can handle.
It’s not a real problem for a system like this. The system uses CXL. Their rant is just because they didn’t take the time to do a click down into what the specs are.
The system uses CXL/AMBA CHI specs under NVLink-C2C. This means the memory is linked both to the GPU directly as well as to the CPU.
All of their complaints are pretty unfounded in that case and they would have to rewrite any concerns taking into account those specs.
Check https://www.nvidia.com/en-us/project-digits/ which is where I did my next level dive on this.
EDIT: This is all me assuming they are talking about the bandwidth requirements of allocating all memory as being CPU allocation rather than enabling concepts like LikelyShared vs Unique.
Ukraine was the 3rd largest nuclear power in the world, and is famous for it’s history with nuclear energy.
The issue here is that them starting the enrichment process is grounds for the start of WW3, and they wouldn’t complete the effort in time to offensively defend themselves. You’d have to give them entirely complete nukes and even that would just mean it’s nuke launchin time for a number of folks.
Debian tends to be a liiiiitle bit behind Fedora and because gaming on Linux is accelerating in popularity, being ahead can provide big gains in performance.
Can you manually handle all of that? Sure. I mean I have Mint on my side desktop with a custom Kernel but I recognize that I am dropping a V8 into a Mini Van.