The “Apple TV” is Apple hardware.
The “Apple TV” is Apple hardware.
How do you not do that? It’s all in your local network, how would it not work offline…?
Replace the 3060 with an equally-priced AMD card and you’ll actually get something decent for your money. Nvidia is horrible at these “lower” price points.
It is, but then again many (most) are hosted on GitHub.
Even if all of that worked, why would I want it?
What I mean by that is that they will take a huge disservice to their customers over a slight financial inconvenience (packaging and validating an existing fix for different CPU series with the same architecture).
I don’t classify fixing critical vulnerabilities from products as recent as the last decade as “goodwill”, that’s just what I’d expect to receive as a customer: a working product with no known vulnerabilities left open. I could’ve bought a Ryzen 3000 CPU (maybe as part of cheap office PCs or whatever) a few days ago, only to now know they have this severe vulnerability with the label WONTFIX on it. And even if I bought it 5 years ago: a fix exists, port it over!
I know some people say it’s not that critical of a bug because an attacker needs kernel access, but it’s a convenient part of a vulnerability chain for an attacker that once exploited is almost impossible to detect and remove.
That’s so stupid, also because they have fixes for Zen and Zen 2 based Epyc CPUs available.
Intel vs. AMD isn’t “bad guys” vs. “good guys”. Either company will take every opportunity to screw their customers over. Sure, “don’t buy Intel” holds true for 13th and 14th gen Core CPUs specifically, but other than that it’s more of a pick your poison.
Privacy is not just black and white.
I’d want bluetooth for music from my phone though. And it’d be nice if my phone’s cellular and GPS didn’t get blocked.
x86/x64 code is pretty much 100% compatible between AMD and Intel. On the GPU side it’s not that simple but Sony would’ve “just” had to port over their GNM(X) graphics APIs to Intel (Arc, presumably). Just like most PC games work completely fine and in the same way between Nvidia, AMD and Intel GPUs. But they have to do that anyway to some extent even with newer GPU architectures from AMD, because PS4’s GCN isn’t 1:1 compatible to PS5’s RDNA2 on an architectural level, and the PS4’s Jaguar CPU isn’t even close to PS5’s Zen 2.
Other than that, you’re right. Sony wouldn’t switch to Intel unless they got a way better chip and/or way better deal, and I don’t think Intel was ready with a competitive GPU architecture back when the PS5’s specifications were set in stone.