

Definitely a worthy advice. Many times it’s just bad hardware, not OS.
Definitely a worthy advice. Many times it’s just bad hardware, not OS.
“it works on Linux” isn’t black and white. My 2070 would not work with freerdp in full screen and nouveau drivers. It works with Nvidia drivers, though.
Solution is an easy one, just become rich and you’ll profit, too.
Obligatory: So he can focus on releasing the Epstein files, right.
But the government is distracting public with such actions. Let me put it like this: Imagine your house is on fire, a fireman, which provided and provides fuel to the arsonist, strolls by and tells you that by next week they will pour a glass of water on your house.
Hey, no need to hurry with these pointless (in short term) actions. I’m sure palestians won’t mind another month of genocide to be recognized which will change exactly nothing for them.
See my other reply
I was thinking about Apple’s M CPUs that have fixed length and they benefit out of it. It was explained on Anandtech years ago, here is a brief paragraph on the topic. Sadly Anandtech article(s) isn’t available anymore.
Since this type of chip has a fixed instruction length, it becomes simple to load a large number of instructions and explore opportunities to execute operations in parallel. This is what’s called out-of-order execution, as explained by Anandtech in a highly technical analysis of the M1. Since complex CISC instructions can access memory before completing an operation, executing instructions in parallel becomes more difficult in contrast to the simpler RISC instructions.
Yes, but RISC knows the exact position of that instruction in cache and how many instructions fit the instructions cache or pipeline. Like you said, it doesn’t help with data cache.
From what I remember one of problems with CISC is that it has variable length instructions and these are harder to predict since you have to analyze all instructions up to the current one wheres for RISC you exactly know where is each instruction in memory/cache.
RISC is perfectly good for desktops as demonstrated by Apple. Microcontroller chips are suitable for light desktop tasks, they are nowhere near modern x64 CPUs. For now.
ARMs are more oriented towards servers and mobile devices for now. Sure, we saw Apple demonstrating desktop use but not much is there for desktops for now. RISC-V is far away, Chinese CPUs are not competitive. It’s coming doesn’t help in short term, questionable in mid term. 🤷♂️ Yes, alternatives will come eventually, but it takes a lot of time and resources.
We don’t have that many other processors, though. If you look at the desktop, there is AMD and there is Apple silicon which is restricted to Apple products. And then there is nothing. If Intel goes under ground, AMD might become next Intel. It’s time (for EU) to invest heavily into RISC-V, the entire stack.
Looks like “if everybody was armed it could be prevented” doesn’t work all that well, who could have known…
deleted by creator
Probably not just capital, also political influence and other ways I guess. But yes, it’s always a problem if it’s like that. If you ask me, there should be a generic independent payment backbone, where many providers could provide payments - like internet or something like that.
The issues here are trust, security, adoption and so forth. It’s not easy to start a competition here I’d say.
STASI enters the chat.
I guess that is one way to avoid releasing Epstein files 🤪
Multiboot is annoying if you multitask. Separate computers is better but then they take space and power. I wish there was a perfect solution… I’m switching to Linux but still have long running Windows protects I have to develop. I’ve settled for running Linux bare and Windows in virt-manager. Seems working for me so far.