

The number 21.
made you look
The number 21.
Unfortunately, it’ll still take a while, this chip from MS only has 8 qubits on it.
And the largest number ever factored by a quantum computer is… 21, and that record was set 13 years ago.
Or just plain union organizers.
I’m not sure if there is just some “point of diminishing returns” or whatever where JPG actually becomes more efficient or what.
There is, but it’s at high quality levels. If you’re using WebP for thumbnails or other lower quality situations (Which was the original intended use) then WebP will give you better quality than JPEG for a given filesize.
For lossless uses it’s even better, the format is much more limited than PNG, but in the common cases it beats it.
I take that there isn’t much motivation in moving to 128 because it’s big enough; it’s only 8 cycles (?) to fill a 512 (that can’t be right?).
8 cycles would be an eternity on a modern CPU, they can achieve multiple register sized loads per cycle.
If we do see a CPU with 128 bit addresses anytime soon, it’ll be something like CHERI, where the extra bits are used for flags.
I think CHERI is the only real attempt at a 128 bit system, but it uses the upper 64 bits for metadata, so the address space is still 64 bits.
It’s kind of like Wikipedia, you have to cite the original source (State or council maps) rather than a 3rd party source.
The map data for my suburb is all kinds of wrong on Google Maps, there’s a park around the corner from me which is marked as a house for starters, blindly copying that into OSM would be a disservice.
The ones that fall out of the sky rarely hit anything. Ukraine is vast.
By definition they hit something, it’s just not considered important when said something is a patch of grass or a medium sized rock.
I think the biggest issue would be a lack of interfaces to the C side code, they’re slowly being fleshed out and each one enables more functionality for the Rust modules.
e.g. the test Ext2 driver a MS dev wrote last year after enough of the filesystem interfaces got hooked up
But even then, I don’t think the maintainers would accept one that replaces the existing C driver, that’d break non-Rust builds and architectures, and that’s a sure-fire way to get Linus on your case. Best you can hope for is one that complements a C driver, and even then I think you’d need a good reason to have two drivers for the same hardware.
Best way forward if they so insist is to refactor small bits without interfering with the existing code-base.
I’m not sure they’re even doing that, I think the policy is that Rust code can depend on C code, but C can’t depend on Rust. So at the moment nothing can actually be rewritten in Rust, it’s only additions like new drivers.
Until Elon shambles in and plugs in an ethernet cable of course
Do they really not realize this literally covers everything NOAA does?
That’s the whole point, they do.
You just leave those bits out when making your own CPU.
On a more serious note, having a CTO at Microsoft, of all places, jump in on your side is kind of embarrassing.
That comment was from a few years ago and wasn’t in relation to Linux, and the company he co-founded made some pretty useful things (And revealed the Sony rootkit in 2005) before MS bought them.
He named the recovery barges after sci fi spaceships (modern sci fi, not old nazi stuff)
It’s pretty clear to me that Elon’s never read a Culture novel, they’re antithetical to him.
Not to the same extent, part of the point of AT is that everything shares the same feed, while the AP relays only offer the view of co-operating servers.
Now Mastodon does offer its own streaming API that lets you see every post on a given server, and at least the fedibuzz relay offers that as well, so if you use that API endpoint you’ll see every post of every server it knows about in real time.
A lot of the GPU drivers back then sucked since it was a rather large departure from how stuff worked under XP.
Like in one release we went from a model where only a single app could use the 3D bits of the GPU at once, to the system itself relying on them to present the UI and letting multiple apps share it at the same time.
Not really, with a GPU compositor it’s basically free.
“Fans claim it may offer an improvement” isn’t exactly a definitive statement.
From what I understand the research does actually show little to no improvement for either mode, which is actually a bit odd because we know the eye performs better the brighter the surroundings (Since it causes the pupil to contract, increasing the depth of field)
Maybe it’s a sign that we just need more research into the effectiveness in interfaces.