Their GPUs are already bricks. Just throw the GPUs.
Their GPUs are already bricks. Just throw the GPUs.
Two thoughts come to mind for me:
Or look at Python and their urllib, urllib2, new urllib, and the requests package on PyPi.
We already sort of saw this in Rust with crossbeam and standard channels, until of course they replaced the standard lib implementation with crossbeam’s implementation.
Hey look, the classic “America bad” comment on a post critical of China!
Are these people bots or something? It’s possible to be critical of both at different times.
While I agree, it makes connecting to localhost as easy as http://0:8080/
(for port 8080, but omit for port 80).
I worry that changing this will cause more CVEs like the octal IP addresses incident.
Edit: looks like it’s only being blocked for outgoing requests from websites, which seems like it’ll have a much more reasonable impact.
Edit 2: skimming through these PRs, at least for WebKit, I don’t see tests for shorthand IPs like 0
(and no Apple device to test with). What are the chances they missed those…?
Imagine how different the story would be if they compensated people for this data. “10% off Geforce NOW if you let us use your gameplay footage as training data!” (for example)
This is obviously cheaper and there’s way more data to train with, but it just continues to skirt a line in copyright law that desperately needs to be tested.
Still working on an assertions library that I started a few weeks ago. I finally managed to get async assertions working:
expect!(foo(), when_ready, all, not, to_equal(0)).await;
It also captures values passed down the assertion chain and reports them on failure (without requiring all types to implement Debug
since it uses autoref specialization).
Hopefully it’ll be ready for a release soon.
Honestly, regardless of what happens to Intel, I’m hopeful for Qualcomm providing a real alternative in the CPU space, especially an alternative as meaningfully different as using an entirely different instruction set. More diversity between competing products in the space can only be a good thing since it gives consumers more meaningful choices to make when deciding between products.
People talk at the urinal?
People on Chrome adding Reddit to their Google searches already use Google. People not using Google who don’t search “Reddit” are going to see fewer Reddit results.
No, this won’t kill Reddit, but it certainly isn’t helping them get more traffic.
Joke’s on Reddit. I’ve been blocking their results in the search engine I use for months!
I wonder if this will end up being pursued as an antitrust case. If anything, it’ll reduce traffic to Reddit from non-Google users, so hopefully that kills them off just a little faster.
Looks like the article requires an account. Is there an archived version?
GN’s charts usually compare against a few gens of somewhat comparable products, so I wouldn’t be surprised to see a 12th gen CPU or two on the charts. I’d also expect to see some 7000 series Ryzen chips and maybe a 5000 series one. I believe they normally include these older gens for people who skipped a gen or two to see what they’d get out of an upgrade.
To be clear - I’m referring to devices with, say, 128MiB of device storage and memory when I refer to low memory machines (which I’ve developed for before actually). If you’ve got storage in the GB, then there’s no way optimizing for size matters lol.
My understanding is that should almost only ever be set for WASM. Certain low-memory machines may also want it, but that’s extremely rare.
I’m not sure who’s recommending it, only ever seen it recommended for WASM applications.
Your lack of imagination
I don’t know why you think these ideas were mine, but I do work for a rather large company that has invested a lot of resources looking for solutions using these models. These ideas came from people far smarter than I.
The rest of your comment has so little to do with what I said that I’m inclined to believe it’s AI generated.
You’re right. Once it settles into its niches and the hype dies down, it won’t be overhyped anymore because everyone will have moved on.
I’ve been working with generative AI for years now and we still struggle to solve real world problems with it. It isn’t useless or anything. It’s way too unreliable, and this isn’t one of those things where time will solve it - it’s being used to solve problems that have no perfect solutions, like human interfacing and generating culturally-appropriate and visually-accurate images. I’d expect it to improve at those tasks over time, but the scope needs to drop from every problem humanity has ever faced to the problems that these models are good at solving.
As much as I dislike Nintendo, the Switch is an excellent console despite its hardware. It’s no surprise that it’s been as popular as it has been for so long. These days though, there are a lot of competitors in the handheld space that have much better hardware, so it really maintains its position due to a combination of branding and the game exclusivity.
I’m curious what their next console will be. I probably won’t buy it, but I wouldn’t be surprised if it was also a huge success.
There are emergencies the adults at the school won’t understand. This has happened a few times to my spouse, where the nurse/teachers kept brushing off issues they didn’t understand, ranging from things like asthma to strep throat.
Otherwise, I agree that the phones should be put away during class.
Sorry if I’m missing some sarcasm here, but if this is all you have to contribute, then as a professional software developer, I’d much rather work with the author of the article on a daily basis.