FOSS in general needs better means of financial support. While the software is free and libre, developer time is not, and ultimately they gotta eat and pay bills. I hope they get positive results and don’t catch much unnecessary flak.
FOSS in general needs better means of financial support. While the software is free and libre, developer time is not, and ultimately they gotta eat and pay bills. I hope they get positive results and don’t catch much unnecessary flak.
My first programming experience, an online class, was in a Linux VM. Linux made programming easy and delightful, Windows always made it a huge pain. As time went on, more of what I did was easier on Linux, and now everything is.
Well this is a tremendous step in the wrong direction. The economic problem is the ad supported model in the first place, no matter how it’s run. This is the same thing Google does, they keep user data to themselves and sell the ad placement. So now Mozilla has the same economic incentives as Google. Unfathomably bad move.
The moment that shocked me was when printers, network cards, and even motherboard integrated Ethernet didn’t work on Windows without driver downloads but Linux was plug and play. Full reversal of the situation.
Note the versions, none of the results give you the official operators page for the current version, 16. They give 9, which went EOL in 2021.
A major caveat I’ve noticed some people misunderstand: it’s corporate CLAs that are problematic. The Apache Foundation also requires contributors sign a CLA, but it’s to provide a legal fail safe and a way to update to say Apache 3.0 if need be one day. Apache’s non profit, open source mission aligns with respecting the rights of contributors and the community. Corporations, on the other hand, not so much.
Have you used it recently? Previous versions I would’ve agreed, but 5.0 was a huge improvement. If I didn’t know, I’d likely have assumed it to be a native feature.
I’ll take a look at Vivaldi’s approach though, I’ve heard good things about those features previously.
If you want vertical tabs with the ability to organize them in trees I suggest the Sideberry extension. It legitimately makes me nervous that the functionality would ever go away, it improves my productivity so much.
You can bookmark trees, collapse them, search them, load/unload them manually, I could go on. It makes it easy to organize dozens or hundreds of tabs. I have some trees for emails, news, forums, projects, etc. When I’m done just fold it up: the top tab bar can hide tabs that aren’t in the active tree you’re using, so you can still navigate the tabs normally.
GNOME always seemed like an odd choice considering how little customization is available. It feels like a prescriptive approach, you will use your computer the way GNOME feels is appropriate, whereas KDE tries to accommodate however you want to use your computer.
3 or 4 years, including on Nvidia machines. I’ll admit it took fiddling to get working awhile ago. Nowadays I use my desktops AMD iGPU as the main display driver and offload the rendering to the Nvidia card for intense programs or games, best of both worlds.
If I’m understanding this right, and this basically an API that lets you pick which app store administers an app, that could be quite helpful, not harmful. I currently have fdroid, play store, and Samsung store, and I assume they try to update apps by the fully qualified name, as multiple stores show and try to update a single app instance, sometimes with weird results.
Compression is actually a mathematical field that’s fairly well explored, and this isn’t compression. There are theoretical limits on how much you can compress data, so the data is always somewhere, either in the dictionary or the input. Trained models like these are gigantic, so even if it was perfect recall the ratio still wouldn’t be good. Lossy “compression” is another issue entirely, more of an engineering problem of determining how much data you can throw out while making acceptable compromises.
This is a classic problem for machine learning systems, sometimes called over fitting or memorization. By analogy, it’s the difference between knowing how to do multiplication vs just memorizing the times tables. With enough training data and large enough storage AI can feign higher “intelligence”, and that is demonstrably what’s going on here. It’s a spectrum as well. In theory, nearly identical recall is undesirable, and there are known ways of shifting away from that end of the spectrum. Literal AI 101 content.
Edit: I don’t mean to say that machine learning as a technique has problems, I mean that implementations of machine learning can run into these problems. And no, I wouldn’t describe these as being intelligent any more than a chess algorithm is intelligent. They just have a much more broad problem space and the natural language processing leads us to anthropomorphize it.
I think about it like a tree structure for both. With a gui you have to move your mouse around to various places, with a cli each character branches off into another tree. Mathematically you can handle more options faster with a CLI.
Normal people don’t, but when you get into absolutely massive enterprise archiving there’s no rival for the density and cost effectiveness. It sucks for general purpose storage, but for write once, hopefully never read use, they’re ideal.
Curiosity, back around 2010 before I was a teenager. No clue how I heard about it, but the concept of replacing the entire operating system was fascinating. I figured it must be really good if it was such a well kept secret.
A few years later, when I started to learn programming, Linux was the obvious winner. The online course taught C in a Linux environment, and I was amazed that the default Ubuntu build at the time had everything built in, whereas a Windows equivalent required visual studio and licensing adventures.
It really stuck as a daily driver after Windows 7, where a clear trend emerged: Windows got in my way, Linux got out of my way. Simple as.
I use Arch for my daily, and I would highly recommend against it for new users. 99% of the time it’s just fine. 1% of the time some edge case sneaks by and you update before a fix is pushed. In those cases, I’ve had installations be deeply broken, far beyond my expectations of normal users.
For actual recommendations, something Debian based for sure. Vanilla Debian, Mint, or Mint Debian edition. If you wanna live on the edge, Sid is rolling but in my experience was more stable than Arch.
Recently got a Onyx Boox Ultra and it’s incredible compared to my previous Kobo. Basically, its 10" with stylus input and a keyboard case. The special sauce is it running Android, complete with the Google store. The display tech is advanced enough that normal apps, for instance Connect for Lemmy, work fine. I have mine setup with Syncthing, Home Assistant, Obsidian, it all just works, mostly. I’d recommend using a 3rd party launcher and not touching the Onyx account, though.
I’ve had great experiences with Kobo, though. I literally went through 4 models because they kept upping their game. They’re less sketchy than Onyx and are very open; you can load your own books of nearly any format and modify it as it runs linux. You can even completely replace the OS.
The comments from that article are some of the most vitriolic I’ve ever seen on a technical issue. Goes to prove the maintainer’s point though.
Some are good for a laugh though, like assertions that Rust in the kernel is a Microsoft sabotage op or LLVM is for grifters and thieves.