![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://sh.itjust.works/pictrs/image/d963a355-eee9-43db-a8a7-ce9455732720.jpeg)
He talks an awful lot about making sense for someone who clearly doesn’t.
He talks an awful lot about making sense for someone who clearly doesn’t.
Would it kill them to make the same size of phone with a better battery?!
I did a similar upgrade last year. I don’t recall any problems under Debian. I now have Bluetooth and Wi-Fi, which my old mobo did not support.
Of course, you should be sure to do a full backup.
What happens when AI advances to the point where it can do everything it does today (and more) without using copyrighted training material?
This is inevitable (and in fact some models already use only licensed training data), so I think it’s a bad idea to focus so much on this angle. If what you’re really worried about is the economic impact, then this is a dead-end argument. By the time any laws pass, it will likely be irrelevant because nobody will be doing that anyway. Or only the big corporations who own the copyrights to a bajillion properties (e.g. Disney) will do it in-house and everyone else will be locked out. That’s the exact opposite of what we should be fighting for.
The concept of “art” changes based on technology. I remember when I first starting fiddling with simple paint programs, just scribbling a little shape and using the paint-bucket tool to fill in a gradient blew my mind. Making in image like that 100 years prior would have been a real achievement. Instead of took me a minute of idle experimentation.
Same thing happened with CGI, synthesizers, etc. Is sampling music “art”? Depends what you do with it. AI should be treated the same way. What is the (human) artist actually contributing to the work? This can be quantified.
Typing “cat wearing sunglasses” into Dall-E will give you an image that would have been art if it were made 100 years ago. But any artistry now is limited to the prompt. I can’t copyright the concept of a cat wearing sunglasses, so I have no claim to such an image generated from such a simple prompt.
We are targeting a first Alpha release for early adopters in 2026.
I will watch this from afar with great interest.
TL;DR: you won’t notice the difference. That’s the beauty of Stable. :)
I had that same thought. Water bears are visible to the naked eye?! I had no idea.
If you need new drivers then Debian is not the easiest distro. I love Debian but I do occasionally consider distro-hopping again to get some complex things working (like ROCm).
I do think Debian is an excellent starting place, though. If it suits you, great! If not, you’ll have a better idea of what you need to look for going forward. Hopping distros isn’t the end of the world, after all.
If you want cutting edge, don’t use Mint. But that’s not their focus at all. Mint is for people who just want their computer to work with minimal hassle.
These don’t seem like competing needs. When I think “just work with minimal hassle”, I don’t think “I need to restrict myself to outdated hardware”.
I’m perfectly happy running old packages in general. I’m still on Plasma 5, and it works just as well as it did last year. But that’s a matter of features, not compatibility. Old is fine; broken is not.
Everything old is new again. Sounds a lot like certain sects of Christianity. They say you need to accept Jesus to go to heaven, otherwise you go to hell, for all eternity. But what about all the people who had no opportunity to even learn who Jesus is? “Oh, they get a pass”, the evangelists say when confronted with this obvious injustice. So then aren’t you condemning entire countries and cultures to hell by spreading “the word”?
Both are ridiculous.
On the one hand, I’m not even running 4K yet, and it is vanishingly unlikely that I will own a >4K display within the lifetime of my PS5, so this makes no difference to me.
On the other hand, I would like to see blatant false advertising punished every time it happens. “Nobody really cares” isn’t much of an excuse when they clearly thought people cared enough to put it prominently on the box. Being able to play high-end video 10 years down the line is a legitimate selling point for a gaming console that doubles as media box.
Am I out of touch with Qualcomm’s increasingly confusing naming schemes, or is that awfully expensive for a 7sG2?
What’s this? A software app store?
It’s ironic how on Linux, my distro’s app repository is always my first stop when looking for software, while on Mac or Windows it’s my last resort.
Commercialized app stores are full of spam, and Microsoft and Apple both decided that app store apps should not have the full capabilities of normal apps. It’s the exact opposite on Linux.
Not sure I understand this one. I’m finding it difficult to read this as anything other than “yes, most people understand negation as negation, and not as something entirely different”. Are there any languages or cultures where negation is same as inversion?
How would you even invert an adjective that doesn’t exist on a one-dimensional scale? For example, good<->bad makes sense, because they are clear opposites. But happy<->sad does not make sense, because emotions don’t exist on a single axis and do not have clear opposites. “Not happy” encompasses all states besides happiness. Could be angry, could be sad, could just be neutral. Like the old saying goes, “the opposite of love is not hate; it’s indifference”.
Thanks for the recommendation! I was looking at the Fedora family since AMD officially supports RHEL 9. Hadn’t gotten as far as to figure out how well that transfers to Fedora and its derivatives. Good to hear that it works.
If you’re only testing on one set of hardware, it isn’t going to tell the whole story. The results might be very different on an AMD vs Nvidia GPU, or even on a brand-new vs 1-3 generation old GPU.
Probably the most important thing for gaming is driver support and ease of installation. This sometimes runs directly counter to other general-purpose needs.
I’m still on the hunt for a distro where everything I need is easy to install. I don’t think any exist, primarily because GPU drivers suuuuuuuck, especially when you need CUDA or ROCm to work.
This is the great thing about open source. It benefits everyone. Any good idea that does not have significant drawbacks should get broad adoption. And that’s generally how it plays out.
Reputations live on for many years (decades, even) after they are justified.
Emulation.
Definitely going to incur a performance hit relative to native code, but in principle it could be perfectly good. It’s not like the GPU is running x86 code in the first place. On macOS, Apple provides Rosetta to run x86 Mac apps, and it’s very, very good. Not sure how FEX compares.
Correct.
Batteries will still lose charge very slowly, so at some point the battery controller will top itself back up. This is nothing to worry about, and I’m not sure macOS (or Linux) will every display the true charge level of a battery. I believe there is some wiggle room built in at the firmware level.
Agreed. English is a stupid language in many ways. Why do we shoehorn in gender when it is not relevant? Why does it deserve to be baked into the language? How the ever loving hell can you expect someone to understand someone else’s gender implicitly in arbitrary scenarios? Even when you can see someone face to face, if they’re not strictly following narrow gender norms, your accuracy is going to be dogshit. Why bother?
I understand the feeling that parading around pronouns and taking time out of our days to explicitly establish them (when it’s generally, again, not relevant) is tedious and confusing. I barely have the brainspace to remember names. The obvious answer is to use neutral language whenever it is sufficient in context. Which is, again, most of the time.
I think it goes beyond the Internet, and beyond trans inclusion. Even if you’re a bunch of cis folks talking face to face, it still makes sense to default to neutral pronouns. I don’t always know (and certainly don’t always care) what someone’s sex or gender is face to face, and that ain’t new.
The singular “they” is awkward, but it’s like two hundred years too late to come up with something better.