Link to the thread: https://programming.dev/post/8969747
Hello everyone, I’ve followed this thread yesterday and noticed a few very negative reactions towards the choice of Java. I follow Java evolution from far away, but it seemed like it was evolving in a good direction since the last few years, and that performance-wise it would make sense for the back-end of a Lemmy-like platform.
Is it indeed the case? I was just curious to see that much negativity towards one of the most popular languages.
They’re also extremely toxic. An example from 4 months ago when they vandalized cppreference.com :
The meme is that most Rust devs merely shout slogans like “memory-safety” without knowing what they mean, precisely because many of them come from web dev backgrounds (this video by Prime Time proves why that’s problematic: https://www.youtube.com/watch?v=Wz0H8HFkI9U , the guy has no clue what
std::unique_ptr
is) and have never touched a pointer in their lives. Easy and “appealing to hobbyists” languages are always an issue as the community usually ends up becoming toxic and full of wrong practices being normalized, and a prime example of that is PHP.Another example is how Lemmy initially struggled to handle 10k~20k users during the Reddit exodus despite the backend being written in the “ultra-fast memory-safe totally-will-replace-C++” Rust. Why? See this: https://github.com/LemmyNet/lemmy/issues/2877 and they were doing stuff like joining huge-ass tables before the filtering. If phiresky didn’t save them with his SQL prowess Lemmy would have literally died and its backend being written in Rust would not have changed a single thing.
Rust gives hobbyists the illusion that their projects will suddenly become fast and bug-free if they write them in Rust, and they don’t even hide that mentality as you can see that on almost every single project that’s written in Rust they list “written in Rust” as the main selling argument. This is probably the only language I’ve seen where this happens.
Now as for the “Java bad”, I’m kind of guilty of it too. I very much dislike how academia is obsessed with UML diagrams and the “Java way” of seeing OOP and interfaces everywhere. CPUs and GPUs do not think in OOP. They do not see “objects”. They see data, registers, caches, branches but certainly not your “beautiful abstract class”. When you think you did a good job of crafting a “clean” UML diagram with lots of “nice interfaces” which you then implement using virtual polymorphism in C++ and abuse
dynamic_cast
, you’re torturing the CPU with indirections, cache misses and branch mispredictions. Dynamic polymorphism and virtual inheritance in particular should not be the standard way to solve problems, yet that’s exactly what academia teaches and most of those who push those ideas coincidentally also happen to be from Java backgrounds and that’s why the “Java bad” meme is still alive.That said, beyond academia, I think it’s obviously stupid to religiously shit on Java. Lot’s of advanced features are coming out, Android is a thing thanks to Java and lots of web services are working with high reliability thanks to it. Also obviously, one has a much better chance at landing a high-paid software engineering job if one knows Java than if one knew only Rust.
I’m kind of guilty of that too - in mindset - I just don’t go around and shitpost about Java.
My dislike for Java also came from academia, since I had to use it in school. Though my main problem was just the general tool chain of Java. Like we had to use Eclipse or NetBeans. And then we had to write stuff in “JavaBeans” When instead of a normal “Person” class, you’d have to have a “PersonBean” and everything was so weird. And all the packages and references would constantly break or be missing, both in the project and even in the IDE itself…
After moving to C# and using Visual Studio, NetBeans just feels like you’re trying to build a house with a rock instead of hammer, on an already crooked foundation.
Though that was a long time ago. I assume things have improved. But I never really had any reason to go back to Java
C# will definitely spoil Java for you. Even modern Java, there’s just no going back from the .Net ecosystem without feeling like you’re timetraveling 10 years.
What does this even mean? One dopey teenager defaces a website, so now everyone associated with Rust is toxic?
This whole argument is just young edgelords bickering with old edgelords, in an eternal and pointless cycle.
Just curious, what other languages have had “one dopey teenager” of their community go and deface cppreference.com ? (which by the way happened multiple times with Rust kiddies, not just 4 months ago)
It doesn’t really change their point though.
Re: “the guy has no clue what
std::unique_ptr
is”, are you saying that because of his assertion thatunique_ptr
has a non-zero cost, whereas Rust’sBox
does not?He’s actually correct about that, although the difference is fairly minimal, and I believe the difference is outweighed by the unwinding (i.e. panic/exception handling) code that needs to be generated in both cases. But with unwinding disabled, you can see clearly that Rust generates exactly the same code for a
Box
as for a raw pointer, whereas C++ does not:The reason I looked into this is because of a Chandler Carruth talk primarily about
unique_ptr
called “There Are No Zero-Cost Abstractions”, which explains in detail why C++ fundamentally can’t optimizeunique_ptr
to generate the same code as a raw pointer.That’s a bad apples-to-oranges comparison,
unique_ptr
frees memory upon destruction, which with the raw pointer version you don’t do. The least you could do is use rvalue references. The class layout ofunique_ptr
is also hard to optimize away (unless via LTO) becauseconsume
isn’t in the same translation unit and the compiler has to let your binary be ABI compatible with the rest of your binaries. (Also, you’re using Clang 9 by the way, we are at version 17 now)This is much fairer: https://godbolt.org/z/v4PYcd8hf
Then, if you additionally make the functions’ bodies accessible to the compiler and add a
free
to the raw pointer version (for fairness if you insist to haveconsume
orfoo
destroy the resource), you should get an almost identical assembly code (with still an extra indirection that you’ll see in an extramov
due to the fact that the C++ compiler still doesn’t see how you use them, but IMO that should still be a textbook case for LTO), and the non-zero difference should disappear altogether once you actually use those functions and if it doesn’t you absolutely should file a bug report.Carruth, while an excellent presenter, has been on a “C++ standard committee bad, why don’t we do more ABI-breaking changes, y’all suck, Abseil and Carbon rule” rant spree, with that basically materialized by Google stopping active participation in Clang (haven’t followed the drama since then so not sure if Google backtracked on that decision), and it’s hard to consider him to be objective about this since he also has the Carbon project and his recent Carbon talks are painful to watch as it’s hard to ignore how he’s going from a “C++ optimization chad” that he used to be to a Google marketing/sales person.
Thank you for the detailed answer