C++ Zooms Past Java in Programming Language Popularity Contest – Slashdot

Posted under Programming, Technology On By James Steward

Follow Slashdot stories on Twitter




The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
This is so obviously wrong. I don’t believe TIOBE for a second.
If this was true it should correlate with stuff like number of github projects and job offerings.
And it just doesn’t
I am also sceptical.
Do TIOBE themselves try and justify their figures by correlation with other metrics ?
I don’t think this report says as much about C++ gaining popularity as Python rising hugely.
Python cannibalizes Java’s former user base. What was Java’s claim to fame?
Write once, run anywhere? Python does that, with a much leaner runtime.
No memory management to worry about? Python does that.
Just In Time compilation (aka, slow apps)? Python does that, albeit with an even slower interpreter (offset by faster hardware today and promises of faster implementations in the future).
The best thing about Python: it’s
Yes, and I can’t believe how low JavaScript is… 7th position, behind Visual Basic.
And assembly (9th) tops PHP (10th) and Go (12th)… I haven’t seen handwritten assembly for a while now, and I work in embedded systems. Swift (15th) and Kotlin (23th) also seem low for the official languages of the big two mobile platforms.
Also, bash is in 36th position, behind Prolog, D, and PL/SQL, seriously?!
Considering how many different processors and assembly languages there are out there
Well, regarding bash, perhaps people rarely google for it? I’m at the moment however in the thought process to kick ksh/sh and focus on bash. As in: do not avoid bash specific stuff, but embrace them. As I frankly nearly exclusively used bash the recent 20 years or so. One project used ksh
Kotlin is gaining popularity on the server side as well. I don’t see companies with an established JVM code base moving away from JVM, but I do see them migrating from Java to Kotlin. This was the route the company I worked at pre-pandemic was going, as well as at the company I work at now. So any loss of Java popularity should be made up by an increase in Kotlin popularity in the charts.
Yes, and I can’t believe how low JavaScript is…
If you look TIOBE’s criteria, the list is more like a list of the “currently most publicly confusing languages”.
hey, it’s clickbait time again!
What is going out in the wider world is different from (a) what academia and media see and (b) the rather limited subset of the world seen by those with the time to come to Slashdot to kvetch and charlar about it.
I’ll use whatever language someone will pay me to use. That’s been Java and C# for more than 20 years.
And JavaScript, unfortunately.

for most things, C++ or another C-alike language is strictly better than C.

for most things, C++ or another C-alike language is strictly better than C.
Talk to an FAA-appointed Designated Engineering Representative specializing in certification of software and you’re likely to get a very different opinion.
Over a decade ago, before Oracle Started getting their hands in Java, and made it non-free (as in cost). When starting a project to create a new software. Java was a good choice, as a full feature modern language, with wide support and could be deployed anywhere.
However after a few big Java bugs, and Oracle making it expensive to develop off the core Java (openJVM is fine, just not considered official for businesses) Other Languages started to become attractive. Especially with more advance IDE’s program
Java isn’t going away and keeps getting better. The one most likely to be knocked down hard is Python. There two major factors going against it that don’t apply to even Node, let alone, C/C++/C#/Java/Rust/Go:
1. The community seems to be saturated with minimally employable users. By this I mean most of the Python developers we’ve seen in our East Coast tech hub are utter garbage at their own language of choice compared to C, C++ and Java developers. I’ve worked with only 2 Python-slinging data scientists who happened to have real chops at writing code and surprise! their skills included heavy Java development so they understood SWE work way better than most of their peers.
2. Any fundamental disruption in chip economics will hurt Python more because it’s so inefficient compared to the others. This is likely coming via a major military campaign against Taiwan. If the cost of good CPUs shoots up hard, Python is going to suddenly cost a lot more than more CPU-efficient languages like its main competitors. It’ll continue to be popular mainly in web dev and writing glue code, but on the big data side it will be too expensive.
1. An anecdote
2. Too tenuous to be serious.
You might be correct ( I personally doubt it ), but your arguments are worthless.
2? Not tenuous. You can measure these things
Ranking Programming Languages by Energy Efficiency
https://haslab.github.io/SAFER… [github.io]
https://kaspergroesludvigsen.m… [medium.com]
https://hackaday.com/2021/11/1… [hackaday.com]
You might also be correct ( I personally doubt it ), and your argument is even more worthless.
These are silly comparisons. If you’re writing a lot of heavy for loops in an interpreted language, you’re doing it wrong. If, for some reason, you do need lots of for loops, use a JIT or compile your code before you run it.
1. This is the curse of ‘the’ hot language, *particularly* when it’s ‘the’ language for educating on programming. Language popularity causes this phenomenon. It’s not that there’s anything about Python that fundamentally makes their developers less capable, it’s just that as the ‘default’ language for early programming career it attracts all the less capable developers.
2. Computers aren’t going to get *slower* due to such scenarios, though they may stagnate. It’s not going to suddenly cost more than it already does. It’s accurate that a compute intensive section written in ‘pure python’ is going to underperform most alternatives, but the vast majority of development isn’t compute intensive. To Python’s credit, it has respectable ability to invoke native code (whether it’s a shared object providing a Python interface, or the built in ctypes to call native code from core python, or going the extra little bit for cffi third-party for easier support of header-defined C interfaces). Lot’s of the ‘compute’ intensive python stuff is just manipulating input and output to popular compiled libraries, like numpy.
I see Java and C# suffering from the fact that you have this wave of traditionally compiled languages offering a lot of the Java benefits, without the awkwardness of an external managed runtime. Rust and Go provide traditional executables that are straightforward to package and run. Interpreted languages suffer the same awkwardness, but it at least means the user has the feature of being able to directly tweak without a build step. The compiled managed runtime languages offer much of the worst of both worlds.
> 2. Computers aren’t going to get *slower* due to such scenarios, though they may stagnate. It’s not going to suddenly cost more than it already does.
The industry works on the assumption that hardware is now cheaper than labor costs except at places like Facebook and Google which have truly extreme scalability requirements.
What I was referring to is that we are on the precipice of WWIII, and China controls our supply of chips going forward unless we are willing to risk catastrophic casualties in our Pac
If we got the unions out of the way, American labor could easily compete with the Chinese. Our costs would be slightly higher because we would not pour wastewater directly into the environment, but that might be a small price to pay in the longer term.
Presuming that supply chain completely collapses, it means that their capabilities pause. They do not suddenly lose access to the massive volume of gear in their datacenters today. The volume of data they can process today is the same volume they could keep processing. If anything would be at threat it would be data *storage*, but processing isn’t suddenly going to backslide unless something were to happen to the standing installation of equipment, and in that case matters are probably so screwed that the
Why would China cut off outside sales, rather than continuing sales and taking the profits for themselves?
Energy costs are going up. Direct costs such as energy to run the computer, and indirect costs such as aircon and the network. That alone means that compute-inefficient languages are going to cost more.
Also, despite all the rah-rah bs from rust evangelists, it’s stuck at #20. I can live with that.

Most of our good CPUs and GPUs come from China. It’s pretty much a guarantee that unless we’re extremely lucky or China wimps out, Taiwan will be getting cut off from global supply chains by the PLA.

Most of our good CPUs and GPUs come from China. It’s pretty much a guarantee that unless we’re extremely lucky or China wimps out, Taiwan will be getting cut off from global supply chains by the PLA.
The designs for those chips are from the USA: you’re talking about fabrication. TSMC is at this moment putting it’s most advacned manufacturing (3nm process) in Arizona.
Taiwan’s TSMC to bring its most advanced chip manufacturing to Arizona [cnn.com]
What will affect the high-tech sector is China’s control of materials. That’s land (mining for “rare” metals), which unlike technology is not absolutely portable.
As you note, none of this has or will have anything to do with Python or any other software.
LISP is (in i
Point 2 is obviously nonsense because computers aren’t going to get slower.
Point 1 is less obviously nonsense, but is, anyway. The Tiobe index is calculated by counting search hits for each programming language on twenty five search engines. Python seems to be the language of choice for amateur programmers. I classify “amateurs” as hobbyists and people who need to do a bit of programming in a job that isn’t programming. As an example, most of the people on Youtube with maths channels will use Python for the
By this I mean most of the Python developers we’ve seen in our East Coast tech hub are utter garbage at their own language of choice compared to C, C++ and Java developers.
This is true with anything that has a low entrance bar. Python is designed to get you up and coding what are basically advanced shell scripts rather quickly; it is like Perl, but less elegant and capable. Consequently, you have a lot of people who have picked up the “Learn 2 Code Python 4 Dummeez” ebook and have some grasp of one type of

The stuff just works, yo

The stuff just works, yo
Right up until you have customers. I worked for a company that thought the way you do. PHP just works. Of course the 4 second average loading time of their web pages was a problem. One they didn’t notice until they were declaring bankruptcy. People don’t like to wait. Just because the web page eventually loads doesn’t mean the backend code “just works, yo”. If it takes seconds to load, it doesn’t just work because by that time, your customers have all left. If software works or not isn’t binary. Do
If you are rolling out something for heavy usage, you are going to design the system differently from the ground up. It will need to scale, have specialized caching, and so forth. For the average website however that is serving a small audience, PHP works great because it allows template-style coding and does great for most of those websites. This is why despite being a technically unimpressive language, it makes stuff work just great for most uses. If I were writing for a high-volume website, I would not u

Of course the 4 second average loading time of their web pages was a problem

Of course the 4 second average loading time of their web pages was a problem
While python is slow, if your page load is that slow, the language of choice is the least of your contributing factors.
Python has become the PHP of the 21st century. It is popular because of an “It is easy!!!” hype. Yeah, the SYNTAX is easier than C++ or Java or LISP for that matter, but that has little to do with complexity in the real world. It is at the end of an illustrious list.
Illustrates that the whole “many eyes make all bugs shallow” claim is utter bullshit, but that’s another discussion.
Want a good C/C++ program. Ask an experienced C/C++ programmer. Want a good Java program. Ask an experienced C/C++ programmer. Why? Because they have fewer illusions of what’s going on “under the hood”.

Along with the irony that original JAVA was written in C and the modern Java compiler is now written in Java, while the JRE is STILL WRITTEN in C.

Along with the irony that original JAVA was written in C and the modern Java compiler is now written in Java, while the JRE is STILL WRITTEN in C.
You are confused. The JRE is the Java VM host. Java is a language for the Java VM. I feel like a good “duh” is in order here.
It’s like saying it’s ironic that a SNES emulator isn’t written in 65c816 assembly, and 65c816 processors aren’t even real.

Want a good C/C++ program. Ask an experienced C/C++ programmer. Want a good Java program. Ask an experienced C/C++ programmer.

Want a good C/C++ program. Ask an experienced C/C++ programmer. Want a good Java program. Ask an experienced C/C++ programmer.
Having a hard time finding a job I gather. There are plenty of people who know both and none of them will tell you that. There are things to learn about programming on the JVM that you just won’t know without any experience. That being said, if you can program in C/C++, switching to Java isn’t hard. The converse can be much harder unless someone else sets up your environment and toolchain for you.
Having done both, java sucks. Absolutely sucks. And despite claims to the contrary, it’s still slow when you benchmark actual code.
It’s one of the reasons you don’t see your graphics card drivers written in java.
Wait… do you even remember what the log4shell exploit was?
It was against a java library.
Your point about a good java program requiring an experienced programmer who has ‘fewer illusions of what’s going on under the hood’ is wrong – Java, by necessity of it’s bundled library problem, has so many more problems than other languages. Being good at java means knowing what these libraries are, not necessarily understanding the security implications thereof.
And having looked at the code of some of these java lib
log4shell was a bug in log4j, 3rd-party code written to add enhanced functionality to supplement/replace the default java logger. It was not part of J2EE anyway, it was 3rd party stuff included by Apache Software Foundation, that people added, and was started by one guy).
“Pass a bunch of data to my neat logger and let it be interpreted by the runtime with no sanity checks. What could possibly go wrong?”
Sure makes little bobby tables look almost benign.
Bubbles, is that you?
> Java isn’t going away and keeps getting better.
Throwing features at something doesn’t necessarily make it “better”, because it also makes it more complicated, creating a bigger learning curve to enter.
Most ‘coders’ I saw before I retired from the software industry were hacks, in some way shape or form.
The Java guys put most of their effort into learning and reusing the vast library of pre-existing Java code already out there. They rushed out prototypes that would compile but were utterly un-scalable. Most of them had a reputation for coding entirely on their own and being awful to work with in a team setting. That being said, at least the work more or less got done.
Python was rolled out as a buzzword,
Yep, fully agree. Permanent training wheels will just mean that people never get good at what they do. And management will hire even cheaper to compensate for the apparently reduced opportunities for errors. If this was the way, for example, PHP was not a problem. Turns out you can make just as bad mistakes with memory safety, they just fall into more categories and hence appear individually smaller. But the problem remains the same overall and the only thing that can fix it is competent coders.
Part of the problem I see is a lack of the sense of a new frontier to coding which animated the first couple generations of programmers. It was new, exciting, and unleashed lots of possibilities. Now it increasingly resembles a regular job, more like working with regulations or codes than creative work. One of the reasons I like the idea of lots of libraries — call it the “CPAN model” — is that it abstracts away drudgery if done well, and shifts the focus on any project from the known and tedious stuff to
Yep, that is certainly a factor. Of course, CPAN is well established and most stuff there will probably have long-term support. If you look at the mess that modern web “engineering” is, not so much.

Permanent training wheels will just mean that people never get good at what they do.

Permanent training wheels will just mean that people never get good at what they do.
Except ‘what they do’ shouldn’t be focused squarely on memory management and memory safety. These things are tedious to micromanage and don’t really inherently mean you have ‘better’ code just because you are stuck having to deal with it.
You can specifically say things like ‘the implementation of language X has a garbage collection system that is too big an impact to performance’ or ‘the data structure overhead incurred by language Y’s memory safety system is too much to tolerate’, but the flat out rejecti
Memory management and memory safety is a barrier to entry that makes sure people that do not have what it takes fail early. Apparently you do either not get that or you do not want a barrier to entry, thus advocating for things staying bad.
No, it doesn’t, it just means that you get lots of security vulnerabilities and out of memory issues.
It’s not some ‘intelligence test’ that getting wrong keeps people out of the field. It’s easy to make it *work*, and getting it to work is all that a lot of development situations will notice. The mistakes are particularly insidious because they don’t make themselves well-known until often way way later than the developer started.
There’s no more than a handful of people who know what they are doing, everyone else just has illusion of control. Unless you are the handful of people who know what they are doing, you can’t tell them apart. You can believe you are one of the handful, but you are almost doing it based on far too little statistical evidence.
Greybeards with many decades of C development for products actively stressed by exploiters are exceedingly rare. Some embedded guys who for most of their career never had to deal with hu
Unless you are the handful of people who know what they are doing, you can’t tell them apart.
More generally, no one can recognize what they do not understand. However, none of this stuff is rocket science; it can be taught, and can be pursued by those in the field, if we start promoting people for competence alone instead of silly stuff like having the right selection of Funko Pops so they “get along with the team.”

However, none of this stuff is rocket science; it can be taught….

However, none of this stuff is rocket science; it can be taught….
I used to think the same thing in my early college days (I’d been programming for five years at that point) in the early 1990’s: it’s not hard, so it’s just a matter of education. But then I saw a stream of students who just could not wrap their heads around simple concepts such as iteration, and lamented how they hoped it got easier after Visual Basic, then changed majors when they found out VB was the easy class.
It blew my mind.
My Introduction to Java class went from 40+ students at the beginning of the semester down to about six or seven by the end of the semester, and the level of difficulty was trivial at worst. There are just a TON of people for whom simple programming is an insurmountable black art.
There are just a TON of people for whom simple programming is an insurmountable black art.
In my experience, not too many given that they have basic cognitive ability. Lots of people just hate it or want an easier way. However, like all things, it cannot be taught to everyone. Same is true of English literature classes.
As a college programming instructor of ~20 years experience.. hard agree with the GP post.
Anytime this comes up, someone makes the “basic cognitive ability” qualifier, and it very much smacks of a circular definition.
I teach at a nationally-top-ranked community college in a computer science program, and perennially only about 5% of students entering the major succeed at earning the 2-year degree. I also didn’t believe programming was so commonly insurmountable until I started teaching. It’s the coldest spla

Memory safety is just a plain old basic requirement for every program of any notable complexity. Giving your expert programmer a language that doesn’t do it automatically is like buying a racehorse to carry sacks of potatoes around your farm.

Memory safety is just a plain old basic requirement for every program of any notable complexity. Giving your expert programmer a language that doesn’t do it automatically is like buying a racehorse to carry sacks of potatoes around your farm.
Memory safety isn’t that hard.
Your mother probably taught you the basic principle – if you use something, be sure to put it back when you’re finished with it.

Your mother probably taught you the basic principle – if you use something, be sure to put it back when you’re finished with it.

Your mother probably taught you the basic principle – if you use something, be sure to put it back when you’re finished with it.
I bet your code is riddled with security holes and memory leaks. Use smart refs (or smart pointers or whatever they are called now) please. It isn’t 1994 anymore. You don’t have to call free or delete yourself.

Your mother probably taught you the basic principle – if you use something, be sure to put it back when you’re finished with it.

I bet your code is riddled with security holes and memory leaks. Use smart refs (or smart pointers or whatever they are called now) please. It isn’t 1994 anymore. You don’t have to call free or delete yourself.

Your mother probably taught you the basic principle – if you use something, be sure to put it back when you’re finished with it.

Your mother probably taught you the basic principle – if you use something, be sure to put it back when you’re finished with it.
I bet your code is riddled with security holes and memory leaks. Use smart refs (or smart pointers or whatever they are called now) please. It isn’t 1994 anymore. You don’t have to call free or delete yourself.
absolutely not. The same servers I wrote over a decade ago is still running with zero memory leaks (and they sit directly on the bare metal, not requiring middleware like a web server). No need to kill off a thread every 300-500 times to “reclaim leaking memory” because it really is possible to write code in C that doesn’t leak. So hundreds of threads start at once, then run forever without being killed or allowing the OS (bsd and linux) to reap leaked ram.
smart pointers are for stupid people – and they

It sounds easy, but the reality is that a lot of bugs in C and C++ are memory related.

It sounds easy, but the reality is that a lot of bugs in C and C++ are memory related.
Yes, but that’s because most “programmers” are shite, same as everything else. Remember George Carlin’s “90% of everything is crap”? He was an optimist.
Developers are not made better for having to micromanage alloc/free and separately having to track the size of various data structures or limits to the sizes.
It just means they are stuck with extra tedium to do that and a very high chance of getting it wrong.
If you have a language that delivers the general capability and performance of C, but with sane data structures and better automatic memory allocation than slapping stuff on the stack, there’s not really a downside. In practice some of those features co
If we are trying to avoid tedium, we are in the wrong field. It helps to manage data structures directly and get clarity on what they are. Relying on the language as a diaper to catch all possible spillage just pushes the inclarity and laziness into some other area. In addition, trendy languages come and go, where if we simply wrote everything in C++ with specialized libraries for those data structures as necessary, the code could be built upon in the future.
We are trying to avoid tedium all the time. C was invented to get away from the tedium of basically writing directly to the processor instruction set.
I agree that understanding the memory layout and having the language implementation make sure things are properly aligned and stuff is valuable. However for the “size of data structure” problem that isn’t really useful micromanagement, with negligible downside for a language to handle (having a size_t member as an innate facet of variable sized things). Mem
I remember the old Pascal-versus-C debates. Man, those were great! About as fun as Mac-versus-PC or middle eastern politics. But the same struggle applies, and the same basic fact will determine the outcome: the more flexible language wins. C++ at this point gives you every option and tightly integrates with machine code and any other “literal” hardware or software. Yes, most people do it wrong, but just like we had an anti-smoking campaign and a crusade against drunk driving, we can put up big billboards a
This is the same TIOBE. Index that claims Assembler is more popular than PHP? And that Visual Basic and C# are more popular than Javascript?
I just did a c++ smart pointer implementation.
And the only thing I did not remember was where exactly the “operator” keyword belongs in a cast operator. So: I only googled once.
However my toy language of the year (more precisely next year) is wren. See: http://wren.io/ [wren.io]
No Java since 3 or 4 years – I guess we are at Java 20 when I start again, lol.

I just did a c++ smart pointer implementation.

I just did a c++ smart pointer implementation.
Why though
Because I have a C program, which implements a small virtual machine/byte code interpreter.
It has its own GC, simply wrapped around standard malloc/free.
I want to control the memory area in which the VM allocates its heap objects, to be able to have a SmallTalk like image that I can write out to disk.
As I did not want to hunt down all ways how the VM uses pointers, I compile it now with a C++ compiler instead of C, and I have changed all typedefs involving VM internals (about 15) to my smart pointer.
Hence t
Interesting. But couldn’t you have used one of the C++ standard library smart pointers instead of implementing your own?
Not that I’m aware of, I don’t think there is a pointer in the standard library, that references memory via a shared “base pointer”.
It would have spoiled the fun to daddle a bit with C++ anyway
True, I don’t know enough about C++ smart pointers to rely on that detail.
My first paid coding job was C in the late 1980s, did many languages in between, lots of C++, and now I’m wrapping up my career with a job doing C in the 2020s.
It is funny how much and how little has changed.
If Java drops sufficiently in the rankings, maybe one day it will be cool to be a Java developer again.
That depends on there being good java developers again, probably. Given how much garbage many of the commonly used java libraries are, I’d not hold my breath.
Unless you’re doing embedded, or telecom, or finance, or
But, yeah, not going to find much c++ if you’re doing windows development.
I mean, it rates C as the second most popular language and, as we all know, C was originally a prank.
Seriously, this is valid C:
for(;P(“n”),R–;P(“|”))for(e=C;e–;P(“_”+(*u++/8)%2))P(“| “+(*u/4)%2);
Theres no way this was ever a language intended for real world applications.
Its all documented here:
https://www-users.cs.york.ac.u… [york.ac.uk]
Therefore, this article itself must be a prank.
Ok. Just in case you are serial about this, that article was posted on April 1. You may draw your own conclusions from that.

Ok. Just in case you are serial about this, that article was posted on April 1. You may draw your own conclusions from that.

Ok. Just in case you are serial about this, that article was posted on April 1. You may draw your own conclusions from that.
Comment about the insane syntax stands.
“Name a single language that doesn’t accept that very same syntax” …. Python?
But if it was in Perl, that wouldn’t be a prank!
(JK, but Perl does let you write code that looks like that without defining a bunch of obfuscation macros first)
Languages that started out (relatively) free increased their popularity which had the network effect of many wheels and other parts getting developed making it easier to develop cars. All turning into legacy to be maintained. Popularity also came from Python’s ease of use via its interpreter, with the web programming languages popularity derived from first mover.
All the latest and greatest languages have a large legacy porting hurdle to jump. If I’m starting a new high-level application I’m going to select
Java dropping a notch is not a ‘zoom’ so the choice of wording puts an interesting spin on what happened. More interesting is the performance of other languages like Rust or even Matlab. In any case, Java will be around for many many decades to come … like FORTRAN.
As we all know, popularity is what matters in tech.
Of course, that only goes for the absolute most undisputed popular concepts, products, companies. Second place when ‘going for’ popularity is a tie for dead-last.
Shooting for ANY other metric, there’s pros and cons about what you actually produce.
There may be more comments in this discussion. Without JavaScript enabled, you might want to turn on Classic Discussion System in your preferences instead.
US Justice Dept Split Over Charging Binance as Crypto World Falters
Meet Two Startups Bringing Robots to Restaurants
The most delightful day after the one on which you buy a cottage in the country is the one on which you resell it. — J. Brecheux

source

Note that any programming tips and code writing requires some knowledge of computer programming. Please, be careful if you do not know what you are doing…

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.