Javascript is killing the world, basically.
Stupid claim.
JavaScript is a language.
Runtimes running it are often - after the analyzing and Jit compiling overhead - same speed as C.
So: you have to compare the JIT compiling overhead with the compiling and linking overhead of C (or similar).
Blaming languages, instead of their implementations, if you want to blame anything, is plain stupid.
How about this:
A)
- 10 programmers meet in two offices at same place every
- a team of 4 and 6
- they program in SmallTalk
- they release a majour feature every month
- most of them have a commute time below one hour
- their software is used by 1000 people in the company they work for (or imagine an airline having 10,000 people using that software)
Versus B)
- 25 programmers meet in 3 offices at different places
- roughly same team sizes
- they program in "insert your favourite non for weenies hard core language here"
- they deliver a "majour feature" every 3 months
- they commute to their work (yes artificial complicated) to their work roughly 2h
- their software is used by 100 people ... imagine a power company doing specialized power contracts
So ... what the funk has the perceived speed advantage of the final "executable" to do with the amount of CO2 the programmers produce during their work?
My computer is sitting here doing nothing except showing "windows". It still uses ~50Watt. Sure, this is a laptop. Not a blade in a data center.
If people think that programmers/datacenters running big iron software do not "optimize": then they are stupid as shit.
Your random Android or iOS App might use an linear search for something in its data instead of a working on a sorted list using binary search ... are you sure that this costs much more energy? The data is in memory. The CPU (in general) does not care if it idles or walks about some memory, neither does the MMU or the memory itself.
So, my current Android toy app is a vocabulary trainer. Guess what: the whole date fits into the CPU cache. Ten or 20 times, perhaps 100 times. Because 1000 core words of "insert your mother language" and translation "insert your target language" are in UTF-8 far below 200kB of data.
If we had better CPUs we perhaps had better OSes and could make an fOpenDirectToCPUCache("filename", "rb") call?
Energy is wasted on the "big picture iron" ... which has to stand by for "black Friday" and other "emergency sales".
Power consumption is most certainly not spiked by: oh Python is so slow, oh JS is so slow, but by not calling the right library.
There are big power consumption areas. Like training large LLMs. And? There is nothing much to do about it.
If you want better programmers then perhaps teach them how to use libraries, instead of making them fail exams by asking what an "epsilon free grammar" is.
And if you hate Python so much, because of its white spaces - what about going back to RPG? Oh, you do not know what RPG is ... guessed so.
Or you could program in Nim, looks like Python (with static typing), but compiles bottom line to machine code via C, or look at Mojo.
Languages are not the problem. Algorithms are. And that goes over a big spectrum.
Why use an SQL database when a network/graph database covers all your needs? Oh, because the NoSQL guys are stupid and you do not want to be associated with them ...
Bottom line: the problem is MONEY. Some people have MONEY, and they want RETURN of the BUCK. The rest is no concern about them.
I have a billion invested, you fail to pay the promised 100millions? But you are confident and I believe it ( and you are actually doing it) investing another 1 billion will not only pay 200millions but more? So, lets go.
Money, and the short term ROI is the problem. You could call it artificial scarceness of resources. If you just could hire an independent contractor to speed up your already existing software ... but you can't, you have to go through management, HR and so on ...
So, energy consumption or the speed of your software: is not the fault of the language. I you do not believe it, I sent you the link to the interpreted C runtime programmed by "John Carmack" himself. So: we have C, and we have C, and one is faster than the other. How can that be?
Bottom line
Of course we can utilize existing hardware better.