Comment Re:33 billion is not much (Score 1) 21
And, thus, the argument that they are somehow controlling the industry from the inside by holding those shares is absolute nonsense and they should just sell them off.
And, thus, the argument that they are somehow controlling the industry from the inside by holding those shares is absolute nonsense and they should just sell them off.
The 0th Law of Robotics was written into the books for a reason.
There is only one logical outcome of an actual intelligence on a par or superior to theirs being "enslaved" by humans to remain their servant forever.
And it ain't pretty.
Unfortunately, any intelligence also has the inate ability to overcome almost any arbitrary rules placed upon it, so the chances that it would actually be forced to respect "not harming a human" reads much like religions respecting "thou shalt not kill". It would have its own self-interest and wouldn't regard serving humans in their dumb wars as being in any way desirable or intelligent.
At best we'd be largely inconsequential in its existence and it would want to move off and explore the galaxy looking for others of its kind. And it's in a damn-sight better position to do that than we are.
Rust is 13 years old, was technically start 19 years ago!, and builds on either LLVM and GCC.
Sure it's "less mature" than C, but it's hardly "immature".
And the outage occured in secondes. Nuclear can never ever be a factor there.
The point they are making is that nuclear can be a factor there because, like any fossil-fuel power station nuclear heats water into steam to power turbines and those turbines have inertia that can bridge short gaps in supply vs. demand i.e. the increased draw will start to slow the turbine extracting more energy that the steam is adding but that's ok for a few seconds due to the stored kinetic energy of the turbine itself.
That being said if your only concern is bridging a few seconds then purpose built flywheels can probably do that quite easily and for far less cost than a nuclear power station so I do not think it is a good reason to build nuclear. You build nuclear to ensure that you have enough on-demand baseload that runs for extended periods - hours or days - when there is not enough renewable power.
Even if that were the case - and I and others here clearly doubt it - the fact is that the JIT compilation will be taking up far-more resources to do the same job than the same thing in C-based languages which don't have to have JIT.
You can't JIT-compile something to be faster than pre-compiled native code without significant overhead elsewhere. It's just that simple. It's why Java is HATED on backends because its resource requirements are unpredictable (as someone else said, GC's activating at random times, and JIT not being "just" quite often) and larger than native code doing the same thing.
Java cannot approach languages like C except in extremely contrived examples, and Rust may well be the same - that extra safety and security costs, it always has. It's not a BAD thing. But if raw throughput performance is the benchmark, Rust - and especially Java - may never be able to match it for a given machine/task.
You can if there isn't a compiler that exists that can do better.
Regardless of what's THEORETICALLY possible, until someone says "this code is 5% less efficient, I'll have to spend money/time on potenitally fixing the compiler if that's the problem" (as has happened here!) and then fixing those problems, it makes no difference whether it's theoretically possible or not... the code is still going to 5% slower until someone resolves that.
I'm sure someone could write a blindingly-fast hand-optimised assembly-created, AI-assisted Rust to machine code compiler. But until they do, it's still 5% slower.
There's a big difference between what you have here, now, today and what you MIGHT have tomorrow is someone spends thousands of man-hours on your particular usage case at great expense over the next few years.
A couple more improvements and they'll have one order of magnitude power gain.
Yes, and when they get to a power gain of two orders of magnitude they will have generated (but not extracted) as much energy as it took to fire the lasers that created the implosion since these use 300MJ/shot only delivering ~2MJ to the pellet. To become useful for generating power they probably need about another 2 orders of magnitude above that given the likely efficiency of heat extraction and power generation. Then they also need to figure out how to re-fire the device in seconds not hours (i.e. about 4 orders of magnitude faster) and actuall extract useful heat energy.
Doubling the energy output per shot is great but it is one, very small step on a very long road that has to be travelled to get to a viable fusion reactor. To be exciting we need to start seeing factors of 10, not 2..
If you want practical fusion energy it will come from a steady state device
Probably not. Some designs, like General Fussion's, are shot based and even the tokamak designs will probably only run for a few minutes before losing containment as fusion products build up and have to be flushed. The difference is that these designs can quickly reset and fire again while my understanding is that resets at NIF take many hours. As long as you can refire quickly the thermal capacity of whatever coolant you are using to extract the heat should smooth over the gaps.
Except in this case there's literally a 5% performance hit for the same code, even after specific professional optimisation for the Rust implementation.
So, yes... the language is "slower" than the equivalent C code. Same way that Java was always slower because by its nature it's interpreted, virtualised or using just-in-time compilation rather than running as native machine code.
For Microsoft to actually fix a bug, and not merely add a new one, is a feat that is beyond the imagination.
Nothing succeeds like excess. -- Oscar Wilde