Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:Where's the story? (Score 1) 110

By his own narrative, it wasn't creating PowerShell specifically that got him demoted. It was doing "unassigned" work during work hours.

He details that it was specifically that Microsoft did/does not have the 80/20 type thing some competitors have, where you get some time to free range random concepts and ideas, so some pissy middle manager got mad that he wasn't going through the whole project approval (you know, the let everyone comment on the color of the shed stage) and he got demerits.

Comment Re:It may not be possible to mitigate (Score 1) 67

*What is YOUR source for this. Do you even have one?*

THE PAPER THAT WAS SUBMITTED. They are very open about the *incredibly* narrow known threat model (basically ASLR pointer obscuring *in the same process*), albeit -- as all papers do -- opining that maybe there is something worse that could be done. These sorts of security papers come out by the dozen per year, and generally no, there isn't any further risk, and the latent risk is negligible to irrelevant.

To be clear, when security researchers are pitching a novel vulnerability, the foundation of their claim is a proof of concept, because the chasm between "well it could...." and the actual can be enormous. No proof of concept. Not even a vague inclination of the knowledge of how to make a proof of concept. And this issue has been very widely disseminated, every hacker group pounding on Augury -- theoretically it is trivial to exploit on an array of pointers -- and no one else has a proof of concept yet. Weird, right?

Comment Re:It may not be possible to mitigate (Score 1) 67

"No bias there at all."

Because I have an M1 Mac I have a "bias"? Yeah, not really. I'm typing this on an Intel box. I have servers on AMD, Graviton 2, among many others. That's a modern life.

"Sources are people in the security industry in which I work."

ROFL. Yeah, no you don't. You are claiming ridiculous things.

These sorts of "you know it *could* hypothetically be exploited" (in a profoundly narrow sense) security papers come up by the dozens per month. The overwhelming majority have no real impact whatsoever. This one is particularly spurious.

The "amateur hour" bit in your comment was particularly hilarious, and betrayed that you're just some guy saying dumb stuff.

Comment Re:It may not be possible to mitigate (Score 2) 67

What source says it's "impossible to mitigate this"? Do you have even one?

Because the notion is preposterous. Not only is this largely a theoretical attack (I'm being generous by not calling it a fully theoretical attack), with extremely little real world consequences, mitigations are *trivial* if it were something real.

"I really want Arm on the workstation and server to succeed."

You seem to know literally nothing about security or chip design, and decided to post some tosser, laugahble anti-Apple screed. Me, I'll keep using my M1 Mac, and have been using ARM on the server for half a decade now. Hurrr.

Comment Re:Yeah... Elon says a lot of thing.... (Score 1) 260

"Last time I checked the public statistics, it's actually already doing that."

Tesla's stats on this are incredibly deceptive.

Firstly, the only place where Tesla drivers engage autopilot is on highways. The accident rate on highways is dramatically lower than city streets and aren't separated out, yet Tesla compares their autopilot numbers of that overall number. Highway driving is the baby first steps of self driving.

Secondly, Tesla drivers already are significantly less likely to be in accidents minus any of the aids. 1 accident per 1.82 million miles for Tesla drivers, versus 1 accident per 479,000 miles for the average vehicle. Again, this is with zero of the assists or safety aids in the Tesla. This is courtesy, presumably, of newer cars (older vehicles are in far more accidents) and perhaps a more enthusiast owner who is more attuned to the world.

Elon Musk has been pitching full self driving for years, increasingly trying to get buyers to pay for a product they aren't actually getting. And I'm sure Elon is looking at the progress and thinking "wow, we're at 95%...only the last little bit left", but in realms like this that last 5% takes 5000% of the time and effort.

Comment Re:No it's not (Score 4, Interesting) 510

Ah, the death certificate claim. This is an argument presented by two types of people-

a) Liars and people who just want to see the world burn
b) The gullible who have been misled by a) and just don't realize it

The CDC gets death certificates often with MONTHS of delay. If you track their counts in real time, past periods will continually percolate up as death certificates from all causes eventually make their way to the CDC.

https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fwww.cdc.gov%2Fnchs%2Fdata%2F...

The CDC gets COVID-19 data very rapidly because it's a pandemic. They get data on every other cause of death much more slowly.

It's remarkable how virtually any topic gets exploited and misdirected by people who just like trolling the world.

Comment Re:So... (Score 1) 510

Currently the deaths / known cases (positive tests) = 5% CFR

If 10x more people have had COVID-19 than known (e.g. 25 million rather than 2.5 million), that would make the CFR 0.5%. Which would still be terrible, as an aside.

If we assumed 100X more people had it than known, that would still be a CFR of 0.05%, which would still be quite bad for a highly viral disease. But given the spreading patterns of the virus, that is clearly and obviously not even remotely possible.

Comment Re:not likely only reason (Score 1) 252

This is hilarious. Just to be clear, I'm apparently a fanboy of-

-AMD
-Apple silicon
-Amazon's custom silicon (Graviton2)
-Google loading their data centers with AMD, and making their own open data center core
-Reality, given that Intel is a joke in GPUs

I mean, what bizarre thought process led you to declare me a fanboy? See, in most of the world losing a major customer when your outlook is grim is a really bad thing.

Not you, though. But apparently your Intel-colored glassed make you think any realistic observations are just weird fanboy noise. Get a grip.

Comment Re:not likely only reason (Score 1) 252

The irony in your comment is astonishing. As is the other guy calling me a "fanboy" for pointing out reality (the surest sign you're talking to a so-called fanboy is that they'll always declare others fanboys).

Fun fact - Intel shares are off 2.5% today. That's despite riding at a tiny 11 P/E ratio, so investors already have a pretty grim outlook. Apple is up 1%. LOL.

Comment Re:not likely only reason (Score 2) 252

Client computer (desktops and laptops) accounts for the bulk of Intel's revenue. PC sales are flat already, AMD is roaring, and suddenly Apple drops a bunch. It isn't looking good.

Data center revenue consumes the bulk of the remainder. Amazon is going all in on Graviton2. Google just heavily committed to AMD. Intel used to own this market and now it is under serious threat.

Everything else -- Optane, IoT, software, networking, chipsets, is a relatively small business comparatively. It'd be a huge business for many other firms, but it's small compared to Intel's beachhead.

At the same time, Intel is years out of data on chip fabs and is making extraordinarily little progress. Their fabs are close to anchors rather than assets now.

The GPU comment gave me a laugh. Intel has had a stalking horse GPU promise for literally two decades now. Always grossly overpromised, massively undelivered. There is no one who expects Intel to do anything credible in that realm anymore.

I don't remember anyone saying Intel was dead at any point. But never has it felt more precarious: Intel was so pathetically desperate to hold onto x86 and to forcefully segment their own products (crippling their own offerings lest they impede the huge margin high end) that they ate the seed for the next season.

Slashdot Top Deals

"No matter where you go, there you are..." -- Buckaroo Banzai

Working...