Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:sic transit gloria Microsoft (Score 1) 216

The company was born in the American culture with a team of young Americans (for better or worse) and this included the Yankee attitude that a person's personal computer was THEIRS and their data was THEIRS and what a person did with his computer was nobody else's business. Instead of leasing a computer and its OS, you could buy a computer and buy a Microsoft OS and then use it privately and securely and what you did with it was nobody else's business. The company's current CEO is not from the American culture (no, not skin color, CULTURE) and seems to not understand the entire POINT of the 1970s and 1980s microcomputer revolution. Not understanding the point, and not "getting it" on the independence and privacy stuff means he's gonna drive the company into a serious ditch at some point.

Unfortunately, I don't think the issue is Nadella, because the push-everyone-to-subscriptions-and-cloud isn't unique to Microsoft by any means. Google has pretty much graveyarded all of their desktop software except Chrome and Google Drive, which is functionally their variant of Remote Desktop (and yes, their Android dev tools). Adobe obviously made Creative Cloud a one-way trip, Salesforce has been cloud/subscription only from day one, and while AWS isn't the biggest share of Amazon revenue, it is their most profitable.

So, while I agree that Nadella has moved away from the mindset of the user being sacrosanct, I don't think I'd blame it entirely on his culture. He wasn't a trendsetter on pushing Azure and OneDrive, I think it was an overall trend of improving revenue...and, in his defense, I think that phones and iPads and smart TVs and Macbooks mostly replaced the home PC that most people had in the late 90's and early 2000s, leaving that revenue stream to dry up over the next decade. That wasn't Microsoft's bread and butter of course - business licensing was and is - but the trend was that users were moving away from local machines and admin access, and actively choosing devices that had a "mommy knows best" paradigm.

The concept of a users' computer belonging to the user and not to the OEM was, to an extent, a product of its era - if you were spending $2,000-$6,000 for a computer, there was a clear use case for it, and it was entirely possible to opt-out of owning a computer and go about your daily life. Even if you owned a computer, it was dial-in, do your business, disconnect. By time Nadella came in, the market shift was already underway, where users had actively been choosing to move away from devices that gave the user total hardware and software access, and moving toward those that allowed them to safely play in a walled garden that they couldn't mess up. On the business side, the accountants liked the move from capex to opex, the techs liked avoiding hardware drama, and the C-suite liked having people to shift blame onto in the event of a failure/outage/security breach...and if Nadella wasn't going to cash in on it, Schmidt and Bezos would.

So...yeah, I too mourn the fact that owning one's own hardware and the mindset of "the user is sacrosanct" is almost nonexistent at this point. However, I don't know that I'd blame Nadella for responding to an overall shift in what the market wanted. It's just sad that the market did so.

Comment Re:There may be an alternate explanation (Score 1) 216

I think it is possible that Microsoft is simply lying here and the whole insane claim is just an attempt to make AI, as offered by Microsoft, appear to have capabilities, it clearly does not have.

Devil's Advocate, it might be a means of giving a worthwhile demonstration. Windows still has no shortage of code from the early 2000s - while I consider it a useful feature that most of the MMC snap-ins work the same today as when I was in college learning Server 2003, the devs seem to think that "old == bad", rather than "old == done". If they can successfully use CoPilot to refactor WINDOWS, in a way that is still drop-in compatible with Windows software, it may be a solid demonstration to potential customers with similarly ancient and convoluted code bases that they too can *finally* move those ancient code bases to something modern.

Of course, this hinges on whether Microsoft is *successful*, but if they can pull it off, even partially, it may be a good marketing tool for the demographic who needs it.

Similarly, I wouldn't be surprised if Microsoft's internal use of CoPilot for this project would allow for some Hollywood Accounting shenanigans, where they can use interdepartmental billing to show that CoPilot has a good amount of usage, to help justify their obscene AI hardware spend over the past few years.

Just a thought. Obviously, they could also be insane and stupid enough to really attempt this.

Absolutely, they could attempt this! ...whether they will *SHIP* the output is a different story, but attempting it? Absolutely.

My guess is that they'll probably refactor some of the more difficult, low-level, mostly-universal things. resolving some multithreading issues would be one place they're likely to focus on, the file manager being another, and they're probably going to do some sort of settings consolidation project where the vestigial .cpl applets finally get merged into Settings (MMC Snap-Ins also, possibly), and get rid of some of the larger headaches and deal with those things. What they might also do is vibe code a compatibility layer of some kind, that keeps the old codebase in some sort of VM/LXC Container situation and make their own Rosetta to keep older software usable, while focusing on UWP/MS Store software being the first-class solution they keep wanting but never being able to push too hard.

Comment Re:Cause and Effect. (Score 3, Insightful) 54

Not so. We have moved on from such large time uncertainties in networking in the last 10 years, at least for specialized applications.

Also do not confuse the reference clock being slow, and the time propagation across the global network (ie Internet) being slow.

It is fine for the NIST clock to be inaccessible to the wider network for small periods of time, which would inevitably introduce drift on the downstream systems. Once those systems reconnect to NIST, they could correct themselves.

What is not fine is for the NIST reference clock to itself fail to stay synchronized with the atomic clocks which create the standard. That's what I'm finding shocking (although I suspect that old school NIST engineers are embarrassed and already on the case).

Comment Re:What could possibly go wrong? (Score 4, Informative) 216

lol. I like how you claim OO inheritance is an anti-pattern. Do you know what an anti-pattern is? It's a recommendation by some self appointed experts for others to not do certain things because it might be too difficult for new students to learn how to do such things correctly. Race to the lowest common denominator.

This is how we ended up with languages like Java who separated interface and object definitions, C# then copied Java. Why would anyone even do that? Because they wanted to fix perceived flaws in C++ that they didn't think students would be able to grasp, and Sun wanted to corner the next generation of programmers. Other examples include removing pointers from the language, and (originally) peppering the libraries with hidden mutexes because concurrency is hard.

Fast forward to today, and ask yourself why Windows 11's file manager is so dog slow? It's not a new technology concept. It's actually slower than the file manager in MSDOS 4, which ran on a PC that was easily a million times slower than yours is today. What gives, Microsoft?

You're a member of a cult of safety zealots without merit. You think that safety is the most important feature a language can have, and you're willing to sacrifice expressivity for it. Then you attack other languages that make different choices, and you want everybody to play by your rules, by requesting that they henceforth contort their logic until it fits your safety first nonsense. Good luck with that. I'll take performance, performance, performance, any day of the week.

Comment Re:Cause and Effect. (Score 2) 54

I am actually shocked. As claimed by the article, it's not the clocks themselves that were compromised, it's actually the network outage that caused the problem.

That makes sense, because the time protocol requires estimating routing delays, so when the outage happened there may have been a change to the estimated delays in the network. And that seems frankly ridiculous, if true, since there's no reason that the network relaying the atomic clock readings should be tampered with on an ordinary operating day.

So the fact that the reference clock is off by this much either suggests a flaw in the protocol (trying to do something that doesn't make sense, like re-estimating the delays continuously through another route) or a flaw in the system design (does the computer which averages the 16 atomic clock ticks actually have an atomic clock itself? It would be stupid to use a cheap commercial time crystal as a fallback in case of a complete outage).

4 microseconds is an eternity in some applications. The idea that it is so easy to tamper with the reference time by messing with the network is unacceptable. NIST has some work to do here.

Comment Re: Elephant in the Room (Score 1) 40

Vendor swag (and especially company swag) can be great, but has a shelf life due to the branding. It's pretty embarrassing to sport a T-shirt from a former employer at your new employer (duh!)

What would be cool is a site with tips and tricks for removing branding of different kinds, so you can keep the swag without the stigma shelf life hit.

Eg how to remove logos from fabric without damage? Easy enough when it's just sewn on, but for some kinds of swag the logo printing methods are more stubborn than others and probably require chemicals.

Slashdot Top Deals

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...