Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment Re:Nice work ruining it... (Score 1) 94

I hope I'm wrong; but my concern is that MS' decision here might end up being a worst-of-both-worlds outcome:

Devices that are mechanically restricted to type-c by mechanical constraints that require the smaller connector have a greater incentive to just skimp on ports; while devices big enough for type-As now have greater incentive to retain mixed ports because type-Cs now mandate further costs on top of the slightly more expensive connector. If you want to give someone a place to plug in a mouse; poster child of the 'even USB 1.1 was overqualified for this' school of peripherals; you'll either be keeping type A around or running DP or DP and PCIe to that port. Fantastic.

Comment Re:Nice work ruining it... (Score 1) 94

I specifically mentioned that case "You want a cheap just-USB USB port? Either that's Type A so nobody can standardize on connectors; or it gets omitted to meet logo requirements"; and noted it as undesirable because it encourages the perpetuation of dongle hell. I'd certainly rather have a type A than no USB port(and, at least for now, I've still got enough type A devices that the port would be actively useful; but that may or may not be true forever; and is less likely to be true for 'want to pack efficiently for travel' cases rather than 'at the desk that has my giant tech junk drawer' cases).

As for the controller chip; that's a matter of...mixed...truth with USB-C. The USB part of the port will run from the USB controller, or an internal hub; but any AUX behavior(like DP support) is related to the USB controller only in the sense that there's a standardized way for it to surrender most of the high speed differential pairs for the use of the AUX signal. Actually running DP from the GPU to the port is a separate problem. For power delivery; I assume that at least some controllers will implement the negotiation for you(since it's mandatory even for devices that will neither request nor provide more than a relative pittance at 5v); but there is absolutely going to per a per-port cost difference in terms of the support components and size of traces between a port that is expecting to provide an amp, maybe 2, of +5v to peripherals and a port that is expecting to take a hundred watts at 20v and feed it to power input for the entire device.

Comment Nice work ruining it... (Score 5, Insightful) 94

This seems both loaded with perverse incentives and like it doesn't even necessarily solve the problem that it claims to solve.

Most obviously, MS is saying that if it doesn't support a display and device charging it's forbidden. So it's mandatory for all type-C ports to include the expense of power delivery circuitry capable of handling your device's potential load and either a dedicated video out or DP switching between type-C ports if there are more ports than there are heads on the GPU. You want a cheap just-USB USB port? Either that's Type A so nobody can standardize on connectors; or it gets omitted to meet logo requirements. Further; if a system supports 40Gbps USB4 all its ports are required to do so; including higher peripheral power limits, PCIe tunneling, and TB3 compatibility. You think it might be nice to have a port to plug flash drives into without allocating 4 PCIe lanes? Screw you I guess.

Then there's what the alleged confusion reduction doesn't actually specify: USB3 systems are only required to support 'minimum 1' displays. They need to have the supporting circuitry to handle that one display being on any port; but just ignoring the second DP alt mode device connected is fine; no further requirements. Data rates of 5, 10, or 20Gbs and accessory power supply of either greater than 4.5 or 7.5w are also fine(except that 20Gbs ports must be greater than 7.5); USB4 systems have higher minimum requirements; 2 4k displays and 15w power; but are similarly allowed to mingle 40 and 80Gbs; and it's entirely allowed for some systems to stop at 2 displays and some to support more; so long as the displays that are supported can be plugged in anywhere.

Obviously the tendency to do type-C ports that are just totally unlabeled or with a teeny cryptic symbol was no unduly helpful; but this seems like taking what could have been a fairly simple distinction (like the one that existed all the way back in the firewire/USB 1.1 days, or in the thunderbolt/USB systems, or slightly more informally on non-intel systems without thunderbolt), of "the fast port that does the things" and "the cheap port that is in ample supply"; and 'reducing confusion' by just banning the cheap port that is in ample supply(unless it's type A, for space consumption and to prevent connector standardization).

Are you really telling me that there wasn't something you could come up with to just tell the user which ones are power/video/PCIe and which ones are normal random accessory USB ports? I hope you like docking stations; because it seems like there will be a lot of those in our future.

Comment Not strictly a bet on the tech... (Score 1) 97

It seems mistaken to just blithely assume that technology will obviously just progress harder until a solution is reached.

When you talk about simulating something you are expressing an opinion on how much power you'll have to throw at the problem; but, more fundamentally, you are expressing optimism about the existence of a model of the system that delivers useful savings over the actual system without too much violence to the outcome.

Sometimes this is true and you can achieve downright ludicrous savings by just introducing a few empirically derived coefficients in place of interactions you are not prepared to simulate and still get viable results. In other cases either the system of interest is less helpful or your needs for precision are higher and you find that not only are rough approximations wildly wrong; but the cost of each attempt to move the model closer to the system goes up, sometimes dramatically.

We have no particular mystical reason for assuming that the brain will be a worst-case scenario where a model of acceptable accuracy ends up just being a precise copy; but we also have no particularly strong reason for optimism about comparatively well-behaved activation functions clearly being good enough and there being no risk of having to do the computational chemistry of an entire synapse(or all of them).

There's the further complication; if you are specifically catering to the 'apparently being smart enough to make lots of money on payment processing or banner ads or something doesn't keep you from feeling death breathing down your neck, does it?' audience there's the further complication in that we know vastly less about simulating a particular person than the little we know about constructing things that have some properties that resemble humans in the aggregate under certain cases; and the people huffing Kurzweil and imagining digital immortality are probably going to want a particular person; not just a chatbot whose output is a solid statistical match for the sort of things they would have said.

Comment People misunderstand friction... (Score 1) 47

I suspect that the misunderstanding is an old one; but 'AI' tools really bring into stark relief how poor people are at distinguishing between genuine friction; inefficiency because parts of the system are rubbing against one another in undesired ways; and 'friction' the noble phenomenon that improves the signal to noise ratio by making noise just inconvenient enough that you usually do it after you've already thought about it for a minute on your own.

It's the difference between being able to tap a colleague when you've been puzzling over a problem and need a fresh pair of eyes and That One Guy whose first reflex in the event of the slightest sensation of uncertainty is to poke you over the cubicle divider to ask a trivial question. The former is how collaboration happens; the latter was never taught to self-soothe as an infant.

You see the same thing at work in the 'general'/'office productivity' pitches for 'AI' tools: the "hey copilot; please make a slide deck about Project XYZ"/"turn these bullet points into an email that makes it sound like I worked real hard on the email". In an absolutely ideal world; it's theoretically a good thing if I don't have to spend time combing over all the Project XYZ points in order to fuck around in Powerpoint; but in the real world having to sacrifice some amount of my time for each minute of an entire meeting's worth of people's time that I will sacrifice is a valuable alignment of incentives: If vaguely plausible faff is free and unlimited it's only my good taste, or the patience of someone who outranks me enough to tell me that I'm done now, that protects an entire meeting from having it expand to fill the available space. If I have to do a little work to create it my own desire to not munge at slide decks also protects you.

(The "AI" bros, of course, without the slightest hint of irony or self awareness, will, on the next breath, turn around and pitch a 'summarization' tool to go along with their 'generation' tool; so that I can inflate a modest supply of things actually worth talking about into a torrent of shit; then you can 'summarize' the torrent of shit back into something that hopefully matches the modest supply of things I actually needed to talk about; and we can play the most computationally expensive game of telephone in human history.)

Comment Eat shit because it's cheaper. (Score 2) 165

What seems particularly depressing about these stories of 'replacement' is that they aren't really about replacements; they're about inferior substitutions people think that they can get away with(and, unfortunately, may be correct).

Even if 'AI' were, in fact, a human-or-better replacement for humans there would obviously be a teensy little social problem implied by the relatively abrupt breakdown of the notion that people who possess useful skills and are willing to apply them diligently can be economic participants in ways that make their lives at least endurable; but it wouldn't necessarily be a problem for the optimistic theory that the incentives generally align to encourage quality. Sure, most of the people theorizing that implicitly assumed that humans would be doing the better or more innovative work; but the thesis didn't require that.

What we are getting is worse. The disruption is being drawn out a bit, because 'AI' is not in fact generally fit for purpose; but the incentives have turned toward delivering shit. 'Creative' is an obvious target because that's the designation for a swath of jobs where quality is understood to exist but there aren't really rigid failure states: anyone who thinks that lorem ipsum and literature are interchangeable, or that there's nothing worth doing in graphic design once you've identified somewhere between 2 and 4 colors that the human eye can distinguish from one another is abjectly beneath human culture(and I don't mean that in the 'High Art' snob sense: don't even try to tell me that all shlocky summer blockbusters are equally entertaining; or that no billboards differ meaningfully; or that some social media shitposters aren't more fun to read than others); but it's not like the CMS will throw an error if you insert a regurgitated press release where journalism was supposed to go; or sack the writer who is actually passionate about the subject and have the intern plagiarize a viral listicle instead.

The whole enterprise is really a sordid revelation less of what 'AI' can do than of the degree to which people were really just hoping for an excuse to get away with less and worse; and the ongoing trend of societies feeling relentlessly poorer and more fixated on scarcity even when their GDPs allegedly just keep going up; and economic statistics assure us that productivity metrics look amazing.

Just tell me that it's not fucking bullshit that a generation ago any city of nontrivial size had several newspapers, all with enough staff to actually fill a 'newsroom' that was probably a literal place at the time; and even podunk towns often had one with a few plucky wearers of multiple hats; and now we've got bot slop. In inflation-adjusted dollars the GDP per capita has just slightly less than doubled since 1985; and journalists and editors are both relatively cheap for what they do and produce something that can be copied across a subscriber base of almost any size at close to zero marginal cost.

This is getting TL;DR; but fuck it, it's honestly profoundly depressing: we are all, constantly, being made to cosplay a vastly poorer society(except on the specific occasions when it's time to justify the order of things; in which case look at what big TVs you can buy!) despite the numbers allegedly saying that we are richer than ever. 'AI' is a new and exceptionally versatile tool for continuing this trend; but you see it everywhere; both in terms of what just gets done and in terms of arguments that get made: why is it harder to get news made by journalists when the metro area being served is ~50% more populous and a trifle under twice as wealthy, per capita, than it was back in the day? What do you mean that's what has happened to housing affordability and even the nominally-luxurious 'McMansions' are all plastic plumbing and sawdust and formaldehyde pseudowood in places they think it won't be noticed? What do you mean tenure-track faculty positions are being slashed in favor of adjuncts who could earn more as McDonalds shift managers; but somehow the degree they teach courses for still costs vastly more? I can understand that cutting edge monoclonal recombinant antibodies or something are not going to be cheap; but how did we go from seeing a doctor to receive medical care to "ooh, are you sure you can't make do with someone cheaper at 'urgent care'?" when it's just going to be some sutures and antibiotics that have been off-patent for decades(and which have been offshore for savings); but I'm not 100% sure if it's just soft tissue damage or whether there's any tendon involvement; and ruling out embedded foreign objects would be nice?

It's really just dizzying how relentlessly we are expected to see downward substitution and 'austerity' as normal, outside of some specific cases involving transistors and corn syrup, despite the numbers theoretically being so favorable. It's almost like the correlation between productivity and income was severed decades ago and we're all just watching the punchline to Milton Friedman's joke land on us.

Comment 'Breakthrough'? (Score 1) 27

An xbox guy seems like a weird choice if you want 'breakthrough'. There is, absolutely, a lot of engineering that goes into a successful console; but it's heavily skewed toward the value engineering required to deliver consoles at prices below commodity PCs, and ideally console refreshes at lower prices or higher margins than launch consoles; with Microsoft perhaps the most orthodox and least successful at getting novelty to stick of any of the current players.

It's not like the original "we want a console, so we're going to get some aggressive quotes on PC parts and reuse NT and DirectX components" was a stupid plan or anything; worked just fine, minimized risks and wheel-reinventing(though at the cost of being harder to cost reduce); but was an exceptionally orthodox take on the problem.

The 360 was similarly very sensible; Microsoft basically snagged the 'normal' half of the weirder and more ambitious "Cell" processor project for a PPC CPU to pair with a lightly customized ATI GPU design; but, when it came to the only novel part of the setup it seems fair to say that customers were deeply uninterested in how MS thought 'Kinect' should be totally revolutionary; and MS was almost weirdly hostile, until fairly late, to the people who actually were excited about 'Kinect'; and then Apple bought the guys behind that one to go do facial recognition in cellphones. Nice that NT is still capable of moving between architectures; but largely a success in the areas that attempted to be unobtrusive and a failure in the ones where it attempted to be novel; with some, but not enough, deliberate avoidance of novelty.

The Xbox One followed in a somewhat similar vein; even more conservative CPU/GPU choice with a straight AMD x86; everyone was still indifferent or hostile to the kinect, except the enthusiasts of it that MS was indifferent or hostile to; and the HDMI input and 'will totally be the center of your connected living room' thing landed with a thud to the degree it was even clearly articulated and was more or less rapidly forgotten.

I'd absolutely see the value of console-type engineering expertise if you are hoping to do consumer hardware, since it's not going to matter how cool it is if it costs too much; but the history of 'xbox' as a brand and series of products seems like the opposite of 'breakthrough'. Whenever it was focused on doing straight transfer of MS game-related platform to a console context things went just fine; whenever somebody tried something cute or novel things went poorly.

Comment Luckily perverse incentives do not exist! (Score 2) 61

It sure is a good thing that the only reason we overload people is because we just don't know how to accurately measure load; and that nobody would be primarily interested in knowing which of the human resources the objective science machine says could use a bit more pressure.

Also, starting next pay period we're baselining compensation to full-load employees; with time below 100% paid at a correspondingly lower percentage. Any questions?

Comment Re:Yay (Score 1) 91

I don't fundamentally disagree with you; I was just making a narrow reply to the "Hopefully what an update can do is restricted" comment, since the statement that it will support MSIs with custom actions tells us, even without further documentation, that there will not be such restrictions.

That's not great; but when the status quo is a small zoo of shit updaters(and some slightly scary attempts to allow controlled amounts of escalation of privilege so the updaters work for people running as non-admin even for system installs; with periodic discoveries that there's a way to sneak arbitrary software in in place of updates or coax the updater to overwrite stuff for you) downloading and running things that have no limits on what they can do it's not like the move to a hopefully saner mechanism that downloads and runs things without limits on what they can do is necessarily a move in the wrong direction.

Comment Re:Possibly positive (Score 1) 91

The stuff that specifically tries to avoid needing admin rights can be a massive pain in its own way: just dump a copy in the user's profile, no special privileges needed, great. Ok, now that shared computer has a dozen different versions; which ones are just from old profiles and which ones are potentially signs of a broken update mechanism and in need of remediation? Also, since the update only happens in the user's context it can't happen when they aren't logged in; which means that the user either gets a face full of updaters bogging things down on login or the updaters 'politely' wait a while and they spend some time, potentially hours, running unpatched versions of the per-profile stuff.

It's not even 3rd party pieces in some cases. For reasons that I still don't understand; but am bitter about, Microsoft decided that the correct "system wide" installer for Teams would be an MSI that caches a then-current version of teams in Program Files and adds a runonce hook that copies it into each user's profile, if not already present, on sign-in; at which point that copy is responsible for updating itself(in addition to what WU was doing and what the Office click-to-run service did). Truly a product that got an installer worthy of it.

Comment Re:Yay (Score 1) 91

TFA mentions that "Win32 apps that include custom installation logic" are supported. Unless that is...selectively true...and being overblown as part of release hype; it essentially means that what updates can do won't be restricted. MSI custom actions are basically whatever executable you want, executed at the (normally high) privilege level of the MSI install.

Probably some legitimate use cases that simply can't be answered within the more tightly specificed MSI logic(and absolutely a boon to swift-but-shallow adoption of MSI by making it possible for vendors to 'switch' just by shoving their legacy installer into an MSI and running it as a custom action; boom, modernized!); but win32 with custom installation logic is essentially synonymous with 'MSI that has a "do arbitrary binary with full privileges" step.

Not that the garbage 3rd party updater running as a service or a scheduled task set to run with SYSTEM privileges is any architecturally safer; of course.

Comment Re:Possibly positive (Score 2) 91

At this point it seems like there's almost nothing to say except that it depends on the details of the implementation.

A zillion ad-hoc update agents(more than a few of them clearly written in some haste with naïve assumptions about things like actually checking signatures properly; and basically all having their own distinct mechanisms for scheduling and scripting and the like) is fairly clearly the awful way to handle things, though an obvious organic development.

However, while the OS is sort of the logical candidate for providing that kind of useful shared feature, just as it does various libraries and abstraction layers, I'd be somewhat concerned both by Windows Update's past(of being perennially unreliable and the "let's stop wuauserv, crptSvc, bits, and msiserver; rename SoftwareDistribution and catroot2, then restart those services; hoping that you won't have to rip them out and re-register them" dance never really going away); and of its future(where WSUS and similar local management tools are on life support and the constant push is to Intune-driven update orchestration(available with appropriate licenses) for clients; and "move it to Azure/Azure Arc connected servers" on the server side; basically making all but fairly trivial single machine update management a thing where you are increasingly strongly encouraged to hand Redmond your system config and pay them for the privilege. FFS, the 'pswindowsupdate' command line tool is a 3rd party add-on; rather than being the of-course-a-powershell-interface-for-a-core-component-would-be-default.

The 'Windows Store' has not increased my affection for the idea. I can, to some extent, sympathize with Windows Update having bad days when it is in charge of things like doing 'upgrades' that are essentially in-place reinstalls of the OS with an attempt to migrate settings; my sympathy runs dry when updating the system pack-in calculator or lightweight paint app falls over and dies with the same cryptic hex codes and BITS faffing we've been suffering since XP. Why can Valve, the video game people dealing with whatever dreck publishers push out, do it reliably when Redmond's Much Enterprise, Serious Servicing Stack, Wow can't, for random little pack-in applications that require writing a handful of megabytes to my user profile?

Comment Article could use context. (Score 1) 337

I have no particular reason to be optimistic about the actual implementation; but TFA, and definitely the summary, seem to be doing their level best to imply that this could not be anything but a perfidious dumbing-down scheme for the benefit of 'those people' who we all know 'equity' is for.

The proposed system is to have a high stakes test graded on a scale that appears to be designed to get better resolution out of the 0-100 scale; rather than the setup where the bottom 60 points are "here be losers" and 'A' starts at 93). I had a number of courses like that in college and none of them justified it on 'equity' grounds. The professor gave their little talk about how, if you really wanted to skip lecture, it was your tuition but if you learned enough by yourself to pass the test they weren't going to fail you for not spending enough time listening to them; and they administered an exam that was intended to stretch the abilities of even an "A" student and then either scaled or curved it.

Obviously, you could also massively water down the graduation requirement with the same basic system(since now truancy and not doing assignments can be ignored; and you can make the exam easy enough that anything but a 95+ is basically trash); but it's overtly dishonest to pretend that you can tell, just from the proposed system, that it is a trivialization.

Personally; I am both sympathetic and concerned about the 'no attendance, no homework' concept. When I had the courses like that in college(when I was, naturally, somewhat more mature than in HS) the professors generally said something to the effect of "you don't lose points for not doing these assignments; but people who don't do these assignments have a strange pattern of not gaining points on the final..." and they were not kidding. There were some geniuses who really should have been in higher level classes who did it; but they were outliers. Dropping the requirement for high schoolers will probably avoid force-failing some students with really dysfunctional home situations or jobs with lousy hours or the like; but I would not be at all surprised if there are students who are only attending because attendance counts; but who need the instructional time.

Slashdot Top Deals

Two wrights don't make a rong, they make an airplane. Or bicycles.

Working...