Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment Re: Lol (Score 1) 44

I hope that you are incorrect, because they've been doing some pretty solid work in the area(albeit still at the stage where you are betting on them continuing to do driver support); but the moves like "don't be a total dick about RAM allocations like Nvidia" definitely seem like they could end up on the chopping block if a myopic spreadsheet cruncher gets a look at them.

Comment The sort of 'progress' that makes you more nervous (Score 2) 42

"Some of our early testing with the components we've turned off in Windows, we get about 2GB of memory going back to the games while running in the full-screen experience." is one of those sentences where they guy specifically working on this project sounds like he has done his job; but it really makes you wonder what the hell MS is thinking with the standard setup. It's not like the win11 shell is especially compelling or feature rich; and games' expectations of the platform are weird and varied enough that this wouldn't work if it were some kind of 'disable legacy jank' mode.

Comment Re:Lol (Score 3, Insightful) 44

It seems like a potentially bigger threat to any adventures outside of their 'core' products they want to try.

If I'm just buying a CPU from them that's a fairly low risk bet. Very mature compiler target; more or less a known quantity once benchmarks are available barring the discovery of some issue serious enough to be RMA material. Even if they decide to quit on that specific flavor of CPU the ones I have and the remaining stock should continue to just work until nobody really cares anymore.

If it's something that requires a lot more ongoing investment, though, like targeting Intel for GPU compute or one of the fancy NICs with a lot of vendor specific offload onboard, I'm going to have bad day if my effort is only applicable for one generation because there's no follow-up product; and a really bad day if something goes from 'a little rough but being rapidly polished' to 'life support'.

Even back when they made money Intel never had a great track record for some of that stuff; they've always got something goofy going on the client side that they lose interest in relatively quickly; like that time when optane cache was totally going to be awesome; or the more recent abandonment of 'deep link technology' that was supposed to do some sort of better cooperation between integrated and discrete GPUs; but that stuff is more on the fluff side so it hasn't really mattered.

Comment Re:Dumb laws (Score -1) 162

Nonsense, the very act of driving does not mean anything, you bought into this nonsense hook, line and sinker. Government shouldn't have any say over how we live our lives, how we drive on our streets, what apps we use, who we trade with.

The only one single, the only one maybe half reasonable thing for government to do is is protection against foreign attacks, that's it.

Comment Re:Dumb laws (Score -1) 162

You are absolutely correct. It is disgusting to see so many people supporting any government intervention in our lives and this is just more of the same. People are supposed to be these automatons the government wants them to be so that they can be fined more and more often, controlled harder and harsher for normal human behavior, which is under a continuous attack by the goons, who have power and occupy government positions.

Comment Re:Nice work ruining it... (Score 1) 97

I hope I'm wrong; but my concern is that MS' decision here might end up being a worst-of-both-worlds outcome:

Devices that are mechanically restricted to type-c by mechanical constraints that require the smaller connector have a greater incentive to just skimp on ports; while devices big enough for type-As now have greater incentive to retain mixed ports because type-Cs now mandate further costs on top of the slightly more expensive connector. If you want to give someone a place to plug in a mouse; poster child of the 'even USB 1.1 was overqualified for this' school of peripherals; you'll either be keeping type A around or running DP or DP and PCIe to that port. Fantastic.

Comment Re:Nice work ruining it... (Score 1) 97

I specifically mentioned that case "You want a cheap just-USB USB port? Either that's Type A so nobody can standardize on connectors; or it gets omitted to meet logo requirements"; and noted it as undesirable because it encourages the perpetuation of dongle hell. I'd certainly rather have a type A than no USB port(and, at least for now, I've still got enough type A devices that the port would be actively useful; but that may or may not be true forever; and is less likely to be true for 'want to pack efficiently for travel' cases rather than 'at the desk that has my giant tech junk drawer' cases).

As for the controller chip; that's a matter of...mixed...truth with USB-C. The USB part of the port will run from the USB controller, or an internal hub; but any AUX behavior(like DP support) is related to the USB controller only in the sense that there's a standardized way for it to surrender most of the high speed differential pairs for the use of the AUX signal. Actually running DP from the GPU to the port is a separate problem. For power delivery; I assume that at least some controllers will implement the negotiation for you(since it's mandatory even for devices that will neither request nor provide more than a relative pittance at 5v); but there is absolutely going to per a per-port cost difference in terms of the support components and size of traces between a port that is expecting to provide an amp, maybe 2, of +5v to peripherals and a port that is expecting to take a hundred watts at 20v and feed it to power input for the entire device.

Comment Nice work ruining it... (Score 5, Insightful) 97

This seems both loaded with perverse incentives and like it doesn't even necessarily solve the problem that it claims to solve.

Most obviously, MS is saying that if it doesn't support a display and device charging it's forbidden. So it's mandatory for all type-C ports to include the expense of power delivery circuitry capable of handling your device's potential load and either a dedicated video out or DP switching between type-C ports if there are more ports than there are heads on the GPU. You want a cheap just-USB USB port? Either that's Type A so nobody can standardize on connectors; or it gets omitted to meet logo requirements. Further; if a system supports 40Gbps USB4 all its ports are required to do so; including higher peripheral power limits, PCIe tunneling, and TB3 compatibility. You think it might be nice to have a port to plug flash drives into without allocating 4 PCIe lanes? Screw you I guess.

Then there's what the alleged confusion reduction doesn't actually specify: USB3 systems are only required to support 'minimum 1' displays. They need to have the supporting circuitry to handle that one display being on any port; but just ignoring the second DP alt mode device connected is fine; no further requirements. Data rates of 5, 10, or 20Gbs and accessory power supply of either greater than 4.5 or 7.5w are also fine(except that 20Gbs ports must be greater than 7.5); USB4 systems have higher minimum requirements; 2 4k displays and 15w power; but are similarly allowed to mingle 40 and 80Gbs; and it's entirely allowed for some systems to stop at 2 displays and some to support more; so long as the displays that are supported can be plugged in anywhere.

Obviously the tendency to do type-C ports that are just totally unlabeled or with a teeny cryptic symbol was no unduly helpful; but this seems like taking what could have been a fairly simple distinction (like the one that existed all the way back in the firewire/USB 1.1 days, or in the thunderbolt/USB systems, or slightly more informally on non-intel systems without thunderbolt), of "the fast port that does the things" and "the cheap port that is in ample supply"; and 'reducing confusion' by just banning the cheap port that is in ample supply(unless it's type A, for space consumption and to prevent connector standardization).

Are you really telling me that there wasn't something you could come up with to just tell the user which ones are power/video/PCIe and which ones are normal random accessory USB ports? I hope you like docking stations; because it seems like there will be a lot of those in our future.

Comment Not strictly a bet on the tech... (Score 1) 107

It seems mistaken to just blithely assume that technology will obviously just progress harder until a solution is reached.

When you talk about simulating something you are expressing an opinion on how much power you'll have to throw at the problem; but, more fundamentally, you are expressing optimism about the existence of a model of the system that delivers useful savings over the actual system without too much violence to the outcome.

Sometimes this is true and you can achieve downright ludicrous savings by just introducing a few empirically derived coefficients in place of interactions you are not prepared to simulate and still get viable results. In other cases either the system of interest is less helpful or your needs for precision are higher and you find that not only are rough approximations wildly wrong; but the cost of each attempt to move the model closer to the system goes up, sometimes dramatically.

We have no particular mystical reason for assuming that the brain will be a worst-case scenario where a model of acceptable accuracy ends up just being a precise copy; but we also have no particularly strong reason for optimism about comparatively well-behaved activation functions clearly being good enough and there being no risk of having to do the computational chemistry of an entire synapse(or all of them).

There's the further complication; if you are specifically catering to the 'apparently being smart enough to make lots of money on payment processing or banner ads or something doesn't keep you from feeling death breathing down your neck, does it?' audience there's the further complication in that we know vastly less about simulating a particular person than the little we know about constructing things that have some properties that resemble humans in the aggregate under certain cases; and the people huffing Kurzweil and imagining digital immortality are probably going to want a particular person; not just a chatbot whose output is a solid statistical match for the sort of things they would have said.

Comment People misunderstand friction... (Score 1) 47

I suspect that the misunderstanding is an old one; but 'AI' tools really bring into stark relief how poor people are at distinguishing between genuine friction; inefficiency because parts of the system are rubbing against one another in undesired ways; and 'friction' the noble phenomenon that improves the signal to noise ratio by making noise just inconvenient enough that you usually do it after you've already thought about it for a minute on your own.

It's the difference between being able to tap a colleague when you've been puzzling over a problem and need a fresh pair of eyes and That One Guy whose first reflex in the event of the slightest sensation of uncertainty is to poke you over the cubicle divider to ask a trivial question. The former is how collaboration happens; the latter was never taught to self-soothe as an infant.

You see the same thing at work in the 'general'/'office productivity' pitches for 'AI' tools: the "hey copilot; please make a slide deck about Project XYZ"/"turn these bullet points into an email that makes it sound like I worked real hard on the email". In an absolutely ideal world; it's theoretically a good thing if I don't have to spend time combing over all the Project XYZ points in order to fuck around in Powerpoint; but in the real world having to sacrifice some amount of my time for each minute of an entire meeting's worth of people's time that I will sacrifice is a valuable alignment of incentives: If vaguely plausible faff is free and unlimited it's only my good taste, or the patience of someone who outranks me enough to tell me that I'm done now, that protects an entire meeting from having it expand to fill the available space. If I have to do a little work to create it my own desire to not munge at slide decks also protects you.

(The "AI" bros, of course, without the slightest hint of irony or self awareness, will, on the next breath, turn around and pitch a 'summarization' tool to go along with their 'generation' tool; so that I can inflate a modest supply of things actually worth talking about into a torrent of shit; then you can 'summarize' the torrent of shit back into something that hopefully matches the modest supply of things I actually needed to talk about; and we can play the most computationally expensive game of telephone in human history.)

Comment Eat shit because it's cheaper. (Score 3, Interesting) 167

What seems particularly depressing about these stories of 'replacement' is that they aren't really about replacements; they're about inferior substitutions people think that they can get away with(and, unfortunately, may be correct).

Even if 'AI' were, in fact, a human-or-better replacement for humans there would obviously be a teensy little social problem implied by the relatively abrupt breakdown of the notion that people who possess useful skills and are willing to apply them diligently can be economic participants in ways that make their lives at least endurable; but it wouldn't necessarily be a problem for the optimistic theory that the incentives generally align to encourage quality. Sure, most of the people theorizing that implicitly assumed that humans would be doing the better or more innovative work; but the thesis didn't require that.

What we are getting is worse. The disruption is being drawn out a bit, because 'AI' is not in fact generally fit for purpose; but the incentives have turned toward delivering shit. 'Creative' is an obvious target because that's the designation for a swath of jobs where quality is understood to exist but there aren't really rigid failure states: anyone who thinks that lorem ipsum and literature are interchangeable, or that there's nothing worth doing in graphic design once you've identified somewhere between 2 and 4 colors that the human eye can distinguish from one another is abjectly beneath human culture(and I don't mean that in the 'High Art' snob sense: don't even try to tell me that all shlocky summer blockbusters are equally entertaining; or that no billboards differ meaningfully; or that some social media shitposters aren't more fun to read than others); but it's not like the CMS will throw an error if you insert a regurgitated press release where journalism was supposed to go; or sack the writer who is actually passionate about the subject and have the intern plagiarize a viral listicle instead.

The whole enterprise is really a sordid revelation less of what 'AI' can do than of the degree to which people were really just hoping for an excuse to get away with less and worse; and the ongoing trend of societies feeling relentlessly poorer and more fixated on scarcity even when their GDPs allegedly just keep going up; and economic statistics assure us that productivity metrics look amazing.

Just tell me that it's not fucking bullshit that a generation ago any city of nontrivial size had several newspapers, all with enough staff to actually fill a 'newsroom' that was probably a literal place at the time; and even podunk towns often had one with a few plucky wearers of multiple hats; and now we've got bot slop. In inflation-adjusted dollars the GDP per capita has just slightly less than doubled since 1985; and journalists and editors are both relatively cheap for what they do and produce something that can be copied across a subscriber base of almost any size at close to zero marginal cost.

This is getting TL;DR; but fuck it, it's honestly profoundly depressing: we are all, constantly, being made to cosplay a vastly poorer society(except on the specific occasions when it's time to justify the order of things; in which case look at what big TVs you can buy!) despite the numbers allegedly saying that we are richer than ever. 'AI' is a new and exceptionally versatile tool for continuing this trend; but you see it everywhere; both in terms of what just gets done and in terms of arguments that get made: why is it harder to get news made by journalists when the metro area being served is ~50% more populous and a trifle under twice as wealthy, per capita, than it was back in the day? What do you mean that's what has happened to housing affordability and even the nominally-luxurious 'McMansions' are all plastic plumbing and sawdust and formaldehyde pseudowood in places they think it won't be noticed? What do you mean tenure-track faculty positions are being slashed in favor of adjuncts who could earn more as McDonalds shift managers; but somehow the degree they teach courses for still costs vastly more? I can understand that cutting edge monoclonal recombinant antibodies or something are not going to be cheap; but how did we go from seeing a doctor to receive medical care to "ooh, are you sure you can't make do with someone cheaper at 'urgent care'?" when it's just going to be some sutures and antibiotics that have been off-patent for decades(and which have been offshore for savings); but I'm not 100% sure if it's just soft tissue damage or whether there's any tendon involvement; and ruling out embedded foreign objects would be nice?

It's really just dizzying how relentlessly we are expected to see downward substitution and 'austerity' as normal, outside of some specific cases involving transistors and corn syrup, despite the numbers theoretically being so favorable. It's almost like the correlation between productivity and income was severed decades ago and we're all just watching the punchline to Milton Friedman's joke land on us.

Slashdot Top Deals

The devil finds work for idle circuits to do.

Working...