Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment Re:Nice work ruining it... (Score 1) 96

I hope I'm wrong; but my concern is that MS' decision here might end up being a worst-of-both-worlds outcome:

Devices that are mechanically restricted to type-c by mechanical constraints that require the smaller connector have a greater incentive to just skimp on ports; while devices big enough for type-As now have greater incentive to retain mixed ports because type-Cs now mandate further costs on top of the slightly more expensive connector. If you want to give someone a place to plug in a mouse; poster child of the 'even USB 1.1 was overqualified for this' school of peripherals; you'll either be keeping type A around or running DP or DP and PCIe to that port. Fantastic.

Comment Re:Nice work ruining it... (Score 1) 96

I specifically mentioned that case "You want a cheap just-USB USB port? Either that's Type A so nobody can standardize on connectors; or it gets omitted to meet logo requirements"; and noted it as undesirable because it encourages the perpetuation of dongle hell. I'd certainly rather have a type A than no USB port(and, at least for now, I've still got enough type A devices that the port would be actively useful; but that may or may not be true forever; and is less likely to be true for 'want to pack efficiently for travel' cases rather than 'at the desk that has my giant tech junk drawer' cases).

As for the controller chip; that's a matter of...mixed...truth with USB-C. The USB part of the port will run from the USB controller, or an internal hub; but any AUX behavior(like DP support) is related to the USB controller only in the sense that there's a standardized way for it to surrender most of the high speed differential pairs for the use of the AUX signal. Actually running DP from the GPU to the port is a separate problem. For power delivery; I assume that at least some controllers will implement the negotiation for you(since it's mandatory even for devices that will neither request nor provide more than a relative pittance at 5v); but there is absolutely going to per a per-port cost difference in terms of the support components and size of traces between a port that is expecting to provide an amp, maybe 2, of +5v to peripherals and a port that is expecting to take a hundred watts at 20v and feed it to power input for the entire device.

Comment Nice work ruining it... (Score 5, Insightful) 96

This seems both loaded with perverse incentives and like it doesn't even necessarily solve the problem that it claims to solve.

Most obviously, MS is saying that if it doesn't support a display and device charging it's forbidden. So it's mandatory for all type-C ports to include the expense of power delivery circuitry capable of handling your device's potential load and either a dedicated video out or DP switching between type-C ports if there are more ports than there are heads on the GPU. You want a cheap just-USB USB port? Either that's Type A so nobody can standardize on connectors; or it gets omitted to meet logo requirements. Further; if a system supports 40Gbps USB4 all its ports are required to do so; including higher peripheral power limits, PCIe tunneling, and TB3 compatibility. You think it might be nice to have a port to plug flash drives into without allocating 4 PCIe lanes? Screw you I guess.

Then there's what the alleged confusion reduction doesn't actually specify: USB3 systems are only required to support 'minimum 1' displays. They need to have the supporting circuitry to handle that one display being on any port; but just ignoring the second DP alt mode device connected is fine; no further requirements. Data rates of 5, 10, or 20Gbs and accessory power supply of either greater than 4.5 or 7.5w are also fine(except that 20Gbs ports must be greater than 7.5); USB4 systems have higher minimum requirements; 2 4k displays and 15w power; but are similarly allowed to mingle 40 and 80Gbs; and it's entirely allowed for some systems to stop at 2 displays and some to support more; so long as the displays that are supported can be plugged in anywhere.

Obviously the tendency to do type-C ports that are just totally unlabeled or with a teeny cryptic symbol was no unduly helpful; but this seems like taking what could have been a fairly simple distinction (like the one that existed all the way back in the firewire/USB 1.1 days, or in the thunderbolt/USB systems, or slightly more informally on non-intel systems without thunderbolt), of "the fast port that does the things" and "the cheap port that is in ample supply"; and 'reducing confusion' by just banning the cheap port that is in ample supply(unless it's type A, for space consumption and to prevent connector standardization).

Are you really telling me that there wasn't something you could come up with to just tell the user which ones are power/video/PCIe and which ones are normal random accessory USB ports? I hope you like docking stations; because it seems like there will be a lot of those in our future.

Comment How else would Windows Hello work? (Score 2) 96

And does M$ think they can mandate what ports manufacturers put on their PC.s
I remember them saying that LapTops had to have a camera.

This article claims that the camera requirement exists to support Windows Hello authentication. How would Microsoft's Windows Hello or Apple's Face ID work without a camera? Or what other means of quickly authenticating the user to the operating system and to the external passkey/password store would you recommend instead?

Comment It'd make low-end laptops more expensive (Score 1) 96

There are no downsides to this.

The only downside I can think of is that low-end Windows laptops could become a lot more expensive to support display and 40 Gbps on all ports. This could drive laptop makers toward an operating system with even more restricted functionality: ChromeOS.

Comment String length API unchanged? Facepalm. (Score 1) 97

unless there is a discovery in calculation of length of a string.

Incidentally, there was such a discovery. 'It's not wrong that "[facepalming man with brown skin emoji]".length = 7' by Henri Sivonen came out in September 2019. It explains the difference among code units, code points, and extended grapheme clusters, the difference among UTF-8, UTF-16, and UTF-32, the difference among JavaScript, Python 3, and Rust length semantics, and the difference among storage, display width, and arbitrary quotas that are roughly fair across languages.

Comment Not strictly a bet on the tech... (Score 1) 101

It seems mistaken to just blithely assume that technology will obviously just progress harder until a solution is reached.

When you talk about simulating something you are expressing an opinion on how much power you'll have to throw at the problem; but, more fundamentally, you are expressing optimism about the existence of a model of the system that delivers useful savings over the actual system without too much violence to the outcome.

Sometimes this is true and you can achieve downright ludicrous savings by just introducing a few empirically derived coefficients in place of interactions you are not prepared to simulate and still get viable results. In other cases either the system of interest is less helpful or your needs for precision are higher and you find that not only are rough approximations wildly wrong; but the cost of each attempt to move the model closer to the system goes up, sometimes dramatically.

We have no particular mystical reason for assuming that the brain will be a worst-case scenario where a model of acceptable accuracy ends up just being a precise copy; but we also have no particularly strong reason for optimism about comparatively well-behaved activation functions clearly being good enough and there being no risk of having to do the computational chemistry of an entire synapse(or all of them).

There's the further complication; if you are specifically catering to the 'apparently being smart enough to make lots of money on payment processing or banner ads or something doesn't keep you from feeling death breathing down your neck, does it?' audience there's the further complication in that we know vastly less about simulating a particular person than the little we know about constructing things that have some properties that resemble humans in the aggregate under certain cases; and the people huffing Kurzweil and imagining digital immortality are probably going to want a particular person; not just a chatbot whose output is a solid statistical match for the sort of things they would have said.

Comment Re: What about 'new' stuff (Score 3, Insightful) 115

Vibe coding is essentially cargo cult programming if you peek behind the curtain.

It is exactly that. You could call me an AI/LLM coding proponent I guess, I use it daily but that vibe coding shit is no different than the Ruby on Rails hype train for example. I doubt it will have much impact outside hipster webdev "I made a twitter clone" trash.

Comment Re: What about 'new' stuff (Score 3, Interesting) 115

A programmer faced with a new language, OS, API or whatever has to sit down and learn it from documents, not existing examples. Without programmers programmers creating stuff with the new thing there is nothing for the AIs to be trained on.

An LLM is actually great at tearing through stuff like that, and translating existing patterns and idioms into new languages, new settings, etc. Waaaaay faster than you would, and they're a great learning aide.

Creating new idioms, new design patterns, no, LLM probably won't do that, but if a new language, OS, or API was intended to be used in some novel way, there would be examples of it.

I'm tempted to say new idioms don't come from a vacuum.. but they do, when a clever monkey invents one. At the same time, nobody else knows what it means without examples. An LLM would learn new design patterns of a new language the same way you would, from the same sources.

A new undocumented API without any examples, yah an LLM isn't going to be much use, it's like using a camera in the dark. There only so much it or you can do with little information.

Comment People misunderstand friction... (Score 1) 47

I suspect that the misunderstanding is an old one; but 'AI' tools really bring into stark relief how poor people are at distinguishing between genuine friction; inefficiency because parts of the system are rubbing against one another in undesired ways; and 'friction' the noble phenomenon that improves the signal to noise ratio by making noise just inconvenient enough that you usually do it after you've already thought about it for a minute on your own.

It's the difference between being able to tap a colleague when you've been puzzling over a problem and need a fresh pair of eyes and That One Guy whose first reflex in the event of the slightest sensation of uncertainty is to poke you over the cubicle divider to ask a trivial question. The former is how collaboration happens; the latter was never taught to self-soothe as an infant.

You see the same thing at work in the 'general'/'office productivity' pitches for 'AI' tools: the "hey copilot; please make a slide deck about Project XYZ"/"turn these bullet points into an email that makes it sound like I worked real hard on the email". In an absolutely ideal world; it's theoretically a good thing if I don't have to spend time combing over all the Project XYZ points in order to fuck around in Powerpoint; but in the real world having to sacrifice some amount of my time for each minute of an entire meeting's worth of people's time that I will sacrifice is a valuable alignment of incentives: If vaguely plausible faff is free and unlimited it's only my good taste, or the patience of someone who outranks me enough to tell me that I'm done now, that protects an entire meeting from having it expand to fill the available space. If I have to do a little work to create it my own desire to not munge at slide decks also protects you.

(The "AI" bros, of course, without the slightest hint of irony or self awareness, will, on the next breath, turn around and pitch a 'summarization' tool to go along with their 'generation' tool; so that I can inflate a modest supply of things actually worth talking about into a torrent of shit; then you can 'summarize' the torrent of shit back into something that hopefully matches the modest supply of things I actually needed to talk about; and we can play the most computationally expensive game of telephone in human history.)

Slashdot Top Deals

"Silent gratitude isn't very much use to anyone." -- G. B. Stearn

Working...