Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment Nice work ruining it... (Score 5, Insightful) 76

This seems both loaded with perverse incentives and like it doesn't even necessarily solve the problem that it claims to solve.

Most obviously, MS is saying that if it doesn't support a display and device charging it's forbidden. So it's mandatory for all type-C ports to include the expense of power delivery circuitry capable of handling your device's potential load and either a dedicated video out or DP switching between type-C ports if there are more ports than there are heads on the GPU. You want a cheap just-USB USB port? Either that's Type A so nobody can standardize on connectors; or it gets omitted to meet logo requirements. Further; if a system supports 40Gbps USB4 all its ports are required to do so; including higher peripheral power limits, PCIe tunneling, and TB3 compatibility. You think it might be nice to have a port to plug flash drives into without allocating 4 PCIe lanes? Screw you I guess.

Then there's what the alleged confusion reduction doesn't actually specify: USB3 systems are only required to support 'minimum 1' displays. They need to have the supporting circuitry to handle that one display being on any port; but just ignoring the second DP alt mode device connected is fine; no further requirements. Data rates of 5, 10, or 20Gbs and accessory power supply of either greater than 4.5 or 7.5w are also fine(except that 20Gbs ports must be greater than 7.5); USB4 systems have higher minimum requirements; 2 4k displays and 15w power; but are similarly allowed to mingle 40 and 80Gbs; and it's entirely allowed for some systems to stop at 2 displays and some to support more; so long as the displays that are supported can be plugged in anywhere.

Obviously the tendency to do type-C ports that are just totally unlabeled or with a teeny cryptic symbol was no unduly helpful; but this seems like taking what could have been a fairly simple distinction (like the one that existed all the way back in the firewire/USB 1.1 days, or in the thunderbolt/USB systems, or slightly more informally on non-intel systems without thunderbolt), of "the fast port that does the things" and "the cheap port that is in ample supply"; and 'reducing confusion' by just banning the cheap port that is in ample supply(unless it's type A, for space consumption and to prevent connector standardization).

Are you really telling me that there wasn't something you could come up with to just tell the user which ones are power/video/PCIe and which ones are normal random accessory USB ports? I hope you like docking stations; because it seems like there will be a lot of those in our future.

Comment Re:If AI were an employee (Score 1) 21

Sadly, based on experience I think you are wrong. Employees who screw up are often not fired, or are replaced with employees just as bad.

There's a reason there's a common saying that "You pay peanuts, you get monkeys." It's because it's very common for employers to accept mediocre or even poor work if the employees doing it are cheap enough. I'm not anti AI -- not even generative AI. I think with AI's ability to process and access huge volumes of data, it has tremendous potential in the right hands. But generative AI in particular has an irresistible appeal to a managerial culture that prefers mediocrity when it's cheap enough.

Instead of hiring someone with expensive thinking skills to use AI tools effectively and safely, you can just have your team of monkeys run an AI chat bot. Or you can fire the whole team and be the monkey yourself. The salary savings are concrete and immediate; the quality risks and costs seem more abstract because they haven't happened yet. Now as a manager it's your job to guide the company to a successful future, but remember you're probably mediocre at your job. Most people are.

According to economics, employers stop adding employees when the marginal productivity of the next employee drops to zero. What this means is that AI *should* create an enormous demand for people with advanced intellectual skills. But it won't because managers don't behave like they do in neat abstract economic models. What it will do is eliminate a lot of jobs where management neither desires nor rewards performance, because they don't want anything a human mind can, at this point, uniquely provide.

Comment Re:If it makes you feel better (Score 1) 72

>Increases in automation are going to help identify the Wallys of the world and fire them.

That is what they said about computers, that is what they said about printing press, that is what they said about factories and robots.

They say this to keep the working persons income down, they say this to suppress the employee.

I would tell you to mind your history, but a bot is only told what to do. It doesnt think for itself.

Comment Re:Kurzweils Singularity. (Score 5, Informative) 126

Life is WAY better after the industrial revolution than it was before it.

People have this fantasy image of what life used to be like, thinking of picturesque farms, craftsmen tinkering in workshops, clean air, etc. The middle ages were filth, you worked backbreaking labour long hours of the day, commonly in highly risky environments, even the simplest necessities cost a large portion of your salary, you lived in a hovel, and you died of preventable diseases at an average age of ~35 (a number admittedly dragged down by the fact that 1/4th of children didn't even survive a single year).

If it takes people of similar social status as you weeks of labour to produce the fibre for a set of clothes, spin it into yarn, dye it, weave it, and sew it, then guess what? It requires that plus taxes and profit weeks of your labour to be able to afford that set of clothes (and you better believe the upper classes were squeezing every ounce of profit from the lower class they could back then). Decreasing the amount of human labour needed to produce things is an immensely good thing. Furthermore, where did that freed up labour go? Into science, into medicine, into the arts, etc etc. Further improving people's quality of life.

And if your response is "But greater production is more polluting!" - I'm sorry, do you have any understanding of how *miserably* polluted cities in the middle ages were? Where coal smoke poured out with no pollution controls, sewage ran straight into rivers that people collected water from and bathed in, where people extensively used things like arsenic and mercury and lead and asbestos, etc etc? The freed-up labour brought about by the industrial revolution allowed us to *learn* and to *fix problems*.

Comment Re:No it isn't (Score 2) 126

Which of those things hit 800 million users in 17 months?
Which of those things hit such high annual recurring revenue rates so fast?
Which of those saw the cost of using the tech decline by 99% over two years?

Heck, most of them aren't even new technologies, just new products, often just the latest version of older, already-commonly-used products.

And re. that last one: it must be stressed that for the "cost of using the tech" to decline 99% over two years per million tokens, you're also talking about a similar order of reduction of power consumption per million tokens, since your two main costs are hardware and electricity.

Comment Re:Dig Baby Dig (Score 4, Informative) 126

You're looking at three months of very noisy data and drawing some pretty dramatic conclusions from said minimal data.

Winter demand is heavily dependent on weather. You're mainly seeing the impacts of weather on demand.

https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2F2024%25E2%2580%259325_North_American_winter

"The 2024–25 North American winter was considerably colder then the previous winter season, and much more wintry across the North American continent, signified by several rounds of bitterly cold temperatures occurring."

Comment Not strictly a bet on the tech... (Score 1) 91

It seems mistaken to just blithely assume that technology will obviously just progress harder until a solution is reached.

When you talk about simulating something you are expressing an opinion on how much power you'll have to throw at the problem; but, more fundamentally, you are expressing optimism about the existence of a model of the system that delivers useful savings over the actual system without too much violence to the outcome.

Sometimes this is true and you can achieve downright ludicrous savings by just introducing a few empirically derived coefficients in place of interactions you are not prepared to simulate and still get viable results. In other cases either the system of interest is less helpful or your needs for precision are higher and you find that not only are rough approximations wildly wrong; but the cost of each attempt to move the model closer to the system goes up, sometimes dramatically.

We have no particular mystical reason for assuming that the brain will be a worst-case scenario where a model of acceptable accuracy ends up just being a precise copy; but we also have no particularly strong reason for optimism about comparatively well-behaved activation functions clearly being good enough and there being no risk of having to do the computational chemistry of an entire synapse(or all of them).

There's the further complication; if you are specifically catering to the 'apparently being smart enough to make lots of money on payment processing or banner ads or something doesn't keep you from feeling death breathing down your neck, does it?' audience there's the further complication in that we know vastly less about simulating a particular person than the little we know about constructing things that have some properties that resemble humans in the aggregate under certain cases; and the people huffing Kurzweil and imagining digital immortality are probably going to want a particular person; not just a chatbot whose output is a solid statistical match for the sort of things they would have said.

Comment People misunderstand friction... (Score 1) 47

I suspect that the misunderstanding is an old one; but 'AI' tools really bring into stark relief how poor people are at distinguishing between genuine friction; inefficiency because parts of the system are rubbing against one another in undesired ways; and 'friction' the noble phenomenon that improves the signal to noise ratio by making noise just inconvenient enough that you usually do it after you've already thought about it for a minute on your own.

It's the difference between being able to tap a colleague when you've been puzzling over a problem and need a fresh pair of eyes and That One Guy whose first reflex in the event of the slightest sensation of uncertainty is to poke you over the cubicle divider to ask a trivial question. The former is how collaboration happens; the latter was never taught to self-soothe as an infant.

You see the same thing at work in the 'general'/'office productivity' pitches for 'AI' tools: the "hey copilot; please make a slide deck about Project XYZ"/"turn these bullet points into an email that makes it sound like I worked real hard on the email". In an absolutely ideal world; it's theoretically a good thing if I don't have to spend time combing over all the Project XYZ points in order to fuck around in Powerpoint; but in the real world having to sacrifice some amount of my time for each minute of an entire meeting's worth of people's time that I will sacrifice is a valuable alignment of incentives: If vaguely plausible faff is free and unlimited it's only my good taste, or the patience of someone who outranks me enough to tell me that I'm done now, that protects an entire meeting from having it expand to fill the available space. If I have to do a little work to create it my own desire to not munge at slide decks also protects you.

(The "AI" bros, of course, without the slightest hint of irony or self awareness, will, on the next breath, turn around and pitch a 'summarization' tool to go along with their 'generation' tool; so that I can inflate a modest supply of things actually worth talking about into a torrent of shit; then you can 'summarize' the torrent of shit back into something that hopefully matches the modest supply of things I actually needed to talk about; and we can play the most computationally expensive game of telephone in human history.)

Comment Eat shit because it's cheaper. (Score 2) 158

What seems particularly depressing about these stories of 'replacement' is that they aren't really about replacements; they're about inferior substitutions people think that they can get away with(and, unfortunately, may be correct).

Even if 'AI' were, in fact, a human-or-better replacement for humans there would obviously be a teensy little social problem implied by the relatively abrupt breakdown of the notion that people who possess useful skills and are willing to apply them diligently can be economic participants in ways that make their lives at least endurable; but it wouldn't necessarily be a problem for the optimistic theory that the incentives generally align to encourage quality. Sure, most of the people theorizing that implicitly assumed that humans would be doing the better or more innovative work; but the thesis didn't require that.

What we are getting is worse. The disruption is being drawn out a bit, because 'AI' is not in fact generally fit for purpose; but the incentives have turned toward delivering shit. 'Creative' is an obvious target because that's the designation for a swath of jobs where quality is understood to exist but there aren't really rigid failure states: anyone who thinks that lorem ipsum and literature are interchangeable, or that there's nothing worth doing in graphic design once you've identified somewhere between 2 and 4 colors that the human eye can distinguish from one another is abjectly beneath human culture(and I don't mean that in the 'High Art' snob sense: don't even try to tell me that all shlocky summer blockbusters are equally entertaining; or that no billboards differ meaningfully; or that some social media shitposters aren't more fun to read than others); but it's not like the CMS will throw an error if you insert a regurgitated press release where journalism was supposed to go; or sack the writer who is actually passionate about the subject and have the intern plagiarize a viral listicle instead.

The whole enterprise is really a sordid revelation less of what 'AI' can do than of the degree to which people were really just hoping for an excuse to get away with less and worse; and the ongoing trend of societies feeling relentlessly poorer and more fixated on scarcity even when their GDPs allegedly just keep going up; and economic statistics assure us that productivity metrics look amazing.

Just tell me that it's not fucking bullshit that a generation ago any city of nontrivial size had several newspapers, all with enough staff to actually fill a 'newsroom' that was probably a literal place at the time; and even podunk towns often had one with a few plucky wearers of multiple hats; and now we've got bot slop. In inflation-adjusted dollars the GDP per capita has just slightly less than doubled since 1985; and journalists and editors are both relatively cheap for what they do and produce something that can be copied across a subscriber base of almost any size at close to zero marginal cost.

This is getting TL;DR; but fuck it, it's honestly profoundly depressing: we are all, constantly, being made to cosplay a vastly poorer society(except on the specific occasions when it's time to justify the order of things; in which case look at what big TVs you can buy!) despite the numbers allegedly saying that we are richer than ever. 'AI' is a new and exceptionally versatile tool for continuing this trend; but you see it everywhere; both in terms of what just gets done and in terms of arguments that get made: why is it harder to get news made by journalists when the metro area being served is ~50% more populous and a trifle under twice as wealthy, per capita, than it was back in the day? What do you mean that's what has happened to housing affordability and even the nominally-luxurious 'McMansions' are all plastic plumbing and sawdust and formaldehyde pseudowood in places they think it won't be noticed? What do you mean tenure-track faculty positions are being slashed in favor of adjuncts who could earn more as McDonalds shift managers; but somehow the degree they teach courses for still costs vastly more? I can understand that cutting edge monoclonal recombinant antibodies or something are not going to be cheap; but how did we go from seeing a doctor to receive medical care to "ooh, are you sure you can't make do with someone cheaper at 'urgent care'?" when it's just going to be some sutures and antibiotics that have been off-patent for decades(and which have been offshore for savings); but I'm not 100% sure if it's just soft tissue damage or whether there's any tendon involvement; and ruling out embedded foreign objects would be nice?

It's really just dizzying how relentlessly we are expected to see downward substitution and 'austerity' as normal, outside of some specific cases involving transistors and corn syrup, despite the numbers theoretically being so favorable. It's almost like the correlation between productivity and income was severed decades ago and we're all just watching the punchline to Milton Friedman's joke land on us.

Slashdot Top Deals

Doubt isn't the opposite of faith; it is an element of faith. - Paul Tillich, German theologian and historian

Working...