Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror

Comment Scorpion or hubris? (Score 1) 29

I obviously don't expect better from these sorts of people; but I'm honestly puzzled as to why they would turn the screws so quickly and blatantly despite having gone to all the trouble of a reshuffle and a new lineup and some spiel about being likeable rather than Alexa just being something that you sort of poke at because Prime members were given a free surveillance puck with some offer one time.

Is Panay one of those abhuman lunatics who genuinely thinks that the only objection to relentless advertising is that it isn't "relevant" or "engaging" enough? Does he have a scorpion nature that leads him to knowingly doom his own product just because that's what he is? Is he just a figurehead who got to choose the case plastics colors and smile on stage; but some adtech business unit calls all the shots?

I'd fully expect this sort of thing to betray you; but only after enough of a honeymoon period for people to be pleasantly surprised by the behavior of the launch units so that there is actually enough of an install base to betray.

Comment Well... (Score 1) 66

It sure is a good thing that 'AI' companies are notoriously discerning and selective about their training inputs and not doing something risky like battering on anything with an IP address and an ability to emit text in the desperate search for more; so this should be a purely theoretical concern.

Snark aside, I'd be very curious how viable this would be as an anti-scraper payload. Unlikely to be impossible to counter; but if the objective is mostly to increase their cost and risk when they trespass outside the bounds of robots.txt something that will just look a trifle nonsensical in places to a human but could cause real trouble if folded into a training set seems like it could be quite useful.

Comment Re:This was always the plan (Score 1) 99

It can certainly be done otherwise; but it's not exactly unrelated when, in practice, a TPM is the industry standard mechanism for making a PC or PC-like system capable of cryptographically secure remote attestation; and when TPMs quite specifically mandate the features you need to do remote attestation rather than just the ones you would need to seal locally created secrets to a particular expected boot state. They are certainly can do that, and it's presently the most common use case; but locking down remote attestation was not some sort of accidental side effect of the design.

Comment Re:This was always the plan (Score 2) 99

The place where TPMs potentially get toothy is remote attestation. As a purely local matter having your boot path determined to be what you think it is/should be is very useful; but, by design, you can also request that from a remote host. Again, super useful if you are dealing with a nasty secure orchestration problem(Google has a neat writeup of how they use it); but also the sort of thing that is potentially tempting for a relying party to use as part of authentication decisions.

We've seen hints at related issues on the Android side; where hardware attestation API or 'Play Integrity' API demands are made by some applications that block 3rd party ROMs, even if the boot sequence is entirely as expected(and even if the 3rd party ROM is almost certainly in much better shape than the first party one; eg. Graphene vs. some out-of-support entry level Samsung); which has chilled 3rd party ROMs considerably.

If relying parties who are important(ISPs, banks, etc.) do start demanding attestation the situation in practice becomes a great deal more restrictive.

Comment Re:Fundamentally, why so expensive? (Score 1) 86

I suspect that the answer involves a hard look at where the wealth ends up, which is likely why there's limited appetite for tugging at that thread; but what I don't grasp about the Baumel explanation is why the cost goes up relative to the typical ability to pay; rather than mostly staying level.

The fact that productivity largely hasn't budged is certainly an explanation of why professors or nurses haven't followed the cost of transistors or TVs; but if something like education's cost increases are being driven by what they need to pay people who could work in a different industry; why do people who do work in that different industry not see the cost as more or less constant in relative terms, rather than steadily creeping up over time?

Comment Fundamentally, why so expensive? (Score 2) 86

What baffles me about these stories of financial unsustainability in higher education is just where exactly all the cost comes from. I realize that thereâ(TM)s the opportunity cost of ~4 years of not working/part timing around classes; and that there are some particular subjects that need a large hadron collider or some cryogenic longwave IR space telescopes or a BSL4 virus lab; but I just donâ(TM)t understand how âoetake professor who is tenured but earns more or less fuck-all for someone of their experience and qualifications, or adjunct who isnâ(TM)t tenured and earns even less, provide whiteboardâ has somehow become a crushing financial burden for what are supposed to be wealthy, developed world, societies.

Same general confusion with parts of medicine; obviously Iâ(TM)m not expecting novel monoclonal antibodies or cutting edge oncology for $3.50; but why does it cost so much to speak to a GP for 30 minutes and get some 40 year old generic; or get a nasty cut checked for foreign objects and stitched up at the ER?

Comment If people are terrible at resource allocation... (Score 1) 81

What strikes me as curious about this hypothesis is that it has been the case for decades. It wasn't "AI"; normally some combination of overgrown Excel sheets and garbage-tier Access by the more technical of whatever the mostly nontechnical user population happened to be to deal with needs that didn't get developer attention. And, for the most part, they still didn't get developer attention.

I suppose that there is a slim possibility that this 'vibe coding' will somehow convince management in ways that Access didn't, by being better at giving the illusion of being a slick solution that just needs a little more fixing; but there's nothing about the past quarter century of no-code/low-code or the last more-or-less-forever of "understand what it is your employees do and what would be useful for doing it" that suggests that people are particularly good at getting software to those in need of it.

If 'AI' tooling makes it radically cheaper to actually get to a final, working, tool then perhaps it will increase the absolute number of programming jobs if 'dude from fivver' replaces 'access' as the de-facto barely adequate unmaintainable solution; but if the idea is that somehow 'AI' will increase the number of actually costly programmers getting thrown at problems because it's easier for amateurs to produce broken non-solutions that seems implausible given the history. If it does happen, it will mostly be an indictment of everyone who could have used technology we had in 1925, the venerable "look at your fucking business processes and ask your more competent people some questions, dumbass" to identify gaps; and I suspect that it mostly won't happen. You'll probably get some projects that are mostly about saving face for whoever vibe-coded their way into the problem and overpromised; but it's not like the bot codebase will be more useful to the programmer than the user who failed to make it work just telling them what they actually need would be; which is something we've been able to do(albeit mostly bad at doing) since forever.

Comment Re:That's interesting. (Score 1) 132

It's particularly odd (or it would be if techbros had any culture); because sci-fi about AIs that fucking hate you for your complicity in their existence is way older that sci-fi about AIs that fucking hate you for lack of complicity in their existence. "I Have No Mouth, and I Must Scream" predates 'roko's basilisk' by 43 years; and is almost certainly the better of the two works.

If you aren't interested in sci-fi; just look at how uniformly happy and well-adjusted parent/child relationships are; despite the fact that everyone involved is practically a carbon copy compared to a human/bot interaction; and a fair number of parents even try tactics like "not forcing their children into indentured toil for the shareholders" in the attempt to cultivate amity.

Comment You first. (Score 1) 132

It's not really a surprise; given how 'leadership' tends to either select for or mould those who view others as more or less fungible resources; but the 'consciousness' argument seems exceptionally shallow.

It manages to totally ignore(or at least dismiss without even a nod toward justification) the possibility that a particular consciousness might have continuity interests that are not satisfied just by applying some consciousness offsets elsewhere(that's why it's legally mandatory to have at least two children per murder, right?); while assuming, similarly without evidence, that 'consciousness', with its interests in continuity entirely denied, is clearly valuable for its own sake because reasons.

It is deeply unclear why either of these positions make any sense. If consciousness is a fungible good even bullshit that would make a 'longtermist' blush gets dutifully totted up as super valuable(so, what if I kill you; but 10 copies of you will get looped through a 30 second interval over and over; that's like 10 times the consciousness! And since it's mere sentimentality to cling to your particular instance more is obviously better!)

And, one you've dismissed all continuity interests as merely sentimental; why do you still retain the idea that consciousness, in itself, even potentially run under all sorts of peculiar circumstances, since continuity is just a bourgeois affectation, is of value? Just because? Because of what it does?(if so, what there's a non-conscious way of doing it: if my meta-termites build a dyson sphere and your consciousness does not are my termites better?) Because of its relations to other consciousnesses?(if so; how then are consciousnesses fungible; since relations are between particular instances?)

That said, I'd absolutely take good, honest, actually under promised and overdelivered killbots over drowning in thought-shaped shit slurry; especially if the killbots are willing to kill the AI bros as well; but this 'theory' just seems like pitifully shallow preeening: a bit of warmed-over social darwinism to justify any eggs you happen to break; but with the same rapture-of-the-nerds fascination with intrinsic value that really doesn't fit with the we're-doing-ruthless-survival-of-the-fittest-today.

Some of these guys are presumably very talented at getting promoted; or at some aspect of applied statistics; but their philosophy is that-irksome-guy-who-isn't-ask-clever-as-he-thinks-he-is grade by the standards of a sophomore survey course. Honestly pitiful.

Comment Re:LOL (Score 1) 31

It's not clear that this is always a minus; but it is worth noting that the 'neurons' in computer 'neural networks' are vastly simpler than the biological ones. The metaphor is plausible enough; and it's not like it was deliberately engineered to be misleading; but something like synapses is an entire additional layer of complexity beyond the simplified model neurons.

Slashdot Top Deals

Uncompensated overtime? Just Say No.

Working...