Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror

Comment Re:He's correct (Score 1) 155

Younger software engineers don't feel the need to optimize.

That's not really true, though. The optimization methods of younger software engineers are things like Microservices and NoSQL.

Of course, they don't profile to see if their "optimizations" actually go faster, but when have programmers ever bothered to measure?

Comment Office politics (Score 4, Interesting) 7

initial talks between Annapurna co-founder Nafea Bshara and Amazon executive James Hamilton...Amazon CEO Andy Jassy, who led AWS when the acquisition occurred

All this name dropping makes it look like office politics leaking out into the public. The article also mentions "Matt Garman" for no reason whatsoever. If office politics are that bad that it leaks out into the public, then Amazon is going to quickly become a very bad place to work. Unless you like fighting office politics, then go work there.

Incidentally, the rack wiring in the picture in TFA needs some work, I would never be satisfied with a datacenter with such sloppy wiring.

Comment Re:I cannot see this stopping the AI spiders (Score 1) 209

You shouldn't have to resort to cleverness and effort to find this out. AI training bots should log the URLs they ingest, and anyone should be able to query those logs to see if their site has been used to train the model. Given the vast sums companies are spending on training their models, the marginal effort of maintaining a public log wouldn't add any significant cost, other than the litigation costs they'll face when sites discover their TOSs have been violated.

Comment Re:I cannot see this stopping the AI spiders (Score 1) 209

The whole "move fast break things" ethos counts on creating a new status quo faster than regulatory bodies can respond. Tech startups rely on creating a fait accompli before government even notices the problem, but if they fail in that a well-funded company has recourse to deceptive PR, then lobbying, then lawyers to gum up the works. In AI, companies are already racing each other as fast as obscene gobs of money can propel them forward; it wouldn't take much to slow down any public regulatory response so that it will have to be mounted against the winner of that race, a company that will be in a much more commanding position to fight back.

In the meantime your hypothetical whistleblowing engineer probably is compensated to a substantial degree with stock options, and his continued employment prospects after ratting out his company are bleak in an industry where everyone is doing the same thing.

I'm not saying its impossible, but I'm a lot more pessimistic than you about it being *easy*. I suspect that enabling private actors to move against AI companies would be a lot faster. Since damages are hard to prove or quantify, simply creating statutory damages would allow intellectual property owners to take the initiative against infringing AI systems. It would help if there were transparency regulations which aided IP owners in detecting unauthorized training. Of course the downside is the volume of litigation that would follow.

Comment Re:Not news (Score 1) 145

worse than the science popularizations have suggested so far - and they were already called alarmist, when in fact, they were understating the problem.

What are you talking about, the alarmists are saying the oceans will boil, not that the results will be in the upper level of estimates.

Slashdot Top Deals

One man's "magic" is another man's engineering. "Supernatural" is a null word. -- Robert Heinlein

Working...