Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment Re: Typical AI issue (Score 1) 76

I think the issue is very simple, the Waymo car drives based on its internal mapping, and the mapping says there is a stop light at a given intersection, and when Waymo encounters the missing stop light, it wasn't programmed to fall back and act as if it was a 4-way stop.

Various news sites also reported that some people observed Waymo cars treating them as 4-way stops. So it probably isn't as simple as not being programmed to be able to fall back, but rather some combination of multiple factors, including the nonfunctional lights, that in combination spooked the cars.

Either way, though, requiring human verification of an outage once per traffic light is probably the right thing to do, if only to ensure that the non-detection isn't a bug. Multiply times a lot of cars at a lot of lights.

Comment Re:Typical AI issue (Score 2) 76

Then how did Tesla work fine? Waymo actually uses less AI than Tesla.

Tesla FSD beta relies on a human operator in the car. I don't know what it does when a light is out. It either treats it as a red light or as a green light. In the former, then it relies on the human driver to take over to get it going again. If the latter, then it relies on the human driver to avoid a fatal collision. Either way, it relies on a human driver in the car.

Tesla's robotaxis also have a human safety driver. And still reportedly crash 12.5x more often than human drivers. So my guess would be that they treat disabled lights like green lights and hope for the best. :-D

Comment Re:Typical AI issue (Score 1) 76

I'd say its a safety feature. If the vehicle uses remote data to drive safely, the safe fallback is to stop when there is no remote data. They are just not fully autonomous. The big question is, if we want cars that act fully on their own.

Everybody here is assuming that the cellular network went down completely, and that the cars couldn't communicate. While possible, I would assume that Waymo uses multiple cellular providers to ensure reliable service, particularly given how spotty service on any individual provider can be in SF. If they don't, I bet they do next week. :-D

I'm also pretty sure they don't use remote data to drive safely at all. From the various articles I've read, all true safety-related data, including map data and driving-related models, should be stored in the vehicles, along with a list of areas where they aren't allowed to drive, etc. They do use the network to find out where to go for the next pickup, of course, so without a network, they are likely to all end up parked in random spots waiting for a fare.

The problem, I suspect, is that they are designed to fail safe. Specifically, when they encounter a situation that is substantially unexpected, they stop and reach out to operators to ask how to resolve the unexpected situation. Traffic lights being out en masse throughout a big chunk of SF is substantially unexpected, and I would assume that they don't let the cars handle that fully autonomously, just because that could also be caused by some major regression in their image recognition models, and they would *not* want the cars to start treating red lights as stop signs just because of a software bug, because that would be catastrophic.

That said, failsafe stops are not a safety issue, per se; they are a reliability issue. After all, a non-moving car is pretty much guaranteed not to kill someone. And if there's only one light out in some random place, it wouldn't be a big deal. A remote operator would tell it that yes, the light really is out, and it should treat it like a stop sign, and all is well.

Now imagine up to 800 vehicles all dialing in at once asking, "What the h*** is happening? There are no street lights, so everything looks different to my cameras, and the traffic light is dark!" That kind of problem is highly likely to overload the remote operators, because normally, those sorts of assistance calls are not happening simultaneously across a large geographical area.

The irony, of course, is that if my theory is correct, the way to fix it is to get Waymo into more cities so that they would have more spare human capacity to absorb the impact of unexpected surges in vehicles calling home.

Comment Re:No thank you. (Score 1) 51

In my mind you'd be buying a car without a battery and simultaneously subscribing to a battery service, but if you ever wanted to own a battery you could buy one. You'd get the battery delivered to the dealer (and/or they would work with one or more services directly and keep some on site) before you picked up the vehicle so it would be all the same to you as if it had come with it, and it would also come charged.

Moving them around without a battery at scrapping time is not a detriment, as vehicles to be scrapped are usually moved around with a fork lift anyway.

Comment Re:Cameras in your bathroom will also detect crime (Score 1) 56

RE the divorces comment, that's a feature not a bug when you realize that what's being uncovered is 100% deeply dishonest behavior.

Personally I don't know why DNA testing at birth isn't mandatory even if it was only to confirm paternity and then disposed.

Comment Re:No thank you. (Score 1) 51

You could do battery swaps for NEVs in a scheme where you didn't own a battery at all, and instead just subscribed to one. You could also do it for heavy diesel truck equivalents, as big diesels typically have the fuel tanks hanging on the outside of the frame where they're nice and accessible anyway. But it doesn't make any sense for the vehicles in between that, i.e. the bulk of them...

Comment Re:I've been using KDE for two months (Score 3, Interesting) 37

MATE is outdated (but good for resource constrained systems) and GNOME is dumbed down and hard to get good results from, you need a whole bunch of add-ins just to get where KDE is. KDE was very bad in the past, but it's really come quite a long way. GNOME was really quite good in the past, but it's really gone the wrong way. I'm not against having a simple mode but I don't want oversimplification to infest everything.

Comment Re:god damn it (Score 1) 273

For example, all of this Epstein nonsense, why the fuck wasn't this released when the Democrats were in power?

Because the USA doesn't have the concept of absolute power, Donnie Dipshit's pet Catholic Court notwithstanding, and those files were sealed by a judge at the time. There are a lot of fundamental ways in which the two parties are up to the same bullshit, but Democrats tend to obey court orders.

Comment Re:And? (Score 1) 273

A military with an obtuse and opaque budget is one thing

Corrupt, yes.

and in all reality, the military has a lot more reporting requirements than the NCAR.

Requirements, maybe. Meeting them, absolutely not. They aren't just reporting an amount spent on classified projects and therefore we can't have a breakdown, they're saying they can't figure out where an awful lot of money went at all.

Comment Re:"Look out, incoming pendulum!" (Score 1) 273

I think that this (electing a Trump) is what happens when the pendulum gets pushed too far

Obama was more like the Republicans than they think. For example, he was fully behind the MIC, blowing people up without due process and so on. Obviously there is a big contrast, for example we know he did a lot of drone strikes because of his EO which gave us information on how many strikes were used and where, and Trump was doing about four times as many strikes per month when he rescinded that order so that we wouldn't know how many he's done since.

Even the ACA was a Republican health care plan, spruced up a little bit but still writing profit for insurance companies into the law. So no, the pendulum just wasn't pushed that far at all.

How can we get to a ranked-choice system at a national level?

Revolution. The chances of us rewriting the constitution for that (which is what it would take) are roughly nil otherwise.

Comment Re:Vought's in the cabinet for one reason (Score 4, Informative) 273

Project 2025 is the result of a moral and ethical pendulum being brazenly shoved way the too far to the left

To you, the centrist (pro-corporation, pro-authoritarianism, pro-incarceration, pro-MIC — based on voting records) policies of the Democrats are "too far to the left" when actual leftism includes far more liberal ideas. This is because you are too far to the right to even see the left from where you're standing.

Slashdot Top Deals

All the simple programs have been written.

Working...