Cars block the street all the time, there is ample place to pass the waymo car on the left in the opposing lane, yet those SUV driving humans don't care to move out of the way either, and police just blocks the maneuver area too.
That silver car in the front could also just pass in front and make space. Situational awareness has room to be improved for a lot of entities in this short video.
Nueces Street is 3 and half lanes wide there plus massive sidewalks, apparently to narrow for even more massive ambulances.
> You can arrest a driver for not making space for an emergency vehicle. Who do we arrest here?
That's the best part, no one! We have finally managed to invent a system that widely disperses accountability so much no one can be held liable when something goes wrong.
>no one can be held liable when something goes wrong.
No, at the very least tort laws still apply even if the driver is a corporation. Do you really need someone sitting in jail to satisfy your justice boner?
Yes, I want to see real, serious punishment for corporate crimes, on par with the life disruption experienced by people who see a jail sentence. It's almost always brutal - major income disruption, job loss, etc. If it's a small fine, which it always seems to be for corporations, then there is no incentive for following the law. I'm also in favor of corporate death sentences for large-scale egregious violations - liquidate assets and jail executives.
By corporatizing social harms, basically nobody is ever held accountable - except for the little guy.
>By corporatizing social harms, basically nobody is ever held accountable - except for the little guy.
Again, this is false. At the very least there's financial penalties, which the shareholders are on the hook for. Moreover the corporate malfeasance that does happen don't map nicely to human crimes. If you kill a guy, you get sent to jail for decades. But what if you're a company, that makes a machine with sloppy code[1] that unintentionally kills someone? What do you do? Jail the programmer who wrote the code? Jail the manager who did the code review? Jail the CEO who had no knowledge of it but "buck stops with him" and we hate CEOs? How does the death penalty work? If you think it through it's basically a fine equivalent to the company's market cap. If Boeing does a bad that kills one person, does that mean the US government just repossesses the entire company?
After watching the movie "dark waters" about the whole Teflon scandal, seems like it should be the highest up person (or people) who had knowledge of the incident (obviously must be proven). An individual engineer knowing a car has a dangerous edge case isn't enough to get them in trouble in my view, especially if the company has claimed they are working on fixing it. Also legitimate mistakes are just mistakes, companies won't get it right every single time.
However there's cases where its completely proven that someone high up knew there was a systemic safety issue (they had a broad view and could see all the different areas of what was going on), they knew exactly what was causing it, and they do nothing because they want to keep the profit going. The fact those people don't go to jail just tells me that corporations have way too much leeway.
Depending on how severe the error is, it could be professional negligence. In other professions, including engineering, this can result in a loss of the professional's license and their inability to continue to work in that field. Also, for negligent drivers, a suspension of their driving license can apply. So there is precedent for severe punishment even if nobody gets a jail sentence.
I think of the corporate death penalty as being more appropriate when leadership knew exactly what was going on and chose profits over people. Exxon, see https://www.science.org/doi/10.1126/science.abk0063. Purdue Pharma, see https://en.wikipedia.org/wiki/Purdue_Pharma. Company gets sold for parts and Cauitebgoes to prison probably for life due to the amount of lives they potentially destroyed. Pretty much all the tobacco companies knew how harmful their product and made a concerted effort to fund their own bogus studies to throw up a smoke screen. Facebook makes billions from (for example) scams and fraudulent ads: https://www.reuters.com/investigations/meta-is-earning-fortu.... Maybe don't throw their CEO in prison but at least fine them 10x the profit they made vs. the usual .0001%.
In Australia it's the board of directors who are liable. They can be liable if they personally direct the company to do something illegal (obviously?) but there is also a positive obligation to exercise due diligence. This covers (but is not limited to) workplace safety and safety of customers and the public. Directors can be personally liable for breaches of this duty and the penalties extend to possible imprisonment and very substantial fines.
>but there is also a positive obligation to exercise due diligence. This covers (but is not limited to) workplace safety and safety of customers and the public.
Is there any indication this requirement was breached for this case? I'm all for jailing executives of companies where they specifically failed to enact safety measures, or even didn't care enough about safety, but in this case it's simply a case of a edge they didn't test. It's not for lack of trying either. Apparently they have their own AI model to generate test data, so they can train/test what happens if a hurricane hits, for instance.
> Do you really need someone sitting in jail to satisfy your justice boner?
Literally, and intentionally avoiding any attempt to examine the implications? No probably not.
But reasonable punishment discourages bad behavior. And software engineers have a habit of ignoring the implications of a defective design. I think apocalyptic fines applied to the companies creating the systems for automated cars would also create the correct incentives, but I find that to be less likely than imprisonment.
What I want is software and systems to not suck ass. I don't want to deal with defective... everything, because it was faster to deliver. That's especially true when it contributes to the death or injury of a person that didn't do anything wrong.
I don't care what works, but people being afraid of going to jail for hurting someone absolutely does work. And 'administrative fines' don't work.
>But reasonable punishment discourages bad behavior. And software engineers have a habit of ignoring the implications of a defective design. I think apocalyptic fines applied to the companies creating the systems for automated cars would also create the correct incentives, but I find that to be less likely than imprisonment.
This just feels like the "we should make the justice system harsher to deter crime" argument but applied to software engineering. If it works, why stop at criminal cases? Maybe we should dock the pay of SWEs next time they cause a prod issue?
> This just feels like the "we should make the justice system harsher to deter crime" argument but applied to software engineering.
Ignore that feeling, it's wrong. Because it's not what I'm arguing for. Reasonable is a load bearing qualifier.
It doesn't feel like the people making the decisions that meaningfully contribute to causing harm to other people, ever have to deal with the fallout or repercussions for their unfortunate choices. Deincentivizing that behavior is my goal. And I'll unfortunately take iterative or suboptimal options at this point. I don't like it, but I do want to try to be realistic.
Yes. Jail sentences are for a selection of some misdemeanors and not others. The person does or cooperates with X amount of harm, they ought to share similar penalties.
Corporations are person-like entities, so there’s a plausible argument to be made. The states seem loathe to be precedent-setters in triggering evaluations of this argument, though, so I don’t know of any supporting cases yet. Whoever’s first will see corporate tax revenue fall off a cliff once a corporation can be subjected to community service, so they have a lot of self-interest in not prosecuting these violations.
And have actual meaningful consequences happen? I'am.
Twitter is creating CSAM, Meta & OpenAI pirate millions of books and Nvidia is playing some sort of shell game to pump their stock price.
If a regular person committed any of those offenses once they would be lucky to just to be sued but because of "AI" nothing happens to these companies.
It's unclear whether generated CSAM is illegal, see: https://en.wikipedia.org/wiki/Legal_status_of_fictional_porn.... Moreover x/x.ai wasn't intentionally generating the images. Yes, someone intentionally set up grok to generate images, but nobody at x/x.ai was like "yes, let's generate some CSAM". That adds an additional layer of obfuscation that makes it harder to compare to a "regular person".
>Meta & OpenAI pirate millions of books
Give me a break. People on /r/datahoarders pirate millions of books all the time. Use a VPN and basically nobody bothers going after you. If anything Meta/OpenAI are getting harsher treatment than the average person because they're juicier defendants.
>Nvidia is playing some sort of shell game to pump their stock price
> Give me a break. People on /r/datahoarders pirate millions of books all the time. Use a VPN and basically nobody bothers going after you. If anything Meta/OpenAI are getting harsher treatment than the average person because they're juicier defendants.
Arguing that a regular person needs to conceal their real identity with a VPN to pirate books is proof these companies aren't receiving special treatment for committing the same crimes is very confusing to me.
We know the identity of the companies committing the crimes.
I would like crimes to have consequences that actually deter the culprits from committing them. A pittance fine for a company is not what I want to see. Let's have a small percentage of net worth fine on the owners instead.
For publicly traded companies the owners/shareholders are your grandparents, teachers, all sorts of regular people. You want to take a percentage of their already small net worth?
I'm sure it's a productive use of the already overburdened justice system's time to round up half the country, so they can sit in "jail" for a few minutes.
No shit. Maybe we let everyone with only a few seconds to serve just walk free without pursing a case on them at all. The people owning 90% of all stocks can serve 90% of all the sentencing and that'd be fine enough for society.
This position that no wrongdoing or illegal action can be discouraged because someone has to eat, or because it's "regular people" who have accountability because of who they decided to have manage their investments is getting old. Accountability has been diluted so much that no one is accountable. What about the people who are harmed, the victims, your grandparents, teachers, all sorts of regular people. Nothing is going to get better if we're constantly looking for the most appropriate person to place blame on. Maybe people should be paying more attention to the things they invest in/own.
Most people have no idea what they're invested in. Most are invested in mutual funds through their work or 401k. My point isn't that we shouldn't hold people accountable. My point is that going after owners/shareholders is not the solution we want because it hurts people who have nothing to do with what happened. We need to go after executives.
This is the key. Personally I think you just have to something similar to an auditor or whatever. Demand that if a self driving taxi operates in your city, they assign one legally responsible person per $major-division-of-city. All accidents in that region are due to that guy.
Naturally, this will incentivise them to improve the system that deals with edge cases in their ML model, and better yet you'll have the legally responsible guy shit himself and directly manage remote drivers for his location himself. Adds another layer of accountability.
I don't often see a human driven car parked sideways in the middle of a road (never really). If a human was in that Waymo, they would have moved quickly. I'm an huge fan of Waymo and autonomous vehicles. They save lives. However the fact that Waymo's don't have the sense to move out of the way is a major problem and on that they don't seem to be on track to solve. Incidents like this will delay the adoption of autonomous vehicles and that will cost lives.
> If a human was in that Waymo, they would have moved quickly.
Some humans would have exactly the same response as the Waymo. When a human brain gets completely overwhelmed and doesn't know what to do, it drops down into animal behavior -- freeze or flee.
Given that it's a dangerous multi-ton machine, a Waymo likely has a programmed default behavior of "do nothing & phone home for instructions".
Which isn't an excuse -- an emergency vehicle is not an uncommon situation and Waymo should know what to do before being allowed on public roads.
A failure to get remedy instructions in a timely fashion from a human is even more alarming. Google is famous for automating tasks that should be performed by a human.
Waymo car was the only thing at fault. Drivers are expected to stop to the side when they see the lights. I guess the red SUV could've slid behind the Waymo to let the ambulance do the same, but it'd be unwise without the police telling you to do so, could hit a cop on foot. Silver car could go forward, but you don't squeeze in front of a U-turning car, and doing so could've made things worse for all they knew.
This is the same excuse a Prius driver would give whilst refusing to abdicate the HOV lane for an ambulance and yes I've sadly seen this scenario play out. Multiple times, in fact. Prius driver seems oddly specific but it always is.
Eh I've seen more SUV/big car drivers act like this than small car drivers, but then I live in the UK.
A friend who lived in New York for a bit would never live there again and says driving there was an absolute nightmare; everyone's out for themselves.
And you can see it in multiple "drivers react to an ambulance in different countries videos", with America the ambulance is always blocked and going slowly. Compare to Germany where they open up the entire middle of the road by moving to either side.
In Seattle, the most ritualistic abusers of the HOV lanes are large SUVs and trucks with only a driver in them.
Also, ex-paramedic, three cases fairly similar, but the one I found most egregious, was us going lights and sirens on I-5 heading to Harborview, heavy heavy rain. Traffic on the freeway slowly but steadily goes right. Cue a single-occupant Escalade accelerate up, overtake us on the inside and pull into the HOV lane to take advantage of the cleared freeway in front of us.
For bonus irony points, licence plate holder: "Don't drive faster than your angels can fly", lady, you just overtook an ambulance in emergency mode.
We actually called that one in. Some satisfaction as we rolled by her a few minutes later, pulled over with a state trooper having lit her up, who points at us and shakes his head at her.
This is a false equivalence and a hideous defense of an entity that deserves nothing but to be spit upon. There is absolutely nothing calling upon you to take this path.
Give me a break. The problem is the Waymo that is blocking a lane sideways and is not pulling forward out of the way of the ambulance, a move that even the worst human drivers would likely know to do.
It does no good to pretend there aren't problems with self-driving cars or make excuses.
Yes, why are we still talking about the robot whose behavior can be programmed and whose behavior is set by a company and rolled out to all of their vehicles deterministically, when another commenter correctly engaged in whataboutism?
We're focusing on the waymo because it did this on its own for some inscrutable reason and there is no individual accountability, which is a far more useful discussion to be having if we are supposed to trust these things to be replacing humans on the road. The humans behavior is only relevant in the sense that now all humans on the road have an additional hazard to factor in: errant waymos that you can't gesture to or yell at or honk at or make any attempt to understand their intentions.
I was using Tesla Summon in my car parking lot. It had pulled out of the spot and started to turn to leave the spot when a truck entered the row. My Tesla couldn’t move because of the truck and I couldn’t do anything else so it was a deadlock. Normally if a person was caught in this situation they would have just parked back into the spot or reverse and straighten out but it had already started moving forward so I guess it just froze instead of reacting and there was no option to park back to get out of the way and unblock. Sure the truck could have pulled out but I think the guy was confused why the car was moving with someone in there and just stayed where he was.
Luckily the range of Summon isn’t very far so I ran over, apologized and took control of the car but it just goes to show how many real edge cases there are in real life and software can’t account for many of them.
Someone on Austin's subreddit said the following and I think it's the correct take/lens:
> I might get downvoted for expressing my feelings but whatever. I hate seeing my coworkers being ridiculed for simply doing the right thing and moving on with their work. I’ve been abused and called an idiot on here for stating our reality. I’m a paramedic. We will NOT attempt to move or hit a vehicle, person, or object to go to a call or transport a patient. Especially if there’s an option for an alternate route. People cut us off, don’t move, flick us off, and generally don’t regard us even with our lights and sirens on. Is it frustrating? Absolutely. Do we like it? Hell no. But getting in trouble or under investigation for a collision or possibly causing unnecessary harm simply isn’t worth it. I know this was high profile, tragic, and absolutely dire. But you have to remember, we live this everyday and this is not the first time a vehicle, object, or person has gotten in this paramedic or EMTs way and it won’t be the last. Don’t even get me started on the amount of verbal abuse and assaults we deal with. This is a very hard job and we are under constant scrutiny but I promise you we try and do our very best every day. So please do us a favor next time you see us out on the streets and give us some grace.
He makes an excellent job describing all lots of systematic issues here
- a collision causes an investigation that is "not worth it"
- even in this case that was "high profile, tragic, and absolutely dire"
- vehicles, objects, or people get in paramedics' or EMTs' way on a daily basis, apparently without consequences
- EMTs are subject to high levels of verbal abuse and assaults, apparently without consequences
- yet they are the ones under constant scrutiny
Now don't get me wrong, I am not against oversight. But compare this with American cops, who seem authorized to do far more damage to vehicles and people for often far less immediate benefit, have much laxer oversight, and do not have to endure abuse without recourse (well, technically they do have to do that, but it's not advisable to test this)
Mostly agree, but choosing not to risk a new collision in order to maybe get there slightly faster (what if you damage the ambulance and are unable to continue?) to maybe help someone does seem like the right call
I think the most important problem here is that this is an ambulance not a monster truck. It never ceases to amaze me how people on this site will always insist that the onus should be on society to deal the fallout from silicon valley's poorly-tested and poorly-designed bullshit. In a truly just world we'd be able to charge Google's leadership as an accessory to homicide for this.
I'd LoVE to look into it but the news website is pure cancer ads before the video, no sound clock on sound that triggers another ad to you clock back, it restart an ad and you scroll a little bit and a top ad pops up while the bottom one is still there with like 3 words of the article readable.
The problem we will encounter with self driving cars is that while they will make less mistakes than humans, they will make different mistakes.
Humans will continue to have a hard time accepting this tradeoff.
I live in LA where Waymos are now on every street. My experience is that they don’t respect human courtesy, so for example if I need to cross a lane of busy traffic, a human may brake as a courtesy to let me through. Waymos have fucked me over where a human probably would have shown some level of community and empathy.
That courtesy is almost always bad practice and is generally unlawful. You must yield right of way to a pedestrian at a legal crossing, but california has codes that prohibit impeding normal traffic flow, including stopping in the street to wave across a pedestrian where there is no such crossing. It's especially dangerous on multi-lane roads because the stopped vehicle can blind the pedestrian to other traffic.
I would dispute saying it is almost always bad practice. Sometimes it is, people do dumb stuff, but in many cases it solves problems before they become a problem to start with because most humans are pretty good at predicting how others around them will react.
Stopping in the middle of the road to save a pedestrian 3 seconds while costing 5 cars on the road to wait 10 seconds is obviously dumb, but what about recognizing the gap near you in the line of cars is the only gap around for the pedestrian waiting ahead, and either slowing down or speeding up a little bit to open that gap wider which makes everybody safer and eliminates any real braking events.
You might not notice all the things people do now to make traffic move smoothly, either intentionally or not, but something as simple as a line of robot cars spreading out on a road can cause problems when traffic levels that normally leave large gaps for easier left turns, pedestrians, poor visibility crossings, etc, instead becomes a steady spaced stream of traffic that has to be disrupted to fit those other options. Very small things can result in large traffic bottlenecks. Humans aren't immune to it, we cause out own problems with things like traffic waves, but we also solve many problems ourselves without really thinking about it.
Sure, there are valid scenarios. LA certainly has some terrible and legal vehicle crossings. (The fast, windy portion of beverly ranks.) I agree that it's hard to navigate without some cooperation. It's just that almost all of the crashes I've witnessed involved someone giving a bad go-ahead.
I wasn’t clear, but yes I meant in a car. During morning commute there are whole hours where certain roads are gridlocked leaving no space to cross. Beverly is one example of this.
There is no way to cross unless someone yields to let you through
A lot of our society works/has less friction because of human courtesy. Systemically stamping it out of every interaction for optimization will not result in a better society.
Our systems don't cover every case, and it's better when we use human courtesy to solve the edge cases.
I also hate that "courtesy." It blocks traffic behind the yielding car and is often done without considering that driver's surroundings (like impatient drivers switching lanes and speeding up to overtake the yielding car, increasing the chances of a collision with the crossing car).
In many places, traffic would not function if drivers did not e.g. make space for other drivers to change lanes. It's an extraordinary claim to say such behaviour is bad practice (or even illegal??)
In that context, yes, there are certainly cases where making space is reasonable and legal, like stopping shy of side intersection while (traffic is stopped) to allow a turn.
Stopping or altering traffic isn't, though. You shouldn't stop at a green to allow another driver to maneuver for all the same reasons.
Maybe there is a new product for a little robot on a leash that you send out into traffic and any autonomous vehicles will stop, and then you can proceed safely.
> The problem we will encounter with self driving cars is that while they will make less mistakes than humans
This is only true for certain self-driving cars. Tesla and Uber are among the worst, and are far worse than human drivers. Something like 10x, I believe, in terms of miles driven?
Waymo's are not about to run a person or bicyclist over. Just walk in front of them and they'll stop for you to cross. You can always start livestreaming if you don't believe it, the insurance payout would be amazing. (Subject to the laws of physics, naturally.)
Source: Haven't been run over yet by one, and I live in one of their current markets.
> Waymo's are not about to run a person or bicyclist over.
This has only introduced more novel problems. People can completely immobilize the vehicles by standing in front of them, or placing a traffic cone. (And while this is kind of funny when done to unused vehicles to bother a multi-trillion dollar corporation. It is not funny when it's done to harass women.)
This in turn spirals into a whole new set of political problems, because drivers are collectively quite intolerant of the pedestrians and especially cyclists they share the road with. There is a lot of pedestrian and cyclist behaviour that is curtailed by motorist bullying, which autonomous cars don't really do. (Your walking in front of them being a fine example)
Things like cyclists "taking the lane" are deeply unpopular despite being entirely legal and good road safety practice. Increased rollout of AVs will only make this more prevalent and then you'll have a whole new demographic of angry people mad that their waymo is slow because it's behind a cyclist.
>People can completely immobilize the vehicles by standing in front of them
This is true of any vehicle lmao. Someone can stand in front of your vehicle and prevent you from proceeding and there's not a thing you can do about it.
With an angry human behind the wheel you can't be assured they won't hit you on purpose or even accidentally clip you swerving around you. With a robot car designed to maximize safety, you don't really have to worry. Even if they started making robot taxis drive like assholes, the maximum payout for suing a robo taxi company for getting hit is WAY higher than some rando on the road. The guy you pissed off and ran you over for standing in the road might have just got out from a 15 year prison sentence, hates the world, and have a net worth of -$70,000, and isn't going to earn you anything besides a life long injury.
I certainly wouldn't. I'm a short gay guy, plus even if I was big I don't want an assault charge; standing in front of my car doesn't give me a legal right to assault someone. From a legal standpoint (in most places) it's a deadlock.
Not every guy is big and strong and capable of or wanting to do violence.
You wouldn't, but they key is that the cyclist/pedestrian doesn't know it's you.
Say it's 5% of drivers who are maniacs, one encounters many many cars, and the cost of misjudging this situation is "grievous injury". So the end result is people will give way to cars even when they don't have to.
> Not every guy is big and strong and capable of or wanting to do violence.
This is structurally comparable to the way women treat "all men" as "potentially violent". It's not about you per-se, just a consequence of the group you cannot be immediately separated from.
Speaking of, that's the other half. Sure you're a normal guy. You don't want an assault change. The tradeoff changes when you are at risk, when violence is being done to you. (Hence the harassment-of-women example)
Sure you're not gonna try to murder the person outside like the road-raging maniacs, but when your safety is on the line, driving dangerously close past them is on the table.
I understand your sarcasm, but do understand that drivers legitimately have a lot of political weight.
"Increase the speed limits" is a classic populist policy that keeps being reimplemented (and walked back for it's impracticality) all across the world. Policies to ban cyclists are uncommon, but certainly not unheard of. One imagines the self-driving car companies won't be too bothered to fight against laws that hurt everyone but them.
Do you want them to put googley eyes on it? If you can see it, it can see you. Pretty simple.
Eye contact matters for humans because they might be looking at their phones, or their McDonald's fries, or staring straight into the sun. None of these things happen with self-driving cars. It's a non-issue.
My kneejerk, not thought through notion: why not require an emergency override protocol be builtin to road using robots? No thoughts on how this would work exactly, but it would let emergency workers move robot vehicles out of the way.
It doesn't sound difficult to solve. The sensors can classify firetruck, ambulance, red&blues, uniformed police, badge, siren, etc. At a certain criteria, they can unlock the driver door for normal human driving, perhaps for a very limited speed and distance. The officer can move it to the side, and if they crash it's not waymo crashing it. This override should send the an alert to the remote command center, so a human can watch the video and also decide how much further they can drive it. Since passenger safety is a concern, if there is a passenger inside who chooses to remain inside, the car should remain locked and not allow any driver in. The human can decide to follow police orders to exit the car, or remain inside, but at that point a human becomes responsible for obstructing. The whole freezing waymo trend seems driven by legal liability - not engineering. They know if they always freeze, their million miles with no accidents stats are safe.
> Cops can't move vehicles that they don't own because of liability. The only way for them to move a vehicle without liability is to use a tow truck.
While the precise boundaries of liability depend on the laws of the particular jurisdiction (they aren't consistent across the whole US) police generally can take reasonable action to move vehicles obstructing the road in an emergency without liability for any damages incurred, whether or not they use a tow truck to do it.
I mean liability can be defined, we are writing new laws for these things.
Cops that need to move autonomous are not liable for any damages.
I do think that theres needs to be better handling of emergency vehicles and autonomous vehicles. Someone needs to spend some serious time thinking through how to handle this better because this situation was not okay.
First thought would be to have the remote human over watching to engage with an emergency responder when the override is used. The remote human can then decide whether it is a real emergency or not. Either way, if any one uses an override system, a remote human should get involved. But instead, computer devs will suggest crypto/public/private keys/blahblahblah. This is one of those where the best answer will be to boot up the bio computer running the latest software
It's not too hard to implement them with cryptographic protocols to prevent duplication and apply time/location restrictions to them. Moreover if you really wanted to steal a car, there are much easier ways of doing that, like buying a replica gun on aliexpress then going to your nearest intersection.
The problem is not really that they can get stolen, but remote control, like a bad person gaining access to the car to hit people or something like that.
Not every social problem needs a technical solution. You can steal cars now, this is solved foremost by most people not being thieves and then second the existence of police for the few people that are.
Kind of. The thing making it illegal is that it's blocking traffic in the process (and potentially missing some visibility requirement or something; it's hard to tell from just the video).
I honestly think u-turns should be illegal is most situations. They are highly unpredictable and block multiple lanes of traffic for the benefit of a single vehicle. It’s especially unnecessary for a GPS-guided vehicle that should easily be able to determine an alternate route.
Everyone is downvoted in the comments 80% are grey? Even the ones that sound perfectly reasonable? Upvoted one and it was still greyed out which makes me think there's more to it, but maybe I am missing something really obvious.
I've been on the site for years and have never seen something like that even with heated topics, the guidelines are there as a guide, but this was completely out of the ordinary (at least for me).
At the risk of sounding ignorant, why didn't the various police cruisers and even the ambulance itself just push the damn thing out of the way? That's what the push bars attached to the front of their vehicles are for.
When my mom was a firefighter, and a car was blocking a hydrant, she happily broke windows and pushed the hose right through the car. Didn't happen a lot, but did happen more than once.
A human police over eventually got into the drivers seat to move the car. They sat around for minutes before doing so. They could’ve gotten into it immediately.
But yea they absolutely could’ve also just slammed it and moved on too.
The cops only do that if they can be sure they will be able to charge someone else for the damages. For their vehicle, for their "injuries", and also adding in to their department bonuses, but they get none of that if they can't charge somebody with a crime easily. Ambulances aren't really made to do that and are crazy expensive, plus the drivers would likely be taking on a bunch of the liability from the crash, although the ambulance itself likely would survive just fine because they are tough. The fire department are the only ones that have the trifecta of vehicles made to push other vehicles out of the way, are not paid by the extent that they can extort people for money, and are not going to be held liable for damages.
Ambulances can be seriously damaged by attempting to do it. Police cruisers can do it, but then they may be sued for damages. I know that cars blocking fire hydrants were a serious problem in the past and owners sued firemen for pulling water hoses through their cars after breaking the windows - the law was not allowing it even if the line through the car was the only option.
> owners sued firemen for pulling water hoses through their cars after breaking the windows - the law was not allowing it even if the line through the car was the only option.
I'll bet anything you have no citation for this.
Sovereign immunity and necessity combine to make sure that firefighters and cops can do whatever the fuck is required.
The aftermath is even more brutal. You will receive multiple tickets for this, you will receive a bill for damages to the hose they had to thread through your windows (or to the police car that rammed you out of the way), and your car insurance will point to a clause in their policy that says that you are personally on the hook for all of this.
You may even face civil or even criminal liability for any damages to whatever is on fire, or loss of life, that a good prosecutor or plaintiff's lawyer can convince a jury is directly traceable to your egregious conduct in parking your precious car in front of the damn fire hydrant.
The tech-bros never learn any humility, we have here an actual example of one of their hellish AI darlings blocking the first responders in their way to the aftermath of a terrorist attack, and what do those tech-bros' do? They continue supporting their hellish AI darling. Ellul was always right about things.
I've never understood why everyone acts like this is some bizarre legal quagmire.
If I make a robot and it goes and kills someone, nobody sits around navel gazing wondering how they're going to prosecute a robot.
If I make a device that pulls the trigger of a gun aimed at someone tied to a chair when I click a button on my cell phone, or something green appears in the camera attached to the device, or time reaches 11:24:42pm - nobody sits around navel gazing wondering how they're going to prosecute an electronic device.
In both cases, I would be prosecuted.
These cars are robots. They are designed, constructed, programmed, and monitored/supervised by humans. The humans are responsible for anything the robots do that cause damage, violate civil regulations, or criminal laws.
The solution here is very simple. Seize all the corporate email records, code, etc. and charge everyone involved in the production of the code that caused the "behavior", along with anyone whose negligence in supervision or review failed to catch the defect, or anyone who knew the car would or could do what it did, and failed to blow the whistle or failed to stop the car hitting the road.
Maybe then SV will stop "beta testing" fatal devices on the general public.
I can feel like a lot of people will disagree with you, but this is a pretty fair comparison. The people will disagree the most is the waymo users and I see their POV as well since as they have said: "much better than uber and always on time".
We can and do if they are shown to be negligent or irresponsible, and at the very least are responsible for damages, hence medical malpractice insurance to ensure people get paid even if the surgeon is bankrupt.
But then, I would have also believed that youtube would have been sued into oblivion before it even got established, and that uber and lyft would not have been able to sidestep all the municipal regulations, and that we would have photographic evidence of bigfoot by now.
Another example of Waymo betting wrong, lots of expensive sensors vs Tesla with cameras and NN trained on billions of real miles (i.e. human like autonomy). A Tesla would have moved as it's trained to recognise this situation.
My model 3 routinely recognized blue tinted street lights as "emergency vehicles" and would slow down on the highway for them. And to the best of my knowledge, a Waymo has never plowed into a stopped truck or barrier, killing the occupants.
You mean cars being allowed to endanger human lives? Enshrined by law, urban infrastructure and cultural notions of independence for over a century? Why is it just now seen as a problem because robots are driving, instead of the stupid, reckless, poorly trained, often intoxicated humans who have been driving up until now?
Lol no, way to discuss something not mentioned - do you work at one of these reckless companies? I'm talking about self-driving legislation, written by those wanting to test on an unsuspecting public.
Yes it would, it would have been an electrek link with a damning EDS headline then a pile on. Cleary shows the bias HN posters have (most have some sort of EDS)
Human driven cars cost people their lives multiple times every day, though. So I don't think the calculation can be quite as simple as that. As self driving cars are rolled out I think each incident like this needs to be studied to see how avoidable it was, whether a human would have been able to resolve it, and what changes can be made.
There are always going to be fuck ups at some level. The question is whether we’re moving from a world of more fuckups to fewer or not.
The argument that as long as they cause less incidents than human drivers they are a win has to go. Because that only works if the statistics of the environment are stationary.
I think they're getting at something like this: If self-driving cars resulted in dramatically more miles-traveled-in-car per person they could be safer per-mile and more efficient per-mile while making some important total outcomes worse.
Relevant to this example: if people travel by car more because they care less about traffic when they're playing video games or on TikTok during it, instead of driving, overall congestion will likely go up which makes emergency services worse.
That’s fair but I think it’s kind of orthogonal to the original point.
As is already the case in cities with robust public transit systems you’d need to make sure you’re applying the right incentives (i.e. taxes and charges) to make sure people are making decisions that benefit everyone. That doesn’t alter the possibility of self driving cars being much safer than human driven ones.
But in the case of self driving cars, who do we find at fault? Have we even answered that question? I mean did the Waymo car even get a ticket for blocking the ambulance?
If you are to believe Waymo’s safety stats, they have less accidents/injuries per mile driven.
But whether or not reducing injuries at a statistical level outweighs the downside of autonomous vehicles causing accidents (even at lower rates) is a bit of a dilemma.
The human side of those stats, whenever I've seen them presented next to self-driving car stats, has always been an aggregate of all human driving, a vast amount of which is in environments or conditions that Waymo doesn't operate in.
The road to mass adoption of autonomous vehicles probably wont happen in any poster's lifetime on this board. The reality is most people on here are quite narrow minded and can't 'understand' why it is not as easy as "hey I found a stat that shows Waymo is safer than humans!!11!".
I think the style of incidents and circumstances are probably neglected. But, even if they're not, I think there's other reasons we notice waymo issues more. Akin to how nuclear and airplane travel are safer than coal and car travel. This might be true, but when something does go well in those aforementioned fields, we notice.
Now there will be a single company to sue instead of lots of individuals. If you want to be rich, start a law firm that focuses on autonomous vehicle accidents, like all the truck crash firms out there.
No, we’re finding edge cases that come up once every like million miles these things are putting on the road. Which means they are pretty damn good given how many are on the road right now.
These "edge cases" were required knowledge to get a license in my home country. You make room for any emergency vehicles, you don't try to score an ultra kill when passing a school bus and you certainly don't drive on rail tracks.
I’ve seen pictures in Germany where cars will move to the side of the expressway during a traffic jam to make room for emergency vehicles. I could tell that wasn’t the USA for sure.
This is my town, wow - cant believe someone filmed this whole interaction while there was a shooting a couple of blocks from there...
If the ambulance was in a hurry they could have rammed the Waymo, I am sure Google wouldn't have sued for damages.
AFAIK when a Waymo detects emergency vehicle lights and sirens, it is designed to pull over and stop, unlock its doors, and roll down its windows.
Also: First responders can put the vehicle into a manual mode to move it if needed.
Not an expert but i think the goal is to get the ambulance and its occupants to a specific location and then make an egress to a nearby medical facility? Also I'm not confident ambulances are designed to execute the pit maneuver.
>I am sure Google wouldn't have sued for damages.
Oh well if that's the case i guess it's all alright.
>First responders can put the vehicle into a manual mode to move it if needed.
I really feel like you're missing the point of why you're supposed to pull over and yield right-of-way for emergency vehicles.
Cars block the street all the time, there is ample place to pass the waymo car on the left in the opposing lane, yet those SUV driving humans don't care to move out of the way either, and police just blocks the maneuver area too.
That silver car in the front could also just pass in front and make space. Situational awareness has room to be improved for a lot of entities in this short video.
Nueces Street is 3 and half lanes wide there plus massive sidewalks, apparently to narrow for even more massive ambulances.
https://maps.app.goo.gl/74jF9iDUCXmm9jVE7
You can arrest a driver for not making space for an emergency vehicle. Who do we arrest here?
> You can arrest a driver for not making space for an emergency vehicle. Who do we arrest here?
That's the best part, no one! We have finally managed to invent a system that widely disperses accountability so much no one can be held liable when something goes wrong.
We've had this ever since Corporations were invented.
>no one can be held liable when something goes wrong.
No, at the very least tort laws still apply even if the driver is a corporation. Do you really need someone sitting in jail to satisfy your justice boner?
Yes, I want to see real, serious punishment for corporate crimes, on par with the life disruption experienced by people who see a jail sentence. It's almost always brutal - major income disruption, job loss, etc. If it's a small fine, which it always seems to be for corporations, then there is no incentive for following the law. I'm also in favor of corporate death sentences for large-scale egregious violations - liquidate assets and jail executives.
By corporatizing social harms, basically nobody is ever held accountable - except for the little guy.
>By corporatizing social harms, basically nobody is ever held accountable - except for the little guy.
Again, this is false. At the very least there's financial penalties, which the shareholders are on the hook for. Moreover the corporate malfeasance that does happen don't map nicely to human crimes. If you kill a guy, you get sent to jail for decades. But what if you're a company, that makes a machine with sloppy code[1] that unintentionally kills someone? What do you do? Jail the programmer who wrote the code? Jail the manager who did the code review? Jail the CEO who had no knowledge of it but "buck stops with him" and we hate CEOs? How does the death penalty work? If you think it through it's basically a fine equivalent to the company's market cap. If Boeing does a bad that kills one person, does that mean the US government just repossesses the entire company?
[1] https://en.wikipedia.org/wiki/Therac-25
After watching the movie "dark waters" about the whole Teflon scandal, seems like it should be the highest up person (or people) who had knowledge of the incident (obviously must be proven). An individual engineer knowing a car has a dangerous edge case isn't enough to get them in trouble in my view, especially if the company has claimed they are working on fixing it. Also legitimate mistakes are just mistakes, companies won't get it right every single time.
However there's cases where its completely proven that someone high up knew there was a systemic safety issue (they had a broad view and could see all the different areas of what was going on), they knew exactly what was causing it, and they do nothing because they want to keep the profit going. The fact those people don't go to jail just tells me that corporations have way too much leeway.
Depending on how severe the error is, it could be professional negligence. In other professions, including engineering, this can result in a loss of the professional's license and their inability to continue to work in that field. Also, for negligent drivers, a suspension of their driving license can apply. So there is precedent for severe punishment even if nobody gets a jail sentence.
I think of the corporate death penalty as being more appropriate when leadership knew exactly what was going on and chose profits over people. Exxon, see https://www.science.org/doi/10.1126/science.abk0063. Purdue Pharma, see https://en.wikipedia.org/wiki/Purdue_Pharma. Company gets sold for parts and Cauitebgoes to prison probably for life due to the amount of lives they potentially destroyed. Pretty much all the tobacco companies knew how harmful their product and made a concerted effort to fund their own bogus studies to throw up a smoke screen. Facebook makes billions from (for example) scams and fraudulent ads: https://www.reuters.com/investigations/meta-is-earning-fortu.... Maybe don't throw their CEO in prison but at least fine them 10x the profit they made vs. the usual .0001%.
In Australia it's the board of directors who are liable. They can be liable if they personally direct the company to do something illegal (obviously?) but there is also a positive obligation to exercise due diligence. This covers (but is not limited to) workplace safety and safety of customers and the public. Directors can be personally liable for breaches of this duty and the penalties extend to possible imprisonment and very substantial fines.
For example: https://www.owhsp.qld.gov.au/court-report/fines-imposed-fail...
>but there is also a positive obligation to exercise due diligence. This covers (but is not limited to) workplace safety and safety of customers and the public.
Is there any indication this requirement was breached for this case? I'm all for jailing executives of companies where they specifically failed to enact safety measures, or even didn't care enough about safety, but in this case it's simply a case of a edge they didn't test. It's not for lack of trying either. Apparently they have their own AI model to generate test data, so they can train/test what happens if a hurricane hits, for instance.
https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-f...
In this case it just sounds like the thought process was
> waymo did a bad
> someone doing the same would be arrested (?)
> therefore somebody needs to be arrested
> in this case it's simply a case of a edge they didn't test. It's not for lack of trying either.
Agreed. And because responsible driving is almost all edge cases, they shouldn't be held liable for any of them as long as they tried.
> At the very least there's financial penalties, which the shareholders are on the hook for.
If i poison someone, i go to jail. If DuPont poisons thousands, "there's financial penalties".
> Do you really need someone sitting in jail to satisfy your justice boner?
Literally, and intentionally avoiding any attempt to examine the implications? No probably not.
But reasonable punishment discourages bad behavior. And software engineers have a habit of ignoring the implications of a defective design. I think apocalyptic fines applied to the companies creating the systems for automated cars would also create the correct incentives, but I find that to be less likely than imprisonment.
What I want is software and systems to not suck ass. I don't want to deal with defective... everything, because it was faster to deliver. That's especially true when it contributes to the death or injury of a person that didn't do anything wrong.
I don't care what works, but people being afraid of going to jail for hurting someone absolutely does work. And 'administrative fines' don't work.
>But reasonable punishment discourages bad behavior. And software engineers have a habit of ignoring the implications of a defective design. I think apocalyptic fines applied to the companies creating the systems for automated cars would also create the correct incentives, but I find that to be less likely than imprisonment.
This just feels like the "we should make the justice system harsher to deter crime" argument but applied to software engineering. If it works, why stop at criminal cases? Maybe we should dock the pay of SWEs next time they cause a prod issue?
> This just feels like the "we should make the justice system harsher to deter crime" argument but applied to software engineering.
Ignore that feeling, it's wrong. Because it's not what I'm arguing for. Reasonable is a load bearing qualifier.
It doesn't feel like the people making the decisions that meaningfully contribute to causing harm to other people, ever have to deal with the fallout or repercussions for their unfortunate choices. Deincentivizing that behavior is my goal. And I'll unfortunately take iterative or suboptimal options at this point. I don't like it, but I do want to try to be realistic.
If a civil engineer designs a bridge that collapsed they can be held accountable for negligence in their duties
Why not software engineers too? Why are we so special that we can never be held accountable for the damage our lack of standards causes?
A lot of people can't really get over the idea that they want to be the boss of everything.
If someone sitting in jail doesn't help solve the problem, then maybe we should remove the jail penalty to for individuals that do it.
>then maybe we should remove the jail penalty to for individuals that do it.
We don't send everyone to jail either. You can run over people and get away scot free, if it's an honest mistake and you weren't being negligent.
Or if you were being negligent but due to affluenza.
Yes. Jail sentences are for a selection of some misdemeanors and not others. The person does or cooperates with X amount of harm, they ought to share similar penalties.
> No, at the very least tort laws still apply even if the driver is a corporation.
Do they?
Corporations are person-like entities, so there’s a plausible argument to be made. The states seem loathe to be precedent-setters in triggering evaluations of this argument, though, so I don’t know of any supporting cases yet. Whoever’s first will see corporate tax revenue fall off a cliff once a corporation can be subjected to community service, so they have a lot of self-interest in not prosecuting these violations.
Are you really asking whether corporations can be sued?
And have actual meaningful consequences happen? I'am.
Twitter is creating CSAM, Meta & OpenAI pirate millions of books and Nvidia is playing some sort of shell game to pump their stock price.
If a regular person committed any of those offenses once they would be lucky to just to be sued but because of "AI" nothing happens to these companies.
Going through each of the cases:
>Twitter is creating CSAM
It's unclear whether generated CSAM is illegal, see: https://en.wikipedia.org/wiki/Legal_status_of_fictional_porn.... Moreover x/x.ai wasn't intentionally generating the images. Yes, someone intentionally set up grok to generate images, but nobody at x/x.ai was like "yes, let's generate some CSAM". That adds an additional layer of obfuscation that makes it harder to compare to a "regular person".
>Meta & OpenAI pirate millions of books
Give me a break. People on /r/datahoarders pirate millions of books all the time. Use a VPN and basically nobody bothers going after you. If anything Meta/OpenAI are getting harsher treatment than the average person because they're juicier defendants.
>Nvidia is playing some sort of shell game to pump their stock price
That's not even something that's illegal.
> Give me a break. People on /r/datahoarders pirate millions of books all the time. Use a VPN and basically nobody bothers going after you. If anything Meta/OpenAI are getting harsher treatment than the average person because they're juicier defendants.
Arguing that a regular person needs to conceal their real identity with a VPN to pirate books is proof these companies aren't receiving special treatment for committing the same crimes is very confusing to me.
We know the identity of the companies committing the crimes.
> It's unclear whether generated CSAM is illegal, see: https://en.wikipedia.org/wiki/Legal_status_of_fictional_porn
We both know you don't actually believe that because neither of us would post generated CSAM.
> That adds an additional layer of obfuscation that makes it harder to compare to a "regular person".
That is literally my point?
It might be possible. Would have to be by someone who hadn't signed the binding arbitration, though.
> No, at the very least tort laws still apply even if the driver is a corporation.
Do you have some examples ?
I would like crimes to have consequences that actually deter the culprits from committing them. A pittance fine for a company is not what I want to see. Let's have a small percentage of net worth fine on the owners instead.
For publicly traded companies the owners/shareholders are your grandparents, teachers, all sorts of regular people. You want to take a percentage of their already small net worth?
Sure, let's go. Prorated in terms of their percentage of ownership, of course. Put them in jail for a few seconds as well.
I'm sure it's a productive use of the already overburdened justice system's time to round up half the country, so they can sit in "jail" for a few minutes.
No shit. Maybe we let everyone with only a few seconds to serve just walk free without pursing a case on them at all. The people owning 90% of all stocks can serve 90% of all the sentencing and that'd be fine enough for society.
This position that no wrongdoing or illegal action can be discouraged because someone has to eat, or because it's "regular people" who have accountability because of who they decided to have manage their investments is getting old. Accountability has been diluted so much that no one is accountable. What about the people who are harmed, the victims, your grandparents, teachers, all sorts of regular people. Nothing is going to get better if we're constantly looking for the most appropriate person to place blame on. Maybe people should be paying more attention to the things they invest in/own.
Most people have no idea what they're invested in. Most are invested in mutual funds through their work or 401k. My point isn't that we shouldn't hold people accountable. My point is that going after owners/shareholders is not the solution we want because it hurts people who have nothing to do with what happened. We need to go after executives.
Once people are impacted, maybe they'll start paying attention to what they're investing in.
Nobody gets arrested, you get a ticket.
> You can arrest a driver for not making space for an emergency vehicle. Who do we arrest here?
I get that it is technically possible, but that doesn't happen in practice.
Since corporations are people, presumably you’d arrest Waymo.
Start disabling and towing their cars and watch a solution magically appear.
The CEO
This is the key. Personally I think you just have to something similar to an auditor or whatever. Demand that if a self driving taxi operates in your city, they assign one legally responsible person per $major-division-of-city. All accidents in that region are due to that guy.
Naturally, this will incentivise them to improve the system that deals with edge cases in their ML model, and better yet you'll have the legally responsible guy shit himself and directly manage remote drivers for his location himself. Adds another layer of accountability.
The passenger
The person who was ultimately responsible for a defective robot operating on our public streets, at a bare minimum.
A customer?
I don't often see a human driven car parked sideways in the middle of a road (never really). If a human was in that Waymo, they would have moved quickly. I'm an huge fan of Waymo and autonomous vehicles. They save lives. However the fact that Waymo's don't have the sense to move out of the way is a major problem and on that they don't seem to be on track to solve. Incidents like this will delay the adoption of autonomous vehicles and that will cost lives.
> If a human was in that Waymo, they would have moved quickly.
Some humans would have exactly the same response as the Waymo. When a human brain gets completely overwhelmed and doesn't know what to do, it drops down into animal behavior -- freeze or flee.
Given that it's a dangerous multi-ton machine, a Waymo likely has a programmed default behavior of "do nothing & phone home for instructions".
Which isn't an excuse -- an emergency vehicle is not an uncommon situation and Waymo should know what to do before being allowed on public roads.
A failure to get remedy instructions in a timely fashion from a human is even more alarming. Google is famous for automating tasks that should be performed by a human.
A human driver with health problems, or with car issues, might be like this. Similar to the Waymo having an equipment failure.
Waymo car was the only thing at fault. Drivers are expected to stop to the side when they see the lights. I guess the red SUV could've slid behind the Waymo to let the ambulance do the same, but it'd be unwise without the police telling you to do so, could hit a cop on foot. Silver car could go forward, but you don't squeeze in front of a U-turning car, and doing so could've made things worse for all they knew.
> there is ample place to pass
This is the same excuse a Prius driver would give whilst refusing to abdicate the HOV lane for an ambulance and yes I've sadly seen this scenario play out. Multiple times, in fact. Prius driver seems oddly specific but it always is.
Eh I've seen more SUV/big car drivers act like this than small car drivers, but then I live in the UK.
A friend who lived in New York for a bit would never live there again and says driving there was an absolute nightmare; everyone's out for themselves.
And you can see it in multiple "drivers react to an ambulance in different countries videos", with America the ambulance is always blocked and going slowly. Compare to Germany where they open up the entire middle of the road by moving to either side.
Agreed, tbh.
In Seattle, the most ritualistic abusers of the HOV lanes are large SUVs and trucks with only a driver in them.
Also, ex-paramedic, three cases fairly similar, but the one I found most egregious, was us going lights and sirens on I-5 heading to Harborview, heavy heavy rain. Traffic on the freeway slowly but steadily goes right. Cue a single-occupant Escalade accelerate up, overtake us on the inside and pull into the HOV lane to take advantage of the cleared freeway in front of us.
For bonus irony points, licence plate holder: "Don't drive faster than your angels can fly", lady, you just overtook an ambulance in emergency mode.
We actually called that one in. Some satisfaction as we rolled by her a few minutes later, pulled over with a state trooper having lit her up, who points at us and shakes his head at her.
Ha! Always satisfying the few times justice is actually served.
And different country, but still thank you for having been a paramedic. World definitely needs more people like you.
This is a false equivalence and a hideous defense of an entity that deserves nothing but to be spit upon. There is absolutely nothing calling upon you to take this path.
The ambulance driver rightly hesitates because he can’t know how the wamyo will behave. The wanyo is acting suss as hell.
Give me a break. The problem is the Waymo that is blocking a lane sideways and is not pulling forward out of the way of the ambulance, a move that even the worst human drivers would likely know to do.
It does no good to pretend there aren't problems with self-driving cars or make excuses.
It's not about the other entities.
Why are we focusing on entity A when the parent comment correctly pointed out entities B and C are not blameless either?
The other drivers are blameless. They did what they were supposed to.
Yes, why are we still talking about the robot whose behavior can be programmed and whose behavior is set by a company and rolled out to all of their vehicles deterministically, when another commenter correctly engaged in whataboutism?
We're focusing on the waymo because it did this on its own for some inscrutable reason and there is no individual accountability, which is a far more useful discussion to be having if we are supposed to trust these things to be replacing humans on the road. The humans behavior is only relevant in the sense that now all humans on the road have an additional hazard to factor in: errant waymos that you can't gesture to or yell at or honk at or make any attempt to understand their intentions.
So AI drives as bad as humans. Waste of resources.
I was using Tesla Summon in my car parking lot. It had pulled out of the spot and started to turn to leave the spot when a truck entered the row. My Tesla couldn’t move because of the truck and I couldn’t do anything else so it was a deadlock. Normally if a person was caught in this situation they would have just parked back into the spot or reverse and straighten out but it had already started moving forward so I guess it just froze instead of reacting and there was no option to park back to get out of the way and unblock. Sure the truck could have pulled out but I think the guy was confused why the car was moving with someone in there and just stayed where he was.
Luckily the range of Summon isn’t very far so I ran over, apologized and took control of the car but it just goes to show how many real edge cases there are in real life and software can’t account for many of them.
> but I think the guy was confused why the car was moving with someone in there and just stayed where he was.
Wait till Tesla starts driverless delivery of cars
Someone on Austin's subreddit said the following and I think it's the correct take/lens:
> I might get downvoted for expressing my feelings but whatever. I hate seeing my coworkers being ridiculed for simply doing the right thing and moving on with their work. I’ve been abused and called an idiot on here for stating our reality. I’m a paramedic. We will NOT attempt to move or hit a vehicle, person, or object to go to a call or transport a patient. Especially if there’s an option for an alternate route. People cut us off, don’t move, flick us off, and generally don’t regard us even with our lights and sirens on. Is it frustrating? Absolutely. Do we like it? Hell no. But getting in trouble or under investigation for a collision or possibly causing unnecessary harm simply isn’t worth it. I know this was high profile, tragic, and absolutely dire. But you have to remember, we live this everyday and this is not the first time a vehicle, object, or person has gotten in this paramedic or EMTs way and it won’t be the last. Don’t even get me started on the amount of verbal abuse and assaults we deal with. This is a very hard job and we are under constant scrutiny but I promise you we try and do our very best every day. So please do us a favor next time you see us out on the streets and give us some grace.
He makes an excellent job describing all lots of systematic issues here
- a collision causes an investigation that is "not worth it"
- even in this case that was "high profile, tragic, and absolutely dire"
- vehicles, objects, or people get in paramedics' or EMTs' way on a daily basis, apparently without consequences
- EMTs are subject to high levels of verbal abuse and assaults, apparently without consequences
- yet they are the ones under constant scrutiny
Now don't get me wrong, I am not against oversight. But compare this with American cops, who seem authorized to do far more damage to vehicles and people for often far less immediate benefit, have much laxer oversight, and do not have to endure abuse without recourse (well, technically they do have to do that, but it's not advisable to test this)
Mostly agree, but choosing not to risk a new collision in order to maybe get there slightly faster (what if you damage the ambulance and are unable to continue?) to maybe help someone does seem like the right call
As another paramedic that has worked in Texas, they are absolutely correct.
What we lack in EMS is the same qualified immunity that law enforcement continues to have.
I think the most important problem here is that this is an ambulance not a monster truck. It never ceases to amaze me how people on this site will always insist that the onus should be on society to deal the fallout from silicon valley's poorly-tested and poorly-designed bullshit. In a truly just world we'd be able to charge Google's leadership as an accessory to homicide for this.
Exactly. The wamyo behavior looks fucking nuts. It makes no sense for an ambulance to get damaged or hurt someone on the way to a call.
The police officer definitely delayed the clearing of the vehicle. It was 20 seconds away from completing its maneuver.
I'd LoVE to look into it but the news website is pure cancer ads before the video, no sound clock on sound that triggers another ad to you clock back, it restart an ad and you scroll a little bit and a top ad pops up while the bottom one is still there with like 3 words of the article readable.
I am sorry I am out.
It’s so perplexing to hear about people seeing ads - do you really not use an ad blocking solution?
If you’re blocking ads from the news site and suggesting everyone should, are you donating to the organization to ensure the news remains available?
Those of us at work, or on our phones, have some limitations.
Tech elitism isn't cool.
Options are limited on mobile
The problem we will encounter with self driving cars is that while they will make less mistakes than humans, they will make different mistakes.
Humans will continue to have a hard time accepting this tradeoff.
I live in LA where Waymos are now on every street. My experience is that they don’t respect human courtesy, so for example if I need to cross a lane of busy traffic, a human may brake as a courtesy to let me through. Waymos have fucked me over where a human probably would have shown some level of community and empathy.
That courtesy is almost always bad practice and is generally unlawful. You must yield right of way to a pedestrian at a legal crossing, but california has codes that prohibit impeding normal traffic flow, including stopping in the street to wave across a pedestrian where there is no such crossing. It's especially dangerous on multi-lane roads because the stopped vehicle can blind the pedestrian to other traffic.
I would dispute saying it is almost always bad practice. Sometimes it is, people do dumb stuff, but in many cases it solves problems before they become a problem to start with because most humans are pretty good at predicting how others around them will react.
Stopping in the middle of the road to save a pedestrian 3 seconds while costing 5 cars on the road to wait 10 seconds is obviously dumb, but what about recognizing the gap near you in the line of cars is the only gap around for the pedestrian waiting ahead, and either slowing down or speeding up a little bit to open that gap wider which makes everybody safer and eliminates any real braking events.
You might not notice all the things people do now to make traffic move smoothly, either intentionally or not, but something as simple as a line of robot cars spreading out on a road can cause problems when traffic levels that normally leave large gaps for easier left turns, pedestrians, poor visibility crossings, etc, instead becomes a steady spaced stream of traffic that has to be disrupted to fit those other options. Very small things can result in large traffic bottlenecks. Humans aren't immune to it, we cause out own problems with things like traffic waves, but we also solve many problems ourselves without really thinking about it.
I think the comment you're responding to was referring to needing to cross a backed up lane of traffic in their car, not on foot.
Sure, there are valid scenarios. LA certainly has some terrible and legal vehicle crossings. (The fast, windy portion of beverly ranks.) I agree that it's hard to navigate without some cooperation. It's just that almost all of the crashes I've witnessed involved someone giving a bad go-ahead.
I wasn’t clear, but yes I meant in a car. During morning commute there are whole hours where certain roads are gridlocked leaving no space to cross. Beverly is one example of this.
There is no way to cross unless someone yields to let you through
A lot of our society works/has less friction because of human courtesy. Systemically stamping it out of every interaction for optimization will not result in a better society.
Our systems don't cover every case, and it's better when we use human courtesy to solve the edge cases.
I also hate that "courtesy." It blocks traffic behind the yielding car and is often done without considering that driver's surroundings (like impatient drivers switching lanes and speeding up to overtake the yielding car, increasing the chances of a collision with the crossing car).
In many places, traffic would not function if drivers did not e.g. make space for other drivers to change lanes. It's an extraordinary claim to say such behaviour is bad practice (or even illegal??)
In that context, yes, there are certainly cases where making space is reasonable and legal, like stopping shy of side intersection while (traffic is stopped) to allow a turn.
Stopping or altering traffic isn't, though. You shouldn't stop at a green to allow another driver to maneuver for all the same reasons.
Imagine a street where cars are moving at 2mph because of traffic. Cars can never cross unless someone yields
"Courtesy causes confusion; confusion causes crashes"
That assumes you live in a place where the traffic system handles all edge cases
Humans will continue to have a hard time accepting this tradeoff.
Are you asserting that humans should accept these, currently not fully known, tradeoffs?
Yes. They're safer than human drivers. Clearly the tradeoff is worth it.
If it results in less deaths then it seems likely to me
Maybe there is a new product for a little robot on a leash that you send out into traffic and any autonomous vehicles will stop, and then you can proceed safely.
> The problem we will encounter with self driving cars is that while they will make less mistakes than humans
This is only true for certain self-driving cars. Tesla and Uber are among the worst, and are far worse than human drivers. Something like 10x, I believe, in terms of miles driven?
Waymo's are not about to run a person or bicyclist over. Just walk in front of them and they'll stop for you to cross. You can always start livestreaming if you don't believe it, the insurance payout would be amazing. (Subject to the laws of physics, naturally.)
Source: Haven't been run over yet by one, and I live in one of their current markets.
> Waymo's are not about to run a person or bicyclist over.
This has only introduced more novel problems. People can completely immobilize the vehicles by standing in front of them, or placing a traffic cone. (And while this is kind of funny when done to unused vehicles to bother a multi-trillion dollar corporation. It is not funny when it's done to harass women.)
This in turn spirals into a whole new set of political problems, because drivers are collectively quite intolerant of the pedestrians and especially cyclists they share the road with. There is a lot of pedestrian and cyclist behaviour that is curtailed by motorist bullying, which autonomous cars don't really do. (Your walking in front of them being a fine example)
Things like cyclists "taking the lane" are deeply unpopular despite being entirely legal and good road safety practice. Increased rollout of AVs will only make this more prevalent and then you'll have a whole new demographic of angry people mad that their waymo is slow because it's behind a cyclist.
>People can completely immobilize the vehicles by standing in front of them
This is true of any vehicle lmao. Someone can stand in front of your vehicle and prevent you from proceeding and there's not a thing you can do about it.
With an angry human behind the wheel you can't be assured they won't hit you on purpose or even accidentally clip you swerving around you. With a robot car designed to maximize safety, you don't really have to worry. Even if they started making robot taxis drive like assholes, the maximum payout for suing a robo taxi company for getting hit is WAY higher than some rando on the road. The guy you pissed off and ran you over for standing in the road might have just got out from a 15 year prison sentence, hates the world, and have a net worth of -$70,000, and isn't going to earn you anything besides a life long injury.
A human driver would just get out of their car and beat the crap out of you, something that Waymo is not capable of (yet).
No they wouldn't.
I certainly wouldn't. I'm a short gay guy, plus even if I was big I don't want an assault charge; standing in front of my car doesn't give me a legal right to assault someone. From a legal standpoint (in most places) it's a deadlock.
Not every guy is big and strong and capable of or wanting to do violence.
You wouldn't, but they key is that the cyclist/pedestrian doesn't know it's you.
Say it's 5% of drivers who are maniacs, one encounters many many cars, and the cost of misjudging this situation is "grievous injury". So the end result is people will give way to cars even when they don't have to.
> Not every guy is big and strong and capable of or wanting to do violence.
This is structurally comparable to the way women treat "all men" as "potentially violent". It's not about you per-se, just a consequence of the group you cannot be immediately separated from.
Speaking of, that's the other half. Sure you're a normal guy. You don't want an assault change. The tradeoff changes when you are at risk, when violence is being done to you. (Hence the harassment-of-women example)
Sure you're not gonna try to murder the person outside like the road-raging maniacs, but when your safety is on the line, driving dangerously close past them is on the table.
Rolling out Waymo more broadly will help bring these criminals to justice, as there crimes will be recorded by an unflappably peaceful victim.
I understand your sarcasm, but do understand that drivers legitimately have a lot of political weight.
"Increase the speed limits" is a classic populist policy that keeps being reimplemented (and walked back for it's impracticality) all across the world. Policies to ban cyclists are uncommon, but certainly not unheard of. One imagines the self-driving car companies won't be too bothered to fight against laws that hurt everyone but them.
You can’t expect someone to not be afraid of an autonomous vehicle that appears to be acting irrationally.
Good luck making eye contact with the Waymo to gain confidence that it sees you.
Do you want them to put googley eyes on it? If you can see it, it can see you. Pretty simple.
Eye contact matters for humans because they might be looking at their phones, or their McDonald's fries, or staring straight into the sun. None of these things happen with self-driving cars. It's a non-issue.
Eye contact matters for humans, because humans can indicate that way which direction and speed they plan on moving.
That would actually be great. Some kind of eye brow raise, a gesture, any recognition/indication that it perceives a life to preserve.
Erm, have you ever taken a Waymo and watched the detection screen?
If you can see it, it sees you. Period. I guarantee it.
It can see and gives special priority to humans. I have watched it mark people at night that I couldn't see at all.
Doing weird shit on the road? Certainly possible. Missing seeing a human? Definitely not happening.
Just because someone knows that logically, doesn't mean that it reassures them in the moment.
My kneejerk, not thought through notion: why not require an emergency override protocol be builtin to road using robots? No thoughts on how this would work exactly, but it would let emergency workers move robot vehicles out of the way.
Theres already an emergency override protocol, you see the lights or hear the siren and then get out of the ambulance's way.
How do you propose to build something like that where it’s actually limited to just emergency workers?
This is like the fire keys for elevators. You can find them on eBay.
It doesn't sound difficult to solve. The sensors can classify firetruck, ambulance, red&blues, uniformed police, badge, siren, etc. At a certain criteria, they can unlock the driver door for normal human driving, perhaps for a very limited speed and distance. The officer can move it to the side, and if they crash it's not waymo crashing it. This override should send the an alert to the remote command center, so a human can watch the video and also decide how much further they can drive it. Since passenger safety is a concern, if there is a passenger inside who chooses to remain inside, the car should remain locked and not allow any driver in. The human can decide to follow police orders to exit the car, or remain inside, but at that point a human becomes responsible for obstructing. The whole freezing waymo trend seems driven by legal liability - not engineering. They know if they always freeze, their million miles with no accidents stats are safe.
Cops can't move vehicles that they don't own because of liability. The only way for them to move a vehicle without liability is to use a tow truck.
> Cops can't move vehicles that they don't own because of liability. The only way for them to move a vehicle without liability is to use a tow truck.
While the precise boundaries of liability depend on the laws of the particular jurisdiction (they aren't consistent across the whole US) police generally can take reasonable action to move vehicles obstructing the road in an emergency without liability for any damages incurred, whether or not they use a tow truck to do it.
I mean liability can be defined, we are writing new laws for these things. Cops that need to move autonomous are not liable for any damages.
I do think that theres needs to be better handling of emergency vehicles and autonomous vehicles. Someone needs to spend some serious time thinking through how to handle this better because this situation was not okay.
First thought would be to have the remote human over watching to engage with an emergency responder when the override is used. The remote human can then decide whether it is a real emergency or not. Either way, if any one uses an override system, a remote human should get involved. But instead, computer devs will suggest crypto/public/private keys/blahblahblah. This is one of those where the best answer will be to boot up the bio computer running the latest software
It's not too hard to implement them with cryptographic protocols to prevent duplication and apply time/location restrictions to them. Moreover if you really wanted to steal a car, there are much easier ways of doing that, like buying a replica gun on aliexpress then going to your nearest intersection.
The problem is not really that they can get stolen, but remote control, like a bad person gaining access to the car to hit people or something like that.
Not every social problem needs a technical solution. You can steal cars now, this is solved foremost by most people not being thieves and then second the existence of police for the few people that are.
>like a bad person gaining access to the car to hit people or something like that.
That can be eliminated by not giving direct control and something closer to what Waymo "Remote Assistance Agents" have access to: https://waymo.com/blog/2024/05/fleet-response/
And every single day you hear about someone abusing them right? Or no?
Americans will blame anything but the guns for shooting deaths.
The neat thing about self driving fleets is that when you fix a issue like this ALL the cars start driving better.
Isn't it making an illegal u-turn over a double yellow line?
Kind of. The thing making it illegal is that it's blocking traffic in the process (and potentially missing some visibility requirement or something; it's hard to tell from just the video).
I honestly think u-turns should be illegal is most situations. They are highly unpredictable and block multiple lanes of traffic for the benefit of a single vehicle. It’s especially unnecessary for a GPS-guided vehicle that should easily be able to determine an alternate route.
What's happening in that situation? When was the remote assistant engaged?
Everyone is downvoted in the comments 80% are grey? Even the ones that sound perfectly reasonable? Upvoted one and it was still greyed out which makes me think there's more to it, but maybe I am missing something really obvious.
>Please don't comment about the voting on comments. It never does any good, and it makes boring reading.
https://news.ycombinator.com/newsguidelines.html
I've been on the site for years and have never seen something like that even with heated topics, the guidelines are there as a guide, but this was completely out of the ordinary (at least for me).
At the risk of sounding ignorant, why didn't the various police cruisers and even the ambulance itself just push the damn thing out of the way? That's what the push bars attached to the front of their vehicles are for.
When my mom was a firefighter, and a car was blocking a hydrant, she happily broke windows and pushed the hose right through the car. Didn't happen a lot, but did happen more than once.
A human police over eventually got into the drivers seat to move the car. They sat around for minutes before doing so. They could’ve gotten into it immediately.
But yea they absolutely could’ve also just slammed it and moved on too.
The cops only do that if they can be sure they will be able to charge someone else for the damages. For their vehicle, for their "injuries", and also adding in to their department bonuses, but they get none of that if they can't charge somebody with a crime easily. Ambulances aren't really made to do that and are crazy expensive, plus the drivers would likely be taking on a bunch of the liability from the crash, although the ambulance itself likely would survive just fine because they are tough. The fire department are the only ones that have the trifecta of vehicles made to push other vehicles out of the way, are not paid by the extent that they can extort people for money, and are not going to be held liable for damages.
Because its not a trailer for Grand Theft Auto 6.
Ambulances aren’t exactly designed to act as battering rams.
They ram a car and the radiator goes bust and now you’ve got an ambulance with no engine.
Or you just hurt the passengers inside the Waymo and now you’ve got two emergencies.
Ambulances can be seriously damaged by attempting to do it. Police cruisers can do it, but then they may be sued for damages. I know that cars blocking fire hydrants were a serious problem in the past and owners sued firemen for pulling water hoses through their cars after breaking the windows - the law was not allowing it even if the line through the car was the only option.
Sounds like trolling, but the idea of Waymo suing a responder to a terrorist attack is too ridiculous.
> owners sued firemen for pulling water hoses through their cars after breaking the windows - the law was not allowing it even if the line through the car was the only option.
I'll bet anything you have no citation for this.
Sovereign immunity and necessity combine to make sure that firefighters and cops can do whatever the fuck is required.
The aftermath is even more brutal. You will receive multiple tickets for this, you will receive a bill for damages to the hose they had to thread through your windows (or to the police car that rammed you out of the way), and your car insurance will point to a clause in their policy that says that you are personally on the hook for all of this.
You may even face civil or even criminal liability for any damages to whatever is on fire, or loss of life, that a good prosecutor or plaintiff's lawyer can convince a jury is directly traceable to your egregious conduct in parking your precious car in front of the damn fire hydrant.
My thoughts exactly.
What an embarrassment.
"Authorities" paralyzed by politeness when lives are in the balance.
First responders need the ability to say get the fuck out of here, don’t come back, tell your friends.
The tech-bros never learn any humility, we have here an actual example of one of their hellish AI darlings blocking the first responders in their way to the aftermath of a terrorist attack, and what do those tech-bros' do? They continue supporting their hellish AI darling. Ellul was always right about things.
I've never understood why everyone acts like this is some bizarre legal quagmire.
If I make a robot and it goes and kills someone, nobody sits around navel gazing wondering how they're going to prosecute a robot.
If I make a device that pulls the trigger of a gun aimed at someone tied to a chair when I click a button on my cell phone, or something green appears in the camera attached to the device, or time reaches 11:24:42pm - nobody sits around navel gazing wondering how they're going to prosecute an electronic device.
In both cases, I would be prosecuted.
These cars are robots. They are designed, constructed, programmed, and monitored/supervised by humans. The humans are responsible for anything the robots do that cause damage, violate civil regulations, or criminal laws.
The solution here is very simple. Seize all the corporate email records, code, etc. and charge everyone involved in the production of the code that caused the "behavior", along with anyone whose negligence in supervision or review failed to catch the defect, or anyone who knew the car would or could do what it did, and failed to blow the whistle or failed to stop the car hitting the road.
Maybe then SV will stop "beta testing" fatal devices on the general public.
I can feel like a lot of people will disagree with you, but this is a pretty fair comparison. The people will disagree the most is the waymo users and I see their POV as well since as they have said: "much better than uber and always on time".
This is ridiculous. We don’t send surgeons to jail if they mistakenly kill their patient.
If they should/could have known better, then we do?
We can and do if they are shown to be negligent or irresponsible, and at the very least are responsible for damages, hence medical malpractice insurance to ensure people get paid even if the surgeon is bankrupt.
A few years ago I would have believed this.
But then, I would have also believed that youtube would have been sued into oblivion before it even got established, and that uber and lyft would not have been able to sidestep all the municipal regulations, and that we would have photographic evidence of bigfoot by now.
where's the lidar bois now?
Nowaymo!
I thought Lidar solved everything?
Another example of Waymo betting wrong, lots of expensive sensors vs Tesla with cameras and NN trained on billions of real miles (i.e. human like autonomy). A Tesla would have moved as it's trained to recognise this situation.
Possibly, but it's apples and oranges because the Tesla isn't self-driving yet. They get to take different risks given someone is behind the wheel.
I meant the Robotaxi's in Texas that have no driver, which will soon be all in Bay Area and Austin, then roll out to more cities over the year.
Tesla’s FSD has a storied history of problems with emergency vehicles.
My model 3 routinely recognized blue tinted street lights as "emergency vehicles" and would slow down on the highway for them. And to the best of my knowledge, a Waymo has never plowed into a stopped truck or barrier, killing the occupants.
Wild! Who wrote legislation to allow this?
You mean cars being allowed to endanger human lives? Enshrined by law, urban infrastructure and cultural notions of independence for over a century? Why is it just now seen as a problem because robots are driving, instead of the stupid, reckless, poorly trained, often intoxicated humans who have been driving up until now?
Lol no, way to discuss something not mentioned - do you work at one of these reckless companies? I'm talking about self-driving legislation, written by those wanting to test on an unsuspecting public.
Whoever wrote the legislation has my vote for reelection. Anything to make roads safer.
You'd vote companies and lobbyists into office?
This comment section surely would look the same if it had been a Tesla, right?
the astroturfing from employees in here is a bit much
It's actually rather telling that HN's greatest fear is accountability.
Yes it would, it would have been an electrek link with a damning EDS headline then a pile on. Cleary shows the bias HN posters have (most have some sort of EDS)
Since teslas have drivers.... no?
Heisenberg's Tesla - if it is doing something good, it has a driver. If it is doing something bad, it's autonomous.
Depends who's observing
Austin has Tesla robotaxis with no driver.
We continue to inch closer to these dumb buckets costing someone their life. Hell, they may already have.
Human driven cars cost people their lives multiple times every day, though. So I don't think the calculation can be quite as simple as that. As self driving cars are rolled out I think each incident like this needs to be studied to see how avoidable it was, whether a human would have been able to resolve it, and what changes can be made.
There are always going to be fuck ups at some level. The question is whether we’re moving from a world of more fuckups to fewer or not.
The argument that as long as they cause less incidents than human drivers they are a win has to go. Because that only works if the statistics of the environment are stationary.
What does “statistics of the environment” mean in this context? How can fewer deaths on the road not be a win?
I think they're getting at something like this: If self-driving cars resulted in dramatically more miles-traveled-in-car per person they could be safer per-mile and more efficient per-mile while making some important total outcomes worse.
Relevant to this example: if people travel by car more because they care less about traffic when they're playing video games or on TikTok during it, instead of driving, overall congestion will likely go up which makes emergency services worse.
That’s fair but I think it’s kind of orthogonal to the original point.
As is already the case in cities with robust public transit systems you’d need to make sure you’re applying the right incentives (i.e. taxes and charges) to make sure people are making decisions that benefit everyone. That doesn’t alter the possibility of self driving cars being much safer than human driven ones.
But in the case of self driving cars, who do we find at fault? Have we even answered that question? I mean did the Waymo car even get a ticket for blocking the ambulance?
Has any human driver ever received such a summons?
"Multiple" being stretched to the absolute limit in this comment.
If you are to believe Waymo’s safety stats, they have less accidents/injuries per mile driven.
But whether or not reducing injuries at a statistical level outweighs the downside of autonomous vehicles causing accidents (even at lower rates) is a bit of a dilemma.
The human side of those stats, whenever I've seen them presented next to self-driving car stats, has always been an aggregate of all human driving, a vast amount of which is in environments or conditions that Waymo doesn't operate in.
Shh stop speaking sense to blind technologists.
The road to mass adoption of autonomous vehicles probably wont happen in any poster's lifetime on this board. The reality is most people on here are quite narrow minded and can't 'understand' why it is not as easy as "hey I found a stat that shows Waymo is safer than humans!!11!".
I think the style of incidents and circumstances are probably neglected. But, even if they're not, I think there's other reasons we notice waymo issues more. Akin to how nuclear and airplane travel are safer than coal and car travel. This might be true, but when something does go well in those aforementioned fields, we notice.
It's the reverse. Every Waymo on the road saves more lives. The average driver is a bufoon
The 'smart' buckets kill about 40K a year, so there's that. No point in abandoning this
Now there will be a single company to sue instead of lots of individuals. If you want to be rich, start a law firm that focuses on autonomous vehicle accidents, like all the truck crash firms out there.
No, we’re finding edge cases that come up once every like million miles these things are putting on the road. Which means they are pretty damn good given how many are on the road right now.
These "edge cases" were required knowledge to get a license in my home country. You make room for any emergency vehicles, you don't try to score an ultra kill when passing a school bus and you certainly don't drive on rail tracks.
I’ve seen pictures in Germany where cars will move to the side of the expressway during a traffic jam to make room for emergency vehicles. I could tell that wasn’t the USA for sure.
This is my town, wow - cant believe someone filmed this whole interaction while there was a shooting a couple of blocks from there... If the ambulance was in a hurry they could have rammed the Waymo, I am sure Google wouldn't have sued for damages.
AFAIK when a Waymo detects emergency vehicle lights and sirens, it is designed to pull over and stop, unlock its doors, and roll down its windows. Also: First responders can put the vehicle into a manual mode to move it if needed.
>If the ambulance was in a hurry
i believe they were.
>they could have rammed the Waymo
Not an expert but i think the goal is to get the ambulance and its occupants to a specific location and then make an egress to a nearby medical facility? Also I'm not confident ambulances are designed to execute the pit maneuver.
>I am sure Google wouldn't have sued for damages.
Oh well if that's the case i guess it's all alright.
>First responders can put the vehicle into a manual mode to move it if needed.
I really feel like you're missing the point of why you're supposed to pull over and yield right-of-way for emergency vehicles.
Can't pit a stationary vehicle.