• samus7070@programming.dev
    link
    fedilink
    English
    arrow-up
    250
    arrow-down
    12
    ·
    11 months ago

    The real crime is marketing the driver assist capability under the name autopilot when it is anything but that.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      190
      arrow-down
      2
      ·
      11 months ago

      Oh no, it’s even worse than that.

      It’s the CEO and other staff repeatedly speaking of the system as if it’s basically fully capable and it’s only for legal reasons why a driver is even required. Even saying that the car could drive from one side of the US to the other without driver interaction (only to not actually do that, of course).

      It’s the company never correcting people when they call it a self driving system.

      It’s the company saying they’re ready for autonomous taxis and saying owner’s cars will make money for them while they aren’t driving it.

      It’s calling their software subscription Full Self Driving

      It’s honestly staggering to me that they’re able to get away with this shit.

      • meleecrits@lemmy.world
        link
        fedilink
        English
        arrow-up
        93
        arrow-down
        4
        ·
        11 months ago

        I love my Model 3, but everything you said is spot on. Autopilot is a great driver assist, but it is nowhere near autonomous driving. I was using it on the highway and was passing a truck on the left. The road veered left and the truck did as well, keeping in its lane the entire time. The car interpreted this as the truck merging over into my lane and slammed the brakes. Fortunately, I was able to figure out what went wrong and quickly accelerated myself so as to not become a hazard to the cars behind me.

        Using Autopilot as anything more than a nice dynamic cruise control setting is putting your life, and other lives, in danger.

        • Neato@kbin.social
          link
          fedilink
          arrow-up
          57
          ·
          11 months ago

          Holy shit. My car doing that once and I’d be a nervous wreck just thinking about using it again.

          • Wrench@lemmy.world
            link
            fedilink
            English
            arrow-up
            29
            ·
            11 months ago

            I give teslas more room because I have been brake checked by them on empty roads before. These ghost brake problems are prevalent.

          • snooggums@kbin.social
            link
            fedilink
            arrow-up
            20
            ·
            11 months ago

            I have had the adaptive cruise control brake on multiple Hondas and Subarus in similar situations. Not like slamming on the brakes, but firm enough to confuse the hell out of me.

            Every time it was confusing and now I just don’t use it if the road is anything but open and clear.

            • buran@lemmy.world
              link
              fedilink
              English
              arrow-up
              23
              ·
              edit-2
              11 months ago

              Honda’s sensing system will read shadows from bridges as obstructions in the road that it needs to brake for. It’s easy enough to accelerate out of the slowdown, but I was surprised to find that there is apparently no radar check to see if the obstruction is real.

              My current vehicle doesn’t have that issue, so either the programming has been improved or the vendor for the sensing systems is a different one (different vehicle make, so it’s entirely possible).

          • KptnAutismus@lemmy.world
            link
            fedilink
            English
            arrow-up
            18
            arrow-down
            8
            ·
            edit-2
            11 months ago

            i barely trust the lane-keeping assistant in my friend’s car. imagine going 70+km/h and suddenly the car decides to jerk the steering to the left/right because you weren’t exactly in the middle of your lane.

            fuck modern assistants IMO. i can use the steering wheel just fine, and people have been able to for a hundred years.

            • Pennomi@lemmy.world
              link
              fedilink
              English
              arrow-up
              26
              arrow-down
              2
              ·
              11 months ago

              Considering that driving is (statistically) the most dangerous thing the average person does, I wouldn’t really say that people use the steering wheel just fine.

              It’s just that computers are currently worse at it than humans.

              • KptnAutismus@lemmy.world
                link
                fedilink
                English
                arrow-up
                14
                ·
                11 months ago

                agreed. if “autopilot” becomes a better driver than the average person, then it has a right to exist.

              • FaceDeer@kbin.social
                link
                fedilink
                arrow-up
                4
                arrow-down
                2
                ·
                11 months ago

                I wouldn’t even say that without seeing statistics to back it up. The news doesn’t cover routine traffic accidents, but one Tesla screws up one thing and that story is front page. Don’t rely on anecdotes and emotions.

            • merc@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              3
              ·
              11 months ago

              i can use the steering wheel just fine, and people have been able to for a hundred years.

              People have been bad at it for a hundred years. I’m not saying that people should necessarily be using auto-steering that keeps them in the middle of their lanes, but they should at least be using systems that beep at them when they stray out of their lane.

              The bar for self-driving technology isn’t some amazing perfect computer that never makes a mistake. It’s the average driver. The average driver is bad.

              • KptnAutismus@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                11 months ago

                we can do two things (these are not mutually exclusive):

                -take further control away from the drivers and make them dependent on a computer, which can always misunderstand a situation and make the driver responsible for it.

                -educate drivers properly, at least in the US. americans have been historically bad at driving and have also been known to be undereducated.

                • merc@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  11 months ago

                  I’m all for more driver education, and for stricter licensing requirements like they have in Europe. Having said that, eventually computers are going to have to take over.

                  It’s pretty absurd that we’re handing control over multi-ton devices traveling at tens of meters per second to fallible, bored, easily distracted humans. The safer cars get, the safer drivers feel. The safer drivers feel, the less they feel they need to concentrate on driving.

                  Safe driving just will never be a skill that humans will be good at. The tasks that humans are good at that require concentration are tasks that are challenging and remain challenging. Think playing a sport where there’s always action and you have to react. Humans are bad at tasks that are mostly routine and boring, but if your concentration lapses you can cause a catastrophe. Those are the kinds of tasks where people get bored so they start glancing away, reading a book or looking at a smartphone, or whatever. For driving to be engaging, it has to be non-boring, which means non-safe. The safer it gets, the more boring it gets, so people stop paying the required attention. There’s just no winning.

          • burliman@lemm.ee
            link
            fedilink
            English
            arrow-up
            13
            arrow-down
            14
            ·
            11 months ago

            That’s the bar that automatic driving has. It messes up once and you never trust it again and the news spins the failure far and wide.

            Your uncle doing the same thing just triggers you to yell at him, the guy behind him flips you off, he apologizes, you’re nervous for a while, and you continue your road trip. Even if he killed someone we would blame the one uncle, or some may blame his entire class at worst. But we would not say that no human should drive again until it is fixed like we do with automated cars.

            I do get the difference between those, and I do think that they should try to make automated drivers better, but we can at least agree about that premise: automated cars have a seriously unreasonable bar to maintain. Maybe that’s fair, and we will never accept anything but perfect, but then we may never have automated cars. And as someone who drives with humans every day, that makes me very sad.

            • maynarkh@feddit.nl
              link
              fedilink
              English
              arrow-up
              15
              ·
              11 months ago

              There is a big difference between Autopilot and that hypotethical uncle. If the uncle causes an accident or breaks shit, he or his insurance pays. Autopilot doesn’t.

              By your analogy, it’s like putting a ton of learner drivers on the road with unqualified instructors, and not telling the instructors that they are supposed to be instructors, but that they are actually taking a taxi service. Except it’s somehow their responsibility. And of course pocketing both the instruction and taxi fees.

              The bar is not incredibly high for self driving cars to be accepted. The only thing is that they should take the blame if they mess up, like all other drivers.

              • burliman@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                ·
                11 months ago

                Yeah, for sure. Like I said, I get the difference. But ultimately we are talking about injury prevention. If automated cars prevented one less death per mile than human drivers, we would think they are terrible. Even though they saved one life.

                And even if they only caused one death per year we’d hear about it and we might still think they are terrible.

            • Neato@kbin.social
              link
              fedilink
              arrow-up
              5
              arrow-down
              1
              ·
              11 months ago

              The difference is that Tesla said it was autopilot when it’s really not. It’s also clearly not ready for primetime. And auto regulators have pretty strict requirements about reliability and safety.

              While that’s true that autonomous cars kill FAR less people than human drivers, ever human is different. If an autonomous driver is subpar and that AI is rolled out to millions of cars, we’ve vastly lowered safety of cars. We need autonomous cars to be better than the best driver because, frankly, humans are shit drivers.

              I’m 100% for autonomous cars taking over entirely. But Tesla isn’t really trying to get there. They are trying to sell cars and lying about their capabilities. And because of that, Tesla should be liable for the deaths. We already have them partially liable: this case caused a recall of this feature.

              • Staiden@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                4
                ·
                edit-2
                11 months ago

                But the vaporware salesman said fully automatic driving was 1 year away! In 2018, 2019, 2020, 2021… he should be held responsible. The guy once said to further technology some people will die and that’s just the price we pay. It was in a comment about going to Mars, but we should take that in to accout for everything he does. If I owned a business and one of my workers died or killed someone because of gross negligence I’d be held responsible why does he get away with it.

            • SlopppyEngineer@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              11 months ago

              Except Tesla’s uncle had brain damage and doesn’t really learn from the situation so will go it again, and had clones of him driving thousands of other cars.

        • Damage@slrpnk.net
          link
          fedilink
          English
          arrow-up
          8
          ·
          11 months ago

          Something like that happened to me while using adaptive cruise control on a rental Jeep Renegade, it slammed the brakes twice on the highway but for no clear reason. I deactivated it before it tried a third one.

        • LordKitsuna@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          The auto cruise on the Priuses at work do this a lot. If the freeway curves to the left or something it will panic and think I’m about to hit the cars in the lane next to me also going through the Curve

        • NeoNachtwaechter@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          5
          ·
          11 months ago

          The road veered left and the truck did as well, keeping in its lane the entire time. The car interpreted this as the truck merging over into my lane and slammed the brakes.

          Even dynamic cruise control must never do such dangerous mistakes!

          You should claim that they fix this on warranty, and they should prove that this is never going to happen again.

          • LordKitsuna@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            Almost all of them do it, the one most fresh in my mind is the Prius because my work uses them as base cards so I drive them a lot. If the highway curves kind of hard to either the left or the right sometimes it will panic and think you’re about to hit the car in the lane next to you because they’re technically in front of you and so it will try to brake.

            Thankfully there is an option to turn off the automatic braking it will just start screaming instead

    • Neato@kbin.social
      link
      fedilink
      arrow-up
      29
      arrow-down
      2
      ·
      11 months ago

      Tesla should be playing wrongful death suits every time autopilot kills someone. Their excuses don’t excuse the blatant marketing that leads people to believe it’s a self driving car.

      • 800XL@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        2
        ·
        11 months ago

        But you see that wasn’t the vehicle’s fault. It’s been programmed perfectly. What happened was the fault of the pedestrians and driver for not properly predicting what the car would do.

        maybe /s maybe not.

        • Goferking0@ttrpg.network
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          11 months ago

          no you see the issue is that the auto pilot stopped right before the accident so obviously it was entirely drivers fault, please don’t check how much time was between it stopping and the accident

    • raptir@lemdro.id
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      14
      ·
      11 months ago

      Do we need to go through what autopilot in a plane or boat actually does again?

        • CmdrShepard@lemmy.one
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          7
          ·
          11 months ago

          To probably 90% of people “autopilot” means “drive automatically”.

          Based on what?

          • poopkins@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            11 months ago

            Tesla markets this feature as “Full Self-Driving Capability.” Maybe I’m poorly informed, but to me that means that the car is fully capable of driving itself without human interaction.

            • CmdrShepard@lemmy.one
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 months ago

              FSD is an entirely separate thing. Autopilot is just an LKAS system, or adaptive cruise control.

              • poopkins@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                11 months ago

                Aha, today I learned that Autopilot is just lane-keeping and adaptive cruise control. I feel that it must be a common misunderstanding to confuse the terms “Autopilot” and “Fully Self-Driving” in the vernacular.

                Many other manufacturers refer to lane-keeping systems as “driver assistance,” and I believe Tesla is intentionally misleading consumers with the impression that their system is more capable and allows the driver to pay less attention.

                • RushingSquirrel@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  11 months ago

                  Until you drive it. You know the capabilities, you know when you can and cannot activate it, you know how often it tells you to look at the road and if you don’t prove you’ve got your hands on the wheel, it disables itself for the drive (you need to park to reactivate it). No Tesla driver thinks autopilot is more than a lane and distance keeping assistance.

                  Autopilot is a marketing name, that’s it.

      • dexa_scantron@lemmy.world
        link
        fedilink
        English
        arrow-up
        22
        arrow-down
        4
        ·
        11 months ago

        If we do, then they shouldn’t have picked a name that most people think does something it doesn’t.

        • RushingSquirrel@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          11 months ago

          When you drive a Tesla, it’s pretty clear what autopilot is. The name is a marketing term, you can’t engage it everywhere and anytime, you’ve got to keep your hands on the wheel or it disables itself, won’t stop at stop signs and red lights, won’t do line changes, etc.

      • fiah@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        11 months ago

        do we need to go through the differences in training, aptitude and intelligence between pilots, captains and your neighbor Greg again? Marketing it as “autopilot” to anyone who can sign a car loan is reckless and has killed people and will continue to kill people until they stop

        • CmdrShepard@lemmy.one
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          7
          ·
          11 months ago

          Yep, just like “cruise control” made tons of people drive their car into the ocean thinking they could sail it to popular island destinations.

    • Fox@pawb.social
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      edit-2
      11 months ago

      It’s a common misunderstanding that an autopilot system in an airplane does everything or even a lot of things. The most basic ones keep the wings level and nothing else. Of course Tesla is probably counting on that misconception to sell this feature, but actual pilots using any kind of autopilot are still on the hook to pay attention 100% of the time.

      • The Menemen!@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 months ago

        In an airplane that is fine as pilots are specifically trained on the planes they fly (at least in theory). No one gets a special course in how to drive a specific (non industrial) car…

  • PatFusty@lemm.ee
    link
    fedilink
    English
    arrow-up
    131
    arrow-down
    5
    ·
    11 months ago

    The guy was going through a suburb at 75 mph blowing through stop lights. Ofcourse he has to pay, im surprised hes not getting jail time. This has nothing to do with the car, thats just gross negligence

    • eltrain123@lemmy.world
      link
      fedilink
      English
      arrow-up
      52
      arrow-down
      1
      ·
      11 months ago

      My tesla doesn’t let me use autopilot or FSD if I set it over 5% of the posted speed limit. How is this guy going 75 in the burbs?

      • BossDj@lemm.ee
        link
        fedilink
        English
        arrow-up
        44
        ·
        11 months ago

        I wonder if Tesla had this section of road mapped as freeway. Especially since it rolled through a red.

        “Suburb” in LA is a very loose term

        I would think that the guy is just lying, but Tesla would call that out REAL quick.

  • qooqie@lemmy.world
    link
    fedilink
    English
    arrow-up
    111
    arrow-down
    4
    ·
    11 months ago

    Wow the value of a life I guess. I don’t really know what can come close to the value of a life, but this doesn’t seem like it.

    • burliman@lemm.ee
      link
      fedilink
      English
      arrow-up
      41
      arrow-down
      14
      ·
      11 months ago

      What would be the value of life then? I’ll save you the answer: no matter how big the number you say, someone else will say bigger. Until it becomes priceless, which is the answer.

      However death and accidental death isn’t always avoidable. And when we pin the fault on someone we cannot expect to say “priceless” is what they owe the victim’s family. So we assign an amount of money or time that hurts, and call it good.

      Doesn’t mean life is worth that. And saying so doesn’t help anyone.

      • a4ng3l@lemmy.world
        link
        fedilink
        English
        arrow-up
        62
        arrow-down
        1
        ·
        11 months ago

        Sure but even looking a only the financial produce of one person for a family dwarfs the comical 23k here. And that’s not even looking at the emotional side of things. 23k is straight insulting imho.

          • AFaithfulNihilist@lemmy.world
            link
            fedilink
            English
            arrow-up
            33
            ·
            11 months ago

            Tesla should be out millions for this. The autopilot feature is a gimmick and not at all transparent. They’re beta testing on the public and people are dying because of it. This is a corporate decision that needs to have corporate consequences over and above legal ones. People shouldn’t just be getting minor fines, they should be going to prison and losing absolutely everything.

            • MNByChoice@midwest.social
              link
              fedilink
              English
              arrow-up
              8
              ·
              11 months ago

              Consistent and clear pattern of lying to everyone, minimizing, and shifting blame with clear motivation of personal profits.

              Fully agree on prison time.

        • Tarquinn2049@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          11 months ago

          That’s life insurances job, this would be on top of life insurance, and is more about where the money comes than where it is going.

          • onion@feddit.de
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            11 months ago

            You look at different jobs, how high the risk of dying is and how much they pay, and work out from that how much more pay people demand for say a 1% risk increase. Then you scale that up to 100% risk.

            So if you were to work an average job no one has ever survived, and you died on the day you retire, you would’ve earned those 12mil

      • TenderfootGungi@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        11 months ago

        True. But what if Tesla has to pay a billion for producing software that runs people over? They probably would not have beta software on the road.

    • Neato@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      11 months ago

      That was the penalty for the felony charge for the driver of the car that ran off the highway into a surface street. It’s almost certain that drivers insurance also paid out their maximum.

      In addition, Tesla is recalling all those cars to change the system that pretends to ensure a driver using autopilot is actually paying attention.

      And a civil suit will likely follow from the 2 victims families.

    • dylanmorgan@slrpnk.net
      link
      fedilink
      English
      arrow-up
      4
      ·
      11 months ago

      That is just the fine, the families are suing the driver and Tesla. Here’s hoping the Tesla suit gives them the real prize: the death of a company.

      (I know it won’t happen but a guy can dream.)

    • ladicius@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      2
      ·
      11 months ago

      Germany the same. Small fine, three month without license, that’s it for killing a human being.

      • Thorny_Insight@lemm.ee
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        3
        ·
        11 months ago

        If we’re talking about an honest accident then how long do you think the jail term should be?

        • sbv@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          17
          arrow-down
          1
          ·
          11 months ago

          “honest accident” is the crux of the question. If the driver was doing everything perfectly and some other party was entirely responsible for the accident, not much (maybe none?).

          But, at least in my corner of Canada, most drivers are not behaving responsibly or adhering to the law. Speeding, following too closely, illegally passing, and using phones while driving are common. If a driver kills someone while doing something overtly dangerous, they deserve jail time.

          • Dark Arc@social.packetloss.gg
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            11 months ago

            I hate that speed(ing) always gets lumped in whenever “dangerous behavior” comes up. Going faster than an arbitrary road sign says you should isn’t inherently dangerous.

            • Going faster than the arbitrary road sign can be dangerous.
            • Going the speed the arbitrary road sign says can be dangerous.
            • Going slower than the arbitrary road sign can be dangerous.

            It’s about the conditions of the road, paying attention, signaling to other drivers what you’re trying to do, and being prepared for people and animals to do something dumb.

            • following too closely
            • using phones while driving

            These things are on a whole other level than speeding or “illegally” passing. But the person who can’t keep their car centered in the lane, wrecks every other winter, doesn’t use their turn signal, doesn’t notice an ambulance right behind them, and drives too close to the car in front of them will say “I’m a GOOD driver because I don’t speed. Shame on all these bad drivers that pass me!”

            Even worse some of those “GOOD” non-speeding drivers will try to “police the roads” and prevent people from getting around them which has literally resulted in completely pointless deaths during emergencies.

            • sbv@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 months ago

              Googling around, it looks like there’s a strong relationship between increased speed and accident severity. The reasons cited are increased kinetic energy of the vehicle, decreased effectiveness of the built-in safety equipment, and a higher risk of rollover. It’s particularly dangerous for pedestrians and cyclists.

              Increased speed also increases the risk of an accident, since it reduces the amount of time drivers have to react, and increases the vehicles stopping distance.

              From

              (Interestingly, speeding decreases fuel efficiency, but that isn’t what the original post was about)

              • Dark Arc@social.packetloss.gg
                link
                fedilink
                English
                arrow-up
                1
                ·
                11 months ago

                Yes, but in Texas you’ll find the interstate speed limit is 85 and in Ohio it was 65, now it’s 70.

                There’s not some fundamentally crazy difference between Ohio and Texas roads that results in Texas accepting 20mph higher speed limits than the rate Ohio was using within the last 10 years or so.

                There is a history of speed limits for fuel efficiency and scarcity https://en.wikipedia.org/wiki/National_Maximum_Speed_Law. Which is more evidence that speed limits do not inherently correspond to safety. Driving the limit, over the limit, or under the limit doesn’t “magically” make you safer.

                Sure, if you’re in an accident two objects traveling faster are going to cause more damage to each other than two objects traveling slower. However, getting into that accident in the first place has little to do with speed and a lot to do with situational awareness and giving yourself enough time to stop in case of emergency (i.e. how close you’re following the car in front of you, how fast you’re passing kids on the sidewalk, etc).

                You’ll note speed is always blamed for increasing severity (i.e., it’s a factor in the severity of the accident). Unfortunately the links don’t work anymore to get to the underlying source but https://sites.psu.edu/siowfa15/2015/09/18/is-driving-faster-safer/ states “A study conducted by the Florida Department of Transportation says that accidents that were caused by speeding is actually 2.2%.” The Autobahn is another great example. Speed is very rarely cited as the primary cause … because speed isn’t really the issue.

                We should be focusing on issues that actually cause accidents like tailgating, blocking the left lane, failing to signal, etc.

        • Pyr_Pressure@lemmy.ca
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          4
          ·
          11 months ago

          If it were an honest accident then nothing. If it were due to neglect or lack of due diligence then maybe a few months of of weekend jail or month of full time jail.

          • MisterFrog@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            For killing someone? Causing someone’s death due to negligence is only worth a month of jail to you?

        • helenslunch@feddit.nl
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          In the US that’s called “involuntary manslaughter” and presumably the sentences are shorter, but also still exist so people can’t throw up their hands and go “whoops! That was totes an accident” and get off scot-free.

        • grue@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          11 months ago

          If we’re talking about an honest accident

          There is no such thing.

      • Pechente@feddit.de
        link
        fedilink
        English
        arrow-up
        5
        ·
        11 months ago

        Holy shit, really? Never looked into it but judging by how people drive here (lots of people on their phones while driving, missing red lights all the time) it certainly doesn’t seem like there are severe consequences for any wrongdoings.

  • Alchemy@lemmy.world
    link
    fedilink
    English
    arrow-up
    61
    arrow-down
    6
    ·
    11 months ago

    Anyone else tired of beta testing Tesla’s garbage just by being outside on the roads near these vehicles?

    • RushingSquirrel@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      11
      ·
      11 months ago

      Human beings controlling cars are extremely dangerous. Drunk drivers, racing, going through red lights and stops, speeding, not paying attention, etc. No need for autopilot for the streets to be dangerous for pedestrians. Autopilot keeps the car in line, which is already way safer than most 100% human-controlled accidents.

      And again, the driver is responsible to keep their eyes on the road, even when using cruise-control or any sort of driving assistance.

  • Commiunism@lemmy.wtf
    link
    fedilink
    English
    arrow-up
    54
    ·
    11 months ago

    There’s this saying about how if something is punishable by a fine, then it’s only illegal for poor people.

    I don’t even have to finish this do I

  • werefreeatlast@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    11 months ago

    You honor, I actually didn’t wack anyone with this self actuating axe. I bought it and I told it to go chop wood. The people just happened to be too close to the axe. Yeah I was holding the axe but I wasn’t actually putting any pressure. The tail was wagging the dog in other words.

    Ok so $10,000.00. Fine? Oh alright I guess that’ll teach me not to buy autonomous axes.

      • aeharding@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        11 months ago

        Part of the reason why you don’t lose your license for killing someone with a car in the US is because it’s much more of a ‘punishment’ because of how car dependent the US is.

        Also, keep in mind a lot of trips are 3 miles or less in the US, and most drive it, despite wanting alternatives to driving.

        If someone is trying to get from A to B in a 2 mile trip and the government basically mandates people to drive that, can you really blame them if they end up killing someone accidentally? What if they accidentally kill themselves smashing into a tree? You might assign some of the blame to their driving, but would that solve anything in the long term? a large part of the blame should be assigned to this insane transportation system we’ve built where everyone needs to drive 2 miles to pick up a bag of milk.

        TLDR prevention, not blame will reduce traffic violence.

          • my_hat_stinks@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            11 months ago

            You’re already paying for a car because your infrastructure demands one. I’m not in the US and get by just fine without one. Saves me a shitload of money too.

            Cycling is unsafe because your infrastructure was built exclusively for cars. Your infrastructure is built exclusively for cars because most people have to use cars because your infrastructure was built exclusively for cars. It’s circular reasoning.

            Both of those issues are caused by infrastructure and would be solved by building slightly less stupid infrastructure.

            • vaultdweller013@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 months ago

              Also theres the edge cases like mine where walking and biking are vastly slower than driving even though the city is somewhat walkable, because I live at one of the highest elevated roads in my city. Its a 15 min walk or a 5 min bike to places, its a 1 hr walk back and a 30 min bike back.