Stevie Behind the Wheel

47
5049
Print Friendly, PDF & Email

I recently test drove a new car with an autopilot system. Turn on the cruise control and the car not only maintains whatever speed you set, it also steers itself through curves and keeps the car from wandering out of its lane.

You are supposed to keep your hands on the wheel, of course. But who wouldn’t be tempted to take them off?

Just to see?

Well, one thing I saw was that the system works (in a clumsy, overcautious old person sort of way) until it doesn’t see. A klaxon sounds; a flashing yellow display appears advising that the auto-pilot system has disabled itself because it lost track of the road.

Because the sensors – the electronic eyes which scan the road, so as to keep the vehicle in its lane and (hopefully) out of the ditch – became suddenly glaucomic for one reason or another. It happens in bright daylight – glare, perhaps? – and also when it’s rainy (fog, one assumes).

Ice build up is a problem in the winter, if you can’t park overnight indoors. And even if you can, once you’re out on the road, if the air temperature is well below freezing, ice can and will form on the car’s exposed exterior surfaces – including the surfaces of the sensors which are critical to the operation of these self-driving technologies.

But I’ve also had a test car literally come to a dead stop in the middle of the road, for reasons known only to the code. It happened on a bright, clear day. I almost had dashboard for lunch – I never buckle up for saaaaaaaaaaafety – because the sudden halt was not expected.

I am able to type all of this because it was my good luck that day no Kenworth was behind me when the car I was in decided to brake violently and stop in the middle of the road – my attempts to countermand this by mashing the accelerator pedal being as futile as Justin Beiber punching a brick wall.

Point being, these systems are fallible.

As are we, of course. But there is a big difference between running off the road because you weren’t paying attention and running off the road because the car wasn’t – or no longer could.

Particularly when the car is being counted on to not run off the road. And when you – the erstwhile driver – have been encouraged (whether tacitly or overtly) to pay less or no attention to the car’s progress.

This is the never-mentioned Catch 22 of these automated driving systems. On the one hand, people are still expected to keep their hands on the wheel – and their eyes on the road.

On the other hand, technology is tempting them not to.

Otherwise, what is the point? If you have to pay attention – or are expected to – then the automated driving tech is useless at worst and a distraction at best. It is also an expense, but never mind that.

A part-time driver is like a part-time pregnancy.

You either is – or you isn’t.

If you isn’t, then the tech had better work –  your life literally depends on it. If it cannot be depended on to work – all the time – then you had better be ready to intervene, assuming your life has value to you.

This is like not having your cake – or eating it, either.

It places people in an impossible position. Who’s responsible, ultimately, for controlling the vehicle and the consequences, if the vehicle becomes uncontrollable? People are being encouraged to hand off more and more control over the car to technology, but the tech doesn’t get the ticket – or the felony indictment/lawsuit – when the car runs someone over.

Now factor in the absence of interventionary controls – a steering wheel, brake pedal. This is actively being considered, both by the car industry and Uncle (see here), on the usual egg-breaking being necessary to make the omelettes basis.

On the upside, we’ll know who – or rather, what – to blame, post mortem.

But that is cold comfort (literally) when your mortal remains are lying on a slab awaiting the embalming machine because there was nothing you could do when the tech hiccuped – and the car took a nosedive off the road, carrying you along for the ride.

. . .

Got a question about cars – or anything else? Click on the “ask Eric” link and send ’em in!

If you like what you’ve found here please consider supporting EPautos. 

We depend on you to keep the wheels turning! 

Our donate button is here.

 If you prefer not to use PayPal, our mailing address is:

EPautos
721 Hummingbird Lane SE
Copper Hill, VA 24079

PS: Get an EPautos magnet (pictured below) in return for a $20 or more one-time donation or a $5 or more monthly recurring donation. (Please be sure to tell us you want a sticker – and also, provide an address, so we know where to mail the thing!)

My latest eBook is also available for your favorite price – free! Click here.  

 

 

47 COMMENTS

  1. I work for a developer of these technologies who sells them to OEM’s. We’re not very well known, and like it that way, but we’re working on the next generation systems which do not need human attention, and in the areas where they have sufficient data, they can drive completely autonomously (this is called level 4 autonomous driving), so I’m pretty familiar with various self driving systems.

    What you see in cars today are systems made by either MobilEye out of Israel (most brands – Subaru, GM particularly), or home brewed stuff (Tesla). These are systems very limited by their sensors. Cameras are terrible compared to the human eye, and really bad in the dark with glare or in rain. Any time contrast is very high, cameras do poorly, since they’re not designed for this. Most cars use off the shelf cameras, which are designed for different scenarios than you need in a self driving car. Camera based systems are limited by these cameras, and also have no understanding of the world, only recognize patterns and lines, and not so well at that.

    The next generation of systems, knowing that computer based vision does not work, use robotics sensors like lidar, radar, sonar, even ground penetrating radar to compare what they sense against an internal map. These things will absolutely work very safely, but there is a HUGE amount of work to be done in building those maps, getting all the sensors calibrated against each other. Even these super sensors have limits; lidar sucks in rain or even high humidity, radar just sucks always, sonar picks up a lot of noise, but taken together, they’re pretty good. Anyone who is telling you that a car will be safely self driving in the next couple of years is lying, it’s impossible. Anyone who tells you they can do it with cameras alone is lying or delusional.

    We have self driving cars today in limited use cases – trucks on fixed routes, quarry trucks, airport parking shuttles, and stuff like that, which solve a much easier problem. This technology is coming, and will be very safe, however, we have a lot of work to do to make it safe, particularly if we don’t want to kill people like Boeing.

    Seeing how this industry works, though, I am worried. We have frequent meetings with government regulatory agencies, who are salivating at the notion of forbidding human driving. They sell this for environmental reasons, claiming self driving cars will save energy (bullshit), it’s all about control in the guise of environmentalism and safety.

    • The GPS maps don’t even know where the roads are out where I live. If a self-driving car followed GPS then you would end up driving down in the creek or maybe off a cliff. If a driver even follows GPS directions then he/she ends up going up/down some goat trail on private property and risk getting stuck or maybe shot by the landowner.

      I don’t expect sensors to EVER determine that “this side of the road is too muddy and you will get stuck if you don’t go around on the opposite side.”

      Not to mention vehicles like farm & ranch, construction, wildland fire, etc which are regularly driven where there is No Road At All.

      • BTW, aircraft software development has very onerous process controls, with a representative of the FAA monitoring every step. And still sometimes it isn’t perfect.

        I somehow doubt that auto software development has any sort of process controls or oversight whatsoever.

        • Software development has controls. But the software world didn’t develop to do the things it is to do today. Mechanical engineering did. The software world is missing key things that mechanical engineers learned and implemented decades ago.

          • Most software is just thrown together to see if it works, and then patched and re-patched to fix it. There are process controls in aviation (DO-178x) but I’m not sure about anything else, maybe medicine?

        • Um…

          https://www.cbsnews.com/news/faa-boeing-relationship-under-scrutiny-after-deadly-crash-2019-03-19/
          ——-
          The two fatal crashes involving the Max, just five months apart, have raised questions about the relationship between the FAA and Boeing.

          “We are bending over too much to the corporate interests and not enough to the public interest in the areas of safety,” said Rep. Steve Cohen.

          Cohen wants hearings on a process he worries has gotten too cozy. After 9/11, Congress approved a system that allows manufactures like Boeing to largely self-certify aircraft, including their safety systems.

          • I’m pretty sure the DER (now called something else) practice goes back before 9-11-2011. It wasn’t too long before then that I started to get into certified stuff. But yeah it does seem weird that the company hires/pays the “watchdog” but otherwise can you imagine the cost to the FAA (your tax money). In my experience, the DERs have always been pretty much sticklers for doing it right, which can sometimes be a “challenge” from the development side but I always appreciated their oversight and review. The worst possible thing would be for something that I had worked on to be involved in a crash and/or fatalities.

            I would sure like to see the safety analysis and meeting notes where the MCAS was approved. I find it hard to believe that no one raised any issues, but then I only barely understand the automatic trim system and this modification because I worked in totally different systems, mostly maintenance and test. There may have been obvious “mitigations” in place in the existing system. I have to wonder if the previous trim system was so reliable that pilots forgot the emergency procedures. Anyway, you would think that the day after the Lion Air crash that every 737 pilot in the world would be scrambling to make sure this couldn’t happen to them.

      • Sounds like where you live is going to be returned to nature and made off limits to humans. It makes sense when merging the agendas together.

        In the cities streets usually stay in the exact same place for decades or centuries. The exceptions are usually rather small in number and occur slowly over time.

        New Urbanists spread the lie that wide streets were created for automobiles. Often they are stupid enough to pick a famous street in pre-automobile city. I then find a 19th century photograph. Guess what? The public way is just as wide today as then. Now in some instances there are small changes in design, but nothing earth shattering. Dramatic changes for automobiles are rather small in number considering street miles in a big city.

        • 1) Guess that’s why they want to take away guns, eh?

          2) Wide streets were initially built at least in some cities so that a team and wagon could turn around.

        • Today, streets are strictly considered “transportation infrastructure”- mainly the domain of fast-moviong motorized vehicles.

          100 years ago, streets were more so just common areas, also used for transport. Theyu were also used to peddlerts, who’d set up their wagons on the sides; pedestrians; cyclists; horses and horse-drawn vehicles; the moving a livestock; garbage collection (Shop keepers woiuld just their sidewalks into the gutter…), etc.

    • How will it be ‘very safe”- as conditions in the real world change immediately in real time, but the system’s maps can not be updated in real time- Even under the best possible scenario, any deviations would have to first be reported, then confirmed, then programmed, then transmitted, then incorporated by refreshing the end user’s data.

      How many people crash, get run over, or die in the meantime?

      Opp, don’t you find it very, very conflicting to be devoting your time and effort to something which is straight out of 1984, and which you know is ultimately dependent upon government tyranny, and which fosters even more control, surveillance, and loss of true autonomy/choice?

      • It’s easy to criticize systems when you don’t have the full picture.

        First, every self driving car that is designed not to kill people must respond to conditions immediately and be able to perceive its world at least as well as a human. This should not be up for debate. Now, the problem is that computer based perception of the world has not been solved in a nice, general, self-contained way. So, we build fancy 3D laser maps and radar maps which give the real time perception systems on the car a heads up, but they don’t represent final truth. The map says, “there should be road here, overpass here, tree there, and careful, there’s a pothole”. The car compares what it perceives with what it knows via the map, and notices any discrepancies. It then has to react immediately, and safely to those discrepancies; lane closed? Creature on the road? oncoming car? React. The problem is that people aren’t yet very good at building systems reacting to every possible situation. Once you aren’t a silicon valley hype-factory, you have real automotive industry safety standards to deal with, which is very onerous, but necessary. The real self driving systems on public roads are a ways off, and it is absolutely impossible to mandate these systems without drastically limiting where people can go, and for the moment, the regulators seem unwilling to do that (and forever I hope).

        I don’t feel conflicted working on what I do, because as a senior engineer, I have the ability to shape this stuff away from the 1984 scenario. I can implement a specification in a customer friendly way, or a government friendly way, skewing the general behavior one way or the other, and I know which way I lean, which is pretty clear where I’m posting this comment. The moment I must make the choice of doing something which violates freedom of travel, or quitting, I will quit without a second thought. Right now, the systems I’m building are designed to look out for your interests, as the owner and passenger of the car. The government may be able to send a kill signal to the car, but no more, which is no different than today’s OnStar equipped cars. One of the primary requirements for safety is never to put in something which can be used to subvert it, so we are resisting any form of external manipulation of the system.

        Self driving cars will come. I would like to have one where I can turn this stuff on when I’m tired or drunk, and drive myself when I want to. I’m a car guy, I wrench on engines and have project cars.

        • One question rarely asked,

          Why do we need self driving cars? Seriously, what is the good we gain for what looks like a lot of problems?

          • Right now, we have self driving vehicles in quarries, mines, parking lots, places where humans get bored and also cost too much to hire. So, there is massive demand from various companies for off-road use – there’s money in it, and companies will chase a profit.

            Even though I work in the industry, I don’t see that much customer demand for it in private road cars. It’s an expensive technology, likely unaffordable for consumers for quite a while. So, the rich want to be chauffeured so they can do other stuff with that time, Google and others want to build robo-taxies to make a buck, without the cost of human drivers, and idiot regulators thing they can stop road fatalities and are driving demand via regulation.

            That being said, every automaker is pouring billions into self driving cars. They must know something I don’t.

            • Hi OP,

              In re this:

              “…every automaker is pouring billions into self driving cars. They must know something I don’t.”

              I can speak to this a little, knowing people in the business as well as having been on the periphery myself for some time now. There is a realization that we are approaching a regulatory choke point which will – in the near future – make cars unaffordable as general consumer products for a significant portion of the population. I refer specifically to the prospect of a 50 MPG CAFE standard as well as carbon dioxide “emissions” standards. Car loans are already at or very near their maximum economically viable duration. They can’t be extended much farther because of depreciation, a problem which is particularly severe for EVs, due to battery considerations. Since people increasingly cannot afford to buy (that is, to own) cars, the only option – other than making them more affordable to own, which isn’t on the table – is to rent them.

              Automated cars – rented by the trip. “Transportation as a service.” ideally, a subscription you pay – indefinitely. Like your electric bill.

              That’s ultimately what’s driving this. It also serves the purpose of increasing government-corporate control over the population, which is at least as important a motivation.

              Consider that there is almost no organic/grassroots demand for either EVs or automated cars, especially if these were offered on the market at their true cost and people were free to avoid them.

          • We “need” self-driving cars like we “need” EVs….. We don’t! Just more needless complexity, expense and CONTROL to muck everything up.

            But since the majority can no longerthink…no one bothers to ask.

            • Morning, Nunz!

              I think in both cases – EVs and automated cars – part of the appeal is the now-general fascination with electronic “tech.” You probably remember when the Japanese were mocked for being obsessed with blinking lights, computers – and so on. That affliction has spread. Especially among the under 35 crowd, which isn’t surprising given they grew up immersed in it. But the fascination with tech isn’t tempered by practical or economic considerations. It’s like a kid imagining a flying jet car….neato!

              • Eric,

                “It’s like a kid imagining a flying jet car….neato!”

                When I was a kid I imagined flying a rocket ship to the moon.

                In reality all I got was a compressed Rosie the robot (Roomba) and FaceTime. The only reason those are allowed is because they have cameras to keep tabs on me.

                Imagination and things like flying cars were banned before inception.

              • that is SO true, Eric!

                I’ve made the analogy of NYC subway before on here:

                People should be up; in arms about modern subway cars with all the computers and blinking lights, which cost over $2M each, and aren’t as reliable or durable as cars from 75 years ago…..but instead, they justify it by saying “We (It’s always “we”!) deserve a world-class modern subway!

                Hmmm…I guess reliability, efficiency and durability aren’t world class; it doesn’t matter if yoiu can’t get where you’re going, just so long as the trains are shiny, and have pre-recorded voices instead of a conductor announcing the stops……woo-hoo, look at “us”!!

            • It’s the ultimate answer to a question no one asked. No one, not the road-ruining bicyclists (who are too confusing for these things to deal with), not advocates for the elderly, not even the worst slowpoking safety mom ever even seriously considered the idea of a car that drives itself. It’s pure top-down bovine excrement.

        • Self-driving cars may be coming, but it ain’t gonna be pretty. They may, at best, work under limited, controlled scenarios- like you mention- airports and quarries and such- which are limited environments where things can be easily overseen and anticipated and controlled- but real traffic in the real world is a different story.

          Over the last decade, we’ve seen catastrophic failures of such system in ships, trains and planes- which are child’s play compared to random traffic where inches count.

          And I suppose we’ll get the usual responses when this crud comes to cvars “Well, just wait till the new versiuon comes out! It’ll work better”.

          And Opp, while you don’t have to rationalize for me nor anyone else, I do think that thinking that your position as a senior engineer somehow allows you to implement a kinder, gentler system, is much like thinking that being a healthcare administrator will somehow help combat Obozocare.

          You’re just doing what they demand of you- and the better you do it, the more you are helping them.

          • Sort of like the “Operative” in Serenity the big damn movie.

            “We’re making Better Worlds – all of them.”

          • ‘And I suppose we’ll get the usual responses when this crud comes to cvars “Well, just wait till the new versiuon comes out! It’ll work better”.’

            Just like airbags.

            Of course Takata……

          • Nunzio,

            “Over the last decade, we’ve seen catastrophic failures of such system in ships, trains and planes- which are child’s play compared to random traffic where inches count.”

            Inches are starting to count a lot more than they used to when it comes to planes.

            Back in the day, when pilots used VORs to navigate, you would occasionally hear of a near miss. The Victor Airways had some tolerances. You and I could be on a “collision” course and still be miles apart laterally and thousands of feet vertically when we crossed paths.

            Today, with the glass cockpits, GPS, autopilots, and Minimum Equipment Lists, it only takes a little bit of inattentiveness by two or three people and its raining people and aluminum.

            My guess is that we are just a few mandates away from such an event. That should fit in nicely with the New Green Deal and the elimination of air travel for the masses.

            Get ready for the blue glove treatment at your local friendly TSA checkpoint.

            • Tuan, maybe it’s a Darwin thing: Those who would tolerate the improprieties of the TSA are more expendable” 🙂

              I’ll bet those Blue Devils who fly those wing-to-wing formations at the air shows an old friend of mine frequents, aren’t on autopilot!

  2. The GM tech center here in Austin loves to hire foreign software developers out of the local OPT diploma mills otherwise known as CS Masters programs. And I’m talking about the *big* school here, not just the understudy state university in San Marcos.

    Having been in classes with these folks, it doesn’t exactly inspire confidence that less than two years after what is, for most of them, their first programming class, they’re out developing autonomous vehicle systems.

    Did I mention the cheating? No worries.

    • Oh yeah, I have worked with those, uh – “types.” They are very clever and can write code like crazy while having absolutely no idea what they are doing. Imagine trying to supervise them from 12,000 miles away.

      I’m now retired but not living like a king in Patagonia.

  3. I think these engineers have solved a spherical-chickens-in-a-vacuum problem, and now want to program all the real chicken farms in the world. They have forgotten (or never learned) that in complex systems, things go wrong the same way they go right (i.e., they are inherently NOT safe). The only reason things go right so many times is because there is an intelligent person constantly making approximate adjustments to the system that are just accurate enough to keep the system from catastrophically failing. When the approximate adjustments are not accurate enough, you get a failure (an accident). The process of making approximate adjustments is the same for both failures and successes (no accident). You never know if an approximate adjustment is accurate enough for success (or failure) until after the fact (i.e., an “error” doesn’t become an “error” until after it has already occurred).

    Perhaps somebody can program a computer to drive a car as well as a human, but can that same computer do so when immersed in a human society? Humans are not spherical chickens and society is not a vacuum.

  4. Q: How is Stevie Wonder’s and Helen Keller’s driving similar?
    A: They both drive with one hand on the wheel and one on the road…

  5. Eric, the same thing is happening in airliner cockpits now that aircraft are so automated. There’s theory that this contributed to the crash of Air France 447 crash of almost 10 years ago. Because modern airliners have become so automated, the pilots’ manual (i.e. hands-on) skills have deteriorated. We’re seeing the same thing happen with our cars.

    That reminds me of the scene from Demolition Man where Stallone’s character sees the Olds 442 in a museum. He says, “THAT’S my style!”; because he grew up without the tech, he’s right at home. Sandra Bullock’s character, because she grew up with automated cars, has NO CLUE WHATSOEVER about driving a car with all manual controls; without all the tech, she’s lost. That’s where we’re heading. People pan Demolition Man for being just another action flick, but, for those who are observant, gives us a look at our likely future.

    • Yeah, like the virtual-cybersex scene in that movie. Only men now, are probably going to want that because women are already becoming a danger to live with, let alone touch (and God forbid, have sex with). Getting arrested for “sexual assault” is more likely now than any VD, or Aids, femminazis at their best!

  6. Went to the site and left my comment but all the other comments approved the change. I’m sure those nice folks at the DOT have been PAID well to approve this. Asking for public opinion is simple a formality.

    Total tech junkie digital insanity,,, What else is there to say other than this will not end well.

  7. The Jeep likes to nag me to keep my hands on the wheel while on a very long straight section of road.

    One thing that this will lead to is manufacturers not worrying about how well the vehicle tracks. After all, the software will keep it in line, right? But like the 737 Max, it will work just fine until it doesn’t.

  8. Years ago I heard on the radio, I think it was Ralph Emory, interview Ronnie Milsap. He and his band were out goofing around so they put Ronnie behind the wheel and one of the band members behind him with hands on the singer’s shoulders directing him to speed up, slow down, turn left/right.

    Well of course they got pulled over for failing to signal or something, and Ronnie managed to bluff his way through it by saying he must have left his drivers license back at the hotel – LOL

  9. My car pool buddy’s 16′ subaru emergency automatic braking disables itself when it is foggy here in Houston, or heavy rain. Joy…

    • Subaru uses cameras that look through the windshield rather than radar behind the grille to trigger their saaaaaaaaafety “features.” Probably the system disengages whenever the cameras can’t see far enough ahead. If you put a covering over the camera lenses it might well keep that stuff turned off.

LEAVE A REPLY

Please enter your comment!
Please enter your name here