The Reminger Report: Emerging Technologies

Autonomous Vehicle Technology Explained

July 15, 2021 Reminger Co., LPA Season 1 Episode 13
The Reminger Report: Emerging Technologies
Autonomous Vehicle Technology Explained
Show Notes Transcript

Today,  Zachary is joined by Stuart Sherry, an electrical engineer and senior staff consultant with Engineering Systems, Inc.

Zach and Stuart discuss the various types of technologies utilized in autonomous vehicles, including: radar, lidar, and ultrasound. Stuart reviews "sensor fusion", the ability to bring together inputs from multiple technologies to form a single model or image of the environment around a vehicle. 

Visit our website for information about our legal services related to emerging technologies.

ZBP -   Zachary B. Pyers, Esq.

SS        Stuart Sherry, Engineering Systems, Inc.

ZBP

            Welcome to this special edition of the Reminger Report Podcast on Emerging Technologies.  Today we are fortunate to have a special guest, Stuart Sherry, who is a senior staff consultant with ESI, and he is going to be talking with us today about some of the sensors and technology that is utilized in autonomous vehicles.  Stuart, if you would, just kind of introduce yourself to our audience and give us a little bit of background on yourself as to how you ended up in this industry.

 

SS

            Thanks, Zach.  I’m glad to be here, and I appreciate the invitation.  Stuart Sherry.  I have a Master’s Degree in Electronic Engineering.  I am a licensed professional engineer in the State of Michigan, and I have spent about 20 years in the automotive industry, mostly with suppliers developing electronic products, so I ran an engineering team doing hardware/software development, project management, the whole industrialization of various different electronic technologies, including some autonomous driving technologies.  Decided roughly 18 months ago that I wanted to kind of use that background and get into forensic engineering, and then joined ESI there in January 2020 with really a focus on autonomous driving technology, vehicle electronics and the litigation that may surround that in the future.

 

ZBP

            Now I know you’ve had some of these discussions with us off the record, but we talked about the autonomous driving and how some of the technology that enables it has been around for a while.  Why now is the autonomous driving becoming such a relevant topic for us, considering the fact that that technology has been around for a while?

 

 

 

SS

            It’s a good question.  It’s funny.  I think back to, I don’t know, 10, 15 years ago, when the show Myth Busters used to be very popular and I used to watch Adam and Jamie mock up these cars with all these actuators and sensors and do remote control of vehicles, and I used to think, man, one day we’re going to have cars like that, and honestly, I would say it comes down to the scalability and cost of that technology, so the things that they did back then were big and bulky, expensive, one-off type devices and nowadays you can get radars that are smaller than a postage stamp in the millions of units for $10.00, $15.00.

 

ZBP

            Now one of the technologies you just mentioned was radar, and the cost obviously of the radar has been dropping.  As we talk about all the various technologies that these vehicles use, as we move into higher levels of autonomy, can you outline the differences between these technologies and help our audience to understand what these technologies are?  I’ve heard the term “LiDAR,” “radar,” “ultrasound,” “infrared.”  Obviously a lot of people are familiar with backup cameras, and I’ll you that, I mean my own vehicle I know has parking sensors on them, but if you asked me, I don’t know that I could tell you actually what type of technology those sensors use to actually start beeping when I’m about ready to hit something, so if you could, can  you explain that to our audience?

 

SS

            Yeah, absolutely.  I’m really classified into two different technologies.  One would be basically the combination of radar, LiDAR and ultrasound, and the basic principle is similar with all those, and that is they send out some sort of beam or signal and then they monitor how long that signal is out there.  It bounces off an object and comes back, and they can measure that time.  The difference would be with radar, you’re talking about electromagnetic waves, which would be things we’re all familiar with, like radio waves, when you listen on the radio, your cell phone uses radio waves to communicate, so a radar is sending out these beams of electromagnetic waves, bouncing off something, measuring the time it comes back.  LiDAR is the same principal but it uses light.  That’s where the “L” comes from for LiDAR, and radar came first, LiDAR came second and they use the same sort of terminology.  In that case, they’re sending out beams of light.  These are usually small focused beams, and they kind of move the beams around, up and down, left and right, and then they send that out and the same thing - measure the time for it to come back.  Ultrasound, I think, is one that most of us are familiar with because that’s what you hear about on submarines, so it’s a sound wave that’s sent out and it’s the same sort of principle that it hits an object and bounces back.  The difference between those three and why there are three of them is usually on the resolution, so ultrasound will probably tell you like you just mentioned, Zach, that there’s something you’re going to back into, a parking garage, a garage door, a car or some big bulky object.  It’s not very fine-tuned.  It is very cheap; that’s why they’ve been around longer than most technologies, but you can’t really tell it’s a person, it’s a dog, it’s a pole, etc.  When you get into radar and LiDAR, you have a lot higher resolution where you can actually start to detect what something is, especially with LiDAR that you see nowadays.  That’s what everyone sees on that Google streetcar that does the street view that drives around with the big LiDAR spinning on the top.  That’s been scaled down also to much lower cost, but it’s sending out beams of light, and it’s taking a picture with distance to the objects that it hits and printing sort of a 3D map of what’s around the LiDAR machine.  One of the things of LiDAR is, it is line of sight, so it can’t see through things.  It can’t see through walls.  It can’t see through material.  It does have some issues sometimes with glass and reflection, where radar oftentimes can see through things.  For example, if you think about snow and rain, that can sometimes be trouble for LiDAR where it sort of blinds the optical beam where radar has a much easier time doing that, so I look at radar, LiDAR and ultrasound, I really look at those as kind of one basic technology with three different applications or three different resolutions.  You asked about infrared and camera.  Those are a very similar technology that’s basically a picture as I think we’re all familiar with cameras, infrared is almost the same except that it’s just at a lower spectrum in the visible light or invisible light spectrum where you can take a picture of things that are infrared instead of just are visible light.  Cameras in infrared, I mean the actual camera in infrared is pretty basic.  We’ve had cell phone cameras for a long time.  I remember we used to all talk about which megapixel size camera is on your latest phone, so that’s been around a long time.  The real improvement and why you see it nowadays is the electronics and technology that processes those pictures, so the ability to detect in those pictures, what is this thing.  Is it a bicycle?  Is it a motorcycle, etc.?  So that’s really the difference as you see on your vehicle around all those different technologies, and that’s where all of those have their own application area that are now enabling autonomous driving features that we’ve been talking about.

 

ZBP

            Now I’ve heard the term “sensor fusion” used and discussed in some of the topics.  Could you kind of explain that to our listeners as to what the term “sensor fusion” means and how it correlates and applies to these autonomous vehicle technologies?

 

SS

            Absolutely.  In the auto industry, everyone, probably like other industries, likes their buzzwords and acronyms and things, so sensor fusion is essentially a way of saying, like I was mentioning earlier, we have a lot of these maybe dumb sensors around the car.  It’s a camera on each corner of your car, it’s a LiDAR on the top, it’s a radar on the side.  By themselves, those individual systems don’t do much.  They will tell you this distance to an object.  What the object is, I don’t know.  How far away the object is, I can tell you.  How big it is, I can maybe give you some information, but it doesn’t have the smarts to figure out what it is.  Sensor fusion is a way of taking all that input from a radar, a LiDAR, an ultrasound, a camera, an infrared, etc. into some sort of smart system, smart computer, high-capacity ECU, electronic control unit, that is basically there only to process all of those inputs and start to figure out what am I looking at.  So my radar says I have this blob over here.  My camera, when I do my processing, it looks kind of like it might be another car and then my LiDAR has confirmed the distance to it, and therefore my computer is going to basically say, hey I think there’s a car over here with this sort of probability, and that’s what sensor fusion does.  It basically takes those different inputs and processes them so that they can kind of confirm with each other or basically also could not confirm with each other and say, hey, that’s not what I’m seeing, and it makes a decision, what is this actual thing that is around the car, and that’s really the idea behind sensor fusion.

 

ZBP

            It sounds to me, and correct me if I’m wrong, but as I’m trying to understand, this obviously is from a non-engineer’s perspective, is it helps to combine the different technologies to really make it more, so that it’s usable for the vehicles to make decisions.  Is that fair?

 

SS

            Yeah, that’s fair.  I mean, the way I look at it, honestly, it’s a little bit like how we as humans operate.  We have our five senses, so maybe you hear a hissing sound and then you smell a gas smell, so the ears and nose have confirmed to you, hey there’s a gas leak - I need to get out of here.  One or the other, you may not really realize it.  If it’s just hearing, maybe it’s just an air leak of some sort, and that’s really what sensor fusion does.  It takes in all these different sort of senses that have different resolutions, different capabilities, and filters them and makes decisions about them to come to a conclusion what’s going on around the environment.

 

ZBP

            I think that’s a really good analogy.  I think that’s probably one of the best explanations I’ve heard is to relate it to the five senses that we as humans understand and correlate that, so I think that’s a good one.  I know that I had originally told you that I didn’t want to talk about the varying levels of autonomous vehicles because it’s a topic that have already covered in some of our earlier episodes, but it occurs to me now that it probably would be helpful, if it’s not too much to ask, as you talk about sensor fusion and you talk about these varying technologies.  We’ve talked previously in other episodes about the 0 through 5 range of autonomous vehicles.  I do actually think it would be helpful if you could explain to us which one of these technologies falls and where, especially as we get into the 2 through 5 range when we start to actually see some of these higher level functioning autonomous-esque features.  Is that something you could at least help us understand where they fit into the picture?

 

SS

            Yeah, absolutely.  So as you mentioned, there is the SAE definition of autonomous driving levels - 0 through 5 - from no automation to full automation, and the areas that I think we’re talking about, like you said, are the 2 through 5, where 2 is a sort of partial automation.  An example I use for Level 2 would be something like a car that has adaptive cruise control, so we’re all familiar with cruise control.  It’s been around a long time.  You set it and it goes and it’ll crash into the car in front of you.  Adaptive cruise control uses, for example, radar technology so it can monitor the distance of the car in front of you and it will provide feedback to the electronic control unit in the car saying, hey, I know you’re set at 65 miles an hour but you’re gaining on the car in front of you, and it will provide feedback to tell the car to slow down a little bit, and that’s an example of where technology like radar is used already today in automation levels that are out there.  When you get to, in my opinion, Level 4 and Level 5, high and full automation, you’re talking every technology I just mentioned is going to be available and used.  There are differences of opinion.  Some companies have made pretty blanket statements that they will get away only with cameras.  They will do away with radars; they will do away with LiDARs, and the basis behind that is basically saying if a human brain can use two eyes and no other senses to really drive the car, we believe we can do it with a computer.  I think that remains to be seen, and that’s why you see so much development right now in all of these technologies because nobody can really can say for sure, I can do it with one technology or I can do it with two technologies.