Budget Self-Driving Car - Computerphile
Vložit
- čas přidán 4. 10. 2023
- Can you replicate millions of dollars of tech with a webcam and an arduino? Not really, but you can get pretty close! Dr Alex Turner took to the motorways of Britain to prove a point!
A few extra bits from our conversation: • EXTRA BITS: Self Steer...
Alex's Code: apt503.github.io/
/ computerphile
/ computer_phile
This video was filmed and edited by Sean Riley.
Computer Science at the University of Nottingham: bit.ly/nottscomputer
Computerphile is a sister project to Brady Haran's Numberphile. More at www.bradyharan.com
Thank you to Jane Street for their support of this channel. Learn more: www.janestreet.com
1:25 _"I thought this was going to be more straightforward than it was. It turned out to be a little bit tricky."_ Don't worry. Every autonomous vehicle company seems to have had that same realization.
Great quote. Imho it's quintessential computerphile logic, not just automated driving.
@@blitzwing1 I feel like they jumped into attempting "Full Self Driving" too quickly. They should've solved the smaller problems first. Like assisted driving safety features. Or autonomously navigating car-parks. Then built upon those.
An important problem with this general approach: Every frame it's seen is from a situation where the car is positioned sensibly on the road, being driven well. If you let it drive a real car it might do ok for a short while, but if it ever makes a mistake, the car will then be in a slightly unusual position, which is a situation it hasn't seen before, so it's more likely to make another mistake, which puts it in an even more unusual position, and so on, so errors compound and the system quickly gets into an out-of-distribution state, e.g. upside down in a hedge
you can alleviate this by augmenting your data: flipping the image, translating it, rotating it during training to where the network will learn to correct for a deviation
"It should robustly keep you on the road and robustly drive you into a hedge whenever it wants to"
I appreciate this kind of realism about AI 😂
If you skip the arduino, and use it just for acceleration and brake (accelerometer), and use a cheap $10 ODB-II cable, you should be able to read the steering wheel angle sensor that is on the car.
An ODB2-USB cable?
@@RootSystemHash yeah, there are different types, I myself nowdays have a bluetooth based one, but works the same. You can also get pedal input data for your throttle pedal, but unfortunatly not anything about the brakepedal (on most cars)
probably you want two so you can control for when the car is on an angle
If you use a piece of white tape on the steering wheel, and move the camera behind you so it can see the steering wheel and road you can save a bit more money 😆
In the same way, with some wires, springs, tape and wheels/pulleys, you could film the gas an brake pedal positions at the same time.
With a large enough Rube Goldberg machine he could roll, push, and bounce all the way home from his mum's house!
This plus some ultrasonic proximity data and you are golden.
"Following the car in front of you seems like a reasonable thing to do"
AI sees oncoming car on an otherwise empty road: "Weee, here we go!" 😜
I wonder if using something like Optical Flow to create single images from videos with visual motion vectors and feed them to the network would work
there are some cases when optic flow will not be able to estimate a motion vector. Imagine a pixel in a homogeneous background, no way to estimate where a particular pixel moved since they all look the same in the neighborhood. I'm thinking clear sky.
Funny thing... The library and model that you use, I used the same for my college project for the same reasons - quick, easy to train ... I used if for classification though and then applying effects to the image.
Comma ai
it would be cool to also take the velocity at each point, and integrate the data to generate a map of the path taken by the car assuming the output was actually followed which could then be overlaid with the true gps track.
What about geohot's comma 3?
Works great, have one myself. Very advanced.
I didn't mean to a db, prof explains some core concepts here.
@@MrNerdHair Where are you from? I wonder if it works in India as well. It could do great on some highways.
It's such an interesting topic! It would be great to see the progress.
also i was wondering if you could just flatten the distribution curve, just what i needed to know for my project!
Hey, can you make a video on how computer standards are decided and what goes into making something into a standard, and role of ISO and similar organization. Thank You.
They should do an interview with someone at comma-ai. Talk about their self driving device/ai model.
Here's something to put your mind to. A motion sensor retrofit for antilock brakes. Most vehicles with antilock brakes have no ability to detect when the vehicle is still moving once all four wheels have locked up. The vehicle then becomes a hockey puck crashing into everything as it slides down a steep Seattle street. The proper driver response is such a situation is to take their foot off the brake pedal to see if the rolling tires can get enough traction to regain direction control.
ABS works on *speed differential* between wheels. If some wheels stop rolling while others haven't, ABS pulses the brakes. More sophisticated ABS monitors the rate of rotation speed change so it will pulse when some wheels are close to locking up.
But without acceleration and gyroscope sensors, once all the wheels are locked, ABS assumes the vehicle is stopped.
A motion sensor retrofit could 'wedge' between the wheel speed sensors and the ABS control. Under normal operations the motion sensing would simply pass the wheel speed sensor pulses through. The motion sensor system would only engage when it detects that no pulses are coming from all the wheel speed sensors *and* its solid state accelerometer detects the vehicle is in motion.
In that state the motion sensor system would generate fake wheel speed sensor data to trick the ABS into restarting brake pulsing.
I'd start by only faking rear wheel data. Rear wheel lockup is the #1 problem causing loss of control and spinning around. Tricking ABS into pulsing the rear brakes would restore rear wheel traction by allowing them to rotate. The resulting wheel speed differential would cause ABS to engage its normal response to front wheel lockup.
The motion sensing system should only need to very briefly generate fake wheel speed signals to cause ABS to 'wake up' and resume its brake pulsing. In extreme conditions such a fresh snow in Seattle where nobody gets snow tires, the system might have to repeatedly engage in response to ABS repeatedly locking all wheels with the vehicle still imitating a hockey puck.
Love the idea of using an existing / pre-trained object recognition net and just removing the classification layer. Since those aren't designed to recognize lines, the idea of adding a Hough transform or other line detection routine via more classical vision systems, and also feeding that into the training data seems smart. As @brantwedel mentioned, since the actual used resolution is small, moving the camera back and watching the wheel isn't such a bad idea... but you must be very careful not to obscure it's vision, and you have to pick out the white tape. It might be better to mount a second arduino on the dash, and subtract it's angle from the wheel angle to avoid tilting road issues. Instead of labeling with just the wheel angle, multiply times 100 and then add 10 times a single digit reading of the acceleration (measured via the dash mounted arduino) and -1 times the deceleration aka acceleration to the rear. Finally, apply the blue box to the physical wheel via a servo driven roller wheel which is easily overridden.
i wonder how much better it would be to use recurrent nets that can do predictions and comparisons off memory
Once we eventually are able to understand what AIs have truly learned it’s going to be interesting to see if there are any cases where we learn that there are ways to do things that are perfectly valid that we didn’t think of ourselves. Like the AI learned to drive or do X in a way that’s totally different than the way humans do it but works fine and we would never have realized.
I don't think we're ever going to be able to understand the internal working of neutral networks, unfortunately. We already don't, and we just keep making larger and more complex models. It's a similar problem to trying to understand the exact decision process of a person by examining the firing of their individual neurons. I am really excited for AI driven physics research. Really hoping the robots figure out cold fusion or something and the turn to us and say something like, "how did you idiots miss this? It's incredibly simple, you could have had limitlessness energy for like 50 years already."
@@evanbarnes9984 yeah that could be that we never figure it out though that’s pretty worrisome long run for alignment issues I feel. I’m extremely excited like you for the potential of what AI might be able to do for us in the STEM fields as research tools! It does concern me a fair bit though that we are integrating LLM AIs en mass into our social culture. I know the world doesn’t work that way but I would feel that’s the last place, not the first you would want to integrate such a poorly understood vastly disruptive technology. Though I also feel there’s very much an AI bubble going on right now too where both many of the big positives AND negatives people are high on about are going to evaporate sooner or later. Some of it is very gimmicky and we don’t yet all see it I think.
comma ai already exists u guys
The numbers just go up and down
I have done pretty much the same in a game. Just to test the waters on how good my ANN could handle the street at different amounts of data and input given.
Found it easier to buy a cheap sim racing gear and just capture the data it can provide.
Worked pretty well for me, I wouldn't trust it with my life but hey at least it didn't go offroad every turn ^^
There is of course all the data a low end smartphone can collect. Acceleration, direction and velocity for one.
As an aside, I'm actually quite surprised all the new war drones don't seem to have a moderately priced smartphone doing all the nav as well as image terrain matching and target selection.
Can’t those same smartphone components be used for other purposes e.g. combat drones?
4:25 doesn't squaring numbers between -1 and 1 push numbers closer to zero?
I assume he meant reverse square aka 1-(1-x)^2, or square root-ing x (depending on how exactly you want the distribution curve to look)
the angle range is between -1 and 1, but the weight graph, where it's been limited to is probably between 1 and some arbitrary upper limit.
I think, he is just replacing x with something like X^2.
He is simply reorganizing the regression values to avoid the model from only "learning" the dominant outcomes.
Now, one way to do that for plot x vs frequency (num of times that angle x was the label), is to take the √x of original x values.
This would somewhat normalize (average) the SMALL frequency of values in vicinity of x= +-1 & LARGE frequency of values in vicinity of x=0, giving a more uniform distribution, exactly because of the reason you have stated.
Higher num of values in vicinity of X= .5 , smoother the curve.
it does and for instance 0.5^2 is 0.25 (so we just divide by 2) and 0.01*0.01 is 0.0001 (divide by 100), so squaring will flatten the curve overall, as desired, because the closer to 0 the higher the flattening effect
Maybe he means that he squares the output of the neural net before comparing it to the (unchanged) angle sensor data. And similarly, you'd then also square it when you want to generate actual steering inputs from the neural net output.
That's fully equivalent to taking the square root of the angle sensor data, which would seem to make training the neural network easier. (You'd still square the output of the neural network to generate steering inputs.)
Look up Comma AI
I'm guessing a huge issue is going to be 'compensation steering', so how the computer is going to steer the car actually changing direction all the time as supension reacts to bumps, accelerations etc. Even EBD and ESP system in mass produced homologated cars in some models are not tuned quite well, and put the car on the edge of catasthrope instead of saving it. I mean, even if the system correctly recognizes where to turn, actually making it do properly with a car is going to be a whole different story.
Tire flex introduces a constant and ever changing degree of uncertainty in steering response to steering input. Humans (mostly) effortlessly compensate for that without having to think about it. Turn the steering wheel, car doesn't turn quick enough, the human turns the steering wheel some more. How much more? Enough to make the car move the desired amount. How is the amount of extra calculated? It's not, it's simply *done*. The driver doesn't do a bunch of complex equations, the steering is just turned the right amount.
Some people do take some time to learn that most vehicles don't respond instantly and precisely to their control inputs. Some people can get in the driver's seat of any vehicle and almost instantly adapt to its individual quirks. Some people learn to drive one vehicle, yet will take several minutes to get used to a different one.
@11:04 just though of this. For v2, you could add a second level attached to the dash. Calibrate them and then subtract them to get the angle of rotation relative to the car, not the earth.
Shouldn't you be able to test it a bit more quantitatively by comparing the values the ML algorithm gives to the real arduino data for a series of images that were not used for training? The difference would be zero in the optimal case.
The importance of correctly mounting a dashcam so it's not over-exposing cannot be overstated. Great demonstration though, despite the worst-case training data! 👍😎🇦🇺
Only people who don't know Ai would trust an Ai to drive.
It's all fun and games until it suddenly decides a white van is the horizon.
So interesting! Do you think that's a viable approach vs what other companies are working on?
There is a digital bus in the car, why on earth do you use a webcam to capture the steering angle?
Sounds like a fun thing to try out in a game with an open city like GTA or Midtown Madness :P
There are actually quite a few videos out there of people training self-driving AI on racing games like TrackMania, where they can design custom tracks with very recognizable graphics for the NN so it doesn't have to deal with the noise of other cars, signs, buildings, etc. (Though I guess some could consider it cheaty to remove said noise)
*sentdex has entered the chat.*
Just try it out on a real car, you bunch of chickens. Live fast die young.
This is like sentdex's pygta5 project. I love it.
You can buy an arduino can bus shield for like 30 bucks or so, that's still in budget.
You could just teach it to say 'It's a lovely day for a walk' and 'Don't forget your brolly'.
Create one for an RC car.
Cool, Now all your missing is 99.9% of driving. :p
Ah so this is a DIY Comma, neat.
Fitting the Arduino on a pendulum, you should get an accelerometer. Make that twice with pendulum free only in perpendicular planes and you get also the steering.
It made him drive on the wrong side of the road 😂
Seriously though, use an identical Arduino on the console or whatever, referenced to car instead of wheel. It would be much more useful to train on the difference than absolute orientation.
I would not trust this at all lol
Tesla shoud learn from this video
People that want self driving cars should just take the bus.
Love from India 🇮🇳🇮🇳🕉️🕉️🕉️🕉️☸️☸️☸️☸️❤️❤️❤️
There is a real world application for this.
Clearing minefields is proving to be a real problem in the War in Ukraine.
One way to help do that is to take old junk cars and trucks and make them self driving and use them to plow through mine fields.
One might have them drive in reverse and fill the trunk or truck beds with sandbags to make them more survivable against gun fire.
Also, one could weld a steel plate to protect the gas tank.
If someone were to develop a kit then others could manufacture them and then put them to use.
BTW, I have a slogan for this. Turn the battlefield into a junk yard not a graveyard. For even when these cars are knocked out, they still can provide cover for troops moving up behind them, with those trunk/truck beds filled with sandbags and all, makes them a self driving field emplacement as well as a mine clearer.
And having self driving capability would make them more jam resistant over simply making them (only) remote controlled.
And after we’ve cleared the minefields, ban mines and make their use a crime against humanity.
Could be interesting, but triangulation with radio or GPS would probably be more practical. Using neural networks would be pretty overkill to sweep a vehicle over a set area or to move a vehicle towards a destination.
Another issue that favors position information would be identifying markers. Roads have clear markers that help immensely with training whereas minefields would be much more complicated environments.
Bro, car is driving the wrong way 😂
Nice lol
borderline safe... so either you are safe or everyone else is safe. That's a really cool term
Never been first anywhere else in my life.
Damn
Ok loser
Your first two sentences are the whole video and you should have stopped there and said “and that’s the problem, it won’t be safe but I’ll be convinced that it is safe enough, and it will kill someone”
Cameras aren’t enough for safe self driving and it’s criminal that we all them on the road at all.
Elon taking pure notes.
Silly person, we know you wouldn't trust it but Elon would.
NOT COMPUTER RELATED
Yes it is. An Arduino is a kind of tiny computer. Also, AI is a sub-field of computer science.
Yes it is. An Arduino is a kind of tiny computer. Also, AI is a sub-field of computer science.