Subaru XV Crosstrek Forums banner

1 - 20 of 22 Posts

·
Prefers non-orange cars
'18 and '19 Crosstrek Limiteds
Joined
·
9,441 Posts
It amazes me how many videos people post of driving on public roads with other cars, not holding the steering wheel, to test driver assist...

From a quick search it looks like he's posted loads of these, on freeways, through intersections, etc. I wonder if he has a permit to do that?
 

·
Registered
Joined
·
13 Posts
Discussion Starter #3 (Edited)
It amazes me how many videos people post of driving on public roads with other cars, not holding the steering wheel, to test driver assist...

From a quick search it looks like he's posted loads of these, on freeways, through intersections, etc. I wonder if he has a permit to do that?
This is not driver assist. Please Google Comma AI Openpilot. This is not LKAS in the Crosstrek. It is a level 2 autonomous system. There is 25 million miles of logged data with this system.

COMMA AI makes this product. No permit needed.

Tesla is also running a Level 2 system in their cars.

 

·
Registered
Joined
·
13 Posts
Discussion Starter #4
This is not driver assist. Please Google Comma AI Openpilot. This is not LKAS in the Crosstrek. It is a level 2 autonomous system. There is 25 million miles of logged data with this system.

COMMA AI makes this product. No permit needed.

Tesla is also running a Level 2 system in their cars.

302007
 

·
Prefers non-orange cars
'18 and '19 Crosstrek Limiteds
Joined
·
9,441 Posts
But isn't the guy supposed to have his hands on the wheel, just in case? I'm pretty sure Tesla requires that, at least of consumers. Some cities and states have issued permits to manufacturers to test driverless cars. California has one but I'm pretty there has to be a human driver in control of the vehicle, and I doubt that a consumer could just buy a kit and legally drive around while playing a video game or whatever.
 

·
Registered
Joined
·
13 Posts
Discussion Starter #6
But isn't the guy supposed to have his hands on the wheel, just in case? I'm pretty sure Tesla requires that, at least of consumers. Some cities and states have issued permits to manufacturers to test driverless cars. California has one but I'm pretty there has to be a human driver in control of the vehicle, and I doubt that a consumer could just buy a kit and legally drive around while playing a video game or whatever.
Yes. There are multiple safety features. If you look away for too long the system will disengage. You have to pay attention while driving. Tesla requires a hand to be touching the wheel.
 

·
Prefers non-orange cars
'18 and '19 Crosstrek Limiteds
Joined
·
9,441 Posts
I'm reading up on it and I see that in their FAQ but I'm still curious if it's legal everywhere to, say, drive without your hands on the wheel. I must admit, it's a little concerning that there could be even more yahoos on the road with driver assist tech, thinking that they have a fully autonomous, self-driving car... :rolleyes:

Do I have to pay attention?
Yes, the driver must always be able to immediately retake manual control of the vehicle, by stepping on either pedal or by pressing the cancel button. When openpilot is engaged, a driver monitoring system actively tracks driver awareness to help prevent distractions. The openpilot system disengages if you are distracted. Drivers must keep their eyes on the road at all times and be ready to take control of the car.
 

·
Prefers non-orange cars
'18 and '19 Crosstrek Limiteds
Joined
·
9,441 Posts
So, that's a statistical sample of exactly ONE, and that you didn't get into an accident due to it doesn't mean you never will... 😸

This is pretty frightening:

The Eon, Panda, and Giraffe can all be purchased online for around $1,000. (The Panda is even available on Amazon for $99.) But the software that enables the semi-autonomous driving is free to download. Hotz says this allows him to sidestep the regulatory issue, though it’s unclear whether NHTSA would agree. “We aren’t selling any products that control a car,” he says. “We are giving away free software, and software is speech.” (A spokesperson for NHTSA did not respond to a request for comment.)

From: George Hotz is on a hacker crusade against the ‘scam’ of self-driving cars
 

·
Registered
Joined
·
854 Posts
I ran this system in my prior car for six months with no issues.
How did you find the level 2 autonomous experience @Subifreak?
I've come to really like the driver assist features of my 2018 and feel it has made me a slightly better, more aware driver. I really love driving and still don't really understand the push towards high / full level autonomous driving. What sorts of driving conditions have you used it in? Has it helped to avoid any dangerous situations? What have the benefits been in your experience?
 

·
Registered
2021 Crosstrek Limited - Pure Red
Joined
·
11 Posts
Seems like an easy way to get into big trouble if you're in an accident and they see you had some sort of 3rd party software controlling your car.
 

·
Registered
Joined
·
13 Posts
Discussion Starter #12
Seems like an easy way to get into big trouble if you're in an accident and they see you had some sort of 3rd party software controlling your car.
The Eyesight controls the car. If you rear end a car it's the drivers fault.

Sent from my SM-G988U using Tapatalk
 

·
Prefers non-orange cars
'18 and '19 Crosstrek Limiteds
Joined
·
9,441 Posts
The Eyesight controls the car. If you rear end a car it's the drivers fault.

Sent from my SM-G988U using Tapatalk
From what I read about it, since your first post, the lane keeping is controlled by OpenPilot, not emergency braking. So a more likely scenario would be that the driver is not paying attention and the lane-keeping fails or misinterprets something that results in he car leaving its lane and colliding head-on with a vehicle coming the other way. Eyesight's emergency braking wouldn't be able to prevent that as it only works up to about 30mph.

There would most certainly be an investigation into what happened and the other driver's attorneys would have a field day with it, as @ITGuy1024 pointed out. From the Verge article I posted, it's pretty clear that the developer is intentionally sidestepping the laws and regulations. Seems highly irresponsible to me and not something that should be promoted on a forum like this, IMO...
 

·
Registered
Joined
·
854 Posts
From the Verge article I posted, it's pretty clear that the developer is intentionally sidestepping the laws and regulations. Seems highly irresponsible to me and not something that should be promoted on a forum like this, IMO...
Hhmm "speech" and self intererst in the guise of ultruism?
Why such a strong push towards fully autonomous vehicles if the primary motivation is driverless Ubers with safety taking the back seat?
 

·
Prefers non-orange cars
'18 and '19 Crosstrek Limiteds
Joined
·
9,441 Posts
Hhmm "speech" and self intererst in the guise of ultruism?
Why such a strong push towards fully autonomous vehicles if the primary motivation is driverless Ubers with safety taking the back seat?
All of this has been of interest to me for some time and we even went to a Stanford University Alumni workshop on it last year. It was less to do with the technological developments and more to do with the legal ramifications (e.g. who is to blame if your autonomous car kills someone while you're playing video games in the back seat, or whether it will OK to get drunk and have your autonomous car drive you home from the pub).

I believe autonomous cars will eventually be safer than human drivers (if they aren't already), I just have issues with the way this OpenPilot guy is going about it. Seems reckless to me.
 
  • Like
Reactions: SubieSubieDo

·
Prefers non-orange cars
'18 and '19 Crosstrek Limiteds
Joined
·
9,441 Posts
p.s. Also, what was fascinating was a panel discussion about whether the car's programming should protect the occupants of the car at all costs or make decisions based on the best outcome overall for society. What if the options in a certain scenario are to protect the occupant and kill 10 pedestrians or kill the occupant and save 10 pedestrians (i.e. the "Trolley Problem")? Who gets to determine how that decision-making is programmed?
 
  • Like
Reactions: SubieSubieDo

·
Registered
Vancouver, BC, Canada CGK 2018 Ltd EyeSight
Joined
·
2,475 Posts
p.s. Also, what was fascinating was a panel discussion about whether the car's programming should protect the occupants of the car at all costs or make decisions based on the best outcome overall for society. What if the options in a certain scenario are to protect the occupant and kill 10 pedestrians or kill the occupant and save 10 pedestrians (i.e. the "Trolley Problem")? Who gets to determine how that decision-making is programmed?
Yup, I have seen the Tesla proponents argue that on the EV forums when Elon Musk was releasing self driving software that was not fully tested. Their argument was that it was speeding up the development (sort of like Agile) and roll-out of self driving and it was worth killing some early users along the way to save more in the future.

I found that a little disturbing however, as I know most of the self driving features they were promoting were not safety oriented. Most new cars already have fully tested (or at least their best efforts) driver assist features like collision avoidance and blind spot lane change warnings. Those are the ones that save lives, not just letting you drive hands off to your destination, and eventually totally driverless. We are a long ways off from the latter, incl the current state of the Tesla self driving software. I just don't see the justification of sacrificing lives with where we are at this point in time.

And for those that are not familiar with the Tesla lingo, Autopilot is like our driver assist features in the Crosstrek, albeit with more bugs (phantom braking and not recognizing some obstacles ahead, and others). That is different than their Self Driving software which is what we are talking about here. BTW, my son has a Tesla, so I am very familiar with that car. And yes, he has had some potentially dangerous surprises with his.
 

·
Registered
Joined
·
854 Posts
p.s. Also, what was fascinating was a panel discussion about whether the car's programming should protect the occupants of the car at all costs or make decisions based on the best outcome overall for society. What if the options in a certain scenario are to protect the occupant and kill 10 pedestrians or kill the occupant and save 10 pedestrians (i.e. the "Trolley Problem")? Who gets to determine how that decision-making is programmed?
Yes we have a lot of trucks using the highway. Who decides whether a single occupant is more important than the truck driver and the value of his load? What if the truck driver has overridden his speed limiter? So many questions.
 

·
Prefers non-orange cars
'18 and '19 Crosstrek Limiteds
Joined
·
9,441 Posts
Yep, we're still at the "driver assist" level which some idiots think means they can have a nap or play a video game while their Tesla does all of the driving for them. It didn't work out so well for a couple of notable ones that made the national headlines. I don't have a problem with them winning Darwin Awards, I'm just concerned that they may crash into us!
 

·
Administrator
2021 Crosstrek Limited, Pure Red
Joined
·
3,674 Posts
p.s. Also, what was fascinating was a panel discussion about whether the car's programming should protect the occupants of the car at all costs or make decisions based on the best outcome overall for society. What if the options in a certain scenario are to protect the occupant and kill 10 pedestrians or kill the occupant and save 10 pedestrians (i.e. the "Trolley Problem")? Who gets to determine how that decision-making is programmed?
I know this is way too long to embed, but it's relevant, and I love n excuse to quote SMBC.
 
1 - 20 of 22 Posts
Top