I would trust it if I was in the driving seat “in case”. I certainly wouldn’t go into the back seat and just trust it.
Tesla seem to be wriggling saying something like...............we never said it could drive itself
https://www.bbc.co.uk/news/technology-56799749
I would trust it if I was in the driving seat “in case”. I certainly wouldn’t go into the back seat and just trust it.
No, absolutely not - ever.
When you look long into an abyss, the abyss looks long into you.........
I'd trust them for most things as if anything, they can be safer than a human and eventually will be far safer than us.
But not yet...and I don't imagine for a very long while given the absolutely ridiculous challenges of going fully autonomous.
I think I'd treat a Tesla, if I owned one, as having very advanced driver aids only. I can't imagine doing what that article stated someone did. I imagine that plenty of people have done and indeed still do what they did, it's just they've been lucky so far.
I would not trust anything Tesla. Deadly junk.
I'm not sure there is any wriggling...surely the "passengers" are at fault for not having anyone in the driver seat? Autopilot does not mean autonomous. When a pilot engages autopilot, they don't kick back and sit in first class, neither should these idiots have vacated the operators seat.
Would I trust a Tesla now? Yes. I'm quite happy to trust my cruise control so why not trust Tesla's autopilot as long as I am alert and monitoring it. Would I eventually trust autonomous cars? Yes. Quite happy to trust an aircraft landing itself in zero visibility on autoland so when autonomous vehicles are developed with accepted levels of safety, I would have no problem.
At this point, it will be as good as the code. Would I trust a programmer who isn't in the car with me? Give it, say, fifty years, for the tech to be sorted out, the ethics to be sorted out and the unexpected consequences to be sorted out and I'm good to go. However, I'll always be intrigued by the question 'who is to blame?' in the event of an accident caused by a self driving car. One day, we'll have truly autonomous vehicles, but not yet and we'll have to learn to trust them in an ad hoc manner when they arrive.
Certainly when they are safely available.
Driving lost its fun to me years ago and I'd love to be chauffeured around.
Cheers,
Neil.
they will never in a million years work on the roads where i live so i give it no thought really.
I'd be quite happy to take a Johnny Cab around town at 20mph but as for 70mph on a motorway... no frickin way.
I would be happy cruising in an autonomous car if everyone was driving one, perhaps on a motor way, as long as I could take back control if I needed to. But not on the crazy roads in the hilly Pennines where you need to know the potential hazards and passing conventions from experience.
Sent from my iPhone using TZ-UK mobile app
Like this well-known news story from five years ago?
https://www.tweaktown.com/news/53332...ife/index.html
The tech has moved on a lot in the interim.
As others have mentioned, this is an advanced driver aid, not an excuse to go for a nap in the back seat on the motorway.
So clever my foot fell off.
No. Software and hardware has a habit of going wrong.
Started out with nothing. Still have most of it left.
I don’t even let my car park itself, I can’t even remember how to operate it. It’s quicker to park it myself.
Sent from my iPad using TZ-UK mobile app
[Justin Case] I would not trust anything Tesla [Fine that's your opinion but no expansion as to why you've formed that opinion and on what evidence]. Deadly junk [Utter crap! I'd wager very few individuals outside the USA have been quite so stupid as to be in the rear seats, go to sleep etc.]
Last edited by Skier; 20th April 2021 at 20:28.
No chance. We have managed just fine up to now driving ourselves
So you're comparing cyanide, a deadly poison to one of the worlds largest and most successful car manufacturers who have achieved 5 star safety ratings across their entire model range
In fact the Model 3, their best selling car, achieves the highest possible award of Top Safety Pick+ in the US national rankings
https://www.iihs.org/ratings/top-safety-picks
Great comparison cyanide and Tesla
Last edited by Vanguard; 20th April 2021 at 12:59.
Probably not but if I was no longer able to drive myself due to medical reasons, poor eyesight for example I'd imagine I'd be more open to the idea.
I barely trust the wife in the drivers seat let alone a programme written by someone I don’t know.
In a word. No.
I'd rather trust a blind man to cut my hair. Or what's left of it.
Sent from my SM-N976B using Tapatalk
I would trust it, like I trust a Rolex AD to tell me the truth!
I still get freaked out by cruise control. Then again, I ‘trust’ taxi drivers, so why not?
Not in a million years. When was complicated tech ever infallible??!
I would, but only when there is better infrastructure and more autonomous vehicles on the road, with some agreed common standards between brands for vehicle-to-vehicle communications.
We’re at such an embryonic stage at the moment we have essentially cruise control on steroids, and some technical demos. I can imagine a future where cars know where each other are, there is a mechanism for reporting obstructions, street lights and smart motorways can share information, etc. It’s a long way off though; if there is one thing proven in technology history, it’s agreeing standards is a long and painful process with a lot of false starts along the way.
I wouldn’t trust a car not equipped with full self driving to .. eh .. fully drive itself as per the article.
It’s a tricky one. When this works, it will save 100’s of thousands from dying and millions from suffering life changing injuries every year.
I don’t like how Elon goes about this but autonomous driving is something we should embrace and encourage.
It will never be infallible, but it won't be fiddling with the radio, infoscreen, or heating controls etc. It wont be making a telephone call, arguing with the wife, or ogling a hottie. It wont be hungover, ill, tired, or distracted. It wont be picking its nose or scratching its nether regions. It wont be having a heart attack, stroke or seizure. It will always have the potential to go wrong, just not nearly as often as we do.
Saying all that, we are not there yet, as RobM has said we need the correct infrastructure, and common systems that communicate with each other and there is a long way to go yet. And I wouldn't be remotely comfortable at this point sitting back and letting the car do everything, but it's on its way for sure.
Last edited by Ruggertech; 24th April 2021 at 09:26.
I wouldn’t trust it. I have and drive “auto steer” tractors. The technology has been available for many years and the steering precision can be down to 1.5cm BUT you need to be in the seat ready to take over. The signal can drop (passing under a tree) and some times the machine just releases the auto steer function. Would I trust it in a car at 70mph ..... not on your life. And that comes from someone who has been using the technology for almost 10 years now.
Sent from my iPhone using Tapatalk
Yes, once the technology had been proven safe over billions of hours of actual use.
When, I say safe, I mean the chance on dying in a car accident from an autonomous car is no greater than a dying in a passenger plane
One day I'm sure people will marvel that anyone was allowed to drive something as dangerous as a car manually, but I've worked in IT for the last 40 years and I've yet to be convinced that computers are reliable enough to trust my life to entirely (although I know there are plenty of times where I do just that!).
M
Breitling Cosmonaute 809 - What's not to like?
You do that every time you drive your car, if it's in any way modern. The engine, brakes, steering, handbrake, lights, etc are all controlled by computers. Using your brake pedal is no different to using a mouse and keyboard on your computer, it's just an instruction to a computer, which in-turn operates something mechanical.
Have a watch of any of those shit driving on camera youtube channels and you'll be desperate for the day when self-driving cars are the norm.
Yes, I would. Not just yet though.
I think there will come a time when it will be statistically safer to let the car drive - even if you're a good driver.
This is a small step on the way to general AI and I'm far more concerned what that will mean.
Given we currently have anaesthetic machines and intensive care ventilators, that have software bugs serious enough that they spontaneously reboot, in some cases stopping therapy, I have no trust in even more complex systems such as autonomous cars.
I feel we are at least 10 years away, no doubt some benefits perhaps, but sometimes it’s just so nice to manually shift gears and have a proper old school drive in an involving characterful car; which are sadly dying out.
Sent from my iPhone using Tapatalk
As a software engineer (retired), I don't think I would ever trust an autonomous car absolutely but I do believe they will probably get to the point that they are safer than most drivers and hence be safer overall. However, if one of our nearest and dearest is killed or maimed by a software 'bug' it seems a very different proposition to human error. It's probably analogous to the risks associated with vaccines (and I've very happily had my covid vaccine accepting the risk/reward balance), but somehow it's harder for me to accept self driving cars. The use of self learning neural network AI where no one actually understands why the AI made a particular decision is troubling to me.
This is an interesting (and worrying) article ..
https://www.nature.com/articles/d41586-019-03013-5
This is fun too...
https://www.lexalytics.com/lexablog/...-ai-fails-2020
An interesting case was an AI system trained to recognise cancerous slides which ended up deciding any image with a ruler in was cancerous. This came about as almost all the slides it was 'trained' with happened to have a ruler in the slides with the cancerous cells as the ruler was there to measure the already identified cancerous cells.