

Huh, my EV also has neutral and park. It can be very useful to yet the wheels freewheel or to lock the wheels. However no PRNDL shifter control for usability
Huh, my EV also has neutral and park. It can be very useful to yet the wheels freewheel or to lock the wheels. However no PRNDL shifter control for usability
Yeah, actually a low mode would be useful for exactly this, in exactly the same use cases as for ICE cars. Whatever my regen is configured as, if I’m heading down a steep icy hill, maybe I could use an easy way to maximize regen. Heck maybe my battery is not warmed up yet so I don’t have effective regen, but I could really use it for that hill even if that battery can’t use it
Probably not the same animals that need to be controlled, but boar is delicious!
The given reason is simply that it will return control to the driver if it can’t figure out what to do, and all evidence is consistent with that. All self-driving cars have some variation of this. However yes it’s suspicious when it disengages right when you need it most. I also don’t know of data to support whether this is a pattern or just a feature of certain well-published cases.
Even in those false positives, it’s entirely consistent with the ai being confused, especially since many of these scenarios get addressed by software updates. I’m not trying to deny it, just say the evidence is not as clear as people here are claiming
I don’t see how this test demonstrates anything is better. It is a gimmick designed for a specific sensor to get an intended result. Show me a realistic test if they want to be useful, or go full looney tunes if they want to be entertaining
Huh, I thought the exact opposite. The clues were small. While they were sufficient for a focussed driver at slow speeds, it also looked like something that would fool a human at typical speeds and attention span.
Painting exactly like the road is a gimmick that really doesn’t demonstrate anything.
Personally I wished they went full looney tunes to better entertain us and to demonstrate that even significant clues may not be enough
It’s a cost-benefit calculation.
So the question is whether they can achieve self-driving without it: humans rely on vision alone so maybe an ai can. I’m just happy someone is taking a different approach rather than the follow the pack mentality: we’re more likely to get something that works
Right, you may see a couch and be curious about a couch, and there’s no moral framework against finding out how those soft fluffy cushions feel, intimately. It’s just instinct. But no, JD Vance has not acted on this instinct
PSA: check your private relay settings if you haven’t in a while
Mine was on but set to “maintain general area” so local content works.
Is this one of these “both sides the same” fallacies?
I don’t recall the Biden administration directly intervening in the attack on protesters, but I did just read Trump doing much worse to violate the rights of protesters and free thought
Yep. You too might be a criminal, gangster, drug dealing, illegal immigrant that should be deported. If China won’t take you back, we can dump you somewhere else.
I don’t see them coming for all of us. It’s just not practical to arrest everyone who complains.
It all comes down to other characteristics:
I tried that once but it wasn’t for me. If it’s been cold out and is suddenly warm, I’ll bask in the heat or open windows.
Actually I’m a bit annoyed that my car does that. I set the car thermostat to heat to a specific temperature but then it also cools to that temperature. I haven’t figured out if there’s a way to set it to heat (or cool) only
We always include the dog for Christmas. In my lifelong total of four dogs, they all were uncertain the first year, a little better the next year, and after three were all set to jump in and unwrap their presents. While they have no concept of Christmas and probably no explicit memories of previous Christmases, they do seem to learn year over year
For me the biggest downside was really poor road maintenance: lines worn off, long cracks that could be interpreted as lines, offset intersection where you can’t go straight across and no lines … or at night not enough cleared space so the side camera decides it’s obscured.
I have this one really narrow windy road that too many humans have trouble with. I really wanted to see what it would do but decided there wasn’t enough room for me to take over if I needed to
That’s the real argument no one seems to make. However I’ll still do my best to not buy Oreos as the only way around this issue
I’m a bit disappointed they painted identical to the actual road. Probably a lot of humans will get fooled by that one. We should send a challenge back: how looney toons can you get? Will something more cartoonish fool it? Will a different landscape fool it? How about drawing an oncoming train?
Good question. I don’t know if they ll succeed but they have a point that humans do it with just vision so why can’t ai do at least as well? We’ll see. I’m happy someone is trying a different approach. Maybe lidar is necessary, but until someone succeeds we won’t know the best approach, so let’s be happy there’s at least one competing attempt
I gave it a try once and it was pretty amazing, but clearly not ready. Tesla is fantastic at “normal” driving, but the trial gave me a real appreciation how driving is all edge cases. At this point I’m no longer confident that anyone will solve the problem adequately for general use.
Plus there will be accidents. No matter how optimistic you may be, it will never be perfect. Are they ready for the liability and reputation hit? Can any company survive that, even if they are demonstrably better than human?
I’m just trying to understand how irrational or aggressive Lemmy is towards women.
While I also don’t know:
When mine is configured for aggressive regen, it slows the vehicle to a stop. You couldn’t push the car, for example