back
8-min read

Would you trust a robotaxi to take your child to school?

Autonomous cars promise fewer accidents and safer roads. Yet the idea of stepping into a car with no driver is deeply unsettling. At the heart of that discomfort isn’t only safety – it’s a question of control, accountability and trust. And in South Africa, where many families already rely on taxis and shuttles to get children safely to school, that question hits closer to home.

In this article you’ll read about:

two children in back of robotaxi
two children in back of robotaxi

Holding on for dear life and fear of letting go

We trust algorithms with our money, our news, and our navigation, but not with our nearest and dearest. While robotaxis are already up and running in parts of the world, many of us shudder at the thought of stepping inside one.

Remember that TV ad from a few years back, where a businessman in the back of an autonomous vehicle looks speculatively at the human-free, hands-free steering wheel, and asks ‘What could go wrong?’? The point is, we have an innate resistance to this kind of automation.

Not because the data says robotaxis are dangerous – in fact, in many cases the opposite is true. But because they demand something of us that makes us feel psychologically uncomfortable: giving up control.

It’s the same feeling you get when you’re teaching your 18-year-old to drive. Not holding the wheel in your hands feels like you’ve lost control completely. It’s a deeply disturbing feeling which we have to work hard at to override.

When letting go is already part of the daily school run

In South Africa, thousands of parents already surrender control every single morning. Not to a robotaxi, but to an Uber or Bolt driver, a lift club or a school shuttle service.

For households with one car or none at all, it’s the only practical way to get children to and from school. And while these systems rely on human drivers, the underlying act is the same: handing over control and trusting that someone else will get your child where they need to go, safely.

Of course, that trust comes with questions, and what every parent should be asking of their lift service is whether they have:

  • a valid licence
  • the necessary professional driving permit
  • the correct operating permits for passenger transport
  • insurance for passengers and third parties
  • an app that can track each trip in real time

These checks won’t remove uncertainty entirely, but they will help parents navigate the same core issue that sits at the heart of the robotaxi debate: how much control we’re willing to give up in exchange for convenience, mobility and safety.

And that brings us back to automation.

It’s more about agency and autonomy than safety

The truth is driving is one of the few everyday acts where we feel fully in charge. We grip the wheel, our foot hovers over the brake pedal, we’re hyper aware of what’s happening around us. We tell ourselves, ‘I’ve got this. I can do it on autopilot if I need to.’

Even though we’re hurtling along at 120km per hour in what is basically a projectile weapon, even when we speed a little or brake late, we still feel like we’re figuratively ‘in the driver’s seat’. We’re in control and in command of the situation.

Robotaxis remove that illusion. Even if statistics eventually prove robotaxis are safer than human drivers – fewer distractions, zero fatigue, no drunk driving, no road rage – they take away the one thing we cling to in uncertainty: our sense of agency, our sense of autonomy and independent action.

In contrast, riding in a robotaxi makes you realise that the only thing you can reach for is the handbrake – which you hope to heaven you won’t have to use.

But there’s more to it than that. It’s also about us versus the machine.

Why machine mistakes feel much worse than human ones

There’s a very human paradox at work here: we forgive human drivers for being tired, distracted or emotional. Life can do that to you, we reason, so we relate to them, we understand their flaws, we are them.

But machines feel different. Psychologists call it automation bias in reverse: We judge machine errors far more harshly than human errors. One robotaxi accident will dominate global headlines for days, while thousands of human fender-benders barely register in the news feed.

It’s not that machines are made to be perfect; it’s just that we expect them to be. And when they fail, we feel robbed of a sense of accountability and action. Because there’s no one to exchange a look with at the four-way stop, no one to exchange numbers with, or have a good rant at.

The machine represents the absence of anything human – and that hurts.

What happens when things gets messy?

Autonomous vehicles are programmed on predictable patterns, but our roads are anything but predictable.

Picture a service delivery protest that spills out into peak hour traffic. Or a pothole that pops up overnight. What happens when an unexpected power outage knocks out a traffic light at a major intersection, or a child chases a ball into the road, or heavy rain wrecks a car’s sensors?

These are what engineers call ‘edge cases’, and they need to be considered when it comes to split second decision-making. Human drivers rely on reflexes, instinct and context in these situations. Machines rely on data and programming. Both can fail, and they fail differently. But there’s always that risk, whether human or machine.

So, when things do get risky, who is liable in a robotaxi accident?

Does removing the driver remove the responsibility?

In a traditional accident, responsibility typically sits with the driver, the other driver, or, occasionally, poor road conditions. But when the driver disappears, liability could sit with a whole bunch of agents:

  • The car manufacturer
  • The software developer
  • The fleet operator
  • The data provider

When cars drive themselves, disasters don’t miraculously disappear. Instead, responsibility reorientates from human behaviour to tech systems. Yesterday’s risk was thoroughly human, but tomorrow’s risk is all about the systems. That can feel frustratingly remote, but it still means someone needs to take responsibility.

Insurance is going to need to meet the autonomous moment

Instead of focusing purely on human speed, distraction and recklessness, insurance now needs to focus on software integrity, sensor failures, cyber interference, update errors and shared responsibility.

Risk hasn’t disappeared. It’s just distributed differently. And insurance will need to track that difference, absorbing uncertainty, assigning accountability and getting people and cars moving again when systems fail.

So, would you trust a robotaxi with your kids?

This is the question that cuts to the heart of things. Many people say they’d try a robotaxi alone, use one for short trips, or test it in low-risk situations. But we all hesitate when it comes to letting our loved ones step into one.

That hesitation is all too human. Trust doesn’t come from safety alone. It’s built on what feels familiar to us, on what we know. And we humans are always suspicious of anything new, anything different, anything unfamiliar.

Think of seatbelts – it’s hard to believe because they’re so embedded in our sense of driving safety, but they were once vehemently resisted. When airbags hit the scene, people we’re too happy about them going off in their faces. And before ABS (Anti-lock Braking Systems) became a safety standard, they were met with outright scepticism. 

Now these are all standard benchmarks for driving behaviour. We’ve embraced them because we’ve built up a sense of familiarity with them, so much so that we now don’t give them a second thought.

What autonomous autos mean for South Africa

Even if fully autonomous robotaxis are years away from running on our roads, automation is already here. Our cars already brake automatically, correct lane drift, monitor blind spots and adjust speed. Without really recognising it, we’ve been giving up control gradually and sharing it with our systems.

But South Africa adds a little complexity to the mix. We have:

  • substantial infrastructure gaps
  • law enforcement inconsistencies
  • weather extremes
  • road surface degradation

This makes it a bit harder for autonomous vehicles to navigate our roads, because their programming relies on clarity and consistency. Which makes one thing certain:

the transition won’t just be in the tech itself. It will also need to be based on changes to regulations, investment in infrastructure, and a cultural mind shift.

Miway’s take on autonomous taxis

Miway understands that driving today already sits somewhere between human, machine backing, and environment. Insurance can no longer only look at who last touched the wheel. It has to understand the systems that sit behind it.

Whether it’s cover that keeps pace with evolving tech, support when things go wrong at the roadside, or helpful services like Wedrive, which gets you home safely when you’re in no state to drive, Miway builds policies for a world where control can easily be compromised.

Because Miway understands that control isn’t about holding on to that wheel. It’s about being prepared when you let go. And that comes from confidence.  It isn’t about gripping the wheel harder. It’s about knowing that whatever changes and challenges lie ahead, whether tech or human, you’re covered.

Share