As autonomous vehicle adoption grows, the most intelligent among them are gathering data from their computer vision cameras,LiDAR lasers, and arrays of onboard sensors.
Do you remember the first time you sat behind the wheel of a car? Did your heart rate climb as you entered traffic, and your mind race as you strived to apply what you had been taught?Now, imagine if instead of the teachings from parents, siblings and teachers, you were suddenly infused with skills and knowledge from countless driving experiences - the learned abilities of drivers in connected cars in your city, and around the world.It may sound like the stuff of science fiction, yet AI software from Oxbotica makes it a reality for autonomous vehicles (AV). Oxbotica harnesses the power of open, accessible AV data to enable their customers' driverless vehicles to travel safely, efficiently, and affordably.In this episode of the "How AI Happens" podcast —powered by Sama — we talk to Ben Upcroft, the Vice President of Technology at Oxbotica. We discuss universal autonomy, the roles of AI, and the internet of things (IoT).We also look at the many ways businesses and consumers are benefitting from unmanned transport, a sector that is expected to grow by 53.6% by 2030 bringing 3.2 million driverless vehicles onto the roads.So what do we need to know in this new era of driverless cars?
In the earliest days of autonomous transportation, most industry pioneers positioned technology as a way to get people and goods from A to B. A picture emerged of driverless cars and transport trucks navigating highways, rural roads and city streets, with futuristic buses and cars equipped with cameras and sensors interacting with locally trained algorithms.Ride-sharing, goods delivery and shuttling people around airports and between venues remain lucrative opportunities for self-driving cars, trucks, and other vehicle fleets. But opportunities are also emerging in industries such as mining, oil and gas and construction to address challenges like driver shortages and unsafe driving conditions.Some of the world's most innovative automotive companies, like Tesla, GM and Ford, faced obstacles to bring their own autonomous vehicles to market. Regulatory safety roadblocks, failed safety tests, and resistance to adopt AV tech like LiDAR are some of the factors that have slowed manufacturers in getting driverless vehicles to market. Oxbotica's open, scalable platform is vehicle agnostic. By training its driving algorithms in both real and virtual (meta) environments, it accelerates testing while reducing costs, and harmful CO2 emissions.
Many organizations are seeking ways to reduce transportation costs and their environmental footprint while increasing safety. In order to empower more companies to attain these goals, more automotive manufacturers, fleets, and individual owners need to democratize their AI navigation data. That is the essence of Oxbotica's universal autonomy approach.Consider how vehicles generally operate within set territories or routes. The navigation and environment data would be fairly consistent over time, but wouldn't prepare that vehicle, or its control systems to travel in other regions, with different environmental conditions such as fog, dust, snow and ice.Here's a common AV scenario: two autonomous SUVs are assigned routes in London, England, and Denver, Colorado. The navigation data for each vehicle is stored within a proprietary data platform. Although each driverless vehicle performs well within its own region based on available inputs, what if the vehicle left its usual environment and encountered conditions it wasn't trained for? It wouldn't be prepared to react to poor traction or visibility due to conditions like icy roads or whiteouts.Open AV data sharing is ideal for the greater good, yet it can be used by cybercriminals to identify vehicle owners and for other threats. This is why Oxbotica and many other organizations are working tirelessly on ways to secure and anonymize AV data.
AV algorithms can't tell the difference between real-world and simulated testing algorithms. Oxbotica's synthesized training environments can accelerate testing methods on a magnitude of 1000% or more. It can also subject an AV's programming to edge cases that can't be simulated in the real world.Human drivers need to build confidence when learning to drive by getting through changing experiences without incident. Similarly, virtual-world training environments reinforce that an AV can:
Oxbotica's vehicle-agnostic software empowers fleets to operate safer and more efficiently by providing both common and rare training scenarios which are hard to come by with publicly available training data.
AI and deep learning stacks can process data faster, and identify data relationships beyond human capacity. Companies like Oxbotica are finding innovative ways to extract and collect quality data, then transfer it to where it is needed, without creating excessive bias.Unlike some AV manufacturers and technology companies, Oxbotica supports and simulates data that is gathered from sensors and systems like LiDar, radar, global positioning systems (GPS), and inertial measurement units (IMUs). Their systems use deep learning to create ultra-realistic training AV training environments without blurriness or pixelation dropouts. Higher quality drives down the volume of data that needs to be gathered to effectively train AVs.
As autonomous vehicle adoption grows, the most intelligent among them are gathering data from their computer vision cameras,LiDAR lasers, and arrays of onboard sensors. They are equipped to compare the information with historical data, and help identify new ways to help organizations reduce costs, increase efficiencies, improve performance and increase safety.Success in the autonomous vehicle segment won't come from proprietary systems or siloed data repositories. Open, vehicle-agnostic cloud applications and safer systems are what will move self-navigating vehicles and fleets forward.The actionable insights above should help you take the next step on your autonomous vehicle journey. In the meantime, stream the full episode below, or subscribe and listen along on Apple Podcasts, Spotify, or here.