15 December, 17:00, «03 Hall. Queen Erato»
Autonomous cars wear a bunch of sensors to detect obstacles in time, day or night. But perception isn’t only about cameras or lidars. We train neural networks, design data pipelines, annotate data (not draining your budget), build the infrastructure (surely, with micro-services at its base).
Object: autonomous light trucks without a driver’s cabin that transport cargo 24/7
Setting: an enclosed area at a logistics hub
Our mission: to develop a perception system for such trucks
I’ll talk about how we have solved major challenges in building a perception system and learned to…
- detect people, cars, e-scooters, etc.
- recognize debris on the roadbed to exclude them from the drivable area
- annotate data used in neural networks training, and reduce annotation costs
- and, finally, ensure that algorithms and calculations won’t fry the onboard computer.
Bonus: No bragging about theoretical concepts or how others do it — I’ll show you how our autonomous vehicles see the world based on our own experience in the fields and give real-life examples of how they operate at our clients’ sites.
The talk was accepted to the conference program