We tested our map-based autonomous navigation in three different environments

RoboTech Vision presentation

July 30, 2020  Development

Thanks to our algorithm, the Husky A200 robot from the Canadian company Clearpath can cross the specified route autonomously in different environments. It mainly uses an RTV sensor Box developed by RoboTech Vision. The robot can thus move independently on road, between trees, buildings in open space, recognize objects and avoid static and dynamic obstacles.

RoboTech Vision tries to develop universal algorithms that could be easily applied to a variety of devices. The company, therefore, tests them on several platforms, e.g. on their own Androver II robot with a Double Ackerman chassis or on the Husky A200 robot with a differential chassis. Each type of the robots uses a different type of autonomous navigation depending on its hardware capability.

After nearly two years of developing autonomous navigation algorithms, RoboTech Vision has managed to adapt the Husky A200 to move autonomously in a variety of environments. Although sensors such as IMU, GPS and a wheel odometry also help the robot to move autonomously, the robot is mainly navigated by using an RTV sensor Box developed by RoboTech Vision.

“The development of the RTV sensor Box arose from our own need to own a universal device that simply moves from robot to robot and fulfills the same purpose – it is given its own eyes and mind.”

Ing. Peter Pásztó, PhD.

CEO, RoboTech Vision

How does it work?

“The development of the RTV sensor Box arose from our own need to own a universal device that simply moves from robot to robot and fulfills the same purpose – it is given its own eyes and mind,” says the CEO of RoboTech Vision Ing. Peter Pásztó, PhD. The sensor combines a 360° camera that is used to segment the road, a forward camera for object detection, and a 3D scanner that provides an idea of the size and distance of obstacles.

The space in which the robot should move autonomously must be first pass by the teleoperator with a joystick. The RTV sensor Box saves the data and then creates a map by which the robot is navigated. Then all you have to do is marked a waypoint on he map. The planner chooses the shortest route and creates a trajectory. The robot thus reaches the destination autonomously even through difficult intersections.

Thanks to the Speech text algorithm from RoboTech Vision, the robot can also be controlled via voice commands, for example using a smartphone. If the robot does not understand the command, you will be asked to repeat the command. The robot is also safe for humans, because even though a route is planned, it constantly monitors the surroundings with an RTV sensor Box, detects static and dynamic obstacles and plans the route again so that it does not collide.

“Some devices  are described as autonomous, although they cannot make decisions at intersections or avoid obstacles.”

Ing. Martin Smoľák

CEO, RoboTech Vision

Safety comes first

The robot tries to stay on the right side of the road all the time. Thanks to this, it can also be used in streets. During the tests, he easily copes with an arriving car, bike or runner, which he identified by using the RTV sensor Box. The robot is also able to move around in an open, monotonous space without significant environmental features that would make it easier to locate it.

The autonomous navigation of the robot also includes the so-called recovery mode, which was developed on the Androver II platform and was used on the Husky A200 too. This is a reactive navigation without using a map of the area. The mode is activated if the robot is unable to locate itself correctly on the map and is therefore navigated using a visual system until it is correctly localized again.

The autonomous navigation algorithm deployed on the Husky A200 is known as navigation using map of environment. According to the 451 Research Group, the robot with our algorithms meets the 6th degree of autonomy. “This is very often the difference. Some devices are described as autonomous, although they cannot make decisions at intersections or avoid obstacles,” adds another CEO of the company, Ing. Martin Smoľák.

Author of the post

Dominika Krajčovičová

Marketing manager

Related articles

We have developed autonomous map navigation and obstacle avoidance

We are completing autonomous navigation in the vineyard using a visual system

Husky A200 robot recognizes and autonomously follows objects

Categories

LATEST POSTS

Androver II visited a nuclear power plant

Androver II visited a nuclear power plant

Androver II visited a nuclear power plantDecember 14, 2018  |  Presentation The RoboTech Vision team participated in the evaluation of the Mission Mars II competition at the Nuclear Power Plant in Mochovce. Together with astrobiologist Dr. Michaela...

Pin It on Pinterest

Share This