Location

Grand Wailea, Hawaii

Event Website

https://hicss.hawaii.edu/

Start Date

8-1-2019 12:00 AM

End Date

11-1-2019 12:00 AM

Description

In this paper we present an intelligent system to help autonomous vehicles in real cities and with local traffic rules. A 2D and 3D visual attention system is proposed, capable of detecting the use of signs and aids in cases of major roadblock (road under work, with a traffic accident, etc.). For this to be possible, we analyze the cones and traffic signs that usually alert a driver about this type of problem. The main objective is to provide support for autonomous vehicles to be able to find an auxiliary route that is not previously mapped. For this we use a Grid Point Cloud Map. Using the ORB-SLAM visual odometry system we can correctly fit each stereo frame point cloud in the pose where the images were collected. With the concatenation of point clouds generated by the stereo camera, every grid block can draw the main characteristics of its region and an auxiliary route can be mapped. In this type of situation the vision system must work in real time. The results are promising and very satisfactory, we obtained an accuracy of 98.4% in the 2D classification task and 83% accuracy in the single frame 3D detection task.

Share

COinS
 
Jan 8th, 12:00 AM Jan 11th, 12:00 AM

Computer Vision System with 2D and 3D Data Fusion for Detection of Possible Auxiliaries Routes in Stretches of Interdicted Roads

Grand Wailea, Hawaii

In this paper we present an intelligent system to help autonomous vehicles in real cities and with local traffic rules. A 2D and 3D visual attention system is proposed, capable of detecting the use of signs and aids in cases of major roadblock (road under work, with a traffic accident, etc.). For this to be possible, we analyze the cones and traffic signs that usually alert a driver about this type of problem. The main objective is to provide support for autonomous vehicles to be able to find an auxiliary route that is not previously mapped. For this we use a Grid Point Cloud Map. Using the ORB-SLAM visual odometry system we can correctly fit each stereo frame point cloud in the pose where the images were collected. With the concatenation of point clouds generated by the stereo camera, every grid block can draw the main characteristics of its region and an auxiliary route can be mapped. In this type of situation the vision system must work in real time. The results are promising and very satisfactory, we obtained an accuracy of 98.4% in the 2D classification task and 83% accuracy in the single frame 3D detection task.

https://aisel.aisnet.org/hicss-52/st/smart_app_development/2