top of page

MAPPING MODULE

Members:

• Ahmet Alper Uzuntepe.

• Ahmet Cebeci.

• Burcu Sultan Orhan.

• Furkan Aydın.

• Hasan Mutlu.

• Şamil Berat Delioğulları.

• Sena Özbelen.

• Taha Kınalı.

 

Responsibilities:

• Implement the software that is needed to create maps of the environment using video footage and other sensor data coming from the robot.

• Define the minimum requirements of the video footage to be used in the mapping algorithms.

• The software should be able to create maps, visualize them, and find the robot’s location inside these maps in real-time.

Module Interactions:

Control Station Module: Work together to create an interface that can display the drawn maps in the control station.

Embedded Programming Module: Get information about what kind of data the robot’s sensors can provide.

 

Technologies to Be Used:

• OpenCV: One of the most popularly used libraries for computer vision. Since our maps will be created using Visual SLAM algorithms, OpenCV is going to be needed.

• ORB-SLAM: A Visual SLAM algorithm for real-time trajectory computation and 3D sparse map construction of the scene.

• Either C++ or Python may be used, depending on how easy and productive each solution is.

bottom of page