JD Cloud, the cloud computing service arm under Chinese e-commerce company JD.com, announced on December 21 that the joint research center between JD.com and the Chongqing University of Posts and Telecommunications on future intelligent vision technology, put forward a fusion perception model based on vision and lidars. The model has begun to actively adapt to the Martian environment and will be used for the rover to perceive and understand the surrounding environment of the planet in the future.
At the same time, the joint research center will conduct the technical R&D around the problems of high cost and insufficient data of data labeling in regard to the Martian environment. Papers have already been submitted to CVPR, a top international conference in the field of computer vision, and key technologies have been patented.
SEE ALSO: JD.com to Cut Executive Salary by 10-20%
According to one person in charge of the JD Explore Academy, in the Martian environment, single algorithms based on pure visual perception or lidar are difficult to accommodate perception tasks. The joint research center proposes a perception model based on a vision-lidar fusion, which fully exploits the unique information of each mode on the one hand, and complements the fusion of the two modes on the other hand, providing assistance for the future development of Mars exploration in China.
The joint research center was officially unveiled on November 7. The center will combine the industry advantages of JD.com and talent of the university to accelerate the research of cutting-edge technologies and promote the scientific and technological innovation. The center is now carrying out R&D of key technologies in the future, such as “Mars unmanned rovers and land-air freight logistics vehicles”.
In August this year, the “Chongli” rover developed by Chongqing University of Posts and Telecommunications and JD Explore Academy was unveiled at the 2022 Smart China Expo.
The “Chongli” rover is equipped with self-developed visual perception and cognition modules, which can realize the functions of autonomous collection of rover visual data, target detection, safety evaluation, travel decision-making and map construction.
The vehicle adopts six-wheel drive, which improves the ability of the rover to maneuver over the soft sand terrain. The sensing modules include lidars, binocular cameras, navigation cameras, obstacle avoidance cameras and other sensors. The software platform is based on the ROS system and integrates a basic navigation map module. The rover supports a maximum climbing angle of 30 degrees while the maximum speed of the whole vehicle is 12km/h and a cruising range of more than 20 km.