NCKU aims to develop artificial intelligence from multiple aspects.?
Autonomous driving at NCKU
Autonomous vehicles are expected to play an important role in future mobility services and smart cities. To pave the way for the research on mobility transformation, NCKU has set up an autonomous driving platform to facilitate a fusion of multidisciplinary research for greater insight.
In early 2018, the NCKU autonomous vehicle made its debut by exhibiting the Society of Automotive Engineers level-4 capability on NCKU’s campus. In 2019, NCKU opened an official Ministry of Science and Technology autonomous vehicle test site called the Taiwan CAR (connected, autonomous road-test) lab. Since then, the NCKU Taiwan CAR lab has been synonymous with advanced autonomous driving technologies and promoting autonomous vehicle augmented services. Here are the current CAR lab research achievements:
- For drive-by-wire vehicle development, perception sensors including lidar, radar and video camera are equipped to provide sensor fusion with advanced deep-learning techniques.
- Innovative localisation algorithms have been developed and tested to ensure accuracy and integrity under extremely challenging environments.
- The autonomous driving vehicle has been connected for seamless integration of cloud-based data and edge operation data resulting in safer vehicle operation.
- The system has been developed to exploit high-definition maps for lane-level path planning and vehicle guidance.
- A mobile app was developed to facilitate mobility as a service.
The NCKU CAR lab research team has also released an open-source simulator for the development and verification of autonomous driving software. Additionally, the NCKU autonomous vehicle has served as an example case for the Ministry of the Interior high-definition map project and the National Development Council operational control center project. Furthermore, collaborations with local industries have been conducted to enhance technical perception, localisation and control techniques to increase awareness and safety. Finally, the team is actively collaborating with international partners to provide a “unique academic research niche” for technology development and personnel training.
?
AI robots at NCKU
The Advanced Intelligent Robot and System laboratory (aiRobots lab) at NCKU focuses on the design and implementation of two types of intelligent robots; the humanoid robot and the service robot. Here is the current status of each robot project:
Humanoid robots: The lab has designed and implemented child-sized, teen-sized and adult-sized humanoid robots to attend two well-known international robot competitions; RoboCup and Federation of International Robot-soccer Association RoboWorld Cup. Over the past decades, these robots have also won the all-round champion award multiple times in the HuroCup league.
Currently, the lab is developing a toddler-sized robot with the gross-motor and fine-motor skills of a two-year-old. The dimensions for the torso, legs and arms are self-learned and calculated for creation by the artificial bee colony algorithm. Q-learning is applied to resolve the transformations between different types of motions. In order to improve gait pattern, a double deep Q-learning network is combined with inertial measurement unit and force sensors to establish the robot’s ability to walk on uneven terrains in real time. Finally, unsupervised learning architecture and self-exploration capability enables the robot to interact with the environment in order to identify specific shape categories from different toys.
The lab is also developing a robotic hand to perform dexterous motions – for example, carrying heavy objects with two hands – in a low-cost and efficient manufacturing process. Furthermore, emotion recognition and expression schemes are applied to allow robots to interact with humans in a more natural and friendly way. The fine-motor skills of the toddler-sized robot will be enhanced by an autonomous and imitation learning scheme. This permits the robot to draw a line or circle, and play a shape-sorting cube game, similar to human toddlers.
?
Service robots: The lab designed and implemented home-service robots to compete in the @Home league of RoboCup. In the past few decades, the service robots have won multiple second- and third-place victories at the RoboCup Japan Open. At this moment, the lab is developing artificial intelligence-based service robots to work as a mega supermarket shopper and as a warehouse worker. The service robot can navigate in a complex environment, avoid obstacles and catch desired objects. The robot is average adult human height and is equipped with a RealSense camera, a microphone array, a SICK Laser range finder, 16 actuators on its arms and hands, two lead screws, speakers and a central processing unit. For real-time image processing, OpenCV and Yolo are utilised to establish the visual system, which can identify surroundings, the appearance and depth of objects, and personnel locations.
The robot is equipped with a speech recognition system to receive commands from people and communicate responses for intimate interaction. The mobile platform is realised by a four-wheel independent steering and driving structure. In order to identify the object coordinate, a RGBD (red, green, blue depth) Convolutional Neural Network is used to classify the orientation of the object.
A deep deterministic policy gradients-based motion planning and control system is used to train a 7-degree of freedom manipulator that allows it to move in 3D space without colliding with itself or environmental obstacles.
To develop the automation of supermarket and/or warehouse logistics, the lab will construct an Internet of Things system to connect all of the robots. This system will be constructed using global optimisation where the task of each robot will be assigned or rearranged by: the distances among the robots, the importance or emergency of the task, what task the robots are performing, and other parameters. The designed and implemented service robots will play important roles in successful cyber-physical systems.