Intelligent vision tasks like visual recognition, object detection, tracking, and segmentation are being embedded into diverse intelligent systems including smartphones, self-driving cars, and smart factories due to the rapid development of deep learning techniques. Although reaching near-human-level performance, the state-of-the-art intelligent methods are developed on the pre-collected training and testing datasets. It is hard for them to get the same conclusions in the real world since complex and dynamic degradations cannot be covered by the datasets even if they are large-scale. Moreover, with the limited datasets, the robustness and security of the vision intelligent methods cannot be extensively evaluated and studied.
Motivated by these challenges, Dr Guo Qing's research goal is to develop automatic techniques and practical tools for actively searching the vulnerability and evaluating the robustness of deep neural networks (DNNs) under diverse computer vision tasks and automatically enhancing or repairing the DNNs. Specifically, Dr Guo's group studies the vulnerability of examples from the perspective of adversarial attacks. The adversarial attack is an effective and fundamental way to reveal the vulnerability of DNNs. Nevertheless, most of the existing works focus on the additive-perturbation-based adversarial attack, neglecting other natural and important degradations, e.g., motion blur, light variation, weather variation, and vignetting, which widely exist in our daily life. In this talk, Dr Guo will share a series of natural-degradation-aware adversarial attacks including adversarial blur, rain, vignetting, exposure attacks, which cover a wide range of vision-based intelligent tasks, e.g., visual recognition, visual object tracking, object detection, etc. In addition, he will also explore benchmark construction for robustness evaluation. In terms of the DNN enhancement or repairing, Dr Guo will introduce their data restoration and augmentation techniques for enhancing image-based DNNs’ robustness, respectively.