[5A2] Flexible component identification for automated scanning and real-time defect detection
A Hifi¹, V Tunukovic¹, S McKnight¹, E Mohseni¹, R Vithanage¹, G Pierce¹, T O’Hare², J O'Brien-O'Reilly², G Munro², M Grosser³, C MacLeod¹ and A Gachagan¹
¹University of Strathclyde, UK
²Spirit AeroSystems, UK
³Spirit AeroSystems, USA
Carbon fibre-reinforced plastics (CFRPs) account for about 50% of the total material weight in flagship Airbus and Boeing aircraft models. The post-manufacture quality of CFRP components is predominantly ensured through phased array ultrasound testing (PAUT). The integration and delivery of PAUT via robotic systems has improved accuracy, consistency and scanning times, resulting in higher data throughput. However, the produced data must be manually processed, creating a bottleneck in the quality control stream.
To automate inspection and data interpretation processes, this work proposes an integrated system comprising flexible robotic path planning, robotic and ultrasonic control and a machine learning (ML) processing and interpretation framework. The robotic set-up is based on a KUKA LBR iiwa collaborative robot, equipped with a Zivid 3D camera that captures a region of interest. Component identification is run utilising proprietary Python modules, followed by a bespoke path planning module. The commands are sent to the robot, relying on real-time force feedback to ensure stable contact forces for PAUT inspection. Real-time data processing generates a comprehensive view encompassing robotic information, live ultrasonic B-scan and automatically generated C-scan. Data visualisation is supplemented with several ML models, allowing analysis of individual B-scans, defective areas in amplitude C-scans and full volumetric processing.
To automate inspection and data interpretation processes, this work proposes an integrated system comprising flexible robotic path planning, robotic and ultrasonic control and a machine learning (ML) processing and interpretation framework. The robotic set-up is based on a KUKA LBR iiwa collaborative robot, equipped with a Zivid 3D camera that captures a region of interest. Component identification is run utilising proprietary Python modules, followed by a bespoke path planning module. The commands are sent to the robot, relying on real-time force feedback to ensure stable contact forces for PAUT inspection. Real-time data processing generates a comprehensive view encompassing robotic information, live ultrasonic B-scan and automatically generated C-scan. Data visualisation is supplemented with several ML models, allowing analysis of individual B-scans, defective areas in amplitude C-scans and full volumetric processing.