Interview: AI for industrial inspection 04 April 2019

Artificial Intelligence for Industrial Inspection was the title, and subject, of a January conference organised by the Bristol Robotics Lab and the Centre for Modelling & Simulation. The latter’s AI domain specialist, Kiran Krishnamurthy, explained more following the event

What challenges are posed by manual inspection?
Manual inspection is a process to verify whether a product conforms to specified requirements. However, manually inspecting throws up challenges. The volume of inspections means a large and costly workforce – and given the activity is essential but not revenue-making, this represents a drain on resources. Repetitive work also means inspections are susceptible to human error, while another challenge is safety – operating in confined spaces.

What is artificial intelligence (AI), and how does it work?
In its simplest form, AI is a computer system that is able to perform tasks that usually require human intelligence, for example visual perception, decision-making, and reacting to real-life/ad hoc scenarios.

The goal of any system is to predict an outcome based on supplied inputs. It needs to learn how to make predictions. The AI learning part is achieved through largely three kinds of techniques: supervised learning, where the system has to learn the relationship between a set of inputs and its corresponding outcomes; unsupervised learning, where the AI system has to learn and group the supplied data based on similarity in certain characteristics; and reinforcement learning, where the system learns prediction using repetitive trial and error explorations.

What inspection processes can AI help with?
In general, anything that involves visual inspection is low hanging fruit from an AI perspective – the manual inspection of large volumes of small components [to] remote inspection of critical assets and infrastructure, oil rigs, and wind mills. There are also examples where AI pairs up with other technologies in the inspection process, such as drones on construction sites and in major floods or earthquakes. Satellite images are used to monitor potential geopolitical threats or what’s happening in seas and weather patterns. Data-rich Internet of Things sensors in robotics is another prime candidate for AI solutions. The sensor feedback can be harvested with AI to improve the robotic automation tasks.

Where do you see AI and industrial inspection in the future?
[Research company] Gartner has identified five stages of AI maturity (www.is.gd/mepana). The levels vary from raising awareness of the benefits of using AI (stage one) to an organisation becoming fully transformed to use AI (stage five). I’d say we are currently sitting between stages one and two (experimentation), and in my optimistic view, we’ll be at stage four (pervasive use) in the next 10 years. That might be too optimistic though, so perhaps we’ll be at stage three (operational).

CFMS has been developing three demonstrators. Tell me more…
Each one aims to disseminate emerging AI technologies across the aerospace industry to automate inspection tasks.

The first relates to the manual inspection of an aircraft wing to detect imperfections, such as foreign object damage and cracks, where the typical challenge is accessibility due to size. The demonstrator was developed in collaboration with the National Composites Centre using its camera drone. The drone captured the wing condition in video format, which was used to train the AI model to learn to detect defects and features such as bolts. This is equally applicable to activities in wind farms, oil rigs and remote construction.

In the second demonstrator, a phone captured footage of the inside of the wing box – a confined space. The footage was again used to train the AI model to learn to detect defects. This is equally applicable to inspection activities in underground utility infrastructures or underwater pipes.

The third demonstrator relates to the inspection aspect of the composite manufacturing process involving an auto fibre placement machine. These monitoring and documenting steps are manual, so high costs and variations in documentation exist. Fixed camcorder footage was used to train the AI model to detect defects like tow gaps, and features of composite manufacturing (untrimmed) components like ‘bat ears’.

Adam Offord

Related Companies
Bristol Robotics Lab
CFMS Ltd

This material is protected by MA Business copyright
See Terms and Conditions.
One-off usage is permitted but bulk copying is not.
For multiple copies contact the sales team.