IBM, Tokyo Electron eye AI in chip manufacturing

Article By : Vivek Nanda

With the broader aim to increase yields and reduce costs, IBM Research and Tokyo Electron are collaborating to inject cognitive computing into chip-making equipment.

In a recent blog post by Mukesh Khare, VP, Semiconductor Research, IBM Research, the company laments that computer chips can fail in countless ways and from countless sources in the manufacturing process. Despite rich sensor data available during the process, foundries may face low wafer yields and performance and reliability issues, which drive up costs.

The post announces that IBM and Tokyo Electron Ltd (TEL) have signed a joint development programme to add cognitive capabilities to TEL’s chip manufacturing equipment to track down anomalies so that the equipment keeps running smoothly.

This will give you an idea of how much data IBM's AI will need to decipher: a typical semiconductor fab generates over of 1.5TB/hr of data, over 5 million transactions/day, from over 50,000 sensors. And that does not include over 400TB of storage to support real-time applications, like a fab’s process control loops that keep each complex action consistent.

IBM Research is developing advanced machine learning “pipelines” to construct prediction models using the big data of structured data as well as unstructured operational data from tools, logs and on-wafer measurements. With TEL, IBM wants to take all that sensor data from manufacturing equipment and wafer measurements for anomaly detection and to identify hidden patterns that they could use as early-warning signs of abnormal behaviour.

Detecting anomalies at nanometre scale

IBM believes cognitive anomaly warnings would be far more sophisticated and sensitive than typical fault detection methods, like simple alarms on individual sensors. They would generate tighter control of tool performance and feedback loops to correct or reset the equipment. IBM also wants to use cognition to help identify and control for equipment subtle process variations caused by process history, wafer history or other changes to the fab facilities. When you are dealing with nanometre-scale manufacturing, tolerances may be limited to <1nm variation, something that's beyond the resolution capability of today's metrology techniques.

Data analysis at the edge

The joint research with TEL will first focus on understanding the ontology of tool behaviours by using structured data from sensors on TEL’s etching equipment, which is used in EUV-patterned interconnects. This data will be coupled with IBM’s wafer measurements, alongside unstructured data, such as maintenance logs and images, to automatically learn patterns that are precursors of abnormality. These modes of failure patterns will then be subsequently deployed on TEL’s etching equipment to identify abnormalities in real time to allow mitigation in an operational mode.

TEL is likely to incorporate the AI in streaming analytics on their equipment, at the “edge,” according to Khare. TEL could also use a hybrid cloud option with streaming analytics at the edge and asset fleet analytics on a secure cloud. This would help TEL demonstrate to their customers a path to anomaly-free chip manufacturing. On the other hand, Khare sees the research efforts help refine his company's Watson IoT for Manufacturing product.

Leave a comment