(EUV) Extreme Ultraviolet lithography has long been touted as the answer to process shrinks and keeping pace with Moore’s law. The likes of Intel and AMD are finding it increasingly difficult to make the technological leaps in the fabrication process, slowly inching to 14nm and subsequently 12nm fabrication. EUV has been seen as the solution, with IBM even revealing its first 5nm EUV chip last June.
However, serious chip defects have been uncovered following extreme UV fabrication, potentially forcing manufacturers to abandon the fabrication process entirely. Researchers in California have detected random defects appearing in silicon manufactured using extreme ultraviolet lithography on the 5nm nodes.
The process of photolithography requires that patterns are etched onto the silicon wafer, which is then coated with a material called photoresist that is light sensitive. After this, the wafer is exposed to high-intensity ultraviolet light shone through a mask. The areas blocked by the mask will retain the photoresist layer, while the parts exposed to the line will burn away. Finally, acid is applied to the wafer, etching away the areas which aren’t coated with photoresist. While this is largely reductionist, this is the general idea for chip fabrication.
Unfortunately, the researchers have discovered that EUV photolithography with 250W EUV light isn’t working. Some of the photoresist material isn’t being etched away properly, and there are gaps, flaws, and tears in the circuit, rendering the chips manufactured practically useless. They’re so tiny that they can take days to even locate, let alone attempt to fix.
Where things get even more complicated is there don’t appear to be any solutions to this problem. Researchers have attempted a number of techniques but none of them have so far worked.
Looking past the upcoming crop 10nm processors, a number of major players such as GlobalFoundries, Samsung, and TSMC are all intending to use 250W EUV for 7nm chip fabrication, which will be intended for use by AMD and Intel, and their future plans are all going to be hugely affected by their investment into a flawed technology that has no obvious solution just yet. The hope is that manufacturers can begin to better understand how and why this is happening, and hopefully figure out the steps necessary to solve the issue. Failing this, alternative technologies will need to be pursued.
Could these defects become a serious issue for the advancement of CPU technology? Let us know what your thoughts are!