Everyday vision comes with an extraordinary set of abilities – seeing color, detecting motion, identifying shapes and gauging distance and speed. According to Discover Magazine, neurons devoted to visual processing take up about 30 percent of the cortex in the human brain (that is compared to 8 percent for touch and 3 percent for hearing). For most of us, it is the primary sense that drives our perception of the world.
The same could be true for our devices. With the rising number of connected devices, the focus shifts toward machine learning and artificial intelligence to make sense of all the data being collected. Today, we are increasingly dependent on cameras and projection systems, which are embedded in many of our compact but complex devices and their usage is growing in popularity. New usage models and applications are created for technology products to "see" to "comprehend" and analyze the physical world – so the work is cut out for optics manufacturing.
Why is that important?
Just the way our eyes are connected to the brain, cameras and projection systems enable devices to understand their surroundings, tasks and success paths to accomplish those tasks. The intelligence from these components are invaluable and will only become more critical to the success of our connected devices. They must be able to support the image quality, speed and analysis time required to recognize and respond to the environment.
Automotive manufacturing is an excellent example.
By combining world class automotive-grade optics capabilities with computer vision and deep learning software, we can develop systems that assess the presence of an active and aware driver. This can improve human-machine interface (HMI) to make the driver experience better and increase overall safety. This is not decades from now – it is happening now. In fact, Jabil and eyeSight Technologies recently announced such a partnership.
In this example, the goal is to embed a scalable system that can be integrated into existing cockpit electronics to enable:
Considering approximately 20 percent of fatal road accidents involve driver fatigue, such a system could provide a significantly safer environment for everyone. Think of the information and benefits you can receive out of such a system. The use in automotive is just one of many ways our lives can be improved.
So how do we get there?
As our needs evolve, so will the imaging systems. In this endeavor toward improved optical performance, one manufacturing process in particular will play a critical role: active alignment.
In a research paper published by Moore Insights and Strategy, active alignment is defined as "the process of affixing imaging technology components, typically a lens and a sensor, while continuously measuring image quality throughout the alignment process to achieve the highest accuracy of images captured."
Download the full research report to understand why active alignment is a good match for the Internet of Things and how optical device OEMs can utilize the technology.