The Deep Learning Network is gradually expanding to robotic arms, sensor network gateways and controllers, bringing a high degree of intelligence to every system connected to it, paving the way for future factory automation. However, these changes will not happen soon.
Deep neural networks (DNN) are spreading to factories.
For some early adopters, neural networks can be said to be new intelligence embedded behind the eyes of computer vision cameras. Ultimately, these networks will transform industrial automation by penetrating into the robotic arm, sensor network gateways, and controllers. However, such changes will not happen soon.
Rob Wat, chief technology officer at IBM Watson, said: “From the advanced era and the next generation of machine learning algorithms that are likely to progress to decades, we are still in the early stages, but I think we will see tremendous progress in the next few years.”
Neural networks will appear in more and more Linux multi-core x86 gateways and controllers in a factory environment. High said that the emerging 5G cellular network will one day allow the neural network to access remote data at any time.
Car and aircraft manufacturers and healthcare providers are taking steps as early as possible, especially smart cameras. Canon (Cannon) has embedded the Nvidia Jetson embedded development board in its industrial camera to launch deep learning. Industrial camera supplier Cognex Corp. is accelerating the production of its product range. Chinese startup Horizon Robotics has shipped security surveillance cameras embedded in its deep learning inference accelerators.
Deepu Talla, general manager of Nvidia’s automated machine division, said: “All early adopters are deploying deep learning for visual perception, and others have begun to notice them. Now it’s not difficult to implement perceptual design, and researchers have solved this. problem.”
However, “the biggest problem now is the use of artificial intelligence (AI) to interact with humans and to achieve more sophisticated drive tasks – these are issues that require long-term research. In areas such as drones and robot navigation, we have further Development to the prototype stage.”
Talla calls robotics “the intersection of computers and AI,” but many of the industrial uses of deep learning may not be so desirable – and will come faster.
Doug Olsen, CEO of Harmonic Drive LLC, a robotic component supplier, said that factory robots have not yet begun using AI. “Because embedded machines in the factory can be used to predict failures and collect daily usage data to determine when the system requires preventive maintenance. This is where AI can be established first. Therefore, don’t over-see the smart machine in the short term. Arm.”
Several large chip manufacturers also agree with this view. Three years ago, Renesas Electronics began experimenting with microcontrollers (MCUs) supporting AI at the end nodes and detecting faults in their semiconductor fabs and predicting whether the production system needed maintenance.
In October of this year, the Japanese chip giant introduced the first MCU with dynamically reconfigurable processor blocks for real-time image processing. The company’s goal is to keep up with controllers that support real-time recognition in 2020 and incremental learning in 2022.
Competitor STMicroelectronics (ST)’s STM32 chip uses a similar approach. In February of this year, the company announced a deep learning SoC and an accelerator under development, in part to detect faults in the factory.
Intelligent robots will finally arrive. For example, startup covariant.ai is working to achieve this through intensive learning. Pieter Abbeel of UC Berkeley said: “Equipping robots with the ability to watch and act on the basis of sight will be one of the biggest differences in deep learning in the next few years.” Abbeel used to be at UC Berkeley Covariant.ai was established and a robot lab was managed.
UC Berkeley Professor Pieter Abbeel founded the startup covariant.ai to help the factory train deeper learning to complete multiple tasks.
Abbeel shows how robots learn how to perform simulations using neural network techniques, but the technology is still in its early stages. He said: “In fact, part of the reason we set up covariant.ai is that we are optimistic that the current industrial AI field is not so crowded.”
Looking for data and experts
Abbeel said that deep learning must overcome three major obstacles in the development of smart factories. “It must incorporate the correct annotation data, expertise in knowing how to handle it, and paying for computing (resources).”
Industry companies planning to recruit will face the problem of “a shortage of AI talent”. Abbeel said: “There are about 30 graduates from Berkeley University each year who have PhDs in this field (Ph.D.). But many people will eventually stay in research, such as Google Deep Mind, so there won’t be too many talents. Enter the industry.”
But the industry needs experts to collect, tag, and build models to train data sets—traditionally, data science tasks are often not an internal priority for industrial manufacturers. However, Colin Parris, vice president of software research at GE, said that as computer models are increasingly used in all areas, this situation is slowly changing.
Parris said: “The industry has long used models and simulations to ensure safety and cost, but now they must maintain an ‘active’ field mode (continuous update). For example, the entire fleet of airlines must simulate flying from the Middle East to The impact of flight through different parts of the heat and cold in Scandinavia.”
These models—the so-called digital twins—are becoming tools for predicting behavioral changes, which “requires more information to be collected and used in different ways,” Parris said. “The problem is that every industry has Want to sell a set of modeling tools, so that the industry will eventually encounter data interoperability issues.”
For example, deep learning has led to the rapid rise of software architectures such as TensorFlow, Caffe, and Chainer, each with its own strengths and weaknesses. Developers believe that it is necessary to integrate various architectures to support more sophisticated AI abstraction, which is where researchers are just beginning to study.
Others point out that neural networks are uncertain. This leads to two problems: the results are very useful but unexplained, and they usually perform well but always fail to achieve perfect accuracy. Abbeel said: “Even if the success rate of research papers (results) may be between 80% and 99%, you still need to pay attention to the situation that may lead to high costs in failure.”
Just like algorithms, hardware is still evolving. Abbeel said, “The calculations in deep learning need to change how the transistors are laid out for features such as matrix multiplication.” He also invested in Graphcore, one of nearly 50 AI accelerator startups, very early on. “(Deep learning needs) can’t match today’s CPUs and GPUs – these CPUs/GPUs are pretty good, but not optimal.”
From jet turbine to Christmas present
Machine learning has begun to have an impact on industrial applications, from painting jets to packaging Christmas gifts.
Parris said GE uses robots with computer vision to apply luminescent paint to jet engine turbines and check for cracks. He said: “This frees people from the frustrating environment and allows us to master the information.”
Nvidia’s chips support the automated optical inspection components of Musashi Seimitsu Industry. They also installed a security camera from Komatsu Ltd. (Komatsu Ltd.) to prevent the excavator from colliding with workers in the construction area.
John Deere of the United States uses the Nvidia Jetson development kit to help tractors distinguish between crops and weeds to prevent fertilizer from being wasted on weeds. These development kits also power Agrobot’s automated harvesting systems.
There are also several startups that are developing some tricky tasks to help robots automatically navigate through warehousing or even cities. In retail stores such as Lowe’s and BevMo, the smart retail robot Fellow Robots supports cruising aisle walkways and inventory systems. Robby Technologies is the first to offer home delivery services in the San Francisco area, while commercial submarines manufactured by Aerialtronics support computer vision for law enforcement, search, and rescue and construction.
Rodney Brooks, co-founder, chairman and CEO of Rethink Robotics, known as the father of the sweeping robot Roomba, said that the large warehousing logistics service business run by companies such as Alibaba and Amazon will become important for intelligent robots. driving force. Brooks pointed out at an event last year that “Amazon Robotics employs 700 people in Boston, USA, but robots can’t perform pick-and-pack tasks, so every year during Christmas. I have to increase the number of people.”
“These warehouse logistics centers especially need sorting and packaging services because each package is unique,” he said. “It is expected that this will drive robotic development in ways that have never been seen before, and in turn, in ways we cannot predict. Factory automation.” However, he added, “The huge demand will bring many people into direct contact with the robot.”
Jonathan Taylor, vice president of manufacturing at Samsung Austin Semiconductor, a subsidiary of Samsung Electronics, said Samsung has built a team of data scientists in its $17 billion wafer foundry. He did not disclose whether these scientists use deep learning technology, but the facility has indeed announced an agreement with AT&T to test 5G networks.
In the future, the fab is expected to use 5G networks for more remote locations in the park, connecting vibration sensors to utility systems such as pumps. It can also use a wireless network to connect to a 4K camera for security, while in an emergency, a neural network-enabled face recognition application can be used to confirm the location of 3,000 employees.
Taylor said there are already “many wired connections in its facility, so we want to connect to areas that we couldn’t connect to in the past.”
As factories become more interconnected, they provide a broad base for AI. According to a 2017 survey conducted by market watch company ON World, one-third of the Industrial IoT (IIoT) network is now connected to more than a thousand nodes. This is twice the number of wired connections in the 2014 survey, but only 12% of the latest surveys deployed many wireless nodes in a single site.
The network that is continuing to expand is paving the way through a deep learning network. It is expected that in the future, the deep learning network will gradually spread, bringing a higher degree of intelligence to each system connected to it.