Robots and Intralogistics, What’s New With Artificial Intelligence
The advent of artificial intelligence (AI) implies not only perspectives quite hard to imagine until recently, but also doubts and ethical dilemmas. As for intralogistics, and especially the development of self-guided vehicles such as those launched by MiR, AI is combined with motion control and robotics to optimize the navigation of mobile robots, potentially able to make decisions on the route to be taken virtually in real time. Davide Boaglio, head of MiR’s Italian subsidiary, talked about it.
by Davide Boaglio
The combination of robotics, automation and all the latest technologies is leading to new industrial business models, also radically changing patterns and workflows. In the midst of this whole, there is something new that might act as a glue, with the aim of implementing and optimizing the processes. We are talking about artificial intelligence (AI).
AI is going to have quite a strong impact on the technological scenario at the end of the decade. It has already been appearing for some time in many areas, as well as in automation and robotics. In the automotive industry, for instance, with the development of self-driving cars, it is necessary to have something more that can better evaluate and decide than what a sophisticated software can do. Also in the field of safety, AI traces new paths, finding wide use: how many anti-intrusion or fire-fighting systems are governed by a system that sees, thinks, evaluates and acts autonomously? It is even widely used in finance for the analysis of money flows, the study of clients’ behavioural patterns, forecasts and advice in real time through robot advisors, just to name a few.
The last goal of robotics has made it possible to coin the word that identifies the union between robot and collaboration: cobot. Cobot means a collaborative robot, which collaborates closely with man. Thanks to the AI, the cobots observe and learn from human gestures, thus creating an archive of learned processes: machine learning. The robots thus work alongside their human colleagues to create highly productive work environments by automating production or even material handling.
Man will always be at the centre of everything
Like all technological innovations, AI also has a certain distrust and concern for all possible future uses, especially in the field of work, where the thought prevails that – just like robots – it might “take away the work from men”.
Every new technology must be used with ethics and responsibility to help and improve life. In the specific case of logistics, AI is only a piece of a much larger mosaic, since robotics and automation, through the use of AI, will replace humans in their most basic tasks, but at the same time will allow workers to perform tasks of greater value.
At MiR have combined motion control, robotics and artificial intelligence to create a product for logistics that looks towards safety, reliability and full autonomy.
Autonomous mobile robots (AMR) are able to lift and transport different types of loads, relieving staff of heavy, monotonous and repetitive transport tasks. However, the robot alone is no longer enough and, although it may have the most sophisticated software, it will never be able to react correctly to perceived situations, but it will always do so predictably in the same way.
In complex and highly dynamic environments, such as those where automatic guided vehicles (AGVs) that cannot deviate from their fixed path or personnel-driven forklifts coexist, the robot’s manoeuvrability may be limited. The safety mechanisms of AGVs are generally limited to forced stops when encountering obstacles; the same can happen with AMRs.
Fixed cameras act as e “third eye”
With the introduction of AI in the programming in our AMRs, each device can have many more reactions generated by the possibility of evaluating the situation from time to time. For example, on a set path, within a time frame of 8 hours, the robot can find itself in front of many variables: intersections with other devices, material temporarily present on the path line, people in motion and so on. Normally, the robot is forced to try to get around the obstacle, or stop or go backwards and wait for the favourable condition; instead, the robot equipped with AI can, in real time, decide according to the situation whether to deviate the path, recalculate it completely or wait briefly and then resume the movement.
Thanks to the AI functionalities incorporated in the software, at MiR we have developed a system that uses not only sensors, cameras and laser scanners present on the robots, but also fixed cameras placed in some strategic spots. Interacting with the AMR, MiR AI Camera fixed cameras act as a “third eye” and are able to communicate all the variables of the path from a fixed perspective, providing the robot in advance the data needed to predict obstacles and decide any type of manoeuvre different from the routine ones. Any intersections with blind spots, people approaching or other situations can be overcome without problems, maximizing the level of safety and optimizing the route planning.
Turning workplaces into dynamic and data-driven environments
Talking about cameras, we cannot help but think about the GDPR and the privacy policies in terms of images and videos. MIR cameras comply with all privacy regulations, as there is no actual video footage and no image capture. The footage is processed in shapes, sizes and colours, then classified into specific categories, such as fixed or moving objects, and used for the decisions that the robot will have to make in order to continue the journey. The acquired video data will never violate the privacy regulations, since for the human eye they represent information impossible to use for a hypothetical recognition.
Autonomous mobile robots incorporating AI capabilities will help turn workplaces into dynamic, data-driven environments. Path scans and variables acquired through individual robot sensors or from remote sensors will be shared in real time between fleet robots.
Thanks to this data sharing model, each robot basically has access to the sensors of any other robot or fixed camera that will give it a much more detailed view of the environment. This process will allow the fleet of robots to make route decisions or to know any obstacles allowing for more efficient route planning.