By clicking “Accept,” you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our privacy policy for more information.

How the MIRAI vision system works

How the MIRAI vision system works
Published
1.11.22

Automation gets complex when conditions at a workstation are constantly changing and unpredictable — when there’s variance. Standard automation solutions can’t deal with variance, or if they can, they’re prohibitively expensive to implement. But MIRAI, our AI-powered vision system, makes it possible to automate tasks otherwise considered too complex to solve. Once equipped with MIRAI, a robot can look out onto its workspace and react to unpredictable situations. The MIRAI system enables robots to perform tasks that would be too hard if not impossible to manually engineer.

Traditional industrial robots are pre-programmed to perform a task, over and over, from a bolted position on a factory floor. While inflexible, these machines are unflagging and precise. Picture an automobile manufacturing plant decked with robots around an assembly line where car parts get produced. Because it’s possible to calculate where each part will be and when along the line, engineers can place each robot in a specified position and program it to repeatedly execute a movement. Welding, painting cars, transferring parts — these are predetermined movements for robots in industrial settings. The process breaks down, though, if a part doesn’t turn up at the right place at the right time on the line.

Thanks to the MIRAI system, a robot gains the ability to open its eyes and perceive its work environment while performing a task and adapt in real time to unforeseen situations. This post explains what MIRAI is and how it works.

MIRAI consists of hardware and software.
Clockwise: tablet and training app; controller; camera parts; camera mount.

What MIRAI is

Made by Micropsi Industries, a robotics software company, MIRAI is an AI-powered vision system that’s attached to and augments a robot. Our customers purchase or already own a robot from a third party of their choice, for instance FANUC, KUKA, or Universal Robots.

A robot guided by MIRAI can handle variance in position, shape, color, lighting, and background. Compared with standard computer-vision systems, MIRAI offers higher robustness, precision, and flexibility thanks to its real time control. No CAD data or measurements are needed. MIRAI can also deal with reflections, specular highlights, and transparency.

When you want to use the MIRAI system to solve an automation task, imagine the task divided into sections. The robot’s native controller will steer during sections of the task with predetermined movements, while the MIRAI controller will steer during the complex sections with variance (e.g., fine positioning a plug a few millimeters above a socket that may vary in its position and shape). During the task, the native robot program transfers control of the robot to MIRAI to handle the complex section. Once MIRAI has completed its section of the task, it hands back control to the native program.

MIRAI consists of both hardware and software. Among other things, the MIRAI kit comes with a controller, a camera, and a mount for securing the camera to the robot. The kit also includes a tablet with our preinstalled training app. You use the training app to show the MIRAI-equipped robot what you want it to do. Without training, it can’t magically perform the task you have in mind. This is where artificial intelligence comes in.

The AI in MIRAI

A subset of artificial intelligence, machine learning refers to a field of study within applied mathematics that, using statistical inference, supplies computers with the ability to learn without being explicitly programmed. There are assorted kinds of machine learning. At Micropsi Industries, we do imitation learning. This is a variant of supervised learning, the goal of which is to get a computer to generate a correct answer to a problem by showing it examples of correct answers to similar problems. You would apply a supervised learning computer-algorithm if you wanted to, say, predict annual income based on the number of years of higher education someone has. The algorithm would do this by analyzing a raft of training data on education and income, discerning a pattern in that data, and then, from this pattern, predicting a person’s annual income when later fed only the education data. In imitation learning, the aim is to predict a correct answer in the form of behavior. The machine learns to make the most appropriate decision based on training data collected from human demonstrations.

With the MIRAI system, a robot’s behavior is acquired from what’s observed by the camera during human demonstrations. A robot behavior here could be a robot plugging a cable into a server rack hole or sniffing the soldering joint on the back of a refrigerator for leakage. During demonstrations, you take the MIRAI-equipped robot by its wrist and show it the action you want it to perform and the variance it may come across. 

As you do, the camera records the scene. The camera-recorded images are converted into data and transported, via the MIRAI controller, to the Micropsi Industries’ secure computing cloud. There, we run the data through a learning algorithm to prepare, or train, a policy, a mathematical model that guides the robot when it performs the action we want it to execute. At Micropsi Industries, we refer to this trained mathematical model as a “skill.” It instructs the robot how to behave in real time when it encounters variance while doing your desired action, in a completely self-contained manner without any more need for a network connection. Skills are like laws that tell the robot how to behave in any situation it encounters. That law is generated and fine-tuned based on the movements you show the system during training. Typically, one does a few training sessions to make sure a skill is robust, meaning it can handle any situation that comes up.

Training through hand-guided movement
An engineer trains the MIRAI system to perform a cable-plugging task.

MIRAI-skill execution

Once the skill is robust, the training phase is complete. You can now use the MIRAI skill as part of the overall task. Again, the robot’s native controller steers during sections of the task with predetermined movements, while the MIRAI controller steers during the sections with variance. If you have, say, a UR robot, this connection is configured in UR Polyscope.

When MIRAI is in control and executes the skill, the camera takes a picture; the resulting image data is run through the policy, which gives the robot an instruction for how to act. This loop fires 15 times per second until the entire movement is complete, at which point MIRAI hands back control to the native robot controller. 

Among other applications, robots with MIRAI can be used for assembly, end-of-line testing, picking, screwdriving, cable insertion, and rack hanging.

If you'd like to discuss your automation challenges with an expert, get in touch.

Using MIRAI, we have solved a challenge we could not solve with standard automation technologies. Accuracy, and performance KPIs have been reached, imprving productivity and quality.

Javier Chasco Echeverria
Engineer for Industry 4.0
Bosch Siemens Hausgeräte (BSH)
Orci ac auctor augue
Ut enim ad minim veniam quis nostrud exercitation ullamco
Ut enim ad minim veniam quis nostrud

Sed viverra ipsum nunc aliquet bibendum. Sit amet nisl suscipit adipiscing bibendum est. Urna nunc id cursus metus aliquam eleifend. Sed risus pretium quam vulputate dignissim suspendisse in est ante. Tempor commodo ullamcorper a lacus vestibulum sed arcu non odio. Integer eget aliquet nibh praesent tristique magna sit. Orci phasellus egestas tellus rutrum tellus pellentesque eu. Pharetra massa massa ultricies mi quis hendrerit dolor magna eget. Dui vivamus arcu felis bibendum.

Heading

Sed viverra ipsum nunc aliquet bibendum. Sit amet nisl suscipit adipiscing bibendum est. Urna nunc id cursus metus aliquam eleifend. Sed risus pretium quam vulputate dignissim suspendisse in est ante. Tempor commodo ullamcorper a lacus vestibulum sed arcu non odio. Integer eget aliquet nibh praesent tristique magna sit. Orci phasellus egestas tellus rutrum tellus pellentesque eu. Pharetra massa massa ultricies mi quis hendrerit dolor magna eget. Dui vivamus arcu felis bibendum.

Ut enim ad minim veniam quis nostrud
Ut enim ad minim veniam quis nostrud

Related entries

Have an automation challenge?

Talk with an expert

Get in touch