Collaborative Robots

Fusion Robot: The Latest Revolution In The Field Of Collaborative Robots

Follow Us:

Collaborative robots, in short cobots are coming in the demand across the globe. According to World Robotics Report 2017, there was a 16% increase i.e. 294,312 units in robot sales in 2016, making it a new peak for the fourth year in a row. The cobot market is now estimated to touch the $95 billion mark by 2024. The report states that Collaborative robots, IoT and Machine Learning/AI will lead robotics in the coming years.

Collaborative robotics have been receiving continued mainstream media coverage, therefore, bringing this new robotics area to a wider public audience. In every environment, robots are clearly on the rise because the ability of machines to work as well as coordinate with humans accurately and safely is fueling the demand for collaborative robots. The conventional collaborative robots are lacking additional capabilities due to improper utilization of sensors and rigidity while performing industrial operations.

A Robotics team led by Yamen Saraiji from Keio University, Japan, has developed a new kind of collaborative robot that’s designed to remotely inhabit the body of someone else in order to assist them with manipulation tasks. It is called the Full Body Surrogacy for Collaborative Communication: Fusion Robot.  The Fusion has a pair of robotic arms that can be worn like a backpack and controlled by a remote operator. The main parts of this amazing robot include a 3-Axis head with an integrated Stereo camera, a Portable Backpack with battery, and control units connected to two Anthropomorphic Arms having attachable hands. All these components are integrated with the backpack, which is wearable and a mobile robotic platform. This article deals with a revolutionary collaborative robot: the Fusion robot.

Fusion consists of an operator and a surrogate that are spatially separated. The operator uses Oculus CV1 virtual reality headsets, which enables him to access the surrogate body. The surrogate mounts a backpack that consists of three axes robotic head with stereo vision and binaural audio, and two anthropomorphic robotic arms with removable hands. The system is mobile and allows the surrogate to freely move and walk while wearing the backpack, enabling outdoor applications.

  1. Remote operation 

Remote operation of fusion is done by Telepresence technology. Telepresence is a sophisticated form of remote-controlled robot, in which a human operator has a sense of being in the place where the actual operation is going on. The Telepresence system uses virtual reality (VR) headgear and headphones that reproduce the audio-video experience of the actual site. The system incorporates Binocular machine vision, which allows a sense of depth.

Effective communication is a key factor in collaborative operations for sharing skills and actions of more than one person. One can learn these skills by making a video call to the other person, but it limits the actions because it doesn’t share the point of view of the actual conditions, which also affects the productivity of the process and slows it down. It is very difficult to cooperate from a different point of view in remote conditions when the participants are physically separated. Fusion allows sharing the same point of view for two individuals simultaneously, out of which, one person is the surrogate body and another is an operator. Operator’s motions are transmitted to the two robotic arms mounted on the surrogate body. These arms can work independently or they can be attached to the human arms for collaborative scenarios. Fusion realizes bodily driven communications and actions by embodying one person into another body. The operator can effectively communicate and collaborate remotely using verbal and physical actions.

  1. Three levels of communications: Directed, Enforced and Induced 
    • Directed Actions: Direct actions utilize the body language based communication, which can share intentions through the motions and gestures of the surrogate body itself. The actions of the surrogate body are copied by Fusion arms, which work independently. For instance, a surrogate person can direct Fusion’s arm to punch on someone’s face or lift a glass of water from a table. The user has total four hands while performing the direct actions.
    • Enforced Actions: When Direct Actions fail to communicate the required motions and postures of the surrogate body, it is possible to enforce these new postures. When the surrogate body is unable to perform a task, the enforced posture will help it to achieve new skills. In these actions, the Fusion arms are attached to the surrogate wrists by removing and replacing the robotic hands with straps. Enforced actions include physical based communication using force feedback from the remote operator to the surrogate body. Imagine a trainer enters your body to train new skills.
    • Induced Actions: Fusion can control the entire motion of the surrogate body by inducing new direction to the surrogate. These motions are achieved by moving surrogate hands beyond their physical reach, stimulating hand grasping effect. The surrogate doesn’t have to make any movement; induced force will be enough to guide the hands.
  1. Independent control system 

The limbs or arms are controlled by a PC, which streams data wirelessly between the Fusion’s arms and the person controlling them in VR. The PC is connected to a microcontroller, which has the posture data of the robotic arms. The robotic arms with seven joints also connect with the head. The head has two cameras, showing the live feed of surrogate to the remote operator wearing VR. Motion tracking sensors are used to track the motions of the operator. When the operator moves his head in VR, sensors track that motion and the control system move the robotic head in response. The camera head can turn left or right, tilt up and down, and pivot from side to side. The battery lasts about an hour and a half weighing in at nearly 21 pounds. There are different buttons on the Oculus Rift controllers, which move three fingers of both the robotic hands simultaneously.

With such distinct and useful features, the collaborative robots provide interactive two-way audio and video communication. For the people with special needs, an individual’s quality of life may get impacted when the person is not able to perform their everyday activities. For such persons, a Fusion robot with induced actions and some modifications according to the type of disability can be of great help. The upcoming generation of robots will greatly mix the physical and the digital world together. These cobots are not meant to replace human labor; in fact, they help people achieve higher output with improved efficiency. They have the potential to ease human life while ensuring precision and accuracy which will enhance the quality of life.

By Mayur Shewale (Assistant Writer)

Share:

Facebook
Twitter
Pinterest
LinkedIn

Subscribe To Our Newsletter

Get updates and learn from the best

Scroll to Top

Hire Us To Spread Your Content

Fill this form and we will call you.