Hand out clothes or a towel to people, work hand in hand with a worker in assembly or work in environments with limited accessibility: the BionicMobileAssistant is predestined for a wide range of assistance services. It consists of a mobile robot, an electric robot arm and the BionicSoftHand 2.0.
A robot moves autonomously through a factory building. It independently recognises where the various workstations are, adaptively grips the tool and works collaboratively with the worker at its side: in the future, an assembly in which the BionicMobileAssistant is used could look like this.
Agile below, adaptable above
The lower part of the assistance system consists of the subsystem ballbot. The mobile robot application balances on a ball and can manoeuvre in any direction. The movements of the ballbot are planned and coordinated using planning and control algorithms that are stored in its body on a powerful computer.
The electric robot arm DynaArm, which enables fast and dynamic movements, sits on the mobile application. Thanks to model-based force control and control algorithms for compensating dynamic effects, the arm can react well to external influences and thus interact very sensitively with its environment. It is controlled by the ballbot via an EtherCAT communication bus.
Adaptive gripping is made possible by the pneumatic robot hand BionicSoftHand 2.0, a further development of the BionicSoftHand from 2019. The fingers consist of flexible bellows structures with air chambers, covered by a firm and at the same time pliable textile knit. This makes the hand light, flexible, adaptable and sensitive, yet capable of exerting strong forces. A glove with tactile force sensors on the fingertips, the palm and the outside gives it the necessary dexterity.
The BionicSoftHand 2.0 recognises how hard the gripping material is and how well it fits in the hand. It adjusts its gripping force accordingly.
Keeping an eye on gripping materials and space
With the help of a depth camera, the BionicSoftHand 2.0 can also visually recognise the gripping object, even if it is partially covered. The information is processed by a neural network that has been trained in advance with the help of data augmentation. By marginally modifying a few source images – for example, with different backgrounds, lighting conditions or viewing angles – and duplicating them, the system obtains a comprehensive data set with which it can work independently.
Thanks to two additional cameras, the BionicMobileAssistant also has its surroundings in view at all times and orientates itself independently in the room: one camera searches for predefined fixed points in the environment to position itself autonomously, while a second camera uses the ceiling structure to estimate movement. The system has its entire power supply on board: this allows it to carry out various tasks at different locations – in line with the constantly changing nature of production.
The robot can move autonomously. The battery for the arm and robot sits inside the body. The compressed air cartridge for the pneumatic hand is installed in the upper arm.