Robotic Development - Hardware, Air Muscle Pneumatic Actuation

Engineered Arts are at the forefront of robotic hardware design. Our parallel electro-pneumatic actuation makes our robots fast and reliable. We use low-geared, fluid tuning for natural, expressive limb and body movements. This combination allows for a broader, faster, more precise and more natural-seeming range of movement than that possible with either actuator variety acting alone. Specialised muscle-mimicking material and custom valve design give an unprecedented degree of control over the non-linear force-driven elements of the system. This focus on hardware ensure’s rapid and reliable robot development.

Pneumatics

The pneumatic fluidic muscle actuators in use on RoboThespian are capable of generating high tensile forces with a stick-slip-free movement pattern, enabling both slow and fast movements to be executed with equal adroitness. The nonlinear nature of the actuation curves provides a highly natural-looking motion. When combined with antagonistic muscle pairings and high-level control algorithms, a sophisticated biomimetic robotic system can be readily achieved.

 

Since the pneumatic actuators are very low impedance, they can be easily back driven. In addition to the safety aspect, this has the advantage that the robot is able to ’relax’, which is not easy to achieve with electrically actuated robots. In this relaxed state, the robot can be guided and manipulated with little to no resistance.

Control

RoboThespian Robot Development Barcelona Stereoscopic Webcams

Thanks to the parallel actuation systems, much of RoboThespian‘s hardware is under MIMO control, offering both force and position feedback. We have multiple, integrated, user-friendly interfaces for programming our robot, from the novice-level drag-and-drop GUI, through to a comprehensive Python scripting layer which gives simple and direct control over all inputs and outputs. All interfaces are browser-based, for easy access, requiring no software downloads or installation. You can access the robot directly, or use the internet – all our robots are web-connected devices. Meaning robot development is extremely accessible.

The position encoders used on the robots are based on the Austria Microsystems AS5145 Rotary Encoder. Visual and depth information is achieved via standard high-resolution RGB cameras in conjunction with IR sensors. We have microphone inputs, for speech recognition and voice control, and force and pressure sensors embedded in our compliant hardware elements allow you to create force-controlled interactive behaviours.

Integration

Engineered Arts designs down to the board electronic level; we can work with you to integrate the hardware and software you desire. Or do it yourself, via our easy-to-access API. Developing software for the RoboThespian platform has never been easier.

Socibot Robot Development Facial Recognition

Social Interaction

A key part of our robot development is in social interaction. Robots and humans sharing a space must interact on many levels, both physical and cognitive. Naturalistic movement, responsive faces and sympathetic personalities make our robots the ideal platform with which to advance the science of human-robot interaction.

 

The robots at Engineered Arts are provided with numerous advanced software features. Visual sensors, IR-depth information, embedded tracking software and pose recognition give our products a fast and flexible responsiveness. In addition, the innate mechanical compliance of their actuators ensures a safe environment for physical collaboration.

Human-robot Interaction

Our projective head hardware can realise a huge range of facial features. From realistic to cartoonish, the robot can be an eerily lifelike simulacrum, or an exaggerated emotive caricature. The embedded SHORE facial recognition software allow real-time recognition and reproduction of facial expressions. We also offer a mechanical head design, for a more retro robot aesthetic. This robot only emotes with the eyes and the voice, but the moving mouth engenders a high degree of realism in spoken interactions.

Our robots come with a wide range of naturalistic passive behaviours already embedded. Turning their head in response to new people or objects entering their field of view, face tracking, subtle expression changes, are only a few ways in which RoboThespian and SociBot engage with the environment as autonomous actors, rather than static machines.

The open API provided to all academic customers makes it simple to advance existing features or develop whole new modes of interaction. As can be seen by the work of University of Kaiserslautern

Whether you wish to use our robots as development hardware, or as a teaching platform to explore the many interconnected facets of robotics, we have the software structure that you require. Content and programming interfaces can be hosted locally on the robot, or on a cloud-based server that is accessible everywhere. Remote robot development is possible due to our robots being web-connected devices. With the requisite APIs, you can even add off-robot computation for processor-intensive applications. For public-sided installations, we offer a dedicated Kiosk which allows general access to basic functionality. More advanced users have access to the entire robot development environment.

 

Standard software deployments include SHORE, the Fraunhofer IIS facial tracking and processing suite; position and pose tracking of up to 12 people using the onboard IR depth sensor in conjunction with the OpenNI framework; and web connectivity to allow remote access and performance assessment. Unique QR codes can embed specific information about a location or individual, enabling easy visual access to a distributed knowledge framework. Open-loop, timeline-based sequences can be created via our comprehensive Virtual Robot platform or using a touchscreen with a child-friendly, easy-to-understand GUI which allows the quick formation of motion and speech routines. For more comprehensive control, a browser-based IDE incorporates content management, real-time sensor feeds, and a Python-based scripting layer.

 

Our underlying Application Programming Interface (API) was developed in-house. Currently, we are using a system called Tritium, named for the three interconnected hubs which handle all software and hardware on the robots. IOServe handles the direct hardware interfacing, the Control Function manager deals with the higher-level control and behaviours. Most information is routed through Comms, a communication manager which interfaces between these two, as well as handling third-party software I/O.

Q: Can you provide customised hardware/software for my project?

 

A: We are constantly developing new hardware and software, with a particular focus on force control, biarticulate linkages and human robot interaction.

 

Q: Do you build robots in house?

 

A: Yes, We have highly skilled engineers working in a well-equipped facility with modern CNC machines

 

Q: Do your robots run ROS?

 

A: No, we have our own fast software stack optimised for dynamic humanoids and interaction, however, you can add ROS connectivity via Python.

 

Q: Do you have a robot capable of grasping and manipulation?

 

A: Yes, RT4 has these capabilities.

 

Q: Can we customise firmware and embedded controllers?

 

A: Yes, please ask about access to our code repositories.

 

Q: Do you offer internships/work placements?

 

A: Yes, take a look at the jobs page.