Software like nothing else
Unique robots need unique software. All of our lifelike characters run on our state-of-the-art software framework: Tritium.
Developed in-house and refined over 12 years, Tritium keeps our robots moving in innovative and breathtaking ways. Built with flexibility in mind, it can operate almost any hardware component from almost any hardware platform. You can be sure every aspect of RoboThespian, Mesmer, Quinn and our custom commissions will run reliably and safely in Tritium’s hands.
Change a robot’s behaviour on the go from anywhere. Tritium is designed to work inside a web browser, from any internet connected device. With one simple login, you’re free to maintain, update and adjust the robot with ease – even while it’s impressing a crowd.
- Add, chop and change content as you go
- Talk to an audience remotely through the robot
- Get remote support and diagnostics from Engineered Arts
You don’t have to be a coder to customise our robots’ abilities to your heart’s content. We designed Tritium so that it can read almost any programming language and work with almost any software. So you can get creative without fear of compatibility issues.
- Interact with our robots using almost any coding language
- Use plug-in software from other suppliers
- Run software locally or using a cloud based service
Robots need to sort through data from sensors, encoders, motors, network traffic, code, video streams, microphone inputs, physical conditions – and act on it in time. It’s a complicated business, but Tritium is smart enough to keep our robots responsive under all kinds of conditions.
- Smart buffering system lets our robots make quick decisions safely
- Adapts to sudden changes in the environment
- Keeps human-robot interactions safe and entertaining
Some software can get confused by customisable inputs and conflicting instructions, but Tritium keeps our robots clear-headed about what they’re meant to do. If ever our robots are asked to do two different things at the same time, Tritium resolves the conflict and prioritises actions safely.
- Sifts through multiple software nodes, device requests and demands
- Prevents unpredictable behaviour
- Robots will keep performing in a reliable and expected way
Now you can be the robot from anywhere in the world. Using inbuilt cameras and microphones, you can control its gaze, enter a natural conversation, and trigger content on the fly. Automated features like face tracking keep interaction compelling and believable. TinMan offers unrivalled levels of engagement and provides lasting positive memories about your business or attraction.
- TinMan hands you real-time control of any of our robots.
- Our powerful software is packed into the simplicity of a browser. So no need to install software.
- Simple and intuitive use, enabling truly breathtaking interaction, with no distractions.
For users wishing to delve more deeply into the possibilities afforded by our robots, we provide an integrated developer environment where you can use Python to create your own control functions and subroutines for the robot. Use the robot’s many sensors to register data from the environment; control the hardware to make the robot truly responsive.
- Create your own advanced robot behaviours
- Write and instantly test your code on the robot
- Access the development environment in your browser from anywhere in the world
Almost all of what our robots do can be controlled remotely over the web, thanks to the wonder of RESTful API. You can use our software and interface to do this. Or you can make your own interface with custom commands, read outs and UI. All remotely!
- Link robots with external equipment and synchronize with your show
- Interface with remote sensors to trigger actions and reactions
- Remotely ask the robot what it is doing and what it is thinking
Device UI and Sensors
All our robots are provided with a wide array of sensors, including cameras, microphones and position encoders and smart electronics with thousands of parameters ensuring they are responsive and interactive machines. On our sensor interfaces, this data is available in real-time in your browser, you can compare inputs and outputs, observe response times and use this information when creating your own control functions.
- Visual feeds are available from the high-resolution head cameras and overlaid with other sensor data
- Real-time virtual oscilloscopes allow quick development, debug and issue diagnostics
- Remote access means hardware support is easy, even if the robot is on another continent
Start creating your own performances
Get in touch to see how our characters can captivate a crowd in your exhibition or event space.