Remove this ad


Jan 21 16 2:42 PM

Tags : :

Software developers may want to work on the firmware of the Ada hand, but may not want to purchase a whole hand, or indeed any of the custom components for the Ada hand. Without the physical hardware, it can be hard to test software changes to see if they work, or to debug code as you go.

To find a way for a software developer to work on the Ada robotic hand firmware without actually having an Ada hand and still being able to debug and test their code.

1. Must be able to fully test firmware code as if it were running on Ada
  • See problem section
2. Must not cost more than £50/$70
  • Designed to be a cheap way for someone to experiment with firmware
3. Must utilize commonly available equipment where possible.
  • Must make life easier for a typical programmer
4. The solution must be licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
  • Since the original Ada hand is released under the this license, any derivatives must be distributed under the same license.

​To design and program a piece of standalone simulation software to run on a desktop computer, which will graphically show the Ada hand and animate it, simulating the movements that the motors would be performing if it were a real Ada hand. Since the firmware can run on a standard arduino microcontroller (a very affordable and readily available component) programmers wouldn't need the specialist PCB or mechanical components of the hand to begin working on the firmware. The simulation software would communicate via a USB based serial connection to the Ada hand, receiving information about where the fingers should move to and returning information about where they have moved to in the simulator. 

1. The simulator should graphically show the Ada robotic hand using a 3D model.
2. The simulator should run on Windows, MACOS and Linux machines.
3. The simulator should utilize a USB serial connection to transfer data.
4. The simulator should receive target motor speed commands from the microntroller.
5. The simulator should send the current position of the motor.
6. The simulator should not interfere with the standard serial communication of the microcontroller, and the user should still be able to send and receive the standard serial commands of the firmware. 

1. Create a custom "fingerlib.h" for the simulator called "fingersimlib.h" to run on the microcontroller alongside the firmware which forwards target motor speed commands to the USB serial port.
2. Format these additional commands with a descriptor, so that the simulator can interpret these commands correctly and remove them from the main body of the serial data, leaving the normal serial data untouched so that normal communication between the computer and microcontroller via USB serial can occur.
3. It may be necessary to include a high level serial communication text box within the simulator so that it can effectively add and remove simulator data from the serial string and maintain the standard serial commands.

Last Edited By: JoelGibbard Jan 21 16 5:31 PM. Edited 7 times

Quote    Reply   
Remove this ad
Remove this ad

#1 [url]

Jan 29 16 12:09 PM

This is a bit big for me to look at atm, but I have a few ideas...

Native route:
Unity3d is cross platform and can load .obj (not .stl) files. It is free to use as long as it displays a nag screen, and has scripting via C# and JavaScript and possibly interop with other languages.

Browser based:
There are HTML based 3d stl viewers (example here: and you can interact with serial from a browser as demonstrated by projects like (a utility for configuring drone firmware from a browser) . Essentially you could use the Chrome browser as your virtual machine and get cross platform support that way. Not sure if the graphics performance would be great, but it might work.

Real hardware based:
Set up some hands in the office, surround them with security type cams and implement an Http protocol for controlling the hands remotely. would require decent low latency connection (at both ends) and a system to book hacking time with the hands. If you record the video streams you can compare software versions as you go.

UPDATE: unity can use .fbx files which can be exported from blender. Rendering in a browser using WebGL should be plenty fast enough too.
UPDATE2: unity can use blend files directly

Last Edited By: jaundice Jan 30 16 11:34 PM. Edited 6 times.

Quote    Reply   

#2 [url]

Jan 31 16 2:03 PM

I spent yesterday evening playing with unity3d and the Ada blender files and it looks like there is a lot of potential.

Unity has a decent looking rigid body physics model, and includes hinge joints for example (e.g joints constrained by an axis). Those joints also have properties for resistance and springiness which could match the Ada model very well, and could even provide a way of modelling properties of new materials before printing. There are quite a lot of examples of modeled cables around, again with appropriate physics so I am thinking it may not be too hard to anchor the end of a cable to a finger tip and mimic the pull through the channels.

On older versions of unity there was an issue with using serial ports on OSX. There is a very high chance that this is solved as the underlying mono framework recently jumped 2 major versions. If that issue remains we could always connect the Arduino serial to Unity via an Http or Tcp proxy layer which may be useful in itself.

Along with Windows, OSX and Linux, Unity also supports Android and iOS and also WebGL so it would be quite conceivable to have an iOS or Android version communicating with the avr via bluetooth serial.

The thing that concerns me is the proprietary license, but I cannot find any other platform that offers such a leg up..

Last Edited By: jaundice Jan 31 16 5:19 PM. Edited 2 times.

Quote    Reply   

#3 [url]

Feb 1 16 1:20 PM


I am definitely up for Unity , I have used it a little in the past and scripting using C# is quite powerful indeed. Please let us know if their licensing model is ok with you !!!
Very good analysis Jaundice !!!

Quote    Reply   

#4 [url]

Feb 1 16 7:48 PM

Which Version You Can Use - Unity Personal Revenue RestrictionsExcept for a thirty (30) day trial period, Unity Personal (including the iOS and Android platform deployment options) may not be used by:
  1. a Commercial Entity that has either: (a) reached annual gross revenues in excess of US$100,000, or (b) raised funds (including but not limited to crowdfunding) in excess of US$100,000, in each case during the most recently completed fiscal year;
  2. a Non-Commercial Entity with a total annual budget in excess of US$100,000 (for the entire Non-Commercial Entity (not just a department)) for the most recently completed fiscal year; or
  3. an individual (not acting on behalf of a Legal Entity) or a Sole Proprietor that has reached annual gross revenues in excess of US$100,000 from its use of the Unity Software during the most recently completed fiscal year, which does not include any income earned by that individual which is unrelated to its use of the Unity Software.

That's from the Unity website. So Unity is free for personal use so long as that use does not conflict with the above. Since we anticipate acheiving (1.) at some point this year Open Bionics wouldn't be able to use Unity under the personal license. However that doesn't stop our developer community using it to develop something. If we then wanted to use it as an organisation we'd have to pay the $75/month license for the proffesional edition, which isn't actually that bad.

I have no objections, unless I'm missing something.....?

Joel (Open Bionics CEO)

Quote    Reply   

#5 [url]

Feb 1 16 8:22 PM

Thanks Joel, you never know, they may even sponsor the project. Some companies like JetBrains (manufacturer of ReSharper and tools for various languages) provide free licences to open source developers so it may be worth a shot.

I am increasingly liking the idea of having a proxy layer in between the AVR and the "Animation Layer".
The brains of the application would live within the proxy and calculate the motor position etc.
The proxy would have pluggable input adapters. Initially one capable of reading serial (either usb or bluetooth, they present themselves the same). Others can be added later e.g remote Http.
The Animation layer would request data from the proxy as part of the game loop.
We can start with unity and add others later if needed.

AVR sends data to proxy
proxy returns motor position to AVR
rinse , repeat

Animation layer requests actuator offsets from proxy, proxy returns them.
Animation Layer updates the UI.
rinse repeat

The proxy can be a self hosted WebAPI app running on .net / mono

Still just thoughts for now...

Quote    Reply   

#6 [url]

Feb 1 16 8:29 PM

If there is anyone with better blender skills than me (read noob ;) who could add finger bones to the .blend files (just one hand would do for now) that would help, I started but it was taking forever and the result was not very well aligned :)
The fingers should be 2 bones each with the joints at the knuckle, the thumb may need 3 bones.

Last Edited By: jaundice Feb 1 16 8:31 PM. Edited 1 time.

Quote    Reply   

#7 [url]

Feb 1 16 8:36 PM

sassobasso wrote:

I am definitely up for Unity , I have used it a little in the past and scripting using C# is quite powerful indeed. Please let us know if their licensing model is ok with you !!!
Very good analysis Jaundice !!!

Thanks @sassobasso :)

Quote    Reply   

#8 [url]

Feb 2 16 9:02 AM

jaundice wrote:
If there is anyone with better blender skills than me (read noob ;) who could add finger bones to the .blend files (just one hand would do for now) that would help, I started but it was taking forever and the result was not very well aligned :)
The fingers should be 2 bones each with the joints at the knuckle, the thumb may need 3 bones.

Hi Jaundice,

Here's a simply rigged hand (I didn't bother putting any inverse kineamtics on it). I left the thumb with two bones, one for each segment but let me know if it would make it easier for you to have it rigged a different way.

All the best, 


Quote    Reply   

#11 [url]

Feb 3 16 9:36 AM

I started doing some coding yesterday evening, so far just laying out a structure and a bit of work on the serial input.
I had some other thoughts on use cases for the proxy layer:

If you take inputs from one AVR device into the proxy, ~unlimited clients can connect to the proxy to either get the raw input, or some transformation thereof. This could be useful in a teaching lab where multiple team members could see the animation on their own machines, or even a whole class. It could also be useful for QA where multiple physical hands receive the same input from the proxy and can be calibrated.

The input doesn't need to be from an AVR and the output could be to an AVR or multiple AVRs. For instance the input could be from a Midi file and the output could be finger positions sent to 1 or more AVRs.

The clients, be they animation, physical hands or something else can either poll the proxy for updates or have the updates pushed to them e.g by UDP network broadcast.

Last Edited By: jaundice Feb 3 16 9:45 AM. Edited 2 times.

Quote    Reply   

#12 [url]

Feb 11 16 10:33 PM

While waiting for some barrel connectors so I can run my hand off Lipos I have been doing a bit of work on this. I have the plumbing for the proxy layer done with a dummy input to mock serial input, which basically mimics 5 motors extending and contracting on a timer. This is delivered to clients via HTTP as a json serialized protocol, and tests ok using a browser. Along side I have started working on the Unity front end and have the RH Ada model in a scene and a camera that can pan around the hand with key or mouse input. I am new to unity so the harder bits may take a while, though unity seems pretty well designed and straightforward to use so far. I'll try and post some code soon.

Quote    Reply   

#13 [url]

Feb 13 16 10:56 PM

Screenshot from Unity, motor positions (text down the LHS) are being read from the proxy, which is just generating a synthetic linear series. The fingers are not attached to the motor position data yet. The fingers are hinged and swinging under physics.

Last Edited By: jaundice Feb 13 16 10:59 PM. Edited 1 time.

Quote    Reply   

#14 [url]

Feb 14 16 11:24 AM

@JonathanRaines is there any chance you could weave some more blender magic in the rigged model? Perhaps start with just the index finger and create a boned tendon. I guess it needs to follow the path of the channels and have enough segments and bones to allow it to flex fairly tightly, but I am not sure really how many that would be. Unity will reload changes to the model so hopefully we can iterate a few times and then knock off the other fingers. Ta jd

Quote    Reply   

#15 [url]

Feb 15 16 5:34 PM

Hi Jaundice,

Would you just be wanting the tendon as a cosmetic piece? I'm liking the attention to detail! I'll try and squeeze it in this week but depeneing on how difficult it is to do it may take a bit longer. 

All the best, 

Quote    Reply   

#16 [url]

Feb 15 16 6:16 PM

Hi Jonathan, no it won't be cosmetic, I will be anchoring it to the finger tip and moving the "free" end back and forward as if moved by the motor. Then hopefully by setting the dampening/spring on the hinges we can use the game physics model to mimic the real physics. 
Cheers jd

Quote    Reply   

#17 [url]

Feb 24 16 10:40 PM

Hi Guys, sorry I have been a bit quiet here for a few days... I have just done an initial commit of the proxy and unity 3d code to Unfortunately I haven't had a chance to verify that it is all there or that it doesn't contain build artifacts... It is still early days with both parts of the project :) As a heads up, the proxy solution references a library called Unity which is not Unity3d, but is actually a dependency injection container.
In terms of building and running it, I am using Visual Studio 2015 (you can get the community edition for free) which is windows only. For Linux and OSX you should be able to use MonoDevelop though I have not tested this yet. You will also need to install Unity3d which requires signing up for an account. If you use Visual Studio, you will probably also want to install the Unity3D extensions (VS Menus>Tools>Extensions>Search for it)
I recently moved the directories about which may require an update to the Unity3d asset paths, though so far it seems ok.

Hopefully I will get a chance to make sure everything is there and working in the next few days..

Quote    Reply   

#18 [url]

Jul 21 16 6:43 PM

Hi there,

Thanks for not considering unity ! Nothing against it, but I don't think it fits within the openness of the full project. I'm not familiar with the game engine in Blender, but surely it fits more with the open source philosophy.

If you don't need any soft material simulation (which I don't think it is necessary for ADA), I offer another option: Gazebo ( It works with ODE as the default physics engine, but Bullet (default in Blender), and others are also available effortlessly.

If you like it, moreover, the exact same gazebo model can run in a 3D enabled web browser (so you don't even need to install gazebo in your PC, which makes it platform-independent) and controlled from your local PC, while anyone can see the simulation. This is a service provided by TheConstructSim (, which is free for short-time simulations, so I'd think of it as a showroom, not for developing/testing.

This is an example of a model I implemented for Gazebo being shown in TheConstruct website:

Let me know what you think.


Quote    Reply   
Remove this ad
Add Reply

Quick Reply

bbcode help