In 2014, a team of researchers from the University of Toronto’s Critical Making Lab, along with CBM Canada, launched the 3D PrintAbility project with the goal of producing lower-limb prosthetic sockets for children in the developing world using inexpensive filament-based 3D printers. However, they had a problem – before a socket could be printed, it had to be designed, based on a digital 3D scan of the patient.
Most Computer-Aided Design (CAD) tools were not designed to work with 3D scans, and the few that were, were difficult to use and very expensive. PrintAbility needed a design tool suitable for use by prosthetists who did not have the time to become CAD experts. Their search for a solution lead them to Meshmixer, a new (and free!) 3D design tool being developed by the Research Group at CAD software giant Autodesk.
With a design workflow based on Meshmixer, PrintAbility was successfully field tested at the CoRSU Rehabilitation Hospital in Uganda in 2015. Following that test, Grand Challenges Canada, Autodesk Foundation, and Google.org, provided funding to scale the project, resulting in the formation of Nia Technologies, a non-profit which has spent the last 2 years further developing the process through deployments at sites in Tanzania and Cambodia. To date over 200 children have been sent home with Nia 3D-printed sockets.
But one major complication remains: Designing the sockets. Meshmixer was the best option, but it is still a complex general-purpose CAD tool, that does hundreds of things unrelated to prosthetic design. Most prosthetists do not have experience with this kind of software, and could easily become lost in all that extra stuff.
In 2017 Nia launched a project to develop the next iteration of their Orthogen prosthetic design tool. The software consultancy Gradientspace, started by the founder of Meshmixer, was already developing a new framework for mesh-based 3D CAD. Nia’s project was a perfect fit. And so, in an ongoing collaboration these teams are developing a Unity-based solution for the prosthetic design problem.
One amazing thing about Unity is that switching between desktop and VR builds of your application is literally as easy as checking a box. Now, normally if you build for a mouse or touch-based UI, switching to VR means rewriting large parts of the app. However, Gradientspace’s open-source frame3Sharp application framework, which provides a set of 3D design tools and other infrastructure necessary for CAD interfaces, has been designed to seamlessly support desktop, tablet, and VR modes. Since Nia was already using frame3Sharp to build Orthogen on the desktop, it was a small leap to “turn on VR” and see what we thought.
And what we thought was: this is awesome.
One of the biggest practical hurdles in making prosthetic design a digital process has been that prosthetists do so much work with their hands. Most prosthetics, even in the developed world, are created using plaster casts. A prosthetic isn’t completely form-fitting – it is the re-shaping of the initial positive cast of the wearers leg that determines how comfortable the final prosthetic will be. And so an experienced prosthetist usually also ends up becoming a highly skilled sculptor. There is also a very technical side, with measurements and analysis, and we can make these easier on the computer. But a mouse cursor or fingertip won’t ever compare to what a person can do with their hands.
With a VR system like the HTC Vive, we go from 2D x/y input to full 6 degrees of freedom (DoF). It’s not exactly like using your hands (although things are getting closer), but it is a giant leap over a mouse cursor. Perhaps surprisingly, one of the most complicated tasks in CAD workflows based on 3D scanners is just getting the scan into a sensible orientation. Traditionally this has meant using finicky little x/y/z 3D widgets, which is difficult even for experts. In Orthogen we put a lot of effort into making this easier, but in OrthoVR the prosthetist can literally just reach out and grab the virtual leg, then position it however they want it by twisting their hand.
This is the sort of situation where the VR interface is undeniably better than the non-VR alternative. We wouldn’t want to fill out patient information forms in VR, but unconstrained 3D manipulation tasks? VR is better. Order-of-magnitude better.
And not only do we have full 6-DoF input, we have two independent controllers. In OrthoVR, the prosthetist uses “sculpting”-style tools to shape the 3D curves that define the shape of the socket. The frame3Sharp framework supports simultaneous tool use, which means the prosthetist can “hold” the scan in one hand, and reposition it, while they are sculpting with the other. This is called “bimanual input” and is another fantastic capability of state-of-the-art VR systems.
OrthoVR is still a work-in-progress. Although you can design a socket in OrthoVR, you can’t yet do the measurements and dimensioning that are necessary to design a good socket. Our end-goal is to build out a tool that fits into Nia’s Orthogen workflow, as an alternative design interface to the desktop and mobile versions.
The Vive Customer Experience team has been instrumental in getting us to this point. Through their VR for Impact program, HTC has provided funding and equipment for OrthoVR development, and we will soon be ready to begin field testing at the Nia clinic in Tanzania.