The BioDigital Human
From 2009-2019 I worked remotely from Berlin, Germany with the team at BioDigital Systems in Manhattan, New York, to develop
and maintain the BioDigital Human anatomy visualization platform and its public developer API.
We developed the Human on SceneJS, an open source WebGL library I created for developing 3D graphics applications
in Web browsers without using plugins. One of the first WebGL engines, SceneJS evolved alongside the WebGL specification, before
we eventually made a private version which we adapted specifically for the Human.
In this article, I’m going to describe SceneJS’ journey from a weekend side project to how we applied it within BioDigital’s
Web-based anatomy visualization platform. Ten years down the track, the platform now has over four million
registered users and continues to develop, with a growing library of models of anatomy and physical conditions.
Contents
- 2005: SceneJS Origins
- 2006: Experiments with Canvas3D
- 2008: SceneJS Open Sourced
- 2009: SceneJS Powering the BioDigital Human
- 2015: SceneJS Private Fork
- SceneJS Presentations
- Next Steps: xeogl
- Acknowledgements
2005: SceneJS Origins
I started SceneJS as a weekend experiment, somewhere around late 2005. Back then, JavaScript wasn’t so fast and friends
like @ohunt were busy writing raytracers on JavaScript that took forever,
as a kind of twisted browser-cooking exercise.
The first version of SceneJS even rendered wireframe as DIV elements, arranged
using Bresenham’s line algorithm, so I wasn’t expecting
that to be particularly interactive.
That early version was even written in completely functional-style JavaScript, and did a ton of garbage collection and scope traversal. I was
inspired at the time by LISP and CLOJURE and so perhaps took my fascination with terse scene definitions a little too far!
2006: Experiments with Canvas3D
Web-based 3D without plugins actually started to look viable in 2006, however, with the Canvas 3D experiments started by Vladimir Vukićević at Mozilla, and by the end of
2007, both Mozilla and Opera had made their own separate implementations. Suddenly, interactive 3D in the browser didn’t
seem so crazy, so I switched SceneJS over to using Canvas3D.
Following tradition, my first SceneJS demo
was, of course, a Gouraud-shaded Utah Teapot, a bit like this one:
I loved the idea of a 3D world defined declaratively, as pure data. At this point, I was inspired by the likes of VRML, which I’d used as a student to visualize data, and by the terse, declarative syntax of JavaFX.
2008: SceneJS Open Sourced
My day job back in 2008 (in a cubicle, maintaining a Java-based spam-scrubbing platform) just wasn’t firing my creative circuits. I needed to get back in
touch with the creative culture that drew me into programming in the first place: 3D graphics, SIGGRAPH journals, cyberpunk
science fiction - all that good stuff.
So I quit my job, put SceneJS on GitHub, and devoted my time to getting
back into 3D programming, using WebGL.
2009: SceneJS Powering the BioDigital Human
A little while later, I signed up with BioDigital Systems and we began developing the BioDigital Human on SceneJS.
Human Content Pipeline Origins
Brandon Smith (AKA “The Wizard”) began by exporting one of
BioDigital’s models of the human skeletal system to COLLADA, which I then imported into SceneJS using an experimental open
source SceneJS asset server I’d been working on.
The 206 bones within that model rendered at a promising rate of around ~20FPS, so we took a gamble on WebGL and so the BioDigital flagship app was born.
Our biggest challenge was getting the platform to work reliably across the various operating systems, browsers and GPUs,
and so the next few years involved navigating patchy GPU support and a lot of “Aw Snap”. We owe a lot to the work of
the ANGLE developers, whose work allows full hardware acceleration on
Windows without relying on OpenGL graphics drivers.
Over the next couple of years I rewrote SceneJS twice, and we managed to double that performance. Olli Etuaho from NVIDIA even helped
out and made optimizations for mobile GPUs, and later, after we’d made a private fork (described below), we got it rendering
at ~60FPS for most of the full anatomy model.
Human Developer API Origins
Before Human, I’d also done some open source experiments with controlling SceneJS via a JSON-RPC message protocol, and we used those
to get started with our developer API.
Some of those experiments were:
- xeoEngine-experiment which allows you to control SceneJS via JSON-RPC,
- actorjs which allows you to define and update actor components via JSON-RPC, and
- scenejs-grid which applies the actor and JSON-RPC concepts on SceneJS.
Those are now archived projects, but were useful for determining the best way to control a Human within an
IFRAME embedded in a 3rd-party container page.
One of my inspirations for JSON-RPC was the messaging system that Paul Brunt had built into his WebGL-based GLGE engine.
Using the Developer API
To use the API, start by embeding the Human Widget in your page. In the example below, we’ll use the cochlear implant model:
Next, we’ll create an instance of the Human API:
Through the API, we can now make the widget do things like set the position of the camera etc:
For more info on what’s possible with the API, sign up with Human and check out the tutorials at developer.biodigital.com.
API Demo: Lockheed-Martin ICE STORM Integration
We used the API for various presentations. For one presentation, we used it to interface the Human with the Lockheed Martin ICE STORM ICU Simulator, so that changes to the patient’s heartbeat and respiration within the simulator were rendered as morph animations within the Human:
Smiletrain Virtual Surgery Simulator
The Human is a platform on which we can build applications. One of the most rewarding of those was the WebGL-based SmileTrain
Surgical Cleft Repair Simulator.
We based the Smiletrain Simulator on Aaron Oliker’s
earlier C++ version, which he implemented on OpenSceneGraph.
That slick Darth Vader Approved UI you’re seeing in Human and Smiletrain is the work of BioDigital front-end engineers Kathia Yau and Avinash Chan.
2015: SceneJS Private Fork
In 2015, with the company expanding, Tarek Sherif took over my role as 3D programming lead, since it made
sense for that job to be a performed on-site by a non-virtual person who could chase people around the office, instead of
typing emails all night in the wrong time zone.
Throwing the git pull requests back and forth, we then added many more features to Human and SceneJS, including a streaming asset
server, physically-based rendering (PBR), geometry and texture compression, particle systems and an improved post-effects pipeline.
For the post-effects support, Tarek built an extensible plugin-based architecture based off his own open source WebGL engine, PicoGL.
SceneJS Presentations
Along the way, I got to write about SceneJS and present it to fellow graphics nerds:
- Wrote a chapter about SceneJS in OpenGL Insights 2012, which you can now download for free.
- Talked about SceneJS at the 2015 Berlin WebGL Meetup - here are the slides from that talk, with a few embedded live demos.
Next Steps: xeogl
I’m going to keep making more of these WebGL engines, because there’s never a one-size-fit-all solution (and well, it is a
bit of a creative compulsion).
The public fork of SceneJS is now archived and no longer under development. However, if you’re looking for a production-proven
WebGL-based 3D engine which is currently used in several commercial IFC and CAD viewers, you might find my latest engine useful: http://xeogl.org.
Acknowledgements
- Vladimir Vukevic for kicking WebGL off with his Canvas3D experiments,
- the Khronos WebGL Working Group for overseeing the development of the WebGL specification,
- the ANGLE developers for making WebGL work on DirectX,
- COLLADA™ for the file format that got us started with the Human (and taught me a lot about what goes into a 3D engine),
- Paul Brunt for his pioneering open source WebGL-based GLGE engine, which was like a living textbook on graphics algorithms on JavaScript,
- the Zygote Body, which was originally created as the Google Body by Won Chun, as his 20% experiment at Google,
- the SceneJS community for a crash course on what a 3D engine is, and
- the model creation team at BioDigital for creating all the cool content that makes the platform shine.
I won’t try to mention the whole BioDigital team here because I might miss someone, but I’ll give shouts out to our interns, such
as Jacqueline Chu and Shuai Shao (AKA ShrekShao),
who came in fresh from academia and added many valuable rendering features.