DIPY Horizon: fast, modular, unified and adaptive visualization
Eleftherios Garyfallidis1, Marc-Alexandre Côté2, Bramsh Qamar Chandio1, Shreyas Fadnavis1, Javier Guaje1, Ranveer Aggarwal3, Etienne St-Onge4, Karandeep Singh Juneja5, Serge Koudoro1, and David Reagan6

1Intelligent Systems Engineering, Indiana University, Bloomington, IN, United States, 2Microsoft Research, Montreal, QC, Canada, 3Microsoft, Hyderabad, India, 4Université de Sherbrooke, Sherbrooke, QC, Canada, 5Indian Institute of Technology, Hyderabad, India, 6Pervasive Technology Institute, Indiana University, Bloomington, IN, United States


DIPY Horizon is fast, modular, unified and adaptive visualization system that resembles a high-end game engine, works on the web and across desktop operating systems. Horizon is suitable both for user, programmers and clinical applications.


Medical imaging has grown to be a fascinating field with beautiful visualization tools that allow us to see images of the brain and the body that were impossible in the past. Although the field has grown quickly, the tools we currently have are limited to very specific applications. To change this paradigm, we propose a visualization engine and application, inspired by high-end game engines, that allows the user to interact with multiple objects (streamlines, surfaces, volumes, glyphs or assemblies of those) and user interface components (including 3D orbital menus) using the same API. Furthermore, this API allows to a) directly program shaders on different visual actors, b) interact with and pick any object, c) build scripted animations with timelines, d) work seamlessly with dipy, nibabel, matplotlib, numpy, scipy, vtk, tensorflow and other libraries of the pythonic ecosystem, e) allow web visualization, desktop and VR visualization with no changes in the code, f) visualize multiple subject data at the same time in grid or layered views, g) load data from the command line, the python script, the web or directly from the window, h) interact between clusters, bases (spherical harmonics) and final data.


DIPY Horizon is built using GLSL2 (OpenGL Shading Language) for low-level GPU operations and the Python VTK3 library (8.1+) for scene representation and high-level operations. The proposed framework can access any DIPY1 method including segmentation, registration, and reconstruction that could be useful for visualization purposes. For instance, let’s assume we want to visualize orientation density functions (ODFs). These are heavy in memory as their size is IxJxKxS, where I, J, K are dimensions of the diffusion data and S is the number of vertices in the sphere. The ODF array can easily reach >10GB in size since S is usually large (>500). Knowing this, rather than directly loading those in memory (CPU/GPU RAM), Horizon uses their spherical harmonic representation to fit them in memory and then directly project the spherical harmonics using vertex, geometry and fragment shaders. Using this technique we can radically reduce memory needs by offloading the generation of sphere geometry to the GPU.For web visualization, we can use a client/server system inspired by ParaViewWeb. For WebGL applications, we can extract the scene to vtk.js canvas. Nonetheless, if GPU servers are available the first option is recommended because data will not need to be streamed to the client only the visual output.

Additionally, Horizon provides support for virtual reality. With the help of VTK’s support for VR rendering through the OpenVR SDK or Oculus Windows SDK, Horizon will offer the possibility to visualize your python application in a VR headset (HTC Vive / Oculus Rift/ Google Cardboard) without any code modification. Our API provides an easy way to switch interactively between a VR headset and a desktop application as well.Furthermore, to simplify installation we separated the core visualization API in a library called FURY5 and the application, higher level API is provided together with DIPY using the command `dipy_horizon`.


One of the first experiments was to test Horizon with loading a large tractogram, cluster it using QuickBundles4 on loading and then interact directly with centroids and clusters (see Fig. 1). The second experiment was to load large slices of FOD fields and slice through them (see Fig. 2). The third experiment was test Horizon with large displays and virtual reality devices using the integrated interfaces (see Fig. 3). Here we show nested interfaces where an object selected a menu appears around it that the 3D object can have a menu on its own (see Fig. 3). Finally, we show an example of cortical surface shader that can change the size of the outline in real time allowing for beautifully glass-like effects (see Fig. 4).


DIPY Horizon is a high-end visualization tool designed for massive visualizations and animations of medical data. It can be useful both to clinicians and researchers. It also has the advantage of being open source and built with an easy-to-learn language such as Python and therefore it is easy to be extended to individual labs needs. Furthermore, it provides utilities for advanced visualization experts to directly program the GPU and generate specialized effects without having to program the entire engine mechanics neither needing to worry about compatibility across platforms (different operating systems or web vs desktop implementations).


We presented DIPY Horizon, an incredibly flexible medical visualization system that we hope will revolutionize the way we interact with medical data. We have tested the system from large wall displays to small mobile systems, web clients and VR using the same code. We hope this ease of use will be useful to many.


We would like to acknowledge NIH R01 R01EB027585 and NSF Nanobio node for their support.


[1] Garyfallidis et al. Dipy, a library for the analysis of diffusion MRI data, Frontiers of Neuroinformatics 8, 8, 2014.

[2] Rost et al. OpenGL Shading Language, Khronos Group 2009.

[3] Schroeder et al. The visualization toolkit: an object-oriented approach to 3D graphics, 2004.

[4] Garyfallidis et al., Recognition of white matter bundles using local and global streamline-based registration and clustering, Neuroimage 2017.

[5] FURY is available at https://fury.gl


Fig. 1: This Figure illustrates the use of GPU shaders for visualization. This example highlights the folds in the brain surface by comparing the normal vector of each face with the direction of the camera. Only fragments which are approximately orthogonal to the camera are rendered and the amount of thickness of the outlines can be change in real time (see left to right).

Fig. 2: Horizon allows to directly interact with labeled tractograms. For example, a user starts by loading a full brain tractogram (A) which gets automatically clustered into bundles (centroids shown in B). Then, the user can expand any selected centroid (shown in C) or decide to hide the other clusters/centroids (shown in D). That process (cluster/selection/hide) can be repeated multiple times to remove subclusters (see red arrow) to obtain refined results as shown in E.

Fig. 3: Horizon uses geometry shaders to optimize FOD and ODF visualization. Moreover, if connected to a tracking algorithm, it can show the tracking process in real time displaying the neighboring peaks as the tracker progresses The engine can automatically zoom and fly along to show the best viewing angle as shown from A to C.

Fig. 4: This figure demonstrates two advanced interfaces. On the left is an orbital menu which can be attached to any 3D object. On the right is a virtual reality application which uses VTK’s OpenVR integration to render stereo image pairs on a mobile phone for use in a VR headset.

Proc. Intl. Soc. Mag. Reson. Med. 27 (2019)