A subscription to JoVE is required to view this content. Sign in or start your free trial.
Not Published
The article describes a method to visualize three-dimensional fluid flow data in virtual reality. The detailed protocol and shared data and scripts do this for a sample data set from water tunnel experiments, but it could be used for computational simulation results or 3D data from other fields as well.
The last decade has seen a rise in both the technological capacity for data generation and the consumer availability of immersive visualization equipment, like Virtual Reality (VR). This paper outlines a method for visualizing the simulated behavior of fluids within an immersive and interactive virtual environment using an HTC Vive. This method integrates complex three-dimensional data sets as digital models within the video game engine, Unity, and allows for user interaction with these data sets using a VR headset and controllers. Custom scripts and a unique workflow have been developed to facilitate the export of processed data from Matlab and the programming of the controllers. The authors discuss the limitations of this particular protocol in its current manifestation, but also the potential for extending the process from this example to study other kinds of 3D data not limited to fluid dynamics, or for using different VR headsets or hardware.
Virtual reality (VR) is a tool that has seen increasing levels of popularity as it provides a new platform for collaboration, education, and research. According to Zhang et al., “VR is an immersive, interactive computer-simulated environment in which the users can interact with the virtual representations of the real world through various input/output devices and sensory channels1.” Fluid dynamics is a field of physics and engineering that attempts to describe the motion of liquids or gases due to pressure gradients or applied forces. Because the study of fluid dynamics is driven by the analysis of large data sets, a significant part of its research process hinges on visualization. This visualization poses a difficult task due to the potential size of the datasets and the four-dimensional (4D) character (3D in space + time) of the data. This complexity can hinder commonly used methods of visualization that are typically constrained to flat screens, limited in scale, and primarily allow the viewing of data from a removed perspective2. To overcome some of the challenges that impede effective visualization, this paper describes a method for processing 3D fluid flow data sets for import and interaction in virtual reality. Especially in the case of fluid dynamics, VR may give the user a more intuitive understanding of data patterns and processes that could otherwise be difficult to detect when exploring the 3D data via a 2D screen. Further, VR can pose as a novel educational tool for students because it provides an opportunity to reinforce ideas that they are learning in a traditional classroom setting in a new format3,4.
To-date implementation of VR for fluid dynamics research has relied on performing computational fluid dynamics (CFD) with in-house or off-the-shelf software packages and using various techniques to import these simulations and results into VR environments. For example, Kuester et al. created simulation code that continually injected particles into a flow field5. From there, they calculated the velocity of the particles which allowed them to visualize the particles in a VR toolkit called VirtualExplorer. The Matar research group at Imperial College London6 has developed a broader range of tools to interrogate additional quantities such as pressure density in the flow field, and they incorporate audio feedback as well. Still, their data is largely referenced on particle points or trajectories. Use of VR in fluid dynamics tends to follow this pattern of calculating particle streaks or streamlines in the highly resolved CFD of flow over immersed objects, e.g. a car or a cylinder, to reveal the structure and organization of the fluid motion7,8,9.
As an alternate to the particle-based modeling techniques, many 3D fluid dynamics results are modeled as 3D isosurfaces. These models are digitally heavier than point-based systems and require significant time and a number of steps to extract from the conventional software tools. The advantage of integrating these volumetric visualizations into VR is that they enable a user to virtually put themselves “inside” the measured or simulated domain. Isosurfaces in volumetric data are commonly used in both computational, and increasingly, experimental results. With the increasing prevalence of tomographic and holographic particle image velocimetry experiments that are acquiring volumetric data sets in a single run, methods to visualize and interrogate these fields will only grow in popularity.
Some of the previous implementations of VR with fluid dynamic data have relied on the creation of a software package written in-house by the researchers, which can be limited in terms of customization and user interaction8,10. There has not been widespread implementation of these packages in other research groups since their creation. In this paper, we describe a method and share software that imports 3D experimental data that was generated by interpolating multiple planes of stereoscopic PIV. Its conversion into digital objects in VR uses three popular software packages: Matlab, Blender, and Unity. A flowchart of how the data objects are processed through these software packages is shown in Figure 1.
Unity, a video-game engine that is free for personal use, offers interesting opportunities to envision and potentially to gamify fluid dynamic data sets with an abundance of resources that allow for representation of objects and customization of features that other packages lack. The method begins in Matlab, processing data from the standard PLOT3D format11, computing three quantities (x-velocity u, y-velocity v, and z-vorticity ωz), and then outputting the data into the .obj file format along with an associated .mtl file for each .obj file that contains the color information for the isosurfaces. These .obj files are then rendered and converted to the .fbx file format in Blender. This step can seem superfluous, but avoids difficulties that the authors encountered by ensuring that .mtl files correctly align with their associated .obj files, and that the colors of the isosurfaces are coordinated properly. These .fbx files are then imported into Unity, where the data is treated as an object that can be manipulated and toggled. Finally, the VR user interface has been designed to allow for the user to move the data in space and time, control visibility settings, and even create drawings of their own in and around the data without removing the VR headset.
Figure 1: Process flow chart. Please click here to view a larger version of this figure.
Data set
The data set used to demonstrate the process of 3D data visualization in virtual reality is from water tunnel experiments that measured the wake of a simplified fish caudal fin geometry and kinematics12. This work entailed experiments that used similar equipment and methods as described by King et al.13, in which a generic caudal fin was modeled as a rigid, trapezoidal panel that pitched about its leading edge. Stereoscopic particle image velocimetry was used to measure the three-component velocity fields in the wakes produced by the fin model, and these experiments were performed in a recirculating water tunnel located at the Syracuse Center of Excellence for Environmental and Energy Systems. Additional details about the experimental data acquisition can be found in Brooks & Green12 or King et al.13.
The 3D surfaces used to visualize the structure of the fluid motion are first rendered in Matlab. The structure and organization of the wake is visualized using isosurfaces of the Q-criterion. The Q-criterion, also referred to as Q, is a scalar calculated at every point in 3D space from the three-component velocity fields, and is commonly used for vortex identification. These surfaces are then colored using spanwise vorticity (ωz) to highlight and distinguish the large-scale structures of interest, mainly oriented in the spanwise (z) direction. Other quantities of interest include the streamwise (x-direction) velocity (u) and the transverse (y-direction) velocity (v). Visualization of isosurfaces of these quantities can often give context as to how the fluid has been accelerated by the motion of the fin model.
Example isosurfaces for each of these quantities, taken at the same instance of time, are shown in Figure 2. Isosurfaces of Q are shown in Figure 2A. The trapezoidal fin model is shown in black. The isosurface level was chosen to be 1% of the maximum value, which is commonly used in 3D vortex visualizations to reveal structures while avoiding noise that can be prevalent at lower levels. Blue coloration indicates negative rotation of the vortex tube, or clockwise rotation (when viewed from above).
Figure 2: Example images from visualization software Fieldview.
(A) Q-criterion isosurfaces, shaded from red to blue by spanwise (z) vorticity;(B) x-direction velocity (u) isosurfaces; and (C) y-direction velocity (v) isosurfaces Please click here to view a larger version of this figure.
Red indicates positive rotation of that vortex tube, or counter-clockwise. Green indicates rotation in a direction other than along the z-axis. In Figure 2B, blue isosurfaces indicate streamwise velocity that is 10% lower than the freestream value and orange indicates streamwise velocity that is 10% higher than the freestream value. In Figure 2C, red isosurfaces indicate transverse, or cross-stream velocity that has a magnitude 10% of the freestream value but negative (out of the page) and green indicates streamwise velocity has a magnitude 10% of the freestream value and positive (into the page). These velocity isosurfaces have been used to show how the vortex ring structure is consistent with the periodic acceleration and deceleration of the fluid due to the motion of the fin model.
In the following protocol, we detail the steps for converting these isosurface visualizations into surface objects in 3D space that can be processed and imported into Unity for visualization and interaction in virtual reality.
1. Processing data files and exporting objects from Matlab
NOTE: The following steps will detail how to open the shared Matlab processing codes, which open PLOT3D data files and generate the isosurfaces as described. The necessary .obj and .mtl files are then also constructed and exported from Matlab.
2. Rendering surfaces in Blender
NOTE: In this section, we detail the steps for importing the output object data from Matlab into Blender, where the objects are scaled and then exported in a file format that is compatible with Unity. The shared scripts execute these tasks.
3. Configure Unity, the SteamVR Plugin, and VRTK
NOTE: These steps will ensure that the necessary software is present and configured correctly to import the data and generate the related games. Please note the software versions recommended in these steps, as we cannot guarantee that the shared scripts will work with updated versions of these particular applications.
4. Creating the data visualization game in Unity
NOTE: After successfully configuring Unity in step 3, these steps will use the provided scripts to build the game that can then be saved as a standalone executable launch file.
Using the method described above, the results are created from the example dataset, which was split into 24 time steps and saved in the PLOT3D standard data format. Since 3 quantities were saved, 72 .obj files were created in Matlab and converted into 72 .fbx files in Blender and imported into Unity. The Matlab code can be adjusted according to the number of time steps of the user’s data by changing the number of times that Matlab iterates through the main for loop, which can be found at the very beginning of the s...
In this article, we described a method for representing fluid dynamics data in VR. This method is designed to start with data in the format of a PLOT3D structured grid, and therefore can be used to represent analytic, computational, or experimental data. The protocol describes the necessary steps to implement the current method, which creates a VR “scene” that allows the user to move and scale the data objects with the use of hand movements, to cycle through time steps and quantities, and to create drawings a...
The authors have nothing to disclose.
This work was supported by the Office of Naval Research under ONR Award No. N00014-17-1-2759. The authors also wish to thank the Syracuse Center of Excellence for Environmental and Energy Systems for providing funds used towards the purchase of lasers and related equipment as well as the HTC Vive hardware. Some images were created using FieldView as provided by Intelligent Light through its University Partners Program. Lastly, we would like to thank Dr. Minghao Rostami of the Math Department at Syracuse University, for her help in testing and refining the written protocol.
Name | Company | Catalog Number | Comments |
Base station | HTC and Valve Corporation | Used to track the user's position in the play area. | |
Base station | HTC and Valve Corporation | Used to track the user's position in the play area. | |
Link box | HTC and Valve Corporation | Used to connect the headset with the computer being used. | |
Vive controller | HTC and Valve Corporation | One of the controllers that can be used. | |
Vive controller | HTC and Valve Corporation | One of the controllers that can be used. | |
Vive headset | HTC and Valve Corporation | Headset the user wears. |
Request permission to reuse the text or figures of this JoVE article
Request PermissionExplore More Articles
This article has been published
Video Coming Soon
Copyright © 2025 MyJoVE Corporation. All rights reserved