By: Roger Hoang & Michael Penick

Wildfires pose a considerable challenge to researchers. Understanding their destructive behavior may be key to mitigating the damage they cause; however the costs and risks associates with purposefully setting an area ablaze make doing so impractical. Additionally, as a controlling the spread of an unexpected wildfire is a primary concern, experimenting with new suppression techniques at such events is a risky proposition. In response to these problems, researchers have developed mathematic models using data collected from unplanned fires. In the past, visualizations of these fire models have been bound to the realm of two-dimensional images. Unfortunately, it is difficult to identify critical factors that dictate wildfire behavior in two dimensions. Slope, for example, is difficult to determine from a two-dimensional image. VFire is a virtual reality wildfire visualization tool that addresses this problem. Researchers will be able to use VFire to view a simulation from a multiple perspectives, and verify and refine fire models. Additionally, VFire will be used to provide training scenarios and inform land manager of the benefits of preventative measures. Our efforts have been focused on increasing the amount of visualized data, enhancing the visual presentation, and improving user interactivity. We utilize several features of modern graphics hardware, including a programmable shader, to speed up not only the rendering but also the simulation updating. Finally, VFire is designed to be implemented in a multi-display environment such as our four screen CAVE (CAVE Automated Virtual Environment). Due to the size of terrain needed for wildfire analysis, VFire uses a level of detail (LOD) rendering system. Processed segments of terrain are loaded as needed; to further facilitate rendering, the terrain geometry is stored in vertex buffer objects directly in a video memory, with a vertex morphing done in a vertex shader. Terrain scorching effects are also implemented. Fire and smoke are implemented using particle systems. All particles are rendered as cylindrical billboard, the tops of which are skewed according to spread direction and rate data obtained from FARSITE and retried in vertex shader. VFire currently allows the user to interact with the visualization through the use of a tracked wand. The user my fly about the virtual world or constrain movement to follow the terrain; additionally the user may drop teleportation markers at any time. Lighting may also be modifies at any time. Time in a simulation may be freely controlled, allowing the user to hasten, slow, stop, or reverse the simulation. Improving the accuracy of the vizualization will continue to be the main objective. We plan to modify SPeedGrass in a way similar to SpeedTreeRT to allow its use in a CAVE. We are developing ways to incorporate more data into the visualization in order to represent variables such as fireline in tensity and wind. A phsyically ased smoke model will be integral for atmospherix analysis. In order to more closely replicate the real world, we are also working on the ability to extract vegetation and objects from satellite images in conjunction with other pieces if data using image processing techniques. For increased interactivity, we plan to integrate FARSITE directly into VFire in order to allow users to experiment with varius parameters and increase VFire's utility as a training tool.