How can we provide users with an accurate impression of the terrain and sky when they experience in-flight simulation, or a detailed view of various decorative materials such as wood or metal when they explore virtual rooms? Assistant Professor of Computer Science Shuang Zhao has received a National Science Foundation (NSF) award of $500,000 to address this issue.
Zhao’s grant, “Predictive Material Appearance Modeling at Multiple Scales,” aims to overcome the current obstacle of trying to work across multiple scales when building highly immersive virtual realities. “The goal is to develop new computational tools capable of predictively reproducing material appearance at greatly varying physical scales,” he explains. “Successful development of these tools will bring high-fidelity materials with fine-grained details — fabrics or animal fur, for example — into computer-simulated virtual realities.”
To maximize the impact of the work, Zhao will release the entire software architecture as open source, making it readily available to designers, retailers, developers, educators, artists and students. Furthermore, he will present the findings at workshops and high-profile conference tutorials and will leverage the new appearance modeling-techniques to develop pedagogical tools (using VR, for example) “for outreach to high-school students (especially minorities) to foster interest in STEM.”
— Shani Murray