New technology at Tufts University's Center for Scientific Visualization is enabling researchers to translate the most abstract, complex scientific concepts into clearer, more precise 3-dimensional images than conventional visualization systems can create.
Funded by a $350,000 grant from the National Science Foundation, Tufts' new 14-foot by 8-foot visualization display offers a combination of advanced features found nowhere else in New England and in only a few other installations in the country. Its application will further Tufts' research and educational programs in diverse disciplines, from mathematics and physics to human factors engineering, and even drama and dance.
Brain's Untapped Capacity for Visuals
"Users will be able to manipulate, simulate, touch and literally immerse themselves in data in a way they never have been able to before," said Amelia Tynan, vice president and chief information officer and co-principal investigator on the grant.
Visualization is built on the age-old premise -- borne out by modern cognitive science -- that pictures say as much as, or even more than, words.
The human brain has a powerful, often underutilized capacity to process visuals, noted Robert Jacob, computer science professor and co-principal investigator on the project. A large portion of the brain processes visuals, and visualization technology puts that ability to work. "The brain absorbs a lot more information when it's presented in pictures rather than in stacks of data from a computer," Jacob said. This, he says, enables researchers and students to recognize things more quickly and also develop insights about what's going on with the data.
Unusual Combination of Technologies
While visualization is widely used in science, Tufts' "VisWall" offers unusually robust capabilities by combining advanced features not typically found together.
Housed at Tufts' School of Engineering but available to the entire university, the seamless wall features a high resolution display system that uses rear projection in order to enhance the amount of detail that is visible. Most visualization systems use several projectors at once or multiple, tiled screens to display images. Tufts' uses just a single screen with close to 9 megapixels resolution (4,096 x 2,169 pixels) and two projectors (with overlapping fields of projection) to create high- resolution images and animation.
By using a single screen and two projectors, Tufts is able to produce ultra-high resolution images -- including 3-D images -- that appear smoother and without seams. Images projected at a higher resolution reveal fine, minute details that would be imperceptible on a screen with fewer pixels or tiled images. The VisWall's projectors are equipped with Infitec filters to minimize ghosting, in which an image appears to include elements of another image. Ghosting is a common drawback with conventional polarized filters.
In addition, the Tufts system can combine the sense of touch with that of sight through haptic devices that convey varying levels of resistance to the user when he or she touches graphical objects on the display wall. This also allows Tufts researchers to create virtual environments, such as the human body for surgical simulations that can be physically manipulated and transformed.
Order in Chaos
Tufts faculty have already discovered applications of the new technology. Mathematics Professor Boris Hasselblatt made a surprising find while viewing a mathematical model of butterfly populations as they fluctuated through successive generations. The model, used for research in dynamical systems theory, is based on a simple formula and is well-known to anyone familiar with chaos theory.
Visualizing the large population dataset with the 14-foot-wide, high-resolution graphical display enabled Hasselblatt to detect anomalies impossible to perceive with conventional displays: subtle traces of curving lines that he said indicated irregularities in variations in the population. The lines extended over different areas of the model and then converged at one distinct point.
Hasselblatt has looked at smaller images of this classic model many times during the last 20 years but had never recognized this convergence. He has not yet determined the implications of this discovery, but he said the pattern reflects order in what mathematicians have always thought to be a progression of chaotic cycles. "The pattern is so subtle that it's imperceptible but in this rendition the resolution is fine enough that I can easily see it," he said.
Bruce Boghosian, chairman of the mathematics department at Tufts and principal investigator on the NSF grant, said that the VisWall will benefit his study of fluid dynamics. Visualization capabilities can help him and his fellow researchers better understand fluid flow.
"You can go right up to streamlines in a fluid or dig into a reservoir and see which way it's flowing," said Boghosian. "That's the direction we would like to move in. You can imagine all kinds of other uses for something like that."
Virtual Surgery
The VisWall will also aid Mechanical Engineering Assistant Professor Caroline Cao. Her goal is to develop more robust laparoscopic surgical training systems in which 3-D computer simulations enable surgeons in training to feel as well as see.
She and her team, including senior Kyle Maxwell, have already developed software that enables users to remove a "tumor" during a simulated procedure. With the haptic device, these virtual surgeons receive force feedback when touching a hard surface, such as a tumor or bone, and a soft, deformable surface, such as tissue. The reaction is determined by the parameters provided by the model, which is based on real material properties.
Cao, who is director of the human factors program in the School of Engineering, said she wants to develop more anatomical features in the models. She also hopes to develop software that will simulate more complicated virtual procedures like heart surgery and colonoscopy. The VisWall's size, resolution and 3-D capability will greatly help in her work.
"Imagine the difference between simulating a virtual environment on a computer screen and one on a visualization wall -- the difference is tremendous," she said. "That's what large-scale visualization gives us, a capacity to create a richer immersion experience."
From Particle Physics to the "Lord of the Rings"
Similar benefits could be gained by physicist Austin Napier. His work in high energy physics relies on the ability to process huge streams of data from organizations like Switzerland's CERN, the world's largest particle physics laboratory. Tufts' VisWall will enable him to visualize on a single display what would otherwise require multiple computers.
Tynan said she expects the VisWall to become a resource for the broad range of academic disciplines at Tufts. She envisions scientists and engineers collaborating with faculty from the arts or humanities.
Boghosian brings up the example of the character Gollum in the "Lord of the Rings." Actor Andy Serkis' movements were tracked and translated to the digital rendering of the creature in the film. Similar technology is now available through the VisWall, which goes beyond traditional 3-D rendering to create a true virtual reality environment.
"Imagine taking the ability to do something like that and applying it to drama and dance," Boghosian mused. "Imagine taking the ability to do something like that and trying to use it for facial recognition or occupational therapy or many other fields. We haven't really even begun to explore those kinds of things yet."
Source : Tufts University