University of Tübingen
Department of Computer Science
D-72076 & Tübingen
The size of datasets in scientific computing is rapidly increasing. This increase is caused by a boost of processing power in the past years, which in turn was invested in an increase of the accuracy and the size of the models. A similar trend enabled a significant improvement of medical scanners; more than 1000 slices of a resolution of 512 x 512 can be generated by modern scanners in daily practice. Even in computer-aided engineering typical models easily contain several million polygons. Unfortunately, the data complexity is growing faster than the rendering performance of modern computer systems. This is not only due to the slower growing graphics performance of the graphics subsystems, but in particular because of the significantly slower growing memory bandwidth for the transfer of the geometry and image data from the main memory to the graphics accelerator.
In this talk, I will focus on large model visualization techniques which address this growing divide between data complexity and rendering performance. I will also discuss several application fields which have to deal with large models.
Recommended reading and other resources: