This is an archival copy of the Visualization Group's web page 1998 to 2017. For current information, please vist our group's new web page.

SciDAC Accelerator Modeling: Visualization of Large Particle Data Sets

Table of Contents

Introduction

Figure 1. Beam-beam collision simulation.

Simulation studies of beam dynamics using large number of particles produce extremely large six dimensional space data sets . It is desirable to store such enourmous datasets efficiently and also to facilitate the sharing of data between different groups that work of particle-based accelerator simulations. A standard data format also makes it easy to develop data analysis and visualization tools that do not need to be adapted to each group's particular way of writing the data.

This work shows a simple HDF5 (Hierarchical Data Format) file schema as well as an API that simplifies reading and writing using the HDF5 library. We also show the work under development to build a set of visualization and analytics tools using this schema. See [1] for more details about the HDF5 format.

A portable high performance parallel data interface for particle simulations

Motivation

The motivation for this work is to produce a file format that is suitable for large-scale particle simulations. The requirements are the following: it must be machine independent, self-describing, easily extensible, language independent, efficient (serial and parallel) and produces files that are seamlessly shared by different programs.

H5Part API

The proposed file storage format uses HDF5 for the low-level file storage and a simple API that provides a high-level interface. For a detailed description of HDF5 its benefits and file organization see [2]. We adopted HDF5 for our file storage because it offers all that was specified in the requirements.

In order to store Particle Data in HDF5 file format, we formalized the hierarchical arrangement of the datasets and the naming conventions of groups, datasets and attributes. The H5Part API encodes this conventions in order to provide easy reading and writing of files using the C, C++ and Fortran APIs. It also makes it easy to write adaptors for visualization and analysis tools. All of the HDF5 utilities still work for these files.

Figure 2. A common self describing data format makes it easier to share visualization and analysis tools. In this figure PartView and AVS/Express read particles data in any platform.


The file format supports the storage of multiple steps of datasets that contain multiple fields. The fields correspond to the different properties of a particle, for instance, X, PX, Y, PY, Z, PZ, the coordinates and phase vector components. The fields can be either integral or real data types. For the moment, to simplify the requirements for the file readers we use 64 bit integers and double precision floats. HDF5 is capable of automatically downconvertings to 32 bit data type upon request. Finally, the file, the individual steps, and the individual data fields can contain attributes, for instance, units, simulation parameters, or code revision information.

H5Part is currently used in the mad9p (methodogical acceleration design version 9 parallel)[3] using the C++ API and we are working on integrating it in BeamBeam3D[4] using the Fortran API.

Performance
Preliminary performance estimations, looking at global (GD) and local data (LD) rates, suggest that our HDF5 writing has a very good performance veven with respect to raw mpi, as shown in table 1.
Table 1: performance
Mode GD(MB/s) LD(MB/s)
mpi-io(one file) 241 3.7
one file per proc 1288 20
H5Part/pHDF5 (one file) 773 12

PartView

PartView[5] is a tool designed to provide a straightforward interface that covers the simpler 3D analysis and inspections needs that accelerator modeling researchers need in order to understand their datasets. It is consistent with recommendations from our users who indicated a preference for tools that had a user interface that was simpler and customized to their problem over more powerful and complex general purpose visualization tools. See below for a more flexible but complex application developed using AVS/Express.

As the particle simulations are scaling up rapidly and may soon exceed the ability of serial tools like Express there is a need for lightweight front-end applicaitions that can tap into remote I/O and rendering services that are located at the site wehere the datasets are produced.This same architecture can be used to connect to running simulations to inspect the data as it is writen.

As a response to these needs we created PartView. This tool allows the user to project the 6 dimensional data onto 3D space by selecting which dimensions are represented as particle coordinates and which ones as particle attributes. It also allows the browsing of a time series. Figure 3 shows the PartView GUI with the colormap editor (solid and gaussian).
Figure 3. PartView application.

PartView was constructed using FLTK [6], OpenGL [7], HDF5 [1], H5Part [2] and libssh[8].

3D Visualization using AVS/Express

Once the data has been inspected and in some cases subsampled, AVS/Express offers a good environment to build visualization applications that requires a level of complexity that would take too long to implement using a graphics API like openGL. Express provides an extensive library of visualization, interactivity, gui support and I/O module that allows for fast prototyping of visualization applications. It also provides a C, C++ and a Fortran API to write custom modules.

We wrote a data reader for Express using the H5Part API. Using the dataset names it's easy to choose which of the data sets are used as coordinates and which ones as particle attributes. Figure 4 shows the Express H5Reader application. Figure 5 shows the time evolution of a proton beam projected in the X-Z planes using this application..
Figure 4. AVS/Express H5Reader application.
Figure 5. Time evolution (MPEG (~300M)) of a proton beam simulation using the mad9p code.

Discussion and Next Steps

As computer simulations become indispensable in the design and test of particle accelerators the high dimensionality of the problem and the size of the simulations requires the design of clever visualizations to help understand the problem. Our primary focus is to increase the capability of PartView which is targeted at simulations that generate 1 billion particles or more.

The file format will be extended in the near future to integrate fast bitmap indexing technology [9] in order extract subsets of the data using compound query expresions such as (velocity > 1e06) AND (0.4 < phase < 1.0) .

References

[1] HDF5 Home page, http://www.ncsa.uiuc.edu/HDF5
[2] H5Part: A Portable High Performance Parallel Data Interface for Particle Simulations, Andreas Adelmann (PSI, Villigen), Robert Douglas Ryne, John M. Shalf, Cristina Siegerist (LBNL, Berkeley, California), Particle Accelerator Conference, Knoxville, Tennessee, 2005.
[3]3D simulations of space charge effects in practicle beams , Andreas Adelmann, Diss., Mathematische Wissenschaften ETH Zürich, Nr. 14545, 2002
[4] J. Qiang, M. A. Furman, R. D. Ryne, A parallel particle-in-cell model for beam-beam interaction in high energy ring colliders, J. Computational Physics 198, p. 278, 2004.
[5] From Visualization to data mining with large data sets, Andreas Adelmann (PSI, Villigen), Robert Douglas Ryne, John M. Shalf, Cristina Siegerist (LBNL, Berkeley, California), Particle Accelerator Conference, Knoxville, Tennessee, 2005.
[6] Fast Light Toolkit, http://www.fltk.org
[7] OpenGL, http://www.opengl.org
[8] libssh, http://osx.freshmeat.net/projects/libssh/
[9] K. Stockinger, J. Shalf, W. Bethel, K. Wu, "Dex: Increasing the Capability of Scientifc Data Analysis Pipeliens by Using Efficient Bitmap Indices to Accelerate Scientifi Visualization", Scientific and Statistical Database Management Conference (SDDBM), 2005.