This is an archival copy of the Visualization Group's web page 1998 to 2017. For current information, please vist our group's new web page.

Visualization of Particle-in-Cell Simulation of Laser Wakefield Particle Acceleration

Table of Contents

Introduction

Plasmas are not subject to the electrical breakdown that limits conventional particle accelerators, and Laser Wakefield Accelerators (LWFA's) have demonstrated accelerating gradients thousands of times those obtained in conventional accelerators using the electric field of a plasma wave (the wakefield) driven by the radiation pressure of an intense laser. Plasma based accelerators hence offer a path to more compact machines for high energy physics, and also to high current ultrashort electron bunches which may revolutionize applications of accelerators to radiation sources and applications in chemistry and biology. The plasma interaction in this regime is fully nonlinear, and particle distribution effects are important, making simulation essential but extremely challenging.

Recent experiments at LBNL demonstrated for the first time the production of high quality electron beams for the first time in a high gradient laser wakefield accelerator (high quality beams were also reported at the same time at RAL/Imperial and LOA). This was achieved at LBNL by extending the interaction distance using a pre-formed plasma density structure, or channel, to guide the drive laser pulse over many diffraction ranges [1]. Initial experiments produced beams with several billion electrons above 80 MeV energy with percent energy spread and low divergence [1], and experiments this year have extended the bunch energy to 1 GeV [2]. Such beams allow laser-plasma accelerators to be considered seriously as alternatives to conventional accelerators for a wide variety of applications that demand high-quality electron bunches, making simulations to understand their behavior imperative.

Incite7 has enabled three-dimensional simulations of a few select cases, clarifying mechanisms of beam formation and evolution as well as laser driver behavior, and has begun to identify potential optimizations to improve performance. The large scale three dimensional simulations show important differences in trapping and beam evolution compared to previous two dimensional simulations. Detailed two dimensional runs are also being done to carefully understand convergence and parameter optimization.

First Light

As a first step we wrote an AVS/Express reader for the HDF5 field and particle data in the 2D case. They routinely use IDL for their work. Our plan is to extend the Express reader to 3D once the data format is specified, visualize the fields using volume rendering and select particles to follow as the simulation progresses. The 2D grid that we used as test case has 1800x300 grid points.

Figure 1. First images of the YeeElecField
Electric Field XElectric Field YElectric Field Z



Figure 2. Electric Field in the X direction and filtered particles (using the magnitude of the momentum as threshold)

3D Visualization

Using AVS/Express and VisIt

The 3-dimensional tests using AVS/Express showed that the datasets are too big to do volume rendering in a "normal" AMD-64, 8Gb of memory workstation. Figure 3 shows a slice through the Electric Field grid (1500,300,300) and the electrons thresholded and colored by the magnitude of the momentum. Figure 4 shows a volume rendering of the Electric Field and particles as points using VisIt.

Figure 3. Visualization using AVS/Express: slice (left), slice and isosurface (right) through the electric field and electrons thresholded and colored by the magnitude of the momentum

Figure 4. Visualization using VisIt: volume rendering of the electric field and electrons rendered as particles (thresholded by the magnitude of the momentum).

Partition of the data into domains to run VisIt in parallel

The next step was to partition the data for parallel processing. VisIt requires a header file that has the information of the number of blocks comprising the time steps. In this case we decided that 16 blocks per time step was a reasonable number both for our 8dual opteron core machine and our 32 processor Altix. Figure 5 shows the decomposition of the domain in the case of 4 processors.

Figure 5. Decomposition and visualization of the density field domain using 4 processors, the slice plane through the grid was colored by processor id

Figure 6 shows a sequence of images with a plane showing the density field and the thresholded particles colored by the magnitude of the momentum. The rendering was done using VisIt in a dual-dual core AMD-64 linux box with 8GBytes of memory.

Figure 6. Sequence of images with a plane showing the density field and the thresholded particles colored by the magnitude of the momentum.

Figure 7 shows mages of a volume rendering of the density field. The rendering was done in an Altix using 16 of the 32 available processors.

Figure 7. Volume rendered density field.

References

[1].G.R. Geddes, Cs. Toth, J. van Tilborg, E. Esarey, C.B. Schroeder, D. Bruhwiler, C. Nieter, J. Cary & W.P. Leemans, "High-quality electron beams from a laser wakefield accelerator using plasma-channel guiding," Nature, Sept 30 2004, pp. 538-41. LBNL-55732.

[2]W. P. Leemans, B. Nagler, A. J. Gonsalves, Cs. Toth, K. Nakamura,3, C. G. R. Geddes, E. Esarey, C. B. Schroeder, S. M. Hooker2, "GeV electron beams froma centimetre-scale accelerator," Nature Physics, V @, pp. 696