Skip to content

Paraview: Neutron Stars

Maria Okounkova edited this page Jul 18, 2022 · 1 revision

Guide: Visualizing Neutron Stars

One of the most difficult visualization tasks is displaying volume data. The vtk files can be tens of gigabytes and a few hundred timesteps can take days to render. Fortunately, the undergraduates have thoroughly fleshed out much of this process.

Generating Your Stills

Presumably, you should now have a directory with a .pvd file and the actual vtk data for your neutron star (and optionally another one for a black hole).

There are two volume data visualization scripts:

  • SliceImage.py makes a top-down image/movie of a z=0 cut
  • VolumeImage.py makes a fully-detailed volume image/movie (Warning: This can take several hours of tweaking parameters and 24 hours of submit time)

In this guide you can substitute SliceImage.py with VolumeImage.py unless otherwise noted

Setting Up The Scripts

  1. Copy SpEC/Support/Visualization/ParaviewPythonScripting/BHNSScripts/SliceImage.py to the directory where you wish to dump the images
  2. Set the initial variables
    1. Set the 'NSDirectory' and 'BHDirectory' variables to the full paths to their respective .pvd files. Example: <code> NSDirectory="/panfs/ds06/sxs/alexs/CurranDebug/TimeParallelFolder/VtkDataAlex.pvd" </code>
    2. Set the 'Data' variable to the name of the scalar to visualize
    3. Set the 'BH' variable to 1 if you have and wish to visualize black hole data, otherwise leave it as 0
    4. Specify Starttime (the first frame you wish to render)
    5. Set IsParallel to 1 if you wish to run the script using NodeDistributor.cpp
    6. If IsParallel is set to 0, you must also specify Endtime (the last frame you wish to render)
    7. Set the 'Imagename' variable to change the prefix of the rendered images (MyName0000.jpg, MyName0001.jpg, etc)
    8. Optional changes for both scripts:
      • Change the 'Width' and 'Height' variables to set the resolution of the rendered images

Required For VolumeMovie.py

  - If rendering a scalar other than Rho0Phys, change 'min' and 'max' to correspond to the min and max of the data over all timesteps. Don't worry about being exact. Try to overestimate min and underestimate max by an order of magnitude.
  - Optional changes
    * Set the 'Mesh' variable to 1 if you wish to display the extent of the domain as a grey surface below the neutron star, otherwise, leave it as 0

Rendering Your First Images

These will only take two cpus, so you can either use a head node, compute node, or submit a job

Option 1: For Only One Image

If you're only interested in making one image of the (n-1)-th timestep and not making a movie or several different timesteps, set IsParallel to 1, and Starttime to (n-1), then

  1. Run <code> pvpython --use-offscreen-rendering < SliceImage.py n </code>

Option 2: To Make a Movie

If you're interested in making a movie from your images or many images, set IsParallel to 0, Starttime to your first desired frame number, and Endtime to your last desired frame number. Then run

  -<code> pvpython --use-offscreen-rendering < SliceImage.py </code>

Tweaking Your Parameters

At this point, it is very unlikely that you are happy with the image(s) that were produced. This is because some of the parameters were chosen for a specific run, and cannot be automatically set. Thus, we have to modify some of the parameters by hand and reproduce the images.

  1. If the center of the neutron star is not red/visible enough, decrease the 'max' variable. If it is too red/visible, increase it.
  2. Change the 'Separation' variable to zoom in or out (larger is more zoomed out)
  3. Optional tweaks for both scripts:
    • Set the 'Log' variable to 1 use a log scale for color (I find that this doesn't typically look good).

Required For VolumeMovie.py

  1. Change the 'CamUpAngle' variable to view the neutron star from a different angle (90 is top-down, 0 is in the z=0 plane)
  2. If there appears to be a lot of blue data that should be considered vacuum, increase 'min' by maybe one order of magnitude. By increasing the lower threshold, the images will take significantly less time to render
  3. Change the 'Transparency' variable to make the data appear more or less 'wispy'. Lower values are more opaque.

Perfecting The Images

Finishing Tweaking

After tweaking the variables as described above, redo the rendering steps listed above. Then repeat the tweaking and rendering steps as many times as necessary to produce images you're happy with.

Making Many Images

At this point, if you were only interested in making one or two nice images, you're done! Otherwise, you'll want to visualize all of the timesteps, not just the first or last.

If you're running VolumeImage.py, you'll need to submit a job request (or leave your computer running overnight). For SliceImage.py, it will be better to grab a compute node (total process should only take 10 minutes), although a job request will also work.

Either way, the command to run is,mpirun -npernode 12 NodeDistributor "pvpython --use-offscreen-rendering < SliceImage.py"

This command should work as is on a compute node. If on a computer with more or less than 12 cpus per node, make sure to change the -npernode option appropriately.

Submitting a Job

In order to run this command as a job request, you will need to write a submission script. The script should be run as qsub NAMEOFSCRIPT.pbs

Here is one example of a submission script. Please read the comment about how many cpus to grab per node and the -npernode option. Two example submission scripts are located at SpEC/Support/Visualization/ParaviewPythonScripting/BHNSScripts/SubmissionExamples

#!/bin/csh -f    
# Request 5 node for 24 hours 
#PBS -l nodes=5                           
#PBS -l walltime=24:00:00         

cd /panfs/ds06/sxs/alexs/GeoffMovie

# Typically a high-resolution volume image will take more than 2GB of RAM, and RAM is shared between mpi processes.   
# Thus, for zwicky, I like to use only 8 cpus per compute node (which has 12).  
# This ratio appears to work well so modify as necessary for your computer. 
mpirun -npernode 8 NodeDistributor "pvpython --use-offscreen-rendering < VolumeImage.py"

Note: We have found that the rendering is extremely inefficient, and sometimes breaks, if we use more than 4 or 6 cpu's per compute node on Zwicky for volume images with high resolution and a lot going on in the image.

Creating Your Movie

Now you should have multiple files named (myimage0001.jpg, myimage0002.jpg, etc). This last step will process those images into a video file. Using ffmpeg, these images can be assembled into a movie with any framerate you like. Even better, you can use x264 to take total control over the encoding, ensuring optimal filesize and compatibility. Here is the procedure (assumes a modern Linux multimedia stack):

In a terminal, cd into the directory containing your frames and execute the following commands to create an MP4 video that is both small in size and compatible with mobile devices: ffmpeg -i 'myframe.%04d.jpg' -pix_fmt yuv420p -f yuv4mpegpipe - | x264 --crf 26 --preset veryslow --tune animation --fps 25 --keyint 75 --profile baseline --level 3.0 --vbv-maxrate 2500 --vbv-bufsize 2500 --ref 5 --output mymovie.264 --demuxer y4m -MP4Box -add mymovie.264:fps=25 mymovie.mp4

  * The input filename (''myframe.%04d.jpg'') uses ''printf'' format strings to indicate how your sequence of frames is numbered.  This example is appropriate for use with ParaView.
  * Replace ''%%--fps 25%%'' with the framerate you would like your movie to have (if you have very few frames, you can choose a small value here to have a slideshow-like effect).
  * The ''%%--keyint%%'' parameter controls the precision of seeking when playing back the video (at the expense of filesize).  A value of triple the framerate allows the user to seek to within 3s, for example.
  * The quality is controlled by ''%%--crf 26%%''.  Lower numbers are higher quality.
  * For compatibility with mobile devices like smartphones (including the original iPhone):
    * You must not alter the line beginning with ''%%--profile baseline%%''
    * Your resolution and frame rate must not exceed the limits of H.264 level 3.  If you stay within 720x480@30fps, you'll be fine.  This is best done when rendering the animation frames, but can also be done by adding something like "''-s 640x360 -sws_flags spline''" to the ''ffmpeg'' line.
  * Muxing with MP4Box produces a file compatible with streaming.  (To remux an existing MP4 file, use ''MP4Box -inter 500 movie.mp4''.)
  * If you have a recent version of ''x264'', see [[..:CombiningStills#encoding_recipes | Encoding Recipes]] in the [[..:CombiningStills#appendixx264_flags | Appendix]] for more advanced options
Clone this wiki locally