Skip to content

Commit

Permalink
update docs (#107)
Browse files Browse the repository at this point in the history
* update doc tool versions + fix #105

* fix deprecation warn

* add copy-paste button

* remove unit cookbook

* fix #100 by limiting copy-paste selection

* fix #103

* fix #102 by adding info box

* fix #108
  • Loading branch information
cbyrohl authored Dec 10, 2023
1 parent 256b14a commit 8f9a557
Show file tree
Hide file tree
Showing 7 changed files with 292 additions and 505 deletions.
179 changes: 0 additions & 179 deletions docs/notebooks/cookbook/units.ipynb

This file was deleted.

4 changes: 4 additions & 0 deletions docs/stylesheets/code_select.css
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
.language-pycon .gp, .language-pycon .go { /* Generic.Prompt, Generic.Output */
user-select: none;
}
.language-pycon .md-clipboard { display: none; }
29 changes: 17 additions & 12 deletions docs/tutorial/observations.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

This package is designed to aid in the efficient analysis of large datasets, such as GAIA DR3.

!!! note "Tutorial dataset"
!!! info "Tutorial dataset"
In the following, we will subset from the [GAIA data release 3](https://www.cosmos.esa.int/web/gaia/dr3). The reduced dataset contains 100000 randomly selected entries only. The reduced dataset can be downloaded [here](https://heibox.uni-heidelberg.de/f/3b05069b1b524c0fa57e/?dl=1).
Check [Supported Datasets](../supported_data.md) for an incomplete list of supported datasets
and requirements for support of new datasets.
Expand All @@ -19,8 +19,7 @@ It uses the [dask](https://dask.org/) library to perform computations, which has

Here, we choose the [GAIA data release 3](https://www.cosmos.esa.int/web/gaia/dr3) as an example.
The dataset is obtained in HDF5 format as used at ITA Heidelberg. We intentionally select a small subset of the data to work with.
Choosing a subset means that the data size in the snapshot is small and easy to work with.
We demonstrate how to work with larger data sets at a later stage.
Choosing a subset means that the data size is small and easy to work with. We demonstrate how to work with larger data sets at a later stage.

First, we load the dataset using the convenience function `load()` that will determine the appropriate dataset class for us:

Expand Down Expand Up @@ -110,10 +109,10 @@ and dimensionality checks are performed. Importantly, the unit calculation is do
to directly see the resulting units and any dimensionality mismatches.


## Analyzing snapshot data
### Computing a simple statistic on (all) particles
## Analyzing the data
### Computing a simple statistic on (all) objects

The fields in our snapshot object behave similar to actual numpy arrays.
The fields in our data object behave similar to actual numpy arrays.

As a first simple example, let's calculate the mean declination of the stars. Just as in numpy we can write

Expand Down Expand Up @@ -148,26 +147,32 @@ We discuss more advanced and interactive visualization methods [here](../visuali
>>> x = ds.data["l"]
>>> y = ds.data["b"]
>>> nbins = (360, 180)
>>> xbins = np.linspace(0.0, 360.0, nbins[0] + 1)
>>> ybins = np.linspace(-90, 90.0, nbins[1] + 1)
>>> extent = [0.0, 360.0, -90.0, 90.0]
>>> xbins = np.linspace(*extent[:2], nbins[0] + 1)
>>> ybins = np.linspace(*extent[-2:], nbins[1] + 1)
>>> hist, xbins, ybins = da.histogram2d(x, y, bins=[xbins, ybins])
>>> im2d = hist.compute() #(1)!
>>> import matplotlib.pyplot as plt
>>> from matplotlib.colors import LogNorm
>>> plt.imshow(im2d.T, norm=LogNorm(), extent=[0.0, 360.0, -90.0, 90.0])
>>> plt.xlabel("l (deg)")
>>> plt.ylabel("b (deg)")
>>> plt.imshow(im2d.T, origin="lower", norm=LogNorm(), extent=extent, interpolation="none")
>>> plt.xlabel("l [deg]")
>>> plt.ylabel("b [deg]")
>>> plt.show()
```

1. The *compute()* on `im2d` results in a two-dimensional array which we can display.


![2D histogram example](../images/simple_hist2d_obs.png)

!!! info

Above image shows the histogram obtained for the full data set.


## FITS files

Observations are often stored in [FITS](https://en.wikipedia.org/wiki/FITS) files. Support in scida work-in-progress
Observations are often stored in [FITS](https://en.wikipedia.org/wiki/FITS) files. Support in scida is work-in-progress
and requires the [astropy](https://www.astropy.org/) package.

Here we show use of the SDSS DR16:
Expand Down
6 changes: 3 additions & 3 deletions docs/tutorial/simulations.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
## Getting started

!!! note "Tutorial dataset"
!!! info "Tutorial dataset"
In the following, we will use a small test dataset from the [TNG50](https://www.tng-project.org/) simulation.
This is a cosmological galaxy formation simulation. This dataset is still a gigabyte in size and can be downloaded [here](https://heibox.uni-heidelberg.de/f/dc65a8c75220477eb62d/).
Note that analysis is not limited to simulations, but also observational data.
Expand Down Expand Up @@ -208,6 +208,6 @@ We discuss more advanced and interactive visualization methods [here](../visuali
![2D histogram example](../images/simple_hist2d.png)

## Catalogs
Many cosmological simulations have a catalog of halos, subhalos, galaxies, etc.
Many cosmological simulations have a catalog of halos, subhalos, galaxies, etc.

For AREPO/Gadget based simulations, we support use of this information. Find more find more information on how to use catalogs [here](../halocatalogs.md).
For AREPO/Gadget based simulations, we support use of this information. Find more find more information on how to use catalogs [here](../halocatalogs.md).
17 changes: 10 additions & 7 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ theme:
name: material
features:
- content.code.annotate
- content.code.copy
- navigation.sections
- navigation.indexes

Expand All @@ -29,8 +30,6 @@ nav:
- 'Advanced Topics':
- 'Arepo Simulations':
- 'Halo Catalogs': halocatalogs.md
# - 'Cookbook':
# - 'Units': notebooks/cookbook/units.ipynb
- 'Configuration': configuration.md
- 'FAQ': faq.md
- api_docs.md
Expand All @@ -49,6 +48,9 @@ markdown_extensions:
- name: mermaid
class: mermaid
format: !!python/name:pymdownx.superfences.fence_code_format
- pymdownx.highlight:
use_pygments: true
pygments_lang_class: true
- admonition
- pymdownx.details
- pymdownx.superfences
Expand All @@ -57,8 +59,8 @@ markdown_extensions:
- attr_list
- footnotes
- pymdownx.emoji:
emoji_index: !!python/name:materialx.emoji.twemoji
emoji_generator: !!python/name:materialx.emoji.to_svg
emoji_index: !!python/name:material.extensions.emoji.twemoji
emoji_generator: !!python/name:material.extensions.emoji.to_svg
- pymdownx.arithmatex:
generic: true
- md_in_html
Expand Down Expand Up @@ -87,16 +89,16 @@ plugins:
width: 12.172vw
- mkdocs-jupyter:
execute: !ENV [JUPYTER_EXECUTE, true]
execute_ignore: "notebooks/static/*.ipynb" # waiting for list support (https://github.com/danielfrg/mkdocs-jupyter/issues/119)
execute_ignore:
- "notebooks/static/*.ipynb"
kernel_name: scida
- mkdocstrings:
default_handler: python
handlers:
python:
options:
show_signature_annotations: true
show_source: true
show_submodules: true
docstring_style: numpy
watch:
- src/scida

Expand All @@ -108,3 +110,4 @@ extra_javascript:

extra_css:
- stylesheets/gridview.css
- stylesheets/code_select.css
Loading

0 comments on commit 8f9a557

Please sign in to comment.