Difference between revisions of "LST1Analysis MVA"
(→LST1 School 2022) |
(→LST1 DL2/DL3 Data) |
||
Line 28: | Line 28: | ||
** dl2_LST-1.Run029*h5 | ** dl2_LST-1.Run029*h5 | ||
** dl2_merged.h5 | ** dl2_merged.h5 | ||
+ | |||
+ | |||
+ | * Check DL2 versions | ||
+ | <pre style="color: blue"> | ||
+ | cd /fefs/aswg/data/real/DL2/20220206/v0.8.4/tailcut84/log | ||
+ | cat sequencer_v0.8.4.cfg | grep models | ||
+ | rf_models: /fefs/aswg/workspace/abelardo.moralejo/Crab_tuned_MC_20211231/MC/DL1b_tuned/training/models | ||
+ | </pre> | ||
== DL2toDL3 conversion == | == DL2toDL3 conversion == |
Revision as of 16:45, 8 February 2022
Contents
Location of LST1 data at PIC
- Raw data http://data.lstcam.pic.es/tape/
- DL1 data
Directory: /pnfs/pic.es/data/cta/LST/DL1
Web: http://data.lstcam.pic.es/DL1/ - DL2 data
Directory: /pnfs/pic.es/data/cta/LST/DL2
Web: http://data.lstcam.pic.es/DL2/
Use CTA observatory LDAP account (eg: monica.vazquez)
Location of LST1 data at La Palma
- Directory: /fefs/onsite/data
Use cp01/cp02 to copy out data, never the login machine
- Latest IRF
/fefs/aswg/data/mc/IRF/20200629_prod5_trans_80/zenith_20deg/south_pointing/20210923_v0.7.5_prod5_trans_80_dynamic_cleaning/off0.4deg/irf_20210923_v075_prod5_trans_80_dynamic_cleaning_gamma_point-like_off04deg.fits.gz
LST1 School 2022
https://www.lst1.iac.es/wiki/index.php/LST1School2021_MVA
LST1 DL2/DL3 Data
- Crab DL3
Location: /fefs/aswg/workspace/daniel.morcuende/data/real/DL3/20201120/v0.7.3/tailcut84/- dl3_LST-1.RunXXX.fits.gz
- hdu-index.fits.gz
- obs-index.fits.gz
- Crab DL2 with dynamic cleaning & MC (tuned NSB & PSF - ICRC version)
Location: /fefs/aswg/workspace/abelardo.moralejo/Crab_tuned_MC_20210924/20201120/merged/DL2/- dl2_LST-1.Run029*h5
- dl2_merged.h5
- Check DL2 versions
cd /fefs/aswg/data/real/DL2/20220206/v0.8.4/tailcut84/log cat sequencer_v0.8.4.cfg | grep models rf_models: /fefs/aswg/workspace/abelardo.moralejo/Crab_tuned_MC_20211231/MC/DL1b_tuned/training/models
DL2toDL3 conversion
General talk on the LST1 DL3 Tools by Chaitanya Priyadarshi: DL3_Tools_and_Spectral_analysis_of_Crab_data_with_gammapy.pdf
DL2toDL3 Tools
Tools are located in lstchain/tools:
- lstchain_create_irf_file: create IRFs from the MC file
- The IRFs can be generated using only the gamma MC file, to produce Effective Area, Energy Dispersion
(and PSF for Full Enclosure IRFs, ie. without any theta cuts). - Proton and Electron MC are used to create the Background IRF.
- For creating point-like IRFs, you have to pass
the --point-like flag in the command-line, as by default the Tools generates a Full Enclosure IRF.
- The IRFs can be generated using only the gamma MC file, to produce Effective Area, Energy Dispersion
- lstchain_create_dl3_file: create DL3 files
- lstchain_create_dl3_index_files: generate HDU and OBS Index files of the DL3 files created.
The DL3 files are indexed as per GADF and using ctapipe tools
Command options can be checked: lstchain_create_irf_file --help-all
DL2 Cuts
Fixed cut variables & default values (see here)
- fixed_gh_cut [0.6]
- fixed_theta_cut [0.2]
- allowed_tels [1]
Cut Methods:
- --fixed-gh-cut 0.9
- --config /path/to/config.json
Example json config files for the DL3 tools are here
The DL3 JSON config file (dl3_tool_config.json) is a subset of the IRF config file (irf_tool_config.json).
One can use the IRF config file for both tools to ensure that the same filter cuts for both MC and real data are used.
NOTE: the fixed_theta_cut option is not available in lstchain_create_dl3_file, so it is better to use the JSON option to be consistent with the IRF generation
Example DL2toDL3 commands
lstchain_create_irf_files -g /path/to/DL2_MC_gamma_file.h5 -o /path/to/irf.fits.gz --point-like (Only for point_like IRFs) --fixed-gh-cut 0.9 --fixed-theta-cut 0.2 --irf-obs-time 50 lstchain_create_dl3_file -d /path/to/DL2_data_file.h5 -o /path/to/DL3/file/ --input-irf /path/to/irf.fits.gz --source-name Crab --source-ra 83.633deg --source-dec 22.01deg --fixed-gh-cut 0.9 lstchain_create_dl3_index_files -d /path/to/DL3/files/ -o /path/to/DL3/index/files -p dl3*[run_1-run_n]*.fits.gz
Example DL2toDL3 commands - MVA specific
lstchain_create_irf_files -g /Users/monicava/LST1_DATA/Abelardo/dl2_merged.h5 -o /Users/monicava/LST1_DATA/Abelardo/irf.gh01.fits.gz --point-like --config irf_tool_config_gh01.json --irf-obs-time 50 lstchain_create_dl3_file -d /Users/monicava/LST1_DATA/Abelardo/dl2_LST-1.Run02967.h5 -o /Users/monicava/LST1_DATA/Abelardo/DL3_gh01 --input-irf /Users/monicava/LST1_DATA/Abelardo/irf.gh01.fits.gz --source-name Crab --source-ra 83.633deg --source-dec 22.01deg --config irf_tool_config_gh01.json lstchain_create_dl3_index_files -d /Users/monicava/LST1_DATA/Abelardo/DL3_gh01
Run summary info/data-check
- Run-summary web in GAE
LST1 account
- Seiya's Crab Run list 2021LST_RUN_CATALOG
- Data check: DATACHECK
LST1 data formats
File content can be obtained as follows:
from lstchain.io import get_dataset_keys from lstchain.io.io import dl1_params_lstcam_key, dl2_params_lstcam_key, dl1_images_lstcam_key dl2_file = "/fefs/aswg/data/real/DL2/20210903/v0.7.3/tailcut84/dl2_LST-1.Run05998.0000.h5" import pandas as pd get_dataset_keys(dl2_file)
Out[]: ['configuration/instrument/subarray/layout', 'configuration/instrument/subarray/layout.__table_column_meta__', 'configuration/instrument/telescope/camera/geometry_LSTCam', 'configuration/instrument/telescope/camera/geometry_LSTCam.__table_column_meta__', 'configuration/instrument/telescope/camera/readout_LSTCam', 'configuration/instrument/telescope/camera/readout_LSTCam.__table_column_meta__', 'configuration/instrument/telescope/optics', 'configuration/instrument/telescope/optics.__table_column_meta__', 'dl1/event/telescope/monitoring/calibration', 'dl1/event/telescope/monitoring/flatfield', 'dl1/event/telescope/monitoring/pedestal', 'dl2/event/telescope/parameters/LST_LSTCam']
dl1_file = "/fefs/aswg/data/real/DL1/20210903/v0.7.3/tailcut84/dl1_LST-1.Run05998.0000.h5" get_dataset_keys(dl1_file)
Out[]: ['configuration/instrument/subarray/layout', 'configuration/instrument/subarray/layout.__table_column_meta__', 'configuration/instrument/telescope/camera/geometry_LSTCam', 'configuration/instrument/telescope/camera/geometry_LSTCam.__table_column_meta__', 'configuration/instrument/telescope/camera/readout_LSTCam', 'configuration/instrument/telescope/camera/readout_LSTCam.__table_column_meta__', 'configuration/instrument/telescope/optics', 'configuration/instrument/telescope/optics.__table_column_meta__', 'dl1/event/telescope/image/LST_LSTCam', 'dl1/event/telescope/monitoring/calibration', 'dl1/event/telescope/monitoring/flatfield', 'dl1/event/telescope/monitoring/pedestal', 'dl1/event/telescope/parameters/LST_LSTCam']
Data is in the tables:
- 'dl1/event/telescope/image/LST_LSTCam' (camera calibrated images)
- 'dl1/event/telescope/parameters/LST_LSTCam' (Hillas parameters, timestamps, pointing, etc)
In case of DL2 only the table parameters 'dl2/event/telescope/parameters/LST_LSTCam' are available, which is a copy of the DL1 table with the reconstructed parameters added
To access the parameter tables in DL1 and DL2:
dl2_dataframe = pd.read_hdf(file, key=dl2_params_lstcam_key) dl2_dataframe
Out[]: obs_id event_id intensity log_intensity x y r phi length ... reco_energy reco_disp_dx reco_disp_dy reco_src_x reco_src_y reco_alt reco_az reco_type gammaness 0 5998 1 5529.521235 3.742688 -0.032178 -0.020007 0.037890 -2.585328 1.168153 ... 12.028581 -1.395368 -0.322723 -1.427546 -0.342730 1.084904 0.228667 101 0.025000 1 5998 4 61.694788 1.790248 0.093973 0.365420 0.377309 1.319085 0.045765 ... 0.021868 0.033724 0.019880 0.127698 0.385300 1.140388 0.287830 101 0.056333 2 5998 5 131.579536 2.119188 0.242024 0.147060 0.283200 0.546006 0.123974 ... 0.114741 0.246228 -0.281001 0.488253 -0.133941 1.153445 0.243046 101 0.188333 3 5998 6 37.598133 1.575166 0.405366 -0.546641 0.680542 -0.932721 0.072361 ... 0.062491 0.011559 -0.049816 0.416924 -0.596457 1.150419 0.202634 101 0.290833 4 5998 7 75.891508 1.880193 -0.129859 -0.524562 0.540397 -1.813474 0.078325 ... 0.033458 0.010284 0.027173 -0.119575 -0.497389 1.131426 0.213075 101 0.165000 ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... 28271 5998 52989 89.536526 1.952000 -0.775412 0.466715 0.905034 2.599783 0.061538 ... 0.043203 -0.004094 0.009934 -0.779506 0.476649 1.108024 0.292202 101 0.165000 28272 5998 52990 113.433699 2.054742 -0.015224 -0.558476 0.558683 -1.598049 0.072026 ... 0.034507 -0.150657 0.088593 -0.165881 -0.469883 1.129934 0.214743 101 0.091000 28273 5998 52991 3053.237697 3.484761 0.621141 0.404424 0.741198 0.577148 0.319016 ... 0.218918 0.064154 -0.829353 0.685294 -0.424929 1.160371 0.216046 101 0.010000 28274 5998 52992 132592.328588 5.122518 0.000555 0.000571 0.000796 0.799391 1.163453 ... 30.287504 -0.286239 0.399654 -0.285684 0.400225 1.125740 0.287283 101 0.178036 28275 5998 52997 74.290200 1.870932 -0.595926 0.781502 0.982788 2.222275 0.097937 ... 0.292780 0.711934 0.310702 0.116007 1.092203 1.138651 0.347324 101 0.442667 [28276 rows x 51 columns]
Accesing images is more complicated and requires the tables module
from astropy.table import Table images = Table.read(dl1_file, path=dl1_images_lstcam_key)['image'] images
Out[]: <Column name='image' dtype='float32' shape=(1855,) length=53000> 8.701056 .. 7.1084614 3.7857776 .. 1.2068052 3.8285189 .. 0.9846253 4.5408783 .. 5.1921587 4.754586 .. 1.026284 0.7226335 .. 0.09590531 3.4738295 .. 2.0677526 2.8597107 .. -0.43177217 -0.20343336 .. 3.8313065 1.3372487 .. 1.5678478 3.2301373 .. 2.428795 3.4438455 .. 4.2895527 3.871757 .. 1.4984165 -0.5311186 .. 1.8316865 2.717239 .. 4.3192797 1.3362556 .. 0.48472026 0.52366954 .. 0.26254028 1.7769248 .. -0.06901258 2.6032615 .. 3.2203112 1.7484305 .. 1.8331614 4.7545853 .. 1.7067102 2.2185874 .. 1.2590411 -0.43089166 .. 0.29031274 2.7479038 .. 1.5956203 ...
for each of the 53000 events, a calibrated image of the camera is obtained (array of 1855 pixels)
The rest of the tables in the files are metadata (info about the camera, optics, calibration, etc)
Software tool installation
LSTCHAIN
Donwload LST analysis software cta-lstchain (wrapper that uses centrally supported ctapipe routines)
git clone https://github.com/cta-observatory/cta-lstchain.git cd cta-lstchain LSTCHAIN_VER=0.7.2 wget https://raw.githubusercontent.com/cta-observatory/cta-lstchain/v$LSTCHAIN_VER/environment.yml conda env create -n lst -f environment.yml conda activate lst pip install lstchain==$LSTCHAIN_VER rm environment.yml In case of problems with conda env try mamba: conda install -c conda-forge -n base mamba mamba env create -f environment.yml
Files needed to analyze golden Run 442 (20190527):
- Pedestal run: pedestal_file_run446_0000.fits
- Low level DRS4 calibration: calibration.hdf5
Run directly jupyter notebook
cd cta-lstchain/notebooks jupyter notebook -> select Analyze_real_muon_data.ipynb
Convert example jupyter notebook to python
cd cta-lstchain/notebooks jupyter nbconvert --to python Analyze_real_muon_data.ipynb
Fix path of data, pedestal and calibration and run!
Lines to fix: source = event_source(input_url="../data/LST-1.4.Run00442.0001.fits.fz",max_events=None) pedestal_path="../pedestal/pedestal_file_run446_0000.fits", with HDF5TableReader('../calibration/calibration.hdf5') as h5_table: python Analyze_real_muon_data.py
Gammapy
curl -O https://gammapy.org/download/install/gammapy-0.18.2-environment.yml conda env create -f gammapy-0.18.2-environment.yml conda activate gammapy-0.18.2 gammapy info ##### Installing local copy git clone -b v0.18.2 https://github.com/gammapy/gammapy cd gammapy pip install -e .
Installation in Ubuntu
conda create -n gammapyenv python=3.7 conda activate gammapyenv conda install gammapy=0.18.2 sherpa=4.12.0 conda install jupyter ipython jupyterlab pandas healpy iminuit naima emcee corner parfice gammapy info
Ctools
Installation MAC
conda config --append channels conda-forge conda config --append channels cta-observatory conda create -n ctathai python=3.8 conda activate ctathai conda install ctools conda install matplotlib conda install jupyterlab export CALDB=/path/where/you/installed/IRF/caldb export HESSDATA=/path/where/you/installed/HESS/data
Installation UBUNTU
conda create -n ctathai python=3.7 conda activate ctathai conda install -c cta-observatory ctools=1.7.4 conda install matplotlib conda install jupyterlab export CALDB=/path/where/you/installed/IRF/caldb export HESSDATA=/path/where/you/installed/HESS/data
Commands:
ctobssim edisp=yes fv events.fits ctskymap ds9 skymap.fits ctbin fv cntcube.fits ctexpcube ctpsfcube ctbkgcube ctlike csresmap csresspec components=yes $CTOOLS/share/examples/python/show_residuals.py resspec.fits
Fermipy
conda config --append channels fermi conda create -n fermiset -c conda-forge -c fermi fermitools=2.0.8 python=3.7 clhep=2.4.4.1 conda activate fermiset conda install fermipy
CTApipe
git clone https://github.com/cta-observatory/ctapipe.git cd ctapipe CTAPIPE_VER=0.10.5 wget https://raw.githubusercontent.com/cta-observatory/ctapipe/v$CTAPIPE_VER/environment.yml conda env create -n cta -f environment.yml conda activate cta conda install -c conda-forge ctapipe=$CTAPIPE_VER
Ubuntu 20.04 issues
- Mars/root does not work so gcc needs to be downgrade to version 5: gcc downgrade
- System Program Problem Detected at reboot:
sudo gedit /etc/default/apport enabled=0
- Problem installing ctools with conda, modify ~/.condarc
channel_priority: flexible
Extra
Conda useful commands
conda remove --name gammapy_default --all conda list -n gammapy-0.18.2
MAGIC DL3 data
https://github.com/open-gamma-ray-astro/joint-crab/tree/master/data/magic
https://gitlab.pic.es/magic_dl3/magic_dl3_sw_school_2021/-/tree/master/data/magic