KPF Calibrations and Master Files

Calibrations

Main Spectrometer

With rare exceptions, calibration spectra are taken on a daily basis with KPF to characterize the instrument. While the calibration set has evolved with time, but is now mostly fixed (the table below is current as of October 2023). This set of calibrations is designed to be sufficient for regular DRP processing without ‘manual’ calibrations taken by KPF observers.

Type

Object name

Exp. time (sec)

Num/day

Comment

Dark

autocal-dark

1200

5

Bias

autocal-bias

0

22

Flat

autocal-flat-all

30

100

LFC

autocal-lfc-all-morn

autocal-lfc-all-eve

60

60

5

5

morning sequence

evening sequence

ThAr

autocal-thar-all-morn

autocal-thar-all-eve

10

10

9

10

morning sequence

evening sequence

Etalon

autocal-etalon-all-morn

autocal-etalon-all-eve

autocal-etalon-all-night

slewcal

60

60

60

60

10

10

varies

varies

morning sequence

evening sequence

30x per qtr. night when off-sky

~once per hour when on-sky

UNe

autocal-une-all-morn

autocal-une-all-eve

5

5

5

5

not used

not used

Ca H & K Spectrometer

The Ca H & K spectrometer shares several calibration exposures with the main spectrometer. The full set is listed below.

Type

Object name

Exp. time (sec)

Num/day

Comment

Dark

autocal-dark

1200

5

the same exposures as for the main spectrometer (above)

Bias

autocal-bias

0

22

the same exposures as for the main spectrometer (above)

ThAr

autocal-thar-hk

60

3

light detected in two bluest orders only

Exposure Meter

Processing data from KPF’s Exposure Meter (EM) is handled in real-time with a separate pipeline. For documentation proposes, the table below lists the EM calibrations. Note that because of coatings on optics in the Fiber Injection Unit that are specific to the light path for calibrations, most calibrations of that type do not deliver measurable flux to the Ca H&K Spectrometer.

Type

Object name

Exp. time (sec)

Num/day

Comment

Bias

bias

0.12

50

taken daily, then stacked, and used by ExpMeter DRP that night.

Master Files

For the main spectrometer, master files for bias, dark, flat, and the wavelength calibration sources listed above are created each day from the calibrations during the UT date. The co-addition process involves iterative outlier rejection per pixel.

How To Run The Full Master Pipeline

The full Master Pipeline, which generates co-add 2D stacks of master bias, master dark, master flat, master arclamps, as well as L1 and L2 master files, including master WLS files. All of these products are registered in the CalFiles table of the operations database at the very end of the pipeline.

The following instructions cover how to run the full Master Pipeline for a given observation date. Assume that date is 20230601 as an example. The effect will be to overwrite the old *master* files in the canonical location of the master files (/data/kpf/masters/20230601/*master*) with *master* files generated by the current software version. In the end, the CalFiles table in the operations database will be purged of all records for the observation date of interest, and then all master files in the canonical location will be re-registered as new records in the CalFiles table. It is essential that the MD5 checksum of each master file matches the database record; otherwise the KPF DRP software, which utilizes master files, when connected to the operations database, will fail (but it will not care about the MD5 checksums if it is not connected to the operations database).

The basic steps are as follows.

  1. Set up a private sandbox:

    mkdir -p /data/user/rlaher/sbx_test
    mkdir -p /data/user/rlaher/sbx_test/reference_fits
    cd /data/user/rlaher/sbx_test/reference_fits
    cp -p /data/kpf/reference_fits/* .
    mkdir -p /data/user/rlaher/sbx_test/masters/masters
    cd /data/user/rlaher/sbx_test/masters/masters
    cp -p /data/kpf/L0/20221029/KP.20221029.21537.28.fits .
    cp -p /data/kpf/masters/kpfMaster_HKOrderBounds20230818.csv  .
    cp -p /data/kpf/masters/kpfMaster_HKwave20230818_sci.csv  .
    cp -p /data/kpf/masters/kpfMaster_HKwave20230818_sky.csv .
    cp -pr /data/kpf/masters/stellarmasks .
    
  2. Set up the environment to point to the sandbox location (must be done via the ~/.bash_profile file, since it is sourced by bash script):

    vi ~/.bash_profile
    export KPFCRONJOB_SBX=/data/user/rlaher/sbx_test
    source ~/.bash_profile
    printenv | grep SBX
    
  3. Set other required environment variables:

    vi ~/.bash_profile
    export KPFCRONJOB_DOCKER_NAME_L0=russkpfmastersdrpl0
    export KPFCRONJOB_DOCKER_NAME_L1=russkpfmastersdrpl1
    export KPFCRONJOB_DOCKER_NAME_WLS=russkpfmasterswlsauto
    export KPFCRONJOB_DOCKER_NAME_DBSCRIPT=russkpfmastersregisterindb
    export KPFPIPE_L0_BASE_DIR=/data/kpf/L0
    export KPFPIPE_TEST_DATA=/KPF-Pipeline-TestData
    export KPFPIPE_MASTERS_BASE_DIR=/data/kpf/masters
    export KPFCRONJOB_CODE=/data/user/rlaher/git/KPF-Pipeline
    export KPFCRONJOB_LOGS=/data/user/rlaher/git/KPF-Pipeline
    export KPFPIPE_PORT=6107
    export KPFDBUSER=kpfporuss
    export KPFDBNAME=kpfopsdb
    export KPFDB=/data/user/rlaher/kpfdb
    
  4. Make a jobs subdirectory under the git repo KPF_Pipeline directory:

    cd ~/git/KPF-Pipeline
    mkdir jobs
    
  5. Run the full Master Pipeline for the current date:

    cd ~/git/KPF-Pipeline/cronjobs
    
  6. This can be done via a cronjob that runs daily at 5:15 p.m.:

    15 17 * * * /data/user/rlaher/git/KPF-Pipeline/cronjobs/runDailyPipelines.sh >& /data/user/rlaher/git/KPF-Pipeline/jobs/runDailyPipelines_$(date +\%Y\%m\%d).out
    

To rerun the full Master Pipeline for some prior observation date (assuming steps 1 and 2 above have been done), such as 20230601, simply copy the run script, modify it to have the desired observation date, ensure the correct configuration file is specified (rather than the default kpf_masters_drp.cfg), and then execute the modified run script:

cp ~/git/KPF-Pipeline/cronjobs/runDailyPipelines.sh ~/git/KPF-Pipeline/cronjobs/runDailyPipelines_20230601.sh
vi ~/git/KPF-Pipeline/cronjobs/runDailyPipelines_20230601.sh (replace with desired observation date)
export KPFCRONJOB_CONFIG_L0=/code/KPF-Pipeline/configs/kpf_masters_drp_before20230623.cfg
~/git/KPF-Pipeline/cronjobs/runDailyPipelines_20230601.sh

If the default configuration file is desired (which is kpf_masters_drp.cfg), then no need to set the KPFCRONJOB_CONFIG_L0 environment variable. The available configuration files listed below contain different settings for smoothlamppattern_path. The smoothlamppattern_path files are located in /data/reference_fits inside the docker container (which is mapped to /data/user/rlaher/sbx_test/reference_fits outside of the docker container).

Configuration file

Observation dates

smoothlamppattern_path (/data/reference_fits)

kpf_masters_drp_before20230623.cfg kpf_masters_drp_from20230624to20230730.cfg kpf_masters_drp.cfg (default)

<20230623 20230624-2023730 >20230731

kpf_20230619_smooth_lamp_made20230817_float32.fits kpf_20230628_smooth_lamp_made20230803_float32.fits kpf_20230804_smooth_lamp_made20230808_float32.fits

How To Run Master WLS Pipeline

The following instructions cover how to run the Master WLS Pipeline for a given observation date. Assume that date is 20230601 as an example. The effect will be to remove all old *master_WLS* files in the canonical location of the master files (/data/kpf/masters/20230601/*master_WLS*) and replace them with *master_WLS* files generated by the current software version. In the end, the CalFiles table in the operations database will be purged of all records for the observation date of interest, and then all master files in the canonical location will be re-registered as new records in the CalFiles table. It is essential that the MD5 checksum of each master file matches the database record; otherwise the KPF DRP software, which utilizes master files, when connected to the operations database, will fail (but it will not care about the MD5 checksums if it is not connected to the operations database).

The basic steps are as follows.

  1. Set up a private sandbox:

    mkdir -p /data/user/rlaher/sbx_test
    mkdir -p /data/user/rlaher/sbx_test/reference_fits
    cd /data/user/rlaher/sbx_test/reference_fits
    cp -p /data/kpf/reference_fits/* .
    
  2. Set up the environment to point to the sandbox location (must be done via the ~/.bash_profile file, since it is sourced by bash script):

    vi ~/.bash_profile
    export KPFCRONJOB_SBX=/data/user/rlaher/sbx_test
    source ~/.bash_profile
    printenv | grep SBX
    
  3. Set other required environment variables:

    vi ~/.bash_profile
    export KPFCRONJOB_DOCKER_NAME_WLS=russkpfmasterswlsauto
    export KPFCRONJOB_DOCKER_NAME_DBSCRIPT=russkpfmastersregisterindb
    export KPFPIPE_L0_BASE_DIR=/data/kpf/L0
    export KPFPIPE_TEST_DATA=/KPF-Pipeline-TestData
    export KPFPIPE_MASTERS_BASE_DIR=/data/kpf/masters
    export KPFCRONJOB_CODE=/data/user/rlaher/git/KPF-Pipeline
    export KPFCRONJOB_LOGS=/data/user/rlaher/git/KPF-Pipeline
    export KPFPIPE_PORT=6107
    export KPFDBUSER=kpfporuss
    export KPFDBNAME=kpfopsdb
    export KPFDB=/data/user/rlaher/kpfdb
    
  4. Make a jobs subdirectory under the git repo KPF_Pipeline directory:

    cd ~/git/KPF-Pipeline
    mkdir jobs
    
  5. Generate a run script for the observation date(s) of interest, in which the input parameters are start and end date. This generates a script called runWLSPipelineFrom20230601To20230601.sh:

    cd ~/git/KPF-Pipeline/cronjobs
    perl generateWLSScriptBetweenTwoDates.pl 20230601 20230601
    
  6. Run the Master WLS Pipeline:

    cd ~/git/KPF-Pipeline/cronjobs
    ./runWLSPipelineFrom20230601To20230601.sh