LDAP: couldn't connect to LDAP server

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
documentation [2012/11/10 06:22]
mjc
documentation [2013/09/05 05:29] (current)
mjc [Metadata Format]
Line 1: Line 1:
 ====== PyWiFeS Documentation ====== ====== PyWiFeS Documentation ======
  
-==== Running the Data Reduction ====+This page is a starting point for running the actual reduction code.  Eventually it will also host documentation for individual function syntax. 
 + 
 +If you are interested in a high-level overview of the techniques and algorithms employed in PyWiFeS, you should head over to the [[data_reduction| main data reduction page]]. 
 + 
 +==== Data Reduction Overview ====
 Data reduction is generally run in two steps: Data reduction is generally run in two steps:
   - **Gather Metadata** - in the 'scripts/' directory of your PyWiFeS installation directory you will find the script 'generate_metadata_script.py' Use this to organize your metadata in the following steps:   - **Gather Metadata** - in the 'scripts/' directory of your PyWiFeS installation directory you will find the script 'generate_metadata_script.py' Use this to organize your metadata in the following steps:
     - Run './generate_metadata_script.py /home/mjc/wifes/20120101/raw_data/' from the command line, with the argument being the directory where your raw data is located. This will automatically figure out what type of data each file is and arrange it in the preferred metadata format.     - Run './generate_metadata_script.py /home/mjc/wifes/20120101/raw_data/' from the command line, with the argument being the directory where your raw data is located. This will automatically figure out what type of data each file is and arrange it in the preferred metadata format.
-    - __//Check the metadata scripts//__ 'save_red_metadata.py' and 'save_blue_metadata.py' which were automatically generated by the previous step.  Make sure that the data are arrange correctly!  In particular, make sure standard star observations were correctly identified.  If changes need to be made, consult the 'Metadata Format' section below for help.+    - __//Check the metadata scripts//__ 'save_red_metadata.py' and 'save_blue_metadata.py' which were automatically generated by the previous step.  Make sure that the data are arranged correctly!  In particular, make sure standard star observations were correctly identified.  If changes need to be made, consult the 'Metadata Format' section below for help.
     - Run 'python save_red_metadata.py' (and similarly for blue) from the command line.  This will save your metadata in a python dictionary which gets pickled into a file e.g. 'wifesR_20120101_metadata.pkl'     - Run 'python save_red_metadata.py' (and similarly for blue) from the command line.  This will save your metadata in a python dictionary which gets pickled into a file e.g. 'wifesR_20120101_metadata.pkl'
   - **Reduce the Data** - again in the 'scripts/' directory you will find 'reduce_red_data.py' (and similarly for blue). It is //strongly// recommended that the first time you use the reduction script you proceed one step at a time (see the 'Reduction Script' section below). This script is run from the command line with the pickled metadata file passed as argument: './reduce_red_data.py wifesR_20120101_metadata.pkl'   - **Reduce the Data** - again in the 'scripts/' directory you will find 'reduce_red_data.py' (and similarly for blue). It is //strongly// recommended that the first time you use the reduction script you proceed one step at a time (see the 'Reduction Script' section below). This script is run from the command line with the pickled metadata file passed as argument: './reduce_red_data.py wifesR_20120101_metadata.pkl'
  
 ==== Metadata Format ==== ==== Metadata Format ====
-Details on the metadata format...+The observation metadata is stored as a [[http://docs.python.org/2/library/stdtypes.html#mapping-types-dict|Python dictionary]].  Dictionaries are very nice objects because they let you access stored values (or other objects) by name (called a "key"), rather than by index.
  
-==== Reduction Script ==== +Master calibration files for the night are stored as lists under appropriately named keys in the metadata dictionary For example, if you are using the Python interpreter and load the metadata pickle file into an object called 'metadata', you can see the biases for the night like this:
-Layout and details about the reduction script...+
  
-==== Individual Function Documentation ==== +<code> 
-Basic Preprocessing Functions +  >> print metadata['bias'
-  [[subtract_overscan]+  ['r0001.fits', 'r0002.fits', 'r0003.fits'] 
-  [[subtract_interslit_bias]]+</code>
  
-Image coaddition arithmetic +Science and standard star observations (identified by the header keyword 'IMAGETYP' being 'OBJECT') are stored in the "metadata['sci']" and "metadata['std']" keys, but are stored differently than calibration files.  Instead "metadata['sci']" is a list of Python //dictionaries// rather than a list of filenames.   
-  [[imarith]] + 
-  [[imarith_mef]] +This format was chosen so that individual science observations could have particular calibration files associated with them, e.g. if you took a bias just before or after which you want to use for bias subtraction, etc.  This format is also critical for associating sky frames with the science frames they are intended to be subtracted from. 
-  [[imcombine]] + 
-  * [[imcombine_mef]]+Let's again suppose you're inspecting the metadata in the Python interpreter, then each science observation dictionary would look like this: 
 + 
 +<code> 
 +  >> print metadata['sci'][0
 +  {'sci'['r0005.fits'], 
 +   'arc'[]
 +   'bias' : ['r0006.fits'], 
 +   'sky' : ['r0007.fits']} 
 +</code> 
 + 
 +Where the 'sci' file is the main science image.  Note that this calibration and sky frame association is the main reason you should check the auto-generated metadata scripts, which by necessity cannot know these associations!  You as the observer must tell the script which calibrations or sky frames go with specific observations. 
 + 
 +The standard star metadata is very similar, but each dictionary also has the additional key 'type', which tells the reduction script whether this star is a flux standard or telluric standard (or both). 
 + 
 +<code> 
 +  >> print metadata['std'][0] 
 +  {'sci' : ['r0008.fits'], 
 +   'arc' : [], 
 +   'type' : ['flux', 'telluric']} 
 +</code> 
 + 
 +Also important to note here is that an object frame only get automatically classified as a standard star if the 'OBJECT' header field is filled with a name from the standard star list defined in 'wifes_calib.py' in the 'src/' directory. 
 + 
 +A schematic overview of the PyWiFeS metadata heirarchy is illustrated in the Figure below.  Python dictionaries are colour green, while Python lists are coloured red. 
 + 
 +{{ :wiki:metadata_wifes.png?600 |}} 
 + 
 +==== Reduction Script ==== 
 +The data reduction script itself is a complex and lengthy piece of code that strategically calls functions from the **pywifes** module to reduce the data according to its layout in the metadata dictionary.  Savvy users of Python who want to tweak details of the reduction should feel free to do so, and ultimately we will allow you to post those on our [[Forum]].  For most users the reduction script itself can be run after completing the following two simple steps.
  
-Cosmics Rays +First, open the 'reduce_red_data.py' script in a text editor of [[http://www.gnu.org/software/emacs/|your choice]] (Note: You will probably want to make a copy of this each time you do a reduction.  I typically have a 'redux/' subdirectory for each night of data.)  Then set the variable 'data_dir' to the location on your filesystem where the raw data is stored, and the 'out_dir' variable to the directory where the processed data should go.  My general practice is to have these be in subdirectories called 'raw_data/' and 'proc_data/' respectively.
-  * [[lacos_wifes]]+
  
-Wavelength Solution +Second, turn on the processing steps that you wish to run.  This done in the section labeled as 'USER REDUCTION DESIGN IS SET HERE' The processing steps are defined here in a list of Python dictionaries.  To run a particular step, set the 'run' key to True.  //It is strongly recommended to run one step at a time your first time through,// so that you can examine the output to understand what each step is going.
-  * [[derive_wifes_wave_solution]] +
-  * [[derive_wifes_skyline_solution]]+
  
-Wire Solution +The dictionary for each step has a 'suffix' key which defines how the output of that step will be stored, so for example the first step will operate on 'r0001.fits' and save its output as 'r0001.p01.fits' when the suffix is '01' This is so the next step know where to look for output of the previous step (e.g. step 2 looks for 'p01.fits' files and saves its output as 'p02.fits' files).
-  * [[derive_wifes_wire_solution]]+
  
-Flat-fielding +The 'step' keyword in each step dictionary is used to determine which function to call to execute that step.  The functions by default are always 'run_step' (e.g. the 'subtract_overscan' step calls the 'run_subtract_overscan' function) which are defined in the reduction script.  These functions open the metadata and call the appropriate **pywifes** functions to operate on your data.
-  * [[wifes_2dim_response]] +
-  [[wifes_illumination]] +
-  [[wifes_pixel_response]]+
  
-WiFeS Data Formatting +You can pass additional arguments to the main **pywifes** function being called by filling the 'args' key for a given processing step.  For example, in the 'cube_gen' step I typically pass the wavelength bounds I desire as 'args':{'wmin_set':3500, 'wmax_set':5700}, and those keyword arguments are passed to pywifes.generate_wifes_cube when it is called in the 'run_cube_gen' routine.
-  [[wifes_slitlet_mef]] +
-  [[generate_wifes_cube]]+
  
-Flux Calibration +The reduction steps used by default in the scripts are described in great detail on the [[default_reduction| default data reduction steps]] page.
-  * [[derive_wifes_calibration]] +
-  * [[calibrate_wifes_cube]] +
-  * [[derive_wifes_telluric]] +
-  * [[apply_wifes_telluric]]+