LDAP: couldn't connect to LDAP server

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
documentation [2012/11/10 06:57]
mjc
documentation [2013/09/05 05:29] (current)
mjc [Metadata Format]
Line 1: Line 1:
 ====== PyWiFeS Documentation ====== ====== PyWiFeS Documentation ======
  
-==== Running the Data Reduction ====+This page is a starting point for running the actual reduction code.  Eventually it will also host documentation for individual function syntax. 
 + 
 +If you are interested in a high-level overview of the techniques and algorithms employed in PyWiFeS, you should head over to the [[data_reduction| main data reduction page]]. 
 + 
 +==== Data Reduction Overview ====
 Data reduction is generally run in two steps: Data reduction is generally run in two steps:
   - **Gather Metadata** - in the 'scripts/' directory of your PyWiFeS installation directory you will find the script 'generate_metadata_script.py' Use this to organize your metadata in the following steps:   - **Gather Metadata** - in the 'scripts/' directory of your PyWiFeS installation directory you will find the script 'generate_metadata_script.py' Use this to organize your metadata in the following steps:
Line 46: Line 50:
 Also important to note here is that an object frame only get automatically classified as a standard star if the 'OBJECT' header field is filled with a name from the standard star list defined in 'wifes_calib.py' in the 'src/' directory. Also important to note here is that an object frame only get automatically classified as a standard star if the 'OBJECT' header field is filled with a name from the standard star list defined in 'wifes_calib.py' in the 'src/' directory.
  
-==== Reduction Script ==== +A schematic overview of the PyWiFeS metadata heirarchy is illustrated in the Figure below Python dictionaries are colour green, while Python lists are coloured red.
-Layout and details about the reduction script...+
  
-==== Individual Function Documentation ==== +{{ :wiki:metadata_wifes.png?600 |}}
-Basic Preprocessing Functions +
-  * [[subtract_overscan]] +
-  * [[subtract_interslit_bias]]+
  
-Image coaddition / arithmetic +==== Reduction Script ==== 
-  [[imarith]] +The data reduction script itself is a complex and lengthy piece of code that strategically calls functions from the **pywifes** module to reduce the data according to its layout in the metadata dictionary.  Savvy users of Python who want to tweak details of the reduction should feel free to do so, and ultimately we will allow you to post those on our [[Forum]] For most users the reduction script itself can be run after completing the following two simple steps.
-  [[imarith_mef]] +
-  [[imcombine]] +
-  * [[imcombine_mef]]+
  
-Cosmics Rays +First, open the 'reduce_red_data.py' script in a text editor of [[http://www.gnu.org/software/emacs/|your choice]] (Note: You will probably want to make a copy of this each time you do a reduction.  I typically have a 'redux/' subdirectory for each night of data.)  Then set the variable 'data_dir' to the location on your filesystem where the raw data is stored, and the 'out_dir' variable to the directory where the processed data should go.  My general practice is to have these be in subdirectories called 'raw_data/' and 'proc_data/' respectively.
-  * [[lacos_wifes]]+
  
-Wavelength Solution +Second, turn on the processing steps that you wish to run.  This done in the section labeled as 'USER REDUCTION DESIGN IS SET HERE' The processing steps are defined here in a list of Python dictionaries.  To run a particular step, set the 'run' key to True.  //It is strongly recommended to run one step at a time your first time through,// so that you can examine the output to understand what each step is going.
-  * [[derive_wifes_wave_solution]] +
-  * [[derive_wifes_skyline_solution]]+
  
-Wire Solution +The dictionary for each step has a 'suffix' key which defines how the output of that step will be stored, so for example the first step will operate on 'r0001.fits' and save its output as 'r0001.p01.fits' when the suffix is '01' This is so the next step know where to look for output of the previous step (e.g. step 2 looks for 'p01.fits' files and saves its output as 'p02.fits' files).
-  * [[derive_wifes_wire_solution]]+
  
-Flat-fielding +The 'step' keyword in each step dictionary is used to determine which function to call to execute that step.  The functions by default are always 'run_step' (e.g. the 'subtract_overscan' step calls the 'run_subtract_overscan' function) which are defined in the reduction script.  These functions open the metadata and call the appropriate **pywifes** functions to operate on your data.
-  * [[wifes_2dim_response]] +
-  [[wifes_illumination]] +
-  [[wifes_pixel_response]]+
  
-WiFeS Data Formatting +You can pass additional arguments to the main **pywifes** function being called by filling the 'args' key for a given processing step.  For example, in the 'cube_gen' step I typically pass the wavelength bounds I desire as 'args':{'wmin_set':3500, 'wmax_set':5700}, and those keyword arguments are passed to pywifes.generate_wifes_cube when it is called in the 'run_cube_gen' routine.
-  [[wifes_slitlet_mef]] +
-  [[generate_wifes_cube]]+
  
-Flux Calibration +The reduction steps used by default in the scripts are described in great detail on the [[default_reduction| default data reduction steps]] page.
-  * [[derive_wifes_calibration]] +
-  * [[calibrate_wifes_cube]] +
-  * [[derive_wifes_telluric]] +
-  * [[apply_wifes_telluric]]+