Introduction ============ The UVES pipeline is basically a collection of data reduction procedures ("Recipes") designed to apply appropriate data reduction steps to the different types of UVES (and FLAMES-UVES) raw data frames, e.g. to create master calibration frames out of a set of calibration exposures, to reduce scientific frames, etc. The description of which reduction recipe has to be applied, which input and reference data have to be used and to which location the reduction products have to be written, is contained in an ASCII file, the so-called Reduction Block ("RB"). The operational versions of the Pipeline on Paranal and at the ESO headquarters in Garching include software components which automatically create and execute these Reduction Block. Although these components are not available for the exportable Pipeline, astronomers at their home institutes can nevertheless make use of the data reduction capabilities of the UVES pipeline and reduce their data on their own. In order to do so, the Reduction Blocks have to be created manually using a text editor. For the execution of the Reduction Blocks a dedicated MIDAS context is available. The UVES pipeline distribution (version 2.2.0) on this CD-ROM contains apart from the data reduction procedures itself a number of Reduction Block examples including all required input and reference frames. It also contains a User Manual which in detail describes how to use the UVES echelle package. This release contain also an updated release of the software to reduce FLAMES-UVES combo data. This release support the new and old UVES data FITS format. Installation Prerequisites ========================== - Hardware: A HP/HP-UX, SUN/Solaris workstation, or a PC/Linux (Kernel 2.x), 512 MB main memory. > 2GB Free space on Hard Disk. Using Linux OS the software shows better performances. - ESO-MIDAS version 04SEP, patch level pl1.0 or later. The MIDAS system variables $MIDASHOME and $MIDVERS have to be set correctly (usually MIDASHOME=/midas and MIDVERS=04SEPpl1.0). - Take care to do the installation in a location such that the total absolute path of the reference files used from the Reduction Block provided examples is less than 80 characters. This means to install the CDROM release under a path of length less <= 30 chars. For example like: /raid1/home/amodigli/tmp/cdrom Installation ============ 1. Unpack the archive file 'fluves_cdrom_2.2.0.tar.gz' using the command % zcat fluves_cdrom_2.2.0.tar.gz | tar xvf - This creates a directory 'fluves' which contains the following subdirectories: - calib : calibration and reference frames needed for some of the Recipes - doc : contains the UVES pipeline user manual - ex : contains examples of reduction blocks for UVES (subdirectory 'rb'), the required raw and calibration data (in the directory 'cal'), a directory ('pro') to which the products will be written. For reason of limited space we refer also to data used by the tutorial in the tutorial/demo directory. Also a small procedure demo_rb.prg to execute in cascade all the RB examples is present. - pipeline : contains the data reduction procedures (the directory 'uves' is a symbolic link to the directory 'uves-2.2.0'. Under this there are the basic directories context/, exec/, proc/, and some files: the Makefile, setup, ReadMe, ReleaseNotes, Disclaimer.txt, HowToInstallForUser.txt. Moreover there is the uves/ directory the actual directory with the uves-echelle pipeline procedures and configuration files for the individual recipes (directories 'uves/uves/calibDB/ech/rec' and 'uves/uves/calibDB/rul'). Under the directory flames/ it is present the software to support flames-uves data reduction. - tutorial : containing in its subdirectory demo/ all the data necessary to run the UVES tutorial (TUTORI/UVES) or the FLAMES-UVES tutorial (TUTORI/FLAMES) and a directory test/ to easily test the tutorial (please clean it up after test and before a new test) 2. Execute the setup procedure setup.sh % cd fluves % ./setup.sh This compiles the data reduction package, modifies the Reduction Block examples to be consistent with the installation directory and creates a MIDAS startup procedure ('pipe.prg') in the ex/rb/ directory which defines the required keywords to execute UVES or FLAMES-UVES Reduction Blocks. 3. If you use a tcsh or the bash source the proper file (.tcsh_uves_env or .bash_uves_env) as indicated by the installation procedure. This file will (the environment setting refers to the case of a tcsh environment): A. Define a shell variable $PIPE_HOME as described by the output of the ./setup.sh procedure. This is a prerequisite to execute Reduction Blocks. % setenv PIPE_HOME /installation_directory/fluves/pipeline/ for example % setenv PIPE_HOME ${MIDASHOME}/${MIDVERS}/fluves/pipeline/ B. Define a shell variable $UVES_HOME as described by the output of the ./setup.sh procedure. This is a prerequisite to execute UVES Tutorial. The tutorial is supposed to be runned from the dedicated ~$UVES_HOME/tutorial/test/ directory. This for practical reasons. Please remember to clean it up after use and before a new test). This to save precious space. % setenv UVES_HOME /installation_directory/fluves/ for example % setenv UVES_HOME ${MIDASHOME}/${MIDVERS}/uves/ C. Include in your PATH the directory $PIPE_HOME/uves/uves/scripts and $PIPE_HOME/uves/flames/scripts as specified from the setup.sh installation script. % setenv PATH ${PATH}:'${PIPE_HOME}/uves/uves/scripts':'${PIPE_HOME}/uves/flames/scripts' C. Define a useful alias "flmidas" as alias flmidas 'inmidas -j "@d pipeline.start; @d pipeline.control D; SET/CONT flames $PIPE_HOME/uves/context; mode(3) = 0"' This is useful to automatically start a MIDAS section loading the UVES and the FLAMES-UVES context. For example define the alias like: % alias flmidas='inmidas -j "@d pipeline.start; @d pipeline.control D O; set/context flames ${MIDASHOME}/${MIDVERS}/uves/context; mode(3) = 0"' Therefore it is recommended to put the definitions listed in the abovementioned file into your shell startup file (.profile or .cshrc). Please note the following: If you decide to move the distribution to another location, the whole installation procedure described above has to be repeated, otherwise the Pipeline will not work properly. Usage ===== To execute a Reduction Block, run a MIDAS session and execute the procedure 'pipe.prg' contained in the directories 'ex/rb'. This procedure does the following: - Defines the MIDAS global keywords CALIBDB_RUL and CALIBDB_REC which point to the location of the Pipeline configuration files. - Enables the context RBS. - Starts the pipeline. - Enable the overwrite data products option - Loads the UVES context The procedure 'pipe.prg' can be executed upon MIDAS run using the UNIX command % inmidas -P -j "@@ pipe.prg" Available MIDAS commands are: - EXECUTE/RB : executes the Reduction Block file . - START/PIPELINE Display : enables graphical output. - START/PIPELINE NoDisplay : disables graphical output. UVES Tutorial: (to be executed in the directory ${UVES_HOME}/tutorial/test) % inmidas -P MIDAS> TUTORI/UVES FLAMES and UVES Context: % inmidas -P MIDAS> SET/CONTEXT flames $PIPE_HOME/uves/context/ where $PIPE_HOME indicates the actual value of that environment variable for your installation, for example: MIDAS> SET/CONTEXT flames /midas/04SEPpl1.0/uves/pipeline/uves/context/ FLAMES-UVES Tutorial: (to be executed in the directory ${UVES_HOME}/tutorial/test) % inmidas -P MIDAS> TUTORI/FLAMES UVES Context: % inmidas -P MIDAS> SET/CONTEXT flames $PIPE_HOME/uves/context/ where $PIPE_HOME indicates the actual value of that environment variable for your installation, for example: MIDAS> SET/CONTEXT flames /midas/04SEPpl1.0/fluves/pipeline/uves/context/ How to build Reduction Blocks ============================= In order to build Reduction Blocks to reduce your own data, please use the examples contained in 'ex/rb' and as templates. The structure of a Reduction Block is always the same. It contains: - The Recipe name: Please choose one out of the available recipes as defined in the directory 'pipeline/uves/uves/calibDB/rul'. The available recipes are also listed in the User Manual. - The instrument name: 'uves'. - The product file name prefix (absolute). The individual product file names are derived from this prefix by adding a running index and the suffix '.fits' for FITS images and '.tfits' for FITS tables. Please ensure that the prefix points to an existing and writable directory. - A set of input frames, surrounded by a pair of curly brackets. Apart from the absolute file name of the input frame each line also contains the frame category, which is defined in the User Manual. It is essential to attach to each frame the correct category. - A set of reference or calibration frames, surrounded by a pair of curly brackets. Apart from the absolute file name of the reference frame each line also contains the frame category, which is defined in the User Manual. It is essential to attach to each frame the correct category. Which reference files are required for each Recipe is defined in the related configuration file in the directory 'pipeline/uves/uves/calibDB/rul' and also described in the User Manual. - Each Recipe is controlled by a number of parameters, which are defined in the related configuration file in the directory 'pipeline/uves/uves/calibDB/rul'. The parameter values specified there can optionally be overwritten by adding the values to the end of a Reduction Block, in the order the parameters are defined in the configuration file. As an example of each available recipe see the ex/rb/ directory. Such RBs have to be considered as templates to build new RBs. Reference frames ================ As mentioned above, for each Recipe one or more calibration or reference frames have to be given. The calibration frames needed for your reduction are usually master calibration frames included on the CD-ROM with your Service Mode data. The CD-ROM may also contain raw calibration frames which can be used to create master frames. The reference frames needed for the different Recipes are included on this CD-ROM in the directories 'calib/'. We have limited this list to the frames which you absolutely need before doing pipeline processing. They include a table containing the atmosphere extinction coefficients, a table containing the fluxes for a limited list of standard stars, a table containing a list of reference ThAr lines. DRS setup tables are empty tables which control the data reduction process by the use of their descriptors. All global keywords are stored in these descriptors. In principium, DRS tables are classified saved sessions (see SAVE/ECHELLE). These guarantees a standardized behavior of the UVES pipeline. DRS tables may be created using SAVE/DRS. For space limitations we have NOT included all the possible reference frame for example present on the calibration data base in Paranal or in Garching. The user is supposed to have a complete set of calibration and science frames, and using the UVES MIDAS context may build up all the frames he may actually need. To do so he is invited to read the UVES context cookbook included in the documentation. Once created the reference frames (DRS setup tables, master biases,darks,flats etc...) he may go on on data reduction either just using the UVES MIDAS context or building his own RBs from the data just created in the interactive MIDAS session, and next run them with the EXECUTE/RB command. The included calibration frames are listed below with their frame category and relevant header keywords. Please refer to the User Manual or the Recipe definitions in 'pipeline/uves/uves/calibDB/rul' for a description which reference frames are required for the different Recipes. - UVES: directory uves/ex/cal (calibration frame used by the RB examples) ------------------------------------------------------------- frame name DO_CLASSIFICATION ------------------------------------------------------------- bkg346d1be1x1.tfits BACKGR_TABLE_BLUE drs346d1be1x1.tfits DRS_SETUP_BLUE gue346d1be1x1.tfits LINE_TABLE_BLUE lin346d1be1x1low.tfits LINE_TABLE_BLUE1 lin346d1be1x1med.tfits LINE_TABLE_BLUE2 lin346d1be1x1upp.tfits LINE_TABLE_BLUE3 mff346d1be1x1s10.fits MASTER_FLAT_BLUE mbsbe1x1.fits MASTER_BIAS_BLUE ord346d1be1x1.tfits ORDER_TABLE_BLUE - UVES: directory uves/calib (calibration frame of general utility) ------------------------------------------------------------- frame name DO_CLASSIFICATION ------------------------------------------------------------- atmoexan.tfits EXTCOEFF_TABLE flxstd.tfits FLUX_STD_TABLE thargood_2.tfits LINE_REFER_TABLE uves_flxstd.tfits FLUX_STD_TABLE mfc346d0be1x1.fits MASTER_FORM_BLUE mfc437d0be1x1.fits MASTER_FORM_BLUE mfc520d0re1x1.fits MASTER_FORM_REDL mfc520d0rm1x1.fits MASTER_FORM_REDU mfc580d0re1x1.fits MASTER_FORM_REDL mfc580d0rm1x1.fits MASTER_FORM_REDU mfc860d0re1x1.fits MASTER_FORM_REDL mfc860d0rm1x1.fits MASTER_FORM_REDU - FLAMES-UVES: directory uves/ex/cal ------------------------------------------------------------- frame name DO_CLASSIFICATION ------------------------------------------------------------- fibreff_580o1re1x1_data01.fits FIB_FF_DT1_REDL fibreff_580o1re1x1_data02.fits FIB_FF_DT2_REDL fibreff_580o1re1x1_badpixel01.fits FIB_FF_BP1_REDL fibreff_580o1re1x1_badpixel02.fits FIB_FF_BP2_REDL fibreff_580o1re1x1_sigma01.fits FIB_FF_SG1_REDL fibreff_580o1re1x1_sigma02.fits FIB_FF_SG2_REDL fibreff_580o1re1x1_common.fits FIB_FF_COM_REDL fibreff_580o1re1x1_nsigma.fits FIB_FF_NSG_REDL fibreff_580o1re1x1_norm.fits FIB_FF_NOR_REDL fibreff_580o1rm1x1_data01.fits FIB_FF_DT1_REDU fibreff_580o1rm1x1_data02.fits FIB_FF_DT2_REDU fibreff_580o1rm1x1_badpixel01.fits FIB_FF_BP1_REDU fibreff_580o1rm1x1_badpixel02.fits FIB_FF_BP2_REDU fibreff_580o1rm1x1_sigma01.fits FIB_FF_SG1_REDU fibreff_580o1rm1x1_sigma02.fits FIB_FF_SG2_REDU fibreff_580o1rm1x1_common.fits FIB_FF_COM_REDU fibreff_580o1rm1x1_nsigma.fits FIB_FF_NSG_REDU fibreff_580o1rm1x1_norm.fits FIB_FF_NOR_REDU gor580o1rm1x1.tfits FIB_ORD_GUE_REDU gor580o1re1x1.tfits FIB_ORD_GUE_REDL gue580o1rm1x1.tfits FIB_LIN_GUE_REDU gue580o1re1x1.tfits FIB_LIN_GUE_REDL lin580o1rm1x1.tfits FIB_LINE_TABLE_REDU lin580o1re1x1.tfits FIB_LINE_TABLE_REDL slitff_580o1re1x1_data01.fits SLIT_FF_DT1_REDL slitff_580o1re1x1_data02.fits SLIT_FF_DT2_REDL slitff_580o1re1x1_badpixel01.fits SLIT_FF_BP1_REDL slitff_580o1re1x1_badpixel02.fits SLIT_FF_BP2_REDL slitff_580o1re1x1_sigma01.fits SLIT_FF_SG1_REDL slitff_580o1re1x1_sigma02.fits SLIT_FF_COM_REDL slitff_580o1re1x1_bound01.fits SLIT_FF_BN1_REDL slitff_580o1re1x1_bound02.fits SLIT_FF_BN2_REDL slitff_580o1re1x1_common.fits SLIT_FF_COM_REDL slitff_580o1re1x1_norm.fits SLIT_FF_NOR_REDL orf580o1rm1x1.tfits FIB_ORDEF_TABLE_REDU orf580o1re1x1.tfits FIB_ORDEF_TABLE_REDL slitff_580o1rm1x1_data01.fits SLIT_FF_DT1_REDU slitff_580o1rm1x1_badpixel01.fits SLIT_FF_BP1_REDU slitff_580o1rm1x1_sigma01.fits SLIT_FF_SG1_REDU slitff_580o1rm1x1_bound01.fits SLIT_FF_BN1_REDL slitff_580o1rm1x1_common.fits SLIT_FF_COM_REDU slitff_580o1rm1x1_norm.fits SLIT_FF_NOR_REDU Known Reduction problems: ------------------------ The UVES pipeline is a project under continuous development. Here we would like to list all the known problems and limitations. Problems: ======== o Known eventual installation problems: - To allow proper installation of the pipeline: 1- Check that you have a MIDAS release 04SEPpl1.0 or higher 2- Check that the 04SEPpl1.0/local/default.mk defines MIDASHOME as the directory where 04SEPpl1.0 is located. - Check that the binary ar (to create, modify, and extract from archives) is included in your local path (on Solaris this should be under /usr/ccs/bin) Limitations: ============ o Physical Model (command PREDICT/UVES): In case of macro Earthquake events the instrument may have significant shifts which induce spectral format shift which may be not tolerated by the physical model. This is reflected from a characteristic final plots which in the XDIF vs X and YDIF vs Y plots loose the characteristic aggregation to show instead random points. In this case it is necessary to apply offset on the relevant parameters, as described in the high level documentation. In case of observation taken in not standard configuration, with X shift greater than 5 pixels the physical model can still be failing. Again in this case one should apply appropriate offset to the parameters, in particular in this case to the X component of the XYtrans parameter (P4). In case of aborted calibration exposure, which can be noticed by the characteristic look-up of the frame without the trace of the orders, this step may file as obviously no line is identified if no order trace is present. o The order position data reduction step may fail if it is not found an appropriate DRS setup table. This may be typical at run time of the pipeline in online mode using as reference solutions the ones stored in a calibDB due to the fact that the calibDB contains DRS setup tables covering only the standard settings. In this case one should generate a complete set of calibration solution from an homogeneous complete set of raw calibration frames all taken with the same instrument setting and (having the ThAr line table, for example thargood_2.tfits) use the script uves_popul.sh to build a complete set of reference solutions in which will be comprised also the proper DRS setup table. o The wavelength calibration step may fail in case of aborted observation and so no signal present on the frame. This is usually sign of an aborted calibration exposure. o The master flat field creation step may fail in automatic pipeline processing (online) if the Reduction Block contains in the input raw frame part two files with incoherent DO_CLASSIFICATION, like FLAT_RED and FLAT_BLUE, ARC_LAMP_BLUE and FLAT_BLUE, ARC_LAMP_BLUE and FLAT_RED etc.. This problem is usually due to a wrong creation of the RB from the Data Organizer which can verify in case an observation is aborted. o The standard star data reduction to get the instrument efficiency (recipe uves_cal_response) may fail in case the observed standard star is not contained in the reference list table of standard stars for which known is their flux. In this case obviously being missing important information the recipe is going to fail. o Optimal extraction (command REDUCE/UVES): Despite from version 1.0.6 a significant improvement of extraction quality occurred, extraction quality is still limited, in particular in case of high S/N data (S/N ratio grater than 50). We also suggest to always check extraction quality using the dedicated commands MPLOT/CHUN and PLOT/CHUN checking in particular how well the fit corresponding to the blue trace matches the dark (raw data) and magenta (after k-sigma clipping) points. In the optimal extraction the maximum allowed input raw file name is 54 chars (which anyway should be enough!). This due to the use some buffer with size limited to 96 chars to display informations on the data reduction. It is anyway a good idea to not use too long filenames. o Background extraction (occurring during master flat preparation, science extraction, efficiency determination) should be checked using the dedicated command MPLOT/BKGR. Help ==== For your questions, comments, problems, etc related to the UVES Pipeline please use the ESO-MIDAS Problem Report Form on the MIDAS Web page http://www.eso.org/projects/esomidas/ (please choose the category 'Pipeline' on the form). You can also contact midas@eso.org.