Difference: PacsCalibrationWeb (100 vs. 101)

Revision 1012013-12-11 - KatrinaExter

Line: 1 to 1
 
META TOPICPARENT name="WebHome"

PACS instrument and calibration web pages

Introduction

Changed:
<
<
This page provides up-to-date information about using the PACS instrument: from preparing observations to reducing your data. We include information summaries, technical reports, information about and links to HIPE--the data processing environment for Herschel data--and the PACS data reduction guides, and cookbooks and scripts that you can use in HIPE when working with PACS data. This page also provides you with the latest calibration accuracies and known PACS calibration issues, and information about what future calibration and processing improvements can be expected.
>
>
This page provides up-to-date information about using the PACS instrument: from preparing observations to reducing your data. This includes information on the PACS instrument, PACS data, reducing PACS data in HIPE, and post-pipeline processing, and links to tutorials and scripts that you can run in HIPE. The calibration accuracies and technical information about the spectrometer and photometer of PACS are also provided here, as well as information about what future calibration and processing improvements can be expected.
 

Observing with PACS

Changed:
<
<
  • The PACS Observer's Manual HTML PDF (11 Mb), version 2.3, 8-June-2011 : the first thing to read before applying for time with PACS or before working on PACS data for the first time, as it tells you how the instrument works. This includes:
>
>
  • The PACS Observer's Manual HTML PDF (11 Mb), version 2.3, 8-June-2011 : the first thing to read before applying for time with PACS (or even before working on PACS data for the first time), as it tells you how the instrument works. This includes:
 
    • A description of the layout and the components of the PACS photometer and spectrometer
Changed:
<
<
    • A description of the scientific capabilities of the instrument: spectral response functions, sensitivity values, point spread functions, astrometric accuracy; these can also be found in the AOT release notes
    • A description of the standard observing templates that were used to set up PACS observations; here you can also learn the lingo that is used in the PACS data reduction guides (DRGs)
    • A brief description of PACS data products; though much more detail on this is provided in the PACS DRGs
>
>
    • A description of the scientific capabilities of the instrument: spectral response functions, sensitivity values, point spread functions, astrometric accuracy, flux calibration information
    • A description of the standard observing templates used to set up PACS observations; here you can also find the various acronyms that are used in the PACS data reduction guides
    • A brief description of PACS data products (although much more detail is provided in the appendices of the PACS data reduction guides)
 
Changed:
<
<
  • AOT Release Notes: dedicated release notes per AOT (the astronomer's observing template, i.e. planning your observing time).
>
>
  • AOT Release Notes: dedicated release notes per AOT (the astronomer's observing template, i.e. the observing time planning).
 
    • Information about how the various standard observing blocks work
Changed:
<
<
    • Summaries of transmission functions, sensitivity, etc. for use in your observing planning
    • We are now at the end of the mission. But these release notes can still be useful to read for a background understanding on how PACS data were gathered--this dictates what you will see as you look at your PACS data while pipeline processing them
    • You can also learn here the lingo that is used in the PACS DRGs when describing the data reduction pipeline scripts
>
>
    • Summaries of transmission functions, sensitivity, etc. for use in your observing planning (similar to what you will find in the Observer's Manual)
    • Here you can also find the various acronyms that are used in the PACS data reduction guides
    • We are now at the end of the mission. But these release notes can still be useful to read for a background understanding on how PACS data were gathered - this dictates what you will see as you look at your PACS data while pipeline processing them
 
Line: 28 to 28
 

Reducing PACS data

Brief explanation

A brief introduction to reducing PACS data in HIPE. You can consult the PACS Data Reduction Guides (photometry and spectroscopy) for more detail.
Changed:
<
<
  • The data you get from the the HSA will have been processed by the "SPG" (Standard Product Generator), which is another way of saying that it is processed with a tailored version of the latest pipeline scripts from the User Release. So, when HIPE User Release 11.0 is released, soon after all the Herschel data are processed with the SPG pipeline scripts of version 11.0, and so on for each User Release.
  • For PACS these SPG scripts are a copy of one flavour of interactive pipeline scripts. The SPG scripts include all the stable pipeline tasks with settings that correspond to the most common type of science target and observing plan (AOT). But some pipeline tasks can only be run via the interactive scripts. The Launch Pads (see below) of the data reduction guides brings you up-to-date on this matter.
  • This makes the SPG results a good starting point to look at your PACS data, but in most cases you can improve the results at least somewhat by reducing the data yourself.
  • For photometry and spectroscopy both there is more than one flavour of pipeline script, tailored to different types of science target or observing plan. These "interactive" pipeline scripts are provided in HIPE and explained in the data reduction guides.
>
>
  • PACS data are reduced with pipeline scripts which are a set of command-line tasks that process the data from Level 0 (raw) to Level 2/2.5 (science-ready). There is more than one flavour of pipeline script, tailored to different types of science target, AOT, and observing plan (e.g. mapping or single pointing for spectroscopy). These 'interactive' pipeline scripts are provided in HIPE and explained in the data reduction guides.
  • The data you get from the the HSA will have been processed by the 'SPG' (Standard Product Generator), meaning that they are processed with a tailored version of the latest pipeline scripts from the User Release. For example, when HIPE User Release 11.0 is released, soon after all the Herschel data are processed with the SPG pipeline scripts of version 11.0, and so on for each User Release.
  • These SPG scripts are a copy of one flavour of interactive pipeline scripts, differing only on the AOT type. The SPG scripts include all the stable pipeline tasks with settings that correspond to the most common type of science target for each AOT. But some pipeline tasks can only be run via the interactive script, and to modify the important parameter settings for pipeline tasks also requires you re-run the pipeline. The Launch Pads (see below) include a guide to understanding the pipeline scripts and how to decide which to run.
  • The SPG results a good starting point to look at your PACS data, but in most cases you can improve the results at least somewhat by reducing the data yourself.
 

HIPE and data reduction documentation

Changed:
<
<
  • HIPE (Herschel Interactive Processing Environment) is the tool used to inspect, reduce, and analyse Herschel data. The latest User Release HCSS (Herschel common science system) version that you should use for reducing PACS data is HIPE v11.1 It can be downloaded from: http://herschel.esac.esa.int/HIPE_download.shtml. In the CIB (continuous integration build) this version corresponds to Track 11, build 3010. (The CIB is the continuously bug-fixed/upgraded/improved version of HIPE, which every X months becomes a stable User Release. The CIB has the latest software in it, but it will not be bug-free.)
>
>
  • HIPE (Herschel Interactive Processing Environment) is the tool used to inspect, reduce, and analyse Herschel data. The latest User Release HCSS (Herschel common science system) version that you should use for reducing PACS data is HIPE v11.1 It can be downloaded from: http://herschel.esac.esa.int/HIPE_download.shtml. In the CIB (continuous integration build) this version corresponds to Track 11, build 3010. (The CIB is the continuously bug-fixed/upgraded/improved version of HIPE, which every X months becomes a stable User Release. The CIB has the latest software in it, but it will not be bug-free.)
 
Changed:
<
<
  • Within HIPE you can access all the PACS data reduction documentation and the general HCSS and HIPE -user documentation for Track 11. The documentation provided via HIPE opens in a web browser, but for those of you who prefer PDF, we include the PACS Data Reduction Guides as PDF files here (note that within the standalone pdf versions, external links will not work):
>
>
  • Within HIPE you can access all the PACS data reduction documentation and the general HCSS and HIPE user documentation for Track 11. The documentation provided via HIPE opens in a web browser, but for those of you who prefer PDF, we include the PACS Data Reduction Guides as PDF files here (note that within the standalone pdf versions, external links will not work):
 

Changed:
<
<
  • The documentation-set provided via HIPE includes the following:
    • The PACS(and HIFI and SPIRE) DRGs. The main function of the "PDRGs" is to take you through reducing your data with the interactive pipelines. They also show you how to quick-look at the already-reduced data you get from the HSA, what to consider before and after reducing your data, and explain what is contained in the PACS data products you get from the HSA.
>
>
  • The full documentation-set provided via HIPE includes the following:
    • The PACS (HIFI and SPIRE) DRGs. The main function of the PACS DRGs (photometry and spectroscopy) is to take you through reducing your data with the interactive pipeline, explaining the steps and the individual tasks in more detail and showing you how to inspect your results. This makes them rather long documents, and they should be read along with the pipeline scripts rather than on their own. They also show you how to quick-look at the SPG products you get from the HSA, what to consider before and after reducing your data, and explain what is contained in the PACS data products you get from the HSA.
 
    • A guide to using HIPE itself (i.e. HIPE as a GUI rather than a scientific tool).
Changed:
<
<
    • A Data Reduction Guide, which is about working with all Herschel (or any other) data in HIPE: the various data analysis tools and data viewers are explained here
    • A Scripting Guide: the language of HIPE is "HIPE's version of jython", and it is intended to be a full scripting environment in which you can manipulate data, do mathematics, and view data in various ways. The "SG" is a guide to scripting in HIPE, although you should be comfortable with scripting yourself (preferably with python, jython, or JAVA) before embarking on scripting in HIPE.
    • Reference manuals, for most of the tasks that you can find in HIPE, and to all the product classes that you can find in HIPE (these tell you e.g. how to manipulate spectra and images directly by querying on the product, rather than using a pre-provided task).
>
>
    • The Data Reduction Guide, which is about working with all Herschel (or any other) data in HIPE: the various data analysis tools and data viewers are explained here.
    • The Scripting Guide: the language of HIPE is 'HIPE's version of jython', and it is a full scripting environment in which you can manipulate data, do mathematics, and view data in various ways. The 'SG' is a guide to scripting in HIPE. It is not necessary, but it does help, if you are already comfortable with scripting before embarking on scripting in HIPE.
    • Reference manuals. For most of the tasks that you can run in HIPE the description of what they do and listings of all the parameters can be found in the 'User's Reference Manual'. To learn more about the various HIPE product classes you can read the JAVA docs (APIs) a.k.a. the 'Developer's Reference Manual'. These tell you e.g. how to manipulate spectra and images directly by querying on the product, rather than using a pre-provided task.
 
  • The what's new in HIPE 11 page lists the changes in HIPE version 11.1 with respect to the 10.x series, provides a detailed lists of updated functionalities and calibration aspects.

Deleted:
<
<
 
Changed:
<
<

Cookbooks and interactive pipeline scripts

  • The pipeline scripts can be seen as cookbooks since they take you through each pipeline flavour, explaining briefly what each task does, commenting on the more crucial pipeline tasks, and showing you how to plot, image, visualise and inspect your data as you work through the pipeline. An example, public, observation is included with each so you can test it out before using it on your data. These data reduction scripts are available in HIPE under the menu: Pipeline --> PACS --> Photometer/Spectrometer.
  • The PDRGs explain in more detail what each pipeline task does and how to work your way through the more critical stages of the pipeline. The PDRGs also explain how to decide which pipeline flavour(s) to run on your data.
  • The PACS Launch Pad from June 2013 for photometry are provided here for photometry. This is taken from the first chapter of the PDRGs and is a useful quick-start guide to loading your data into HIPE, and then what to know and do before you begin reprocessing your data with one of the pipelines.
  • The PACS Launch Pad from July 2013 for spectroscopy (for Track 11) are provided here for spectroscopy. This is taken from the first chapter of the PDRGs and is a useful quick-start guide to loading your data into HIPE, and then what to know and do before you begin reprocessing your data with one of the pipelines. In addition, we take you through all the things you need to think about before reprocessing your PACS spectroscopy through the pipeline yourself:
>
>

Cookbooks and interactive pipeline scripts

  • The various pipeline scripts PACS photometry and spectroscopy provide can be seen as cookbooks, since they take you through each pipeline, task by task (on the command-line), explaining briefly what each task does, commenting on the more crucial pipeline tasks, and showing you how to plot, image, visualise and inspect your data as you work through the pipeline. An example public observation is included with each so you can test it out before using it on your data. These data reduction scripts are available in HIPE under the menu: Pipeline --> PACS --> Photometer/Spectrometer.

  • The PACS Launch Pad from June 2013 for photometry is provided here. The PACS Launch Pad from July 2013 for spectroscopy is provided here. These are taken from the first chapter of therespective PDRG and are a useful quick-start guide to loading your data into HIPE, look at them, and then what to know and do before you begin reprocessing your data with one of the pipelines.
 
    • why we recommend you do re-pipeline your data
    • what you need to pay attention to for different types of astronomical source
    • what the post-pipeline processing tasks are you can, or must, do
 
Changed:
<
<
Tutorials and scripts:
>
>

Tutorials and scripts

 
Added:
>
>
 
  • HIPE Academy on YouTube: here you can find recordings of various seminars and webinars that the HSC have given on working in HIPE, reducing Herschel data, using various tools to visualise and manipulate data in HIPE, and etc.
Changed:
<
<
  • In HIPE there is a Scripts menu in which you can find various "useful scripts" for working with PACS data in HIPE. For example, for spectroscopy there is a script showing how to fit the spectra in cubes and make integrated flux images from them; for photometry we show how to do point source aperture photometry. These are written as scripts which you can open in HIPE and run on a test dataset, and in most cases you can replace the test dataset with your own and take it from there. Please do note that these scripts do not explain how to use the GUI version of the tasks--for this you need to read the PDRGs or the general Data Reduction Guide.
>
>
  • In HIPE there is a Scripts menu in which you can find various "useful scripts" for working with PACS data in HIPE. For example, for spectroscopy there is a script showing how to fit the spectra in cubes and make integrated flux images from them; for photometry we show how to do point source aperture photometry. These are written as scripts which you can open in HIPE and run on a test dataset, and in most cases you can replace the test dataset with your own and take it from there. Please do note that these scripts do not explain how to use the GUI version of the tasks - for this you need to read the PDRGs or the general Data Reduction Guide.
 

PACS calibration file versions

 
This site is powered by the TWiki collaboration platform Powered by Perl