5.4. Modifying the pipeline

You may wish to modify the pipeline for the following reasons:

5.4.1. Using the interactive Level 2.5 pipeline

The default Level 2.5 pipeline (that is used to create products in the HSA) is observing mode dependent, and this is reflected in the Interactive Level 2.5 Pipeline GUI.

  • Point modes

    HTP are stitched, folded (if Frequency Switch), and converted to simpleSpectrum format. HRS spectra are stitched together using the fillGaps option of doStitch set to True so that any gaps in the subbands are set to NaNs. The tasks listed in the Interactive Level 2.5 Pipeline GUI are doStitch, doFold (only for Frequency Switch observations), and convertSingleHifiSpectrum

  • Mapping modes

    HTP are stitched, folded (if Frequency Switch), and gridded. HRS spectra are stitched together only if the subbands overlap in frequency (by calling doStitch with fillGaps=False) in order to avoid NaNs in the cube. The tasks listed in the Interactive Level 2.5 Pipeline GUI are doStitch, doFold (only for Frequency Switch observations), and doGridding. Note that in the Level 2.5 Interactive Pipeline, doGridding automatically applies the rotation angle used in the observation to the gridding. This is a contrast to the pipeline used to populate the HSA, which creates non-rotated cubes and, in the case of maps carried out with a non-zero rotation angle, also a set of rotated cubes.

  • Spectral Scans

    The data are deconvolved using the doDeconvolution task, which is listed in the Level 2.5 Interactive Pipeline GUI.

  • All observing modes

    The mkRmsAlgo algorithm, which uses the mkRms task is run on the Level 2.5 data. For each type of observing mode, the mkRmsAlgo and mkRms tasks are listed in the Interactive Level 2.5 Pipeline GUI.

The interactive Level 2.5 Pipeline allows you to set up the tasks, automatically included in it, as you wish prior to running the pipeline. It also allows you to introduce other tasks into the pipeline, with the requirement that the output of one task be able to be passed to the next task in your customised Level 2.5 pipeline.

By default, the interactive Level 2.5 pipeline tab is inoperative when the pipeline GUI is first opened. To use it, check the box to the right of "Interactive Level 2.5 Pipeline" near the bottom of the hifiPipeline GUI, and open the tab by clicking on the arrow to the left. Tasks already included in the Level 2.5 pipeline can also be opened by clicking on the arrow to the left of the task name, and the task options can be modified. See Chapter 14 and Chapter 15 for information about setting up doDeconvolution and doGridding, respectively. Note that upon checking the "Interactive Level 2.5 Pipeline" box, these tasks will be initialised with their default settings: these may not be the same as in the standard pipeline. For example, doGridding will produce only cubes with the flyAngle applied when run this way, but in the standard pipeline, it would produce cubes with the flyAngle applied, and cubes to which no rotation is applied.

Other tasks that accept and pass HTPs can be added to the Level 2.5 pipeline. To find tasks that are applicable to HTPs, click on an HTP in the Variables view or in the Observation Context tree, and look under Applicable in the Tasks View. You can also pass tasks that loop through HTPs in an Observation Context, such as doDeconvolution, fitBaseline, fitHifiFringe and convertK2Jy. These can be found by clicking on an Observation Context in the Variables view or in the Observation Context tree before looking under Applicable in the Tasks View.

To pass a task to the Level 2.5 pipeline, drag it from the Tasks View to the green bullet in the tab, see Figure 5.4. The tasks can be re-ordered by dragging them up and down the list, and can be set up as desired after clicking on the arrow to the left of the task name. Note that adding tasks to the list will remove any changes you made to a panel already in the list so it is best to add all the tasks you want to use first, and then set them up.

Using the Level 2.5 Interactive Pipeline GUI

Figure 5.4. Using the Level 2.5 Interactive Pipeline GUI

As usual, the command to run the pipeline with your customised Level 2.5 pipeline will be echoed to the console and can be copied to insert into scripts. The output to the console will also include the commands to customise the Levels 1 and 2 pipelines, unless you unchecked that option in the GUI. If you did not modify the Levels 1 and 2 pipelines then you can safely remove these by deleting params = { ... }.

Writing your own task to provide to the Level 2.5 pipeline

You can provide your own task to run the Level 2.5 pipeline. The example below shows the framework of how to do that:

from herschel.ia.task import Task
from herschel.ia.task import *
from herschel.ia.gui.kernel import ParameterValidatorAdapter, ParameterValidationException
from herschel.ia.all import *

class HtpPrimeInputValidator(ParameterValidatorAdapter):
    def validate(self, val):
        if not isinstance(val, HifiTimelineProduct):
            msg = "The prime input is a %s. It must be a %s." % (val.__class__, HifiTimelineProduct)
            raise ParameterValidationException(msg)

class HifiPipelineTemplateTask(Task):
    @jhelp A task to demostrate how to write a task that is compatible with the
    the hifi interactive pipeline.

    @jcategory HIFI/Analysis

    @jalias hifiPipelineTemplate

    @jparameter htp, IO, HifiTimelineProduct, MANDATORY, None
    The number of the observation context to be compared

    @jparameter otherTaskParameter, INPUT, String, OPTIONAL, 'empty'
    Example of a non-prime task parameter

    @jexample How to register this task in HIPE and use it
    # Import your task if it is contained in the software build
    from herschel.hifi.scripts.users.share.HifiPipelineTemplateTask import *
    # Register your task in HIPE, and move task to the interactive pipeline (GUI)
    hifiPipelineTemplate = HifiPipelineTemplateTask()
    # To execute your task on the command-line
    output = hifiPipelineTemplete(obs=xxxx, otherTaskParameter='myInput')

    @jhistory 2012-02-13 KE Initial version
    def __init__(self):
        self.setDescription("A task to demostrate how to write a task that is " +
        "compatible with the the hifi interactive pipeline")
        # HifiTimelineProduct - htp
        p = TaskParameter()
        p.name         = 'htp'
        p.type         = TaskParameter.IO
        p.valueType    = HifiTimelineProduct
        p.nullAllowed  = 0
        p.defaultValue = None
        p.description  = 'The obsid of an ObservationContext'
        p.mandatory    = 1
        p.parameterValidator = HtpPrimeInputValidator()
        # Non-prime task parameter
        p = TaskParameter()
        p.name         = 'otherTaskParameter'
        p.type         = TaskParameter.IN
        p.valueType    = String
        p.nullAllowed  = 0
        p.defaultValue = 'empty'
        p.description  = 'example of a non-prime task parameter'
        p.mandatory    = 0
    def execute(self):
        print self.getName() + " is being executed."
        htpCopy = self.htp
        print htpCopy
        #self.htp = HifiTimelineProduct()
        #self.htp.setDescription("This is a test output")
hifiPipelineTemplate = HifiPipelineTemplateTask()
#output = hifiPipelineTemplate(htp=htp, otherTaskParameter='myInput')
#print output

5.4.2. Customising the Level 1 and 2 pipelines

The Customize Pipeline section of the hifiPipeline GUI can be used to change the defaults used in the pipeline algorithms, and to omit steps from the Level 1 and 2 pipelines.

  • Unhide the Customize Pipeline section by clicking on the arrow, all of the steps in the Levels 1 and 2 pipelines are displayed.

  • If an Observation Context is loaded into the hifiPipeline GUI then doPipelineConfiguration will read the observing mode from the instMode (not the obsMode) metadata item, and the steps that are not applicable for that observing mode will be greyed out.

    To find the value of the instMode metadata, you can either directly look in the Observation Context metadata, or use the following in the command line:

    obsmode = obs.meta.get("instMode").value
  • By unchecking the tick box by each pipeline step name, you can omit that step when you run the pipeline.

  • By unhiding the panels in the GUI, you can see the parameters used in each step with the default settings for that observing mode, which you can modify as you prefer. Tooltips found by hovering over parameter names give indications of the options available to you, while more information is available in Chapter 4. The pipeline steps are described in detail in the Pipeline Specification Document .

  • The command to run the pipeline as you have configured it is echoed to the console, allowing you to transfer your customised pipeline command to a script. This is a very simple way to put a customised pipeline into a script but you should be aware that this will only work for other observations of the same observing mode as the configuration is dependent on the observing mode and a different configuration must be created for a different mode.

  • You can also customise the pipeline directly from the command line (without copying the echo to console) using doPipelineConfiguration, although the format is somewhat different than that echoed to the console. The echo to the console from the pipeline includes all the default settings for the observing mode but when using doPipelineConfiguration, it is only necessary to include the settings for the tasks that you wish to modify. In the example below, the doAvg step is omitted:

    # This passes the instMode metadata value to the pipeline configuration
    config = doPipelineConfiguration(obs)
    # Now change the pipeline configuration to skip doAvg
    config.setParameter('doAvg','ignore', True)
    obs_1 = hifiPipeline(obs=obs, params=config)

    And in this example, you may want to add doFilterLoads to the configuration:

    # This passes the instMode metadata value to the pipeline configuration
    config = doPipelineConfiguration(obs)
    # Now change the pipeline configuration to add doFilterLoads
    obs_2 = hifiPipeline(obs=obs, params=config)

5.4.3. Editing the pipeline algorithms

The scripts for the algorithms of each stage of the pipeline and the algorithm for calculating the rms noise, mkRmsAlgo, can be found in the PipelineHIFI menu in the HIPE toolbar. These scripts are kept automatically up to date and they have been made as clear and well-commented as possible in order that you be able to modify them. However, it is strongly recommended that you look at the Pipeline Specification Document to understand what the pipeline does, and what alternatives are available for each pipeline step.

  • Edit the pipeline algorithm script (optional), and save it on disk. You can then pass the modified algorithm to the pipeline by specifying the location of the saved script in the appropriate algo field of the GUI. You can browse for your saved script by clicking on the folder icon.

  • You can also pass the path to the saved algorithm script in the command line, e.g.,

    obs = hifiPipeline(obs=obs, level2Algo="/home/me/MyScripts/MyLevel2PipelineAlgo.py")
  • Alternatively, you can make use of the fact that the pipeline algorithm scripts define a function ( def TaskName (Parameters):), and pass the function name to the pipeline. To do this you must first compile your new pipeline algorithm by running the script with the double arrows (>>) in the HIPE toolbar. In the example below, the function has been called MyLevel1Algo:

    # Include your own algorithm for the Level 1 pipeline, for all spectrometers, from Level 0 to 1:
    obs = hifiPipeline(obs=obs, fromLevel=0, upToLevel=1, level1Algo=MyLevel1Algo)

5.4.4. Running the Pipeline step by step

  • If you choose to modify or customise the pipeline, it can be helpful to see directly and quickly what changes will be made to the data. Running the pipeline, or one part of the pipeline, step by step allows you to inspect the results of each step and change the default parameters of the pipeline. You will find extended information of every steps the pipeline perform in Chapter 4. If you wish to create your own algorithm, which must be written in jython, for a part of the pipeline, then this will likely be your first step.

  • It is not expected that there will be much need to customise the spectrometer pipelines (up to Level 0.5), and indeed there are only a few steps of the spectrometer pipelines that have some options. It is more likely that you may wish to play with how OFF and reference spectra are subtracted in the Level 1 pipeline, although it is expected that the default settings should work well.

  • To step through the pipeline, you must work directly on the appropriate level HifiTimeLine (HTP - the dataset containing all the spectra, including calibration spectra, made during an observation for a given spectrometer). So the first thing you must do is to extract the HTP you want to work on from your Observation Context:

    • Drag an HTP from the Observation Context tree in either the Context Viewer or Observation Viewer into the Variables view, and rename it if you desire by right clicking on the new variable and selecting "rename".

    • In the command line, the formalism to extract an HTP is

      htp = obs.refs["level2"].product.refs["HRS-V-USB"].product

      "level2" and "HRS-V-USB" should be replaced by the level and backend combination desired.

  • When you select an HTP in the Variables view in HIPE, you will notice many tasks with names like DoWbsDark, mkFreqGrid. These are the names of all the steps in the HIFI pipeline; mk... signifies a step where a calibration product is being made, Do... is a step where a calibration is applied. You can step through the pipeline using these tasks, you will need to refer to the HIFI Pipeline Specification Document for the order that the steps should be applied in. Alternatively, you can use and modify the scripts that are supplied with the software from the Pipeline menu in HIPE, as described above. To learn about moving about and running scripts, see the HIPE Owners Guide .

  • For information on the steps of each level of the pipeline (their names, the order to run them in, and what options you can change) see the HIFI Pipeline Specification Document.