1.16. Exchanging data with FITS files

1.16.1. Saving a product to a FITS file

You can save any kind of Herschel data to FITS files, as long as it is of type Product or a dataset such as a TableDataset. All the raw and reduced data coming from the Herschel Science Archive are either products or datasets. Note that you cannot save arrays such as Double1d (for example, single columns extracted from a table dataset). In that case, see the end of this section for how to wrap arrays into datasets.

To save a product or dataset as FITS file, follow these steps:

  1. Select the product or dataset in the Variables view.

  2. Right click on the variable name and choose Send toFITS file.

    The simpleFitsWriter task dialogue window opens in the Editor view, as shown in Figure 1.29.

  3. Write the name of the FITS file in the file field. Alternatively, click the folder icon to browse to a different directory.

  4. Optionally, tick the Ask before overwriting checkbox to be warned if you are about to overwrite an existing file.

  5. Optionally, choose a compression method from the compression drop-down list. You can choose between ZIP and GZIP.

  6. Press Accept to save the product or dataset to file.

FITS save task dialogue window.

Figure 1.29. FITS save task dialogue window.


[Note] Note
  • You are responsible for adding the .fits extension to the file name, plus any additional extension, such as .gz , if you choose a compression method. If you fail to do so, other applications such as ds9 may not handle the file correctly.

  • Unless you choose a different directory, FITS files are saved in the directory HIPE was started from. To locate this directory, issue these commands in the Console view:

    import os
    						print os.getcwd()

    Example 1.28. How to get the current workind directory for the Jython interpreter.


From the command line

You can write a product or dataset to a FITS file with the simpleFitsWriter task. Follow the link to access the corresponding entry in the User's Reference Manual.

myProduct = Product()  # Empty data product
simpleFitsWriter(myProduct, "myProduct.fits")

Example 1.29. Creating a new empty data product and writing it to disk as a FITS file.


Files are saved in the directory from which you started HIPE, unless you provide a different path with the file name.

The following commands create an image and save it as a multi-extension FITS file:

# Saving an image composed of random data to disk
myImage = SimpleImage(description="An image",image=Double2d(50,100), \
			       error=Double2d(50,100),exposure=Double2d(50,100))
simpleFitsWriter(myImage, "myImage.fits")
# Reading back the file created
path = "myImage.fits"
myImage = importImage(filename = path)

Example 1.30. Creating a SimpleImage with random data and saving it to disk as a FITS file, reading it back afterwards.


[Warning] Warning

The above code generates a FITS file with the value 50 assigned to the NAXIS2 keyword and 100 assigned to NAXIS1. In other words, the image size is 50 pixels along the y axis and 100 pixels along the x axis. The coordinate values are displayed in this order (y, x) in the Image Viewer. For an explanation of why the y size is specified before the x size, see the Scripting Guide : Section 2.2.5, “Ordering of array elements”.

If you get a SignatureException error when trying to save a variable to FITS, it probably means that your variables are not a product or dataset, but a simple array, such as a Double1d. To save it to FITS file you have to manually wrap a table dataset around it:

myArray = Double1d(10, 10.0)  # Array, cannot be saved to FITS
myTable = TableDataset()
myTable["myArray"] = Column(myArray)  # Putting array into dataset
simpleFitsWriter(myTable, "myTable.fits")

Example 1.31. Using a dataset as a wrapper to store an array in a FITS file.


1.16.2. Retrieving a Herschel product from a FITS file

To load a Herschel product stored in a FITS file (or any other standard FITS file), do either of the following:

  • Double click on the FITS file in the Navigator view.

  • Choose FileOpen File , select the FITS file and click Open.

The tasks used by HIPE to load FITS files are fitsReader and simpleFitsReader. The fitsReader task (see Figure 1.30) tries to guess the file contents by looking at the XTENSION keyword, and puts the contents in a variable of the appropriate type.

[Note] Note

This procedure is also valid for high-level reduced data from ISO, XMM-Newton, ALMA and SOFIA. For files that aren't correctly imported this way, please see Section 1.16.6

If fitsReader does not recognise the file contents, it defaults to the simpleFitsReader task. This task is optimised to read data from FITS files as packaged by HIPE. If the file is not a HIPE FITS product, the contents are put in unformatted arrays. You can choose how to read the file or let the software choose.

To run fitsReader or simpleFitsReader from HIPE, go to the Tasks view, select the All tasks folder and scroll down to fitsReader or simpleFitsReader. Double click on the task name to open its dialogue window. Insert the input file name and click the Accept button to run the task and read in the FITS file.

FITS read task dialogue window.

Figure 1.30. FITS read task dialogue window.


From the command line

You can read data from a FITS file into HIPE with the fitsReader task. Follow the link to access the corresponding entry in the User's Reference Manual.

myProduct = fitsReader("myProduct.fits")  # Load a product from FITS

Example 1.32. Load a product from a FITS file.


1.16.3. Translation of Herschel metadata to FITS keywords

Long, mixed-case parameter names, defined in the metadata of your product, are converted to a FITS compliant notation. This notation dictates that parameter names must be uppercase, with a maximum length of eight characters.

HIPE uses the following lookup dictionaries to convert well-known FITS parameter names into a convenient and human-readable name:

  • Common keywords widely used within the astronomical community, which are taken from HEASARC.

  • Standard FITS keywords.

  • HCSS keywords containing keywords that are not defined in the above dictionaries.

  • Instrument-specific dictionaries:

    • The HIFI-specific dictionary is included in the HCSS dictionary and maintained as a part of it. See above.

    • The PACS-specific dictionary is located here.

For example the following metadata is transformed into a known FITS keyword:

product.meta["softwareTaskName"]=StringParameter("FooBar")

Example 1.33. Setting a metadata property to a StringParameter value.


The result in the FITS product header is the following:

HIERARCH key.PROGRAM='softwareTaskName'
			PROGRAM = 'FooBar '

A full demonstration is available in the following example. The script creates a product with several nested datasets, stores it into a FITS file, and then retrieves it again.

# First we will get some unit definitions for our example
from herschel.share.unit import *
from java.lang.Math import PI

# Construction of a product (only for demonstration purposes)
points = 50
x = Double1d.range(points)
x *= 2*PI/points
eV = Energy.ELECTRON_VOLTS
# Create an array dataset that will eventually be exported
s = ArrayDataset(data = x, description = "range of real values", \
unit = eV)
degK = Temperature.KELVIN
# Provide some metadata for it (header information)
s.meta["temperature"] = LongParameter(long=293,\
description="room temperature", unit = degK)

# We can store the array in a FITS file 
# after making it a Product
p = Product(description="FITS demonstration",creator="You")
# Add some meta data
p.meta["sampleKeyword"]=StringParameter("First FITS file")
p.meta["observationInstrumentMode"]=StringParameter("UnitTest")
# Add the array of data to the product
p["myArray"] = s
# Store in FITS file
fits = FitsArchive()
fits.save("sdemo.fits", p)

# And restore it 
scopy = fits.load("sdemo.fits")

# Create a TableDataset for export
t = TableDataset(description = "This is a table")
t["x"] = Column(x)
t["sin"] = Column(data=SIN(x),description="sin(x)")

# And a composite dataset with an array and a table in it
c = CompositeDataset(description="Composite with three datasets!")
c.meta["exposeTime"] = DoubleParameter(double=10,description="duration")
c["childArray"] = s
c["childTable"] = t
c["childNest"] = CompositeDataset("Empty child, just to prove nesting")

# And finally, a product that has the composite dataset,
# TableDataset and array dataset.
p = Product(description="FITS demonstration",creator="demo.py")
p.creator = "You?"
p.modelName = "demonstration"
p.meta["sampleKeyword"] = \
StringParameter("Example keyword not in FITS dictionaries")
p.meta["observationInstrumentMode"] = StringParameter("UnitTest")
p["myArray"] = s
p["myTable"] = t
p["myNest"] = c

# Save our product ...
fits.save("demo.fits",p)
# ... load it back into a new variable, n,...
n = fits.load("demo.fits")
# ... and show it!
print n
print n["myArray"]
print n["myNest"]
print n["myNest"]["childNest"]

# We can also get information on the metadata/keywords
print n.meta
# And look at a specific piece of metadata
print n.meta["startDate"]

Example 1.34. Create FITS file from random data and read it back.


1.16.4. Structure of Herschel products when saved as FITS

This section describes the structure of FITS files created from typical Herschel product types appearing in Level 2 data.

All FITS files described here, when produced from Herschel observation products, also have a History extension with three child extensions: HistoryScript, HistoryTasks and HistoryParameters. These are explained separately in Section 1.16.4.9.

How to export data products from HIPE to other astronomical software is described in Section 1.16.7. If you have successfully exported Herschel data to other software, you are encouraged to contribute information to this page. Click the icon in the toolbar of the HIPE Help System to get in touch with us.

1.16.4.1. General information

World Coordinate System.  WCS information is held in the main header of the FITS file and in the image extension, for those products that have an image dataset.

Measurement units.  Information on measurement units is held in the header of each FITS extension. Look for the QTTY____ and BUNIT keywords, unless stated otherwise in the following sections.

1.16.4.2. SimpleImage

A FITS file from a SimpleImage shows at least three image extensions called image, error and coverage. These have the same size and contain the flux, error and coverage information of the original image, respectively. Usually there is also a History extension and more could be created by the pipelines. To check which extensions are present in the product, you can use print mySimpleImage and check the contents of the datasets attribute.

If a WCS is present in the original image, the WCS keywords appear in the FITS file for each array-like extension such as coverage, error.

For the most current information about the structure of products and their datasets, you can check the Product Definitions Document and, for PACS products, the PACS Products Explained document.

Structure of a FITS file produced from a SimpleImage.

Figure 1.31.  Structure of a FITS file produced from a SimpleImage.


1.16.4.3. SpectralSimpleCube for PACS

A SpectralSimpleCube has two three-dimensional datasets, image and coverage, and one table dataset ImageIndex, with two columns relating each cube layer to its wavelength. The LayerCount column contains the layer index (starting from zero) and the DepthIndex column contains the corresponding wavelength.

These datasets are translated to two image and one binary extension in the FITS file, with the same names. The wavelength measurement unit is held in the header of the ImageIndex extension, under the TUNIT1 keyword.

If a WCS is present in the original image, this is kept in the FITS file.

[Tip] Tip

A PACS projected cube is a SpectralSimpleCube.

Structure of a FITS file produced from a SpectralSimpleCube from a PACS observation. The two columns of the ImageIndex binary table extension are shown.

Figure 1.32.  Structure of a FITS file produced from a SpectralSimpleCube from a PACS observation. The two columns of the ImageIndex binary table extension are shown.


1.16.4.4. SpectralSimpleCube for SPIRE

The structure of this product, and corresponding FITS files, for SPIRE observation is mostly the same as for PACS observations, as described in Section 1.16.4.3. The only difference is the addition of two more three-dimensional datasets, error and flag , converted to two image extensions in the FITS file.

Structure of a FITS file produced from a SpectralSimpleCube from a SPIRE observation. The two columns of the ImageIndex binary table extension are shown.

Figure 1.33.  Structure of a FITS file produced from a SpectralSimpleCube from a SPIRE observation. The two columns of the ImageIndex binary table extension are shown.


1.16.4.5. SpectralSimpleCube for HIFI

You can find SpectralSimpleCube objects in Level 2.5 HIFI data. These cubes are made of three three-dimensional datasets, called image, weight and flag. These are converted to three image extensions in the FITS file.

Unlike cubes from PACS and SPIRE observations, there is no ImageIndex dataset relating cube layers to their wavelength (frequency for HIFI). Instead, you can look at the image dataset metadata, where parameters crpix3, crval3, ctype3 and so on define the reference layer, unit and scale of the frequency axis.

These keywords are translated to the header of the image extension in the FITS file.

Structure of a FITS file produced from a SpectralSimpleCube from a HIFI observation.

Figure 1.34.  Structure of a FITS file produced from a SpectralSimpleCube from a HIFI observation.


1.16.4.6. PacsRebinnedCube

A PacsRebinnedCube derives from a SpectralSimpleCube and adds more components to it. The exported FITS file is correspondingly more complicated.

Cube data is held in six three-dimensional image extensions, called image, ra, dec, stddev, exposure and flag. The ra and dec extensions hold the coordinates of each pixel in degrees (the measurement unit is shown in the extension header).

The ImageIndex extension relates each cube slice to its wavelength, in the same way as with a SpectralSimpleCube. The waveGrid extension contains the same wavelength information as the ImageIndex extension, but without the LayerCount column.

The contents of the qualityControl extension can be ignored.

Structure of a FITS file produced from a PacsRebinnedCube. The two columns of the ImageIndex binary table extension are shown.

Figure 1.35.  Structure of a FITS file produced from a PacsRebinnedCube. The two columns of the ImageIndex binary table extension are shown.


1.16.4.7. HifiTimelineProduct

The HifiTimelineProduct is a product context (a container with references to other products), which means that it cannot be saved as FITS file from HIPE.

Inside a HifiTimelineProduct there are a summary table and one or more DatasetWrapper products (one per building block) containing a number of SpectrumDataset objects.

The summary table and each DatasetWrapper can be separately saved as FITS files, but note that these FITS files will not have the History extension.

The FITS file of a summary table has one binary extension called wrapped, which reproduces the original table.

The FITS file of a DatasetWrapper has one binary extension per spectrum. These are called 0001, 0002 and so on. These extensions contain the actual spectra. Each extension is a table with one row and as many columns as the parameters describing the spectrum. Each table cell may contain a single value (like longitude and obs time) or an array of values (like flux and lsbfrequency).

Structure of a FITS file produced from a HifiTimelineProduct. This product cannot be saved directly as a FITS file, but the summary table and each DatasetWrapper can. The dashed gray lines show the contents of each FITS file.

Figure 1.36.  Structure of a FITS file produced from a HifiTimelineProduct. This product cannot be saved directly as a FITS file, but the summary table and each DatasetWrapper can. The dashed gray lines show the contents of each FITS file.


1.16.4.8. SpectrometerPointSourceSpectrum

This SPIRE product has one extension called 0000, with two children extensions called SSWD4 and SLWD4. These correspond to the centre bolometers of the short and long wavelength spectrometer arrays, respectively. Each extension is a table with five columns: wave (wavelength), flux, error, mask (zero unless mask flags have been applied) and numScans. Each row is a data point of the spectrum.

The numScans column is not present for data processed with SPG versions prior to 9.1.0.

Structure of a FITS file produced from a SpectrometerPointSourceSpectrum. The five table columns are shown for the SSWD4 extension. They are the same for the SLWC3 extension.

Figure 1.37.  Structure of a FITS file produced from a SpectrometerPointSourceSpectrum. The five table columns are shown for the SSWD4 extension. They are the same for the SLWC3 extension.


1.16.4.9. The History extension

The History extension is part of all the FITS files generated from HIPE products, including those described in the previous sections. It contains the following child extensions, all in binary table format:

  • HistoryScript. This table contains a Jython script with all the operations performed on the data that resulted in this data product. The table as a single column, and each row corresponds to a line of the script.

  • HistoryTasks. This table shows the names of all the tasks used in data processing, and the corresponding HIPE version and build number. The execution date and time is also shown in the ExecDate column. The format is FineTime, that is, the number of microseconds since 1st January 1958. To convert a value to a more convenient format, you can use a command like the following in the Console view of HIPE:

    print FineTime(1693341725238000)

    Example 1.35. Printing a FineTime formatted string to the console.


  • HistoryParameters. This table lists all the task parameters used during data processing, with their type, value and whether the value used was the default (column IsDefault). Note that, for parameters, of type PRODUCT, the value is usually an expression like hash2798118624. This is a unique value identifying the particular data product that was used.

    With the TaskID column you can find the task a given parameter was used in, by comparing the value with those in the ID column of the HistoryTasks column.

For more information about history in products, see the Scripting Guide: Section 2.8.7, “Product history”.

Structure of the History extension of a FITS file created from a Herschel product. Column names for each of the three binary table extensions are shown.

Figure 1.38.  Structure of the History extension of a FITS file created from a Herschel product. Column names for each of the three binary table extensions are shown.


1.16.5. Troubleshooting FITS import/export

For more information see the FITS IO general documentation.

Problems opening FITS files created by HIPE

If you export a FITS file from HIPE and modify it with an external program, HIPE may not be able to import it anymore. If this happens, follow these steps:

  1. Open the FITS file with a FITS editing program such as fv.

  2. Delete the HCSS____ keyword from the header of all extensions.

  3. Save the file.

HIPE should now be able to read the file.

FITS header character limit

A FITS header card is limited to 80 characters. StringParameters and FITS card descriptions longer than the allocated length are distributed over multiple lines. An & character at the end of a line means that the text continues on the next line. The keyword CONTINUE is used for the lines after the first one.

Opening multi-extension FITS files in DS9

When FITS files with multiple extensions are opened as cubes in DS9, the application crashes. One alternative is to open the different extensions in separate frames, for which you need at least version 6 of DS9. Version 6 or higher of DS9 does not crash on cubes, and it correctly opens only the relevant extensions.

1.16.6. Importing a non-Herschel FITS file into HIPE

There are several options for importing data from a non-Herschel FITS file and making it fit into a Herschel product or dataset. These options go from a relatively simple class for reading/writing FITS files which is included along with Java (FitsArchive) to a HIPE task that tries to smartly guess the most similar Herschel product to create and load the FITS file data into (fitsReader). Some of them have been previously described in this section, but they will be mentioned again as part of a workflow that will allow you to select the most appropriate mechanism for importing some exotic FITS file.

  1. The FitsArchive class can only be used via script and you can find examples in both the URM entry and in this section in Example 1.39 (writing FITS only). The usual output for this class is simply a dataset that holds the data encapsulated in a generic Product.

  2. The simpleFitsReader task has been explained before in Section 1.16.2. To use the task through the GUI, you can:

    • Double click on the FITS file in the Navigator view.

    • Choose FileOpen File , select the FITS file and click Open .

    When using the GUI, if you select guess (see Figure 1.30) as the fitsType input parameter, the task that will be called internally is fitsReader instead. This option tries to guess the file contents by looking at the XTENSION keyword, and puts the contents in a variable of the appropriate type. If fitsReader does not recognise the file contents, it defaults to the simpleFitsReader task. This task is optimised to read data from FITS files as packaged by HIPE. If the file is not a HIPE FITS product, the usual output is a set of unformatted arrays. To run fitsReader or simpleFitsReader from HIPE, go to the Tasks view, select the All tasks folder and scroll down to fitsReader or simpleFitsReader. Double click on the task name to open its dialogue window. Insert the input file name and click the Accept button to run the task and read in the FITS file. Finally, if you want to script those tasks, either have a look at their URM entries (simpleFitsReader or fitsReader) or try this example copied from the URM:

    filepath = "path_to_file/filename"
    readertype = SimpleFitsReaderTask.ReaderType.STANDARD
    product=simpleFitsReader(file=filepath, reader=readertype)

    Example 1.36. Importing a non-Herschel FITS file with the simpleFitsReader task.


    Remember that the fitsReader task is the same but it only requires one parameter (therefore using guessing implicitly): file.

  3. If you know the HIPE type that most closely matches the FITS file data, you can use any of the import * tasks. These are the current import tasks:

    The image importing tasks only have one parameter, a string containing the path to the file and return the image as an output parameter. For example:

    filePath = "myNonHerschel.fits"
    myImage = importImage(filename = filePath)

    Example 1.37. Importing non-Herschel FITS files using specific image import tasks.


    The cube importing tasks have two parameters, an input/output parameter that takes a previously defined variable and sets its content to the data in the FITS file and a string containing the path to the file. For example:

    filePath = "myNonHerschel.fits"
    myCube = SpectralSimpleCube()
    myCube = importSpectralCube(spectralcube = myCube, filename = filePath)

    Example 1.38. Importing non-Herschel FITS files using specific spectral import tasks.


1.16.6.1. Using data from other missions and observatories

HIPE is able to load FITS files. Being an open format, many missions and observatories offer their products as FITS files. In particular, HIPE can read the FITS files that the Common Astronomy Software Applications (CASA) suite of tools produces. This software is used to process the raw data (unreadable by HIPE) obtained from ALMA interferometer and generate the final products (readable by HIPE in the form of a standard FITS file).

1.16.7. Importing a Herschel FITS file into external applications

This section describes how to import FITS files of Herschel products into some popular data analysis applications.

1.16.7.1. IDL

Importing images. See the following code:

IDL> im = mrdfits('/path/image.fits',1)
% Compiled module: FXMOVE.
% Compiled module: MRD_HREAD.
% Compiled module: FXPAR.
% Compiled module: GETTOK.
% Compiled module: VALID_NUM.
% Compiled module: MRD_SKIP.
MRDFITS: Image array (2012,2009)  Type=Real*8
% Compiled module: SWAP_ENDIAN_INPLACE.
IDL> tv,im

Importing spectra. See the following code:

IDL> spec = mrdfits('/path/spectrum.fits',2)
% Compiled module: MATCH.
% Compiled module: MRD_STRUCT.
MRDFITS: Binary table.  4 columns by  2061 rows.
IDL> help,spec,/struc
** Structure <15e03af4>, 4 tags, length=28, data length=28, refs=1:
  WAVE            DOUBLE           31.200000
  FLUX            DOUBLE           8.2931329
  ERROR           DOUBLE           3.4131544
  MASK            LONG                 0
IDL> plot,spec.wave,spec.flux

Importing cubes. See the following code:

IDL> cube = mrdfits('/path/cube.fits',2)
MRDFITS: Image array (16,18,374)  Type=Real*8
IDL> help,cube
CUBE            DOUBLE    = Array[16, 18, 374]
IDL> plot,cube[8,8,*]

In the case of PACS projected cubes, the structure of the FITS file is that described in Section 1.16.4.6 .

IDL> FITS_HELP,'path/cubeName.fits'
XTENSION  EXTNAME         EXTVER EXTLEVEL BITPIX GCOUNT  PCOUNT NAXIS  NAXIS*
0                                            32      0      0     0  
1 IMAGE    image                            -64      1      0     3  39 x 39 x 29
2 IMAGE    coverage                         -64      1      0     3  39 x 39 x 29
3 BINTABLE ImageIndex                         8      1      0     2  12 x 29
4 IMAGE    History                           32      1      0     0  
5 BINTABLE HistoryScript                      8      1      0     2  80 x 7
6 BINTABLE HistoryTasks                       8      1      0     2  35 x 1
7 BINTABLE HistoryParameters                  8      1      0     2  103 x 12
IDL> image = mrdfits('path/cubeName', 'image', hd) ; the header contains the image's WCS
IDL> imageIndex = mrdfits('path/cubeName','ImageIndex')
IDL> wave = imageIndex.depthindex  ; cube's wavescale

1.16.7.2. CLASS

You can read the FITS files produced with the hiClass task in HIPE on HIFI data with the following commands:

file out MyHIFISpectra.hifi mul
fits read MyHIFISpectra.fits
#
# Now you have a CLASS file named MyHIFISpectra.hifi (you can use whatever
# you want as an extension) you can access like you always do in CLASS:
#
file in MyHIFISpectra.hifi
find
get first
set unit f i
device image white
plot

For PACS data or any Spectrum1d product, run this script in HIPE:

Spectrum1d to CLASS FITS conversion
Written by C. Borys April 15, 2010
cborys@ipac.caltech.edu
Inspired greatly by HICLASS, written originally by Bertrand Delforge 
and now maintained by Damien Rabois.
The core code was taken directly from that package.

NOTE: this code is specific for HIFI, and even then may lack
some of the keywords CLASS looks for.  The script is relatively
easy to tweak however.

Out of the box, this should work on the spectrum1d that
is output from HIFI's deconvolution task.  Indeed that was
the driver for this task in the first place.

'''

from herschel.ia.io.fits.dictionary import AbstractFitsDictionary
from herschel.share.fltdyn.time import FineTime
from java.util import Date
from herschel.share.unit import Frequency

# Define keyword dictionary
# The following class is stolen directly from HICLASS
class MyFitsDictionary(AbstractFitsDictionary):
"""Dictionary to use with FitsArchive to get proper keywords.

Because HCSS can use metadata parameters with fancy names and FITS
is stuck with keywords of 8 uppercase ASCII characters, a
dictionary is needed to convert the meta data parameter names into
FITS keywords.

The present class defines a dictionary for which the HCSS name and
the FITS name of a parameter are identical.  This allows you to
populate a HCSS dataset using the keywords you will want to see
appear in the FITS file.  And they will be used.

When instanciating this dictionary, feed to the constructor a
product created by HiClass.  It will be scanned and all the meta
data parameters found in its datasets will be added to the
dictionary.

Say you want to export a product p:
>>> dico = MyFitsDictionary(p)
>>> archive = FitsArchive()
>>> archive.rules.append(dico)
>>> archive.save(sFileName, p)

"""
def __init__(self, p):
    """p: HiClass product."""
    AbstractFitsDictionary.__init__(self)
    self._addKeysForProduct(p)
def _addKeysForProduct(self, prod):
    map(self._addKeysForDataset, map(prod.get, prod.keySet()))
def _addKeysForDataset(self, ds):
    for meta_name in ds.meta.keySet():
        self.set(meta_name, meta_name)

# This routine checks for a meta data parameter and if it doesn't exist, 
# sets a default.
def checkForMeta(spectrum,metaName,metaDefault) :
if spectrum.meta.containsKey(metaName):
 mdata = spectrum.meta[metaName]
else :
 class_name= metaDefault.__class__.__name__
 print metaName, "tag not found in spectrum. Setting to default"
 if class_name.endswith('Float') :
   mdata = DoubleParameter(Double(metaDefault))
 elif class_name.endswith('Double') :
   mdata = DoubleParameter(metaDefault)
 elif class_name.endswith('String') :
   mdata = StringParameter(metaDefault)
 elif class_name.endswith('Long') :
   mdata = LongParameter(metaDefault)
return mdata

# the main routine:
def spectrum1dToClass(spectrum,fitsfn):
# ensure that the spectrum is a 1d.
class_name = spectrum.__class__.__name__
if class_name.endswith('Spectrum1d'):
 print "Converting input spectrum"
else :
 print "Input is not a Spectrum1d, exiting."
 return -1

p=Product(description = 'Herschel HIFI', \
        instrument = 'HIFI', \
        creator = 'spectrum1dToClass')
p.type = 'Class formatted fits file'

sFlux=spectrum.getFlux()
sWave=spectrum.getWave()*1e6 # converts to Hz, assumes data is in MHz
n_channels=sFlux.length()

# compute frequency parameters
# this computes a scale, but needs a lot of error checking
# -assumes data has no NANs and is ordered, etc.  
# works fine for decon output but may crash on types of 1d.
sIndex = Double1d.range(len(sWave))  
fitter = Fitter(sIndex, PolynomialModel(1)) # Degree 1: y = ax+b.
result = fitter.fit(sWave)
freqSpacing=result[1]
freqStart=result[0]
# Irregularity. not used for now.
diff = (sIndex * freqSpacing + freqStart)-sWave
irregularity = STDDEV(diff)



blankingvalue=-1000

# here is where we set all the meta data CLASS fits needs.
meta= MetaData()
#
# Axis dimensions.
# -----------------
meta['MAXIS' ] = LongParameter(4, "Number of axis")
meta['MAXIS1'] = LongParameter(n_channels, "Max nb of channels in spectrum")
meta['MAXIS2'] = LongParameter(1, "Position coordinate 1 scale")
meta['MAXIS3'] = LongParameter(1, "Position coordinate 2 scale")
meta['MAXIS4'] = LongParameter(1, "Stokes parameters")
#
# Axis 1: Frequency.
# -------------------
# CLASS understands FREQ, FREQUENCY, LAMBDA, WAVELENGTH.
meta['CTYPE1'] = StringParameter("FREQ", "Frequency scale parameters")
meta['CRVAL1'] = DoubleParameter(freqStart, \
                 "Frequency offset @ reference channel")
meta['CDELT1'] = DoubleParameter(freqSpacing, \
                "Freq step, fres, channel width.")
meta['CRPIX1'] = LongParameter(0, "Number of the reference channel")
#
# Axis 2: Right ascention.
# -------------------------
# CLASS understands RA--, RA ; DEC-, DEC ; GLON ; GLAT ; TIME, UT.
# For the projection system, CLASS understands
#     : P_NONE      = 0 ! Unprojected data
# -TAN: P_GNOMONIC  = 1 ! Radial Tangent plane
# -SIN: P_ORTHO     = 2 ! Dixon Tangent plane
# -ARC: P_AZIMUTHAL = 3 ! Schmidt Tangent plane
# -STG: P_STEREO    = 4 ! Stereographic
# lamb: P_LAMBERT   = 5 ! Lambert equal area
# -ATF: P_AITOFF    = 6 ! Aitoff equal area
# -GLS: P_RADIO     = 7 ! Classic Single dish radio mapping
# Read Representations of celestial coordinates in FITS
# Authors: Mark R. Calabretta, Eric W. Greisen
# (Submitted on 19 Jul 2002)
# arXiv:astro-ph/0207413v1
# http://arxiv.org/abs/astro-ph/0207413
# 
# Be careful:
# RA becomes  RA---GLS with three hyphens
# Dec becomes DEC--GLS with two hyphens.
# That's why we can also write 'RA--' and 'DEC-'
# for 'RA' and 'DEC': it's just easier to add the
# projection code after that.
#
proj = '-GLS'
#

meta['CTYPE2'] = StringParameter('RA--' + proj)
meta['CRVAL2'] = checkForMeta(spectrum,"raNominal",0.0)
meta['CDELT2'] = DoubleParameter(0.0)
meta['CRPIX2'] = DoubleParameter(0.0)
#
# Axis 3: Declination.
# ---------------------
meta['CTYPE3'] = StringParameter('DEC-' + proj)
meta['CRVAL3'] = checkForMeta(spectrum,"decNominal",0.0)
meta['CDELT3'] = DoubleParameter(0.0)
meta['CRPIX3'] = DoubleParameter(0.0)
#
# Axis 4: Stokes.
# -----------------
meta['CTYPE4'] = StringParameter('STOKES')
meta['CRVAL4'] = DoubleParameter(1.0)
meta['CDELT4'] = DoubleParameter(0.0)
meta['CRPIX4'] = DoubleParameter(0.0)
#
# Misc. information.
# -------------------
meta['EQUINOX']  = checkForMeta(spectrum,"equinox",0.0)
meta['BLANK']    = LongParameter(blankingvalue, "Marker of invalid channels")
meta['DATE-RED'] = DateParameter(FineTime(Date()),"Creation date of this file")
meta['PVEL-LSR'] = DoubleParameter(0.0,"source velocity")
meta['PVELTYPE'] = StringParameter('radio','source velocity type')
meta['TELESCOP'] = StringParameter('Herschel-HIFI-WBS','source of data')
meta['SCAN']     = LongParameter(1)
meta['SUBSCAN']  = LongParameter(1)
meta['OBJECT']   = checkForMeta(spectrum,"object",'Unknown object')
meta['MOLECULE'] = StringParameter('Unknown molecule','Molecule name')
meta['LINE']     = StringParameter('Unknown line','Line name')
meta['EXPOSURE'] = checkForMeta(spectrum,"exposure",0.0)
meta['TSYS']     = checkForMeta(spectrum,"Tsys",0.0)
meta['RESTFREQ'] = DoubleParameter(freqStart,'')
meta['IMAGFREQ'] = DoubleParameter(freqStart,'')
meta['BEAMEFF']  = checkForMeta(spectrum,"beff",1.0)
meta['PRESSURE'] = DoubleParameter(0.0,'Atmospheric pressure')
meta['TOUTSIDE'] = DoubleParameter(0.0,'Atmospheric temperature')

# convert the 1d flux into a 2d array for CLASS.
sArray=Double2d(1,sFlux.length())
sArray[0,:]=sFlux
# format the data into a table dataset, and tag it with our metadata
sData=TableDataset()
sData["DATA"]=Column(data=sArray,description="The spectrum",unit=Frequency.HERTZ)
sData.meta=meta
# insert the data into our product, and convert metadata keywords into FITS compliant text.
p["data"]=sData
keyDictionary=MyFitsDictionary(p)
for meta_name in meta.keySet():
 keyDictionary.set(meta_name, meta_name)

# save the output.
fits=FitsArchive()
fits.rules.append(keyDictionary)
fits.save(fitsfn, p)

# example usage:
# spectrum1dToClass(mySpectrum1d,'myClassOutput.fits')

Example 1.39. Complete example to convert a Spectrum1d class to a CLASS FITS file.


1.16.7.3. SAOImage DS9

Choose FileOpen to open FITS files of Herschel images and cubes.

Note also that you can exchange data between HIPE and SAOImage DS9 via the Virtual Observatory SAMP protocol. See Section 1.17 for more information.