########################################################################### # # Script based on "SPIRE Large Map Mode User Reprocessing Script" # Original script is available under menu # Pipelines -> SPIRE -> Photometer Large Map user pipeline # # This script load an observation from a untarred archive, runs the default # Large Map user pipeline but with the option of detecting the cooler burp # and eventually correct for it. # Finally, the re-processed observation is save into the HIPE database ("pool") # # Luca Conversi - 25/06/2013 # ########################################################################### # First, load an observation faffected by "cooler burp" from archive obs = getObservation(1342191175, useHsa=1) # Additional Options # (A) includeTurnaround: Include the scan line turnarounds in the pocessing and mapmaking # (B) applyExtendedEmissionGain: Apply the relative gains for each bolometer for extended emission # (C) baselineSubtraction: Subtract a baseline from each scan to avoid stripes # (D) destriper: Determine and remove baselines to achive an optimum fit between all timelines # If the map maker is 'naive', at least one of the options G or H must be True. # (E) coolerBurpDetection: Search and correct cooler burp recalculating the TempDrift calibration table includeTurnaround = False applyExtendedEmissionGains = False useBaselineSubtraction = True useDestriper = False coolerBurpDetection = True ########################################################################### #************************************************************************* ## Re-define variables to be in line with the original script code myObsid = obs.obsid print print "Processing observation %i (0x%X)"%(myObsid, myObsid) #************************************************************************* # Calibration Context and Calibration Files # Read the latest calibration tree relevant to HCSS v11 from the local disc: cal = spireCal(pool="spire_cal_11_0") # TO CORRECT AN ERROR ON THE ABOVE LINE, run the following command ONCE only # to load and save the calibration tree from the Archive (may take some time): # (for more details, see the "Calibration" chapter in the SPIRE Data Reduction Guide) #cal = spireCal(calTree="spire_cal_11_0", saveTree=True) # Attach the updated calibration tree to the observation context obs.calibration.update(cal) # Extract necessary Calibration Products from the Observation Context bsmPos = obs.calibration.phot.bsmPos lpfPar = obs.calibration.phot.lpfPar detAngOff = obs.calibration.phot.detAngOff elecCross = obs.calibration.phot.elecCross chanTimeConst = obs.calibration.phot.chanTimeConst chanNum = obs.calibration.phot.chanNum fluxConvList = obs.calibration.phot.fluxConvList tempDriftCorrList = obs.calibration.phot.tempDriftCorrList chanRelGains = obs.calibration.phot.chanRelGain chanNoiseList = obs.calibration.phot.chanNoiseList chanNoise = chanNoiseList.getProduct(obs.level1.getProduct(0).meta["biasMode"].value,obs.level1.getProduct(0).startDate) # Note to read in a single calibration fits file from some location use, e.g., # bsmPos = simpleFitsReader("/enter/path/here/"+"YourCalibrationFilename.fits") # Extract the necessary Auxiliary Products from the Observation Context hpp = obs.auxiliary.pointing siam = obs.auxiliary.siam timeCorr = obs.auxiliary.timeCorrelation #************************************************************************* #************************************************************************* ## Reports how many scan lines there are to process count=1 bbids=obs.level0_5.getBbids(0xa103) nlines=len(bbids) print "Total number of scan lines: ",nlines print # Create Level1 context to collect Level one products level1=Level1Context(myObsid) # Add any SSO meta data for moving objects if obs.meta.containsKey('naifId'): level1.meta['naifId']=obs.meta["naifId"].copy() level1.meta['aot']=obs.meta["aot"].copy() # ########################################################################### ### Pipeline Level 0.5 to Level 1 ### ### Process all Building Blocks for this observation ### ########################################################################### # Loop over scan lines for bbid in bbids: print "Starting BBID=0x%x: scan %i / %i"%(bbid,count,nlines) # Get basic level 0.5 data products (detector data and housekeeping data) pdt = obs.level0_5.get(bbid).pdt nhkt = obs.level0_5.get(bbid).nhkt # record the calibration tree version used by the pipeline pdt.calVersion = obs.calibration.version # # ----------------------------------------------------------- # (1) join all scan legs and turnarounds together bbCount=bbid & 0xFFFF pdtLead=None nhktLead=None pdtTrail=None nhktTrail=None if bbCount >1: blockLead=obs.level0_5.get(0xaf000000L+bbCount-1) pdtLead=blockLead.pdt nhktLead=blockLead.nhkt if pdtLead != None and pdtLead.sampleTime[-1] < pdt.sampleTime[0]-3.0: pdtLead=None nhktLead=None if bbid < MAX(Long1d(bbids)): blockTrail=obs.level0_5.get(0xaf000000L+bbCount) pdtTrail=blockTrail.pdt nhktTrail=blockTrail.nhkt if pdtTrail != None and pdtTrail.sampleTime[0] > pdt.sampleTime[-1]+3.0: pdtTrail=None nhktTrail=None pdt=joinPhotDetTimelines(pdt,pdtLead,pdtTrail) nhkt=joinNhkTimelines(nhkt,nhktLead,nhktTrail) # # ----------------------------------------------------------- # (2) Convert BSM timeline to angles on sky (constant for scan map) bat=calcBsmAngles(nhkt,bsmPos=bsmPos) # # ----------------------------------------------------------- # (3) Create the Spire Pointing Product for this observation spp=createSpirePointing(detAngOff=detAngOff,bat=bat,hpp=hpp,siam=siam) # # # ----------------------------------------------------------- # (4) Run the Electrical Crosstalk Correction on the timeline data pdt=elecCrossCorrection(pdt,elecCross=elecCross) # # ----------------------------------------------------------- # (5) Detect jumps in the Thermistor timelines that occasionally occur, # leading to map artefacts introduced in the temperature drift correction # Also requires the Temperature Drift Correct Calibration File. tempDriftCorr=tempDriftCorrList.getProduct(pdt.meta["biasMode"].value,pdt.startDate) if pdt.meta["biasMode"].value == "nominal": pdt=signalJumpDetector(pdt,tempDriftCorr=tempDriftCorr, kappa=2.0,gamma=6.0,\ gapWidth=1.0,windowWidth=40.0, filterType="DISCRETEDERIVATIVE",glitchinfo="NULL") # ----------------------------------------------------------- # (6) Run the concurrent deglitcher on the timeline data pdt=concurrentGlitchDeglitcher(pdt,chanNum=chanNum,kappa=2.0, size = 15, correctGlitches = True) # # ----------------------------------------------------------- # (7) Run the wavelet deglitcher on the timeline data pdt=waveletDeglitcher(pdt, scaleMin=1.0, scaleMax=8.0, scaleInterval=7, holderMin=-3.0,\ holderMax=-0.3, correlationThreshold=0.3, optionReconstruction='linearAdaptive20',\ reconPointsBefore=1, reconPointsAfter=3) # # Alternatively, run the sigma-kappa deglitcher. # The following task can be uncommented to try the sigma-kappa deglitcher. # In this case the wavelet deglitcher should be commented out. # This module is still a prototype and should be used with caution. # pdt = sigmaKappaDeglitcher(pdt, # filterType="BOXCAR", boxFilterWidth = 3, \ # boxFilterCascade = 1, kappa = 4.0, \ # detectGlitches = True, \ # largeGlitchMode = 'ADDITIVE', \ # largeGlDetectTc = 4, \ # largeGlFlagTc = 6, \ # detectLargeGl = True, \ # correctionMode = 'DIRECT', gamma = 1.0, \ # randomSeed = 1984574303L, \ # repairGlitches = True, \ # iterationNumber = 1) # # ----------------------------------------------------------- # (8) Apply the Low Pass Filter response correction pdt=lpfResponseCorrection(pdt,lpfPar=lpfPar) # # ----------------------------------------------------------- # (9) Apply the flux conversion fluxConv=fluxConvList.getProduct(pdt.meta["biasMode"].value,pdt.startDate) pdt=photFluxConversion(pdt,fluxConv=fluxConv) # # ----------------------------------------------------------- # (10) Check for cooler burp coolerBurpFound = False if coolerBurpDetection : threshold=3.e-7 tmpTime= nhkt.getSampleTime() tmpSubk = nhkt.getSignalData("SUBKTEMP") if (tmpSubk[-1]-tmpSubk[0])/(tmpTime[-1]-tmpTime[0]) > threshold: coolerBurpFound=Boolean.TRUE print "Cooler burp detected!" # ----------------------------------------------------------- # (11) Make the temperature drift correction pdt=temperatureDriftCorrection(pdt,tempDriftCorr=tempDriftCorr, coolerBurp=coolerBurpFound) # # ----------------------------------------------------------- # (12) Apply the bolometer time response correction pdt=bolometerResponseCorrection(pdt,chanTimeConst=chanTimeConst) # # ----------------------------------------------------------- # (13) Cut the timeline back into individual scan lines . pdt=cutPhotDetTimelines(pdt,extend=includeTurnaround) # # ----------------------------------------------------------- # (14) Add pointing timelines to the data psp=associateSkyPosition(pdt,spp=spp) # # ----------------------------------------------------------- # (15) Apply the time correlation psp=timeCorrelation(psp,timeCorr) # ----------------------------------------------------------- # Add Photometer Scan Product to Level 1 context level1.addProduct(psp) # print "Completed BBID=0x%x (scan %i/%i)"%(bbid,count,nlines) # set the progress count=count+1 print print "Finished the Level 0.5 to Level 1 processing for OBSID= %i, (0x%x)"%(myObsid,myObsid) # Update the Level 1 Context in the Observation Context obs.level1 = level1 # ### Finished the Level 0.5 to Level 1 processing ### ########################################################################### ########################################################################### ### Optionally apply Relative Bolometer Gains for extended emission ### ########################################################################### if applyExtendedEmissionGains: print print "Apply relative gains for bolometers for better extended maps" for i in range(level1.getCount()): level1.getRefs().set(i,ProductRef(applyRelativeGains(level1.getProduct(i), gains = chanRelGains))) print "Finished applying relative gains" print ### Finished applying relative gains ### ########################################################################### ########################################################################### ### Baseline Subtraction ### ########################################################################### if useBaselineSubtraction: print print "Begin the Baseline Subtraction for OBSID= %i, (0x%x)"%(myObsid,myObsid) # # Using Level 1 context. Run baseline removal as an input to the map making scans=baselineRemovalMedian(level1) print "Finished the Baseline Subtraction for OBSID= %i, (0x%x)"%(myObsid,myObsid) print else: scans = obs.level1 ### Finished the Baseline Subtraction ### ########################################################################### ########################################################################### ### Destriping ### ########################################################################### if useDestriper: print print "Destriper Run for OBSID= %i, (0x%x)"%(myObsid,myObsid) arrays = ["PSW","PMW","PLW"] pixelSize = [6,10,14] #Map pixel size in arcsec for PSW, PMW, PLW respectively maps = [] # # Using Level 1 context. Run destriper as an input to the map making for iArray in range(len(arrays)): scans,map,diag,p4,p5 = destriper(level1=scans,\ pixelSize=pixelSize[iArray], offsetFunction='perScan',\ array=arrays[iArray], polyDegree=0, kappa=5.0, iterThresh=1.0E-10,\ l2DeglitchRepeat=100, l2DeglitchAlgorithm='sigmaKappa',\ iterMax=100, l2IterMax=5, nThreads=2, withMedianCorrected=True,\ brightSource=True, useSink=False, storeTod=False) # # Save diagnostic product if obs.level2.refs['pdd'+arrays[iArray]]!=None: obs.level2.refs.remove('pdd'+arrays[iArray]) obs.level2.setProduct('psrc'+arrays[iArray]+'diag', diag) # # Keep destriper maps for inspection maps.append(map) print "Finished the Destriper Run for OBSID= %i, (0x%x)"%(myObsid,myObsid) print ### Finished Destriping ### ########################################################################### ########################################################################### ### Update Level 1 with the destriped/baseline subtracted timeline data ### ########################################################################### obs.level1=scans # ### Finished Update ### ########################################################################### ########################################################################### ### Mapmaking ### ########################################################################### # # Naive Map maker # For alternative weighted error map use (requires Channel Noise Table Calibration Product); # naiveScanMapper(scans,array="PSW",method=WeightedVariance,chanNoise=chanNoise) print 'Starting Naive Map maker' mapPlw=naiveScanMapper(scans, array="PLW", method=UnweightedVariance) mapPmw=naiveScanMapper(scans, array="PMW", method=UnweightedVariance) mapPsw=naiveScanMapper(scans, array="PSW", method=UnweightedVariance) # # Update the Level 2 (point source maps) Context in the Observation Context # Clean contxt of pre-HIPE 10 products if obs.level2.refs['PSW']!=None: obs.level2.refs.remove('PSW') if obs.level2.refs['PMW']!=None: obs.level2.refs.remove('PMW') if obs.level2.refs['PLW']!=None: obs.level2.refs.remove('PLW') obs.level2.setProduct("psrcPSW", mapPsw) obs.level2.setProduct("psrcPMW", mapPmw) obs.level2.setProduct("psrcPLW", mapPlw) # print "Finished the map making for OBSID= %i, (0x%x)"%(myObsid,myObsid) ### Finished the Mapmaking ### ############################################################################ # print print "Completed the processing of OBSID= %i, (0x%x)"%(myObsid,myObsid)