LIGO-G030657-00-E How to Develop a LIGO Search Peter Shawhan (LIGO / Caltech) NSF Review November 18, 2003 NSF Review, 18 Nov 2003 Peter Shawhan (LIGO/Caltech) 1 Initial Exploration I have an idea for a new search algorithm… May use Matlab to refine it ¾ Quick development, built-in visualization, scripting May want to look at some real data ¾ Tools for remote frame data retrieval: guild, getFrames Both require a valid LDAS username/password ¾ Matlab function to read from a frame file: frextract NSF Review, 18 Nov 2003 Peter Shawhan (LIGO/Caltech) 2 Implementing the Algorithm I’ll write code in C Link to LAL ¾ Standard data structures, input / output conventions ¾ Fourier transforms and other signal processing functions ¾ Error reporting and handling schema ¾ Does impose a certain programming style I will want to structure my search code intelligently ¾ Separate algorithm itself from data input ¾ Put all algorithm parameters into a structure May add enhancements to LAL package(s), or add new one Can use existing code in CVS repository as examples NSF Review, 18 Nov 2003 Peter Shawhan (LIGO/Caltech) 3 Testing and Tuning Two approaches, depending on where I ultimately want to run the code: Dynamic Shared Object (DSO) [LALWrapper] ¾ Designed to slot into LDAS ¾ Test / tune with “standalone wrapper” running on interactive machine ¾ Use LDAS to construct input data file (ilwd format) Self-contained C program [LALApps] ¾ Use LAL support functions to read data, calibration, etc. from disk May write to a verbose log file for debugging, dump intermediate products and view them with Matlab, etc. NSF Review, 18 Nov 2003 Peter Shawhan (LIGO/Caltech) 4 Running the Search on Lots of Data First, I have to know what data to analyze… ¾ Web site with lists of “segments”, with data quality flags ¾ segwizard graphical interface, segments Tcl library Run a DSO in LDAS ¾ Tool for remote job execution: ldasjob Tcl library ¾ Searches generally use a “loop script” written in Tcl ¾ Event candidates go into LDAS database ¾ Other output can go to disk files Run a self-contained C program on a Condor cluster ¾ Again, need a script to define all the jobs with appropriate parameters ¾ Write a script to construct a “directed acyclic graph” (DAG) ¾ Output goes to disk files (e.g. event candidates in LIGO_LW format, using LAL functions) ¾ Grid computing is a natural extension of this approach NSF Review, 18 Nov 2003 Peter Shawhan (LIGO/Caltech) 5 Statistical Analysis Coincidence and statistical analysis ¾ Tools to retrieve event candidates from LDAS database: guild, getMeta ¾ For either environment, event candidates end up in LIGO_LW files ¾ Event lists can be read into Matlab (readMeta function), ROOT (eventTool extension), or a C program (metaio parsing library) The details of the analysis depend on the search ¾ Coincidence requirements ¾ Vetoes, other cuts ¾ Background estimation May do follow-up processing, e.g. cross-correlate raw data ¾ Full analysis “pipeline” might be quite complicated ¾ Goal is to automate as much as possible, focus on the high-level stuff NSF Review, 18 Nov 2003 Peter Shawhan (LIGO/Caltech) 6
© Copyright 2024