\documentclass[twoside,11pt]{article} % Package standard : Utilisation de caracteres accentues, mode francais et graphique \usepackage{url} \usepackage[latin1]{inputenc} \usepackage[T1]{fontenc} \usepackage[english]{babel} \usepackage{graphicx} % package a mettre pour faire du pdf \usepackage{palatino} % Extension de symboles mathematiques \usepackage{amssymb} % Definition pour Docs Sophya \usepackage{defsophya} % Constitution d'index \usepackage{makeidx} \usepackage[ps2pdf,bookmarks,bookmarksnumbered,% urlcolor=blue,citecolor=blue,linkcolor=blue,% pagecolor=blue,%hyperindex,% colorlinks=true,hyperfigures=true,hyperindex=true ]{hyperref} \makeindex % Constitution d'index \begin{document} \begin{titlepage} % The title page - top of the page with the title of the paper \titrehp{Sophya \\ An overview } % Authors list \auteurs{ R. Ansari & ansari@lal.in2p3.fr \\ E. Aubourg & aubourg@hep.saclay.cea.fr \\ G. Le Meur & lemeur@lal.in2p3.fr \\ C. Magneville & cmv@hep.saclay.cea.fr \\ S. Henrot-Versille & versille@in2p3.fr } % \auteursall % The title page - bottom of the page with the paper number \vspace{1cm} \begin{center} {\bf \Large Sophya Version: 1.5 (V\_Dec2002) } % Document revision 1.0 } \end{center} \titrebp{1} \end{titlepage} \tableofcontents \newpage \section{Introduction} {\bf SOPHYA} ({\bf SO}ftware for {\bf PHY}sics {\bf A}nalysis) is a collection of C++ classes designed for numerical and physics analysis software development. Our goal is to provide easy to use, yet powerful classes which can be used by scientists. We have decided to use as much as possible available numerical analysis libraries, encapsulating them whenever possible. Although some the modules in SOPHYA have been designed with the specific goal of providing some of the tools for the Planck-HFI data processing software, most of the packages presented here have a more general scope than the CMB analysis and Planck mission problem. \par \vspace*{2mm} This documents presents only a brief overview of the class library, mainly from the user's point of view. A more complete description can be found in the reference manual, available from the SOPHYA web site: % {\bf http://www.sophya.org}. \href{http://www.sophya.org}{http://www.sophya.org}. \par \vspace*{2mm} The source directory tree \footnote{ CVS: cvsserver.lal.in2p3.fr:/exp/eros/CVSPlanck} is organised into a number of modules. \begin{itemize} \item[] {\bf Mgr/} Scripts for code management, makefile generation and software installation \item[] {\bf BaseTools/} General architecture support classes such as {\tt PPersist, NDataBlock}, and few utility classes such as the dynamic variable list manager ({\tt DVList}) as well as the basic set of exception classes used in SOPHYA. \item[] {\bf TArray/} template numerical arrays, vectors and matrices \\ ({\tt TArray TMatrix TVector } \ldots) \item[] {\bf NTools/} Some standard numerical analysis tools (linear, and non linear parameter fitting, FFT, \ldots) \item[] {\bf HiStats/} Histogram-ming and data set handling classes (tuples) \\ ({\tt Histo Histo2D NTuple XNTuple} \ldots) \item[] {\bf SkyMap/} Local and full sky maps, and few geometry handling utility classes. \\ ({\tt PixelMap, LocalMap, SphericalMap, \ldots}) \item[] {\bf SUtils/} This module contains few utility classes, such as the {\tt DataCard} class, as well as string manipulation functions in C and C++. \item[] {\bf SysTools/} This module contains classes implementing an interface to various OS specific services, such as shared object and dynamic link handling. \end{itemize} The modules listed below are more tightly related to the CMB (Cosmic Microwave Background) data analysis problem: \begin{itemize} \item[] {\bf SkyT/} classes for spectral emission and detector frequency response modelling \\ ({\tt SpectralResponse, RadSpectra, BlackBody} \ldots) \item[] {\bf Samba/} Spherical harmonic analysis, noise generators \ldots \end{itemize} The following modules contain the interface classes with external libraries: \begin{itemize} \item[] {\bf FitsIOServer/} Classes for handling file input-output in FITS format using the cfitsio library. \item[] {\bf LinAlg/} Interface with Lapack linear algebra package \item[] {\bf IFFTW/} Interface with FFTW package (libfftw.a) \item[] {\bf XAstroPack/} Interface to some common astronomical computation libraries. Presently, this module uses an external library extracted from the {\bf Xephem } source code. The corresponding source code is also available from SOPHYA cvs repository, module {\bf XephemAstroLib}. \end{itemize} The following modules contain each a set of related programs using the SOPHYA library. \begin{itemize} \item[] {\bf Tests/} Simple test programs \item[] {\bf PrgUtil/} Various utility programs (runcxx, scanppf, scanfits, \ldots) \item[] {\bf PrgMap/} Programs performing operations on skymaps: projections, power spectrum in harmonic space, \ldots \item[] {\bf PMixer/} skymixer and related programs \end{itemize} As a companion to SOPHYA, the {\bf (s)piapp} interactive data analysis program is built on top of SOPHYA and the {\bf PI} GUI class library and application framework. The {\bf PI} ({\bf P}eida {\bf Interactive}) development started in 1995, in the EROS \footnote{EROS: {\bf E}xp\'erience de {\bf R}echerche d'{\bf O}bjets {\bf S}ombres - http://eros.in2p3.fr} microlensing search collaboration, with PEIDA++ \footnote {PEIDA++: The EROS data analysis class library - http://www.lal.in2p3.fr/recherche/eros/PeidaDoc/}. The {\bf PI} documentation and the {\bf piapp} user's guide are available from \href{http://www.sophya.org}{http://www.sophya.org}. %\href{http://www.sophya.org}{http://www.sophya.org}. The {\bf PI} is organized as the following modules: \begin{itemize} \item[] {\bf PI/} Portable GUI class library and application development framework kernel. \item[] {\bf PIGcont/} Contour-plot drawing classes. \item[] {\bf PIext/} Specific drawers and adapters for SOPHYA objects, and the {\bf piapp} interactive data analysis framework. \item[] {\bf ProgPI/} interactive analysis tool main program and pre-loaded modules. \end{itemize} Modules containing examples and demo programs: \begin{itemize} \item[] {\bf Examples/} Sample SOPHYA codes and example programs and makefiles (auto\_makefile and ex\_makefile). \item[] {\bf DemoPIApp/} Sample exripts and programs for (s)piapp interactive analysis tools. \end{itemize} \newpage \section{Using Sophya} Basic usage of Sophya classes are described in in the following sections. Complete Sophya documentation can be found at our web site {\bf http://www.sophya.org}. \subsection{Environment variables} Two environment variables {\bf SOPHYABASEREP} and {\bf SOPHYACXX} are used to define the path where the Sophya libraries and executable are installed. {\bf SOPHYABASEREP} defines the base directory path and {\bf SOPHYACXX} the name of the C++ compiler. The complete path is built using {\bf SOPHYABASEREP}, the operating system name (as obtained by the {\tt uname} command), and the compiler name. In the example below, we show the complete path for a {\tt Linux} system, using the GNU g++ compiler: \begin{itemize} \item \$SOPHYABASEREP/Include : Include (.h) files \item \$SOPHYABASEREP/Linux-g++/Libs : Path for the archive libraries (.a) \item \$SOPHYABASEREP/Linux-g++/ShLibs : Shared library path (.so) \item \$SOPHYABASEREP/Linux-g++/Exec : Executable file path \end{itemize} In order to use the shared libraries, the {\bf LD\_LIBRARY\_PATH} variable should contain the Sophya shared library path ({\tt \$SOPHYABASEREP/Linux-g++/ShLibs } when using g++ compiler on Linux) For modules using external libraries, the {\bf EXTLIBDIR} environment variable should contain the path to these libraries and corresponding include files. C-FitsIO anf FFTW include files should be accessible through: \\ {\tt \$EXTLIBDIR/Include/FitsIO } \\ {\tt \$EXTLIBDIR/Include/FFTW } \\ The corresponding libraries are expected to be found in: \\ {\tt \$EXTLIBDIR/Linux-g++/Libs} \\ \subsection{User makefiles} The file {\tt \$SOPHYABASEREP/Include/MakefileUser.h} defines the compilation flags and the list of Sophya libraries. It should be included in the user's makefile. The default compilation rules assumes that the object (.o) and executable files would be put in the following directories: \\ {\tt \$HOME/`uname`-\$SOPHYACXX/Objs} \\ {\tt \$HOME/`uname`-\$SOPHYACXX/Exec}. In the case of a {\tt Linux} system and using {\tt g++} as the C++ compiler, these two directories would be translated to \\ {\tt \$HOME/Linux-g++/Objs} and {\tt \$HOME/Linux-g++/Exec}. The GNU make program should be used. \par The file {\tt Examples/auto\_makefile} defines the rules to compile a given source program, and link it against the Sophya libraries to produce an executable. The example below shows the steps to compile a program named {\tt trivial.cc }. \begin{verbatim} csh> cp Examples/auto_makefile makefile csh> cp Examples/ex1.cc trivial.cc csh> make trivial \end{verbatim} This command should compile the {\tt trivial.cc} file, and link it against the sophya libraries. The file {\tt Examples/ex\_makefile} provides another example makefile. \subsection{the runcxx program} \index{runcxx} {\bf runcxx} is a simple program which can be used to compile, link and run simple C++ programs. It handles the creation of a complete program file, containing the basic set C++ include files, the necessary include files for SOPHYA SysTools, TArray, HiStats and NTools modules, and the main program with exception handling. Other Sophya modules can be included using the {\tt -import} flag. Use of additional include files can be specified using the {\tt -inc} flag. \begin{verbatim} csh> runcxx -h SOPHYA Version 1.1 Revision 0 (V_Fev2001) -- Feb 28 2001 11:19:17 cxx runcxx : compiling and running of a piece of C++ code Usage: runcxx [-compopt CompileOptions] [-linkopt LinkOptions] [-tmpdir TmpDirectory] [-f C++CodeFileName] [-inc includefile] [-inc includefile ...] [-import modulename] [-import modulename ...] [-uarg UserArg1 UserArg2 ...] if no file name is specified, read from standard input modulenames: SkyMap, Samba, SkyT, FitsIOServer, LinAlg, IFFTW \end{verbatim} Most examples in this manual can be tested using runcxx. The example below shows how to compile, link and run a sample code. \begin{verbatim} // File example.icc Matrix a(3,3); a = IdentityMatrix(1.); cout << a ; // Executing this sample code csh> runcxx -f example.icc \end{verbatim} \subsection{the scanppf program} {\bf scanppf} is a simple SOPHYA application which can be used to check PPF files and list their contents. \begin{verbatim} csh> scanppf -h PIOPersist::Initialize() Starting Sophya Persistence management service SOPHYA Version 1.4 Revision 0 (V_Nov2002) -- Nov 15 2002 10:32:12 cxx Usage: scanppf filename [s/n/a0/a1/a2/a3] s[=default} : Sequential reading of objects n : Object reading at NameTags a0...a3 : Tag List with PInPersist.AnalyseTags(0...3) \end{verbatim} \newpage \section{Copy constructor and assignment operator} In C++, objects can be copied by assignment or by initialisation. Copying by initialisation corresponds to creating an object and initialising its value through the copy constructor. The copy constructor has its first argument as a reference, or const reference to the object's class type. It can have more arguments, if default values are provided. Copying by assignment applies to an existing object and is performed through the assignment operator (=). The copy constructor implements this for identical type objects: \begin{verbatim} class MyObject { public: MyObject(); // Default constructor MyObject(MyObject const & a); // Copy constructor MyObject & operator = (MyObject const & a) // Assignment operator } \end{verbatim} The copy constructors play an important role, as they are called when class objects are passed by value, returned by value, or thrown as an exception. \begin{verbatim} // A function declaration with an argument of type MyObject, // passed by value, and returning a MyObject MyObject f(MyObject x) { MyObject r; ... return(r); // Copy constructor is called here } // Calling the function : MyObject a; f(a); // Copy constructor called for a \end{verbatim} It should be noted that the C++ syntax is ambiguous for the assignment operator. {\tt MyObject x; x=y; } and {\tt MyObject x=y;} have different meaning. \begin{verbatim} MyObject a; // default constructor call MyObject b(a); // copy constructor call MyObject bb = a; // identical to bb(a) : copy constructor call MyObject c; // default constructor call c = a; // assignment operator call \end{verbatim} As a general rule in SOPHYA, objects which implements reference sharing on their data members have a copy constructor which shares the data, while the assignment operator copies or duplicate the data. \newpage \section{Module BaseTools} {\bf BaseTools} contains utility classes such as {\tt DVlist}, an hierarchy of exception classes for Sophya, a template class {\tcls{NDataBlock}} for handling reference counting on numerical arrays, as well as classes providing the services for implementing simple serialisation. \vspace*{5mm} \subsection{SOPHYA persistence} \index{PPersist} \index{PInPersist} \index{POutPersist} \begin{figure}[hbt] \dclsa{PPersist} \dclsbb{PIOPersist}{PInPersist} \dclsb{POutPersist} \caption{partial class diagram for classes handling persistence in Sophya} \end{figure} A simple persistence mechanism is defined in SOPHYA. Its main features are: \begin{itemize} \item[] Portable file format, containing the description of the data structures and object hierarchy. \\ {\bf PPF} {\bf P}ortable {\bf P}ersistence file {\bf F}ormat. \index{PPF} \item[] Handling of read/write for multiply referenced objects. \item[] All write operations are carried using sequential access only. This holds also for read operations, unless positional tags are used. SOPHYA persistence services can thus be used to transfer objects through network links. \item[] The serialisation (reading/writing) for objects for a given class is implemented through a handler object. The handler class inherits from {\tt PPersist} class. \item[] A run time registration mechanism is used in conjunction with RTTI (Run Time Type Identification) for identifying handler classes when reading {\bf PInPersist} streams, or for associating handlers with data objects {\bf AnyDataObject} for write operations. \end{itemize} A complete description of SOPHYA persistence mechanism and guidelines for writing delegate classes for handling object persistence is beyond the scope of this document. The most useful methods for using Sophya persistence are listed below: \begin{itemize} \item[] {\tt POutPersist::PutObject(AnyDataObj \& o)} \\ Writes the data object {\bf o} to the output stream. \item[] {\tt POutPersist::PutObject(AnyDataObj \& o, string tagname)} \\ Writes the data object {\bf o} to the output stream, associated with an identification tag {\bf tagname}. \item[] {\tt PInPersist::GetObject(AnyDataObj \& o)} \\ Reads the next object in stream into {\bf o}. An exception is generated for incompatible object types. \item[] {\tt PInPersist::GetObject(AnyDataObj \& o, string tagname)} \\ Reads the object associated with the tag {\bf tagname} into {\bf o}. An exception is generated for incompatible object types. \end{itemize} The operators {\tt operator << (POutPersist ...) } and {\tt operator >> (PInPersist ...) } are often overloaded to perform {\tt PutObject()} and {\tt GetObject()} operations, as illustrated in the example below: \begin{verbatim} // Creating and filling a histogram Histo hw(0.,10.,100); ... // Writing histogram to a PPF stream POutPersist os("hw.ppf"); os << hw; // Reading a histogram from a PPF stream PInPersist is("hr.ppf"); is >> hr; \end{verbatim} The {\bf scanppf} program can be used to list the content of a PPF file. \index{scanppf} \begin{verbatim} csh> scanppf -h SOPHYA Version 1.1 Revision 0 (V_Fev2001) -- Feb 28 2001 11:19:17 cxx Usage: scanppf filename [s/n/a0/a1/a2/a3] s[=default} : Sequential reading of objects n : Object reading at NameTags a0...a3 : Tag List with PInPersist.AnalyseTags(0...3) \end{verbatim} \subsection{\tcls{NDataBlock}} \index{\tcls{NDataBlock}} \begin{figure}[hbt] \dclsbb{AnyDataObj}{\tcls{NDataBlock}} \dclsbb{PPersist}{\tcls{FIO\_NDataBlock}} \end{figure} The {\bf \tcls{NDataBlock}} is designed to handle reference counting and sharing of memory blocs (contiguous arrays) for numerical data types. Initialisation, resizing, basic arithmetic operations, as well as persistence handling services are provided. The persistence handler class ({\tt \tcls{FIO\_NDataBlock}}) insures that a single copy of data is written for multiply referenced objects, and the data is shared among objects when reading. \par The example below shows writing of NDataBlock objects through the use of overloaded operator $ << $ : \begin{verbatim} #include "fiondblock.h" // ... POutPersist pos("aa.ppf"); NDataBlock rdb(40); rdb = 567.89; pos << rdb; // We can also use the PutObject method NDataBlock idb(20); idb = 123; pos.PutObject(idb); \end{verbatim} The following sample programs show the reading of the created PPF file : \begin{verbatim} PInPersist pis("aa.ppf"); NDataBlock rdb; pis >> rdb; cout << rdb; NDataBlock idb; cout << idb; \end{verbatim} \subsection{Using DVList} \index{DVList} \index{MuTyV} \begin{figure}[hbt] \dclsbb{AnyDataObj}{DVList} \dclsbb{PPersist}{\tclsc{ObjFileIO}{DVList}} \end{figure} The {\bf DVList} class objects can be used to create and manage list of values, associated with names. A list of pairs of (MuTyV, name(string)) is maintained by DVList objects. {\bf MuTyV} is a simple class capable of holding string, integer, float or complex values, providing easy conversion methods between these objects. \begin{verbatim} // Using MuTyV objects MuTyV s("hello"); // string type value MuTyV x; x = "3.14159626"; // string type value, ASCII representation for Pi double d = x; // x converted to double = 3.141596 x = 314; // x contains the integer value = 314 // Using DVList DVList dvl; dvl("Pi") = 3.14159626; // float value, named Pi dvl("Log2") = 0.30102999; // float value, named Log2 dvl("FileName") = "myfile.fits"; // string value, named myfile.fits // Printing DVList object cout << dvl; \end{verbatim} \newpage \section{Module TArray} \index{\tcls{TArray}} {\bf TArray} module contains template classes for handling standard operations on numerical arrays. Using the class {\tt \tcls{TArray} }, it is possible to create and manipulate up to 5-dimension numerical arrays {\tt (int, float, double, complex, \ldots)}. The include file {\tt array.h} declares all the classes and definitions in module TArray. {\bf Array} is a typedef for arrays with double precision floating value elements. \\ {\tt typedef TArray$<$r\_8$>$ Array ; } \begin{figure}[hbt] \dclsccc{AnyDataObj}{BaseArray}{\tcls{TArray}} \dclsbb{PPersist}{\tcls{FIO\_TArray}} \end{figure} \subsection{Using arrays} \index{Sequence} \index{RandomSequence} \index{RegularSequence} \index{EnumeratedSequence} The example below shows basic usage of arrays, creation, initialisation and arithmetic operations. Different kind of {\bf Sequence} objects can be used for initialising arrays. \begin{figure}[hbt] \dclsbb{Sequence}{RandomSequence} \dclsb{RegularSequence} \dclsb{EnumeratedSequence} \end{figure} The example below shows basic usage of arrays: \index{\tcls{TArray}} \begin{verbatim} // Creating and initialising a 1-D array of integers TArray ia(5); EnumeratedSequence es; es = 24, 35, 46, 57, 68; ia = es; cout << "Array ia = " << ia; // 2-D array of floats TArray b(6,4), c(6,4); // Initializing b with a constant b = 2.71828; // Filling c with random numbers c = RandomSequence(); // Arithmetic operations TArray d = b+0.3f*c; cout << "Array d = " << d; \end{verbatim} The copy constructor shares the array data, while the assignment operator copies the array elements, as illustrated in the following example: \begin{verbatim} TArray a1(4,3); a1 = RegularSequence(0,2); // Array a2 and a1 shares their data TArray a2(a1); // a3 and a1 have the same size and identical elements TArray a3; a3 = a1; // Changing one of the a2 elements a2(1,1,0) = 555; // a1(1,1) is also changed to 555, but not a3(1,1) cout << "Array a1 = " << a1; cout << "Array a3 = " << a3; \end{verbatim} \subsection{Matrices and vectors} \index{\tcls{TMatrix}} \index{\tcls{TVector}} \begin{figure}[hbt] \dclsccc{\tcls{TArray}}{\tcls{TMatrix}}{\tcls{TVector}} \end{figure} Vectors and matrices are 2 dimensional arrays. The array size along one dimension is equal 1 for vectors. Column vectors have {\tt NCols() = 1} and row vectors have {\tt NRows() = 1}. Mathematical expressions involving matrices and vectors can easily be translated into C++ code using {\tt TMatrix} and {\tt TVector} objects. {\bf Matrix} and {\bf Vector} are typedefs for double precision float matrices and vectors. The operator {\bf *} beteween matrices is redefined to perform matrix multiplication. One can then write: \\ \begin{verbatim} // We create a row vector Vector v(1000, BaseArray::RowVector); // Initialize values with a random sequence v = RandomSequence(); // Compute the vector length (norm) double norm = (v*v.Transpose()).toScalar(); cout << "Norm(v) = " << norm << endl; \end{verbatim} This module contains basic array and matrix operations such as the Gauss matrix inversion algorithm which can be used to solve linear systems, as illustrated by the example below: \begin{verbatim} #include "sopemtx.h" // ... // Creation of a random 5x5 matrix Matrix A(5,5); A = RandomSequence(RandomSequence::Flat); Vector X0(5); X0 = RandomSequence(RandomSequence::Gaussian); // Computing B = A*X0 Vector B = A*X0; // Solving the system A*X = B Vector X; LinSolve(A, B, X); // Checking the result Vector diff = X-X0; cout << "X-X0= " << diff ; double min,max; diff.MinMax(min, max); cout << " Min(X-X0) = " << min << " Max(X-X0) = " << max << endl; \end{verbatim} \subsection{Working with sub-arrays and Ranges} \index{Range} A powerful mechanism is included in array classes for working with sub-arrays. The class {\bf Range} can be used to specify range of array indexes in any of the array dimensions. Any regularly spaced index range can be specified, using the {\tt start} and {\tt end} index and an optional step (or stride). It is also possible to specify the {\tt start} index and the number of elements: \begin{center} \begin{tabular}{ll} \multicolumn{2}{c}{ {\bf Range} {\tt (start=0, end=0, size=1, step=1) } } \\[2mm] \hline \\ {\bf Range} {\tt r(3,6); } & index range 3,4,5,6 \\ {\bf Range} {\tt r(7,0,3); } & index range 7,8,9 \\ {\bf Range} {\tt r(10,0,3,5); } & index range 10,12,14,16,18 \\ \end{tabular} \end{center} In the following example, a simple low-pass filter, on a one dimensional stream (Vector) has been written using sub-arrays: \begin{verbatim} // Input Vector containing a noisy periodic signal Vector in(1024), out(1024); in = RandomSequence(RandomSequence::Gaussian, 0., 1.); for(int kk=0; kk> B >> BS; // BS is a sub-array of B, modifying BS changes also B BS(1,1) = 98765.; cout << " B , BS after BS(1,1) = 98765. " << B << BS << endl; } \end{verbatim} The execution of this sample code creates the file {\tt aas.ppf} and its output is reproduced here. Notice that the array hierarch is recovered. BS is a sub-array of B, and modifying BS changes also the corresponding element in B. \begin{verbatim} B , BS after BS(1,1) = 98765. --- TMatrix(NRows=3, NCols=4) ND=2 SizeX*Y*...= 4x3 --- 10 15 20 25 30 35 40 45 50 55 60 98765 --- TMatrix(NRows=2, NCols=2) ND=2 SizeX*Y*...= 2x2 --- 40 45 60 98765 \end{verbatim} \centerline{\bf Warning: } There is a drawback in this behaviour: only a single copy of an array is written to a file, even if the array is modified, without being resized and written to a PPF stream. \begin{verbatim} { POutPersist pos("mca.ppf"); TArray ia(5,3); ia = 8; pos << ia; ia = 16; pos << ia; ia = 32; pos << ia; } \end{verbatim} Only a single copy of the data is effectively written to the output PPF file, corresponding to the value 8 for array elements. When we read the three array from the file mca.ppf, the same array elements are obtained three times (all elements equal to 8): \begin{verbatim} { PInPersist pis("mca.ppf"); TArray ib; pis >> ib; cout << " First array read from mca.ppf : " << ib; pis >> ib; cout << " Second array read from mca.ppf : " << ib; pis >> ib; cout << " Third array read from mca.ppf : " << ib; } \end{verbatim} \subsubsection{ASCII streams} The {\bf WriteASCII} method can be used to dump an array to an ASCII formatted file, while the {\bf ReadASCII} method can be used to decode ASCII formatted files. Space or tabs are the possible separators. Complex numbers should be specified as a pair of comma separated real and imaginary parts, enclosed in parenthesis. \begin{verbatim} { // Creating array A and writing it to an ASCII file (aaa.txt) Array A(4,6); A = RegularSequence(0.5, 0.2); ofstream ofs("aaa.txt"); A.WriteASCII(ofs); } { // Decoding the ASCII file aaa.txt ifstream ifs("aaa.txt"); Array B; sa_size_t nr, nc; B.ReadASCII(ifs,nr,nc); cout << " Array B; B.ReadASCII() from file " << endl; cout << B ; } \end{verbatim} \subsection{Complex arrays} The {\bf TArray} module provides few functions for manipulating arrays of complex numbers (single and double precision). These functions are declared in {\tt matharr.h}. \begin{itemize} \item[\bul] Creating a complex array through the specification of the real and imaginary parts. \item[\bul] Functions returning arrays corresponding to real and imaginary parts of a complex array: {\tt real(za) , imag(za) } ({\bf Warning:} Note that the present implementation does not provide shared memory access to real and imaginary parts.) \item[\bul] Functions returning arrays corresponding to the module, phase, and module squared of a complex array: {\tt phase(za) , module(za) , module2(za) } \end{itemize} \begin{verbatim} TVector p_real(10, BaseArray::RowVector); TVector p_imag(10, BaseArray::RowVector); p_real = RegularSequence(0., 0.5); p_imag = RegularSequence(0., 0.25); TVector< complex > zvec = ComplexArray(p_real, p_imag); cout << " :: zvec= " << zvec; cout << " :: real(zvec) = " << real(zvec) ; cout << " :::: imag(zvec) = " << imag(zvec) ; cout << " :::: module2(zvec) = " << module2(zvec) ; cout << " :::: module(zvec) = " << module(zvec) ; cout << " :::: phase(zvec) = " << phase(zvec) ; \end{verbatim} The decoding of complex numbers from an ASCII formatted stream is illustrated by the next example. As mentionned already, complex numbers should be specified as a pair of comma separated real and imaginary parts, enclosed in parenthesis. \begin{verbatim} csh> cat zzz.txt (1.,-1) (2., 2.5) -3. 12. -24. (-6.,7.) 14.2 (8.,64.) // Decoding of complex numbers from an ASCII file // Notice that the << operator can be used instead of ReadASCII TArray< complex > Z; ifstream ifs("zzz.txt"); ifs >> Z; cout << " TArray< complex > Z from file zzz.txt " << Z ; \end{verbatim} \subsection{Memory organisation} {\tt \tcls{TArray} } can handle numerical arrays with various memory organisation, as long as the spacing (steps) along each axis is regular. The five axis are labeled X,Y,Z,T,U. The examples below illustrates the memory location for a 2-dimensional, $N_x=4 \times N_y=3$. The first index is along the X axis and the second index along the Y axis. \begin{verbatim} | (0,0) (0,1) (0,2) (0,3) | | (1,0) (1,1) (1,2) (1,3) | | (2,0) (2,1) (2,2) (2,3) | \end{verbatim} In the first case, the array is completely packed ($Step_X=1, Step_Y=N_X=4$), with zero offset, while in the second case, $Step_X=2, Step_Y=10, Offset=10$: \begin{verbatim} | 0 1 2 3 | | 10 12 14 16 | Ex1 | 4 5 6 7 | Ex2 | 20 22 24 26 | | 8 9 10 11 | | 30 32 34 36 | \end{verbatim} For matrices and vectors, an optional argument ({\tt MemoryMapping}) can be used to select the memory mapping, where two basic schemes are available: \\ {\tt CMemoryMapping} and {\tt FortranMemoryMapping}. \\ In the case where {\tt CMemoryMapping} is used, a given matrix line is packed in memory, while the columns are packed when {\tt FortranMemoryMapping} is used. The first index when addressing the matrix elements (line number index) runs along the Y-axis if {\tt CMemoryMapping} is used, and along the X-axis in the case of {\tt FortranMemoryMapping}. Arithmetic operations between matrices with different memory organisation is allowed as long as the two matrices have the same sizes (Number of rows and columns). The following code example and the corresponding output illustrates these two memory mappings. The {\tt \tcls{TMatrix}::TransposeSelf() } method changes effectively the matrix memory mapping, which is also the case of {\tt \tcls{TMatrix}::Transpose() } method without argument. \begin{verbatim} TArray X(4,2); X = RegularSequence(1,1); cout << "Array X= " << X << endl; TMatrix X_C(X, true, BaseArray::CMemoryMapping); cout << "Matrix X_C (CMemoryMapping) = " << X_C << endl; TMatrix X_F(X, true, BaseArray::FortranMemoryMapping); cout << "Matrix X_F (FortranMemoryMapping) = " << X_F << endl; \end{verbatim} This code would produce the following output (X\_F = Transpose(X\_C)) : \begin{verbatim} Array X= --- TArray ND=2 SizeX*Y*...= 4x2 --- 1, 2, 3, 4 5, 6, 7, 8 Matrix X_C (CMemoryMapping) = --- TMatrix(NRows=2, NCols=4) ND=2 SizeX*Y*...= 4x2 --- 1, 2, 3, 4 5, 6, 7, 8 Matrix X_F (FortranMemoryMapping) = --- TMatrix(NRows=4, NCols=2) ND=2 SizeX*Y*...= 4x2 --- 1, 5 2, 6 3, 7 4, 8 \end{verbatim} \newpage \section{Module HiStats} \begin{figure}[hbt] \dclsccc{AnyDataObj}{Histo}{HProf} \dclsbb{AnyDataObj}{Histo2D} \dclsbb{AnyDataObj}{Ntuple} \dclsb{XNtuple} \caption{partial class diagram for histograms and ntuples} \end{figure} {\bf HiStats} contains classes for creating, filling, printing and doing various operations on one or two dimensional histograms {\tt Histo} and {\tt Histo2D} as well as profile histograms {\tt HProf}. \\ This module also contains {\tt NTuple} and {\tt XNTuple} which are more or less the same that the binary FITS tables. \subsection{1D Histograms} \index{Histo} For 1D histograms, various numerical methods are provided such as computing means and sigmas, finding maxima, fitting, rebinning, integrating \dots \\ The example below shows creating and filling a one dimensional histogram of 100 bins from $-5.$ to $+5.$ to create a Gaussian normal distribution with errors~: \begin{verbatim} #include "histos.h" // ... Histo H(-0.5,0.5,100); H.Errors(); for(int i=0;i<25000;i++) { double x = NorRand(); H.Add(x); } H.Print(80); \end{verbatim} \subsection{2D Histograms} \index{Histo2D} Much of these operations are also valid for 2D histograms. 1D projection or slices can be set~: \begin{verbatim} #include "histos2.h" // ... Histo2D H2(-1.,1.,100,0.,60.,50); H2.SetProjX(); // create the 1D histo for X projection H2.SetBandX(25.,35.); // create 1D histo projection for 25.Print(80); Histo *hbx2 = HBandX(1); // Get the second X band (35.Print(80); \end{verbatim} \subsection{Profile Histograms} \index{HProf} Profiles histograms {\bf HProf} contains the mean and the sigma of the distribution of the values filled in each bin. The sigma can be changed to the error on the mean. When filled, the profile histogram looks like a 1D histogram and much of the operations that can be done on 1D histo may be applied onto profile histograms. \subsection{Data tables (tuples)} \index{NTuple} NTuple are memory resident tables of 32 bits floating values (float). They are arranged in columns. Each line is often called an event. These objects are frequently used to analyze data. Graphicals tools (spiapp) can plot a column against an other one with respect to various selection cuts. \\ Here is an example of creation and filling~: \begin{verbatim} #include "ntuple.h" #include "srandgen.h" // ... char* nament[4] = {"i","x","y","ey"}; r_4 xnt[4]; NTuple NT(4,nament); for(i=0;i<5000;i++) { xnt[0] = i+1; xnt[1] = 5.*drandpm1(); // a random value between -5 and +5 xnt[2] = 100.*exp(-0.5*xnt[1]*xnt[1]) + 1.; xnt[3] = sqrt(xnt[2]); xnt[2] += xnt[3] * NorRand(); // add a random gaussian error NT.Fill(xnt); } \end{verbatim} XNTuple are sophisticated NTuple : they accept various types of column values (double,float,int,string,...) and can handle very large data sets, through swap space on disk. \index{XNTuple} In the sample code below we show how to create a XNTuple object with four columns (double, double, int, string). Several entries (lines) are then appended to the table, which is saved to a PPF file. \begin{verbatim} #include "xntuple.h" // ... char * names[4] = {"X", "X2", "XInt","XStr"}; // XNTuple (Table) creation with 4 columns, of integer, // double(2) and string type XNTuple xnt(2,0,1,1, names); // Filling the NTuple r_8 xd[2]; int_4 xi[2]; char xss[2][32]; char * xs[2] = {xss[0], xss[1]} ; for(int i=0; i<50; i++) { xi[0] = i; xd[0] = i+0.5; xd[1] = xd[0]*xd[0]; sprintf(xs[0],"X=%g", xd[0]); xnt.Fill(xd, NULL, xi, xs); } // Printing table info cout << xnt ; // Saving object into a PPF file POutPersist po("xnt.ppf"); po << xnt ; \end{verbatim} \subsection{Writing, viewing \dots } All these objects have been design to be written to or read from a persistent file. The following example shows how to write the previously created objects into such a file~: \begin{verbatim} //-- Writing { char *fileout = "myfile.ppf"; string tag; POutPersist outppf(fileout); tag = "H"; outppf.PutObject(H,tag); tag = "H2"; outppf.PutObject(H2,tag); tag = "NT"; outppf.PutObject(NT,tag); } // closing ``}'' destroy ``outppf'' and automatically close the file ! \end{verbatim} Sophya graphical tools (spiapp) can automatically display and operate all these objects. \newpage \section{Module NTools} This module provides elementary numerical tools for numerical integration, fitting, sorting and ODE solving. FFTs are also provided (Mayer,FFTPack). \subsection{Fitting} \index{Fitting} \index{Minimisation} Fitting is done with two classes {\tt GeneralFit} and {\tt GeneralFitData} and is based on the Levenberg-Marquardt method. \index{GeneralFit} \index{GeneralFitData} GeneralFitData is a class which provide a description of the data to be fitted. GeneralFit is the fitter class. Parametrized functions can be given as classes which inherit {\tt GeneralFunction} or as simple C functions. Classes of pre-defined functions are provided (see files fct1dfit.h and fct2dfit.h). The user interface is very close from that of the CERN {\tt Minuit} fitter. Number of objects (Histo, HProf \dots ) are interfaced with GeneralFit and can be easily fitted. \\ Here is a very simple example for fitting the previously created NTuple with a Gaussian~: \begin{verbatim} #include "fct1dfit.h" // ... // Read from ppf file NTuple nt; { PInPersist pis("myfile.ppf"); string tag = "NT"; pis.GetObject(nt,tag); } // Fill GeneralData GeneralData mGdata(nt.NEntry()); for(int i=0; i0) {) cout<<"Reduce_Chisquare = "< x(100); TVector y(100); TVector ey(100); for(int i=0;i<100;i++) { x(i) = i; ey(i) = 10.; y(i) = pol((double) i) + ey(i)*NorRand(); ey(i) *= ey(i) } TVector errcoef; Poly polfit; polfit.Fit(x,y,ey,2,errcoef); cout<<"Fit Result"< in(32); TVector< complex > out; in = RandomSequence(); FFTPackServer ffts; ffts.setNormalize(true); // To have normalized transforms cout << " FFTServer info string= " << ffts.getInfo() << endl; cout << "in= " << in << endl; cout << " Calling ffts.FFTForward(in, out) : " << endl; ffts.FFTForward(in, out); cout << "out= " << out << endl; \end{verbatim} % \newpage \section{Module SUtils} Some utility classes and C/C++ string manipulation functions are gathered in {\bf SUtils} module. \subsection{Using DataCards} \index{DataCards} The {\bf DataCards} class can be used to read parameters from a file. Each line in the file starting with \@ defines a set of values associated with a keyword. In the example below, we read the parameters corresponding with the keyword {\tt SIZE} from the file {\tt ex.d}. We suppose that {\tt ex.d} contains the line: \\ {\tt @SIZE 400 250} \\ \begin{verbatim} #include "datacards.h" // ... // Initialising DataCards object dc from file ex.d DataCards dc( "ex.d" ); // Getting the first and second parameters for keyword size // We define a default value 100 int size_x = dc.IParam("SIZE", 0, 100); int size_y = dc.IParam("SIZE", 1, 100); cout << " size_x= " << size_x << " size_y= " << size_y << endl; \end{verbatim} \section{Module SysTools} The {\bf SysTools} module contains classes implementing interface to some OS specific services. The class {\bf ResourceUsage} \index{ResourceUsage} and {\bf Timer} {\index{Timer} provides access to information about various resource usage (memory, CPU, ...). The class {\bf Periodic} provides the necessary services needed to implement the execution of a periodic action. A basic interface to POSIX threads \index{thread} is also provided through the \index{ZThread} {\bf ZThread}, {\bf ZMutex} and {\bf ZSync} classes. \subsection{Dynamic linker} \index{PDynLinkMgr} The class {\bf PDynLinkMgr} can be used for managing shared libraries at run time. The example below shows the run time linking of a function:\\ {\tt extern "C" { void myfunc(); } } \\ \begin{verbatim} #include "pdlmgr.h" // ... string soname = "mylib.so"; string funcname = "myfunc"; PDynLinkMgr dyl(soname); DlFunction f = dyl.GetFunction(funcname); if (f != NULL) { // Calling the function f(); } \end{verbatim} \subsection{CxxCompilerLinker class} \index{CxxCompilerLinker} This class provides the services to compile C++ code and building shared libraries, using the same compiler and options which have been used to create the SOPHYA shared library. The sample program below illustrates using this class to build the shared library (myfunc.so) from the source file myfunc.cc : \begin{verbatim} #include "cxxcmplnk.h" // ... string flnm = "myfunc.cc"; string oname, soname; int rc; CxxCompilerLinker cxx; // The Compile method provides a default object file name rc = cxx.Compile(flnm, oname); if (rc != 0 ) { // Error when compiling ... } // The BuildSO method provides a default shared object file name rc = cxx.BuildSO(oname, soname); if (rc != 0 ) { // Error when creating shared object ... } \end{verbatim} \newpage \section{Module SkyMap} \begin{figure}[hbt] \dclsbb{AnyDataObj}{PixelMap} \dclsccc{PixelMap}{Sphericalmap}{SphereHEALPix} \dclsc{SphereThetaPhi} \dclsb{LocalMap} \caption{partial class diagram for pixelization classes in Sophya} \end{figure} The {\bf SkyMap} module provides classes for creating, filling, reading pixelized spherical and 2D-maps. The types of values stored in pixels can be int, float, double , complex etc. according to the specialization of the template type. \subsection {Spherical maps} There are two kinds of spherical maps according pixelization algorithms. SphereHEALPix represents spheres pixelized following the HEALPIix algorithm (E. Hivon, K. Gorski) \footnote{see the HEALPix Homepage: http://www.eso.org/kgorski/healpix/ } , SphereThetaPhi represents spheres pixelized following an algorithm developed at LAL-ORSAY. The example below shows creating and filling of a SphereHEALPix with nside = 8 (it will be 12*8*8= 768 pixels) : \index{\tcls{SphereHEALPix}} \index{\tcls{SphereThetaPhi}} \begin{verbatim} #include "spherehealpix.h" // ... SphereHEALPix sph(8); for (int k=0; k< sph.NbPixels(); k++) sph(k) = (double)(10*k); \end{verbatim} SphereThetaPhi is used in a similar way with an argument representing number of slices in theta (Euler angle) for an hemisphere. \index{\tcls{SphereThetaPhi}} \subsection {Local maps} \index{\tcls{LocalMap}} A local map is a 2 dimensional array, with i as column index and j as row index. The map is supposed to lie on a plan tangent to the celestial sphere in a point whose coordinates are (x0,y0) on the local map and (theta0, phi0) on the sphere. The range of the map is defined by two values of angles covered respectively by all the pixels in x direction and all the pixels in y direction (SetSize()). Default value of (x0, y0) is middle of the map, center of pixel(nx/2, ny/2). Internally, a map is first defined within this reference plane and tranported until the point (theta0, phi0) in such a way that both axes are kept parallel to meridian and parallel lines of the sphere. The user can define its own map with axes rotated with respect to reference axes (this rotation is characterized by angle between the local parallel line and the wanted x-axis-- method SetOrigin(...)) The example below shows creating and filling of a LocalMap with 4 columns and 5 rows. The origin is set to default. The map covers a sphere portion defined by two angles of 30. degrees (methods \textit{SetOrigin()} and \textit{SetSize()} must be called in order to completely define the map). \begin{verbatim} #include "localmap.h" //.............. LocalMap locmap(4,5); for (int k=0; k outsph(sph); outsph.Write(outppf); FIO_LocalMap outloc(locmap); outloc.Write(outppf); // It is also possible to use the << operator POutPersist os("sph.ppf"); os << outsph; os << outloc; \end{verbatim} Sophya graphical tools (spiapp) can automatically display and operate all these objects. \newpage \section{Module Samba} \index{Spherical Harmonics} \index{SphericalTransformServer} The module provides several classes for spherical harmonic analysis. The main class is \textit{SphericalTranformServer}. It contains methods for analysis and synthesis of spherical maps. The following example fills a vector of Cl's, generate a spherical map from these Cl's. This map is analysed back to Cl's... \begin{verbatim} #include "skymap.h" #include "samba.h" .................... // Generate input spectra a + b* l + c * gaussienne(l, 50, 20) int lmax = 92; Vector clin(lmax); for(int l=0; l ylmserver; int m = 128; // HealPix pixelisation parameter SphereHEALPix map(m); ylmserver.GenerateFromCl(map, m, clin, 0.); // Compute power spectrum from map Vector clout = ylmserver.DecomposeToCl(map, lmax, 0.); \end{verbatim} \newpage \section{Module SkyT} \index{RadSpectra} \index{SpectralResponse} The SkyT module is composed of two types of classes: \begin{itemize} \item{} one which corresponds to an emission spectrum of radiation, which is called RadSpectra \item{} one which corresponds to the spectral response of a given detector (i.e. corresponding to a detector filter in a given frequency domain), which is called SpectralResponse. \end{itemize} \begin{figure}[hbt] \dclsbb{RadSpectra}{RadSpectraVec} \dclsb{BlackBody} \dclsccc{AnyDataObj}{SpectralResponse}{SpecRespVec} \dclsc{GaussianFilter} \caption{partial class for SkyT module} \end{figure} \begin{verbatim} #include "skyt.h" // .... // Compute the flux from a blackbody at 2.73 K through a square filter BlackBody myBB(2.73); // We define a square filter from 100 - 200 GHz SquareFilter mySF(100,200); // Compute the filtered integrated flux : double flux = myBB.filteredIntegratedFlux(mySF); \end{verbatim} A more detailed description of SkyT module can be found in: {\it The SkyMixer (SkyT and PMixer modules) - Sophya Note No 2. } available also from Sophya Web site. \newpage \section{Module FitsIOServer} \begin{figure}[hbt] \dclsbb{FitsFile}{FitsInFile} \dclsb{FitsOutFile} \end{figure} \index{FITS} \index{FitsInFile} \index{FitsOutFile} This module provides classes for handling file input-output in FITS format using the cfitsio library. It works like the SOPHYA persistence (see Module SysTools), using delegate objects, but its design is simpler. The following example writes a matrix (see module TArray) and a spherical map (see module SkyMap) on a FITS file and reads back from FITS file and creates new objects : \begin{verbatim} #include "spherehealpix.h" #include "fitsspherehealpix.h" #include "fitstarray.h" #include "tmatrix.h" //........................... int m=...; SphereHEALPix sph(m); ................ int dim1=...; int dim2=...; TMatrix mat(dim1,dim2); ............ FITS_SphereHEALPix sph_temp(sph); FITS_TArray mat_temp(mat); // writing FitsOutFile os("myfile.fits"); sph_temp.Write(os); mat_temp.Write(os); // reading FitsInFile is("myfile.fits"); sph_temp.Read(is); mat_temp.Read(is); SphereHEALPix new_sph=(SphereHEALPix)sph_temp; TMatrix new_mat=(TMatrix)mat_temp; ................ \end{verbatim} The operators {\tt operator << (FitsOutFile ...)} and {\tt operator >> (FitsInFile ...)} are defined in order to facilitate the FITS file operations: \begin{verbatim} // Writing an array object to a FITS file #include "fitstarray.h" FitsOutFile fio("arr.fits"); Matrix m(20,30); m = 12345.; fio << m; // ..... // Reading a binary table to a XNTuple #include "fitsxntuple.h" XNTuple xn; FitsInFile fii("table.fits"); fii >> xn; \end{verbatim} The class {\bf FITS\_AutoReader} provides a limited FITS files reading and decoding capabilities. A partial class diagram of FITS persistence handling classes is shown below: \begin{figure}[hbt] \dclsbb{FitsIOhandler}{FITS\_TArray} \dclsb{FITS\_NTuple} % \dclsb{FITS\_XNTuple} \dclsb{FITS\_SphereHEALPix} % \dclsb{FITS\_LocalMap} \end{figure} \newpage \section{LinAlg and IFFTW modules} An interface to use LAPACK library (available from {\tt http://www.netlib.org}) is implemented by the {\bf LapackServer} class, in module LinAlg. \index{LapackServer}. The sample code below shows how to use SVD (Singular Value Decomposition) through LapackServer: \begin{verbatim} #include "intflapack.h" // ... // Use FortranMemoryMapping as default BaseArray::SetDefaultMemoryMapping(BaseArray::FortranMemoryMapping); // Create an fill the arrays A and its copy AA int n = 20; Matrix A(n , n), AA; A = RandomSequence(RandomSequence::Gaussian, 0., 4.); AA = A; // AA is a copy of A // Compute the SVD decomposition Vector S; // Vector of singular values Matrix U, VT; LapackServer lpks; lpks.SVD(AA, S, U, VT); // We create a diagonal matrix using S Matrix SM(n, n); for(int k=0; k