[Halld-offline] Offline Software Meeting Minutes, July 9, 2014
Mark M. Ito
marki at jlab.org
Wed Jul 9 18:26:17 EDT 2014
Folks,
Find the minutes below and at
https://halldweb1.jlab.org/wiki/index.php/GlueX_Offline_Meeting,_July_9,_2014#Minutes
.
-- Mark
____________________________________________
GlueX Offline Meeting, July 9, 2014
Minutes
Present:
* CMU: Paul Mattione, Curtis Meyer
* FSU: Aristeidis Tsaris
* IU: Kei Moriya, Matt Shepherd
* JLab: Alex Barnes, Mark Ito (chair), David Lawrence, Mike Staib,
Simon Taylor, Beni Zihlmann
* MIT: Justin Stevens
* NU: Sean Dobbs
* UConn: Richard Jones
Announcements
* [25]CCDB 1.02 has been released. This fixes a problem with the use
of non-default variations.
* [26]sim-recon-2014-06-30 has been released. Matt and Paul asked
about version compatibility for this version. On the farm machines
at JLab, the following versions were used to build this tag:
+ Xerces 3.1.1
+ JANA 0.7.1p3
+ ROOT 5.34.01
+ CERNLIB 2005
+ gcc/g++/gfortran : 4.4.6 20110731 (Red Hat 4.4.6-3)
+ HDDS 2.1
+ CCDB 1.02
This information is also contained in the release notes. Matt suggested
in the future that this be posted on a web page, like the corresponding
information for [27]Data Challenge 2.
* Recent automatic tests, both the single track and the b1pi, have
been failing recently. Mark and Simon are looking into this.
Review of minutes from June 25
We reviewed the [28]minutes.
EVIO Build
Mark has succeeded in building EVIO using the source code from the Data
Acquisition Group's webpage. Use needs to be incorporated into the
build system.
BCAL Timing
No new news.
Data Challenge 3
We validated the sense of the last meeting: we will limit goals to
analyzing data from tape at JLab and not try to generate a large data
set suitable for studies. This reduces the amount of preparation needed
and can be started by the middle of August.
Generation and reconstruction of EVIO-formatted simulated data is very
close. David is working on this actively.
Tagger Reconstruction
Richard led us through his [29]proposal for introducing a random global
time offset to all events, consistent with the 500 MHz RF time
structure. This would simulate the real-life uncertainty due to
event-to-event trigger latency variations and the intrinsic jitter due
having a fully pipelined data acquisition system driven on a 250 MHz
clock. At present, all events are analyzed as if the true RF bucket is
known, a priori. Also the true beam photon energy is also assumed in
the analysis.
Tagger hits, including out-of-time accidentals, are already present in
the simulated data as long as the electromagnetic background is turned
on. This change means that we would have to use detector information to
determine both the time and energy of the beam photon of interest, as
we will have to do for real data.
This change should not break the current reconstruction, in particular
since each charged track is reconstructed with its own independent
starting time. It will require changes to the analysis library, but
Paul already has a scheme implemented in his analysis library for
dealing with multiple photon tag candidates; it has just not been
enabled for GlueX analysis. The scheme includes a [30]parameter for
setting the time window to use for tag candidates.
We endorsed the proposal. We thought that it should be made the default
scheme, but that disabling it should be possible with a FFREAD card in
HDGeant. In addition the time smearing parameter should be under user
control.
Paul and Richard discussed division of labor. Richard will provide a
set of tagger hit objects and Paul will produce the DBeamPhoton object
needed for the analysis library. Richard will code up an example of how
the latter step might go.
Mark will propose a Subversion strategy for how to manage changes from
both Richard and Paul without impacting others during development.
Reinstating the Separation between Truth and Hit Information in HDDM
Richard described the structure in HDDM that we have now where Monte
Carlo truth information is recorded in parallel with "hit" or
"detected" information in HDDM in mcsmear. He is implementing these
schemes for some additional detectors, including the start counter and
the tagger.
In the process, he is modernizing the HDDM parsing code to use the C++
API rather than the original C routines. This gives a major reduction
in lines of code. Also compression will be done on the HDDM output.
References
25.
https://mailman.jlab.org/pipermail/halld-offline/2014-July/001707.html
26.
https://mailman.jlab.org/pipermail/halld-offline/2014-July/001710.html
27.
https://halldweb1.jlab.org/data_challenge/02/conditions/data_challenge_2.html
28.
https://halldweb1.jlab.org/wiki/index.php/GlueX_Offline_Meeting,_June_25,_2014#Minutes
29.
https://mailman.jlab.org/pipermail/halld-offline/2014-July/001708.html
30.
https://mailman.jlab.org/pipermail/halld-offline/2014-July/001709.html
--
Mark M. Ito, Jefferson Lab, marki at jlab.org, (757)269-5295
More information about the Halld-offline
mailing list