[Halld-offline] [EXTERNAL] Re: Software Meeting Minutes, September 17, 2019
Naomi Jarvis
nsj at cmu.edu
Wed Sep 18 18:04:29 EDT 2019
Regarding the infinite variety of use cases for CCDB variation
combinations, how about asking Dmitry to implement an easy import/export of
constants from a readable text file? So the user can go to ccdb, load
constants from calibtime a or variation a, dump it out, load
calibtime/variation b, dump it out, combine the two dumps as they see fit,
and then upload as variation c. This is really straightforward. Easier
than remembering the inheritance rules. Easier for debugging than running
repeated instances of ccdb ls table > textfile etc.
Naomi.
On Wed, Sep 18, 2019 at 5:15 PM Mark Ito <marki at jlab.org> wrote:
> Please find the minutes here
> <https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_September_17,_2019#Minutes>
> and below.
>
> _________________________________________
>
> GlueX Software Meeting, September 17, 2019, Minutes
>
> Present:
>
> - * CMU: * Naomi Jarvis
> - * FSU: * Sean Dobbs
> - * JLab: * Alexander Austregesilo, Mark Ito (chair), David Lawrence,
> Simon Taylor, Beni Zihlmann
>
> There is a recording of his meeting <https://urldefense.proofpoint.com/v2/url?u=https-3A__bluejeans.com_s_97cGP_&d=DwIBaQ&c=CJqEzB1piLOyyvZjb8YUQw&r=Te_hCR4EUlJ6iCDYLJ8Viv2aDOR7D9ZZMoBAvf2H0M4&m=IYKLUuGUrJ7h-sLjihD8cXSGszVsLrdjfKNwhxEqvZk&s=nTojui-vwd5h81bpJECjyria9jHluIv6O8hK9OpmyQQ&e= > on
> the BlueJeans site. Use your JLab credentials to access it.
> Announcements
>
> 1. Collaboration Meeting
> <https://halldweb.jlab.org/wiki/index.php/GlueX-Collaboration-Oct-2019>:
> Sean has proposed a list of speakers for the Offline Session on Thursday.
> Alex will substitute for David and give a status of data processing.
> 2. New DB Servers -- HALLDDB-A and HALLDDB-B Online
> <https://mailman.jlab.org/pipermail/halld-offline/2019-September/003758.html>:
> the new servers were stood up to relieve stress on halldb.jlab.org
> (our main database server) from farm jobs. Testing is still in progress but
> users are welcome to try it out.
> 3. *No online compression this Fall*. David has discussed the issue
> with Graham and they agree that compression of raw data is not ready for
> the November run. In addition using ramdisk on the front end, improvements
> in the Data Transfer Node (for off-site transfers), and expansion of disk
> space at JLab all reduce the need for immediate relief on data size.
>
> Review of minutes from the last Software Meeting
>
> We went over the minutes from September 3
> <https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_September_3,_2019#Minutes>.
>
>
> David gave us an update on NERSC and PSC.
>
> - At NERSC, batch 3 of the Fall 2018 data reconstruction is finished.
> 80% of the output has been brought back to the Lab.
> - At the Pittsburgh Supercomputing Center (PSC) there is a steady rate
> of about 300 jobs a day, slower than NERSC, but with fewer job failures. It
> is not clear why the pace is so slow.
> - At NERSC, Perlmutter will be coming on line next year with an
> attendant large increase in computing capacity.
> - The XSEDE proposal at PSC has been approved with 5.9 million units.
> October 1 is the nominal start date. Note that our advance award was 850
> thousand units.
>
> Report from the last HDGeant4 Meeting
>
> We forgot to go over the minutes from the September 10 meeting
> <https://halldweb.jlab.org/wiki/index.php/HDGeant4_Meeting,_September_10,_2019#Minutes>.
> Maybe next time.
> Reconstruction Software for the upgraded Time-of-Flight
>
> Sean went through and made the needed changes. The DGeometry class was
> modified to load in the geometry information. The new DTOFGeometry class
> was changed to present the info in a more reasonable way. There were places
> where geometry parameters where hard-coded. These were changed to use the
> information from the CCDB-resident HDDS files. The process benefited from
> the structure where the DGeometry class parses the HDDS XML and the
> individual detector geometry classes turn that information into useful
> parametrizations.
>
> Right now hits are not showing up in the simulation (HDGeant4). Fixing
> this is the next task.
> Fixing Crashes When Running over Data with Multiple Runs
>
> Sean described his fix of a long standing problem, first reported by Elton
> Smith, where the ReactionFilter crashes when run over data that contains
> multiple runs. This closes halld_recon issue #111
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_JeffersonLab_halld-5Frecon_issues_111&d=DwIBaQ&c=CJqEzB1piLOyyvZjb8YUQw&r=Te_hCR4EUlJ6iCDYLJ8Viv2aDOR7D9ZZMoBAvf2H0M4&m=IYKLUuGUrJ7h-sLjihD8cXSGszVsLrdjfKNwhxEqvZk&s=JlmVIVFi35Zqqh8xVTmtH5aPLYcLWpmizdw7sIcbbNs&e= >. In particular,
> the DParticleID class assumed the that run number never changes. Necessary
> refresh of constants from the CCDB on run number boundaries was thus never
> done.
> Tagger Counter Energy Assignment Bug
>
> Beni brought to our attention an issue that was discussed at the last
> Beamline Meeting. Currently, tagger energies are set as a fraction of the
> endpoint energy. But since the electron beam energy can change from run to
> run, albeit by a small amount, the reported energy of a particular tagger
> counter will change when the tagged electron energy bin is really
> determined by the strength of the tagger magnet field. Richard Jones is
> working on a proposal on how this should be fixed.
> Software Versions and Calibration Constant Compatibility
>
> Sean led us through an issue he described in an earlier email
> <https://mailman.jlab.org/pipermail/halld-offline/2019-September/003761.html>
> to the Offline List. The basic issue is that older versions of mcsmear are
> not compatible with recent constants used in smearing the FCAL. We
> discussed the issue and concluded that the problem was changing the meaning
> of columns in the table, rather than creating a new calibration type with
> the new interpretation. Because this situation, the software has to know
> which interpretation is correct for a given set of constants. Old software
> versions are not instrumented to do so, of course. If the constants are
> under a different type, the then the software will know which type is it
> using and do the right thing. And old software, only knowing about the old
> type will do the right thing as well.
>
> Sean is thinking about how we will address this going forward.
> CCDB Ancestry Control
>
> Mark presented a set of issues that arise with CCDB 2.0 (coming soon). See his
> slides
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__docs.google.com_presentation_d_1P5mE3SApCmeWv4oNrW58tzeD6zj9neQlr5XORigYCXM_edit-3Fusp-3Dsharing&d=DwIBaQ&c=CJqEzB1piLOyyvZjb8YUQw&r=Te_hCR4EUlJ6iCDYLJ8Viv2aDOR7D9ZZMoBAvf2H0M4&m=IYKLUuGUrJ7h-sLjihD8cXSGszVsLrdjfKNwhxEqvZk&s=lT7k_ydrjM6cwgOxuCzJxCaz3XmkANVp84HXJQjfhKk&e= >
> for all of the dirty details.
>
> In CCDB 1.x we can "freeze" calibration constants in time by setting a
> "calib-time" for the system to use. All calibration changes made after that
> time will be ignored. Because of the hierarchical structure of calibration
> "variations" there is a valid use case where the user may want constants at
> the level of the named variation to float, but freeze the constants coming
> from variations higher in the hierarchy. This use case is not supported
> under CCDB 1.x, but is provided for in CCDB 2.0. The implementation
> provides a rich set of choices for freezing (or not freezing) variations in
> the hierarchy. Too rich in fact. The discussion was about how to limit the
> scope of what can be done so users are presented with an understandable,
> tractable set of options. There was a lot of discussion. See the recording
> if interested.
>
> No final decision was made, but at least by the end of the meeting
> everyone was aware of the nature of the problem.
> Retrieved from "
> https://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_September_17,_2019&oldid=94126
> "
>
> - This page was last modified on 18 September 2019, at 17:13.
>
> _______________________________________________
> Halld-offline mailing list
> Halld-offline at jlab.org
> https://mailman.jlab.org/mailman/listinfo/halld-offline
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.jlab.org/pipermail/halld-offline/attachments/20190918/562c0378/attachment-0002.html>
More information about the Halld-offline
mailing list