[Halld-offline] Offline Software Meeting Minutes, June 1, 2018
Mark Ito
marki at jlab.org
Fri Jun 1 17:44:00 EDT 2018
Folks,
Please find the minutes below and at
https://halldweb.jlab.org/wiki/index.php/GlueX_Offline_Meeting,_June_1,_2018#Minutes
-- Mark
________________
GlueX Offline Meeting, June 1, 2018
Minutes
Present:
* *FSU: * Sean Dobbs
* *JLab: * Amber Boehnlein, Thomas Britton, Hovanes Egiyan, Mark Ito
(chair), David Lawrence, Simon Taylor
There is a recording of this meeting <https://bluejeans.com/s/pNrKQ/> on
the BlueJeans site. Use your JLab credentials to access it.
Announcements
1. *Simulation Launch*.Thomas has start a large-scale bggen simulation
run on the OSG. John Hardin's plug-ins are included.
2. new version file (version_2.35_jlab.xml) with new sim-recon tag for
bggen simulation
<https://mailman.jlab.org/pipermail/halld-offline/2018-May/003211.html>.
This will be used for the simulation mentioned above. It was
re-tagged last night.
3. New release of build_scripts: version 1.31
<https://mailman.jlab.org/pipermail/halld-offline/2018-May/003201.html>.
The default source of geometry information is now the CCDB.
4. scratch disk policy change
<https://mailman.jlab.org/pipermail/halld-offline/2018-May/003209.html>.
The life of unread files has been increased from 14 days to 60 days.
5. *Software Review, Summer 2018*. No word yet.
Review of Minutes from the May 18 Meeting
We went over the minutes
<https://halldweb.jlab.org/wiki/index.php/GlueX_Offline_Meeting,_May_18,_2018#Minutes>.
* David reported that our allocation for NERSC for this year is 23
million core hours. Chris Larrieu of Scientific Computing is working
on the system for staging data to NERSC. For comparison, one
reconstruction pass (ver01) on 2017 data used 3.3 million core
hours. And Richard Jones reported getting one million core hours for
simulation on the OSG over a 10 day period recently.
Splitting up sim-recon into sim and recon
Sean went over a proposal on why and how to split sim-recon
<https://halldweb.jlab.org/wiki/index.php/Sdobbs_splitting_sim-recon>.
He and Mark had discussed the issues last week. The major points in the
proposal are:
* The directories targeted for relocation to the "sim" repository are
programs/simulation, plugins/Simulation, and libraries/AMPTOOLS_*.
* Although there are other parts of the current sim-recon that could
be split off into separate repositories, we would start with just a
"sim" split off.
Sean has already done a proof-of-principle split and build.
We endorsed the proposal. Details to come.
Transitioning to HDGeant4
We discussed the situation/problem of continuing to rely on GEANT 3 with
only a very few projects using Geant 4. Mark asked if we should be
taking active steps to encourage/support/cajole collaborators to start
using HDGeant4. He is concerned that we are in a chicken and egg
situation: no one will use it until it can be trusted and no one will
trust it until everyone is using it.
Amber is concerned that the effort on Geant4 may be ramping down and
told us that she has advocated for more funding to go the Geant4 team.
She is worried that the main user base in high-energy physics (i. e.,
LHC experiments) may not be pushing development in a direction that
helps us.
Two ideas:
1. Recruit someone to devote a large fraction of their time to
tuning-up Geant4 to work well for GlueX, and perhaps for other Halls
as well.
2. Encourage/help people who have significant simulation tasks to
devote part of their effort to running HDGeant4 and running the same
post-detector-simulation software on the HDGeant4 generated sample.
Thomas agreed to do some simulation with HDGeant4 as part of the current
bggen campaign. MCwrapper already has an option to do this.
Request for University Compute Resources
We discussed Richard's recent email
<https://mailman.jlab.org/pipermail/halld-offline/2018-May/003204.html>
asking for collaborators to identify local resources that might be made
available for use by GlueX.
Sean explained that the idea hinges on leveraging tools developed by
OSG, in particular BOSCO <https://osg-bosco.github.io/docs/>, that allow
jobs submitted to the OSG to run on a variety of batch systems including
those not configured as OSG Compute Elements. This effort is enabled by
our deployment of Singularity containers and our development of
infrastructure to support their use. It could result in a substantial
increase in the amount of computing available to us at very little cost.
Work on Geometry Classes
Sean reported that he is working on re-writing some of our geometry
classes adding the ability to read alignment constants (geometry tweaks)
in from the CCDB for subsystems that would benefit from it.
Review of Recent Pull Requests
We looked at [the list
<https://github.com/JeffersonLab/sim-recon/pulls?q=is%3Aclosed+is%3Apr>
without comment.
Review of Recent Discussion on the GlueX Software Help List
We pulled up the Help List
<https://groups.google.com/forum/#%21forum/gluex-software>.
* Sean asked about persistent classes in JANA and refreshing
calibration info on run number changes. David is working on it.
* David noted that the launch parameter page still needs updating.
Alex Austregesilo is aware of this.
--
Mark Ito, marki at jlab.org, (757)269-5295
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.jlab.org/pipermail/halld-offline/attachments/20180601/ecd94e5b/attachment.html>
More information about the Halld-offline
mailing list