<html>
  <head>

    <meta http-equiv="content-type" content="text/html; charset=UTF-8">
  </head>
  <body text="#000000" bgcolor="#FFFFFF">
    <p>Folks,</p>
    <p>Please find the minutes <a moz-do-not-send="true"
href="https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_December_11,_2018#Minutes">here</a>
      and below.</p>
    <p>  -- Mark</p>
    <p>____________________________________</p>
    <p>
    </p>
    <div id="globalWrapper">
      <div id="column-content">
        <div id="content" class="mw-body" role="main">
          <h2 id="firstHeading" class="firstHeading" lang="en"><span
              dir="auto">GlueX Software Meeting, December 11, 2018, </span><span
              class="mw-headline" id="Minutes">Minutes</span></h2>
          <div id="bodyContent" class="mw-body-content">
            <div id="mw-content-text" dir="ltr" class="mw-content-ltr"
              lang="en">
              <p>Present:
              </p>
              <ul>
                <li> <b> CMU: </b> Naomi Jarvis</li>
                <li> <b> FSU: </b> Sean Dobbs</li>
                <li> <b> JLab: </b> Alexander Austregesilo, Thomas
                  Britton, Mark Ito (chair), David Lawrence, Justin
                  Stevens, Simon Taylor, Beni Zihlmann</li>
              </ul>
              <p>There is a <a rel="nofollow" class="external text"
                  href="https://bluejeans.com/s/ksHJx/">recording of
                  this meeting</a> on the BlueJeans site. Use your JLab
                credentials to access it.
              </p>
              <h3><span class="mw-headline" id="Announcements">Announcements</span></h3>
              <p>Thomas announced the release of <a rel="nofollow"
                  class="external text"
href="https://github.com/JeffersonLab/gluex_MCwrapper/releases/tag/v2.0.4">MCwrapper
                  v3.0.4</a> a.k.a. <a rel="nofollow" class="external
                  text"
                  href="https://en.wikipedia.org/wiki/Kool_Moe_Dee">Kool
                  Moe Dee</a>. It includes updates to
              </p>
              <ul>
                <li> <a rel="nofollow" class="external text"
                    href="https://halldweb.jlab.org/gluex_sim/Dashboard.html">Dashboards</a>
                  which now keeps track of jobs.</li>
                <li> <a rel="nofollow" class="external text"
                    href="https://halldweb.jlab.org/gluex_sim/SubmitSim.html">web
                    form</a> which can now run reaction filters.</li>
              </ul>
              <h3><span class="mw-headline"
                  id="Review_of_minutes_from_the_November_13_meeting">Review
                  of minutes from the November 13 meeting</span></h3>
              <p>We went over <a
href="https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_November_13,_2018#Minutes"
                  title="GlueX Software Meeting, November 13, 2018">the
                  minutes</a>.
              </p>
              <h4><span class="mw-headline"
                  id="Crashing_Monitor_Launches.3F">Crashing Monitor
                  Launches?</span></h4>
              <p>We had an extended is discussion on the report of 50%
                success rate for monitoring launches back in October. We
                attributed that to a as-yet-to-be-found problem in the
                code and were waiting to tag and use new versions
                pending finding and fixing. It has been tricky to
                reproduce the problem. Alex reports that a recent run
                with only the danarest and monitoring_histograms
                plug-ins included did not have a problem. The initial
                report came from a run using 50 plug-ins. We decided to
                wait no longer; a new version set will be released soon
                with the latest version of halld_recon. This will
                promote wider use of recent versions and may shed light
                on the problem, including on whether it resides in the
                main reconstruction code.
              </p>
              <p>As far as which version to use on 2018 data, more
                testing will likely be necessary.
              </p>
              <h4><span class="mw-headline" id="NERSC_Allocation">NERSC
                  Allocation</span></h4>
              <p>David reported that we received word back on our
                request for running time at NERSC for 2019. We were
                awarded 35 million units (a unit is roughly a core-hour)
                out of our request for 112 million units. Hall B
                received 30 M out of a 60 M request. The basis for the
                award is not known.
              </p>
              <p>David also mentioned that he is working on using cycles
                from supercomputer centers as Indiana and Carnegie
                Mellon.
              </p>
              <p>We may be starting a reconstruction launch soon.
              </p>
              <p>For NERSC, in 2018, we used 10 M units from our
                allocation of 50 M. That award was made for the entire
                Lab.
              </p>
              <p>David noted that if we have multiple projects going at
                the same time, we run the risk of stepping on ourselves
                as far as Lustre access is concerned.
              </p>
              <h3><span class="mw-headline"
                  id="Report_from_the_December_4_HDGeant4_Meeting">Report
                  from the December 4 HDGeant4 Meeting</span></h3>
              <p>We reviewed issues from the <a
href="https://halldweb.jlab.org/wiki/index.php/HDGeant4_Meeting,_December_4,_2018"
                  title="HDGeant4 Meeting, December 4, 2018">last
                  HDGeant4 meeting</a>.
              </p>
              <p>We discussed at length our approach to merging in
                changes from the DIRC-enabled branches of hdds,
                halld_recon, halld_sim, and hdgeant4. A merge of the
                changes for one repository requires that the changes to
                the other repositories be merged as well to get a
                working system.
              </p>
              <p>After that happens, if we want to use an older version
                of reconstruction with a modern, DIRC-enabled
                simulation, there is a problem since constructs in
                halld_sim using the DIRC require a DIRC-enabled
                halld_recon, a feature missing from the older
                halld_recon.
              </p>
              <p>There are two possible solutions that we discussed:
              </p>
              <ol>
                <li> Add preprocessor directives in halld_sim to exclude
                  DIRC-aware code to be consistent with old halld_recon
                  releases.</li>
                <li> Add patches to selected old releases to enable the
                  DIRC hits. These would only be needed to get halld_sim
                  to build, DIRC hits would not have to be generated.</li>
              </ol>
              <p>Both have drawbacks. More discussion is needed. Also
                this issue will come up again every time a new detector
                is added to the main development path.
              </p>
              <h3><span class="mw-headline"
                  id="Report_on_Computing_Review_on_November_27-28">Report
                  on Computing Review on November 27-28</span></h3>
              <p><a
href="https://halldweb.jlab.org/wiki/index.php/Software_and_Computing_Review_5"
                  title="Software and Computing Review 5">The review</a>
                was held two weeks ago. We went over highlights from the
                committee's preliminary report as presented at review
                close-out. Some quotes from that report:
              </p>
              <ul>
                <li> Steps have been taken to reduce data processing
                  burdens on analyzers through simple APIs/interfaces.
                  This is strongly commended.</li>
                <li> Overall, GlueX plans and actual developments are
                  excellent and appear to match what is needed to
                  produce timely and important science. </li>
                <li> NERSC allocations are now an important resource to
                  support ENP computing needs. This is a positive
                  development. It is important to continue investigating
                  additional offsite resources as part of future
                  planning. </li>
                <li> More efficient data transfer mechanisms for OSG
                  (e.g XrootD) would allow for running reconstruction at
                  these sites. </li>
              </ul>
              <p>and the two recommendations:
              </p>
              <ul>
                <li> Prepare to support increasing interest in machine
                  learning and modern data science tools, possibly in
                  collaboration with other labs to leverage existing
                  solutions.</li>
                <li> Consider increasing the central support for offsite
                  resource access, especially for OSG and data
                  transfers, leveraging work already done by GlueX and
                  CLAS12 and at other laboratories.</li>
              </ul>
              <p>So generally favorable stuff.
              </p>
              <h3><span class="mw-headline"
id="Review_of_recent_issues.2C_pull_requests.2C_and_discussion_on_the_help_list">Review
                  of recent issues, pull requests, and discussion on the
                  help list</span></h3>
              <p>The discussion:
              </p>
              <ul>
                <li> <a rel="nofollow" class="external text"
                    href="https://github.com/JeffersonLab/halld_recon/pull/65">halld_recon
                    pull request #65: Hdview2 primex</a>. This change
                  from David allows drawing of the CompCal. Can be
                  turned on and off in the gui.</li>
                <li> <a rel="nofollow" class="external text"
                    href="https://github.com/JeffersonLab/halld_recon/pull/55">halld_recon
                    pull request #55: Tracking update oct18</a>. Several
                  small-ish changes to tracking from Simon including
                  measures to preserve hits in the downstream FDC
                  layers.</li>
                <li> <a rel="nofollow" class="external text"
                    href="https://github.com/JeffersonLab/halld_sim/pull/21">halld_sim
                    pull request #21 Gen amp baryons</a>. Peter Pauli
                  has enabled baryon resonance production at the lower
                  vertex in certain cases.</li>
              </ul>
              <h3><span class="mw-headline"
                  id="Record_of_per_file_event_ranges">Record of per
                  file event ranges</span></h3>
              <p>Sean raised the idea of having a record of the first
                and last event present in each data file. This would
                allow us to know which file to interrogate for a
                particular event. David already has a program that will
                generate this information (along with a host of other
                items). He will look at running it online. The next
                question will be how to present the data to users.
              </p>
            </div>
            <div class="printfooter">
              Retrieved from "<a dir="ltr"
href="https://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_December_11,_2018&oldid=90619">https://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_December_11,_2018&oldid=90619</a>"</div>
          </div>
        </div>
      </div>
      <div id="footer" role="contentinfo">
        <ul id="f-list">
          <li id="lastmod"> This page was last modified on 14 December
            2018, at 16:55.</li>
        </ul>
      </div>
    </div>
  </body>
</html>