[Halld-online] Online meeting minutes
David Lawrence
davidl at jlab.org
Wed Jun 1 17:10:53 EDT 2016
Hi All,
Here is a link to the meeting page with minutes from today’s online meeting. They are also copied below.
https://halldweb.jlab.org/wiki/index.php/OWG_Meeting_1-Jun-2016
Regards,
-David
Minutes[edit]
Attendees: David L. (chair), Beni Z., Sean D., Simon T., Naomi J., Dave A., Carl T., Curtis M., Mark I.
Announcements[edit]
Power Outage
There was a power outage on the accelerator site yesterday evening for about 1.5 hours starting at around 19:30
Techs were notified (not Physicists)
Gluon computers did not turn off
Helium in supply tank was lost (but not in magnet)
Gas system survived on UPS
3rd RAID server will be added to large IT procurement at time of award
New Online Farm nodes:
Nature of them is still up in the air (i.e. exactly the type of CPU). Some benchmarking by IT people currently underway
gluon01: RHEL7-> RHEL6 : still not done. David will remind Paul so it stays on his radar
Whiteboard cameras: installed but still not accessible. Will need help from Hovanes
Missing RunLog.tar files[edit]
Near end of Spring run noticed numerous RunLog.tar files were missing from tape
Belief is that this is due to intereference between two cron scripts
A fix has been implemented in the wrapper script run on the RAID servers that hopefully fixes this issue
Won't really know until we go back into production data taking
E-log entry made to document changes
Gluon kernel versions[edit]
Gluons running RHEL6 have different kernel versions
A "yum update will need to be done on each node to bring them up to date and in alignment with one another
David will schedule this and send out and announcement since this will require a reboot of all computers
Compiler upgrade (gcc 4.9.2)[edit]
Offline made gcc4.8 or higher a requirement as of today
Proposal put forth to make some computers (ones used mainly for controls) default to gcc4.4.7 while the rest default to gcc 4.9.2 following a suggestion by Sergey at last meeting
Hovanes not at meeting to discuss so David will speak with him offline
Write-through cache for raw data[edit]
Suggestion was made just prior to Spring run to use write-through cache for transferring raw data to tape. This would leave most recent files on cache for offline monitoring
We deferred implementing this until this summer to avoid potential issues with data taking in the Spring
It turns out SciComp already has a system for moving files matching certain naming patterns from the staging disk to cache. They have now included an appropriate pattern for the first 5 raw data files from any future runs.
Copying to tape via staging disk remains as it has been.
DAQ - CODA 3.0.6[edit]
Dave A. has installed CODA 3.0.6 on the gluon cluster and it is ready for testing
One key new feature is that it now provides an option from transferring data via direct socket connection rather than through ET
New jcedit has option to select for each component how it will receive its input
Dave A. noted that if the COOL config. were changed to use direct sockets and then one tried using that config with CODA 3.0.5, it would likely not work. We will need to remember to either change the configuration back, or keep a backup of the COOL config used with CODA 3.0.5
JinFlux support is now also built into CODA 3.0.6
L3 Status[edit]
Biweekly meetings have started
David L. plans to resurrect system for playing back data from ROCs stored on a RAM disk
We can use real data this time instead of MC data
Beni will be providing a variable rate random trigger to the TS front panel for DAQ rate testing that can also be used for L3 testing.
ROL status (SYNC events)[edit]
Alex S. (not at meeting) has committed some code to the repository for handling data from f250 modules in sync events
Changes were made to current default parser, but will need to be ported to new parser (David will help with that)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.jlab.org/pipermail/halld-online/attachments/20160601/e825ce1e/attachment.html>
More information about the Halld-online
mailing list