[acg] IBC1H04CFLCW2

Kyle Hesse hesse at jlab.org
Wed Sep 13 10:19:43 EDT 2023


I took a quick look as I prepare to leave for a week.

The soft IOCs are not a problem. It is maxJuice restricting the current limit to 5uA based on the condition of the moeller dipole and the duty factor for Hall A. maxJuice.xml should be investigated. Please see the strip chart and code below.



[cid:9fd0251a-b646-425c-b9e7-dbd0f9a0d17a]
________________________________
From: acg <acg-bounces at jlab.org> on behalf of Adam Carpenter via acg <acg at jlab.org>
Sent: Wednesday, September 13, 2023 10:08 AM
To: Theo Larrieu <theo at jlab.org>; acg at jlab.org <acg at jlab.org>; Anthony Cuffe <cuffe at jlab.org>
Subject: Re: [acg] IBC1H04CFLCW2

I don't think we want to have non-production soft IOCs running on opsbat9.  Should iocsoftkyle, iocsoftamanda, iocsofttest1, and iocsuetest be running here?  If not, can we turn them off and see if the problem resolves?


Adam Carpenter
Accelerator Operations Software Department
Thomas Jefferson National Accelerator Facility

________________________________
From: acg <acg-bounces at jlab.org> on behalf of Anthony Cuffe via acg <acg at jlab.org>
Sent: Wednesday, September 13, 2023 9:47 AM
To: Theo Larrieu <theo at jlab.org>; acg at jlab.org <acg at jlab.org>
Subject: Re: [acg] IBC1H04CFLCW2


Here are the items running as alarms on that system:



SoftIOCs:



iocsoftmag running on port 20000 of opsbat9 with pid 10317

iocsoftinjdec running on port 20001 of opsbat9 with pid 10332

iocsofteamomod running on port 20002 of opsbat9 with pid 10347

iocsoftwamomod running on port 20003 of opsbat9 with pid 10362

iocsoftfac running on port 20004 of opsbat9 with pid 10377

iocsoftinj1 running on port 20005 of opsbat9 with pid 10392

iocsoftfac1 running on port 20006 of opsbat9 with pid 10407

iocsoftfac2 running on port 20007 of opsbat9 with pid 10422

iocsoftaclrm running on port 20008 of opsbat9 with pid 10437

iocsoftinj4 running on port 20009 of opsbat9 with pid 10452

iocsoftinj3 running on port 20010 of opsbat9 with pid 10468

iocsoftheflow running on port 20011 of opsbat9 with pid 10483

iocsoftinj5 running on port 20015 of opsbat9 with pid 10512

iocsoftmccxpt running on port 20026 of opsbat9 with pid 10560

iocsoftCAsec running on port 20031 of opsbat9 with pid 10611

iocsoftkyle running on port 20033 of opsbat9 with pid 10662

iocsoftfsdtracker running on port 20038 of opsbat9 with pid 4024443

iocsoftfilter running on port 20039 of opsbat9 with pid 10830

iocsoftloadedq running on port 20050 of opsbat9 with pid 10895

iocsoftrfmotemp running on port 20051 of opsbat9 with pid 2910321

iocsoftrfsepiot running on port 20052 of opsbat9 with pid 11008

iocsoftmodanode running on port 20053 of opsbat9 with pid 11026

iocsoftrfcmtype running on port 20054 of opsbat9 with pid 11057

iocsoftrfgang running on port 20055 of opsbat9 with pid 3816988

iocsoftserptn05 running on port 20056 of opsbat9 with pid 3091536

iocsoftserptn11 running on port 20057 of opsbat9 with pid 11313

iocsoftserptn17 running on port 20058 of opsbat9 with pid 11424

iocsoftserpts05 running on port 20059 of opsbat9 with pid 11455

iocsoftserpts11 running on port 20060 of opsbat9 with pid 11481

iocsoftserpts17 running on port 20061 of opsbat9 with pid 11534

iocSnmp running on port 20062 of opsbat9 with pid 11624

iocsoftbpms running on port 20064 of opsbat9 with pid 11821

iocsofttest1 running on port 20065 of opsbat9 with pid 3478041

iocsoftihvin1 running on port 20067 of opsbat9 with pid 1911726

iocsoftihvnl1 running on port 20068 of opsbat9 with pid 1950599

iocsoftihvsl1 running on port 20069 of opsbat9 with pid 1977621

iocsoftihvwa1 running on port 20070 of opsbat9 with pid 2036277

iocsoftihvbs1 running on port 20071 of opsbat9 with pid 1844207

iocsoftsuetest running on port 20400 of opsbat9 with pid 3421485

iocsoftamanda running on port 20401 of opsbat9 with pid 679093



ForeverFile Processes:



#MEJ 05/24/23 : moved to opsbat9 (rhel9)

opsbat9 orbitLogger /cs/prohome/bin/run_csue_script orbitLogger  > /dev/null 2>&1

#MEJ 07/03/23 : moved to opsbat9 (rhel9)

opsbat9 FaultLogger /cs/prohome/bin/run_csue_script FaultLogger >> /cs/prohome/apps/f/FaultCounter/pro/fileio/errdiag/errorlog

#MEJ 07/18/23 : app is now C100Logger2 moved to opsbat9 (rhel9)

opsbat9 FCLoggerC100 /cs/prohome/bin/run_csue_script FCLoggerC100 > /dev/null 2>&1

opsbat9 maxServer /cs/prohome/bin/run_csue_script maxServer  > /dev/null 2>&1

#MEJ 05/24/23 : moved to opsbat9 (rhel9)

opsbat9 RATLogger /cs/prohome/bin/run_csue_script RATLogger >> /cs/prohome/apps/r/RFAnalyzer/pro/fileio/errdiag/errorlog

#MEJ 05/24/23 : moved to opsbat9 (rhel9)

#opsbat9 vipLogger /cs/prohome/bin/run_csue_script vipLogger > /dev/null 2>&1

#MEJ 05/24/23 : moved to opsbat9 (rhel9)

#opsbat9 vacLogger /cs/prohome/bin/run_csue_script vacLogger >> /cs/prohome/apps/v/vacuumViewer/pro/fileio/tmp/errorlog

#MEJ 05/24/23 : moved to opsbat9 (rhel9)

opsbat9 tunerExerciseServer /cs/prohome/bin/run_csue_script tunerExerciseServer  > /dev/null 2>&1

#MEJ 05/24/23 : moved to opsbat9 (rhel9)

opsbat9 tunerExerciseServerLERF /cs/prohome/bin/run_csue_script tunerExerciseServerLERF -ced=led > /dev/null 2>&1

opsbat9 BOOMServer /cs/prohome/bin/run_csue_script BOOMServer  >> /cs/prohome/apps/b/BOOM/pro/fileio/errdiag/BOOM.server.errorlog

#SDW 04/13/23: moved camonsave tasks to opsbat9

#opsbat9 SIGNALS /cs/prohome/bin/run_csue_script /cs/prohome/bin/CaMonSaveRtems /cs/op/iocs/iocsb4bcmtc/SIGNALS /cs/op/iocs/iocsb4bcmtc/startup.signals.init sb4bcmtc_opmode >> camonsaveRtems.log 2>&1 &

opsbat9 INSIGNALS /cs/prohome/bin/run_csue_script /cs/prohome/bin/CaMonSaveRtems /cs/op/iocs/iocinbcmtc/INSIGNALS /cs/op/iocs/iocinbcmtc/startup.signals.init inbcmtc_opmode >> camonsaveRtemsIN.log 2>&1 &

#opsbat9 NLSIGNALS /cs/prohome/bin/run_csue_script /cs/prohome/bin/CaMonSaveRtems /cs/op/iocs/iocnlbcmtc/NLSIGNALS /cs/op/iocs/iocnlbcmtc/startup.signals.init nlbcmtc_opmode >> camonsaveRtemsNL.log 2>&1 &

opsbat9 TSBSIGNALS /cs/prohome/bin/run_csue_script /cs/prohome/bin/CaMonSaveRtems /cs/op/iocs/ioctsbbcmtc/TSBSIGNALS /cs/op/iocs/ioctsbbcmtc/startup.signals.init tsbbcmtc_seq_opmode >> camonsaveRtemsHD.log 2>&1 &

opsbat9 HLBBCMSIGNALS /cs/prohome/bin/run_csue_script /cs/prohome/bin/CaMonSaveRtems /cs/op/iocs/iochlbbpmtc/HLBBCMSIGNALS /cs/op/iocs/iochlbbpmtc/startup.signals.init hlbbcmtc_seq_opmode >> camonsaveRtemsHB.log 2>&1 &

# CSH 2023/04/19 Moved to opsbat9 and rhel9  insertablesManager

opsbat9 insertableManager /cs/prohome/bin/run_csue_script /usr/csite6/op/prod_R3.14.12.3.J0/insertables/1-1/bin/rhel-9-x86_64/insertableManagerPro -vc OPS > /cs/op/iocs/DATA/insertables/insertables_`date +'%Y%m%d_%H%M%S'`.log 2>&1 &

# 1. Comment out opsbat9 line above

# 3. Delete insertableMangerApp running on opsbat9. ForeverFile will automatically start insertableManagerApp on opsbat0

opsbat9 fastFBfeedforward /cs/prohome/bin/run_csue_script /cs/prohome/bin/ff  fastFBfeedforward  > /dev/null 2>&1



From: acg <acg-bounces at jlab.org> On Behalf Of Theo Larrieu via acg
Sent: Wednesday, September 13, 2023 9:07 AM
To: acg at jlab.org
Subject: Re: [acg] IBC1H04CFLCW2





[cid:image001.png at 01D9E627.56BFE570]





From: acg <acg-bounces at jlab.org<mailto:acg-bounces at jlab.org>> On Behalf Of Christopher Slominski via acg
Sent: Wednesday, September 13, 2023 9:06 AM
To: acg at jlab.org<mailto:acg at jlab.org>
Subject: Re: [acg] IBC1H04CFLCW2



Theo looked at the splunk log and sees that user alarms on opsbat9 is setting it to 5. Operators are setting it back to 12.



From: acg <acg-bounces at jlab.org<mailto:acg-bounces at jlab.org>> On Behalf Of Christopher Slominski via acg
Sent: Wednesday, September 13, 2023 8:11 AM
To: acg at jlab.org<mailto:acg at jlab.org>
Subject: [acg] IBC1H04CFLCW2



I am on call. This PV has been toggling back and forth between 5 and 12, causing ‘maxJuice’ problems. Who knows anything about this PV?



[cid:image002.png at 01D9E627.56BFE570]


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.jlab.org/pipermail/acg/attachments/20230913/f3b072bc/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image001.png
Type: image/png
Size: 168529 bytes
Desc: image001.png
URL: <https://mailman.jlab.org/pipermail/acg/attachments/20230913/f3b072bc/attachment-0003.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image002.png
Type: image/png
Size: 28789 bytes
Desc: image002.png
URL: <https://mailman.jlab.org/pipermail/acg/attachments/20230913/f3b072bc/attachment-0004.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image.png
Type: image/png
Size: 631101 bytes
Desc: image.png
URL: <https://mailman.jlab.org/pipermail/acg/attachments/20230913/f3b072bc/attachment-0005.png>


More information about the acg mailing list