<html>
  <head>
    <meta http-equiv="Content-Type" content="text/html;
      charset=windows-1252">
  </head>
  <body text="#000000" bgcolor="#FFFFFF">
    <p>I'm having trouble accessing the farm, I don't know why. I'll
      remove my files as soon as I can..</p>
    <p>Cheers</p>
    <p>L.<br>
    </p>
    <br>
    <div class="moz-cite-prefix">Il 01/06/2017 23:14, Solt, Matthew
      Reagan ha scritto:<br>
    </div>
    <blockquote type="cite"
      cite="mid:1496351695393.18442@slac.stanford.edu">
      <style type="text/css" style="display:none"><!--P{margin-top:0;margin-bottom:0;} --></style>
      <p>I reduced my work directory from 649G to 14G. I think I win ;)<br>
      </p>
      <p><br>
      </p>
      <p>Matt Solt<br>
      </p>
      <div style="color: rgb(33, 33, 33);">
        <hr tabindex="-1" style="display:inline-block; width:98%">
        <div id="divRplyFwdMsg" dir="ltr"><font style="font-size:11pt"
            face="Calibri, sans-serif" color="#000000"><b>From:</b>
            <a class="moz-txt-link-abbreviated" href="mailto:hps-software@slac.stanford.edu">hps-software@slac.stanford.edu</a>
            <a class="moz-txt-link-rfc2396E" href="mailto:hps-software@slac.stanford.edu"><hps-software@slac.stanford.edu></a> on behalf of Kyle
            McCarty <a class="moz-txt-link-rfc2396E" href="mailto:mccaky@gmail.com"><mccaky@gmail.com></a><br>
            <b>Sent:</b> Thursday, June 1, 2017 1:04 PM<br>
            <b>To:</b> Bradley T Yale<br>
            <b>Cc:</b> Nathan Baltzell; hps-software;
            <a class="moz-txt-link-abbreviated" href="mailto:hps-analysis@jlab.org">hps-analysis@jlab.org</a><br>
            <b>Subject:</b> Re: [Hps-analysis] Fwd: [Clas_offline] Fwd:
            ENP consumption of disk space under /work</font>
          <div> </div>
        </div>
        <div>
          <div dir="ltr">I have reduced my /work/ usage to 16 MB.</div>
          <div class="gmail_extra"><br>
            <div class="gmail_quote">On Thu, Jun 1, 2017 at 3:05 PM,
              Bradley T Yale <span dir="ltr">
                <<a href="mailto:btu29@wildcats.unh.edu"
                  target="_blank" moz-do-not-send="true">btu29@wildcats.unh.edu</a>></span>
              wrote:<br>
              <blockquote class="gmail_quote" style="margin:0 0 0 .8ex;
                border-left:1px #ccc solid; padding-left:1ex">
                <div>
                  <div dir="ltr">
                    <div
                      id="m_-8138436127875140186x_divtagdefaultwrapper"
                      dir="ltr" style="font-size:12pt; color:#000000;
                      font-family:Calibri,Arial,Helvetica,sans-serif">
                      <p>I deleted/moved 351 GB in mc_production.</p>
                      <p><br>
                      </p>
                      <p>The following things are owned by other users
                        though, and ~6 GB more can be freed up if no
                        longer needed:</p>
                      <p><br>
                      </p>
                      <p>Luca:</p>
                      <p><span>762M: /work/hallb/hps/mc_production/<wbr>Luca/lhe/Vegas_10_10_2016/</span></p>
                      <p><br>
                      </p>
                      <p>Holly:</p>
                      <span><span>3.1G: /work/hallb/hps/mc_production/<wbr>MG5/dst/</span><br>
                      </span>
                      <p><span><span>2G: /work/hallb/hps/mc_production/<wbr>postTriSummitFixes/tritrig/<wbr>1pt05/NOSUMCUT/</span></span><br>
                      </p>
                      <p><span>25M:  /work/hallb/hps/mc_production/<wbr>logs/slic/ap/</span><br>
                      </p>
                      <p><span></span></p>
                      <p><span><span>16M: /work/hallb/hps/mc_production/<wbr>logs/readout/beam-tri/1pt92/</span></span></p>
                      <span><span></span></span><br>
                      <span></span>
                      <p><span><span>Matt G:</span></span></p>
                      <p><span><span><span><span><span><span>35M:
                                    /work/hallb/hps/mc_production/<wbr>dst/</span></span></span></span><br>
                          </span></span></p>
                      <p><span><span><span>11M:
                              /work/hallb/hps/mc_production/<wbr>logs/dqm/</span></span></span></p>
                      <p><br>
                        <span></span></p>
                      <p><span><span>There is also 2 GB of old 2.2 GeV
                            A' MC, which should no longer be relevant,
                            but I didn't want to do anything with it
                            since it had mock data stuff in there:</span></span></p>
                      <p><span><span><span>/work/hallb/hps/mc_production/<wbr>lhe/ap/2pt2/</span></span></span></p>
                      <p><span><span><span><br>
                            </span></span></span></p>
                    </div>
                    <hr style="display:inline-block; width:98%">
                    <div id="m_-8138436127875140186x_divRplyFwdMsg"
                      dir="ltr"><font style="font-size:11pt"
                        face="Calibri, sans-serif" color="#000000"><b>From:</b>
                        <a href="mailto:hps-software@SLAC.STANFORD.EDU"
                          target="_blank" moz-do-not-send="true">hps-software@SLAC.STANFORD.EDU</a>
                        <<a
                          href="mailto:hps-software@SLAC.STANFORD.EDU"
                          target="_blank" moz-do-not-send="true">hps-software@SLAC.STANFORD.<wbr>EDU</a>>
                        on behalf of Nathan Baltzell <<a
                          href="mailto:baltzell@jlab.org"
                          target="_blank" moz-do-not-send="true">baltzell@jlab.org</a>><br>
                        <b>Sent:</b> Thursday, June 1, 2017 1:23:19 PM<br>
                        <b>To:</b> HPS-SOFTWARE<br>
                        <b>Cc:</b> <a
                          href="mailto:hps-analysis@jlab.org"
                          target="_blank" moz-do-not-send="true">hps-analysis@jlab.org</a><br>
                        <b>Subject:</b> Re: [Hps-analysis] Fwd:
                        [Clas_offline] Fwd: ENP consumption of disk
                        space under /work</font>
                      <div> </div>
                    </div>
                  </div>
                  <font size="2"><span style="font-size:10pt">
                      <div class="m_-8138436127875140186PlainText">
                        <div>
                          <div class="h5">Here’s the most relevant usage<br>
                            <br>
                            649G    mrsolt/<br>
                            570G    sebouh/<br>
                            459G    mc_production/<br>
                            228G    holly<br>
                            159G    mccaky/<br>
                            78G     rafopar/<br>
                            45G     omoreno/<br>
                            44G     spaul<br>
                            39G     fxgirod<br>
                            34G     jeremym<br>
                            <br>
                            data/engrun2015:<br>
                            3.2T    tweakpass6<br>
                            50G     tweakpass6fail<br>
                            64G     tpass7<br>
                            2.4G    tpass7b<br>
                            39G     tpass7c<br>
                            6.5G    t_tweakpass_a<br>
                            373G    pass6/skim<br>
                            201G    pass6/dst<br>
                            <br>
                            data/physrun2016:<br>
                            3.5T    pass0<br>
                            690G    feeiter4<br>
                            94M     feeiter0<br>
                            327M    feeiter1<br>
                            339M    feeiter2<br>
                            338M    feeiter3<br>
                            15G     noPass<br>
                            24G     pass0_allign<br>
                            52G     pass0fail<br>
                            4.5G    tmp_test<br>
                            281G    tpass1<br>
                            11G     upass0<br>
                            <br>
                            <br>
                            <br>
                            <br>
                            On Jun 1, 2017, at 11:05, Stepan Stepanyan
                            <<a href="mailto:stepanya@jlab.org"
                              target="_blank" moz-do-not-send="true">stepanya@jlab.org</a>>
                            wrote:<br>
                            <br>
                            > FYI, we need to move files.<br>
                            > <br>
                            > Stepan<br>
                            > <br>
                            >> Begin forwarded message:<br>
                            >> <br>
                            >> From: Harut Avakian <<a
                              href="mailto:avakian@jlab.org"
                              target="_blank" moz-do-not-send="true">avakian@jlab.org</a>><br>
                            >> Subject: [Clas_offline] Fwd: ENP
                            consumption of disk space under /work<br>
                            >> Date: June 1, 2017 at 5:01:24 PM
                            GMT+2<br>
                            >> To: "<a
                              href="mailto:clas_offline@jlab.org"
                              target="_blank" moz-do-not-send="true">clas_offline@jlab.org</a>"
                            <<a href="mailto:clas_offline@jlab.org"
                              target="_blank" moz-do-not-send="true">clas_offline@jlab.org</a>><br>
                            >> <br>
                            >> <br>
                            >> <br>
                            >> <br>
                            >> Dear All,<br>
                            >> <br>
                            >> As you can see from the e-mail
                            below,  keeping all our work disk space
                            requires some additional funding.<br>
                            >> Option 3 will inevitably impact on
                            farm operations, removing of ~20% space from
                            Lustre.<br>
                            >> <br>
                            >> We can also choose something
                            between options 1) and 3).<br>
                            >> Please revise the content and move
                            at least 75% of what is in /work/clas  to
                            either /cache or /volatile? 
                            <br>
                            >> The current Hall-B usage includes:<br>
                            >> 550G    hallb/bonus<br>
                            >> 1.5T    hallb/clase1<br>
                            >> 3.6T    hallb/clase1-6<br>
                            >> 3.3T    hallb/clase1dvcs<br>
                            >> 2.8T    hallb/clase1dvcs2<br>
                            >> 987G    hallb/clase1f<br>
                            >> 1.8T    hallb/clase2<br>
                            >> 1.6G    hallb/clase5<br>
                            >> 413G    hallb/clase6<br>
                            >> 2.2T    hallb/claseg1<br>
                            >> 3.9T    hallb/claseg1dvcs<br>
                            >> 1.2T    hallb/claseg3<br>
                            >> 4.1T    hallb/claseg4<br>
                            >> 2.7T    hallb/claseg5<br>
                            >> 1.7T    hallb/claseg6<br>
                            >> 367G    hallb/clas-farm-output<br>
                            >> 734G    hallb/clasg10<br>
                            >> 601G    hallb/clasg11<br>
                            >> 8.1T    hallb/clasg12<br>
                            >> 2.4T    hallb/clasg13<br>
                            >> 2.4T    hallb/clasg14<br>
                            >> 28G    hallb/clasg3<br>
                            >> 5.8G    hallb/clasg7<br>
                            >> 269G    hallb/clasg8<br>
                            >> 1.2T    hallb/clasg9<br>
                            >> 1.3T    hallb/clashps<br>
                            >> 1.8T    hallb/clas-production<br>
                            >> 5.6T    hallb/clas-production2<br>
                            >> 1.4T    hallb/clas-production3<br>
                            >> 12T    hallb/hps<br>
                            >> 13T    hallb/prad<br>
                            >> <br>
                            >> Regards,<br>
                            >> <br>
                            >> Harut<br>
                            >> <br>
                            >> P.S. Few times we had crashes and
                            they may also happen in future, so keeping
                            important files in /work is not recommended.<br>
                            >> You can see the list of lost files
                            in /site/scicomp/lostfiles.txt  and 
                            /site/scicomp/lostfiles-jan-<wbr>2017.txt<br>
                            >> <br>
                            >> <br>
                            >> <br>
                            >> -------- Forwarded Message --------<br>
                            >> Subject:     ENP consumption of
                            disk space under /work<br>
                            >> Date:        Wed, 31 May 2017
                            10:35:51 -0400<br>
                            >> From:        Chip Watson <<a
                              href="mailto:watson@jlab.org"
                              target="_blank" moz-do-not-send="true">watson@jlab.org</a>><br>
                            >> To:  Sandy Philpott <<a
                              href="mailto:philpott@jlab.org"
                              target="_blank" moz-do-not-send="true">philpott@jlab.org</a>>,
                            Graham Heyes <<a
                              href="mailto:heyes@jlab.org"
                              target="_blank" moz-do-not-send="true">heyes@jlab.org</a>>,
                            Ole Hansen <<a href="mailto:ole@jlab.org"
                              target="_blank" moz-do-not-send="true">ole@jlab.org</a>>,
                            Harut Avakian <<a
                              href="mailto:avakian@jlab.org"
                              target="_blank" moz-do-not-send="true">avakian@jlab.org</a>>,
                            Brad Sawatzky <<a
                              href="mailto:brads@jlab.org"
                              target="_blank" moz-do-not-send="true">brads@jlab.org</a>>,
                            Mark M. Ito <<a
                              href="mailto:marki@jlab.org"
                              target="_blank" moz-do-not-send="true">marki@jlab.org</a>><br>
                            >> <br>
                            >> All,<br>
                            >> <br>
                            >> As I have started on the
                            procurement of the new /work file server, I
                            <br>
                            >> have discovered that Physics' use
                            of /work has grown unrestrained over <br>
                            >> the last year or two.<br>
                            >> <br>
                            >> "Unrestrained" because there is no
                            way under Lustre to restrain it <br>
                            >> except via a very unfriendly Lustre
                            quota system.  As we leave some <br>
                            >> quota headroom to accommodate large
                            swings in usage for each hall for <br>
                            >> cache and volatile, then /work
                            continues to grow.<br>
                            >> <br>
                            >> Total /work has now reached 260 TB,
                            several times larger than I was <br>
                            >> anticipating.  This constitutes
                            more than 25% of Physics' share of <br>
                            >> Lustre, compared to LQCD which uses
                            less than 5% of its disk space on <br>
                            >> the un-managed /work.<br>
                            >> <br>
                            >> It would cost Physics an extra $25K
                            (total $35K - $40K) to treat the 260 <br>
                            >> TB as a requirement.<br>
                            >> <br>
                            >> There are 3 paths forward:<br>
                            >> <br>
                            >> (1) Physics cuts its use of /work
                            by a factor of 4-5.<br>
                            >> (2) Physics increases funding to
                            $40K<br>
                            >> (3) We pull a server out of Lustre,
                            decreasing Physics' share of the <br>
                            >> system, and use that as half of the
                            new active-active pair, beefing it <br>
                            >> up with SSDs and perhaps additional
                            memory; this would actually shrink <br>
                            >> Physics near term costs, but puts
                            higher pressure on the file system for <br>
                            >> the farm<br>
                            >> <br>
                            >> The decision is clearly Physics',
                            but I do need a VERY FAST response to <br>
                            >> this question, as I need to move
                            quickly now for LQCD's needs.<br>
                            >> <br>
                            >> Hall D + GlueX,  96 TB<br>
                            >> CLAS + CLAS12, 98 TB<br>
                            >> Hall C,                35 TB<br>
                            >> Hall A <unknown, still
                            scanning><br>
                            >> <br>
                            >> Email, call (x7101), or drop by
                            today 1:30-3:00 p.m. for discussion.<br>
                            >> <br>
                            >> thanks,<br>
                            >> Chip<br>
                            >> <br>
                            >> <br>
                            >> ______________________________<wbr>_________________<br>
                            >> Clas_offline mailing list<br>
                            >> <a
                              href="mailto:Clas_offline@jlab.org"
                              target="_blank" moz-do-not-send="true">Clas_offline@jlab.org</a><br>
                            >> <a
                              href="https://mailman.jlab.org/mailman/listinfo/clas_offline"
                              target="_blank" moz-do-not-send="true">
                              https://mailman.jlab.org/<wbr>mailman/listinfo/clas_offline</a><br>
                            > <br>
                            > ______________________________<wbr>_________________<br>
                            > Hps-analysis mailing list<br>
                            > <a href="mailto:Hps-analysis@jlab.org"
                              target="_blank" moz-do-not-send="true">Hps-analysis@jlab.org</a><br>
                            > <a
                              href="https://mailman.jlab.org/mailman/listinfo/hps-analysis"
                              target="_blank" moz-do-not-send="true">
                              https://mailman.jlab.org/<wbr>mailman/listinfo/hps-analysis</a><br>
                            <br>
                          </div>
                        </div>
                        ##############################<wbr>##############################<wbr>############<br>
                        Use REPLY-ALL to reply to list<br>
                        <br>
                        To unsubscribe from the HPS-SOFTWARE list, click
                        the following link:<br>
                        <a
href="https://urldefense.proofpoint.com/v2/url?u=https-3A__listserv.slac.stanford.edu_cgi-2Dbin_wa-3FSUBED1-3DHPS-2DSOFTWARE-26A-3D1&d=DwMF-g&c=lz9TcOasaINaaC3U7FbMev2lsutwpI4--09aP8Lu18s&r=J4PP6Zl8IyGHpsqWaKegORCYw8hoCHePTw5O95a5lqQ&m=9Q1Lr9Ubyxo4a-aHpRTr_pw0S7IQk02K6XFKuDBvcGs&s=_aWngkITQu0UvMHgIBu-sCNwvne9c5O4rWMYRQc-vi8&e="
                          target="_blank" moz-do-not-send="true">https://listserv.slac.<wbr>stanford.edu/cgi-bin/wa?<wbr>SUBED1=HPS-SOFTWARE&A=1</a><br>
                      </div>
                    </span></font></div>
                <br>
                <hr>
                <p align="left">Use REPLY-ALL to reply to list</p>
                <p align="center">To unsubscribe from the HPS-SOFTWARE
                  list, click the following link:<br>
                  <a
href="https://urldefense.proofpoint.com/v2/url?u=https-3A__listserv.slac.stanford.edu_cgi-2Dbin_wa-3FSUBED1-3DHPS-2DSOFTWARE-26A-3D1&d=DwMF-g&c=lz9TcOasaINaaC3U7FbMev2lsutwpI4--09aP8Lu18s&r=J4PP6Zl8IyGHpsqWaKegORCYw8hoCHePTw5O95a5lqQ&m=9Q1Lr9Ubyxo4a-aHpRTr_pw0S7IQk02K6XFKuDBvcGs&s=_aWngkITQu0UvMHgIBu-sCNwvne9c5O4rWMYRQc-vi8&e="
                    target="_blank" moz-do-not-send="true">https://listserv.slac.<wbr>stanford.edu/cgi-bin/wa?<wbr>SUBED1=HPS-SOFTWARE&A=1</a>
                </p>
              </blockquote>
            </div>
            <br>
          </div>
          <br>
          <hr>
          <p align="left">Use REPLY-ALL to reply to list</p>
          <p align="center">To unsubscribe from the HPS-SOFTWARE list,
            click the following link:<br>
            <a
href="https://urldefense.proofpoint.com/v2/url?u=https-3A__listserv.slac.stanford.edu_cgi-2Dbin_wa-3FSUBED1-3DHPS-2DSOFTWARE-26A-3D1&d=DwMF-g&c=lz9TcOasaINaaC3U7FbMev2lsutwpI4--09aP8Lu18s&r=J4PP6Zl8IyGHpsqWaKegORCYw8hoCHePTw5O95a5lqQ&m=9Q1Lr9Ubyxo4a-aHpRTr_pw0S7IQk02K6XFKuDBvcGs&s=_aWngkITQu0UvMHgIBu-sCNwvne9c5O4rWMYRQc-vi8&e="
              target="_blank" moz-do-not-send="true">https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1</a>
          </p>
        </div>
      </div>
      <br>
      <fieldset class="mimeAttachmentHeader"></fieldset>
      <br>
      <pre wrap="">_______________________________________________
Hps-analysis mailing list
<a class="moz-txt-link-abbreviated" href="mailto:Hps-analysis@jlab.org">Hps-analysis@jlab.org</a>
<a class="moz-txt-link-freetext" href="https://mailman.jlab.org/mailman/listinfo/hps-analysis">https://mailman.jlab.org/mailman/listinfo/hps-analysis</a>
</pre>
    </blockquote>
    <br>
  </body>
</html>