<div dir="ltr">I have reduced my /work/ usage to 16 MB.</div><div class="gmail_extra"><br><div class="gmail_quote">On Thu, Jun 1, 2017 at 3:05 PM, Bradley T Yale <span dir="ltr"><<a href="mailto:btu29@wildcats.unh.edu" target="_blank">btu29@wildcats.unh.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">





<div>


<div dir="ltr">
<div id="m_-8138436127875140186x_divtagdefaultwrapper" dir="ltr" style="font-size:12pt;color:#000000;font-family:Calibri,Arial,Helvetica,sans-serif">
<p>I deleted/moved 351 GB in mc_production.</p>
<p><br>
</p>
<p>The following things are owned by other users though, and ~6 GB more can be freed up if no longer needed:</p>
<p><br>
</p>
<p>Luca:</p>
<p><span>762M: /work/hallb/hps/mc_production/<wbr>Luca/lhe/Vegas_10_10_2016/</span></p>
<p><br>
</p>
<p>Holly:</p>
<span><span>3.1G: /work/hallb/hps/mc_production/<wbr>MG5/dst/</span><br>
</span>
<p><span><span>2G: /work/hallb/hps/mc_production/<wbr>postTriSummitFixes/tritrig/<wbr>1pt05/NOSUMCUT/</span></span><br>
</p>
<p><span>25M:  /work/hallb/hps/mc_production/<wbr>logs/slic/ap/</span><br>
</p>
<p><span><span></span></span></p>
<p><span><span>16M: /work/hallb/hps/mc_production/<wbr>logs/readout/beam-tri/1pt92/</span></span></p>
<span><span><span></span></span></span><br>
<span><span></span></span>
<p><span><span>Matt G:</span></span></p>
<p><span><span><span><span><span><span>35M: /work/hallb/hps/mc_production/<wbr>dst/</span></span></span></span><br>
</span></span></p>
<p><span><span><span>11M: /work/hallb/hps/mc_production/<wbr>logs/dqm/</span></span></span></p>
<p><br>
<span><span></span></span></p>
<p><span><span>There is also 2 GB of old 2.2 GeV A' MC, which should no longer be relevant, but I didn't want to do anything with it since it had mock data stuff in there:</span></span></p>
<p><span><span><span>/work/hallb/hps/mc_production/<wbr>lhe/ap/2pt2/</span></span></span></p>
<p><span><span><span><br>
</span></span></span></p>
<p></p>
</div>
<hr style="display:inline-block;width:98%">
<div id="m_-8138436127875140186x_divRplyFwdMsg" dir="ltr"><font face="Calibri, sans-serif" color="#000000" style="font-size:11pt"><b>From:</b> <a href="mailto:hps-software@SLAC.STANFORD.EDU" target="_blank">hps-software@SLAC.STANFORD.EDU</a> <<a href="mailto:hps-software@SLAC.STANFORD.EDU" target="_blank">hps-software@SLAC.STANFORD.<wbr>EDU</a>> on behalf of Nathan Baltzell <<a href="mailto:baltzell@jlab.org" target="_blank">baltzell@jlab.org</a>><br>
<b>Sent:</b> Thursday, June 1, 2017 1:23:19 PM<br>
<b>To:</b> HPS-SOFTWARE<br>
<b>Cc:</b> <a href="mailto:hps-analysis@jlab.org" target="_blank">hps-analysis@jlab.org</a><br>
<b>Subject:</b> Re: [Hps-analysis] Fwd: [Clas_offline] Fwd: ENP consumption of disk space under /work</font>
<div> </div>
</div>
</div>
<font size="2"><span style="font-size:10pt">
<div class="m_-8138436127875140186PlainText"><div><div class="h5">Here’s the most relevant usage<br>
<br>
649G    mrsolt/<br>
570G    sebouh/<br>
459G    mc_production/<br>
228G    holly<br>
159G    mccaky/<br>
78G     rafopar/<br>
45G     omoreno/<br>
44G     spaul<br>
39G     fxgirod<br>
34G     jeremym<br>
<br>
data/engrun2015:<br>
3.2T    tweakpass6<br>
50G     tweakpass6fail<br>
64G     tpass7<br>
2.4G    tpass7b<br>
39G     tpass7c<br>
6.5G    t_tweakpass_a<br>
373G    pass6/skim<br>
201G    pass6/dst<br>
<br>
data/physrun2016:<br>
3.5T    pass0<br>
690G    feeiter4<br>
94M     feeiter0<br>
327M    feeiter1<br>
339M    feeiter2<br>
338M    feeiter3<br>
15G     noPass<br>
24G     pass0_allign<br>
52G     pass0fail<br>
4.5G    tmp_test<br>
281G    tpass1<br>
11G     upass0<br>
<br>
<br>
<br>
<br>
On Jun 1, 2017, at 11:05, Stepan Stepanyan <<a href="mailto:stepanya@jlab.org" target="_blank">stepanya@jlab.org</a>> wrote:<br>
<br>
> FYI, we need to move files.<br>
> <br>
> Stepan<br>
> <br>
>> Begin forwarded message:<br>
>> <br>
>> From: Harut Avakian <<a href="mailto:avakian@jlab.org" target="_blank">avakian@jlab.org</a>><br>
>> Subject: [Clas_offline] Fwd: ENP consumption of disk space under /work<br>
>> Date: June 1, 2017 at 5:01:24 PM GMT+2<br>
>> To: "<a href="mailto:clas_offline@jlab.org" target="_blank">clas_offline@jlab.org</a>" <<a href="mailto:clas_offline@jlab.org" target="_blank">clas_offline@jlab.org</a>><br>
>> <br>
>> <br>
>> <br>
>> <br>
>> Dear All,<br>
>> <br>
>> As you can see from the e-mail below,  keeping all our work disk space requires some additional funding.<br>
>> Option 3 will inevitably impact on farm operations, removing of ~20% space from Lustre.<br>
>> <br>
>> We can also choose something between options 1) and 3).<br>
>> Please revise the content and move at least 75% of what is in /work/clas  to either /cache or /volatile? 
<br>
>> The current Hall-B usage includes:<br>
>> 550G    hallb/bonus<br>
>> 1.5T    hallb/clase1<br>
>> 3.6T    hallb/clase1-6<br>
>> 3.3T    hallb/clase1dvcs<br>
>> 2.8T    hallb/clase1dvcs2<br>
>> 987G    hallb/clase1f<br>
>> 1.8T    hallb/clase2<br>
>> 1.6G    hallb/clase5<br>
>> 413G    hallb/clase6<br>
>> 2.2T    hallb/claseg1<br>
>> 3.9T    hallb/claseg1dvcs<br>
>> 1.2T    hallb/claseg3<br>
>> 4.1T    hallb/claseg4<br>
>> 2.7T    hallb/claseg5<br>
>> 1.7T    hallb/claseg6<br>
>> 367G    hallb/clas-farm-output<br>
>> 734G    hallb/clasg10<br>
>> 601G    hallb/clasg11<br>
>> 8.1T    hallb/clasg12<br>
>> 2.4T    hallb/clasg13<br>
>> 2.4T    hallb/clasg14<br>
>> 28G    hallb/clasg3<br>
>> 5.8G    hallb/clasg7<br>
>> 269G    hallb/clasg8<br>
>> 1.2T    hallb/clasg9<br>
>> 1.3T    hallb/clashps<br>
>> 1.8T    hallb/clas-production<br>
>> 5.6T    hallb/clas-production2<br>
>> 1.4T    hallb/clas-production3<br>
>> 12T    hallb/hps<br>
>> 13T    hallb/prad<br>
>> <br>
>> Regards,<br>
>> <br>
>> Harut<br>
>> <br>
>> P.S. Few times we had crashes and they may also happen in future, so keeping important files in /work is not recommended.<br>
>> You can see the list of lost files in /site/scicomp/lostfiles.txt  and  /site/scicomp/lostfiles-jan-<wbr>2017.txt<br>
>> <br>
>> <br>
>> <br>
>> -------- Forwarded Message --------<br>
>> Subject:     ENP consumption of disk space under /work<br>
>> Date:        Wed, 31 May 2017 10:35:51 -0400<br>
>> From:        Chip Watson <<a href="mailto:watson@jlab.org" target="_blank">watson@jlab.org</a>><br>
>> To:  Sandy Philpott <<a href="mailto:philpott@jlab.org" target="_blank">philpott@jlab.org</a>>, Graham Heyes <<a href="mailto:heyes@jlab.org" target="_blank">heyes@jlab.org</a>>, Ole Hansen <<a href="mailto:ole@jlab.org" target="_blank">ole@jlab.org</a>>, Harut Avakian <<a href="mailto:avakian@jlab.org" target="_blank">avakian@jlab.org</a>>, Brad Sawatzky <<a href="mailto:brads@jlab.org" target="_blank">brads@jlab.org</a>>, Mark M. Ito <<a href="mailto:marki@jlab.org" target="_blank">marki@jlab.org</a>><br>
>> <br>
>> All,<br>
>> <br>
>> As I have started on the procurement of the new /work file server, I <br>
>> have discovered that Physics' use of /work has grown unrestrained over <br>
>> the last year or two.<br>
>> <br>
>> "Unrestrained" because there is no way under Lustre to restrain it <br>
>> except via a very unfriendly Lustre quota system.  As we leave some <br>
>> quota headroom to accommodate large swings in usage for each hall for <br>
>> cache and volatile, then /work continues to grow.<br>
>> <br>
>> Total /work has now reached 260 TB, several times larger than I was <br>
>> anticipating.  This constitutes more than 25% of Physics' share of <br>
>> Lustre, compared to LQCD which uses less than 5% of its disk space on <br>
>> the un-managed /work.<br>
>> <br>
>> It would cost Physics an extra $25K (total $35K - $40K) to treat the 260 <br>
>> TB as a requirement.<br>
>> <br>
>> There are 3 paths forward:<br>
>> <br>
>> (1) Physics cuts its use of /work by a factor of 4-5.<br>
>> (2) Physics increases funding to $40K<br>
>> (3) We pull a server out of Lustre, decreasing Physics' share of the <br>
>> system, and use that as half of the new active-active pair, beefing it <br>
>> up with SSDs and perhaps additional memory; this would actually shrink <br>
>> Physics near term costs, but puts higher pressure on the file system for <br>
>> the farm<br>
>> <br>
>> The decision is clearly Physics', but I do need a VERY FAST response to <br>
>> this question, as I need to move quickly now for LQCD's needs.<br>
>> <br>
>> Hall D + GlueX,  96 TB<br>
>> CLAS + CLAS12, 98 TB<br>
>> Hall C,                35 TB<br>
>> Hall A <unknown, still scanning><br>
>> <br>
>> Email, call (x7101), or drop by today 1:30-3:00 p.m. for discussion.<br>
>> <br>
>> thanks,<br>
>> Chip<br>
>> <br>
>> <br>
>> ______________________________<wbr>_________________<br>
>> Clas_offline mailing list<br>
>> <a href="mailto:Clas_offline@jlab.org" target="_blank">Clas_offline@jlab.org</a><br>
>> <a href="https://mailman.jlab.org/mailman/listinfo/clas_offline" target="_blank">https://mailman.jlab.org/<wbr>mailman/listinfo/clas_offline</a><br>
> <br>
> ______________________________<wbr>_________________<br>
> Hps-analysis mailing list<br>
> <a href="mailto:Hps-analysis@jlab.org" target="_blank">Hps-analysis@jlab.org</a><br>
> <a href="https://mailman.jlab.org/mailman/listinfo/hps-analysis" target="_blank">https://mailman.jlab.org/<wbr>mailman/listinfo/hps-analysis</a><br>
<br></div></div>
##############################<wbr>##############################<wbr>############<br>
Use REPLY-ALL to reply to list<br>
<br>
To unsubscribe from the HPS-SOFTWARE list, click the following link:<br>
<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__listserv.slac.stanford.edu_cgi-2Dbin_wa-3FSUBED1-3DHPS-2DSOFTWARE-26A-3D1&d=DwMFaQ&c=lz9TcOasaINaaC3U7FbMev2lsutwpI4--09aP8Lu18s&r=J4PP6Zl8IyGHpsqWaKegORCYw8hoCHePTw5O95a5lqQ&m=uMArnDUxZIcWnhhhkDxCFt_6oYuTtC4sqa5hcpmCjU0&s=txmDkGpN9LqqrRlM0rv_DUNvH5iC47ZyNI3ckuacXRQ&e=" target="_blank">https://listserv.slac.<wbr>stanford.edu/cgi-bin/wa?<wbr>SUBED1=HPS-SOFTWARE&A=1</a><br>
</div>
</span></font>
</div>

<br>
<hr>
<p align="left">
Use REPLY-ALL to reply to list</p>
<p align="center">To unsubscribe from the HPS-SOFTWARE list, click the following link:<br>
<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__listserv.slac.stanford.edu_cgi-2Dbin_wa-3FSUBED1-3DHPS-2DSOFTWARE-26A-3D1&d=DwMFaQ&c=lz9TcOasaINaaC3U7FbMev2lsutwpI4--09aP8Lu18s&r=J4PP6Zl8IyGHpsqWaKegORCYw8hoCHePTw5O95a5lqQ&m=uMArnDUxZIcWnhhhkDxCFt_6oYuTtC4sqa5hcpmCjU0&s=txmDkGpN9LqqrRlM0rv_DUNvH5iC47ZyNI3ckuacXRQ&e=" target="_blank">https://listserv.slac.<wbr>stanford.edu/cgi-bin/wa?<wbr>SUBED1=HPS-SOFTWARE&A=1</a>
</p>
</blockquote></div><br></div>