<div dir="ltr">The module command works fine on the ifarm under tcsh, but for some reason it's not properly configured under bash using the default environment. Whether they work on the non-interactive farm nodes is a different question...<div><br></div><div>---Sean</div></div><br><div class="gmail_quote"><div dir="ltr">On Thu, Jun 2, 2016 at 10:35 AM Mark Ito <<a href="mailto:marki@jlab.org">marki@jlab.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">I think it is available for tcsh on the farm.<br>
<br>
For bash, Sean did some research and came up with the following recipe:<br>
<br>
export MODULESHOME=/usr/share/Modules<br>
source $MODULESHOME/init/bash<br>
module load gcc_4.9.2<br>
<br>
I just pushed (to a branch) new versions of<br>
$BUILD_SCRIPTS/gluex_env_jlab.(c)sh that execute these commands<br>
automatically if you are on the farm.<br>
<br>
On 06/02/2016 11:16 AM, Sandy Philpott wrote:<br>
> Hmmmm... that should be available on the farm nodes as well - let me get that straight.<br>
><br>
> Sandy<br>
><br>
> ----- Original Message -----<br>
> From: "Justin Stevens" <<a href="mailto:jrsteven@jlab.org" target="_blank">jrsteven@jlab.org</a>><br>
> To: "Paul Mattione" <<a href="mailto:pmatt@jlab.org" target="_blank">pmatt@jlab.org</a>><br>
> Cc: "<a href="mailto:halld-physics@jlab.org" target="_blank">halld-physics@jlab.org</a> Physics" <<a href="mailto:halld-physics@jlab.org" target="_blank">halld-physics@jlab.org</a>>, "GlueX Offline Software Email List" <<a href="mailto:halld-offline@jlab.org" target="_blank">halld-offline@jlab.org</a>><br>
> Sent: Thursday, June 2, 2016 11:01:58 AM<br>
> Subject: Re: [Halld-offline] [Halld-physics] Updating software for gcc 4.9.2<br>
><br>
> FYI, the command to load gcc_4.9.2 with 'module'<br>
><br>
>> module load gcc_4.9.2<br>
> is not available on the farm nodes at JLab, so you can instead manually set the PATH and LD_LIBRARY_PATH variables when setting up your environment so that it works properly for batch jobs as well. For example:<br>
><br>
>> set GCC_HOME=/apps/gcc/4.9.2<br>
>> setenv PATH ${GCC_HOME}/bin:${PATH}<br>
>> setenv LD_LIBRARY_PATH ${GCC_HOME}/lib64:${GCC_HOME}/lib<br>
><br>
> -Justin<br>
><br>
> On Jun 1, 2016, at 3:26 PM, Paul Mattione wrote:<br>
><br>
>> Also, if you are working on the ifarm, before setting up your environment, you can run:<br>
>><br>
>> module load gcc_4.9.2<br>
>><br>
>> Which will prepend the necessary locations for the gcc libraries and binaries. Then just source your latest environment that is built with gcc 4.9.2.<br>
>><br>
>> - Paul<br>
>><br>
>> On Jun 1, 2016, at 3:22 PM, Paul Mattione <<a href="mailto:pmatt@jlab.org" target="_blank">pmatt@jlab.org</a>> wrote:<br>
>><br>
>>> We’ve now updated our software to work with gcc 4.9.2 on the farm. To get your software to work at your home institutions with this version of gcc, you need to:<br>
>>><br>
>>> 1) Upgrade to JANA 0.7.5p2<br>
>>><br>
>>> 2) Update to the latest sim-recon<br>
>>><br>
>>> And then, if you are using analysis plugins generated by MakeReactionPlugin.pl:<br>
>>><br>
>>> 3) You need to update the SConstruct in each plugin. The easiest way to do this is just to create a dummy plugin:<br>
>>><br>
>>> MakeReactionPlugin.pl dummy<br>
>>><br>
>>> and then copy the SConstruct it generates into all of your plugin folders.<br>
>>><br>
>>> 4) In your analysis plugin processor brun(), delete all reference to DEventWriterROOT. Creating of trees is done automatically now.<br>
>>><br>
>>> - Paul<br>
>>><br>
>>><br>
>>> _______________________________________________<br>
>>> Halld-physics mailing list<br>
>>> <a href="mailto:Halld-physics@jlab.org" target="_blank">Halld-physics@jlab.org</a><br>
>>> <a href="https://mailman.jlab.org/mailman/listinfo/halld-physics" rel="noreferrer" target="_blank">https://mailman.jlab.org/mailman/listinfo/halld-physics</a><br>
>><br>
>> _______________________________________________<br>
>> Halld-physics mailing list<br>
>> <a href="mailto:Halld-physics@jlab.org" target="_blank">Halld-physics@jlab.org</a><br>
>> <a href="https://mailman.jlab.org/mailman/listinfo/halld-physics" rel="noreferrer" target="_blank">https://mailman.jlab.org/mailman/listinfo/halld-physics</a><br>
><br>
> _______________________________________________<br>
> Halld-offline mailing list<br>
> <a href="mailto:Halld-offline@jlab.org" target="_blank">Halld-offline@jlab.org</a><br>
> <a href="https://mailman.jlab.org/mailman/listinfo/halld-offline" rel="noreferrer" target="_blank">https://mailman.jlab.org/mailman/listinfo/halld-offline</a><br>
><br>
> _______________________________________________<br>
> Halld-offline mailing list<br>
> <a href="mailto:Halld-offline@jlab.org" target="_blank">Halld-offline@jlab.org</a><br>
> <a href="https://mailman.jlab.org/mailman/listinfo/halld-offline" rel="noreferrer" target="_blank">https://mailman.jlab.org/mailman/listinfo/halld-offline</a><br>
<br>
--<br>
<a href="mailto:marki@jlab.org" target="_blank">marki@jlab.org</a>, (757)269-5295<br>
<br>
_______________________________________________<br>
Halld-offline mailing list<br>
<a href="mailto:Halld-offline@jlab.org" target="_blank">Halld-offline@jlab.org</a><br>
<a href="https://mailman.jlab.org/mailman/listinfo/halld-offline" rel="noreferrer" target="_blank">https://mailman.jlab.org/mailman/listinfo/halld-offline</a></blockquote></div>