[Halld-offline] time for chat this afternoon

Richard Jones richard.t.jones at uconn.edu
Mon Aug 30 16:04:30 EDT 2010


  Jake,

A few changes have occurred since Blake's work was completed, that have impacted your tests.

   1. rearrangement of the gluex software tree: Used to be that /home/halld/src pointed to something.  Now it should be /home/halld/sim-recon/src, as I understand it.  That is where I am accessing the gluex software stack.  Note that this is not always up-to-the-minute recent latest build from Dave et.al.  If you need a certain tagged release installed, please let me know.  Otherwise you can install it yourself under the $OSG_APP directory tree, like I think Blake explains in his howto.
   2. srm tools are used to access the data grid.  The data grid is supposed to be used to store large data files, such as simulation output and logs, but not small files such as executable scripts or software build trees.  If you are having problems getting a reasonable response from my srm server, please forward me the details.  Here is what I am getting, at this moment as I write: 30 seconds of overhead to establish a srm session and do a simple task -- srm is not supposed to compete with command-line tools for things like ls, but has high transfer throughput for large files.
         1. time srmls srm://grinch.phys.uconn.edu
               0 //
                   512 //test/
                   512 //Gluex/
                   512 //dzero/
                   512 //prod/
                   512 //engage/
                   512 //radphi/
                   512 //data1/
            real    0m25.337s
            user    0m23.923s
            sys     0m1.113s

   3. I think that Blake was just forgetting about what tools to use for what needs.  If you want to look at files in local disks on the server, just a simple remote command like the following will do.  You can see that this one completed in 4 seconds.
         1.   time globus-job-run gluskap.phys.uconn.edu /bin/ls -l .globus
            total 4
            drwx------ 5 gluexuser Gluex 4096 Jan 22  2010 job

            real    0m2.615s
            user    0m0.494s
            sys     0m0.289s

-Richard Jones

On 8/30/2010 1:51 PM, Jake Bennett wrote:
> Hi, Richard.
>
> I don't think 5 pm this evening will work for me.  Do you have some time tomorrow?  I am generally free until about 6 pm on Tuesdays.
>
> Let me briefly tell you what I'm working on.  I used Blake's example code from the wiki to generate a .sub script to generate some pythia events.  I get the following error:
>
> /nfs/direct/app/Gluex/eta-pi0/setup.sh line 2: /home/halld/src/BMS/osrelease.pl<http://osrelease.pl>: no such file or directory
>
> The setup.sh file is the one I need to edit.  Blake suggested that the only way to edit files is to use the srmcp command to copy the file to my home directory and then edit it there before copying it back to the grid.  When I tried using the srm commands to access grendl, I get errors, but not when I access grinch.  I think I am just not quite clear on the structure of the grid.  That is something I would like to discuss when we get the chance.
>
> Let me know if tomorrow will work.
>
> Thanks,
> Jake
>
> On Mon, Aug 30, 2010 at 12:26 PM, Richard Jones<richard.t.jones at uconn.edu<mailto:richard.t.jones at uconn.edu>>  wrote:
>   Jake,
>
> I would be available to chat with you this afternoon at 5:00 our time (EST).  Will that work for you?
>
> -Richard J.
>
>
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.jlab.org/pipermail/halld-offline/attachments/20100830/5408cd3e/attachment-0002.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/pkcs7-signature
Size: 4092 bytes
Desc: S/MIME Cryptographic Signature
URL: <https://mailman.jlab.org/pipermail/halld-offline/attachments/20100830/5408cd3e/attachment.p7s>


More information about the Halld-offline mailing list