[Halld-tagger] [EXTERNAL] first step toward a global filesystem for Gluex

Richard Jones richard.t.jones at uconn.edu
Tue Nov 16 10:13:37 EST 2021


Hello all,

A few weeks ago, the possibility was raised of a shared global filesystem
to provide easy access to shared Gluex data (eg. REST, analysis data sets,
skims of various kinds) from anywhere on or off site without having to wait
to stage data from tape. As a first step, I have created a namespace for
these files under the osgstorage file catalog, managed by the osg ops.

   - /cvmfs/gluex.osgstorage.org/gluex

The purpose of the second /gluex is to allow various physics working groups
(eg. cpp, primex) to have their own separate branch under /cvmfs/
gluex.osgstorage.org. This osgstorage.org is built around a network of
shared caches across North America that automatically finds and serves you
the nearest copy of any file that is registered in the catalog. The data
are also locally cached on your local machine through the cvmfs caching
mechanism, for repeated access to the same files.

Right now UConn is contributing the "origin" service for the gluex
namespace, but hopefully JLab will also contribute to this in the near
future. To provide an osgstorage origin service, all you need is to export
your files using a standard xrootd server. Just email
support.opensciencegrid.org and tell them what portion of the /gluex
namespace you want to occupy, and they will start automatically indexing
your files and adding them to the catalog.

If you don't have any storage to contribute, but you would like to take
advantage of this shared Gluex storage, write to Mark Ito or to me and tell
us what datasets you would like to see published through the system. If you
don't have /cvmfs installed, you can still access any file in the namespace
using the stashcp command. If you log onto an ifarm machine at the lab, you
can poke around on /cvmfs/gluex.osgstorage.org/gluex and see what is
currently stored there, maybe 120 TB of various bits and pieces. There is
about 600 TB of additional space available at present, so there is plenty
of room for anything you would like to add.

-Richard Jones
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.jlab.org/pipermail/halld-tagger/attachments/20211116/445034e8/attachment-0001.html>


More information about the Halld-tagger mailing list