[Moller_simulation] Free Microsoft Azure Resource
Wouter Deconinck
wdeconinck at wm.edu
Mon Jun 10 09:54:46 EDT 2019
Hi Michael and others,
We should strive to take advantage of every free allocation of computing
resources. I have no experience running Geant4 on Azure (I have run them on
Google Cloud and Amazon). I know Andrea Dotti, formerly one of the lead
developers of Geant4 spent a summer on getting this to work in a joint
program with Azure, and was not recommending us to use Azure afterwards (I
forget whether due to technical reasons or data egress policy reasons)...
Our simulations are usually considered high throughput computation (HTC),
not just (or not at all) high performance computation (HPC), since we have
pretty large egress requirements (and for analysis also large ingress
requirements). I have been targeting the scalable HTC resources that the
lab provides through Open Science Grid (OSG), currently mainly for Hall D
experiments. That allows a similar level of cores*, but indefinitely, with
high bandwidth in/egress, and without in/egress costs. We are already
working with Hall D to adapt their tools to remoll (in particular a
web-based job submission system, MC Wrapper). For Azure we'd need to
develop tools from scratch, though it may be possible to build those into
MC Wrapper (which already supports multiple batch systems).
In any case, the remoll docker containers is what allows this to happen,
both on Azure and on OSG (through singularity). Starting from those
containers it should be possible to spin up Azure instances to run
simulations; the challenge will be to make this scalable beyond a single
node.
Cheers,
Wouter
* Back in 2012 GlueX did a data challenge to test the limits of OSG and got
2M core-hours in a 14 day period, equivalent to continuous use of 6k cores.
That was a blip on the OSG worldwide capacity and is representative to
their current baseline use, I think.
On Mon, Jun 10, 2019 at 12:22 AM Michael Gericke <
Michael.Gericke at umanitoba.ca> wrote:
> Hello,
>
> The UofM has been given at least $20k worth of free Microsoft's Azure
> high performance
> computing resources to test their services and it seems that there is
> only a small number
> of groups really interested, including our group. That means that we
> could have access to
> a large number of cores over a significant period of time (months). This
> would be useful if
> one is planning to run a very large number of simulation configurations,
> or a very high
> statistics simulation. For example, we could have something like 72
> cores over a period
> of 3 months or more, plus associated storage, without a queue or any
> form of scheduling.
>
> I am currently trying to figure out what we might run, to take advantage
> of this, so if anyone
> in this group thinks this could be useful for a large scale MOLLER
> simulation job, please let
> me now as soon as possible.
>
> If you would like more information about it, here is a link:
>
>
> https://urldefense.proofpoint.com/v2/url?u=https-3A__azure.microsoft.com_en-2Dca_pricing_calculator_&d=DwICaQ&c=CJqEzB1piLOyyvZjb8YUQw&r=hK0qfHtfdwUHlZG0DnK-QRJVYus_TzTB8u52ev2QBtI&m=6NQiL4oNQwoN22o4fBlQGwUj6gCsdUE69cFS4F9zpug&s=y-eRkWEzfpGzeXbUgg1XCYCFAcqbT_2ZAI19cqtUQ1I&e=
>
> Cheers,
>
> Michael
>
>
> --
> Michael Gericke (Ph.D., Professor)
>
> Physics and Astronomy
> University of Manitoba
> 30A Sifton Road, 213 Allen Bldg.
> Winnipeg, MB R3T 2N2, Canada
>
> Tel.: 204 474 6203
> Fax.: 204 474 7622
>
>
> _______________________________________________
> Moller_simulation mailing list
> Moller_simulation at jlab.org
> https://mailman.jlab.org/mailman/listinfo/moller_simulation
>
--
Wouter Deconinck (pronouns: he, him, his)
Assistant Professor of Physics, William & Mary
Office: Small Hall 343D, Phone: (757) 221-3539
Emails sent to this address are subject to requests for public review under
the Virginia Freedom of Information Act.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.jlab.org/pipermail/moller_simulation/attachments/20190610/21ba5757/attachment-0001.html>
More information about the Moller_simulation
mailing list