November 15, 2023

LArSoft Offline Leads 11/15/23 Meeting

Attendees: Herb Greenlee, Tom Junk, Will Foreman, Giuseppi Cerati, Erica Snider, Katherine Lato

At the meeting, we went over the draft of the 2024 work plan for LArSoft. 

Number one priority continues to be multi-threading and High Performance Computing (HPC) with several people working on this in 2024. Notably, SciSoft has effort available to make algorithms run on GPUs. Requirement is that they be relatively slow, in LArSoft, and amenable to acceleration with a GPU. There was a follow-on question about GPU as a service. This is described at: SciSoft believes this is ready to run at scale, so just need to work out the logistics of spinning up the GPU server somewhere.

Spack migration has a deadline of Q2 2024 in order to provide time for experiments to complete their migrations in advance of the June 2024 SL7 EOL.  AL9 requires Spack, since there will be no UPS support. This work seems close, though we still do not have a detailed timeline to completion. Comment:  Experiments have changes to make, so do not want to be pinched for time. Response:  Given that all experiment code has been migrated to cetmodules, the procedure for getting from there to a Spack build should be straight-forward. Expect that any issues will be related to getting the experiment-specific product stacks under Spack. Q:  Containers should provide some cover? Yes, should provide a buffer to the June 2024 EOL. Generally, the process will be that AL9 and SL7 will co-exist for some time to allow experiments to migrate. Spack has to come first. 

Support for multi-experiment event display has been on the wish list for a while so that upper management understands that multiple experiments have requested this.

A new item in this year’s work plan are updates to the LArSoft infrastructure. These include, but are not limited to:

  • Sampling frequencies vary across TPCs in protoDUNE, while LArSoft supports only a single value.
  • Support for non-planar cathode geometries to facilitate tracking across non-planar cathodes. 
  • Support for TPC-dependent drift velocities and electron lifetimes. 

Appendix B is a short summary of our major observations from one-on-one meetings with each experiment in September and October of 2023. Common items include: 

  1. Event display that is useful in the current environment.
  2. HPC.
  3. Faster processing.
  4. Event generators

Round Robin:

  • SBND:  Will Foreman
    • working to integrate blip reco into SBND code. Once completed, will migrate to LArSoft. But first want to make sure it is working in SBND
  • MicroBooNE:  all question answered during work plan discussion
  • SBN: Giuseppi 
    • SBND has allocation to run on Polaris at Argonne. Will be running simulation. Running in a container. First asked about Spack, but was not available, so opted for container. That has been demonstrated to work.
  • DUNE:  will send comments later.

Please email Katherine Lato or Erica Snider for any corrections or additions to these notes.