Andrzej Szelek, Wes Ketchum, Thomas Junk, Herb Greenlee, Erica Snider, Katherine Lato
LArSoft – Erica and Katherine discussed the process of follow-up with issues, and how if it’s a ‘real issue,’ we’d like the Offline Leads to create the redmine issue so we know who to follow-up with. We then went through some of the current issues–but broke to have time for the round robin reports.
DUNE – Tom Junk
- They’re dealing with a number of little things that are experiment-specific.
- DAQ people from ProtoDUNE will give them error codes as well as other information per tick which is a lot of error information. Erica suggests putting it in a parallel structure so it’s easier to navigate. There will be some experiment-specific code that will have to know about it.
- DUNE is interested in when the LArG4 restructure is done.
- For best practices in DUNE, they are currently telling people to follow art’s and LArSoft’s. Note, LArSoft is happy to take suggestions to improve our recommended practices. This discussion related to an issue that art proposed a technical solution and a justification following best practices.
ICARUS– Daniele Gibin
LArIAT – Jonathan Asaadi
MicroBooNE – Wesley Robert Ketchum and Herb Greenlee
- Wes has spoken with Hans and Wes has follow up to do with Hans on integrating with the LArG4 restructure work. Wes put this on hold but will try to find the time for this.
- Both ICARUS and SBND state they would like to have separate storing of simulated energy depositions as an output of geant4, something that for use in event mixing and potentially translation of events within and among detectors (for systematic studies). I know other experiments are interested too.
- There are basic modules for electron and photon (based on library) propagation, but it’s likely that more effort will be needed here (from other experiments or the project.) NEST currently has hooks to the G4 geometry/materials and his cannot cleanly be handled right now, for instance. This would also represent a doubling up of data, as energy depositions are stored in a slightly different way as part of the sim channels. I think design and infrastructure work may be necessary to evolve or consolidate how the information is stored and how things are properly backtracked. Overall, this deserves a solid review if here’s any significant changes planned, as I think we would prefer to do any change like this once and have it be stable for some time.
- MicroBooNE spokes are concerned about giving credit to authors of code who contribute to LArSoft. And to experiments. During the discussion, we also mentioned including the institution. LArSoft has a place to give author credit for algorithms and services on larsoft.org – as well as in the code itself. This is not being done enough, so suggest a multi-prong education campaign to encourage people to sign their name on the code they write and to contribute to the http://larsoft.org/doc-algorithms/ page. This will be highlighted in a LArSoft Coordinate Meeting with follow up but this needs to be done by all experiments, not just LArSoft. The reason people should want to do this is that they may want a job reference some day and people need to be able to tell what they did.
- Code should contain 5 pieces of information every time 1) Name, 2) from experiment, 3) from institution, 4) date 5) Good information about what this code does. Code was written by X author from experiment from institution X. It’s important to not imply that the code is for a given experiment–some people will assume that means it doesn’t apply to them.
- LArSoft can and should supply templates, but how do we ensure that people use them? Use the template-generators and then you get a header with information defined. We should update those templates to include from experiment, and institution. Should be skeleton for generic files like algorithms, phython macros, tools, an effort to update them would be good, not just art-specific. But one problem is that most people develop on their laptops, so may not use the template generators. Still, we can take this as an issue — create skeletons for a document section in the top of header files for module code, algorithms, services. Post these to redmine. Make them part of skelton generators. May need a better name. Perhaps the first step is to talk about this at a Coordination meeting, and get a team from multiple experiments to work on this.
- Also talked about automatic ways to determine who is doing work such as commit counter — and then do something with it. Like poking them to add something to larsoft.org.
SBND – Andrzej Szelek –
- Very interested in Hans’ work, especially with wavelength and other things that changes the propagation times. The effect would show up in the photon library, but Hans hasn’t said he has functions that would replace the photon library. They’re interested in what is going on and are interested in anything that will speed things up. They are using the index of something now. The timing simulation takes that into account.
- A large part of the simulation is in, thanks for all the help with that from Erica and Gianluca. They were using standard GEANT4 projections taking for granted, but they forgot to include it so are building a branch to put that in. They uncovered a bug in GEANT4 timing. There’s code in LArG4 that is copied verbatim from GEANT4. The charge loss is calculated and the first step is wrong, so the timing can be slightly off. It affects the distribution. They have a hacky fix that looks at this and fixes the velocity of the first step if need be. A systematic fix would be better, but GEANT4 says it works for them. There is something in GEANT4 that prevents it from being a bug for them, but it is for us. We are moving to GEANT4 photon propagation.
- When they were trying to do a demo using event display, the mouse function didn’t work.
Issues from previous Offline Leads meetings to follow-up on and new issues from 6/14/17 meeting.
- From May 2017: How to handle different interaction time hypotheses for particles in an event.
- Background: In ProtoDUNE, there can be four or more measurements of the time of an interaction, associated beam arrival time information, cosmic-ray counter hits, photon-detector hits, and a diffusion-based measurement. Some of these may have mis-associations and multiple candidate times reconstructed. This is probably just an analysis issue — what we want to do with these candidate times. The obvious thing is to use them to calibrate timing offsets in the detector, but there may be other uses, such as improving the reconstruction of particles broken in different volumes for other uses.
- 6/14/17 update – The attendees discussed possible infrastructure work related to how to handle different interaction times, with some clarification of terms. All agreed that “t0” is not a good name, but that’s what we have for it. This value needs to be attached to different things, so the current practice is to create associations between t0 objects as needed, and let experiment analyses deal with the details. Tom (who brought this up last month) agreed that attempting to centralize any further handling of this information within LArSoft is not needed. This issue can be closed.
- From May 2017: Feedback at a recent SBN analysis meeting on proposed restructuring of G4 simulation step was that these were potentially very important changes. Is it possible to see all of this in place this summer for ICARUS large-scale processing and potentially MicroBooNE processing campaign? Along with the work itself, how one updates/does the backtracking remains an unanswered question: not technically difficult, but could imply significant changes in downstream code based on how it is done. This may take another or a few more people working on it to really see it through.
- The project and schedule are being tracked in issue 14454 with expected completion in the middle of July. https://cdcvs.fnal.gov/redmine/issues/14454
- After 6/14/17 consult with Hans on the current status – Work in Geant4 has been completed, and now need to re-structure and adapt LArG4. This portion of the work is being tracked in the above issue. The new version of LArG4 will use the artg4 toolkit, and will expose the energy deposition after particle tracing through the LAr as a new interface layer. The energy deposition will be performed via step limiting, so will eliminate the current usage of voxelization within a parallel geometry. Modeling of ionization and photon statistics will continue to be performed via some interchangeable unit that is independent of the rest of the code. Charge and photon transport will be factored into new modeling layers, where the latter is optionally performed by Geant4. Charge transport modeling will deliver arrival times and positions at a predetermined plane (e.g, a readout plane, a grid plane, etc). The current thinking is that details of how charge transport occurs within the anode region will be handled separately. An alternative is to allow Geant4 to deliver charge to each of the anode planes. This scheme provides for charge deposition with the anode region, but would complicate adapting the simulation to different readout schemes (e.g., 90-degree strips as in DP detectors, or readout pixels as proposed for DUNE ND). The photon transport modeling would deliver arrival times at photo-detectors after all wave-shifting. This scheme allows for complete Geant4 modeling of photon transport, including all wave-shifters (which are defined in the detector geometries), or alternatively, parameterized transport outcomes. The choice between the two options would be made via fhicl options (either disabling Geant4 photon transport and enabling the factored transport model, or visa versa). A modified backtracking scheme comes along with all this.
- From 4/19/17 meeting: DUNE wants something added to hit to store things for dual-phase. Robert Sulej is working on this with dual phase person. Additional hit parameters are used only in the event display. Having them inside hit would make code more simple. Having the parameters separated is reasonable as well since they are specific to dual phase. In the code branch we are developing there is a working solution to keep hit parameters in a separate collection, with no need for Assns.
- 5/25/17 update – need to follow up to see if keeping the hit parameters in a separate collection works.
- 6/14/17 update – Concluded that the current scheme of using additional data products for DUNE is adequate.
- From 4/19/17 meeting: Presentation from student on CNN and adversarial network? According to Robert Sulej, it is possible to think of making such machine-learning-based filter for the detector/E-field response simulation, but it is a future work. They are working now on the idea rather to provide data-driven training set for the CNN model preparation. Need time to understand the results to tell what is the limitation of the tool and what isn’t.
- 5/25/17 update – This has been on the schedule for a coordination meeting, and has been postponed at the request of the authors
- 6/14/17 update – No update
- From 3/22/17 meeting: Since SBND has been trying to include files inside GDML, have run into problems. Gianluca has been helping debug this. New version of ROOT may be a solution.
- 4/18/17 update: The version of root used by LArSoft has changed a few times over the last few weeks. Andrzej Szelc said they haven’t looked into this because they were focusing on getting the basic geometry in. Once that is done, will look at this and hope the new ROOT version may have fixed it. So, no ticket has been written yet (to ROOT or LArSoft) about this as waiting on whether it is still an issue.
- 5/25/17 update – still waiting.
- 6/14/17 update – Once SBND gets the latest version of ROOT in, they’ll see if this fixes the issue.
- From 6/14/17 meeting: MicroBooNE asked about how to give credit to authors, experiments and institutions for LArSoft code written. Have updated the recommendation on documenting code at http://larsoft.org/important-concepts-in-larsoft/design/. We will also present a proposed documentation template for header files at a LArSoft Coordination Meeting. LArSoft has a place to give author credit for algorithms and services on larsoft.org – as well as in the code itself. This is not being done enough, so suggest a multi-prong education campaign to encourage people to sign their name on the code they write and to contribute to the http://larsoft.org/doc-algorithms/ page. This will be highlighted in a LArSoft Coordinate Meeting with follow up but this needs to be done by all experiments, not just LArSoft.
Please let us know if there are any corrections or comments to these meeting notes.
Katherine & Erica