Accelerator: A device that accelerates charged particles (such as electrons, protons, and atomic nuclei) to high velocities, thus giving them high kinetic energies.
ADC: Analog to Digital Converter.
Alignment: geodesy, surveying and alignment procedures to support the construction and installation of the experiment and achieve the required precise global and local positioning of beam line and experiment components.
Associated metadata : Ancillary information about the beam or non-beam data, including such attributes as storage location, calibration information etc.
Auxiliary Detector: Detector in the experiment other than the LArTPC itself – e.g. beam vetos or cosmic ray taggers.
Background radiation: Radiation present in the environment from cosmic sources, naturally occurring radioactive materials, and global fallout.
Beam Instrumentation: Instrumentation, equipment and diagnostics systems for the beam and beamlines.
Beamline: The set of magnets, vacuump pipes and other elements that transport accelerated beams from an accelerator. A beamline is used, for instance, to direct accelerated protons onto a target for the purpose of generating neutrinos.
Calibrations: the process or the data required to remove instrumental effects
Cloud computing: distributed computing technology with emphasis on network connectivity, virtualization and elasticity of resource allocation.
Content Management System (CMS): a computer program that allows publishing, editing and modifying content as well as maintenance from a central interface.
Continuous Integration: a software development practice that entails frequent updates of the code in the common repository, combined with running validation tests in an ongoing and coordinated manner.
Coulomb scattering: Elastic scattering of charged particles by the Coulomb interaction – law of physics describing the electrostatic interaction between electrically charged particles.
Cryogenics: The branches of physics and engineering that involve the study of very low temperatures, how to produce them, and how materials behave at those temperatures.
Dataset: a collection of files (potentially in differing formats) and corresponding metadata (uniform across the set) that forms a coherent unit of data used in a computation, and is accounted for as such.
Design constraints: impose restrictions on how the product must be designed. For example, it might have to be implemented in the handheld device being given to major customers, or it might have to use the existing servers and desktop computers, or any other hardware, software, or business practice.
Design requirements: define all of the components necessary to achieve the system requirements: “The alarm will be produced by part # 123-45-678.”
Detector: A particle detector is any device used to sense the passage of atomic or subatomic particles or to measure their properties.
Dual-phase Time Projection Chamber: is a configuration of a LArTPC that contains a small region of gaseous argon above a larger region of liquid argon. Ionization electrons drift to the surface of the liquid, where they are extracted into the gas phase, where they are amplified, and/or used to produce secondary scintillation photons that are detected.
Electron neutrino: Neutrinos are elementary particles, which exist in three different types or flavors. They are uncharged, non-ionizing and only rarely interact with ordinary matter.
Electron volt: A unit of energy equal to the kinetic energy (or energy of motion) an electron gains when being accelerated through a potential difference on 1 volt.
Electronics/Readout: the circuits used to amplify or otherwise process raw signals from a detector, plus those that convert and present that data in a digital form for storage.
External Software: Software on which the Offline Software relies. Also referred to as externals or third party software.
Exotics: Particles said to exist by some areas of modern physics, and whose alleged properties are extremely unusual. As examples: a tachyon is a hypothetical particle that always travels faster than light; supersymmetry is a theoretical set of significantly heavier versions of known particles (one of the proposed candidates for dark matter).
Field cage: Field cage that surrounds the entire detector between the APA’s and the CPA’s. Provides a uniform electric field within the detector volume.
File catalog: a general term used to describe a system which performs a range of mapping functions, such as mapping of a Logical File Name (LFN) to one or more physical locations of the file in the distributed data management systems. This functionality is essential for locating physical copies of the data when needed, optimal matching of distributed data to available computing resources, storage accounting and various other aspects of distributed data management. File catalog may also incorporate functionality related to Metadata (see next item).
Framework: an abstraction in which software providing generic functionality can be selectively changed by additional user-written code, thus providing application specific software without modification of the underlying framework.
Functional requirements: describe what a product has to do or what processing actions it is to take.
Geometry description: a collection of information sufficient for creation of the geometry model of the detector, for the purposes of either simulation, reconstruction or both.
Geometry model: a collection of data structures plus the code to manipulate these structures, which together provide the functionality required by the application (simulation etc, cf. the geometry model in Geant4).
Graphics processing units (GPUs): originally developed for accelerating graphics processing, can dramatically speed up computational processes for deep learning.
GPU as a Service (GPUaaS): Accelerator hardware like graphics processing units are purpose built for deep learning-like applications, but how to use them has been an issue. LArSoft tools provide a practical way to deploy GPUs “as a service.”
Grid computing: technology that allows a collection of computer resources from multiple locations to reach a common goal.
Half-life: The time during which half the (large number of) atoms of a particular radionuclide disintegrate.
Hit-finding and hit-object: Method and information that captures the charge deposition data from a single particle on a single LArTPC wire.
Horn: The neutrino beam horn is a horn-shaped part of the beam pipe that produces a funnel-like magnetic field, which directs dispersed charged pions and kaons straight ahead, thereby intensifying the resulting neutrino beam. By switching the direction of the magnetic field (by reversing the electric current in the horn), the oppositely charged particles are focused. The field direction thus determines whether the beam contains primarily neutrinos or anti-neutrinos.
Indirect dark matter detection: Detection of by-products of dark matter interactions rather than of the dark matter ”particle” itself.
Issue Tracking System: is a computer software package that manages and maintains lists of issues, as needed by an organization. An issue tracking system often also contains a knowledge base containing information on each customer, resolutions to common problems, and other such data.
Kaon: A kaon (also called K-meson) is any one of a group of four mesons distinguished by the fact that they carry a quantum number called strangeness.
Kilowatt: A thousand watts.
LArSoft: common toolkit for contributing and sharing of physics codes for Liquid Argon TPC experiments. Includes build and release of core executables based on the art framework, and support for each experiment to build and release their own executables.
LArSoft Requirements : the record in a common place of the requirements for capabilities, performance, and solutions based on LArSoft Software categorized by near term (¡1 year), medium term (¡3years) and long term (¡5years). As an example, a new requirement for the program might be capabilities to apply multiple algorithms for any output result including at the granularity of a single event.
LarSoft Steering Group: an experiment spokesperson-led initiative to drive, define and own LArSoft activities. The group is a collaboration within the context of the experiments, university physics and computing groups, the Fermilab Neutrino Platform and supporting computing organizations.
Lattice Optics: the optics associated with beams moving through an accelerator “lattice”, i.e. the arrangement of magnets used to focus, steer and stabilize the beam
Liquid Argon Time-Projection Chamber (LAr-TPC): Is the type of neutrino detector planned for LBNF/DUNE. The detector consists of a chamber filled with liquid argon and a network of wire planes. The detection method is based on the collection of ionization electrons, which result from particle interactions between the neutrinos and the liquid argon, onto wire planes immersed in the fluid
Metadata: data that describes other data.
Merge: information output from sub-components of the processing collected together into one entity or event.
Monitoring: in this context, it represents a system which stores the state of individual jobs, groups of jobs, data transmission and other objects vital for Workload Management, while providing appropriate interfaces to end-users and operators of a WMS.
Muon: The muon is a fundamental particle that is part of the Standard Model of particle physics. It is unstable subatomic particle of the same class as an electron (a lepton), but with a mass around 200 times greater
Muon neutrino: Neutrinos are elementary particles, which exist in three different types or flavors. They are uncharged, non-ionizing and only rarely interact with ordinary matter.
Near detector: In neutrino experiments scientists detect a small fraction of neutrinos generated in a near-detector as well as in much larger far-detectors to look for signals that the neutrinos are changing from one type to another on their trip. The same neutrino cannot be detected in both detectors – so statistics over many interactions are used to determine what is happening.
Nonfunctional requirements: are the properties of a product such as performance and usability. Do not be deterred by the unfortunate type name (we use it because it is the most common way of referring to these types of requirements) — these requirements are as important as the functional requirements for a product’s success.
Objective: experiment objective (goal).
Offline Leads: The Software and Computing coordinators of the experiments that use LArSoft.
Offline Software: Simulation, reconstruction, analysis and production software which is written for the collaboration.
Photon Detector: a single response element for photons
Pion: A pion (abbreviation for pi meson) is the collective name for three subatomic particles: ?0, ?+ , and ?-. Pions are the lightest mesons and play an important role in explaining low-energy properties of the strong nuclear force
PM10: Particulate matter having a median aerodynamic diameter less than 10 micrometers
Primary Beam: The primary proton beam line and its necessary supporting systems.
Processed Data: any data produced by production software (see above) for the collaboration as a whole or to satisfy the needs of working groups. This includes simulations, data reduction/reconstruction, data skimming and inputs to final analyses. This type of data does not include data samples produced by individual users using software that has not been certified as production software.
Production Software: the suite of software run to produce an official collaboration result. One example of such software is software used to produce data appearing in a publication. Such software is subject to strict version control, QA and validation, and is utilized in a managed fashion.
Prompt radiation: radiation produced by an accelerated beam or through interaction of the beam with matter
Proton: One of the basic particles that make up an atom. The proton is found in the nucleus and has a positive electrical charge equal to the negative charge of an electron and a mass similar to that of a neutron: a hydrogen nucleus
Radiation: The emitted particles (alpha, beta, neutrons) or photons (X-rays, gamma rays) from the nuclei of unstable (radioactive) atoms as a result of radioactive decay. Some elements are naturally radioactive; others are induced to become radioactive by bombardment in a nuclear reactor or other particle accelerator.
Radioactive decay: The change of one radionuclide into a different radionuclide by the spontaneous emission of radiation such as alpha, beta, or gamma rays, or by electron capture. Each decay process has a definite half-life
Random Number Generator (RNG) seed: The number (or vector) used to initialize software generation of a sequence of numbers or symbols that can not be reasonably predicted better than by a random chance.
Raw Data: Data saved by a detector (far, near or prototype detector) DAQ or monitoring systems, and information from FNAL beam monitoring.
Raw Event, Raw DAQ event: Information as acquired from the detector (before any processing/transformations are applied) that is treated as a set for the purposes of calculating physics quantities.
Re-constructable DAQ event: A DAQ event is Not re-constructable if the information available does not allow physics quantities to be calculated due to noise, missing information or other problems.
Requirement: A statement of what each system needs to do to meet the (prioritized) objectives. Anything tagged as a requirement for a given subsystem or component is to be fulfilled by that same subsystem or component.
Revision control: (also known as version control and source control) is the management of changes to documents, computer programs, web sites, and other collections of information.
Risk: The product of the probability of occurrence of an event or activity and the impacts resulting from that event or activity
Scenario: Description of someone or something interacting with a system, explained as a series of steps.
Shielding: A protective barrier, usually a dense material that reduces the passage of radiation from radioactive materials to the surroundings by absorbing it.
Side-bands: Region of the data outside of the region of interest used to extrapolate or interpolate into the region of interest.
Source: A radioactive material that produces radiation for experimental or industrial use
Specification: A value or range of values for a design choice or parameter, with units as needed, that meet the associated requirement(s); the chosen parameter value will need to be consistent with it.
Steering Group: the spokespeople of the experiments that use LArSoft as well as representatives from Fermilab’s Scientific Computing and Neutrino Divisions.
Stopping particles: Particle that goes no further in the detection system.
Supported Platform: A computing platform (OS, compiler and CPU architecture) which the software is agreed to be supported.
System requirements: define what the system must do to satisfy the users: “The alarm will produce a sound between 125 155 dBA”
Tau neutrino: Neutrinos are elementary particles, which exist in three different types or flavors. They are uncharged, non-ionizing and only rarely interact with ordinary matter Term] Explanation
Test Harness: a system supporting the creation and execution of unit tests.
Unit Test: a method by which individual units of source code, sets of one or more computer program modules together with associated control data, usage procedures, and operating procedures are tested to determine if they are fit for use.
User Friendly: The ability to make use of something with only a reasonably minimal amount of effort by the intended qualified user.
User requirements: define the results the users expect from the system: “The homeowner shall hear an alarm when smoke is detected.”
Validation: quality control which may include validation of physics functionality.
Validation criteria: a set of characteristics and/or parameters the objective or goal must comply with in order to be declared valid (i.e. passed validation).
Voxel: Software configured and defined 3-d volume of the detector for the purpose to ease use of treatment of very large volumes for simulation.
Workload Management System (WMS): a system that enables automated placement of computational payload jobs submitted by its users on distributed resources, using the underlying Grid layer, and makes subsequent record keeping, accounting, elements of data management and general monitoring available to the user.