DUNE-DAQ
DUNE Trigger and Data Acquisition software
|
emulate_from_tpstream.cxx
(and the application trgtools_emulate_from_tpstream
) processes timeslice HDF5 files that contain TriggerPrimitives, and creates a new HDF5 that includes TriggerActivities and TriggerCandidates.
The primary use of this is to test TA algorithms, TC algorithms, and their configurations, with output diagnostics available from ta_dump.py
and tc_dump.py
. The full TC displays that include all TAs & TPs within those TAs can be made with the plot_emulated_triggers.pd
script.
The application understands that there might be multiple sources of trigger primitives, that different files might contain TPs from different sources (e.g. two APAs), or that different files might contain TPs from the same sources but across different time-periods (e.g. two consecutive files from the same APAs).
The application will first check if we have TPs from all the available sources for the specified slices – and crop the requested timeslice ranges as appropriate.
WARNING: This script is different from process_tpstream
, and does not contain all the functionality yet. Look at TODOs below for more info.
An example algo_config.json
file with an explanation are provided HERE.
You can run the emulator with --parallel
, which will not only process each TPWriter separately, but each TAMaker too. This is currently only worth trying if using slow algo, like DBSCAN, otherwise it can be marginally slower.
convert-tplatencies
does – and that needs to be corrected.SliceID
s across different files, and cnverges on a SliceID
"window" to get the TPs from. Although rare, it should be possible to have matching SliceID
s with mismatching times – we need to go off actual TP times, rather than SliceID
s.