[Vetroc_daq_sbs] Understanding the trigger timing

Benjamin Raydo braydo at jlab.org
Wed Oct 5 14:08:56 EDT 2016


Hi Evan,

Your understanding sounds basically right, but I'll a few words that may or may not make things more clear:
-the PMT and reference channel signals go into the VETROC, and are stored in a pipeline
-the VETROC streams PMT and reference channels hits (with 32ns timing resolution) to GTP through the back-plane, and GTP makes a trigger decision
-the trigger signal is sent out from the front of the GTP, passes through the VETROC (for LVDS->ECL conversion), and then goes into the TI
-after the TI receives the trigger signal, it stops (only if in "ROC LOCK" mode; otherwise triggers are allowed to be accepted up to the set buffer limit) and reads out the VETROC through the back-plane

So the VETROC has 1ns timing resolution for the TDC/readout, but for the trigger it's currently 32ns. This means that your trigger signal will have this jitter as well and therefore the TDC hits in the readout window will also have this similar spread. Have you looked at timing distributions of this reference channel? A histogram of the raw times should have a rectangular distribution with a width of 32ns (at least) - so by shortening the cable you should see that entire distribution move forward in the window by whatever time you saved. So I agree you should see the reference signal move by the amount you save in cable delay for the trigger signal going into the TI - would you be able to share the information on the cable length changes and timing distributions for your reference channel (or you could send me links to the EVIO files for both runs and I could figure that out too). Alternatively if you think you've checked and know there to be an issue and no comment above helps then I probably would definitely want to see the before/after EVIO files (then see the setup to run a test if that's still not making any sense).

Ben

----- Original Message -----
From: "R. Evan McClellan" <randallm at jlab.org>
To: "Benjamin Raydo" <braydo at jlab.org>
Cc: "Vetroc_daq_sbs" <vetroc_daq_sbs at jlab.org>
Sent: Wednesday, October 5, 2016 1:36:34 PM
Subject: Understanding the trigger timing

Hey Ben,

We'd like to clarify our understanding of the behavior of the trigger signal in the current GRINCH DAQ prototype setup.

Here is my understanding:
-the PMT and reference channel signals go into the VETROC, and are stored in a pipeline
-the GTP reads the VETROC pipeline memory directly, through the back-plane, and makes a trigger decision
-the trigger signal is sent out from the front of the GTP, passes through the VETROC (for LVDS->ECL conversion), and then goes into the TI
-after the TI receives the trigger signal, it stops and reads out the VETROC through the back-plane

We did a quick test to check this understanding. We significantly reduced the length of the cable carrying the trigger signal from the GTP to the VETROC.
We expected to see the timing of the reference channel TDC hits shift (due to the TI stopping the VETROC TDC earlier). However, the timing remained exactly the same.

Where is the mistake in our understanding?

Thanks!
Evan


R. Evan McClellan, PhD
Hall A Postdoctoral Fellow
Jefferson Lab



More information about the Vetroc_daq_sbs mailing list