You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when the triggers switches from off to on, we consider that the strike (beep) is received; now we just calculate the delay between the moment when the strike was played and between the current time, i.e. the moment when the strike is recorded - and here is our latency
The actual implementation is a bit more complicated, but is close enough to this. This algorithm has some advantages:
it's simple to implement;
it works well even if the input signal is very quiet compared to the original signal, which is so when the loopback being measured includes "in-air" transmission, e.g. you just play signal to speakers and record it from headphones.
(NB: I selected this algorithm after unsuccessfully trying out alsa_delay from zita-alsa-pcmi package and test/latency.c from libasound).
The disadvantage of this algorithm is that it's very sensitive to environment noise. Actually you need absolute silence in the room to make it working.
There is a tool for Android called OboeTester that solves the similar task that signal-estimator, but in a more sophisticated way. It is Android (Oboe) specific.
round the edges of the encoded result using a half cosine (src)
play the result (a vector of size N) to output device
read looped back signal (a vector of size N + K) from input device
calculate normalized cross-correlation for K pairs of vectors: input[0:N], output[0:N], input[0:N], output[1:N+1], ... input[0:N], output[K:N+K]; store the result into a vector of cross-correlation values (src)
find the index of the highest cross-correlation value
the index found defines the offset between the input and output signal, i.e. our latency
I hope my understanding of the algorithm is correct or at least close :)
This algorithm is claimed to be much more robust to environment noise. It would be very interesting to try it out in signal-estimator, so that we can use it outside of Android.
To achieve this, we can rename the current LatencyEstimator to something like StrikeLatencyEstimator, and add a new implementation of IEstimator, say, ManchesterLatencyEstimator.
Random pulse generator, manchester encoder, edge rounding, and cross-correlation computation preferably should be extracted into separate components and functions (like we do with running maximum and schmitt trigger, for example).
BTW, OboeTester also has a GlitchAnalyzer, solving the same task that our LossEstimator, but I haven't look at it closely so far.
The text was updated successfully, but these errors were encountered:
I have implemented in signal-estimator something similar to what you describe above:
Generate a probe impulse with python script:
M-sequence 2048 bins,
up-convert to 8kHz (assuming 48 kHz sample rate),
upsample 4 times,
fade-in and fade-out with Hamming window
store the constant array of samples into h-file.
In run-time a generator plays these samples into output device with some silence in between impulses,
Estimator cross-correlates input signal with constant m-sequence, detects peaks above noise-floor and prints measured delay between input and output signals.
Currently we use, roughly, the following algorithm for measuring the latency (see StrikeGenerator and LatencyEstimator classes):
Here is an illustration (from our README): https://github.com/gavv/signal-estimator/blob/master/example/plot_edited.png
The actual implementation is a bit more complicated, but is close enough to this. This algorithm has some advantages:
(NB: I selected this algorithm after unsuccessfully trying out alsa_delay from zita-alsa-pcmi package and test/latency.c from libasound).
The disadvantage of this algorithm is that it's very sensitive to environment noise. Actually you need absolute silence in the room to make it working.
There is a tool for Android called OboeTester that solves the similar task that signal-estimator, but in a more sophisticated way. It is Android (Oboe) specific.
It uses, roughly, the following algorithm:
input[0:N], output[0:N]
,input[0:N], output[1:N+1]
, ...input[0:N], output[K:N+K]
; store the result into a vector of cross-correlation values (src)I hope my understanding of the algorithm is correct or at least close :)
This algorithm is claimed to be much more robust to environment noise. It would be very interesting to try it out in signal-estimator, so that we can use it outside of Android.
To achieve this, we can rename the current LatencyEstimator to something like StrikeLatencyEstimator, and add a new implementation of IEstimator, say, ManchesterLatencyEstimator.
Random pulse generator, manchester encoder, edge rounding, and cross-correlation computation preferably should be extracted into separate components and functions (like we do with running maximum and schmitt trigger, for example).
BTW, OboeTester also has a GlitchAnalyzer, solving the same task that our LossEstimator, but I haven't look at it closely so far.
The text was updated successfully, but these errors were encountered: