Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Try to port the algorithm based on manchester coding and cross-correlation from OboeTester #3

Closed
gavv opened this issue May 25, 2020 · 4 comments
Assignees
Labels
feature New feature or request

Comments

@gavv
Copy link
Owner

gavv commented May 25, 2020

Currently we use, roughly, the following algorithm for measuring the latency (see StrikeGenerator and LatencyEstimator classes):

  • generate periodic strikes (loud beeps) and write them to output device
  • read looped back signal from input device
  • calculate running maximum over the signal (i.e. the maximum inside a sliding window)
  • pass running maximum results to a schmitt trigger
  • when the triggers switches from off to on, we consider that the strike (beep) is received; now we just calculate the delay between the moment when the strike was played and between the current time, i.e. the moment when the strike is recorded - and here is our latency

Here is an illustration (from our README): https://github.com/gavv/signal-estimator/blob/master/example/plot_edited.png

The actual implementation is a bit more complicated, but is close enough to this. This algorithm has some advantages:

  • it's simple to implement;
  • it works well even if the input signal is very quiet compared to the original signal, which is so when the loopback being measured includes "in-air" transmission, e.g. you just play signal to speakers and record it from headphones.

(NB: I selected this algorithm after unsuccessfully trying out alsa_delay from zita-alsa-pcmi package and test/latency.c from libasound).

The disadvantage of this algorithm is that it's very sensitive to environment noise. Actually you need absolute silence in the room to make it working.

There is a tool for Android called OboeTester that solves the similar task that signal-estimator, but in a more sophisticated way. It is Android (Oboe) specific.

It uses, roughly, the following algorithm:

  • generate random pulse (src)
  • pass it to manchester encoder (src)
  • round the edges of the encoded result using a half cosine (src)
  • play the result (a vector of size N) to output device
  • read looped back signal (a vector of size N + K) from input device
  • calculate normalized cross-correlation for K pairs of vectors: input[0:N], output[0:N], input[0:N], output[1:N+1], ... input[0:N], output[K:N+K]; store the result into a vector of cross-correlation values (src)
  • find the index of the highest cross-correlation value
  • the index found defines the offset between the input and output signal, i.e. our latency

I hope my understanding of the algorithm is correct or at least close :)

This algorithm is claimed to be much more robust to environment noise. It would be very interesting to try it out in signal-estimator, so that we can use it outside of Android.

To achieve this, we can rename the current LatencyEstimator to something like StrikeLatencyEstimator, and add a new implementation of IEstimator, say, ManchesterLatencyEstimator.

Random pulse generator, manchester encoder, edge rounding, and cross-correlation computation preferably should be extracted into separate components and functions (like we do with running maximum and schmitt trigger, for example).

BTW, OboeTester also has a GlitchAnalyzer, solving the same task that our LossEstimator, but I haven't look at it closely so far.

@gavv gavv added feature New feature or request help wanted Contributions are welcome labels May 25, 2020
@gavv gavv removed the hacktoberfest label Dec 3, 2020
@gavv gavv removed the help wanted Contributions are welcome label May 29, 2022
@baranovmv
Copy link
Contributor

baranovmv commented Jun 1, 2022

Hi @gavv

I have implemented in signal-estimator something similar to what you describe above:

  • Generate a probe impulse with python script:
    • M-sequence 2048 bins,
    • up-convert to 8kHz (assuming 48 kHz sample rate),
    • upsample 4 times,
    • fade-in and fade-out with Hamming window
    • store the constant array of samples into h-file.
  • In run-time a generator plays these samples into output device with some silence in between impulses,
  • Estimator cross-correlates input signal with constant m-sequence, detects peaks above noise-floor and prints measured delay between input and output signals.

@gavv
Copy link
Owner Author

gavv commented Jun 1, 2022

Awesome

@gavv
Copy link
Owner Author

gavv commented Feb 11, 2023

Landed!

@gavv gavv closed this as completed Feb 11, 2023
@gavv
Copy link
Owner Author

gavv commented Feb 11, 2023

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants