Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove upside-down tags #82

Open
ayberkozgur opened this issue Dec 11, 2014 · 7 comments
Open

Remove upside-down tags #82

ayberkozgur opened this issue Dec 11, 2014 · 7 comments

Comments

@ayberkozgur
Copy link
Member

We should add an option to remove upside-down tags, which is probably the ultimate solution to the tag-flipping issue. This can be achieved via:

  1. Discard tags whose z axes make a negative angle with the plane perpendicular to the camera y axis (in other words, the plane defined by the camera z and x axes).
  2. Discard tags whose z axes make a negative angle with the floor plane (or e.g less than -45 degrees). The latest sensor fusion code in the kalman branch provides very stable and reliable device orientation w.r.t the world and we can use these in mobile devices with IMUs.
@qbonnard
Copy link
Member

That's good old #19 right ?

@ayberkozgur
Copy link
Member Author

Like a much more specific case of #19.

@ayberkozgur
Copy link
Member Author

My initial tests with removing upside down tags reveal that multitag configuration recognized as a package by solvePnP is not a good idea; the whole configuration tends to wobble around but not flip completely (still need to confirm this) due to some tags flipping while some tags staying still. It is not possible to screen them out after they've been PnP'ed as a package.

OK, some remarks:

  1. Tracking needs a lot of work: Pose estimate resulting from the tracked corners is horrible; this was the reason I was seeing all that wobble.
  2. Culling upside-down tags is a good idea but doesn't completely cover flipping. You can, by bending a tag the right way, produce a flipped 3D pose estimate that is not completely upside-down.
  3. There seems to be no problems with PnP'ing multiple tags' corners for now.

@ayberkozgur
Copy link
Member Author

With the latest changes, I'm getting the best performance on mobile with upside down culling + IMU + DETECT_ONLY. It's really not bad even in highly dynamic cases as long as the tablet doesn't move linearly too much but only turns in its place (any amount of rotation is tolerated really well as long as it's not above 250 deg/s which is extremely fast). Tracking is only degrading the quality.

@qbonnard
Copy link
Member

Next one ;)
So I get the part where you discard the tags that are flipped relatively to the camera, but not the part where you discard tags that are flipped relatively to the floor. So if you put a tag on the ceiling for example, you discard it too ?

I agree en tracking needing a lot of work. I made very little experimentation with it, but trying to follow the corners rather than getting more feature points seemed a lot better. This was only "sample-based" experimentation, so to realiably improve this, we should definitely get serious tracking tests. tests tests tests. Also, not relying on corners would go in the direction of #58

@ayberkozgur
Copy link
Member Author

Ok, need to clarify more here: There is no screening of tags via pose wrt camera; there is only screening wrt the floor. The floor vector and tag z axis that are compared both being in the camera frame was probably misleading here. This whole thing has two (disadvantageous) implications besides preventing most flipping:

  1. We sacrifice the tags that are on the ceiling and on downward facing surfaces when operating a mobile device. I can't think of many use cases that are like this. We can (should) give the option of disabling this and set a bit larger threshold than exactly more than 90 degrees (e.g 100 degrees).
  2. This will not work on non-mobile (without inertial sensors) platforms without specifying the floor vector explicitly; this can be done, for example, on lamps.

Also, there was the idea of discarding the tags that are tilted more than say 90 degrees wrt to the camera z axis on one side so that as long as the camera is looking forwards, upside-down tags are discarded. This has the obvious disadvantage on mobile that you don't know the orientation of the device without inertial sensors so you can be discarding tags that are right side up but are tilted away from the camera. I think that a floor vector calculated via inertial sensors is a much better solution here. For non-mobile platforms, we can specify the floor vector manually as I said above.

About tracking; I would say again that non-video tests do not make much sense. Decoding a video and loading it frame by frame should be (very) easily doable with OpenCV as far as I know.

@qbonnard
Copy link
Member

qbonnard commented Jan 5, 2015

So it's another IMU kinda-related feature right ? I guess we could now boast about supporting IMU in the README.

About video tests, I'm definitely for feeding sequences of images coming from a video to Chilitags, I'm only talking about how to store them. Admittedly, it wouldn't be very complicated to use OpenCV's video reader, but it would be even simpler to just explode the video to a sequence of jpg and feed these to Chilitags. It would probably cost some disk space because of the less efficient compression, but it would make it easier to deal with the images, e.g. investigate a problem happening on a subsequence. Not a really big issue anyway.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants