Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to Generate Event Data for Demos Using Traditional Frame-Captured Footage? #7

Open
FrescoDev opened this issue Jun 24, 2024 · 3 comments

Comments

@FrescoDev
Copy link

Description
The model code assumes that event video data is included as part of the inputs. However, it's unclear how this was managed for the demonstration purposes, especially since the test datasets like Vid4 and REDS seem to be captured using traditional frame-based cameras.

Could you please provide details on the following:

Generation of Event Data: How was event data generated or simulated for the demonstration? Was there a specific method or tool used to convert traditional frame-based footage into event-based data?

Implementation Details: Any specific scripts or code examples used to achieve this conversion would be highly appreciated. Understanding the methodology would help in replicating the demo setup accurately.

Test Data Adaptation: If the event data was simulated, what adjustments or preprocessing steps were necessary to align this data with the model's requirements?

@DachunKai
Copy link
Owner

Generation of Event Data:

Our event data is generated using the event simulator vid2e. After generating the events, we apply a temporal reversal to create a backward version, making them suitable for our bidirectional network. These events are then converted into event voxels as code.

Implementation Details:

The data preparation process involves several different code libraries. For event data processing, you can refer to the event_utils library. We will be organizing and sharing the detailed data preparation steps soon.

Test Data Adaptation:

The event simulator already includes a sufficient simulation of real events. You can refer to the paper ESIM for more details. In our implementation, we did not need to make unique adjustments to align the simulated data with the model.

Thank you for your interest in our work!

@FrescoDev
Copy link
Author

Generation of Event Data:

Our event data is generated using the event simulator vid2e. After generating the events, we apply a temporal reversal to create a backward version, making them suitable for our bidirectional network. These events are then converted into event voxels as code.

Implementation Details:

The data preparation process involves several different code libraries. For event data processing, you can refer to the event_utils library. We will be organizing and sharing the detailed data preparation steps soon.

Test Data Adaptation:

The event simulator already includes a sufficient simulation of real events. You can refer to the paper ESIM for more details. In our implementation, we did not need to make unique adjustments to align the simulated data with the model.

Thank you for your interest in our work!

That's all really clear and helpful, thank you for the insights!

@DachunKai
Copy link
Owner

DachunKai commented Jun 28, 2024

@FrescoDev Hi, we have released our data preparation details at DataPreparation.md. Hope it helps you. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants