Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluation Question: 125 Video-prompt Pairs #27

Open
mikolez opened this issue Oct 12, 2023 · 2 comments
Open

Evaluation Question: 125 Video-prompt Pairs #27

mikolez opened this issue Oct 12, 2023 · 2 comments

Comments

@mikolez
Copy link

mikolez commented Oct 12, 2023

Dear authors,

First of all, very nice work and impressive results, congratulations! In the paper, you mention that you have 125 video-prompt pairs in total that you use for quantitative evaluations, could you please specify the exact prompts and which sequences from the DAVIS dataset you used so that I can replicate the results and evaluate in the exact same setting as you?

Thank you!

@YBYBZhang
Copy link
Owner

Thank you for your interest and appreciation of our work. The details of evaluation datasets have been sent to you by email.
We will update the details in revised arxiv.

@DavideA
Copy link

DavideA commented Nov 25, 2023

Hi,
I am looking for the same info. Could I please get them by email as well?

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants