You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hi, thanks for you interesting work!!
i test some custom data, and got about scroe=35, but i am not sure this is high or low.
as i know, the output score is corresponding to the training dataset labels, right?
The text was updated successfully, but these errors were encountered:
Hi! Thanks for your interest in our work and for testing with your custom data.
To address your question:
Our algorithm evaluates the difference between the input image and a high-quality reference image set. A higher score generally indicates a greater difference, meaning the image quality is lower. Image quality is usually relative, so to give you some context: in our experiments with the LIVE database, the quality score for the reference image "bikes.bmp" is 19.2337, while the score for a distorted image with Gaussian blur, "img10.bmp", is 42.7782. Based on this, if your test image has a quality score of 35, visually it should be of better quality than "img10.bmp" but not as good as "bikes.bmp".
It's also important to note that our algorithm is opinion-unaware. This means that during training, we only use a dataset of high-quality images without any opinion scores or labels to guide the model's learning.
I hope this helps clarify things! If you have more questions or need further clarification, feel free to ask.
hi, thanks for you interesting work!!
i test some custom data, and got about scroe=35, but i am not sure this is high or low.
as i know, the output score is corresponding to the training dataset labels, right?
The text was updated successfully, but these errors were encountered: