The CID2013 Camera Image Database

The CID2013 Camera Image Database consists of real images taken by consumer cameras and mobile phones. It is developed to provide useful tool to allow researchers target more commercially relevant distortions when developing processes of objective image quality assessment algorithms.

The CID2013 database consists of 480 images captured by 79 imaging devices (mobile phones, DSC, DSLR) in six Image Sets.

The images are evaluated by 188 observers using Dynamic Reference (DR-ACR) method (explained below). A separate scale realignment ACR data consisting evaluations from 34 observers is also included that allows to combine the data from the six image sets

In other respects the DR-ACR method resembles very much a basic Absolute Category Rating (ACR) method (ITU-R 500-11), except the observers saw a slideshow of all the other images in the test depicting the same scene before every evaluation (SEE VIDEO). By seeing the other images in the test setup as reference the observers were more aware of the total variation of quality represented within a single image set. This improved their evaluation as they didn’t need to save the far ends of the scale in case there would be even more better or worse image later on the experiment.

Database contains consumer camera images and their subjective evaluations in mean opinion score (MOS), sharpness, graininess, lightness and color saturation scales. It includes the complete raw data and background information from the naïve observers used to evaluate the images. Subjects’ vision was controlled for the near visual acuity, near contrast vision (near F.A.C.T.) and color vision (Farnsworth D15) before the participation. They received movie tickets as a reward. Outlier removal is made for mean opinion score (MOS) evaluations using ITU-R 500-11 recommendations to ease out the implementation of the database.

The images in CID2013 are intended to represent typical photographs that consumers might capture with their cameras. The photographed scenes were based partly on the Photospace approach described by I3A (CPIQ Initiative Phase 1 White Paper: Fundamentals and review of considered test methods, I3A, 2007)

The test environment

The room has been covered with medium gray curtains to diffuse the ambient illumination. Fluorescent lights (5800K) were positioned behind the monitors and reflected from the back wall covered with grey curtain to create dim and uniform ambient illumination in the room. The light hitting the monitors measured below 20 lx. The subject’s viewing distance (approximately 80 cm) was controlled by a line hanging from the ceiling, and they were instructed to keep their forehead steady next to the line. Because of the display size, images were scaled to a size of 1600 x 1200 pixels using the bicubic interpolation method. Eizo ColorEdge CG241W, with 1920x1200 pixel resolution, monitors in was calibrated to sRGB having target values of: 80 cd/m2, 6500K and gamma 2.2 using EyeOne Pro calibrator (X-rite co.).

The all experiments were conducted with the VQone MATLAB toolbox, which is available to the research community from here. More information of the monitors and laboratory can be found here

Download

Database is password protected. If you are interesting to download the database, please contact Toni Virtanen, (firstname.lastname ( at ) helsinki.fi). You may download the database from here

If you use these images in your research, we kindly ask that you follow the copyright notice and cite the following paper:


Last modified: 2015/11/05