HDR+ Burst Photography Dataset - Details

Dataset accompanying the paper:

Burst photography for high dynamic range and low-light imaging on mobile cameras
Samuel W. Hasinoff, Dillon Sharlet, Ryan Geiss, Andrew Adams, Jonathan T. Barron, Florian Kainz, Jiawen Chen, and Marc Levoy
ACM Transactions on Graphics (Proc. SIGGRAPH Asia 2016), 35(6), 12 pp.

For a quick overview of the dataset, see the galleries of our final results:

Filenames in these image galleries correspond to burst folder names in the dataset.

Overview

The dataset consists of 3640 bursts (made up of 28461 images in total), organized into subfolders, plus the results of our image processing pipeline. Each burst consists of the raw burst input (in DNG format) and certain metadata not present in the images, as sidecar files. For results, we provide both the intermediate result of aligning and merging the frames (also in DNG format), and the final result of our pipeline (as a JPG).

The bursts in the dataset were captured using a variety of Android mobile cameras (Nexus 5/6/5X/6P, Pixel, Pixel XL) using the public Android Camera2 API. The bursts contain between 2 and 10 raw photos each, and are generally 12-13 Mpixels, depending on the type of camera used for capture. All photos in a burst are generally captured with the same exposure time and gain.

All the mobile cameras we used have a rolling shutter with a readout time of 1/30 second. As a result, the gap between successive frames should never be larger than 33 ms, unless a rare hiccup in the camera driver caused a frame to be dropped.

Our results

We generated our results using the same image processing pipeline used by the HDR+ system in the Google Camera app, available on select Android devices. We provide results for two snapshots of our pipeline. The 2016/10/14 results correspond to the pipeline described in the HDR+ paper. The 2017/10/23 results correspond to a more recent version of our pipeline, with algorithm refinements and updated tuning. All results were generated by reprocessing saved input bursts on desktop.

Getting the data

The dataset is hosted on Google Cloud. It's available for both browsing (Google account required) and download (anonymously).

To get started, we suggest using the subset of bursts (153 bursts, 37 GiB) that we've curated. All of these shots have reasonably good technical photographic quality, and they cover a diverse range of scenes. From there, feel free to move on to the full dataset (3640 bursts, 765 GiB).

To download the dataset, we recommend using the gsutil tool included in the Google Cloud SDK. For example, the command:

gsutil -m cp -r gs://hdrplusdata/20171106_subset .

will copy the curated subset of bursts and our corresponding results to the current folder.

Burst folder description

payload_N<frame>.dng

Input burst photos, as captured on device. For devices that include optically shielded pixels, the images have been cropped to the active pixel array. For the minority of shots captured with digital zoom, the raw input has been pre-cropped to the photographer-selected field of view. Location data (GPS tags) has been stripped for privacy. Note that for some historical bursts (roughly, those captured before 2015) the last frame was captured with a longer exposure; our results ignore these longer-exposure frames.

lens_shading_map_N<frame>.tiff

Low-resolution floating-point gain maps listing the coefficients used to correct the input for both color shading and vignetting, for each Bayer color channel. Stored as 4-channel floating-point TIFF images. The channel order is [R, Gred, Gblue, B], where Gred is the green channel for the red-containing rows of a Bayer pattern, and Gblue is for the blue-containing rows.

rgb2rgb.txt

Coefficients of the 3x3 color transform matrix to use to transform from sensor RGB color space to output linear sRGB color space, serialized in row-major order. Note that the gains applied to Bayer raw color channels for white-balance are not included in the matrix; these can be determined as the reciprocals of the coefficients in the AsShotNeutral DNG tag.

timing.txt

On-device timings for the align, merge, and finish stages respectively. These timings are included for completeness, but do not reflect the current performance of our system. In particular, these timings correspond to less optimized historical snapshots of our pipeline, include older devices that do not run our pipeline in production, and may be slowed down by the very fact of saving input (which can trigger thermal throttling). This file is omitted for shots where our pipeline was not run on device.

Result folder description

merged.dng

Intermediate raw result of aligning and merging the input burst. Compared to the input photos, the merged result has higher precision. This higher precision is reflected in the BlackLevel and WhiteLevel DNG tags.

final.jpg

Final output, from running the merged result through our finishing pipeline. The JPG is encoded at a quality level of 95.

reference_frame.txt

Index of the reference frame. Zero-based; corresponds to <frame> in the filenames of the associated input burst.

Using the dataset

The dataset is released under a Creative Commons license (CC-BY-SA). This license is broad and largely unencumbered, however our main intention is that the dataset be used for scientific purposes. The subjects of the dataset include the authors' friends and family, so please keep usage in good taste.

Researchers who use the dataset should cite the associated paper:

@article{hasinoff2016burst,
    author  = {Samuel W. Hasinoff and Dillon Sharlet and Ryan Geiss and Andrew Adams and
               Jonathan T. Barron and Florian Kainz and Jiawen Chen and Marc Levoy},
    title   = {Burst photography for high dynamic range and low-light imaging on mobile cameras},
    journal = {ACM Transactions on Graphics (Proc. SIGGRAPH Asia)},
    volume  = {35},
    number  = {6},
    year    = {2016},
}

Release notes

Contact

Please direct questions to hdrplusdata@google.com.