We have developed interactive image editing tools using a new randomized algorithm for quickly finding approximate nearest neighbor matches between image patches. Previous research in graphics and vision has leveraged such nearest-neighbor searches to provide a variety of high-level digital image editing tools. However, the cost of computing a field of such matches for an entire image has eluded previous efforts to provide interactive performance. Our algorithm offers substantial performance improvements over the previous state of the art (20-100x), enabling its use in interactive editing tools. The key insights driving the algorithm are that some good patch matches can be found via random sampling, and that natural coherence in the imagery allows us to propagate such matches quickly to surrounding areas. We offer theoretical analysis of the convergence properties of the algorithm, as well as empirical and practical evidence for its high quality and performance. This one simple algorithm forms the basis for a variety of tools – image retargeting, completion and reshuffling – that can be used together in the context of a high-level image editing application. Finally, we propose additional intuitive constraints on the synthesis process that offer the user a level of control unavailable in previous methods.
PatchMatch was later generalized and shown to be useful for a variety of computer vision applications like image denoising, object detection, label transfer and more.The patch synthesis capabilities shown here were later enhanced in the Image Melding project. Some of our later citations using PatchMatch as a building block can be found below.
PatchMatch is part of the Content Aware Fill feature in Photoshop CS5, the Content Aware Patch and Move tools in Photoshop CS6 (with further improvements in later versions). It was also used as a core algorithm in the Video Tapestries, Cosaliency, Regenerative Morphing and Non-Rigid Dense Correspondence research projects.