NCJ Number
254595
Date Published
2020
Length
9 pages
Annotation
In order to have improved embeddings with easy positive triplet mining, this paper proposes an alternative, loosened embedding strategy that requires the embedding function only map each training image to the most similar examples from the same class, an approach the authors call "Easy Positive" mining.
Abstract
Deep metric learning seeks to define an embedding where semantically similar images are embedded to nearby locations, and semantically dissimilar images are embedded to distant locations. Substantial work has focused on loss functions and strategies to learn these embeddings by pushing images from the same class as close together in the embedding space as possible. The current project provides a collection of experiments and visualizations that show this Easy Positive mining leads to embeddings that are more flexible and generalize better to new unseen data. This simple mining strategy yields recall performance that exceeds state of the art approaches (including those with complicated loss functions and ensemble methods) on image retrieval datasets, including CUB, Stanford Online Products, In-Shop Clothes and Hotels-50K. The code is available at: https://github.com/littleredxh/EasyPositiveHardNegative (publisher abstract modified)
Date Published: January 1, 2020
Downloads
Similar Publications
- Community Court Grows in Brooklyn: A Comprehensive Evaluation of the Red Hook Community Justice Center, Final Report
- Quantifying the strength of palmprint comparisons: Majority identifications with surprisingly low value
- Correctional Officer Fatalities in Line of Duty During 2005 to 2015: A Survival Analysis