The First International Workshop on

Large-Scale Instance Retrieval


In conjunction with ACM Multimedia 2017
October 23 2017, Mountain View, CA USA

Retrieval is a classic and important topic in ACM Multimedia. This workshop focuses on large-scale instance retrieval, which is essentially a relatively new retrieval problem gathering more and more attention.Instance retrieval can be the core technology for many applications important for both the academia and industrial communities, e.g., person re-identification, product search, vehicle identification, etc. Therefore, this workshop is closely related with the ACM Multimedia and has sufficient significance.


Program Outline

The proposed length of this workshop is half a day. The rough outline and highlights are as follows:

Topics will be covered

As an emerging research topic attracting more and more interests in both academia and industry, instance retrieval targets to identify the re-appearing instances like persons, vehicles, products, landmarks, etc., from a large corpus of images and videos. It is potential to open great opportunities to address the challenging multimedia content understanding problem, offering an unprecedented possibility for intelligent multimedia processing and analysis, as well as exploring the promising applications like product, pedestrian, and vehicle search.

The proposed LSIR(Large-Scale Instance Retrieval) workshop aims at gathering the latest research works worldwide and provides a communication platform for researchers working on this topic. Instance retrieval is not a traditional search or classification task. First, LSIR system needs to locate instances before proceeding to the identification or retrieval step. Therefore, proper detection algorithms should be designed. Second, the visual appearance of an instance is easily affected by many factors like viewpoint changes and camera parameter differences, etc. This makes discriminative and robust feature learning a key step. Third, to cope with large-scale data, scalable indexing or feature coding algorithms should be designed to ensure the online efficiency. Aiming to seek novel solutions and possibilities, this workshop will have in-depth discussions on these issues. Specifically, the covered topic includes, but is not limited to:

After the workshop, Special Issues on instance retrieval in journals like IEEE T-MM and IEEE T-IP will be organized.


Proposers:

General Chairs:

Qi Tian (University of Texas at San Antonio), qitian@cs.utsa.edu
Wen Gao (Peking University), wgao@pku.edu.cn

Technical Program Chairs:

Shiliang Zhang (Peking University, China), slzhang.jdl@pku.edu.cn
Wengang Zhou (University of Science and Technology, China),zhwg@ustc.edu.cn
Bingbing Ni (Shanghai Jiao Tong University, China), nibingbing@sjtu.edu.cn

Anticipated Participants

Instance retrieval is an important topic for both the academia and industry. It is also an interdisciplinary research topic among Computer Vision, Multimedia, and Machine Learning fields. For example, the advances in object detection, distance metric learning, and deep learning have significantly boosted the performance of instance detection and recognition. Meanwhile, the research efforts in instance retrieval are potential to provide new possibilitiesfor efficient object detection, zero-shot deep learning, scalable indexing and searching, etc. Moreover, the LSIR would be a unique platform allowing world-renowned researchers to share their thoughts and works on instance retrieval. The LSIR has broad coverage on related topics in Multimedia Content Analysis, Computer Vision, Multimedia, and Machine Learning. It is expected to attract researchers working on both this topic and other related fields to submit papers. The expected numbers of accepted oral and poster papers are 3 and 10, respectively.

Submission link

You can Log in to MMWorkshop17 by clicking here.

Important Dates:

Paper submission deadline: July 31th , 2017
Notification of acceptance: August 15th , 2017
Camera-ready submission: August 31th , 2017

Contact

For any information, please send an e-mail to Shiliang Zhang(slzhang.jdl@pku.edu.cn), Wengang Zhou(zhwg@ustc.edu.cn) and Bingbing Ni(nibingbing@sjtu.edu.cn).