Partially-Supervised Novel Object Captioning Using Context from Paired Data

Shashank Bujimalla (Intel),* Mahesh Subedar (Intel), Omesh Tickoo (Intel)
The 33rd British Machine Vision Conference


In this paper, we propose an approach to improve image captioning solution for images with novel objects that do not have caption labels in the training dataset. We refer to our approach as Partially-Supervised Novel Object Captioning (PS-NOC). PS-NOC is agnostic to model architecture, and primarily focuses on the training approach that uses existing fully paired image-caption data and the images with only the novel object detection labels (partially paired data). We create synthetic paired captioning data for novel objects by leveraging context from existing image-caption pairs. We then create pseudo-label captions for partially paired images with novel objects, and use this additional data to fine-tune the captioning model. We also propose a variant of Self-Critical Sequence Training (SCST) within PS-NOC, called SCST-F1, that directly optimizes the F1-score of novel objects. Using a popular captioning model (Up-Down) as baseline, PS-NOC sets new state-of-the-art results on held-out MS COCO out-of-domain test split, i.e., 85.9 F1-score and 103.8 CIDEr. This is an improvement of 85.9 and 34.1 points respectively compared to baseline model that does not use partially paired data during training. We also perform detailed ablation studies to demonstrate the effectiveness of our approach.



author    = {Shashank Bujimalla and Mahesh Subedar and Omesh Tickoo},
title     = {Partially-Supervised Novel Object Captioning Using Context from Paired Data},
booktitle = {33rd British Machine Vision Conference 2022, {BMVC} 2022, London, UK, November 21-24, 2022},
publisher = {{BMVA} Press},
year      = {2022},
url       = {}

Copyright © 2022 The British Machine Vision Association and Society for Pattern Recognition
The British Machine Vision Conference is organised by The British Machine Vision Association and Society for Pattern Recognition. The Association is a Company limited by guarantee, No.2543446, and a non-profit-making body, registered in England and Wales as Charity No.1002307 (Registered Office: Dept. of Computer Science, Durham University, South Road, Durham, DH1 3LE, UK).

Imprint | Data Protection