Design of benchmark imagery for validating facility annotation algorithms

Randy S. Roberts, Paul A. Pope, Raju R. Vatsavai, Ming Jiang, Lloyd F. Arrowood, Timothy G. Trucano, Shaun Gleason, Anil Cheriyadat, Alex Sorokine, Aggelos K. Katsaggelos, Thrasyvoulos N. Pappas, Lucinda R. Gaines, Lawrence K. Chilton

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

The design of benchmark imagery for validation of image annotation algorithms is considered. Emphasis is placed on imagery that contains industrial facilities, such as chemical refineries. An application-level facility ontology is used as a means to define salient objects in the benchmark imagery. In-strinsic and extrinsic scene factors important for comprehensive validation are listed, and variability in the benchmarks discussed. Finally, the pros and cons of three forms of benchmark imagery: real, composite and synthetic, are delineated.

Original languageEnglish
Title of host publication2011 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2011 - Proceedings
Pages1453-1456
Number of pages4
DOIs
StatePublished - 2011
Event2011 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2011 - Vancouver, BC, Canada
Duration: Jul 24 2011Jul 29 2011

Publication series

NameInternational Geoscience and Remote Sensing Symposium (IGARSS)

Conference

Conference2011 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2011
Country/TerritoryCanada
CityVancouver, BC
Period07/24/1107/29/11

Keywords

  • Algorithm validation
  • Benchmark imagery
  • Benchmark variability
  • Ontology
  • Real annotated imagery
  • Validation using synthetic imagery

Fingerprint

Dive into the research topics of 'Design of benchmark imagery for validating facility annotation algorithms'. Together they form a unique fingerprint.

Cite this