Banner Research Area (1)

AutoSentinel-2/3 (2015 - 2016)

 

Knowledge-based mapping of Sentinel-2/3 images for operational product generation and content-based image retrieval

The primary objective of the AutoSentinel-2/3 exploratory project is to develop and test a novel remote sensing  (RS) Earth observation (EO) image understanding system architecture (EO-IUS) and implementation suitable for automatic Sentinel-2/3 multi-spectral (MS) image preliminary classification (pre-classification through color space discretization) at large spatial extent in operating mode. The proposed hybrid (combined deductive and inductive) feedback EO-IUS consists of the three following stages.

  • Stage 1. MS image pre-processing (e.g., radiometric calibration) and enhancement (e.g., topographic correction),  provided with feedback loops from the pre-attentive vision Stage 2’s and the attentive vision Stage 3’s categorical outputs. These feedback loops make an inherently ill-posed image enhancement problem (e.g., topographic correction, atmospheric correction, image co-registration, image mosaicking, image compositing, etc.) better posed (conditioned) for numerical treatment. In practice, high-level qualitative (categorical) information, generated at Stages 2 and 3, is employed to stratify (partition) low-level quantitative variables (e.g., sensory data) available as input to Stage 1, in compliance with a divide-and-conquer problem solving principle, regarded as common knowledge.

  • Stage 2. Low-level preliminary classification (pre-classification), where the existing Satellite Image Automatic Mapper (SIAM) software product for prior knowledge-based continuous MS data space discretization is adopted. The sole SIAM’s data requirement is that the input MS image must be radiometrically calibrated into a radiometric physical unit of measure, namely, top-of-atmosphere reflectance or surface reflectance values, where the latter is a special case of the former in clear sky conditions and flat terrain. The SIAM’s output products automatically generated in near real-time from a single-date MS image comprise:

          (i)  multi-level pre-classification maps
          (ii) multi-scale segmentation maps and
         (iii) various “smart” (driven-by-knowledge) spectral indexes, masked (conditioned) by
              spectral categories (e.g., a quantitative greenness index, computed
              as a scalar combination of two or three bands, depending on the image spectral 
              resolution at hand, is masked by a SIAM-based vegetation mask, where all
              available spectral bands are simultaneously considered for spectral categorization 
              purposes).

  • Stage 3. High-level land surface continuous variable (e.g., leaf area index) estimation and categorical variable (e.g., land cover (LC) and LC change class) extraction from single-date and multi-temporal MS images. The extension of the SIAM expert system to MS colour space discretization through time, called SIAM-through-time (SIAMT2) is under development.
              

Through its experimental activities, AutoSentinel2-/3 will also conceptualize a novel generation of semantic querying systems for content-based image retrieval from multi-source EO “big data” repositories, where the information-as-image-interpretation, available to be queried together with sensory data,  is provided off-line by an automatic EO-IUS incorporating a SIAM/SIAMT2 expert system at Stage 2.

Overall, the present AutoSentinel2/3 exploratory project is conceived as a preparation to a participation in the Horizon 2020 (SPACE, 2016/17) project call.


Instrument: FFG, ASAP (Austrian Space Applications Program), exploratory project

Project volume: 125,580 EUR

Contact person: This email address is being protected from spambots. You need JavaScript enabled to view it.

Researchers involved: This email address is being protected from spambots. You need JavaScript enabled to view it.(main project collaborator), This email address is being protected from spambots. You need JavaScript enabled to view it., This email address is being protected from spambots. You need JavaScript enabled to view it.