Input tensor (Inference Overlay)

From Tygron Preview Support Wiki
Revision as of 08:49, 9 October 2024 by Frank@tygron.nl (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

An Input tensor is a multi-dimensional data array that serves as input for Neural Networks. Generally these input tensors are filled with (parts of) one or more images, of a given width and height and with one more more color channels. These images will be processed by the neural network to classify the image or detect features in the image.

In the Tygron Platform, also Grid Overlays, often Satellite Overlays or WMS Overlays, can serve as input for neural networks. How an input tensor of a neural network is filled by an Inference Overlay is configured using Tensor Links.

An input tensor link in the Inference Overlay Wizard.

A tensor link references:

  • The input tensor, identified by the n (images) and c (channel) tuple, which are mentioned in the name of the tensor link, as well as the width and height of the input tensor.
  • The prequel of an Inference Overlay that should be used to obtain values from
  • The value type of the data of the prequel;
    • When it is a color, you can specify which color channel should be used; Red, Green, Blue or Alpha
    • When it is a floating point value, simple specify DEFAULT.
  • Whether the value should be normalized; Color channels are always normalized between 0 and 255. Floating point values are normalized using the calculated min- and max-value of the specified input prequel overlay.

Since the input tensor has a limited width and height, the Inference Overlay model uses the algorithm that moves an input window of this width and height over the prequel grid. Since features can be situated on the edges, the step by which the input tensor is moved can be configured using the Stride fraction model attribute.

Notes

  • The inference overlay wizard will report an issue when a prequel is referenced by a tensor link and it is not yet specified for the Inference Overlay.