You are here: Start » Technical Issues » General Image Acquisition

General Image Acquisition

Camera Acquisition Thread

In Aurora Vision image acquisition is done in the background. Due to that, regardless of what the vision program is doing a new image can be received from the camera as soon as possible.

When the communication with the camera is started (for example by using GigEVision_StartAcquisition, but this applies to different vendors as well) Aurora Vision creates a special background thread. This thread handles all communications with that particular camera, and in case of received images stores them in a special queue. While not exactly the same, in principle that queue is similar queues available to user in Aurora Vision. The size of the queue is determined by the filter (some camera interfaces do not support sizes larger than 1).

When the program executes a grabbing filter (e.g. GigEVision_GrabImage:Synchronous) that filter in fact grabs an image from the queue made by the background acquisition thread, not directly from the camera.

If there are no images in the queue, the grabbing filter waits until a new image arrives either for an indefinite amount of time (synchronous variant) or for a specified amount of time (asynchronous variant). This behavior is analogous to how the filters Queue_Pop and Queue_Pop_Timeout behave, respectively. After getting an image, it is removed from the queue. If the queue is full and the camera sends a new image, then the oldest image in the queue is replaced.

Images will remain in the queue until they are grabbed, are replaced by a newer image, or their camera thread was closed.

What is important to remember is that if the queue size is larger than 1, the grabbing filter will return the oldest image available. Depending on the application structure this may or may not be the desired behavior.

  • If the application has to analyze every image, but the iteration time may be longer than the period between images the queue should be large enough to store all images until the application can catch up. For example, a camera is triggered multiple time in the burst, followed by a period of inactivity, the queue size should be equal to the number of triggers in one burst.
  • Conversely, if the application does not need to inspect every image (common in application with a free-running camera) the queue size can usually be limited to one. This ensures the lowest possible lag between image acquisition and results.

As mentioned before, the background camera thread handles all communication with its assigned camera. If no thread for the specified camera exists, it will be created as soon as any camera filter is used. Camera can only have one thread assigned to it, so filters will always use the existing thread if one is available.

Camera threads also handle setting and reading parameters. Depending on the camera interface there may be optimizations, such as writing new parameter values only if the value has changed.

It is important to remember that the threads will be closed if the task in which they were created ends. When the thread is closed all data in it (including images in the queue) is removed.

Previous: TCP/IP Networking Next: Working with GigE Vision® Devices