Skip to content

Instantly share code, notes, and snippets.

@finddavesteele
Last active September 6, 2023 17:41
Show Gist options
  • Save finddavesteele/91e623d503df328a692bd22e92ad394c to your computer and use it in GitHub Desktop.
Save finddavesteele/91e623d503df328a692bd22e92ad394c to your computer and use it in GitHub Desktop.
Brinq Smart Camera Analytics for NAVQ+

Table of Contents

Overview of Brinq Smart Camera Analytics for NAVQ+

The NAVQ+ Smart Camera platform includes AI enabled vision analytics for boundary crossing, people counting, tracking, reidentification, motionless and loitering behavior. These analytics provide powerful tools to help identify events that occur in live video. This tutorial explains a) the expected behavior analytics and b) how to configure them from first principles.

An additional getting started tutorial and a video demo has been created to illustrate features and operation using test source video should be reviewed first

This is demo software and is limited in its features and capabilities. Additional analytics for people, vehicles, packages and inspection are available from Arcturus along with development services for specialized analytics and support. Refer to the contact section to reach out to us. We are here to help you deliver your ML and vision projects.

Object Detection

The demo uses a light-weight Yolov5 model that performs efficiently on the i.MX 8MP NPU. The model is trained on the open source COCO dataset limited to people class only in order to maximize accuracy. The model is designed to achieve real-time performance with good accuracy and sufficient generalization for most applications. Additional models are available from Arcturus including a proprietary people detection model trained on a custom curated dataset sourced from multiple validated public sources. This model improves accuracy and generally performs better with no additional performance sacrifice - contact Arcturus for additional information.

Refer also to the Advanced section for additional settings.

Tracking and Reidentification

Core to many of the analytics is a tracking and reidentification capability. This capability acts as a fundamental primitive that underpins the analytics and enhances detection by assigning a unique identity to each person as they enter the field of view. As a person moves around the field of view a motion prediction tracking model is used to reassign the same identity to each bounding box, making it possible to reidentify the same person frame-over-frame. Motion prediction is light-weight and a highly effective method of tracking and reidentification; however, it is dependent on continuous detections. If detections are lost (beyond a tolerance period) a new ID will be assigned to the person when they re-appear.

Additional tracking and reidentification methods are available from Arcturus which include the use of an additional visual appearance metric to reassociate identities based on appearance, should detections be lost. This makes it possible to assign the same identity to a person who has left and re-entered the field of view or who may have become occluded due to a person or object passing in front of them. This form of re-id is more computationally intensive, but more robust to dynamic scenes.

Analytics

Boundary Crossing

The Boundary Crossing analytic supports three distinct features:

Boundary Crossing

  • Makes it possible to detect when a person has entered or exited a particular region or zone -much like the behavior of a trip wire. This analytic has may general purpose uses including:
    • Secure areas, where people should not be
    • Staffed areas, such as a security desk
    • Monitoring discrete entrances and exits to determine capacity

Intrusion

  • The intrusion detection feature makes it possible to trigger an event if a person suddenly appears in a zone without first crossing a boundary for example:
    • A person suddenly appears by opening a door

People Counting

  • The people counting feature makes it possible to:
    • Determining how many people are in each zone
    • Determine the total number of unique people in the field of view
    • Determine people traffic per minute (e.g. impression counting or traffic analysis)
    • Trigger an event based on a sudden appearance of a crowds (e.g. a fire alarm that has caused an evacuation)
    • Trigger an event based on the min / max capacity thresholds

Expected Behavior

Boundary Crossing

  • The boundary crossing analytic relies on the tracking and reidentification feature to assign each person in the field of view with a unique identity number and track them frame-over-frame.
  • The boundary crossing feature uses foot-fall logic to determine an event trigger. This implementation is consistent with a "trip wire" use case example. Specifically, a boundary crossing event is triggered by the center of the base of the bounding box passing over the zone boundary threshold. Debounce on enter/leave events requires 15 successful frames of either entering or leaving before triggering the event and sending a notification. The analytic relies on configuring zones and schedules to define operation.

Intrusion

  • Intrusion will detect when a person suddenly appears in a zone -but- has not necessarily passed across a boundary. This feature uses the same debounce and detection conditions as the boundary crossing feature.

People Counting

  • The people counting feature provides a tally of the people in each zone. This feature displays the output of each zone in real-time as an overlay in the live video feed. This overlay is configured in the Advanced section of the analytics feature in the settings panel.

  • The people counting feature also feed data to the Statistics view including the People Traffic per Minute and additional min/max trigger threshold events that can be configured in the General tab of the setting panel (note this feature is not available in the demo code - contact Arcturus).

NOTE: In some cases where a person may be partially occluded and close up, they may visually appear to exist in a zone, however, the center point of their bounding box may still reside outside the zone and thus not trip a detection. Depending on use case required the methodology can be changed to IoU or Centroid. Contact Arcturus for support.

Configuration

To configure the boundary crossing analytic open the settings panel by clicking on the gear icon in the top right corner of the web dashboard. Then navigate to the BOUNDARY CROSSING tab. image

Zones

To configure a zone click on the ADD ZONE button to open the zone configuration panel. image

Follow the steps below to create a zone

  1. Name the zone
  2. Select the color for the zone form the color palette or using a hex value
  3. Click on the ADD POLYGON icon to add a polygon to the image field
  4. Move the polygon points to the desired location
  5. Select desired event notifications (zone-enter or zone-leave)
  6. Click on the ADD ITEM button to create zone

ProTip:

  • To select a polygon to edit, click on it
  • To move a selected polygon, left click and drag
  • To delete a selected polygon, press backspace
  • To move a point of the selected polygon, left click and drag the point
  • To add a point to a selected polygon, right click the desired location on the polygon frame
  • To delete a point from the polygon, right click on the point
  • To invert the selected region, check the Invert Regions check box

Schedules

A tool is available to build automation schedules of when incident notifications are sent to the dashboard. To enable schedules the "Send incident notifications at all times" check box must be deselected.

image

To set a schedule

  1. deselect the "Send incident notifications at all times" check box
  2. click on the calendar schedule to open the clock / calendar selector
  3. provide a name for the schedule
  4. click on the Start Time to define the schedule start time
  5. click on the End Time to define the schedule end time
  6. Click on the days of the week to add them to the selected days
  7. Click on ADD ITEM to create schedule

Event Notifications

The following event notifications are provided:

Zone-Enter Event

  • Is triggered when an object appears in a zone.
  • Is triggered when the center point of the bottom of an object bounding box passes across the boundary of a configured zone.

Zone-Leave Event

  • Is triggered when a object no longer appears in a zone
  • Is triggered when the center point of the bottom of an object bounding box passes across the boundary of a configured zone.

Occupancy Exceeds Event

  • Is triggered when the total number of people in the field of view exceeds the threshold value (feature not available in demo code - contact Arcturus)

Occupancy Drops Below Event

  • Is triggered when the total number of people in the field of view drops below the threshold value (feature not available in demo code - contact Arcturus)

Statistics

The STATISTICS view contains several charts to help visualize event activity. Time periods of charts can be adjusted by using the time range selector in the top right corner of the statistics view. The following statistics relate to the Boundary Crossing analytic:

  • Zone Incidents

    • Incidents over a selected time period
    • image
  • People Traffic Per Minute

    • People counting per minute over a selected time period (often referred to as impressions)
    • image
  • Incident Averages

    • A rolling average of events over a 7 day period
    • A rolling average of events of a 30 day period
    • image

Loitering

The loitering analytic makes it possible to determine when the same person has remained in a zone for an extended period of time. This is a useful analytic for:

  • monitoring areas such as exit doors
  • areas prone to suspicious activity

Expected Behavior

The loitering analytic relies on the tracking and reidentification feature to assign each person in the field of view with an unique identity number and track them frame-over-frame.

Once a person is detected, a timer is started to determine how long the person has been in the field of view. A timer threshold value determines when the person is considered to be loitering and triggers an incident notification. The timer threshold value is configurable using the analytics settings. The analytic relies on configuring zones and schedules to define operation.

Configuration

The analytic relies on configuring zones and schedules to define operation along with a loitering threshold value. These parameters are configured using the settings panel by clicking on the gear icon in the top right corner of the dashboard and selecting the LOITERING tab.

Zones

  • Follow the zone configuration instruction as described in the Boundary Crossing analytic to set loitering zones.

Schedules

  • Follow the schedule configuration instruction as described in the Boundary Crossing analytic to set loitering schedule.

Advanced Settings

  • Loitering Threshold
    • The loitering threshold setting is a timer in mS that determines the length of time a person must be tracked before they are considered to be loitering.

Event Notifications

The following event notifications are provided:

Loitering-detected Event

  • Is triggered when a person remains inside a loitering zone for longer than the threshold value.

Loitering-ends Event

  • Is triggered when a person exits a loitering zone or is no longer detected.

Statistics

The STATISTICS view contains a chart to help visualize event activity. Time periods of charts can be adjusted by using the time range selector in the top right corner of the statistics view. The following statistics relate to the Loitering analytic: image

Motionless

The motionless analytic makes it possible to determine when a person who has been moving has stopped moving. This is useful analytic for:

  • Detecting if someone is trying to obfuscate themselves
  • Detecting if someone has fallen asleep

Expected Behavior

The motionless analytic relies on the tracking and reidentification feature to assign each person in the field of view with a unique identity number and track them frame-over-frame. Once a person is detected, the motionless analytic stores the localization information. This initialized localization is calculated by creating a mean value derived from a buffer of multiple detections. Periodically the current localization information is compared against the initialized localization value of the same ID to determine pixel-wise motion using Euclidian distance. A threshold value is used to determine the maximum amount of motion tolerated before an event is triggered.

Configuration

The analytic relies on configuring zones and schedules to define operation along with a loitering threshold value. These parameters are configured using the settings panel by clicking on the gear icon in the top right corner of the dashboard and selecting the MOTIONLESS tab.

Zones

  • Follow the zone configuration instruction as described in the Boundary Crossing analytic to set loitering zones.

Schedules

  • Follow the schedule configuration instruction as described in the Boundary Crossing analytic to set loitering schedule.

Advanced Settings

  • Initial Tracks

    • Defines the number of initial tracks stored until the analytic begins.
  • Maximum Buffer Size

    • Maximum buffer size of stored points to calculate mean.
  • Sample Size

    • Sample windows of tracklets to derive direction vector.
  • Alarm Delay (off)

    • Minimum number of mS before motionless event can be triggered again.
  • Maximum Misses

    • Maximum number of consecutive misses until an object is classified as moving.
  • Maximum Euclidian Distance

    • Maximum distance in pixels that an object can move to still be classified as motionless.
  • Alarm Delay

    • Minimum number of mS an object must remain motionless before triggering an event.

Event Notifications

Motionless Person

  • Is triggered when a person remains in a motionless state, inside a motionless zone, for longer than the threshold values.

End of Motionless Period

  • Is triggered when a person stops being motionless or is no longer detected in a motionless zone.

Stats

The STATISTICS view contains a chart to help visualize event activity. Time periods of charts can be adjusted by using the time range selector in the top right corner of the statistics view. The following statistics relate to the Motionless analytic: image

Other Settings

Video Source

The VIDEO SOURCE setting provide a method to change the input video source into the analytics system. Video sources can include local V4L2 camera sources (such as a MIPI-CSI or USB camera), an RTSP camera stream from a remote IP camera or a local video file using a compatible video format. The Video Source can be configured using the settings panel by clicking on the gear icon in the top right corner of the dashboard and selecting the VIDEO SOURCE tab. The default video source for the NAVQ+ is the local camera. A second test source video file has been provided to help evaluate the analytics. A video demo has been created to illustrate features and operation using this test source video.

There are three parts to the video source

  1. Active Video Sources
  2. Video Sources
  3. Local Video Sources

Active Video Sources

The Active Video source drop down enables the selection of the current video source from the list of video sources.

Video Sources

The Video Sources table displays a list of available video sources. To add a source, click on the ADD SOURCE button and complete the form and submit by using the ADD ITEM button.

Web Camera Example The local camera provided with the NAVQ+ is configured by default. It is not recommended that it be changed or removed. It can be accessed using:

v4l2src device=/dev/video3 ! video/x-raw,width=720,height=480 ! imxvideoconvert_g2d ! videoconvert ! videocrop top=-1 left=-1 bottom=-1 right=-1 ! video/x-raw,width=640,height=360 ! appsink

Local IP Camera Example The address of the local camera stream will vary across manufacturers, refer to your camera manufacturer for more information. The example below is provided as a reference, it uses an Axis IP Camera located at IP address 192.168.1.16.

rtspsrc location=rtsp://192.168.1.16/axis-media/media.amp is-live=true latency=100 ! rtph264depay ! vpudec ! queue leaky=2 max-size-buffers=2 ! imxvideoconvert_g2d ! videoconvert ! videocrop top=-1 left=-1 bottom=-1 right=-1 ! video/x-raw,width=640,height=360 ! appsink

Upload Local Video File

The Upload Local Video File tool uploads a local file to the NAVQ+ file:///src/streamproc/data/videos/ directory. To upload a file clicking the UPLOAD LOCAL VIDEO FILE button, select the file you want to upload from your local machine. The action of selecting the file to upload will prompt an Add Video Source dialog box which contains the explicit path of the video in the NAVQ+. Complete the Add Video Source dialog box information and submit by pressing ADD ITEM, then, APPLY CHANGES. The software will automatically append additional parameters to the source information to maintain playback compatibility. A video file must be 640x360 encoded using H.264 or mJPEG format and saved as an .mp4 file.

Once a video is uploaded it will need to be selected as the Active Video Source. Uploaded test videos will playback in a loop by default.

Advanced Settings

Minimum Confidence Threshold

A Minimum Confidence Threshold setting is used to determine the minimum percentage of object detection confidence required before an object is determined. An detection with lower confidence than the threshold will not be display a bounding box and will not be identified or tracked by the system. This setting applies globally to all detections. It is configurable by opening the settings

The Minimum Confidence Threshold can be configured using the settings panel by clicking on the gear icon in the top right corner of the dashboard and selecting the ADVANCED tab.

Contact and Additional Information

Brinq Smart Camera Software Demo on NAVQ+

Copyright 2023 - Arcturus Networks Inc. All Rights Reserved.

Licensing and customization available from Arcturus

Demo License and Terms

Refer to the Brinq Smart Camera Software Demo On NAVQ+ Hardware - License Terms (EULA)

Additional Resources

The follow resources and links available to support NAVQ+ development

Arcturus - Brinq Smart Camera Software

Emcraft Hardware

NXP Hardware

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment