1 Introduction

This is the user manual for the VCAedge video analytics plug-in.

This manual will describe how to license, enable and configure the features of our video analytics to detect events of interest and trigger actions to react to those events.

2 Getting Started

The VCAedge plug-in is a set of analytical tools that can be loaded onto supported cameras. It provides the means to perform advanced analytics, reduce false alerts and customize when events occur. To get started, you will need to add a license, after which you can enable the VCAedge engine and start using the features.

Before continuing make sure you are familiar with the cameras interface and have the username and password available.

This manual will describe how to license, enable and configure the features of our video analytics to detect events of interest and trigger actions to react to those events.

3 Enable/Disable

By default, the VCAedge plug-in is disabled. Activate it to enable the plug-in.

3.1 General Setting

3.2 Tracker Engine

The tracker engine setting is enabled depending on the hardware platform and active license.

Note: The menu system will reload after each apply to reflect the features that are available, the features that are available will depend on the license that has been applied.

4 Rules

Rules are used to react to events within a scene and trigger actions. To manage the rules, navigate to the rules feature from the VCAedge menu.

The rules page displays a live view from the camera and allows you to add, modify or delete rules.

4.1 Show Annotation

Use this option to show analytic data on the camera view, select Start to show the data and Stop to hide the data

Note: The burnt-in annotation feature needs to be enabled for this option to function, this does not effect the processing of analytics but the annotation requires more resource from the camera and is not on by default.

4.2 Rule(s)

The table will show the rules that have been defined for the camera, Add can be used to add additional rules, Modify is used to change the settings on a selected rule and Delete will remove the selected rule.

Note: Rules cannot be deleted if they are linked to an action or counter. Remove these links before attempting to delete.

4.3 Types of Rules Available

The type of rules available include:

4.4 How to Add a Rule

4.5 How to Modify a Rule

4.6 How to Delete a Rule

Note: Rules cannot be deleted if they are linked to an action or counter. Remove these links before attempting to delete.

4.7 Presence Polygon

The presence polygon rule triggers an event when an object is first detected in a particular zone.

Note: The presence polygon rule will trigger in the same circumstances as the Enter and Appear rule, the choice of which rule is most appropriate will depend on the scenario.

The rule will create a zone and overlay it on the live view, the zone can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

4.7.1 Rule Properties

Note: The available classifiers are different depending on the hardware platform and the installed license.

Calibration Available classifiers of Object Filter
disabled Not available (Object Filter Off)
enabled the classifier defined in the classification
Tracker Engine Calibration Available classifiers of Object Filter
Object Tracker disabled Not available (Object Filter Off)
Object Tracker enabled the classifiers defined in the classification
DL Object Tracker - any of the available classifiers
DL People Tracker - any of the available classifiers

Note: The Face detection is only available when the DL People tracker is selected and provides additional metadata to the channel.

Note: If object colour is turned on at BURNT-IN ANNOTATION menu, the top four colours which make up more than 5% of the object are represented by the colour swatch attached to the object.

4.7.2 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.7.3 Save

Click Save to save the current settings.

4.7.4 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.8 Presence Line

The presence line rule triggers an event when an object is first detected crossing a particular line.

Note: The presence line rule will trigger in the same circumstances as the direction and counting line rule, the choice of which rule is most appropriate will depend on the scenario.

The rule will create a line and overlay it on the live view, the line can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

4.8.1 Rule Properties

Note: The available classifiers are different depending on the hardware platform and the installed license.

Calibration Available classifiers of Object Filter
disabled Not available (Object Filter Off)
enabled the classifier defined in the classification
Tracker Engine Calibration Available classifiers of Object Filter
Object Tracker disabled Not available (Object Filter Off)
Object Tracker enabled the classifiers defined in the classification
DL Object Tracker - any of the available classifiers
DL People Tracker - any of the available classifiers

Note: The Face detection is only available when the DL People tracker is selected and provides additional metadata to the channel.

Note: If object colour is turned on at BURNT-IN ANNOTATION menu, the top four colours which make up more than 5% of the object are represented by the colour swatch attached to the object.

4.8.2 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.8.3 Save

Click Save to save the current settings.

4.8.4 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.9 Enter

The enter rule triggers an event when an object crosses from outside a zone to inside a zone.

Note: The enter rule detects already-tracked objects crossing the zone border from outside to inside.

The rule will create a zone and overlay it on the live view, the zone can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

4.9.1 Rule Properties

Note: The available classifiers are different depending on the hardware platform and the installed license.

Calibration Available classifiers of Object Filter
disabled Not available (Object Filter Off)
enabled the classifier defined in the classification
Tracker Engine Calibration Available classifiers of Object Filter
Object Tracker disabled Not available (Object Filter Off)
Object Tracker enabled the classifiers defined in the classification
DL Object Tracker - any of the available classifiers
DL People Tracker - any of the available classifiers

Note: The Face detection is only available when the DL People tracker is selected and provides additional metadata to the channel.

Note: If object colour is turned on at BURNT-IN ANNOTATION menu, the top four colours which make up more than 5% of the object are represented by the colour swatch attached to the object.

4.9.2 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.9.3 Save

Click Save to save the current settings.

4.9.4 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.10 Exit

The exit rule triggers an event when an object crosses from inside a zone to outside a zone.

Note: The exit rule detects already-tracked objects crossing the zone border from inside to outside.

The rule will create a zone and overlay it on the live view, the line can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

4.10.1 Rule Properties

Note: The available classifiers are different depending on the hardware platform and the installed license.

Calibration Available classifiers of Object Filter
disabled Not available (Object Filter Off)
enabled the classifier defined in the classification
Tracker Engine Calibration Available classifiers of Object Filter
Object Tracker disabled Not available (Object Filter Off)
Object Tracker enabled the classifiers defined in the classification
DL Object Tracker - any of the available classifiers
DL People Tracker - any of the available classifiers

Note: The Face detection is only available when the DL People tracker is selected and provides additional metadata to the channel.

Note: If object colour is turned on at BURNT-IN ANNOTATION menu, the top four colours which make up more than 5% of the object are represented by the colour swatch attached to the object.

4.10.2 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.10.3 Save

Click Save to save the current settings.

4.10.4 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.11 Appear

The appear rule triggers an event when an object starts to be tracked from within a zone.

Note: The appear rule detects objects that start being tracked from within a zone.

The rule will create a zone and overlay it on the live view, the zone can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

4.11.1 Rule Properties

Note: The available classifiers are different depending on the hardware platform and the installed license.

Calibration Available classifiers of Object Filter
disabled Not available (Object Filter Off)
enabled the classifier defined in the classification
Tracker Engine Calibration Available classifiers of Object Filter
Object Tracker disabled Not available (Object Filter Off)
Object Tracker enabled the classifiers defined in the classification
DL Object Tracker - any of the available classifiers
DL People Tracker - any of the available classifiers

Note: The Face detection is only available when the DL People tracker is selected and provides additional metadata to the channel.

Note: If object colour is turned on at BURNT-IN ANNOTATION menu, the top four colours which make up more than 5% of the object are represented by the colour swatch attached to the object.

4.11.2 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.11.3 Save

Click Save to save the current settings.

4.11.4 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.12 Disappear

The appear rule triggers an event when an object starts to be tracked from within a zone.

Note: The disappear rule detects objects that stop being tracked from within a zone.

The rule will create a zone and overlay it on the live view, the zone can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

4.12.1 Rule Properties

Note: The available classifiers are different depending on the hardware platform and the installed license.

Calibration Available classifiers of Object Filter
disabled Not available (Object Filter Off)
enabled the classifier defined in the classification
Tracker Engine Calibration Available classifiers of Object Filter
Object Tracker disabled Not available (Object Filter Off)
Object Tracker enabled the classifiers defined in the classification
DL Object Tracker - any of the available classifiers
DL People Tracker - any of the available classifiers

Note: The Face detection is only available when the DL People tracker is selected and provides additional metadata to the channel.

Note: If object colour is turned on at BURNT-IN ANNOTATION menu, the top four colours which make up more than 5% of the object are represented by the colour swatch attached to the object.

4.12.2 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.12.3 Save

Click Save to save the current settings.

4.12.4 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.13 Stopped

The stopped rule triggers an event when an object has stopped in a particular zone for a pre-defined period of time.

Note: The stopped rule does not detect abandoned objects. It only detects objects which have moved at some point and then become stationary.

The rule will create a zone and overlay it on the live view, the zone can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

4.13.1 Rule Properties

Note: The available classifiers are different depending on the hardware platform and the installed license.

Calibration Available classifiers of Object Filter
disabled Not available (Object Filter Off)
enabled the classifier defined in the classification
Tracker Engine Calibration Available classifiers of Object Filter
Object Tracker disabled Not available (Object Filter Off)
Object Tracker enabled the classifiers defined in the classification
DL Object Tracker - any of the available classifiers
DL People Tracker - any of the available classifiers

Note: The Face detection is only available when the DL People tracker is selected and provides additional metadata to the channel.

Note: If object colour is turned on at BURNT-IN ANNOTATION menu, the top four colours which make up more than 5% of the object are represented by the colour swatch attached to the object.

4.13.2 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.13.3 Save

Click Save to save the current settings.

4.13.4 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.14 Dwell

The dwell rule triggers an event when an object is present in a particular zone for a predefined period of time.

The rule will create a zone and overlay it on the live view, the zone can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

4.14.1 Rule Properties

Note: The available classifiers are different depending on the hardware platform and the installed license.

Calibration Available classifiers of Object Filter
disabled Not available (Object Filter Off)
enabled the classifier defined in the classification
Tracker Engine Calibration Available classifiers of Object Filter
Object Tracker disabled Not available (Object Filter Off)
Object Tracker enabled the classifiers defined in the classification
DL Object Tracker - any of the available classifiers
DL People Tracker - any of the available classifiers

Note: The Face detection is only available when the DL People tracker is selected and provides additional metadata to the channel.

Note: If object colour is turned on at BURNT-IN ANNOTATION menu, the top four colours which make up more than 5% of the object are represented by the colour swatch attached to the object.

4.14.2 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.14.3 Save

Click Save to save the current settings.

4.14.4 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.15 Direction

The direction rule triggers an event when an object crosses the detection line in a particular direction and within the acceptance parameters.

The rule will create a line and overlay it on the live view, the line can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

4.15.1 Rule Properties

Note: You can also adjust these settings using the on-screen controls. Click and hold inside the dotted circles and drag to your desired angle

Note: The available classifiers are different depending on the hardware platform and the installed license.

Calibration Available classifiers of Object Filter
disabled Not available (Object Filter Off)
enabled the classifier defined in the classification
Tracker Engine Calibration Available classifiers of Object Filter
Object Tracker disabled Not available (Object Filter Off)
Object Tracker enabled the classifiers defined in the classification
DL Object Tracker - any of the available classifiers
DL People Tracker - any of the available classifiers

Note: The Face detection is only available when the DL People tracker is selected and provides additional metadata to the channel.

Note: If object colour is turned on at BURNT-IN ANNOTATION menu, the top four colours which make up more than 5% of the object are represented by the colour swatch attached to the object.

4.15.2 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.15.3 Save

Click Save to save the current settings.

4.15.4 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.16 Removed

The removed rule triggers an event when the area within a zone has changed for the specified time.

The rule will create a zone and overlay it on the live view, the zone can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

4.16.1 Rule Properties

4.16.2 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.16.3 Save

Click Save to save the current settings.

4.16.4 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.17 Abandoned

The abandoned rule triggers an event when an object is left in a zone for the specified time.

The rule will create a zone and overlay it on the live view, the zone can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

4.17.1 Rule Properties

4.17.2 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.17.3 Save

Click Save to save the current settings.

4.17.4 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.18 Tailgating

The tailgating rule triggers an event when objects cross over a line within quick succession of each other.

The rule will create a line and overlay it on the live view, the line can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

4.18.1 Rule Properties

Note: The available classifiers are different depending on the hardware platform and the installed license.

Calibration Available classifiers of Object Filter
disabled Not available (Object Filter Off)
enabled the classifier defined in the classification
Tracker Engine Calibration Available classifiers of Object Filter
Object Tracker disabled Not available (Object Filter Off)
Object Tracker enabled the classifiers defined in the classification
DL Object Tracker - any of the available classifiers
DL People Tracker - any of the available classifiers

Note: The Face detection is only available when the DL People tracker is selected and provides additional metadata to the channel.

Note: If object colour is turned on at BURNT-IN ANNOTATION menu, the top four colours which make up more than 5% of the object are represented by the colour swatch attached to the object.

4.18.2 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.18.3 Save

Click Save to save the current settings.

4.18.4 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.19 Counting Line

The counting line rule triggers an event when an object crosses the line in the direction indicated.

Note: The counting line defers from the direction rule in that each segment of the line can have a different direction defined.

The rule will create a line and overlay it on the live view, the line can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

The direction indicator shows the direction objects must take to cause an event to be triggered, segments can be configured to point in any direction required.

4.19.1 Rule Properties

Note: The direction that will be used is shown on the screen as you select the options

Note: The object filtering feature is not available when using the counting line rule.

4.19.2 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.19.3 Save

Click Save to save the current settings.

4.19.4 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.20 Logical Rule

Logical rules extend the standard rules to allow various inputs to be combined using logical expressions, this helps to reduce false events.

The rule allows you to combine other rules into a logical expression using the AND operator and can be used to filter and reduce false events.

4.20.1 Rule Properties

4.20.2 AND operator

The AND operator combines two or more rules and only fires events if all the rules have triggered. By default, a new logical rule allows two rules to be combined, for example, trigger an event when the presence polygon AND the presence line rules are true.

4.20.3 WITHIN x seconds

This is the same as a PREVIOUS operator and holds the event as true for the defined period of time. It allows an object to trigger one of the rules and then trigger the next rule within x seconds to cause an event to occur. This is sometimes known as the double knock rule.

4.20.4 Object Filter

It allows the rule to be configured to only trigger based on an objects classification (e.g. person, vehicle), any combination of the available options is possible.

Note: The object filter of the rules that are included in the logical rule are disabled.

Note: The available classifiers are different depending on the hardware platform and the installed license.

Calibration Available classifiers of Object Filter
disabled Not available (Object Filter Off)
enabled the classifier defined in the classification
Tracker Engine Calibration Available classifiers of Object Filter
Object Tracker disabled Not available (Object Filter Off)
Object Tracker enabled the classifiers defined in the classification
DL Object Tracker - any of the available classifiers
DL People Tracker - any of the available classifiers

Note: The Face detection is only available when the DL People tracker is selected and provides additional metadata to the channel.

4.20.5 Colour Filter

It provides the ability to pick up objects based on an object’s colour components that are grouped into 10 colours.

Note: The colour filter of the rules that are included in the logical rule are disabled.

Note: If object colour is turned on at BURNT-IN ANNOTATION menu, the top four colours which make up more than 5% of the object are represented by the colour swatch attached to the object.

4.20.6 Event Actions

The following applies for the notification methods selected.

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

4.20.7 Save

Click Save to save the current settings.

4.20.8 Cancel

Click Cancel to return to the rules screen without saving any changes.

4.21 Non-detect zone

The non-detect zone can be used to exclude areas of the scene from being analysed. This can be used to reduce false triggers that can be caused by moving foliage or busy scenes.

The rule will create a line and overlay it on the live view, the line can be reshaped accordingly. Selecting a grey node will split the segment and create a more complex shape, to remove a segment select the minus sign next to a red node.

4.21.1 Rule Properties

4.21.2 Save

Click Save to save the current settings.

4.21.3 Cancel

Click Cancel to return to the rules screen without saving any changes.

5 Counters

Counters can be configured to count the number of times a rule is triggered, for example the number of people crossing a line.

The counters page displays a live view from the camera and allows you to add, modify or delete counters.

5.1 Show Burnt-in Annotation for setup

Use this option to show analytic data on the camera view, select Start to show the data and Stop to hide the data

Note: The burnt-in annotation feature needs to be enabled for this option to function, this does not effect the processing of analytics but the annotation requires more resource from the camera and is not on by default.

5.2 Counter(s)

The table will show the counters that have been defined for the camera, Add can be used to add a counter, Modify is used to change the settings on a selected counter and Delete will remove the selected counter

5.3 Counter

The counter will create a counter field and overlay it on the live view, the counter can be repositioned on the screen as required.

Note: A counter position can only be modified by selecting the counter and clicking modify.

A counter is designed to be utilised in the following way:

More than one counting line rule can be assigned to a counter input. This allows, for example, the occupancy of two counting lines to be reflected in a single counter or more than one entrance / exit gate to be assigned to a counter.

Note: counters should not be used for occupancy and increment / decrement at the same time.

5.3.1 Counter Property

Select Add input to show a list of available counting lines that can be added. Select the cross next to rules already added to remove them from the counter.

Note: events created by a counter will not trigger the Deep-Learning Filter, even if enabled on the channel.

5.3.1.1 Reset Counter

Resets the counter value to zero.

5.3.2 Event Actions

Note: Remember to configure the TCP/HTTP notification actions for the action rules feature to function. Any combination of the available options is possible.

5.3.3 Save

Click Save to save the current settings.

5.3.4 Cancel

Click Cancel to return to the counters screen without saving any changes.

5.4 How to Create a Counter

5.5 How to Modify a Counter

5.6 How to Delete a Counter

6 Calibration

Camera calibration is required in order for object identification and classification to occur. If the height, tilt and vertical field-of-view are known then these can be entered as parameters in the appropriate fields. If however, these parameters are not known then we can overlay a grid to aid in the process and provide mimics to allow you to check your settings.

6.1 Calibration

Enable Calibration: Used for turning the calibration feature on or off.

Height: Defines the height on the camera.

Tilt: Defines the tilt of the camera.

VFOV: Defines the vertical field of view of the camera.

Note: A correct value for the camera vertical field of view is important for accurate calibration and classification.

Unit: Used for changing the unit values between metric or imperial.

6.1.1 Advanced Calibration Parameters

Pan and Roll allow the ground plane to be panned and rolled without affecting the camera calibration parameters. This can be useful to visualize the calibration setup if the scene has pan or roll with respect to the camera.

Note: The pan and roll advanced parameters only affect the orientation of the 3D ground plane so that it can be more conveniently aligned with the video scene, and does not actually affect the calibration parameters.

6.2 Apply

Saves any changes that have been made.

6.3 3D Graphics Overlay

During the calibration process, the features in the video image need to be matched with a 3D graphics overlay. The 3D graphics overlay consists of a green grid that represents the ground pane. Placed on the ground plane are a number of 3D mimics (people-shaped figures) that represent the dimensions of a person with the current calibration parameters.

The mimics are used for verifying the changes you make to the calibration settings and represent a person 1.8 metres tall. The mimics can be moved around the scene to line up with people or objects of a known size to aid in the calibration process.

6.4 How to Calibrate a Camera

Position the mimics on top or near people within the camera scene.

Note: During the calibration process, as you change settings, you may need to reposition the mimics.

Entering the correct vertical field-of-view is important for accurate calibration, the following table shows pre-calculated values for vertical field-of-view for different sensors. If the table does not contain the relevant parameters, use the mimics to adjust the settings.

Focal Length(mm) 1 2 3 4 5 6 7 8
CCD Size (in) CCD Height(mm)
1/6" 1.73 82 47 32 24 20 16 14 12
1/4" 2.40 100 62 44 33 27 23 19 17
1/3.6" 3.00 113 74 53 41 33 28 24 21
1/3.2" 3.42 119 81 59 46 38 32 27 24
1/3" 3.60 122 84 62 48 40 33 29 25
1/2.7" 3.96 126 89 67 53 43 37 32 28
1/2" 4.80 135 100 77 62 51 44 38 33
1/1.8" 5.32 139 106 83 67 56 48 42 37
2/3" 6.60 118 95 79 67 58 50 45
1" 9.60 135 116 100 88 77 69 62
4/3" 13.50 132 119 107 97 88 80
Focal Length(mm) 9 10 15 20 30 40 50
CCD Size (in) CCD Height(mm)
1/6" 1.73 11 10 7
1/4" 2.40 15 14 9 7
1/3.6" 3.00 19 12 11 9 6
1/3.2" 3.42 21 16 13 10 7
1/3" 3.60 23 20 14 10 7 5
1/2.7" 3.96 25 22 15 11 8 6
1/2" 4.80 30 27 18 14 9 7 5
1/1.8" 5.32 33 30 20 15 10 8 6
2/3" 6.60 40 37 25 19 13 9 8
1" 9.60 56 51 35 27 18 14 11
4/3" 13.50 74 68 48 37 25 19 15

If the camera height is known, then it can be entered. If the height in not known then it is recommended to obtain an accurate reading otherwise estimate it based on known object heights in the scene.

Note: A correct camera height measurement is required for accurate video analytics

If the tilt angle of the camera is known, then it can be entered. If the tilt angle is not known then estimate it and use the mimics as a guide to confirm and change as required.

The objective is to ensure the mimics match people or other known objects in the scene, both the height and angle of the mimic should represent the objects in the scene and adjustment of the height, tilt and vertical field-of-view may be required to achieve this.

Click Apply to save your changes.

7 Classification

When the calibration features have been defined, objects that are detected are assessed and assigned to one of the classifiers listed in the classification section. it has been preprogrammed with the most commonly used classifiers but these can added to or deleted as the scenario requires.

Use the add, modify and delete options to change these settings.

Note: The calibration process must be completed before objects can be classified.

7.1 How to Add a New Classifier

Classification- New Classifier

Classification- New Classifier

Name: Defines the name of the new classifier.

Min. Area: Defines the minimum area for the new classifier.

Max. Area: Defines the maximum area for the new classifier.

Min. Speed: Defines the minimum speed for the new classifier.

Max. Speed: Defines the maximum speed for the new classifier.

Note: When creating or modifying classifiers, avoid overlapping parameters with other classifiers as this will cause the analytics engine to incorrectly identify objects.

8 Burnt-in Annotation

The Burnt-in Annotation feature allows analytical data to be burnt in to the raw video stream of the camera. Annotations can include tracked objects, counters and system messages.

Note: - To display object parameters such as speed, height, area and classifications, the device must be calibrated. - The stream intended for use with burnt-in annotation must have a resolution lower than 1920x1080, this is a limitation of the camera hardware. - In order for the show annotation function of Rules and Counters to work, the burnt-in annotation feature must be on and configured.

8.1 Burnt-in Annotation

8.2 BIA Setup

8.3 How to Enable Burnt-in Annotation

Note: The streams resolution must be lower than 1920x1080.

Note: The calibration process must be completed for information such as speed, height, area and classification to appear.

9 TCP Notification

The TCP notification sends data to a remote TCP server when triggered. The format is configurable with a mixture of plain text and tokens. Tokens are used to represent the event metadata that will be included when a rule is triggered.

9.1 General Setting

Note: Changing this setting will turn on/off the rules ability to send notifications through this action. Rules can still show a link to this notification but the notification will not occur when the rule triggers.

9.2 TCP Settings

9.3 Message

Note: Tokens are replaced with event-specific data at the time an event is generated and includes information of the event that triggered the notification.

Note: See the section titled Tokens for full details about the token system and example templates.

9.4 How to Enable TCP Notification

Note: The available tokens can be selected from the drop-down menu below the query window.

10 HTTP Notification

The HTTP notification sends a HTTP request to a remote endpoint when triggered. The URL, HTTP header and message body are all configurable with a mixture of plain text and tokens. Tokens are used to represent the event metadata that will be included when a rule is triggered.

10.1 General Setting

Note: Changing this setting will turn on/off the rules ability to send notifications through this action. Rules can still show a link to this notification but the notification will not occur when the rule triggers.

10.2 HTTP Settings

10.3 Query/Body

Specifies the body of the HTTP request, this can be a mixture of plain text and any supported tokens which will be replaced with event-specific data at the time an event is generated.

Note: See the tokens topic for full details about the token system and example templates.

Note: Tokens are replaced with event-specific data at the time an event is generated and includes information of the event that triggered the notification.

Note: See the section titled Tokens for full details about the token system and example templates.

10.4 How to Enable HTTP Notification

Note: The available tokens can be selected from the drop-down menu below the query window.

11 Tamper

The Tamper feature is intended to detect camera tampering events such as bagging, defocusing and moving the camera. This is achieved by detecting large persistent changes in the image.

11.1 Tamper Detection

Note: The option will reduce sensitivity to genuine alarms and should be used with caution. Remember to Apply changes for them to take effect

11.2 Event Action

11.3 Apply

Click Apply to save the current settings.

12 Advanced

The advanced section contains settings relating to how the analytics engine tracks objects.

Note: In most installations the default configuration will apply.

Note: Supported settings are different depending on the tracker engine.

12.1 Object Tracker

Note: Changing the detection point that is used by the system can effect the point at which objects will trigger an event.

12.2 Apply

Click Apply to save the current settings.

12.3 Scene Change

12.4 Apply

Click Apply to save the current settings.

12.5 Clean up

Click Clean up to perform a clean up process.

13 Face Detection

The facial detection feature recognises the face of the person who triggers the configured rule, but only if the triggered individual is in the face profile. It is split into two sections: live feed, and search.

Note: Face detection works in conjunction with a best shot process in order to provide the best image of the face for profile matching. It is advised that the Presence-Polygon rule is used when using this feature. The Presence-Polygon rule produces data for the period the object is present in the zone and allows the best shot to obtain the best image of the face.

Refer to the section in rules titled Presence-Polygon for further details on configuring the Presence-Polygon rule.

13.1 Live Feed Page

The live feed updates every 5 seconds to show the most recently discovered faces. The discovered faces are accompanied by information such as their age, gender, name, and group. Names and groups are only visible in the face profile for persons who have registered. Clicking the monitoring button on the search page takes you to the live stream. page.

13.1.1 Show Burnt-in Annotation for setup

Use this option to show analytic data on the camera view; select Start to show the data and Stop to hide the data.

Note: The burnt-in annotation feature needs to be enabled for this option to function, this does not effect the processing of analytics but the annotation requires more resource from the camera and is not on by default.

Recognised faces are stored in the camera. This information can be searched using several filters.

13.2.1 Filter

It allows users to search for faces detected and stored in the Camera DB based on various search criteria below.

Search: Perform a search against the defined criteria.

13.3 Face Profiles

The face profile is to identify and classify the detected face, you need to create groups and profiles.

13.3.1 Groups

13.3.2 Profile Settings

13.4 How to configure Camera to use Face Detection

13.5 How to Create a Group

13.6 How to Modify a Group

13.7 How to Delete a Group

Note: It is not possible to delete any profile that belongs to the selected group.

13.8 How to Create a Profile

Note: The face image is available in PNG or JPG format with a minimum resolution of 150x150 pixels.

13.9 How to Modify a Profile

13.10 How to Delete a Profile

14 License

To take advantage of the VCAedge video analytic features a licence is required.

In many cases, the VCA analytic features on a camera are pre-activated in the factory and further activation is only necessary to enable additional functionality. There are 3 different methods of activation, token, pre-activation and activation code.

An activation code is linked to the cameras hardware configuration and is not transferable.

To manage activation codes, navigate to the license feature from the VCA menu.

14.1 Activate License

Note: Please refer to your license distributor for additional licenses.

14.2 Activated License(s)

Note: Multiple licenses can be applied to a camera but only one license can be active at any one time.

14.3 Assign

Click Assign to assign the selected license for use with analytics.

Note: The menu system will reload to reflect any changes the license has applied. This may result in features not being available

14.4 Delete

Click Delete to delete the selected license from the camera.

Note: deleting the active license from the camera will remove all analytic features. Any configured rules will be available when a new valid license is applied.

14.5 How to apply a Token

14.6 How to apply Activation Codes

14.7 How to Delete License(s)

14.8 More Information

For more information on the complete range of additional features available, please visit VCA Technology

15 Tokens

Tokens are used within actions events such as TCP and HTTP and are automatically filled in with the metadata for the event. This allows the details of the event to be specified in the message that the action sends, e.g. the location of the object, type of event, etc.

15.1 List of tokens

Below is a list of the available tokens along with a description of the data they will provide.

15.1.1 {{name}}

The name of the event

15.1.2 {{id}}

The unique id of the event

15.1.3 {{type}}

The type of the event. This is usually the type of rule that triggered the event

15.1.4 {{status}}

The status of the event

15.1.5 {{iso8601}}

The event time in the ISO 8601 format. An example of this would be:

Start time (ISO 8601 format): {{iso8601}} Start time (ISO 8601 format): 2017-04-21T10:09:42+00:00

15.1.6 {{time}}

The event time in Unix time stamp format. An example of this would be:

time: {{time}} time: 1582308244.376

15.1.7 {{ip}}

The IP address of the device

15.1.8 {{host}}

The hostname of the device that generated the event

15.1.9 {{datetime}}

The event time in the format DD MM D HH:MM:SS YYYY 'Tue Jan 1 12:00:00 2019'. An example of this would be:

time: {{datetime}} time: Tue Jan 1 12:00:00 2019

15.1.10 {{zoneid}}

The zone ID of the event

15.1.11 {{ruleid}}

The rule ID of the event

15.1.12 {{bb}}

The bounding box of the object

15.1.13 {{objectclass}}

The object class of the object triggering the rule

15.1.14 {{mac}}

The MAC address of the device

An example for using tokens is given below:

Event #{{id}}: {{name}} Event type: {{type}} Start time (ISO 8601 format): {{iso8601}} time: {{time}} Device: {{host}}: {{ip}}: {{mac}} Object bounding box: {{bb}} Classification: {{objclass}}

This would produce the following text:

Event #350: My event name Event type: presence Start time (ISO 8601 format): 2017-04-21T10:09:42+00:00 time: 1492769382 Device: Camera: 10.0.5.2: AB:CD:EF:GH:01:02 Object bounding box: [45251:12069:14004:8563] Classification: Vehicle