Sentinel Release 4.17

 

Sentinel release 4.17 adds the ability to capture performance metrics for each monitor/test, in the Sentinel database.

 


Sentinel Usage Metrics

A new TestPerformanceMetrics table, which captures performance metrics for each monitor/test, has been added to the Sentinel database.

Two new configuration settings have been added to the Sentinel Configuration file: which need to be set:

  • MonitorPerformanceMetricsEnabled: This needs to be set to True for the metrics to be collected.
  • MonitorPerformanceMetricsPurgeTimeoutHours: Specifies how often to delete all the metrics from the table.

The table contains the following columns, which can be used to analyse performance issues:

COLUMN DESCRIPTION
TestId The unique identifier of the test being analysed.
TestVersionId The version of the test.
MonitorVersionId The version of the monitor.
TestRunTime Time, in milliseconds, taken to run the test.
DataFetchTime Time, in milliseconds, taken to fetch the data.
EventWriteTime Time, in milliseconds, taken to an event to be written to the database.
TotalRunTime Total time, in milliseconds, taken for the monitor to run.
FetchSourceTime Time, in milliseconds, for the data source to respond to the fetch request.
PreConditionRunTime Time, in milliseconds, taken to for the pre-condition to run.
EntityCount The number of entities being processed.
TriggerTime The timestamp that the test was triggered.
ProcessingStartTime The timestamp that the test started being processed.

Comments are closed