Scan performance options

Introduction

When scheduling a new scan there is the option to "limit CPU priority" as well as "limit search throughput".
This article aims to explain what these options do.

 

Limit CPU priority

Low priority

This is the default CPU priority level for any given scan.

Enterprise Recon will instruct the operating system* to only allow it CPU resources when they are available.
The result of this approach is that when other applications begin utilizing CPU, the scan will slow down as CPU cycles are being taken away from its scanning process.

*(Unix) Nice level for ER would be set to '15'

Normal priority

The scanning process will compete for CPU cycles on an equal basis against other applications.
Only if other applications have a lower 'Nice' setting (in Unix*) or a Windows OS level priority been set on the process will other applications overthrow the Agent scanning process.

The disk access read speed will achieve its maximum capability when scanning the underlying file system.
A scan will continually open and close files as fast as it possible can without any delay and thus, the disk speed normally becomes the primary limiting factor for any performance of scans.

*(Unix) Nice level for ER would be set to '0'

 

Limit data throughput rate

This setting adjusts the maximum data throughput the application can use when searching each target.
In other words, this rate sets the amount of data that would be passed through to the scanning engine for scanning. The higher the rate, the more data that can be scanned at a given time.

For a typical environment where workstations that are running scans are not impacted by performance, we would recommend not to limit the data throughput rate.
Only If users notice a significant slow down of their workstations with an ongoing scan, would we suggest turning it on.

The default value in the ER Web console is 50 megabytes per second.
This value is actually based on limiting the reading off a non-RAID modern hard disk drive before its maximum read throughput, while still giving a reasonable scanning rate for the application.

There may be other factors that may have contributed to the speed of the scanning - including the size of the data, the type of data, the available computing resources in each workstation that is running the scans.
Other further factors that could affect performance include whether the scanning is done by locally installed agents in all the workstations, or if they are done through proxy agents.

 

All information in this article is accurate and true as of the last edited date.

Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.