Bobcares

Clamscan high CPU usage – 3 steps to make it right

by | Mar 29, 2020

Clamscan high CPU usage is affecting my server performance and user experience. Is there something that we can do?

That was a recent request in our Server Management Services HelpDesk.

Clamscan is a tool that scans files and folders in the server for viruses. However, at times, the scan process consumes all of the server resources and make the entire server slow.

Today. we’ll see how our Dedicated Engineers solve Clamscan high CPU usage for the customer in 3 simple steps.

 

How Clamscan helps in server security

The servers connected to the internet are always prone to virus attacks. That’s where Clamscan helps.

Clamscan scans the files and folders for viruses. Fortunately, it comes with several automated actions on the infected files. This includes moving the infected files to a separate directory, removing files, etc.

For instance, to scan the home directory of a user on the server, we use

clamscan -r --move=/home/USER/infected /home/USER

This command will move the infected files to the infected directory.

To check the entire /home directory, the command will be

clamscan -r /home

However, in all cases, our Dedicated Engineers recommend verifying the infected files before removing.

 

What causes Clamscan high CPU usage

Let’s now check on why there is a high CPU usage during a scan.

In any server, the website folders can easily be a source of infected files. These folders involve regular updates by users. As a result, infected files from a user’s computer may get into the server.

Thus as a part of security measures, it’s a practice to scan all the user directories on the server. Here, the clamscan process has to read the files and check for infected content. This involves comparing the file with a virus database. Then clamscan takes further action like removing or copying to a separate folder.

When the size of the folder goes high, it requires a lot of server resources for the scan process to do all its tasks. This eventually causes high CPU usage. Again depending on the time for scan completion, the server becomes slow too.

 

How we fix Clamscan high CPU usage

Till now we saw the importance of Clamscan and why it causes high resource usage on the server.

Let’s take a look at how our Dedicated Engineers solved the Clamscan high CPU usage for the customer with these steps.

 

1. Initiate scan at an off-peak time

One of the best practices to reduce server usage is to run the scan at off-peak hours.

During peak hours, there will be a high number of visitors to the websites. When the scan runs simultaneously, it can make all the websites slow. Therefore, it is ideal to run the clamscan when the server resource usage is less.

We checked the time at which the clamscan was running on the server. It was scheduled to run every Monday at 9 am server time. At that time there were too many website requests on the server. This caused a spike in CPU usage.

To solve the problem, we changed the clamscan to run once at the weekend at the off-peak hours.

The frequency and the time of clamscan depends a lot on the server content and type. When the contents remain static, our Support Engineers recommend scanning only once a month or so.

 

2. Restricting clamscan process CPU usage

Moving on, we restricted the maximum resource that the clamscan process could use.

For this, we installed the cpulimit package. This package allows limiting the CPU usage of any server process.

We edited the clamscan command in the scan script as:

cpulimit -e clamscan -l 30

This limited the clamscan CPU usage to 30%.

In some servers, we use the nice command to lower the priority of the clamscan process too. However, it does not limit the CPU.

From our server management expertise, limiting the CPU usage of the clamscan process may not be effective in all cases. This is because when there is a restriction on the CPU usage, clamscan waits for other processes to complete. In effect, this would increase the time for which clamscan runs in the server.

 

3. Add adequate Server resources

Furthermore, when the server has too much content to scan, the clamscan becomes resource-intensive automatically. This requires adequate server memory.

For instance, the customer server had 847Gb of data in the /home directory. The customer wanted to scan the entire directory too. Details from the server appeared as:

clamscan_high_CPU_usage

However, the server didn’t have the resources to process them effectively. Therefore, we suggested the customer add additional memory on the server.

Thus, with the above steps, we brought back the CPU usage normal on the server.

 

[Struggling with clamscan high CPU usage? We can help you right away.]

 

Conclusion

In short, Clamscan high CPU usage can be a common problem in servers. Today, we saw how our Dedicated Engineers fix it by scheduling the scan at off-peak times, restricting CPU usage of scan process, adding enough server memory, etc.

PREVENT YOUR SERVER FROM CRASHING!

Never again lose customers to poor server speed! Let us help you.

Our server experts will monitor & maintain your server 24/7 so that it remains lightning fast and secure.

GET STARTED

var google_conversion_label = "owonCMyG5nEQ0aD71QM";

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Never again lose customers to poor
server speed! Let us help you.

Privacy Preference Center

Necessary

Necessary cookies help make a website usable by enabling basic functions like page navigation and access to secure areas of the website. The website cannot function properly without these cookies.

PHPSESSID - Preserves user session state across page requests.

gdpr[consent_types] - Used to store user consents.

gdpr[allowed_cookies] - Used to store user allowed cookies.

PHPSESSID, gdpr[consent_types], gdpr[allowed_cookies]
PHPSESSID
WHMCSpKDlPzh2chML

Statistics

Statistic cookies help website owners to understand how visitors interact with websites by collecting and reporting information anonymously.

_ga - Preserves user session state across page requests.

_gat - Used by Google Analytics to throttle request rate

_gid - Registers a unique ID that is used to generate statistical data on how you use the website.

smartlookCookie - Used to collect user device and location information of the site visitors to improve the websites User Experience.

_ga, _gat, _gid
_ga, _gat, _gid
smartlookCookie
_clck, _clsk, CLID, ANONCHK, MR, MUID, SM

Marketing

Marketing cookies are used to track visitors across websites. The intention is to display ads that are relevant and engaging for the individual user and thereby more valuable for publishers and third party advertisers.

IDE - Used by Google DoubleClick to register and report the website user's actions after viewing or clicking one of the advertiser's ads with the purpose of measuring the efficacy of an ad and to present targeted ads to the user.

test_cookie - Used to check if the user's browser supports cookies.

1P_JAR - Google cookie. These cookies are used to collect website statistics and track conversion rates.

NID - Registers a unique ID that identifies a returning user's device. The ID is used for serving ads that are most relevant to the user.

DV - Google ad personalisation

_reb2bgeo - The visitor's geographical location

_reb2bloaded - Whether or not the script loaded for the visitor

_reb2bref - The referring URL for the visit

_reb2bsessionID - The visitor's RB2B session ID

_reb2buid - The visitor's RB2B user ID

IDE, test_cookie, 1P_JAR, NID, DV, NID
IDE, test_cookie
1P_JAR, NID, DV
NID
hblid
_reb2bgeo, _reb2bloaded, _reb2bref, _reb2bsessionID, _reb2buid

Security

These are essential site cookies, used by the google reCAPTCHA. These cookies use an unique identifier to verify if a visitor is human or a bot.

SID, APISID, HSID, NID, PREF
SID, APISID, HSID, NID, PREF