Methodology used in testing proactive antivirus protection II

The test was performed on a specially-prepared VMware GSX Server platform. A “clean” virtual machine with a Microsoft Windows XP SP3 operating system was cloned for each antivirus program.

The following antivirus products participated in the test:

  1. Agnitum Outpost Antivirus Pro 2009
  2. Avast! Professional Edition 4.8
  3. AVG Anti-Virus 8.0
  4. Avira AntiVir Premium 8.2
  5. BitDefender Antivirus 2009
  6. Dr.Web 5.0
  7. Eset Nod32 Anti-Virus 3.0
  8. F-Secure Anti-Virus 2009
  9. Kaspersky Anti-Virus 2009
  10. Panda Antivirus 2009
  11. Sophos Anti-Virus 7.6
  12. Symantec Anti-Virus 2009
  13. Trend Micro Internet Security 2009
  14. VBA32 Antivirus 3.12

All the recommended actions (system reboot, updates, etc.) accompanied the installation of the antivirus programs. All protection components were activated if this did not occur automatically after installation. 

After the test platform was set up, special conditions were created to check the effectiveness of the heuristic analyzers. This entailed the update function being switched off on all the antivirus programs, i.e., the information in the antivirus databases was frozen on the date the test was initiated.

In-the-wild samples of malicious programs were collected from corporate gateways and private collections two weeks after the antivirus databases were frozen. The uniqueness of the samples was confirmed by comparing the hash codes with those in the Anti-Malware.ru collection that was accumulated in the six months before the test started. Thus, the selection of malicious programs used in the test was, with a high degree of probability, unknown to the antivirus programs when their antivirus databases were switched off.

Important! The gap of two weeks between the freezing of the antivirus databases and when the malicious code started to be collected was intentional in order to minimize the possibility of a virus known to any of the products from ending up in the sample collection.

The result of all these actions was a situation whereby the effectiveness of the classical signature-based protection component was reduced to zero. As a result, the proactive heuristic component would have to be responsible for any detection of an unknown sample using a straightforward on-demand scan, which was exactly what we wanted.

The on-demand scan was performed using optimal settings: heuristic analyzer turned on (highest level), scanning of all files, and detection of all types of malicious and potentially harmful programs.

As opposed to the previous test, due to multiple requests we performed a check of the level of false alarms. For that reason a collection of pure files was formed at the same time of the accumulation of the collection of malicious programs. For this reason we downloaded install packages from download.com. The packages extracted, and only .exe and .dll files (unique in md5) were selected, other files were removed. As a result a collection of 15121 pure files was created.

To determine the level of the false alarm, an on-demand scan of the collection of pure files by all tested antivirus products was performed. We kept the same databases and settings which had been used earlier for scanning the collection of malicious programs.

Important! The detection of ‘unwanted programs’ in the collection of pure files (spyware, adware, remote admin tools and so on) were not counted as false alarm, since a degree of danger of such programs, as a rule, provoke a lot of questions and it is determined by each vendor independently. And what is dangerous for one is safe for another.

As an addendum to the main test following its completion, all the antivirus programs were updated and a repeat scan of the collection was performed (one week after the main test was completed). As a result, the effectiveness of the classical signature-based method of each antivirus program was ascertained in addition to their heuristics.

Steps taken to set up the test environment:

  1. Installation of antivirus program on a clean machine;
  2. System reboot;
  3. Check of all program modules to ensure successful installation and operability;
  4. Update of antivirus program;
  5. System reboot;
  6. Update function switched off, Internet disconnected (antivirus database frozen);
  7. Virtual machine state saved;
  8. Virtual machine switched off for 6 weeks;

  9. Malicious programs and pure files collected for test (beginning in 2 weeks after point 2).

Steps taken during testing:

  1. Virtual machine switched on;
  2. On-demand scan of malware collection (set to automatic deletion of detected objects);
  3. Undetected samples counted after scan of collection;
  4. On-demand scan of pure files collection (fixation of false alarm)
  5. Antivirus programs updated;
  6. Repeat scan of samples still remaining at point 3;
  7. Remaining undetected samples counted after repeat scan of collection.

Each antivirus program was allocated a dedicated clean virtual machine (step 1). A copy of the collection of malicious programs was made for each antivirus product (step 9).