Methodology for Polymorphic Virus Detection Test (February 2008)

To test the antivirus products the experts at Anti-Malware Test Lab selected sample malicious programs meeting the following criteria:

  1. The sample makes use of polymorphism to hide its presence from antivirus software.
  2. The sample is recent (appearing/circulating in 2007);
  3. The sample is active on the Internet (‘in-the-wild’ sample).

The selected malicious programs were divided into 11 families based on their functionality (the names of different mutations, as classified by some vendors, are given in brackets):

  1. Allaple.1 (Symantec: W32.Rahack.H, W32.Rahack.W; DrWeb: Trojan.Starman; Kaspersky Lab: Net-Worm.Win32.Allaple.a, Net-Worm.Win32.Allaple.d);
  2. Allaple.2 (Symantec: W32.Rahack.W; DrWeb: Trojan.Starman; Kaspersky Lab: Net-Worm.Win32.Allaple.b);
  3. Allaple.3 (Symantec: W32.Rahack.H; DrWeb: Trojan.Starman; Kaspersky Lab: Net-Worm.Win32.Allaple.d, Net-Worm.Win32.Allaple.e);
  4. Allaple.4 (Symantec: W32.Rahack.H, W32.Rahack.W; DrWeb: Trojan.Starman; Kaspersky Lab: Net-Worm.Win32.Allaple.e, Net-Worm.Win32.Allaple.d);
  5. Alman.1 (Symantec: W32.Almanahe.A!inf, W32.Almanahe.B!inf, W32.Fubalka.B …; DrWeb: Win32.Alman.2, Win32.Alman.3; Kaspersky Lab: Virus.Win32.Alman.a);
  6. Alman.2 (Symantec: W32.Almanahe.B!inf, W32.HLLW.Oror.B@mm, W32.Bustoy, W32.SillyDC …; DrWeb: Win32.Alman; Kaspersky Lab: Virus.Win32.Alman.b);
  7. Twido.1 (Avira: W32/Tvido; DrWeb: Win32.Dwee.2887, Win32.Dwee.3029; Kaspersky Lab: Virus.Win32.Tvido.a);
  8. Twido.2 (Avira: W32/Tvido.B; DrWeb: Win32.Dwee.2; Kaspersky Lab: Virus.Win32.Tvido.b);
  9. Virut.2 (Symantec: W32.Virut.B, W32.Virut!gen, W32.Ifbo.A, W32.Virut.H …; DrWeb: Win32.Virut.5; Kaspersky Lab: Virus.Win32.Virut.n, Backdoor.Win32.VanBot.bh, Backdoor.Win32.VanBot.ax …);
  10. Virut.3 (Symantec: W32.Virut.B, W32.Virut!gen, W32.Virut.R, Downloader …; DrWeb: Win32.Virut.5, Trojan.MulDrop.5684, Trojan.DownLoader.24029, Trojan.Fakealert.257 …; Kaspersky Lab: Virus.Win32.Virut.n, Virus.Win32.Virut.m, Virus.Win32.Virut.q, Trojan.Win32.Agent.bnj, Trojan.Win32.Agent.qt …);
  11. Virut.4 (Symantec: W32.Virut.U, Trojan Horse, W32.Virut.R, W32.Virut!gen, Downloader …; DrWeb: Win32.Virut.5; Kaspersky Lab: Virus.Win32.Virut.q, Trojan-Downloader.Win32.VB.awj, Trojan-PSW.Win32.OnLineGames.yn …).
    The initial test sample was composed of malicious programs circulating on the Internet, as well as those from antivirus vendors that expressed a willingness to cooperate in the testing process.

Note! The initial selection, analysis and classification of the samples represent the main preparation stage of the test, which is also the most time-consuming.

It is necessary to have as many mutations of polymorphic viruses as possible to ensure the precision of the algorithms devised to detect them. The initial test samples were therefore multiplied to gain a broad collection for the test using the following methodology:  

  1. A clean virtual machine with a Microsoft Windows XP SP2 operating system was cloned.
  2. The system was infected using one of the malware samples from the initial collection.
  3. File system changes were recorded (a list of infected files) using a special applet.
  4. The generated malware was copied from the virtual machine.
  5. System roll-back to the initial stage (point 1) was performed and the infection stage was repeated.

As a result, a sample collection of 30,000 malicious programs was generated (from 500 to 8,000 in each family) that was used to test the ability of antivirus software to detect polymorphic viruses.

The test was performed on a specially-prepared VMware GSX Server platform. A “clean” virtual machine with a Microsoft Windows XP SP2 operating system was cloned for each antivirus product.

The following antivirus programs were tested:

  1. Agnitum Outpost Security Suite Pro 2008 (VirusBuster)
  2. Avast Professional Edition 4.7
  3. AVG Anti-Virus Professional Edition 7.5
  4. Avira Antivir Personal Edition Classic 7.06
  5. BitDefender Anti-Virus 2008
  6. DrWeb 4.44
  7. Eset Nod32 Antivirus 3.0
  8. F-Secure Anti-Virus 2008
  9. Kaspersky Anti-Virus 7.0
  10. McAfee VirusScan 2008
  11. Microsoft Windows Live OneCare 2.0 Pre-Release
  12. Panda Antivirus 2008
  13. Sophos Anti-Virus 7.0
  14. Symantec Anti-Virus 2008
  15. Trend Micro Antivirus plus Antispyware 2008
  16. VBA32 Workstation 3.12.6

All of the actions recommended by the installation programs (e.g., system restart, updating, etc.) were performed. The antivirus program default settings were not altered after installation. The only exception was when the “scan all files” option had to be activated in the settings.

Testing stages:

  1. Virtual machine switched on;
  2. On-demand scan of malware samples (set to automatically delete/quarantine detected objects);
  3. Undetected malware samples counted after scan.

Each antivirus program was allocated a dedicated clean virtual machine (step 1). A copy of the sample collection of malicious programs was made for each antivirus product (step 2).