avc_od_aug2011.pdf

(453 KB) Pobierz
Anti-Virus Comparative
On-demand Detection of
Malicious Software
includes false alarm and on-demand scanning speed test
Language: English
August 2011
Last Revision: 27
th
September 2011
www.av-comparatives.org
Anti-Virus Comparative – August 2011
www.av-comparatives.org
Table of Contents
Tested Products
Conditions for participation and test methodology
Tested product versions
Comments
Test results
Graph of missed samples
Summary results
False positive/alarm test
On-Demand Scanning speed test
Award levels reached in this test
Copyright and Disclaimer
3
5
6
10 
11 
12 
13 
-2-
Anti-Virus Comparative – August 2011
www.av-comparatives.org
Tested Products
avast! Free Antivirus 6.0
AVG Anti-Virus Free Edition 10.0
AVIRA AntiVir Personal 10.2
BitDefender Antivirus Plus 2012
eScan Anti-Virus 11.0
ESET NOD32 Antivirus 5.0
F-Secure Anti-Virus 2011
G DATA AntiVirus 2012
K7 TotalSecurity 11.1
Kaspersky Anti-Virus 2012
McAfee AntiVirus Plus 2011
Microsoft Security Essentials 2.1
Panda Cloud Antivirus 1.5
PC Tools Spyware Doctor with AV 8.0
Qihoo 360 Antivirus 2.0
Sophos Anti-Virus 9.7
Symantec Norton Anti-Virus 2012
Trend Micro Titanium AntiVirus+ 2012
Trustport Antivirus 2012
Webroot AntiVirus with Spy Sweeper 7.0
-3-
Anti-Virus Comparative – August 2011
www.av-comparatives.org
Conditions for participation and test methodology
The conditions for participation in our tests are listed in the methodology document at
http://www.av-comparatives.org/seiten/ergebnisse/methodology.pdf.
Before proceeding with this
report, readers are advised to first read the above-mentioned document.
The participation is limited to not more than 20 well-known Anti-Virus products, which vendors
agreed to get tested and included in the public test-series of 2011.
Tested Product Versions
The Malware sets have been frozen the 1
st
August 2011. The system sets and the products were
updated and frozen on the 12
th
August 2011. The following 20 up-to-date products
1
were included in
this public test:
avast! Free Antivirus 6.0.1203
AVG Anti-Virus Free Edition 10.0.1392
AVIRA AntiVir Personal 10.2.0.700
BitDefender Anti-Virus+ 15.0.27.319
eScan Anti-Virus 11.0.1139.998
ESET NOD32 Antivirus 5.0.90.0
F-Secure Anti-Virus 10.51.106
G DATA AntiVirus 22.0.2.32
K7 TotalSecurity 11.1.0050
Kaspersky Anti-Virus 12.0.0.374 (abc)
McAfee AntiVirus Plus 15.0.291
Microsoft Security Essentials 2.1.1116.0
Panda Cloud Antivirus Free 1.5.1
PC Tools Spyware Doctor with Antivirus 8.0.0.655
Qihoo 360 Antivirus 2.0.1.1332
Sophos Anti-Virus 9.7.4
Symantec Norton Anti-Virus 19.1.0.21
Trend Micro Titanium AntiVirus Plus 2012
Trustport Antivirus 10.0.0.4796
Webroot AntiVirus with Spy Sweeper 7.0.11.25
Please try the products
2
on your own system before making a purchase decision based on these tests.
There are also some other program features and important factors (e.g. price, ease of
use/management, compatibility, graphical user interface, language, HIPS / behaviour blocker
functions, URL filter/reputation services, support, etc.) to consider. Some products may offer
additional features e.g. to provide additional protection against malware during its execution (if not
detected in advance on-access or on-demand).
Although extremely important, the detection rate of a product is only one aspect of a complete Anti-
Virus product. AV-Comparatives provides also a whole product dynamic test, as well as other test
reports which cover different aspects/features of the products.
Avast, AVG, AVIRA and Panda wanted to participate in the tests with their free product version.
Information about used additional third-party engines/signatures inside the products:
eScan, F-Secure
and
Qihoo 360
are based on the Bitdefender engine.
G DATA
is based on the Avast and Bitdefender engines.
PC
Tools
is using the signatures of Symantec.
Trustport
is based on the AVG and Bitdefender engines.
Webroot
is
based on the Sophos engine.
2
1
-4-
Anti-Virus Comparative – August 2011
www.av-comparatives.org
Comments
Nowadays, almost all products run with the highest protection settings by default (at least during on-
demand / scheduled scans), some however may automatically switch to the highest settings once
infection detections begin to occur. Due to this, and in order to ensure comparable results, we tested
all products with the highest settings unless explicitly advised otherwise by the security vendors. The
vendors may do this as they prefer the highest settings not to be used due to high number of False
Alarms, or perhaps the highest settings will have a performance impact, or maybe they are planning
to change/remove the setting in the near future. Below are some notes about the settings used (scan
all files etc is always enabled), e.g.: where the settings are not set to the highest by default:
Avast, AVIRA, Kaspersky, Symantec:
asked to get tested with heuristic set to high/advanced. For
this reason, we recommend users to consider also setting the heuristics to high/advanced.
F-Secure, Sophos:
asked to get tested and awarded based on their default settings (i.e. without using
their advanced heuristics / suspicious detections setting).
AVG, AVIRA:
asked to do not enable/consider the informational warnings of packers as detections.
So, we did not count them as detections (neither on the malware set, nor on the clean set).
AV-Comparatives prefers to test with default settings. As most products run with highest settings by
default (or switch to highest automatically when malware is found, making it impossible to test
against various malware with “default” settings), in order to get comparable results we set also the
few remaining products to highest settings (or leave them to lower settings) in accordance with the
respective vendors. We kindly ask vendors to provide stronger settings by default, i.e. set their default
settings to highest levels of detection, esp. for scheduled scans or scans initiated by the user this
would make more sense. We also kindly ask vendors to remove paranoid settings inside the user
interface which are too high to be ever of any benefit for normal users. As some vendors decided to
take part in our tests using the stronger settings, it appears that the better option would be to go for
the stronger settings by default and that is why we recommend users to consider to use those settings
too.
Several products make use of cloud technologies, which require an active internet connection. Our
test is performed using an active internet connection. Although we do not longer show the baseline
detection rates without cloud and show instead only the results with active cloud, users should be
aware that detection rates may in some few cases be lower if the scan is performed while offline (or
when the cloud is unreachable without their knowledge). The cloud should be considered as an
additional benefit/feature to increase detection rates (as well as response times and false alarm
suppression) and not as a full replacement for local offline detections. Vendors should make sure that
users are warned in case that the connectivity to the cloud gets lost e.g. during a scan, which may
affect considerably the provided protection and make e.g. the initiated scan useless. We have seen
that products which rely much on the cloud may perform better in detecting PE malware, while
scoring lower in detecting malware in non-PE format, like present in the “other malware/viruses”
category.
Telemetry data has been consulted to include prevalent malware samples which are/were hitting users
in the last six months. Due the focus on prevalent/widespread and recent samples (majority is from
last three months), the size of the test-set is much smaller than in previous years.
-5-
Zgłoś jeśli naruszono regulamin