Beyond the Hype: An Evaluation of Commercially Available Machine Learning-based Malware Detectors

Robert A. Bridges, Sean Oesch, Michael D. Iannacone, Kelly M.T. Huffer, Brian Jewell, Jeff A. Nichols, Brian Weber, Miki E. Verma, Daniel Scofield, Craig Miles, Thomas Plummer, Mark Daniell, Anne M. Tall, Justin M. Beaver, Jared M. Smith

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

There is a lack of scientific testing of commercially available malware detectors, especially those that boast accurate classification of never-before-seen (i.e., zero-day) files using machine learning (ML). Consequently, efficacy of malware detectors is opaque, inhibiting end users from making informed decisions and researchers from targeting gaps in current detectors. In this article, we present a scientific evaluation of four prominent commercial malware detection tools to assist an organization with two primary questions: To what extent do ML-based tools accurately classify previously and never-before-seen files? Is purchasing a network-level malware detector worth the cost? To investigate, we tested each tool against 3,536 total files (2,554 or 72% malicious and 982 or 28% benign) of a variety of file types, including hundreds of malicious zero-days, polyglots, and APT-style files, delivered on multiple protocols. We present statistical results on detection time and accuracy, consider complementary analysis (using multiple tools together), and provide two novel applications of the recent cost-benefit evaluation procedure of Iannacone and Bridges. Although the ML-based tools are more effective at detecting zero-day files and executables, the signature-based tool might still be an overall better option. Both network-based tools provide substantial (simulated) savings when paired with either host tool, yet both show poor detection rates on protocols other than HTTP or SMTP. Our results show that all four tools have near-perfect precision but alarmingly low recall, especially on file types other than executables and office files: Thirty-seven percent of malware, including all polyglot files, were undetected. Priorities for researchers and takeaways for end users are given. Code for future use of the cost model is provided.

Original languageEnglish
Article number27
JournalDigital Threats: Research and Practice
Volume4
Issue number2
DOIs
StatePublished - Aug 10 2023

Funding

The research is based upon work supported by the Department of Defense (DOD), Naval Information Warfare Systems Command (NAVWAR), via the Department of Energy (DOE) under contract DE-AC05-00OR22725. The views and conclusions contained herein are those of the authors and should not be interpreted as representing the official policies or endorsements, either expressed or implied, of the DOD, NAVWAR, or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright annotation thereon.

FundersFunder number
U.S. Department of Defense
U.S. Department of EnergyDE-AC05-00OR22725
Naval Information Warfare Systems Command

    Keywords

    • Malware detection
    • cost benefit analysis
    • dynamic analysis
    • endpoint detection
    • evaluation
    • intrusion detection
    • machine learning
    • network detection
    • static analysis
    • test

    Fingerprint

    Dive into the research topics of 'Beyond the Hype: An Evaluation of Commercially Available Machine Learning-based Malware Detectors'. Together they form a unique fingerprint.

    Cite this