Testing SOAR tools in use

Robert A. Bridges, Ashley E. Rice, Sean Oesch, Jeffrey. A. Nichols, Cory Watson, Kevin Spakes, Savannah Norem, Mike Huettel, Brian Jewell, Brian Weber, Connor Gannon, Olivia Bizovi, Samuel C. Hollifield, Samantha Erwin

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

Investigations within Security Operation Centers (SOCs) are tedious as they rely on manual efforts to query diverse data sources, overlay related logs, correlate the data into information, and then document results in a ticketing system. Security Orchestration, Automation, and Response (SOAR) tools are a relatively new technology that promise, with appropriate configuration, to collect, filter, and display needed diverse information; automate many of the common tasks that unnecessarily require SOC analysts’ time; facilitate SOC collaboration; and, in doing so, improve both efficiency and consistency of SOCs. There has been no prior research to test SOAR tools in practice; hence, understanding and evaluation of their effect is nascent and needed. In this paper, we design and administer the first hands-on user study of SOAR tools, involving 24 participants and six commercial SOAR tools. Our contributions include the experimental design, itemizing six defining characteristics of SOAR tools, and a methodology for testing them. We describe configuration of a cyber range test environment, including network, user, and threat emulation; a full SOC tool suite; and creation of artifacts allowing multiple representative investigation scenarios to permit testing. We present the first research results on SOAR tools. Concisely, our findings are that: per-SOC SOAR configuration is extremely important; SOAR tools increase efficiency and reduce context switching, although with potentially decreased ticketing accuracy/completeness; user preference is slightly negatively correlated with their performance with the tool; internet dependence varies widely among SOAR tools; and balance of automation with assisting decision making is preferred by senior participants. We deliver a public user- and tool-anonymized and -obfuscated version of the data.

Original languageEnglish
Article number103201
JournalComputers and Security
Volume129
DOIs
StatePublished - Jun 2023

Funding

Samantha (SAM) Erwin received a B.S. in 2011 from Murray State University and a M.S. (2013) and Ph.D. (2017) from Virginia Tech, all in Mathematics. Sam completed a postdoctoral research fellowship at North Carolina State University and previously worked at Oak Ridge National Laboratory. Sam is now a Data Scientist at Pacific Northwest National Laboratory focusing on threat assessment and anomaly detection. Data & code from this work available at https://github.com/bridgesra/soar_experiment_data_code . This manuscript has been co-authored by UT-Battelle LLC under contract DE-AC05-00OR22725 with the US Department of Energy (DOE). The US government retains and the publisher, by accepting the article for publication, acknowledges that the US government retains a nonexclusive, paid-up, irrevocable, worldwide license to publish or reproduce the published form of this manuscript, or allow others to do so, for US government purposes. DOE will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan ( http://energy.gov/downloads/doe-public-access-plan ). The authors thank: Mike Karlbom for ongoing support and leadership through the AI ATAC Challenge series; Jonathan Hodapp for the bureaucracy; Jessica Briesacker for the counseling (legal and otherwise); Jaimee Janiga and Laurie Varma for assistance with the infographic; Laurie Varma for editoral review; Mingyan Li for technical review, and translation of non-English written previous works. The research is based upon work supported by the US Department of Defense (DOD), Naval Information Warfare Systems Command (NAVWAR), via the US Department of Energy (DOE) under contract DE-AC05-00OR22725. The views and conclusions contained herein are those of the authors and should not be interpreted as representing the official policies or endorsements, either expressed or implied, of the DOD, DOE, NAVWAR, or the US Government. The US Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright annotation thereon. The authors thank: Mike Karlbom for ongoing support and leadership through the AI ATAC Challenge series; Jonathan Hodapp for the bureaucracy; Jessica Briesacker for the counseling (legal and otherwise); Jaimee Janiga and Laurie Varma for assistance with the infographic; Laurie Varma for editoral review; Mingyan Li for technical review, and translation of non-English written previous works. The research is based upon work supported by the US Department of Defense (DOD), Naval Information Warfare Systems Command (NAVWAR), via the US Department of Energy (DOE) under contract DE-AC05-00OR22725. The views and conclusions contained herein are those of the authors and should not be interpreted as representing the official policies or endorsements, either expressed or implied, of the DOD, DOE, NAVWAR, or the US Government. The US Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright annotation thereon.

Keywords

  • Cybersecurity technology
  • Security operation center (SOC)
  • Security orchestration automation and response (SOAR)
  • Test and evaluation
  • User study

Fingerprint

Dive into the research topics of 'Testing SOAR tools in use'. Together they form a unique fingerprint.

Cite this