SAIBERSOC: A Methodology and Tool for Experimenting with Security Operation Centers

Research output: Contribution to journalArticleAcademicpeer-review

2 Citations (Scopus)
159 Downloads (Pure)

Abstract

In this article, we introduce SAIBERSOC (Synthetic Attack Injection to Benchmark and Evaluate the Performance of Security Operation Centers), a tool and methodology enabling security researchers and operators to evaluate the performance of deployed and operational Security Operation Centers (SOC) — or any other security monitoring infrastructure. The methodology relies on the MITRE ATT&CK Framework to define a procedure to generate and automatically inject synthetic attacks in an operational SOC to evaluate any output metric of interest (e.g., detection accuracy, time-to-investigation). To evaluate the effectiveness of the proposed methodology, we devise an experiment with n=124 students playing the role of SOC analysts. The experiment relies on a real SOC infrastructure and assigns students to either a BADSOC or a GOODSOC experimental condition. Our results show that the proposed methodology is effective in identifying variations in SOC performance caused by (minimal) changes in SOC configuration. We release the SAIBERSOC tool implementation as free and open source software.
Original languageEnglish
Article number14
Number of pages29
JournalDigital Threats: Research and Practice
Volume3
Issue number2
DOIs
Publication statusPublished - Jun 2022

Keywords

  • SOC
  • Cyber security operations center
  • evaluation
  • performance

Fingerprint

Dive into the research topics of 'SAIBERSOC: A Methodology and Tool for Experimenting with Security Operation Centers'. Together they form a unique fingerprint.

Cite this