A Security ECONomics service platform for smart security investments and cyber insurance pricing in the beyonD 2020 netwOrking era

BDCPMEuropeH2020MSCA20SECONDOSECONDOH2020UBITECHUPRC

SECONDO Researchers Working on Twitter Crawler

Researchers Aristidis Farao and Evangelos Kotsifakos are working on Task 3.3, the intelligent Big Data Collection and Processing Module (BDCPM) that acquires risk-related data either from internal organizational sources, e.g. network infrastructure, Security Information and Event Management (SIEM), log files, users’ interaction, etc., or external sources, e.g. social media (with focus on Twitter) and other internet-based sources, including Darknet, using specialized crawlers.

The collected and processed data from BDCPM will be specified and quantified within a metamodel. This task will bundle a set of algorithms that will be used to perform sophisticated analysis on top of aggregated data whenever required during the project course. These algorithms have been classified into three categories: regression analysis, predictive and prescriptive analysis, and lastly data mining and machine learning.