Researchers are Working on Crawlers Development for BDCPM Module
SECONDO researchers are working on the development and design of cyber crawlers that will provide data, either from internal organizational sources, e.g. network infrastructure, Security Information and Event Management (SIEM), log files, users ’interaction, etc., or external sources, e.g. social media and other internet-based sources, including Darknet, using specialized crawlers.
Researchers are working towards the completion of Task 3.3 “Big data collection and processing” and at the development of cyber crawlers. Through the meeting which was held virtually the members exchanged pieces of information in order to fulfill the needs of the project.
The collected and processed data will be specified and quantified within a metamodel as well as will bundle a set of algorithms that will be used to perform sophisticated analysis on top of aggregated data whenever required during the project course.