Crawler: A Useful Tool of The Project Has Been Completed
SECONDO researchers announced the completion of the crawler, which constitutes a significant part of the project, at the Bi-weekly general call. The crawler scans the Dark Web and Twitter for essential data, that could endanger an organization.
The collected and processed data will be specified and quantified within a metamodel by the researchers. This task will bundle a set of algorithms that will be used to perform sophisticated analysis on top of aggregated data whenever required during the project course.
These algorithms have been classified into three categories; regression analysis, predictive and prescriptive analysis, and lastly data mining and machine learning.