This project aimed to fundamentally increase the performance of current state of-the-art forensic methods and decrease the manual work necessary for a forensic analyst by
- developing new methods to increase the use of parallelized data processing within the specific environment of digital forensics,
- identifying the best method(s) on how to exclude a possibly vast number of files and file system artefacts that are not specific to a case, and
- streamlining and improve methods proposed in the literature that have not been included into existing processing steps for additional insights for various reasons.
New prototypes and new methods were published which support the fully automated analysis of large amounts of data, which have attracted the interest of analysts and forensic scientists worldwide. Especially the parallel processing is currently a very strong limitation for hard disks and server systems, because the simplest analyses can take many hours to be processed. This is also still true for some commercial solutions that currently still do not allow parallel processing of analysis steps. However, as storage capacities are increasing and hard disks with more than 10 terabytes can be purchased directly from end customers, these techniques will become more and more relevant in the future.
A particular highlight is the BitTorrent files collected in the course of the project, which, at more than 3 million, represent currently (end of project 2017) the world’s largest publicly accessible archive. Not only are they useful for the specific context of the project, but they can also be used in many other research areas, such as the analysis of distributed systems, P2P file-sharing networks, or Internet architectures in general.
Related News & Events
This project was funded by the FFG.