Releases: 0xCCF4/BackupDeduplicator
Releases · 0xCCF4/BackupDeduplicator
v0.3.0
What's Changed
- Bump clap from 4.5.3 to 4.5.4 by @dependabot in #2
- Bump serde_json from 1.0.114 to 1.0.115 by @dependabot in #1
- Feature v0.3 by @0xCCF4 in #4
New Contributors
- @dependabot made their first contribution in #2
Full Changelog: v0.2.0...v0.3.0
v0.2.0
- Full refactor of source tree
Full Changelog: v0.1.0...v0.2.0
v0.1.0
Initial release. Basic duplication discovery of plain files/folders.
Features
- Multi threading: The tool is able to use multiple threads to speed up the
hash calculation process. - Pause and resume: The tool can be paused (killed) and resumed at any time. The
current state is saved to disk and can be loaded later. This is useful for long
analysis processes (large directories). - Cache and resume: The tool can be run at a later point reusing the cache from
a previous run. This is useful for re-analyzing a directory after some changes
have been made. - Follow or not follow symlinks: The tool can be configured to follow symlinks
or not. - Hash collision robustness: The tool uses hashes to detect duplicates.
There is a probability of hash collisions. For the final duplicate detection,
not only the hash but also the file size and file types are compared to reduce
the probability of false positives. When choosing a weak hash function (with many
false duplicates), the tool may run slower.