Posted by: hsei
« on: December 29, 2017, 20:44:20 »I tried to compare about 3000 files against a huge collection of 250 K songs in two groups, having created before a cache.dat of 4.5 GB.
Global optimization creates another data.dat of 20 GB (+index.dat, links.dat), found no duplicates in 12 hours with almost 100% completed and finally crashed, removing the disk (SSD) with the data from the file system.
The disk could only be recognized again by cold boot.
Doing the same without global optimization finished successfully in 4 hours, finding about 200 duplicates.
Global optimization creates another data.dat of 20 GB (+index.dat, links.dat), found no duplicates in 12 hours with almost 100% completed and finally crashed, removing the disk (SSD) with the data from the file system.
The disk could only be recognized again by cold boot.
Doing the same without global optimization finished successfully in 4 hours, finding about 200 duplicates.