Post reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Note: this post will not display until it's been approved by a moderator.

Name:
Email:
Subject:
Message icon:

Verification:
This site name www.[site].com?:

shortcuts: hit alt+s to submit/post or alt+p to preview


Topic Summary

Posted by: hsei
« on: December 29, 2017, 20:44:20 »

I tried to compare about 3000 files against a huge collection of 250 K songs in two groups, having created before a cache.dat of 4.5 GB.
Global optimization creates another data.dat of 20 GB (+index.dat, links.dat), found no duplicates in 12 hours with almost 100% completed and finally crashed, removing the disk (SSD) with the data from the file system.
The disk could only be recognized again by cold boot.

Doing the same without global optimization finished successfully in 4 hours, finding about 200 duplicates.