[OSX-Users] Re: reconciling duplicated files/directories/backups
Dave Challis
dsc at ecs.soton.ac.uk
Wed Dec 14 15:35:18 GMT 2011
I've heard good things about http://minimalisticdev.com/singlemizer/ to
do just that, never tried it myself though.
Otherwise there's a bunch of open source command line stuff out there,
which will probably compile on macs (http://en.wikipedia.org/wiki/Fdupes
is the one I used a few years back).
On 14/12/11 15:25, Marcus Cobden wrote:
> Ah right.
>
> This'd be too complex to do in a shell, but you could build a matrix 'relevant' of information about each file (basename, timestamps, md5, etc) and then throw it through some kind of clustering algorithm.
>
> That might be overkill :P
>
> Also I can't see an easy way to incorporate directory structures into that, but that might not be needed.
>
> MArcus
>
> On 14 Dec 2011, at 13:12, Les A Carr wrote:
>
>> That's fine if you have two candidate duplicate directories, but I want to identify all the files across various hard disks that may be the same as any others!
>> --
>> les
>>
>>
>> On 14 Dec 2011, at 13:02, Marcus Cobden wrote:
>>
>>> There's always the Unix diff command
>>>
>>> diff -r dir1 dir2
>>>
>>> I prefer to add in the -u option, and also view a colorised version.
>>> Also I'm pretty sure you can get it to only tell you that files differ, and not how.
>>>
>>> Finally, the timestamps on the files might be useful, so if you move/copy them around, take care to preserve them.
>>>
>>> Marcus
>>>
>>> On 14 Dec 2011, at 12:08, Les A Carr wrote:
>>>
>>>> I am writing some simple tools to help me reconcile multiple copies of files/directories from various attempts to back up or duplicate information across machines for safekeeping.
>>>>
>>>> Before I go too far down this road (shell scripts and md5 checksums!) does anyone know of any software that already does this?
>>>> --
>>>> Les
>>>
>>>
>>
>>
>
>
--
Dave Challis
dsc at ecs.soton.ac.uk
More information about the Osx-users
mailing list