Jump to content











Photo
- - - - -

Backup clever


  • Please log in to reply
4 replies to this topic

#1 Wonko the Sane

Wonko the Sane

    The Finder

  • Advanced user
  • 16066 posts
  • Location:The Outside of the Asylum (gate is closed)
  •  
    Italy

Posted 19 June 2010 - 04:26 PM

Backupclever:
http://sourceforge.n...le=BackupClever

What is BackupClever?

BackupClever is a backup utility for Windows.

It combines the benefits of incremental backups (less space) with the freedom of full backups (all files accessible). After an initial full backup, only modified files will be copied while representing the others by NTFS hard-links to previous backups.

When a backup is done all source files are compared with already existing files in previous backups. If there are changes, a new (time-stamped) backup directory is created and all modified source files are copied to it, unmodified files are integrated as NTFS hard-links to the existing files of previous backups. This allows having a complete copy of all source files and folders in each backup set, while only reserving the space of one initial backup plus the differences between each subsequent backup on the backup media. Unused backup folders (including the first full backup) may be deleted safely at any moment without impacting the integrity of the other backups.


UNFORTUNATELY :unsure: .NET based.

:unsure:
Wonko

#2 Brito

Brito

    Platinum Member

  • .script developer
  • 10616 posts
  • Location:boot.wim
  • Interests:I'm just a quiet simple person with a very quiet simple life living one day at a time..
  •  
    European Union

Posted 19 June 2010 - 05:43 PM

eheh.. .NET and NTFS based.

Would be nice to see it in Java to also work on Linux and OSX.

I like the concept but when there is a considerable number of files that change per day then the system get less and less optimized and you eventually need to do a full backup again.

When you do the full backup again then you need to support the size used by these two backups. If you eliminate the first backup then all backups made until the second full backup are rendered useless - unless of course the author has predicted this case and the application is converting the precedent incremental backups to be compared against the second full backup.

Good suggestion.

:unsure:

#3 sbaeder

sbaeder

    Gold Member

  • .script developer
  • 1338 posts
  • Location:usa - massachusettes
  •  
    United States

Posted 19 June 2010 - 06:49 PM

I like the concept but when there is a considerable number of files that change per day then the system get less and less optimized and you eventually need to do a full backup again.

I think you missed the point of "HARD" links - this is a trick in the OS/file system that has a pointer name back to the same bits on the disk - so there is no real "optimization" lost, you are just adding files to the disk for the new things. EVERY backup looks like (and points) to a full set of content on the disk

You could do something like this on ext2/3 (not sure about ext4). On Linux, look at the "ln" command...all it does is make another directory pointer to the same blocks as the original name it is "linking" to...

Yes, this is a pretty neat trick and it saves you time of a back up and also space, since you really only have to keep the incrementals for the days you want to change. you just keep deleting the older days, but still have a "full" backup...

This is similar tot he making of an ISO file, where you can tell it not to put duplicate copies of the same file onto the CD/DVD image...

Scott

#4 Brito

Brito

    Platinum Member

  • .script developer
  • 10616 posts
  • Location:boot.wim
  • Interests:I'm just a quiet simple person with a very quiet simple life living one day at a time..
  •  
    European Union

Posted 20 June 2010 - 08:11 AM

You could do something like this on ext2/3 (not sure about ext4). On Linux, look at the "ln" command

Yes, this is the same for HFS on OSX. Had he used Java instead of .NET and it would run across all these OS's.

I think you missed the point of "HARD" links - this is a trick in the OS/file system that has a pointer name back to the same bits on the disk

The problem is if a user deletes a file from the bulk of files to backup and we delete the first full backup then the file won't appear magically.

That's one of the reasons why I'm a bit suspicious about incremental backups.

:unsure:

#5 sbaeder

sbaeder

    Gold Member

  • .script developer
  • 1338 posts
  • Location:usa - massachusettes
  •  
    United States

Posted 20 June 2010 - 09:28 PM

The problem is if a user deletes a file from the bulk of files to backup and we delete the first full backup then the file won't appear magically.

Yes, that is very true - this only keeps a snap shot of the current status, and IF you delete the older "versions", then you run the risk of not having a file to go back to.

I don't see this as an incremental vs. full issue. I see this more as an issue of how many backups do you keep. IN reality, each directory created by this system is a "full" backup, even if it just incrementally added new files to the disk that hols the backed-up files. Each directory looks like it has all the files, since it actually does have directory pointers to all the files.

BUT, if you delete an older directory to reclaim the space, then you loose any UNIQUE files in that backup. This is the same as any other type of backup strategy. If the source file is delete, and the backup that had that file is also deleted, then the file is gone.

This is just a sad fact of backup management :unsure:




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users