Jump to content











- - - - -

Is it safe to defrag an external HDD?


  • Please log in to reply
9 replies to this topic

#1 Guest_AnonVendetta_*

Guest_AnonVendetta_*
  • Guests

Posted 09 January 2020 - 05:14 AM

Not much explanation needed, question is exactly what the thread title says. I have a handful of large 5TB to 10TB external HDDs, all of which are either USB 3.0 or USB 3.1. Most are either at or nearing capacity, I use them for archiving files that I either don't need to access often, or don't need to store on my internal drives. The main issue is that both reading and writing to these drives has become somewhat slow, much slower than USB 3.x speeds. Obviously writing will become slower since there is very little available space. And each of these drives have native NTFS compression enabled (not really necessary, but it helps to squeeze out all the capacity I can get), which does slightly decrease speed (not significantly, in my experience, but it depends on the drive). But when they were empty the read/write speeds were much faster.

 

This has led me to believe that maybe defragging to consolidate fragmented files might help. Of course, this won't increase available space. And if a crash or some other error occurs during the defrag, that could mean corruption of data that was otherwise OK before. So...is it safe? Or should I just leave it alone?



#2 Wonko the Sane

Wonko the Sane

    The Finder

  • Advanced user
  • 16066 posts
  • Location:The Outside of the Asylum (gate is closed)
  •  
    Italy

Posted 09 January 2020 - 11:23 AM

As a general rule of thumb, NTFS volumes (at least non compressed ones) work just fine, fragmented or not fragmented, until they have less than 15% or 10% of free space available, than performance starts to degrade noticeably.

 

But this is "old school" advice and the percentages - besides being as said a rule of thumb - came out when disks were much smaller than they are today.

 

Having 15% of (say) a 160 GB volume means having 24 GB free.

Having 10% of (say) a 500 GB volume means having 50 GB free.

 

On large (TB) disks, the percentages to have some minimal "slack" space obviously decrease.

 

Another - possibly more "modern" - rule of the thumb might be "have at least 25 GB free and no less than 5 times the biggest file on the filesystem".    

 

Defragmenting has its own (very little) risks, just like any other disk activity of any kind, but they are risks that are worth the increased abilities to recover data from files in contiguous extents should it be needed (again at least on non compressed drives).

 

Before doing actual defragmenting of the whole filesystem using the defrag tool, in your case I would check the volumes with a more "granular" tool like Wincontig:

https://wincontig.md...it/en/index.htm

which allows for a more targeted defragmentation.

 

Please note how if there is not enough free space the defragmentation may not be performed or "take forever", so if you have volumes filled up to the brim it is advised to temporarily remove (copying them to other media) enough files to allow for some "reasonable" space.

 

:duff:

Wonko



#3 Rootman

Rootman

    Frequent Member

  • Advanced user
  • 382 posts
  • Location:USA

Posted 09 January 2020 - 12:32 PM

Defragging drives that large will be excruciatingly slow, especially at low capacity and if perchance the space left on the drive is less than some of the file sizes you have on the disk it may not be possible.

 

I'd test one drive first then see if it is practical and if it actually helps with your write speed issue. 

 

I have a bunch of smaller USB 3.X HDDs and have defragged them once in a while, I had a number of large files that benefited from not being fragmented. A few ISO files that MUSt be contiguous so that they boot using Grub4DOS and I need to defrag first to make sure that there is enough contiguous space free to make the ISOs one single piece. It made no difference at all to write speed, but I did no tested, just observation.   



#4 Guest_AnonVendetta_*

Guest_AnonVendetta_*
  • Guests

Posted 09 January 2020 - 09:35 PM

OK, thanks for the advice. I'll try a test defrag with WinContig, and later with PerfectDisk. I only want to consolidate fragmented files, no need to do the whole drive.

But the one thing I'm not understanding, and the one thing that has not been answered despite the topic's very specific title, is:

Is it safe to defrag an *EXTERNAL* HDD? Sure, it's just a drive, but since it is connected via USB it would seem to be more risky, vs an internal drive, which generally do not easily come loose from their socket/port. Probably the quality and length of the cable would play a factor, one would not want to risk defragging with a shit cable. I personally only use fiber braided cables (for their durability), no shorter than 3ft or longer than 6ft (depends on how far I need to be able to extend it).


Thanks!

#5 Rootman

Rootman

    Frequent Member

  • Advanced user
  • 382 posts
  • Location:USA

Posted 09 January 2020 - 10:30 PM

Well, as I stated, I've got a bunch of USB HDDs that I have defragged, all without any issues, so safe?  I suppose, but as you stated, the fact that the cable is exposed and able to be yanked out is a possible issue, however most all defraggers do a safe defrag and even if disconnected should be OK.

 

It just depends on what risk you are wiling to accept.  I'm typing on my laptop which has a good quality USB cable attached to a powered USB hub,  I have 3 multi TB drives attached to it mounted and encrypted with VeraCrypt.  For years now I have not lost any data despite there being so many potential single points of failure.  I've defragged them a few times over the years, as well as run chkdsks on them.  I feel safe, and so far so good. So, it just depends on your comfort with potential disaster. 



#6 Guest_AnonVendetta_*

Guest_AnonVendetta_*
  • Guests

Posted 09 January 2020 - 11:04 PM

@Rootman: I use SHA512 to check the integrity of all files, and par (parity) files to repair them in case of corruption. Or if the file is really important, at least 2 backups in different places. I too use VeraCrypt on most of my external drive partitions, never any serious issues.

Does WinContig do a "safe" defrag? PerfectDisk probably does, given that it is one of the most known and reputable commercial defrag softwares.

By "safe" I assume you mean that the defragged portion is temporarily cached in a delayed write buffer, and flushed to disk shortly thereafter. So the only point of actual data loss that I can see, would be the actual writing portion.

#7 Wonko the Sane

Wonko the Sane

    The Finder

  • Advanced user
  • 16066 posts
  • Location:The Outside of the Asylum (gate is closed)
  •  
    Italy

Posted 10 January 2020 - 09:41 AM

If I recall correctly Wincontig uses under the hood  the MS provided services (i.e. the same "engine" Windows defrag uses) it only allows more "granularity", just like Sysinternals Contig:

https://docs.microso...ownloads/contig

 

Anyway, Wincontig is around in a mature version (first releases were 2007 or 2008) - at the very least - since 2009 - and I have never seen a report of actual issues (corruption of files or filesystem).

 

BTW, AFAIK most third party software does not have any "fancy", "different" defragging engine, most if not all use the MS one, "piloting" it slightly differently than the built-in defrag.

 

Maybe a line must be drawn (talking of risk/reliability due to cable contacts) between USB 2.5" inch disks (powered via the same USB cable) and 3.5" disks (powered separately via a power adapter) :dubbio:.

 

To the latter ones a mis-connection of the USB cable should be more or less a "data trasmission error"  that could be either automatically fixed by the internal caches and read/write implementation, or in the worst case procure a partial data loss/corruption of the specific file "under work" , while the second would definitely be a "power failure" which is more serious as it could lead to physical damage to the disk (head crash).

 

On the other hand - still in theory - the externally powered cases have two separate cable/connections that may go wrong ...

 

:duff:

Wonko



#8 steve6375

steve6375

    Platinum Member

  • Developer
  • 7566 posts
  • Location:UK
  • Interests:computers, programming (masm,vb6,C,vbs), photography,TV,films
  •  
    United Kingdom

Posted 11 January 2020 - 10:21 AM

NTFS clusters are either  in-use, dirty (used and file/folder was deleted) or unused. A deleted file still is still present on the disk, it is just marked as 'deleted'.

 

WinContig will defragment files, but it does not touch/move/tidy any unused cluster areas or deleted files.

 

So after WinContig, if you then write another large file, then that file will probably be slow to write and will be fragmented because the free space on the drive is all over the place in 'small' blocks.

 

I suggest you use Defraggler to look at the cluster map.

 

The NTFS file system will not overwrite deleted files unless there are no unused clusters left on the volume. i.e. when you write a file, it will look for unused clusters (even if they are small and all over the volume) and write to those.

 

After a while, the disk will not contain any unused clusters (only in-use or deleted clusters), so the NTFS file system will then overwrite the dirty (deleted) clusters when adding a new file (I think it looks for a contiguous block of used clusters as a first choice - e.g. if you are writing a 10GB file and there is a 10GB+ contiguous block belonging to a deleted file, it will use that.)

 

Defraggler also has an Action - Advanced - Defrag free space option. If you want to be able to write files to the USB drive faster, then you will need to use this.

 

As for - is it safe? - the NTFS file system is pretty repairable if anything does go wrong.



#9 Guest_AnonVendetta_*

Guest_AnonVendetta_*
  • Guests

Posted 06 February 2020 - 12:42 AM

I've recently encountered a weird issue, and it only seems to have started after I defragged the drive a few times. It is a Seagate 5TB external HDD. I use it for storage of game data only (installers, archives, ISOs, etc). The entire drive has NTFS compression enabled from the root/top-level directory downwards, and the single NTFS volume on the drive is encrypted with VeraCrypt (encryption isn't really necessary, but I used to have other more sensitive files on it, decrypting the volume in-place would take days, it is faster to move the files elsewhere and format it). But anyway, after copying several hundred GB of files to it this morning, it was supposedly down to 43GB free space....or so Windows Explorer says. But after dismounting the drive and later remounting it, Explorer says it has 100GB. What gives? Maybe the compression is throwing off the estimate? I really wish Windows would make up its' mind. How do I determine the actual amount of free space remaining? On compressed volumes in the past, I have often had the issue that say, 100GB is remaining. So I try to copy say, 80GB of files. Explorer will oftentimes get most of the way through the copying, then say there is insufficient space. It is my understanding that Windows compresses the file in place as it is copied to the volume. But I am not sure exactly how it calculates for compressed volumes. I plan to continue copying more files to it until it is down to 25GB free, then do a final defrag only/consolidate free space run. After that I will not write to it anymore, only copy from it when needed. Write speeds have become quite slow, but read speeds seem normal for the most part.

 

So far I have not encountered any issues while defragging the drive, running chkdsk on the volume shows that all is fine. And the drive passes numerous health tests with several reputable utilities, despite me dropping the drive times in the 18-ish months I have owned it.



#10 steve6375

steve6375

    Platinum Member

  • Developer
  • 7566 posts
  • Location:UK
  • Interests:computers, programming (masm,vb6,C,vbs), photography,TV,films
  •  
    United Kingdom

Posted 06 February 2020 - 09:40 AM

The remaining free space on a compressed drive is always going to be a 'guess'.

How can the OS know if you are going to copy over 30GB of pre-compressed .zip files (which are essentially uncompressible) or 30GB of plain text files (which are highly compressible)? All the OS can do is take an educated guess!






1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users