Jump to content











Photo
- - - - -

GZip + IMG compression problems


  • Please log in to reply
7 replies to this topic

#1 ignored

ignored

    Newbie

  • Members
  • 26 posts
  •  
    US Virgin Islands

Posted 11 December 2011 - 04:50 PM

hello.


i created 3.8GB image (x.img) then downloaded gzip.exe and ran the command
gzip -1 d:x.img

the compressed image size is 514MB


Next i ran this command while x.img was mounted on xp
sdelete -c


and updated x.img with img_xp_update.exe


now when i compress, the size is 3GB+ O_O


I defragmented still size not going down!! what went wrong? i also tried with 7zip, still 3GB! why free space not getting compressed?

#2 ignored

ignored

    Newbie

  • Members
  • 26 posts
  •  
    US Virgin Islands

Posted 11 December 2011 - 05:41 PM

Is there a tool which can COMPACT files, resize x.img partition (from windows) till the end of used space, resize again to end of available disk capacity?

maybe such tool can again help gzip.exe compress only data and not the empty space.

Edited by ignored, 11 December 2011 - 05:43 PM.


#3 Hima

Hima

    Member

  • Members
  • 52 posts
  • Location:cairo
  •  
    Egypt

Posted 12 December 2011 - 12:26 AM

hi

to compress extra by gzip

you can do that by this command


gzip -[color=#ff0000]9[/color] file.img


9 is the maximum compress and you can use some other parameters
by change the number from 9 to 1 it is less compression

and normal compression use 6

regards

#4 ignored

ignored

    Newbie

  • Members
  • 26 posts
  •  
    US Virgin Islands

Posted 17 December 2011 - 02:44 AM

I know -9 compresses best, but i'm looking for FAST compression, FAST decompression of data and !!FASTEST!! decompression of empty space in boot time.



at first i got this result when i didn't touch the empty space but when i ran "sdelete.exe -c" to clean empty space inside image, everything became slow.


compression with -1 which is 8MiB/s became slow because freespace was getting compressed badly. So .gz is high in size, resulting in slowest boot.




I am going to create brand new empty image, copy files from previous image to new and try gzip -1 again. i think this will mostly work



if i go for your option "-9" compression time of 3.8GiB image will irritate me.

Edited by ignored, 17 December 2011 - 02:46 AM.


#5 ignored

ignored

    Newbie

  • Members
  • 26 posts
  •  
    US Virgin Islands

Posted 17 December 2011 - 04:13 AM

Hi again :) , I got 90% success with new image method. The compressed size of 3.9GiB z.img.gz is again 510MB!


inside 3.9GiB image, only 1.09GiB is used.


x.img.gz map memory speed is very slow on my ssd, 30MiB/s approx but when data part is mapped, empty space goes like 300 or 400MiB/s which is very good.




In the middle however, it becomes slow for 6 to 7 seconds. To verify, i mounted x.img and analyzed with UltraDefrag. There is free space in the middle, when i defragmented, gzipped image went above 850MiB!!
Posted ImagePosted Image
UltraDefrag's strategy of moving files at the end of disk and then moving back optimized files to beginning used free space! Once the free space is used up, gzip algorithm gets hard time!

Edited by ignored, 17 December 2011 - 04:26 AM.


#6 ignored

ignored

    Newbie

  • Members
  • 26 posts
  •  
    US Virgin Islands

Posted 17 December 2011 - 05:24 AM

[PROBLEM SOLVED] USED CCLEANER --> TOOLS --> DRIVE WIPER

Now x.img.gz 501MB with "gzip -1 x.img" command line :)

http://www.filehippo...wnload_ccleaner

#7 AceInfinity

AceInfinity

    Frequent Member

  • Team Reboot
  • 228 posts
  • Location:Canada
  • Interests:Windows Security, Programming, Customizing & Crash Dump Analysis.
  •  
    Canada

Posted 17 December 2011 - 06:47 AM

The problem here was that Windows doesn't accurately relay the value of free space. What they consider free space, is space that can be swapped for real non-temporary data. For example, when you delete files, they aren't deleted in their entirety. Data is still left behind on your system from even those "permanently deleted" files, and which is the reason for why you can recover permanently deleted files. Programs like Recuva in essence use that information to recompile their original form. When data is almost gone, and traces from that file have become replaced by newer incoming temp data hidden on your system from view, it becomes harder to recover files because the information to recover the file is partially there, and becomes less and less with every action you take on your computer because that temp data gets replaced. That's just some background information I have for you.

Furthermore, the reason why CCleaner had solved this is because it has options and functionality to clear out that temp data, (not all of it though). So you will never have 100% of that free space logically for that reason, from system properties. It tells you because your system knows that at any given time you could never fill the partition with that free data space immediately.

#8 ignored

ignored

    Newbie

  • Members
  • 26 posts
  •  
    US Virgin Islands

Posted 17 December 2011 - 02:20 PM

I'm unhappy with sysinternal's "sdelete.exe -c". it was suppose to clean out free space but it creates large file and fills the partition till the free space runs out. some strategy? :P Also UltraDefrag's latest RC doesn't have free space clearing option. I guess free space clearing is not a popular so don't blame ultadefrag.

does any of you know of smallest tool:
- to clear free space
- to save partition or disk to .img from dos
- to resize partition with different formats, linux, windows,
- to defragment




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users