Jump to content











Photo
- - - - -

turbocharging .wims


  • Please log in to reply
11 replies to this topic

#1 MedEvil

MedEvil

    Platinum Member

  • .script developer
  • 7771 posts

Posted 19 May 2010 - 10:25 PM

After running some tests with compressed ntfs images, wim images and simply compressed NaughtyPE with zip, rar and 7zip.

I noticed how poor the compression of wim images is compared to other archives.

original NaughtyPE folder - 100%
wim - 54%
zip - 51%
rar- 46%
7zip - 39%

From what i've read, wim uses the LZX compression algorithm. Would it be possible to switch it out for something, which delivers more bang for the buck? Like the algorithm used in 7zip?


:lol:

#2 paraglider

paraglider

    Gold Member

  • .script developer
  • 1743 posts
  • Location:NC,USA
  •  
    United States

Posted 19 May 2010 - 11:10 PM

Only if you rewrite the OS boot loader and parts of the OS that understand the format of the wim file.

#3 MedEvil

MedEvil

    Platinum Member

  • .script developer
  • 7771 posts

Posted 19 May 2010 - 11:26 PM

Actually i thought, only the wim driver would need to get 'fixed' and missed the bootloader.
But since we have now a Win2k3 Bootloader, which supports wim booting unlike the original, i don't think the patching is the problem.

The question is, is it even possible to use such a high compressing algorythm in a wim or is there a reason wim do not perform better?

:lol:

#4 Galapo

Galapo

    Platinum Member

  • .script developer
  • 3841 posts
  •  
    Australia

Posted 19 May 2010 - 11:41 PM

Did you use heavy compression or fast compression? WIM format for me compresses better than ZIP, more comparable with RAR.

Regards,
Galapo.

#5 MedEvil

MedEvil

    Platinum Member

  • .script developer
  • 7771 posts

Posted 20 May 2010 - 10:18 AM

Didn't knew i could change the compression for wim. Where do i do that in the script?

:cheers:

#6 Galapo

Galapo

    Platinum Member

  • .script developer
  • 3841 posts
  •  
    Australia

Posted 20 May 2010 - 10:39 AM

You haven't mentioned anything about a script, so I don't know what you're meaning. But details on WIM here: http://msdn.microsof...y/dd851934.aspx. There's either 'NONE', 'XPRESS', or 'LZX'.

Regards,
Galapo.

#7 MedEvil

MedEvil

    Platinum Member

  • .script developer
  • 7771 posts

Posted 20 May 2010 - 10:48 AM

You haven't mentioned anything about a script, so I don't know what you're meaning.

I used the wimboot script from LiveXP to create the wim, the setting was maximum.
The setting for all Archivers was Best, but not solid.

:cheers:

#8 Galapo

Galapo

    Platinum Member

  • .script developer
  • 3841 posts
  •  
    Australia

Posted 20 May 2010 - 09:54 PM

I just did a test on the outputted target folder of a LiveXP build. Folder was 168MB and compressed as the following:

ZIP: 87MB
WIM: 84MB
RAR: 77MB
7z: 65MB

That's about what I would have expected. WIM compresses better than ZIP. WIM will come down considerably if there's duplicate files.

Regards,
Galapo.

#9 MedEvil

MedEvil

    Platinum Member

  • .script developer
  • 7771 posts

Posted 20 May 2010 - 11:00 PM

Did a quick check and the values for zip, rar and 7zip match with mine, though wim worked for some reason 4% better for you.
Still a 7zip compression would get it down an additional 10%. Giving it a compression ratio of 2,5.

That wim save a lot of space, when files exist more than once, does unfortunately not help with bootable wim like in our case, there are no duplicate files.


:cheers:

#10 Galapo

Galapo

    Platinum Member

  • .script developer
  • 3841 posts
  •  
    Australia

Posted 20 May 2010 - 11:17 PM

Under a PE there can indeed be duplicate files depending upon apps etc. I'll let you discover this for yourself.

Regarding using a new compression algorithm in WIM images, one important issue you'll need to confront is getting such an image mounted to the file system as current WIM images do. Sure, compression with 7z will be smaller, but will there be a performance hit with read-write access vis-a-vis the current algorithm? With 7z I strongly suspect there would be as its designed for compression, not as a potential container for a booting PE.

Regards,
Galapo.

#11 Wonko the Sane

Wonko the Sane

    The Finder

  • Advanced user
  • 16066 posts
  • Location:The Outside of the Asylum (gate is closed)
  •  
    Italy

Posted 21 May 2010 - 07:05 AM

Surely a programmer would be able to use this:
http://www.pismotechnic.com/pfm/ap/
http://www.pismotechnic.com/download/
or a similar thingy.

As Galapo said, most compression algorithms have an inefficient indexing even for reading, let alone for writing:
http://www.pismotechnic.com/cfs/

Why CFS instead of ZIP

The ZIP format is a staple format used in many applications. All modern operating systems have some form of integrated support for ZIP files. For many applications ZIP is certainly the right choice over CFS or ISO.

ZIP does have its limitations including:
Poor randam access performance in large compressed files due to lack of compression indexing.
Poor compression performance in archives with many small files due to file based compression.
File based password protection and encryption.
Introduction of essentially proprietary compression algorithms, encryption, and other extensions.
ZIP is targeted by many firewall and e-mail filtering applications, making it increasingly difficult to use for file transfers.
Why CFS instead of TAR, RAR, 7Z, etc

Most archive formats do not consistently provide the compression indexing necessary to allow efficient random access. This makes archive formats inefficient or unusable in file system related applications.

The data format and compression algorithms used in the RAR format are not freely available for use in other applications.

The 7Z archive format is documented, but available implementations of the numerous necessary compression algorithms have restricted licensing.


It's mostly a trade-off:
tight compression vs. acessibility/processor power

Try KGB to have an idea of what I mean:
http://kgbarchiver.net/

Some comparisons:
http://www.maximumco....com/index.html
Compare first programs in "tightness":
http://www.maximumco.../summary_mf.php
with those in "speed":
http://www.maximumco...summary_mf4.php

BTW, FREEARC is not too shabby:
http://freearc.org/

:cheers:
Wonko

#12 MedEvil

MedEvil

    Platinum Member

  • .script developer
  • 7771 posts

Posted 21 May 2010 - 10:17 AM

@Galapo
The way the WimBoot.script works, is the wim protected or made writable by means of fbwf.
Thus it wouldn't matter, how long the compressing takes. This is only done during build.
However a fast random read access would be required and here helps that the boot.wim is always a ramdisk, imo.
Solid archives - archives without random access are a no-go for sure.

@Wonko
Lots of nice reading. Thanks you.
Yep KGB is a bitch, got stuck with one of those archives once on an, at the time, outdated system. 14 hours decompression! :cheers:

btw. Think i possibly found the reason for the poor performance of wim compared to the other archivers, from what i've read, it seems, that all the compressors, which work better than wim, use more than 1 algorythm to compress files. Could not find any such description for wim.
If this is right, improving might not be possible.


:cheers:




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users