Compression Commands implemented.

Topics: Developer Forum
Jan 9, 2007 at 3:37 AM
Ok guys, so the compression write-* cmdlets have entered beta/rc stage. I'd appreciate if you could play around with them and let me know what breaks ; I've spent the last two days straight on them and I'm feeling a bit wacky.

so, current issues:

  • write-tar requires -OutputPath even though it is not marked as mandatory.
  • write-bzip2 seems to stall about 80% for each file it writes for about 1 second, then continues fine. buffering issue?
  • ctrl+c needs to be trapped and pending output stream handles should be closed -- do we have base class support for dealing with Stopping / StopProcessing?

apart from that, tis nice to do:

dir *.log | write-tar -outputpath my.tar | write-gzip


dir *.log | write-bzip2

etc. etc.

I've checked in everything, so feel free to fix bugs if you see some (i'm sure there are).

- Oisin
Jan 9, 2007 at 3:39 AM

gi dirname | write-tar -outputpath my.tar

this works great, but for some reason my winrar stalls about 10 seconds before opening it, regardless of size. It appears that the tar is perfectly valid though?
Jan 9, 2007 at 10:51 AM
WRT Ctrl-C: You shouldnt release resources in StopProcessing. The IDisposable.Dispose is the place to do this. I've implemented IDisposable in the PscxCmdlet and WriterBase, and changed the CloseCompressedOutputStream to only Finish() the stream where required.

WRT Write-Tar OutputPath: I've hidden the base WriterBase.OutputPath property with a new mandatory one. Maybe it could default to the parent directory name instead of being mandatory.

I've also renamed most of the fields to match our naming standard. (trust me, I've had hard time adjusting to it, too ;-)
Jan 9, 2007 at 5:31 PM
ok, I've fixed up the last few remaining violations for member vars - removed m_ prefix, replaced with _.

- Oisin
Jan 9, 2007 at 6:03 PM
ok. do you have any ideas about the strange 1-second hangs? i can confirm them on my notebook (32bit vista). i think i am experiencing them on 64bit longhorn server, too, but the machine is dual-core and a lot faster than the laptop, and they occur only in larger files, somewhere around 50% of the file...
Jan 9, 2007 at 6:06 PM
are you talking about bzip2 compression? or opening a write-tar generated tar?
Jan 9, 2007 at 6:08 PM
i am talking about bzip2. i dont have winrar, so i cant confirm the tar issue
Jan 9, 2007 at 7:31 PM
I'll look into it later and post my report either this evening or tomorrow. My gut feeling is buffering and/or GC issues.

- Oisin
Jan 9, 2007 at 11:07 PM
yeah, i guess it might be the garbage collector. maybe bzip2 is putting more load on it than the other algorithms.. i guess we could start by making the buffer static (lazily created)
Jan 10, 2007 at 5:09 AM
With Write-Zip, how do I get it to preserve the directory structure in this case:

gci -rec -fil *.txt | write-zip -output

Also when I tried this on my temp dir (rather large):

C:\Temp> gci -rec | write-zip -output

It got stuck in an infinite loop compressing these files over and over. I also got an error that said it couldn't access You might want to check the file being compressed to make sure it isn't the same as the output path.
Jan 10, 2007 at 5:13 AM
One other minor problem. When I CtrlC killed the run away write-zip, it kept a lock on the 641 MB zip file so I couldn't delete it until I exited PowerShell. Does the EndProcessing method get called on a CtrlC abort? If not I wonder if there is another way to hook Ctrl+C to clean up properly?
Jan 10, 2007 at 5:31 AM
Doh! Nevermind on the Ctrl C issue. I see that you are aware of that.
Jan 11, 2007 at 9:56 PM
yeah, the piping files issue brought along a whole new way of breaking things especially concerning the "container" archivers, e.g. tar and zip.

Using write-zip . -output does the right thing, but piping is buggered. Suffice it to say, I'm on it.