[lnkForumImage]
TotalShareware - Download Free Software

Confronta i prezzi di migliaia di prodotti.
Asp Forum
 Home | Login | Register | Search 


 

Forums >

comp.lang.ruby

zlib large files problem

greg

3/13/2007 6:42:00 PM

I am trying to decompress a 80MB file (decompresses to 300MB), but I
keep getting a buffer error. I thought I had a workaround using
chunks, but it fails with 20M left to go. Any help or suggestions are
greatly appreciated.

in `read': buffer error (Zlib::BufError)


class Zlib::GzipReader
def each_chunk(chunk_size=1024)
yield read(chunk_size) until eof
end
end

gz = Zlib::GzipReader.open(zip_file)

File.open( non_zip_file, 'wb' ) do |f|
gz.each_chunk {|chunk| f.write(chunk)}
end


7 Answers

David Mullet

3/13/2007 11:00:00 PM

0

On Mar 13, 2:42 pm, "greg" <eeg...@gmail.com> wrote:
> I am trying to decompress a 80MB file (decompresses to 300MB), but I
> keep getting a buffer error. I thought I had a workaround using
> chunks, but it fails with 20M left to go. Any help or suggestions are
> greatly appreciated.
>
> in `read': buffer error (Zlib::BufError)
>
> class Zlib::GzipReader
> def each_chunk(chunk_size=1024)
> yield read(chunk_size) until eof
> end
> end
>
> gz = Zlib::GzipReader.open(zip_file)
>
> File.open( non_zip_file, 'wb' ) do |f|
> gz.each_chunk {|chunk| f.write(chunk)}
> end

I get good results with the rubyzip gem, extracting 300+mb files from
Zip archives...

require 'zip/zip'
zip = Zip::ZipInputStream::open(zipfile)
zip.get_next_entry
File.open(output_file, 'wb') do |f|
f.write(zip.read)
end
zip.close

Hope that helps.

David

eden li

3/14/2007 1:57:00 AM

0

The default decompress method works for me...

$ irb -r zlib
irb(main):001:0> File.stat('threehundred.gz').size / (1024**2)
=> 81
irb(main):002:0> Zlib::GzipReader.open('threehundred.gz') { |r|
File.open('threehundred', 'wb') { |f| f.write r.read } }
=> 314835846
irb(main):003:0> File.stat('threehundred').size / (1024**2)
=> 300

What version of ruby are you using?

On Mar 14, 2:42 am, "greg" <eeg...@gmail.com> wrote:
> I am trying to decompress a 80MB file (decompresses to 300MB), but I
> keep getting a buffer error. I thought I had a workaround using
> chunks, but it fails with 20M left to go. Any help or suggestions are
> greatly appreciated.
>
> in `read': buffer error (Zlib::BufError)
>
> class Zlib::GzipReader
> def each_chunk(chunk_size=1024)
> yield read(chunk_size) until eof
> end
> end
>
> gz = Zlib::GzipReader.open(zip_file)
>
> File.open( non_zip_file, 'wb' ) do |f|
> gz.each_chunk {|chunk| f.write(chunk)}
> end


greg

3/14/2007 2:13:00 AM

0

ruby -v
ruby 1.8.4 (2006-04-14) [i386-mswin32]

I should also say that the same code I am using works fine on a file
of 20MB that expands to 60MB

does the zip gem (or any other gem) handle .gz files?

Thanks,
Greg

On Mar 13, 7:56 pm, "eden li" <eden...@gmail.com> wrote:
> The default decompress method works for me...
>
> $ irb -r zlib
> irb(main):001:0> File.stat('threehundred.gz').size / (1024**2)
> => 81
> irb(main):002:0> Zlib::GzipReader.open('threehundred.gz') { |r|
> File.open('threehundred', 'wb') { |f| f.write r.read } }
> => 314835846
> irb(main):003:0> File.stat('threehundred').size / (1024**2)
> => 300
>
> What version of ruby are you using?
>
> On Mar 14, 2:42 am, "greg" <eeg...@gmail.com> wrote:
>
> > I am trying to decompress a 80MB file (decompresses to 300MB), but I
> > keep getting a buffer error. I thought I had a workaround using
> > chunks, but it fails with 20M left to go. Any help or suggestions are
> > greatly appreciated.
>
> > in `read': buffer error (Zlib::BufError)
>
> > class Zlib::GzipReader
> > def each_chunk(chunk_size=1024)
> > yield read(chunk_size) until eof
> > end
> > end
>
> > gz = Zlib::GzipReader.open(zip_file)
>
> > File.open( non_zip_file, 'wb' ) do |f|
> > gz.each_chunk {|chunk| f.write(chunk)}
> > end


eden li

3/14/2007 2:16:00 PM

0

strange... all I can think of is that the ruby installer you used has
a foobar'ed zlib1.dll. Have you tried upgrading to 1.8.5? Also, what
version of zlib does your ruby claim to have?

> ruby -r zlib -e 'puts Zlib::ZLIB_VERSION'
1.2.3

On Mar 14, 10:13 am, "greg" <eeg...@gmail.com> wrote:
> ruby -v
> ruby 1.8.4 (2006-04-14) [i386-mswin32]
>
> I should also say that the same code I am using works fine on a file
> of 20MB that expands to 60MB
>
> does the zip gem (or any other gem) handle .gz files?
>
> Thanks,
> Greg
>
> On Mar 13, 7:56 pm, "eden li" <eden...@gmail.com> wrote:
>
> > The default decompress method works for me...
>
> > $ irb -r zlib
> > irb(main):001:0> File.stat('threehundred.gz').size / (1024**2)
> > => 81
> > irb(main):002:0> Zlib::GzipReader.open('threehundred.gz') { |r|
> > File.open('threehundred', 'wb') { |f| f.write r.read } }
> > => 314835846
> > irb(main):003:0> File.stat('threehundred').size / (1024**2)
> > => 300
>
> > What version of ruby are you using?
>
> > On Mar 14, 2:42 am, "greg" <eeg...@gmail.com> wrote:
>
> > > I am trying to decompress a 80MB file (decompresses to 300MB), but I
> > > keep getting a buffer error. I thought I had a workaround using
> > > chunks, but it fails with 20M left to go. Any help or suggestions are
> > > greatly appreciated.
>
> > > in `read': buffer error (Zlib::BufError)
>
> > > class Zlib::GzipReader
> > > def each_chunk(chunk_size=1024)
> > > yield read(chunk_size) until eof
> > > end
> > > end
>
> > > gz = Zlib::GzipReader.open(zip_file)
>
> > > File.open( non_zip_file, 'wb' ) do |f|
> > > gz.each_chunk {|chunk| f.write(chunk)}
> > > end


greg

3/14/2007 3:32:00 PM

0

ruby -r zlib -e 'puts Zlib::ZLIB_VERSION'
1.2.3

On Mar 14, 8:15 am, "eden li" <eden...@gmail.com> wrote:
> strange... all I can think of is that the ruby installer you used has
> a foobar'ed zlib1.dll. Have you tried upgrading to 1.8.5? Also, what
> version ofzlibdoes your ruby claim to have?
>
> > ruby -rzlib-e 'putsZlib::ZLIB_VERSION'
>
> 1.2.3
>
> On Mar 14, 10:13 am, "greg" <eeg...@gmail.com> wrote:
>
> > ruby -v
> > ruby 1.8.4 (2006-04-14) [i386-mswin32]
>
> > I should also say that the same code I am using works fine on a file
> > of 20MB that expands to 60MB
>
> > does the zip gem (or any other gem) handle .gz files?
>
> > Thanks,
> > Greg
>
> > On Mar 13, 7:56 pm, "eden li" <eden...@gmail.com> wrote:
>
> > > The default decompress method works for me...
>
> > > $ irb -rzlib
> > > irb(main):001:0> File.stat('threehundred.gz').size / (1024**2)
> > > => 81
> > > irb(main):002:0>Zlib::GzipReader.open('threehundred.gz') { |r|
> > > File.open('threehundred', 'wb') { |f| f.write r.read } }
> > > => 314835846
> > > irb(main):003:0> File.stat('threehundred').size / (1024**2)
> > > => 300
>
> > > What version of ruby are you using?
>
> > > On Mar 14, 2:42 am, "greg" <eeg...@gmail.com> wrote:
>
> > > > I am trying to decompress a 80MB file (decompresses to 300MB), but I
> > > > keep getting a buffer error. I thought I had a workaround using
> > > > chunks, but it fails with 20M left to go. Any help or suggestions are
> > > > greatly appreciated.
>
> > > > in `read': buffer error (Zlib::BufError)
>
> > > > classZlib::GzipReader
> > > > def each_chunk(chunk_size=1024)
> > > > yield read(chunk_size) until eof
> > > > end
> > > > end
>
> > > > gz =Zlib::GzipReader.open(zip_file)
>
> > > > File.open( non_zip_file, 'wb' ) do |f|
> > > > gz.each_chunk {|chunk| f.write(chunk)}
> > > > end


Clifford Heath

3/14/2007 4:04:00 PM

0

greg wrote:
> ruby -r zlib -e 'puts Zlib::ZLIB_VERSION'
> 1.2.3

I recall having had that problem a year or two ago.
I think it was fixed by a version update, but I don't
know the version numbers any more. It was definitely
a bug in zlib though, and I definitely got around it
without having to patch the source.

Sorry I can't help more right now.

Clifford Heath.

greg

3/14/2007 8:41:00 PM

0

I am using the rubyzip library now- it works fine.

On Mar 14, 10:05 am, Clifford Heath <n...@spam.please.net> wrote:
> greg wrote:
> > ruby -rzlib-e 'putsZlib::ZLIB_VERSION'
> > 1.2.3
>
> I recall having had that problem a year or two ago.
> I think it was fixed by a version update, but I don't
> know the version numbers any more. It was definitely
> a bug inzlibthough, and I definitely got around it
> without having to patch the source.
>
> Sorry I can't help more right now.
>
> Clifford Heath.