[lnkForumImage]
TotalShareware - Download Free Software

Confronta i prezzi di migliaia di prodotti.
Asp Forum
 Home | Login | Register | Search 


 

Forums >

comp.lang.ruby

webrick fails with large files

Roger Pack

2/27/2007 9:36:00 PM

So...apparently webrick does the following for serving pages:
loads the page into a string object
then sends that string to the requestor.
Unfortunately this means that if it is a very large file being served
that that buffer string will get very large. I.e. with a 700MB file it
will fail because Ruby runs out of memory. This fact also defeats the
'streaming' aspect of some ruby functions, for example the RoR send_file
has an option to send a file 'a chunk at a time' to the client--however
these chunks are all conglomerated within webrick then sent--so it
doesn't stream out as it hoped to be doing. I therefore see this as a
bug in Webrick and was wondering what others thought.
Cheers!
-Roger

--
Posted via http://www.ruby-....

4 Answers

Eric Hodel

2/27/2007 11:27:00 PM

0

On Feb 27, 2007, at 13:36, Roger Pack wrote:

> So...apparently webrick does the following for serving pages:
> loads the page into a string object
> then sends that string to the requestor.
> Unfortunately this means that if it is a very large file being served
> that that buffer string will get very large.

Don't dynamically build and send large strings. Use an IO instead.
WEBrick knows the difference and responds accordingly.

Roger Pack

2/27/2007 11:34:00 PM

0

Eric Hodel wrote:
> On Feb 27, 2007, at 13:36, Roger Pack wrote:
>
>> So...apparently webrick does the following for serving pages:
>> loads the page into a string object
>> then sends that string to the requestor.
>> Unfortunately this means that if it is a very large file being served
>> that that buffer string will get very large.
>
> Don't dynamically build and send large strings. Use an IO instead.
> WEBrick knows the difference and responds accordingly.

I believe that rails must use strings, then--any ideas for that case?
Thanks!

--
Posted via http://www.ruby-....

Mat Schaffer

2/28/2007 3:53:00 AM

0

On Feb 27, 2007, at 4:36 PM, Roger Pack wrote:
> So...apparently webrick does the following for serving pages:
> loads the page into a string object
> then sends that string to the requestor.
> Unfortunately this means that if it is a very large file being served
> that that buffer string will get very large. I.e. with a 700MB
> file it
> will fail because Ruby runs out of memory. This fact also defeats the
> 'streaming' aspect of some ruby functions, for example the RoR
> send_file
> has an option to send a file 'a chunk at a time' to the client--
> however
> these chunks are all conglomerated within webrick then sent--so it
> doesn't stream out as it hoped to be doing. I therefore see this as a
> bug in Webrick and was wondering what others thought.
> Cheers!
> -Roger

I doubt Webrick was ever really intended for that sort of work. You
could try Mongrel, although it may yield the same result. Rock solid
ruby deployment is still something of a work in progress, I feel.
-Mat

Brian Candler

2/28/2007 8:45:00 AM

0

On Wed, Feb 28, 2007 at 12:53:18PM +0900, Mat Schaffer wrote:
> On Feb 27, 2007, at 4:36 PM, Roger Pack wrote:
> >So...apparently webrick does the following for serving pages:
> >loads the page into a string object
> >then sends that string to the requestor.
> >Unfortunately this means that if it is a very large file being served
> >that that buffer string will get very large. I.e. with a 700MB
> >file it
> >will fail because Ruby runs out of memory. This fact also defeats the
> >'streaming' aspect of some ruby functions, for example the RoR
> >send_file
> >has an option to send a file 'a chunk at a time' to the client--
> >however
> >these chunks are all conglomerated within webrick then sent--so it
> >doesn't stream out as it hoped to be doing. I therefore see this as a
> >bug in Webrick and was wondering what others thought.
> >Cheers!
> >-Roger
>
> I doubt Webrick was ever really intended for that sort of work. You
> could try Mongrel, although it may yield the same result. Rock solid
> ruby deployment is still something of a work in progress, I feel.

Or run your code as a fastcgi under Apache.

This doesn't stop you from trying to read a 700MB file into a string of
course, but it does give you the option to simply open the file, read it
chunk at a time and squirt it to STDOUT.

I believe Rails will run happily under fastcgi. You'll just need to tell it
not to render anything if you're generating your own HTTP headers and body
directly.

Fastcgi also has the advantage of automatically making your program thread
safe, since each instance is a separate process. The downside is that if
you're handling (say) 5 concurrent client requests, you'll have five
Ruby+Rails processes spawned from scratch, each with their own memory
footprint.

Regards,

Brian.