Brian Candler
2/28/2007 8:45:00 AM
On Wed, Feb 28, 2007 at 12:53:18PM +0900, Mat Schaffer wrote:
> On Feb 27, 2007, at 4:36 PM, Roger Pack wrote:
> >So...apparently webrick does the following for serving pages:
> >loads the page into a string object
> >then sends that string to the requestor.
> >Unfortunately this means that if it is a very large file being served
> >that that buffer string will get very large. I.e. with a 700MB
> >file it
> >will fail because Ruby runs out of memory. This fact also defeats the
> >'streaming' aspect of some ruby functions, for example the RoR
> >send_file
> >has an option to send a file 'a chunk at a time' to the client--
> >however
> >these chunks are all conglomerated within webrick then sent--so it
> >doesn't stream out as it hoped to be doing. I therefore see this as a
> >bug in Webrick and was wondering what others thought.
> >Cheers!
> >-Roger
>
> I doubt Webrick was ever really intended for that sort of work. You
> could try Mongrel, although it may yield the same result. Rock solid
> ruby deployment is still something of a work in progress, I feel.
Or run your code as a fastcgi under Apache.
This doesn't stop you from trying to read a 700MB file into a string of
course, but it does give you the option to simply open the file, read it
chunk at a time and squirt it to STDOUT.
I believe Rails will run happily under fastcgi. You'll just need to tell it
not to render anything if you're generating your own HTTP headers and body
directly.
Fastcgi also has the advantage of automatically making your program thread
safe, since each instance is a separate process. The downside is that if
you're handling (say) 5 concurrent client requests, you'll have five
Ruby+Rails processes spawned from scratch, each with their own memory
footprint.
Regards,
Brian.