[lnkForumImage]
TotalShareware - Download Free Software

Confronta i prezzi di migliaia di prodotti.
Asp Forum
 Home | Login | Register | Search 


 

Forums >

comp.lang.ruby

Ruby performance

Keith Sader

3/22/2006 1:52:00 PM

I'm considering using Ruby to re-write and extract, transform, and
load process for an online database. This will replace an existing VB
system that does most of the T of the ETL in T-SQL(don't ask).

My replacement choices come down to .Net(C# or VB), Perl, and Ruby.
Since we can have up to 3 million updates to the database during the
day, performance is an issue. Does Ruby perform as well at large text
file transformations as Perl? Does C# for that matter?

At this point my gut feeling is to write the ETL in Ruby and transform
it to Perl if performance becomes an issue.

Any thoughts?

thanks,
--
Keith Sader
ksader@gmail.com
http://www.saderfamily.org/roller/p...


7 Answers

pat eyler

3/22/2006 2:06:00 PM

0

On 3/22/06, Keith Sader <ksader@gmail.com> wrote:
> I'm considering using Ruby to re-write and extract, transform, and
> load process for an online database. This will replace an existing VB
> system that does most of the T of the ETL in T-SQL(don't ask).
>
> My replacement choices come down to .Net(C# or VB), Perl, and Ruby.
> Since we can have up to 3 million updates to the database during the
> day, performance is an issue. Does Ruby perform as well at large text
> file transformations as Perl? Does C# for that matter?
>
> At this point my gut feeling is to write the ETL in Ruby and transform
> it to Perl if performance becomes an issue.

Write it in Ruby, then if you have performance issues -- profile,
benchmark, and optimize. If you really get stuck RubyInline will
be your friend *far* more than Perl ever would.


>
> Any thoughts?
>
> thanks,
> --
> Keith Sader
> ksader@gmail.com
> http://www.saderfamily.org/roller/p...
>
>


--
thanks,
-pate
-------------------------


Sascha Ebach

3/22/2006 2:09:00 PM

0

> Write it in Ruby, then if you have performance issues -- profile,
> benchmark, and optimize. If you really get stuck RubyInline will
> be your friend *far* more than Perl ever would.

Unless he uses Perl::Inline :)

(Not advocating for Perl here, just remembered where RubyInline comes from)

-Sascha Ebach


pat eyler

3/22/2006 2:19:00 PM

0

On 3/22/06, Sascha Ebach <se@digitale-wertschoepfung.de> wrote:
> > Write it in Ruby, then if you have performance issues -- profile,
> > benchmark, and optimize. If you really get stuck RubyInline will
> > be your friend *far* more than Perl ever would.
>
> Unless he uses Perl::Inline :)

You'd still have to use Perl ;^)

>
> (Not advocating for Perl here, just remembered where RubyInline comes from)
>

Well, zenspider's implementation is cool enough that it got Ingy to
look twice (and, iirc, to pull some ideas into the Perl version).

> -Sascha Ebach
>
>


--
thanks,
-pate
-------------------------


Robert Klemme

3/22/2006 2:24:00 PM

0

Keith Sader wrote:
> I'm considering using Ruby to re-write and extract, transform, and
> load process for an online database. This will replace an existing VB
> system that does most of the T of the ETL in T-SQL(don't ask).
>
> My replacement choices come down to .Net(C# or VB), Perl, and Ruby.
> Since we can have up to 3 million updates to the database during the
> day, performance is an issue. Does Ruby perform as well at large text
> file transformations as Perl? Does C# for that matter?
>
> At this point my gut feeling is to write the ETL in Ruby and transform
> it to Perl if performance becomes an issue.
>
> Any thoughts?

Usually client tool performance is shadowed by DB performance and
network communication times. Due to the nature of the DB it has to do
more complex IO operations to store the data than the client which reads
just a plain text file sequentially or writes it sequentially. I'd go
with Ruby.

Kind regards

robert

Randy Kramer

3/22/2006 7:26:00 PM

0

Keith Sader wrote:
> Any thoughts?

Of course ;-)

3 million updates per day doesn't mean much to me. With a little arithmetic,
that looks like a sustained average load of ~ 33 TPS.

Having said that, I don't know if that's reasonably within Ruby's capabilities
or not--I just think TPS (transactions per second) is a more common metric
for performance of databased applications. (But, I could be wrong ;-)

As a Ruby newbie, unless it's grossly out of the ballpark, I'd probably try to
build it in Ruby first, then optimize, then consider upgrading hardware, and
then consider switching to another language.

Randy Kramer


Eric Kidd

3/22/2006 7:51:00 PM

0

On Mar 22, 2006, at 2:26 PM, Randy Kramer wrote:
> 3 million updates per day doesn't mean much to me. With a little
> arithmetic,
> that looks like a sustained average load of ~ 33 TPS.

By an interesting coincidence, Rails sites tend to support about 30
hits/second/server on decent hardware, assuming they have to go all
the way to the database and render views. With action caching (which
bypasses the database and view rendering, but still runs Ruby code),
I've seen benchmarks in the 500 hits/second range.

So Ruby might very well be a plausible solution, depending on a
number of factors. Given the sweet simplicity of ActiveRecord, you
could even spend a couple of days building a prototype and seeing how
fast it goes. :-)

Cheers,
Eric




Reid Thompson

3/22/2006 9:58:00 PM

0

Eric Kidd wrote:
> On Mar 22, 2006, at 2:26 PM, Randy Kramer wrote:
>> 3 million updates per day doesn't mean much to me. With a little
>> arithmetic,
>> that looks like a sustained average load of ~ 33 TPS.
>
> By an interesting coincidence, Rails sites tend to support about 30
> hits/second/server on decent hardware, assuming they have to go all
> the way to the database and render views. With action caching (which
> bypasses the database and view rendering, but still runs Ruby code),
> I've seen benchmarks in the 500 hits/second range.
>
> So Ruby might very well be a plausible solution, depending on a number
> of factors. Given the sweet simplicity of ActiveRecord, you could even
> spend a couple of days building a prototype and seeing how fast it
> goes. :-)
>
> Cheers,
> Eric
>
>
>
Simple test, ran from
within RDE editor
windows xp,
testog=# select version();

version
------------------------------------------------------------------------------------------
PostgreSQL 8.1.3 on i686-pc-mingw32, compiled by GCC gcc.exe (GCC)
3.4.2 (mingw-special)
(1 row)
with 119 odd processes running ( windows, cygwin, etc, etc)
1GB ram

10000 inserts in < 60 seconds = 167 tps.

testog=# truncate ogcomment;
TRUNCATE TABLE
testog=# select * from ogcomment;
title | body | author | create_time | oid
-------+------+--------+-------------+-----
(0 rows)

I, [2006-03-22T16:44:48.183000 #3220] INFO -- : Og uses the Psql store.
Wed Mar 22 16:44:49 Eastern Standard Time 2006
Wed Mar 22 16:45:44 Eastern Standard Time 2006
D, [2006-03-22T16:44:49.464000 #3220] DEBUG -- : Table ogcomment already
exists
D, [2006-03-22T16:44:49.495000 #3220] DEBUG -- : PostgreSQL processing
foreign key constraints
D, [2006-03-22T16:44:49.495000 #3220] DEBUG -- : PostgreSQL finished
setting constraints. No action was taken in 0.00 seconds.
Completed(0)



testog=# select count(*) from ogcomment;
count
-------
10000
(1 row)

testog=# select min(create_time),max(create_time) from ogcomment;
min | max
---------------------+---------------------
2006-03-22 16:44:49 | 2006-03-22 16:45:44
(1 row)



require 'og'

class Comment
property :title, String
property :body, String
property :author, String
property :create_time, Time
end
og_psql = {
:destroy => true,
:store => :psql,
:user => 'rthompso',
:password => 'rthompso',
:name => 'testog'
}

Og.setup(og_psql)

c = Comment.new
c.title = 'Hello'
c.body = 'World'
c.create_time = Time.now
c.author = 'tml'

puts Time.now
# save the object in the database
1.upto(10000) { |i|
c = Comment.new
c.title = 'Hello'
c.body = 'World'
c.create_time = Time.now
c.author = 'tml'
c.save
}
puts Time.now