Jeremy Hinegardner
8/14/2008 9:26:00 PM
On Fri, Aug 15, 2008 at 05:56:46AM +0900, ara.t.howard wrote:
>
> On Aug 14, 2008, at 1:10 PM, Martin DeMello wrote:
>
>> On Thu, Aug 14, 2008 at 12:06 PM, ara.t.howard <ara.t.howard@gmail.com>
>> wrote:
>>>
>>> buffer them and insert them in a transaction 1000 at a time. even with
>>> ruby this should be a peice of cake.
>>
>> Do any of the ruby db libraries offer support for doing this efficiently?
>>
>> martin
>
> pretty much all of them
>
[...]
> cfp:~/rails_root > ./script/runner a.rb
> using sqlite3
> elapsed: 0.222311019897461
> count: 10000
>
> using ar
> elapsed: 7.75591206550598
> count: 10000
>
> 0.2 seconds for 100000 records seems plenty fast to me. 7 seconds not so
> much.
If your standard of performance is 10,000 records inserted in a minute, any
database should be able to satisfy your requirements.
And here's the amalgalite version of ara's test... embedded sqlite in a ruby
extension.
% cat am_inserts.rb
#!/usr/bin/env ruby
require 'rubygems'
require 'amalgalite'
size = Integer(ARGV.shift || 10_000)
messages = Array.new(size).map{ rand.to_s }
Db = "speed-test.db"
FileUtils.rm_f Db if File.exist?( Db )
db = Amalgalite::Database.new( Db )
db.execute(" CREATE TABLE messages(content); ")
before = Time.now.to_f
db.transaction do |db_in_trans|
messages.each do |m|
db_in_trans.execute("insert into messages(content) values( #{m} )")
end
end
after = Time.now.to_f
elapsed = after - before
mps = size / elapsed
puts "#{"%0.2f" % elapsed} seconds to insert #{size} records at #{"%0.2f" % mps} records per second"
% ruby am_inserts.rb
0.38 seconds to insert 10000 records at 25999.01 records per second
% ruby am_inserts.rb 100000
3.80 seconds to insert 100000 records at 26344.71 records per second
enjoy,
-jeremy
--
========================================================================
Jeremy Hinegardner jeremy@hinegardner.org