[lnkForumImage]
TotalShareware - Download Free Software

Confronta i prezzi di migliaia di prodotti.
Asp Forum
 Home | Login | Register | Search 


 

Forums >

comp.lang.ruby

Array flatten memory "leak"?

Ron M

7/19/2007 10:07:00 PM

After I use "flatten" on a big array; the garbage collector
seems to get pretty forgetful of what memory it's allowed
to free up.

In both of the following scripts, the first two lines
create an array of many arrays; and then flatten them
back into a flat array.

In both cases, the memory after creating this array
the scripts use about 20MB. After that, they
repeatedly pack and unpack an array. The script that
used "flatten" keeps growing quickly until it used
about 1/4 GB of memory. The script that flattened
the array by hand stays right about at 20MB.

What could it be about flatten that makes the
garbage collector fail to see that it could be
re-using the memory of the pack/unpack loop?


#########################################################################
# This unexpectedly grows to 200+MB.
#########################################################################
a = (1..200000).map{|x| [x.to_s]};a.length
b = a.flatten.map{|x| x.to_i} ; b.length
puts `grep VmSize /proc/#{$$}/status` ######## under 20 MB
(1..100).each{|x| b.pack("I*").unpack("I*");}
puts `grep VmSize /proc/#{$$}/status` ######## WHOA GREW TO 1/4 GB.


#########################################################################
# This stays at 20MB
#########################################################################
a = (1..200000).map{|x| [x.to_s]};a.length
b = a.map{|x| x[0].to_i} ; b.length
puts `grep VmSize /proc/#{$$}/status` ######## under 20 MB
(1..100).each{|x| b.pack("I*").unpack("I*");}
puts `grep VmSize /proc/#{$$}/status` ######## still 20 MB





4 Answers

ara.t.howard

7/20/2007 2:46:00 AM

0


On Jul 19, 2007, at 4:07 PM, Ron M wrote:

> ######################################################################
> ###
> # This unexpectedly grows to 200+MB.
not
> ######################################################################
> ###
> a = (1..200000).map{|x| [x.to_s]};a.length
> b = a.flatten.map{|x| x.to_i} ; b.length

a #=> big array
a.flatten #=> copy of said big array
a.flatten #=> copy of flattened array

so you take two copies to get one and neglect to call GC.start

> puts `grep VmSize /proc/#{$$}/status` ######## under 20 MB
> (1..100).each{|x| b.pack("I*").unpack("I*");}
> puts `grep VmSize /proc/#{$$}/status` ######## WHOA GREW
> TO 1/4 GB.
>
>
> ######################################################################
> ###
> # This stays at 20MB
> ######################################################################
> ###
> a = (1..200000).map{|x| [x.to_s]};a.length
> b = a.map{|x| x[0].to_i} ; b.length
> puts `grep VmSize /proc/#{$$}/status` ######## under 20 MB
> (1..100).each{|x| b.pack("I*").unpack("I*");}
> puts `grep VmSize /proc/#{$$}/status` ######## still 20 MB
>

a @ http://draw...
--
we can deny everything, except that we have the possibility of being
better. simply reflect on that.
h.h. the 14th dalai lama




Eric Mahurin

7/20/2007 2:49:00 AM

0

On 7/19/07, Ron M <rm_rails@cheapcomplexdevices.com> wrote:
> After I use "flatten" on a big array; the garbage collector
> seems to get pretty forgetful of what memory it's allowed
> to free up.
>
> In both of the following scripts, the first two lines
> create an array of many arrays; and then flatten them
> back into a flat array.
>
> In both cases, the memory after creating this array
> the scripts use about 20MB. After that, they
> repeatedly pack and unpack an array. The script that
> used "flatten" keeps growing quickly until it used
> about 1/4 GB of memory. The script that flattened
> the array by hand stays right about at 20MB.
>
> What could it be about flatten that makes the
> garbage collector fail to see that it could be
> re-using the memory of the pack/unpack loop?
>
>
> #########################################################################
> # This unexpectedly grows to 200+MB.
> #########################################################################
> a = (1..200000).map{|x| [x.to_s]};a.length
> b = a.flatten.map{|x| x.to_i} ; b.length
> puts `grep VmSize /proc/#{$$}/status` ######## under 20 MB
> (1..100).each{|x| b.pack("I*").unpack("I*");}
> puts `grep VmSize /proc/#{$$}/status` ######## WHOA GREW TO 1/4 GB.
>
>
> #########################################################################
> # This stays at 20MB
> #########################################################################
> a = (1..200000).map{|x| [x.to_s]};a.length
> b = a.map{|x| x[0].to_i} ; b.length
> puts `grep VmSize /proc/#{$$}/status` ######## under 20 MB
> (1..100).each{|x| b.pack("I*").unpack("I*");}
> puts `grep VmSize /proc/#{$$}/status` ######## still 20 MB
>


I isolated this a bit down by looking at the C code. The culprit in
Array#flatten seems to be rb_ary_splice which also gets called when
assigning an array slice (one form of Array#[]=). The culprit in
Array#pack seems to be rb_str_buf_cat which also gets called from
String#<<FixNum. I also found that you could even completely remove
references to the array you flattened/sliced and the problem still
persisted. A more fundamental test showing the problem would be this:

a = Array.new(200000) { |x| [x] }
a.size.times { |i| a[i,1] = a[i][0,1] } # calls rb_ary_splice,
triggering the problem
#a.size.times { |i| a[i] = a[i][0] } # calls rb_ary_store which seems OK
puts `grep VmSize /proc/#{$$}/status`
a = nil
GC.start
(1..200).each{ s="";200000.times{ s<<(?A) }} # calls rb_str_buf_cat
which "leaks"
#(1..200).each{ s="";200000.times{ s<<"A" }} # calls rb_str_append
which seems OK
puts `grep VmSize /proc/#{$$}/status`
GC.start
puts `grep VmSize /proc/#{$$}/status`

The amount of "leakage" seems to be about linear with respect to the
number of iterations in the last loop. Each string generated in the
loop should be 200K, but is not used again, so the memory increase due
to the last loop should be O(1) (around 200K independent of the number
of iterations). Also, the GC.start doesn't help.

If you use either of the alternatives, it seems to work.

This sure looks like a bug to me.

Eric

Ron M

7/22/2007 12:08:00 AM

0

ara.t.howard wrote:
> On Jul 19, 2007, at 4:07 PM, Ron M wrote:
>> #########################################################################
>> # This unexpectedly grows to 200+MB.
> not
really?

>> #########################################################################
>> a = (1..200000).map{|x| [x.to_s]};a.length
>> b = a.flatten.map{|x| x.to_i} ; b.length
> a #=> big array
> a.flatten #=> copy of said big array
> a.flatten #=> copy of flattened array
>
> so you take two copies to get one and neglect to call GC.start

But at that point the program is still very small (20MB).

But the strange (to me) thing is that is that the next loop -
which seems like it shouldn't be allocating more memory since
nothing gets saved to variables - where the program quickly
grows to over 200MB.

>> puts `grep VmSize /proc/#{$$}/status` ######## under 20 MB
>> (1..100).each{|x| b.pack("I*").unpack("I*");}
>> puts `grep VmSize /proc/#{$$}/status` ######## WHOA GREW TO

But this loop increasing memory seemingly without bounds seems
somewhat unrelated to the lines above with flatten that only
used 20MB, doesn't it?

ara.t.howard

7/22/2007 3:15:00 AM

0


On Jul 21, 2007, at 6:08 PM, Ron M wrote:

> But the strange (to me) thing is that is that the next loop -
> which seems like it shouldn't be allocating more memory since
> nothing gets saved to variables - where the program quickly
> grows to over 200MB

but assigning to variables isn't what determines whether memory is
used, it's a simple matter of object creation:

cfp:~ > cat a.rb
#! /usr/bin/env ruby
STDOUT.sync = true
require 'yaml'

def n_objects
n = 0 and ObjectSpace.each_object{ n += 1 }
n
end

y 'n_objects_0' => n_objects

65635.times{ "foo" }

y 'n_objects_1' => n_objects

GC.start

y 'n_objects_2' => n_objects



cfp:~ > ruby a.rb
---
n_objects_0: 2599
---
n_objects_1: 28143
---
n_objects_2: 1648



if it were then n_objects_1 wouldn't show such a huge increase: we've
assigned no variables here. my point is that what is important is
object creation and when those objects get freed by the GC.

regards.

a @ http://draw...
--
we can deny everything, except that we have the possibility of being
better. simply reflect on that.
h.h. the 14th dalai lama