color version - PET: Python Entre Todos - Python Argentina
color version - PET: Python Entre Todos - Python Argentina
color version - PET: Python Entre Todos - Python Argentina
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
If black is used space, and white is free space, you can see here how there’s a lot of<br />
free space, but unusable for objects beyond some size, because it’s not contiguous<br />
space.<br />
Put simply, memory fragmentation happens when there’s lots of free space, but it’s not<br />
contiguous. Like in the above map, much of it is free space, but it’s unusable for big<br />
objects because, in contrast with a file you can split into several blocks on disk, an<br />
object’s memory region needs to be contiguous.<br />
So, in contrast with fragmentation in a file system, memory fragmentation makes<br />
portions of it unusable. If I needed memory for a big object, say a few megabytes, I would<br />
have to use the area towards the end of the map (that is, extend the virtual image of the<br />
process). This is effectively what malloc does when it’s faced with this situation.<br />
The immediate, visible effect here, is an inefficiency in the use of available memory. If my<br />
program needed 2GB of memory in theory, it could be reserving 4GB from the operating<br />
system (because it has many small pieces reserved that it cannot use). If I have bad luck,<br />
this could make my system swap. If I have too much bad luck, it could trash, and die.<br />
Lets see code that fragments memory:<br />
>>> l = []<br />
>>> for i in xrange(1,100):<br />
... ll = [ " " * i * 16 for j in xrange(1000000 / i) ]<br />
... ll = ll[::2]<br />
... l.extend(ll)<br />
... print sum(map(len,l))<br />
...<br />
8000000<br />
16000000<br />
…<br />
792005616<br />
>>><br />
But there’s something worse. Suppose I do:<br />
>>> del l<br />
>>> del ll<br />
I get from top:<br />
10467 claudiof 20 0 1532m 1.5g 1864 S 0 75.6 1:17.96 python<br />
If I repeat the fragmentation example, I can confirm that those 1.5G are effectively free<br />
for python:<br />
10467 claudiof 20 0 1676m 1.6g 1864 S 0 82.8 2:33.39 python<br />
But if I try to free them (to the operating system), I can’t.<br />
¿WTF<br />
Enter Guppy<br />
Guppy is a little red fish commonly seen in fish tanks everywhere. Those little fish there,<br />
those are called guppy.<br />
Really.<br />
It’s also an extension library for <strong>Python</strong> that contains a module, heapy, which allows me<br />
to do memory diagnostics.<br />
Really.<br />
After this, top says:<br />
10467 claudiof 20 0 1676m 1.6g 1864 S 0 82.7 1:17.07 python<br />
Guppy<br />
Meaning, even though according to our calculations the program had to consume 800M<br />
of memory, it effectively consumes 1.6G. Double that.<br />
Why<br />
Well the example was specifically tailored to create 50% of unusable holes. The memory<br />
is fragmented, then, by 50%.<br />
So, let’s try to use it:<br />
>>> from guppy import hpy<br />
>>> hp = hpy()<br />
>>> hp.heap()