heapq.py revision a0b3a00bc5f55cfbdc3d9b7925ee8a28fa2bdc55
1"""Heap queue algorithm (a.k.a. priority queue).
2
3Heaps are arrays for which a[k] <= a[2*k+1] and a[k] <= a[2*k+2] for
4all k, counting elements from 0.  For the sake of comparison,
5non-existing elements are considered to be infinite.  The interesting
6property of a heap is that a[0] is always its smallest element.
7
8Usage:
9
10heap = []            # creates an empty heap
11heappush(heap, item) # pushes a new item on the heap
12item = heappop(heap) # pops the smallest item from the heap
13item = heap[0]       # smallest item on the heap without popping it
14
15Our API differs from textbook heap algorithms as follows:
16
17- We use 0-based indexing.  This makes the relationship between the
18  index for a node and the indexes for its children slightly less
19  obvious, but is more suitable since Python uses 0-based indexing.
20
21- Our heappop() method returns the smallest item, not the largest.
22
23These two make it possible to view the heap as a regular Python list
24without surprises: heap[0] is the smallest item, and heap.sort()
25maintains the heap invariant!
26"""
27
28# Code by Kevin O'Connor
29
30__about__ = """Heap queues
31
32[explanation by Fran�ois Pinard]
33
34Heaps are arrays for which a[k] <= a[2*k+1] and a[k] <= a[2*k+2] for
35all k, counting elements from 0.  For the sake of comparison,
36non-existing elements are considered to be infinite.  The interesting
37property of a heap is that a[0] is always its smallest element.
38
39The strange invariant above is meant to be an efficient memory
40representation for a tournament.  The numbers below are `k', not a[k]:
41
42                                   0
43
44                  1                                 2
45
46          3               4                5               6
47
48      7       8       9       10      11      12      13      14
49
50    15 16   17 18   19 20   21 22   23 24   25 26   27 28   29 30
51
52
53In the tree above, each cell `k' is topping `2*k+1' and `2*k+2'.  In
54an usual binary tournament we see in sports, each cell is the winner
55over the two cells it tops, and we can trace the winner down the tree
56to see all opponents s/he had.  However, in many computer applications
57of such tournaments, we do not need to trace the history of a winner.
58To be more memory efficient, when a winner is promoted, we try to
59replace it by something else at a lower level, and the rule becomes
60that a cell and the two cells it tops contain three different items,
61but the top cell "wins" over the two topped cells.
62
63If this heap invariant is protected at all time, index 0 is clearly
64the overall winner.  The simplest algorithmic way to remove it and
65find the "next" winner is to move some loser (let's say cell 30 in the
66diagram above) into the 0 position, and then percolate this new 0 down
67the tree, exchanging values, until the invariant is re-established.
68This is clearly logarithmic on the total number of items in the tree.
69By iterating over all items, you get an O(n ln n) sort.
70
71A nice feature of this sort is that you can efficiently insert new
72items while the sort is going on, provided that the inserted items are
73not "better" than the last 0'th element you extracted.  This is
74especially useful in simulation contexts, where the tree holds all
75incoming events, and the "win" condition means the smallest scheduled
76time.  When an event schedule other events for execution, they are
77scheduled into the future, so they can easily go into the heap.  So, a
78heap is a good structure for implementing schedulers (this is what I
79used for my MIDI sequencer :-).
80
81Various structures for implementing schedulers have been extensively
82studied, and heaps are good for this, as they are reasonably speedy,
83the speed is almost constant, and the worst case is not much different
84than the average case.  However, there are other representations which
85are more efficient overall, yet the worst cases might be terrible.
86
87Heaps are also very useful in big disk sorts.  You most probably all
88know that a big sort implies producing "runs" (which are pre-sorted
89sequences, which size is usually related to the amount of CPU memory),
90followed by a merging passes for these runs, which merging is often
91very cleverly organised[1].  It is very important that the initial
92sort produces the longest runs possible.  Tournaments are a good way
93to that.  If, using all the memory available to hold a tournament, you
94replace and percolate items that happen to fit the current run, you'll
95produce runs which are twice the size of the memory for random input,
96and much better for input fuzzily ordered.
97
98Moreover, if you output the 0'th item on disk and get an input which
99may not fit in the current tournament (because the value "wins" over
100the last output value), it cannot fit in the heap, so the size of the
101heap decreases.  The freed memory could be cleverly reused immediately
102for progressively building a second heap, which grows at exactly the
103same rate the first heap is melting.  When the first heap completely
104vanishes, you switch heaps and start a new run.  Clever and quite
105effective!
106
107In a word, heaps are useful memory structures to know.  I use them in
108a few applications, and I think it is good to keep a `heap' module
109around. :-)
110
111--------------------
112[1] The disk balancing algorithms which are current, nowadays, are
113more annoying than clever, and this is a consequence of the seeking
114capabilities of the disks.  On devices which cannot seek, like big
115tape drives, the story was quite different, and one had to be very
116clever to ensure (far in advance) that each tape movement will be the
117most effective possible (that is, will best participate at
118"progressing" the merge).  Some tapes were even able to read
119backwards, and this was also used to avoid the rewinding time.
120Believe me, real good tape sorts were quite spectacular to watch!
121From all times, sorting has always been a Great Art! :-)
122"""
123
124def heappush(heap, item):
125    """Push item onto heap, maintaining the heap invariant."""
126    pos = len(heap)
127    heap.append(None)
128    while pos:
129        parentpos = (pos - 1) >> 1
130        parent = heap[parentpos]
131        if item >= parent:
132            break
133        heap[pos] = parent
134        pos = parentpos
135    heap[pos] = item
136
137def heappop(heap):
138    """Pop the smallest item off the heap, maintaining the heap invariant."""
139    endpos = len(heap) - 1
140    if endpos <= 0:
141        return heap.pop()
142    returnitem = heap[0]
143    item = heap.pop()
144    pos = 0
145    while True:
146        child2pos = (pos + 1) * 2
147        child1pos = child2pos - 1
148        if child2pos < endpos:
149            child1 = heap[child1pos]
150            child2 = heap[child2pos]
151            if item <= child1 and item <= child2:
152                break
153            if child1 < child2:
154                heap[pos] = child1
155                pos = child1pos
156                continue
157            heap[pos] = child2
158            pos = child2pos
159            continue
160        if child1pos < endpos:
161            child1 = heap[child1pos]
162            if child1 < item:
163                heap[pos] = child1
164                pos = child1pos
165        break
166    heap[pos] = item
167    return returnitem
168
169if __name__ == "__main__":
170    # Simple sanity test
171    heap = []
172    data = [1, 3, 5, 7, 9, 2, 4, 6, 8, 0]
173    for item in data:
174        heappush(heap, item)
175    sort = []
176    while heap:
177        sort.append(heappop(heap))
178    print sort
179