Heap (data structure)
Heap (data structure)
Jump to navigation
Jump to search
In computer science, a heap is a specialized tree-based data structure that satisfies the heap property: if P is a parent node of C, then the key (the value) of P is either greater than or equal to (in a max heap) or less than or equal to (in a min heap) the key of C.[1] The node at the "top" of the heap (with no parents) is called the root node.
The heap is one maximally efficient implementation of an abstract data type called a priority queue, and in fact priority queues are often referred to as "heaps", regardless of how they may be implemented. A common implementation of a heap is the binary heap, in which the tree is a binary tree (see figure). The heap data structure, specifically the binary heap, was introduced by J. W. J. Williams in 1964, as a data structure for the heapsort sorting algorithm.[2] Heaps are also crucial in several efficient graph algorithms such as Dijkstra's algorithm.
In a heap, the highest (or lowest) priority element is always stored at the root. A heap is not a sorted structure and can be regarded as partially ordered. As visible from the heap-diagram, there is no particular relationship among nodes on any given level, even among the siblings. When a heap is a complete binary tree, it has a smallest possible height—a heap with N nodes and for each node a branches always has logaN height. A heap is a useful data structure when you need to remove the object with the highest (or lowest) priority.
Note that, as shown in the graphic, there is no implied ordering between siblings or cousins and no implied sequence for an in-order traversal (as there would be in, e.g., a binary search tree). The heap relation mentioned above applies only between nodes and their parents, grandparents, etc. The maximum number of children each node can have depends on the type of heap, but in many types it is at most two, which is known as a binary heap.
Contents
1 Operations
2 Implementation
3 Variants
4 Comparison of theoretic bounds for variants
5 Applications
6 Implementations
7 See also
8 References
9 External links
Operations[edit]
The common operations involving heaps are:
- Basic
find-max [or find-min]: find a maximum item of a max-heap, or a minimum item of a min-heap, respectively (a.k.a. peek)
insert: adding a new key to the heap (a.k.a., push[3])
extract-max [or extract-min]: returns the node of maximum value from a max heap [or minimum value from a min heap] after removing it from the heap (a.k.a., pop[4])
delete-max [or delete-min]: removing the root node of a max heap [or min heap], respectively
replace: pop root and push a new key. More efficient than pop followed by push, since only need to balance once, not twice, and appropriate for fixed-size heaps.[5]
- Creation
create-heap: create an empty heap
heapify: create a heap out of given array of elements
merge (union): joining two heaps to form a valid new heap containing all the elements of both, preserving the original heaps.
meld: joining two heaps to form a valid new heap containing all the elements of both, destroying the original heaps.
- Inspection
size: return the number of items in the heap.
is-empty: return true if the heap is empty, false otherwise.
- Internal
increase-key or decrease-key: updating a key within a max- or min-heap, respectively
delete: delete an arbitrary node (followed by moving last node and sifting to maintain heap)
sift-up: move a node up in the tree, as long as needed; used to restore heap condition after insertion. Called "sift" because node moves up the tree until it reaches the correct level, as in a sieve.
sift-down: move a node down in the tree, similar to sift-up; used to restore heap condition after deletion or replacement.
Implementation[edit]
Heaps are usually implemented in an array (fixed size or dynamic array), and do not require pointers between elements. After an element is inserted into or deleted from a heap, the heap property may be violated and the heap must be balanced by internal operations.
Binary heaps may be represented in a very space-efficient way (as an implicit data structure) using an array alone. The first (or last) element will contain the root. The next two elements of the array contain its children. The next four contain the four children of the two child nodes, etc. Thus the children of the node at position n would be at positions 2n and 2n + 1 in a one-based array, or 2n + 1 and 2n + 2 in a zero-based array. This allows moving up or down the tree by doing simple index computations. Balancing a heap is done by sift-up or sift-down operations (swapping elements which are out of order). As we can build a heap from an array without requiring extra memory (for the nodes, for example), heapsort can be used to sort an array in-place.
Different types of heaps implement the operations in different ways, but notably, insertion is often done by adding the new element at the end of the heap in the first available free space. This will generally violate the heap property, and so the elements are then sifted up until the heap property has been reestablished. Similarly, deleting the root is done by removing the root and then putting the last element in the root and sifting down to rebalance. Thus replacing is done by deleting the root and putting the new element in the root and sifting down, avoiding a sifting up step compared to pop (sift down of last element) followed by push (sift up of new element).
Construction of a binary (or d-ary) heap out of a given array of elements may be performed in linear time using the classic Floyd algorithm, with the worst-case number of comparisons equal to 2N − 2s2(N) − e2(N) (for a binary heap), where s2(N) is the sum of all digits of the binary representation of N and e2(N) is the exponent of 2 in the prime factorization of N.[6] This is faster than a sequence of consecutive insertions into an originally empty heap, which is log-linear.[a]
Variants[edit]
- 2–3 heap
- B-heap
- Beap
- Binary heap
- Binomial heap
- Brodal queue
d-ary heap- Fibonacci heap
- Leaf heap
- Leftist heap
- Pairing heap
- Radix heap
- Randomized meldable heap
- Skew heap
- Soft heap
- Ternary heap
- Treap
- Weak heap
Comparison of theoretic bounds for variants[edit]
In the following time complexities[7]O(f) is an asymptotic upper bound and Θ(f) is an asymptotically tight bound (see Big O notation). Function names assume a min-heap.
Operation | Binary[7] | Leftist | Binomial[7] | Fibonacci[7][8] | Pairing[9] | Brodal[10][b] | Rank-pairing[12] | Strict Fibonacci[13] | 2-3 heap |
---|---|---|---|---|---|---|---|---|---|
find-min | Θ(1) | Θ(1) | Θ(log n) | Θ(1) | Θ(1) | Θ(1) | Θ(1) | Θ(1) | ? |
delete-min | Θ(log n) | Θ(log n) | Θ(log n) | O(log n)[c] | O(log n)[c] | O(log n) | O(log n)[c] | O(log n) | O(log n)[c] |
insert | O(log n) | Θ(log n) | Θ(1)[c] | Θ(1) | Θ(1) | Θ(1) | Θ(1) | Θ(1) | O(log n)[c] |
decrease-key | Θ(log n) | Θ(n) | Θ(log n) | Θ(1)[c] | o(log n)[c][d] | Θ(1) | Θ(1)[c] | Θ(1) | Θ(1) |
merge | Θ(n) | Θ(log n) | O(log n)[e] | Θ(1) | Θ(1) | Θ(1) | Θ(1) | Θ(1) | ? |
^ Each insertion takes O(log(k)) in the existing size of the heap, thus ∑k=1nO(logk){displaystyle sum _{k=1}^{n}O(log k)}. Since logn/2=(logn)−1{displaystyle log n/2=(log n)-1}, a constant factor (half) of these insertions are within a constant factor of the maximum, so asymptotically we can assume k=n{displaystyle k=n}; formally the time is nO(logn)−O(n)=O(nlogn){displaystyle nO(log n)-O(n)=O(nlog n)}. This can also be readily seen from Stirling's approximation.
^ Brodal and Okasaki later describe a persistent variant with the same bounds except for decrease-key, which is not supported.
Heaps with n elements can be constructed bottom-up in O(n).[11]
^ abcdefghi Amortized time.
^ Lower bound of Ω(loglogn),{displaystyle Omega (log log n),}[14] upper bound of O(22loglogn).{displaystyle O(2^{2{sqrt {log log n}}}).}[15]
^ n is the size of the larger heap.
Applications[edit]
The heap data structure has many applications.
Heapsort: One of the best sorting methods being in-place and with no quadratic worst-case scenarios.
Selection algorithms: A heap allows access to the min or max element in constant time, and other selections (such as median or kth-element) can be done in sub-linear time on data that is in a heap.[16]
Graph algorithms: By using heaps as internal traversal data structures, run time will be reduced by polynomial order. Examples of such problems are Prim's minimal-spanning-tree algorithm and Dijkstra's shortest-path algorithm.
Priority Queue: A priority queue is an abstract concept like "a list" or "a map"; just as a list can be implemented with a linked list or an array, a priority queue can be implemented with a heap or a variety of other methods.
K-way merge: A heap data structure is useful to merge many already-sorted input streams into a single sorted output stream. Examples of the need for merging include external sorting and streaming results from distributed data such as a log structured merge tree. The inner loop is obtaining the min element, replacing with the next element for the corresponding input stream, then doing a sift-down heap operation. (Alternatively the replace function.) (Using extract-max and insert functions of a priority queue are much less efficient.)
Order statistics: The Heap data structure can be used to efficiently find the kth smallest (or largest) element in an array.
Implementations[edit]
- The C++ Standard Library provides the make_heap, push_heap and pop_heap algorithms for heaps (usually implemented as binary heaps), which operate on arbitrary random access iterators. It treats the iterators as a reference to an array, and uses the array-to-heap conversion. It also provides the container adaptor priority_queue, which wraps these facilities in a container-like class. However, there is no standard support for the replace, sift-up/sift-down, or decrease/increase-key operations.
- The Boost C++ libraries include a heaps library. Unlike the STL, it supports decrease and increase operations, and supports additional types of heap: specifically, it supports d-ary, binomial, Fibonacci, pairing and skew heaps.
- There is a generic heap implementation for C and C++ with D-ary heap and B-heap support. It provides an STL-like API.
- The standard library of the D programming language includes std.container.BinaryHeap, which is implemented in terms of D's ranges. Instances can be constructed from any random-access range. BinaryHeap exposes an input range interface that allows iteration with D's built-in foreach statements and integration with the range-based API of the std.algorithm package.
- The Java platform (since version 1.5) provides a binary heap implementation with the class
java.util.PriorityQueue
in the Java Collections Framework. This class implements by default a min-heap; to implement a max-heap, programmer should write a custom comparator. There is no support for the replace, sift-up/sift-down, or decrease/increase-key operations.
Python has a heapq module that implements a priority queue using a binary heap. The library exposes a heapreplace function to support k-way merging.
PHP has both max-heap (SplMaxHeap) and min-heap (SplMinHeap) as of version 5.3 in the Standard PHP Library.
Perl has implementations of binary, binomial, and Fibonacci heaps in the Heap distribution available on CPAN.- The Go language contains a heap package with heap algorithms that operate on an arbitrary type that satisfies a given interface. That package does not support the replace, sift-up/sift-down, or decrease/increase-key operations.
- Apple's Core Foundation library contains a CFBinaryHeap structure.
Pharo has an implementation of a heap in the Collections-Sequenceable package along with a set of test cases. A heap is used in the implementation of the timer event loop.- The Rust programming language has a binary max-heap implementation, BinaryHeap, in the collections module of its standard library.
See also[edit]
- Sorting algorithm
- Search data structure
- Stack (abstract data type)
- Queue (abstract data type)
- Tree (data structure)
Treap, a form of binary search tree based on heap-ordered trees
References[edit]
^ Black (ed.), Paul E. (2004-12-14). Entry for heap in Dictionary of Algorithms and Data Structures. Online version. U.S. National Institute of Standards and Technology, 14 December 2004. Retrieved on 2017-10-08 from https://xlinux.nist.gov/dads/HTML/heap.html.
^ Williams, J. W. J. (1964), "Algorithm 232 - Heapsort", Communications of the ACM, 7 (6): 347–348, doi:10.1145/512274.512284.mw-parser-output cite.citation{font-style:inherit}.mw-parser-output q{quotes:"""""""'""'"}.mw-parser-output code.cs1-code{color:inherit;background:inherit;border:inherit;padding:inherit}.mw-parser-output .cs1-lock-free a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/6/65/Lock-green.svg/9px-Lock-green.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output .cs1-lock-limited a,.mw-parser-output .cs1-lock-registration a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/d/d6/Lock-gray-alt-2.svg/9px-Lock-gray-alt-2.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output .cs1-lock-subscription a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/a/aa/Lock-red-alt-2.svg/9px-Lock-red-alt-2.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output .cs1-subscription,.mw-parser-output .cs1-registration{color:#555}.mw-parser-output .cs1-subscription span,.mw-parser-output .cs1-registration span{border-bottom:1px dotted;cursor:help}.mw-parser-output .cs1-hidden-error{display:none;font-size:100%}.mw-parser-output .cs1-visible-error{font-size:100%}.mw-parser-output .cs1-subscription,.mw-parser-output .cs1-registration,.mw-parser-output .cs1-format{font-size:95%}.mw-parser-output .cs1-kern-left,.mw-parser-output .cs1-kern-wl-left{padding-left:0.2em}.mw-parser-output .cs1-kern-right,.mw-parser-output .cs1-kern-wl-right{padding-right:0.2em}
^ The Python Standard Library, 8.4. heapq — Heap queue algorithm, heapq.heappush
^ The Python Standard Library, 8.4. heapq — Heap queue algorithm, heapq.heappop
^ The Python Standard Library, 8.4. heapq — Heap queue algorithm, heapq.heapreplace
^ Suchenek, Marek A. (2012), "Elementary Yet Precise Worst-Case Analysis of Floyd's Heap-Construction Program", Fundamenta Informaticae, IOS Press, 120 (1): 75–92, doi:10.3233/FI-2012-751.
^ abcd Cormen, Thomas H.; Leiserson, Charles E.; Rivest, Ronald L. (1990). Introduction to Algorithms (1st ed.). MIT Press and McGraw-Hill. ISBN 0-262-03141-8.
^ Fredman, Michael Lawrence; Tarjan, Robert E. (July 1987). "Fibonacci heaps and their uses in improved network optimization algorithms" (PDF). Journal of the Association for Computing Machinery. 34 (3): 596&ndash, 615. doi:10.1145/28869.28874.
^ Iacono, John (2000), "Improved upper bounds for pairing heaps", Proc. 7th Scandinavian Workshop on Algorithm Theory (PDF), Lecture Notes in Computer Science, 1851, Springer-Verlag, pp. 63–77, arXiv:1110.4428, doi:10.1007/3-540-44985-X_5, ISBN 3-540-67690-2
^ Brodal, Gerth S. (1996), "Worst-Case Efficient Priority Queues" (PDF), Proc. 7th Annual ACM-SIAM Symposium on Discrete Algorithms, pp. 52–58
^ Goodrich, Michael T.; Tamassia, Roberto (2004). "7.3.6. Bottom-Up Heap Construction". Data Structures and Algorithms in Java (3rd ed.). pp. 338&ndash, 341. ISBN 0-471-46983-1.
^ Haeupler, Bernhard; Sen, Siddhartha; Tarjan, Robert E. (November 2011). "Rank-pairing heaps" (PDF). SIAM J. Computing: 1463–1485. doi:10.1137/100785351.
^ Brodal, G. S. L.; Lagogiannis, G.; Tarjan, R. E. (2012). Strict Fibonacci heaps (PDF). Proceedings of the 44th symposium on Theory of Computing - STOC '12. p. 1177. doi:10.1145/2213977.2214082. ISBN 9781450312455.
^ Fredman, Michael Lawrence (July 1999). "On the Efficiency of Pairing Heaps and Related Data Structures" (PDF). Journal of the Association for Computing Machinery. 46 (4): 473&ndash, 501. doi:10.1145/320211.320214.
^ Pettie, Seth (2005). Towards a Final Analysis of Pairing Heaps (PDF). FOCS '05 Proceedings of the 46th Annual IEEE Symposium on Foundations of Computer Science. pp. 174&ndash, 183. CiteSeerX 10.1.1.549.471. doi:10.1109/SFCS.2005.75. ISBN 0-7695-2468-0.
^ Frederickson, Greg N. (1993), "An Optimal Algorithm for Selection in a Min-Heap", Information and Computation (PDF), 104 (2), Academic Press, pp. 197–214, doi:10.1006/inco.1993.1030
External links[edit]
Wikimedia Commons has media related to Heaps. |
The Wikibook Data Structures has a page on the topic of: Min and Max Heaps |
Heap at Wolfram MathWorld
Explanation of how the basic heap algorithms work
Bentley, Jon Louis (2000). Programming Pearls (2nd ed.). Addison Wesley. pp. 147–162. ISBN 0201657880.
Categories:
- Heaps (data structures)
(window.RLQ=window.RLQ||).push(function(){mw.config.set({"wgPageParseReport":{"limitreport":{"cputime":"0.400","walltime":"0.532","ppvisitednodes":{"value":1801,"limit":1000000},"ppgeneratednodes":{"value":0,"limit":1500000},"postexpandincludesize":{"value":66629,"limit":2097152},"templateargumentsize":{"value":2053,"limit":2097152},"expansiondepth":{"value":12,"limit":40},"expensivefunctioncount":{"value":3,"limit":500},"unstrip-depth":{"value":1,"limit":20},"unstrip-size":{"value":42116,"limit":5000000},"entityaccesscount":{"value":4,"limit":400},"timingprofile":["100.00% 324.355 1 -total"," 59.02% 191.421 2 Template:Reflist"," 33.54% 108.788 5 Template:Citation"," 16.56% 53.710 1 Template:Heap_Running_Times"," 8.75% 28.388 13 Template:Efn"," 7.70% 24.988 1 Template:Div_col"," 6.14% 19.905 3 Template:Cite_journal"," 5.38% 17.435 3 Template:Cite_book"," 5.32% 17.251 2 Template:Cite_conference"," 5.22% 16.936 2 Template:Navbox"]},"scribunto":{"limitreport-timeusage":{"value":"0.143","limit":"10.000"},"limitreport-memusage":{"value":3416973,"limit":52428800}},"cachereport":{"origin":"mw1258","timestamp":"20181029051919","ttl":1900800,"transientcontent":false}}});mw.config.set({"wgBackendResponseTime":90,"wgHostname":"mw1240"});});