Skip to content

Commit 2036bea

Browse files
📚 docs(manual): Add Theory section.
1 parent 0bd1d85 commit 2036bea

File tree

2 files changed

+165
-1
lines changed

2 files changed

+165
-1
lines changed

.esdoc.json

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,8 @@
2424
"./doc/manual/overview.md",
2525
"./doc/manual/installation.md",
2626
"./doc/manual/usage.md",
27-
"./doc/manual/example.md"
27+
"./doc/manual/example.md",
28+
"./doc/manual/theory.md"
2829
]
2930
}
3031
}

doc/manual/theory.md

Lines changed: 163 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,163 @@
1+
# Theory
2+
3+
The Fibonacci heaps guarantees that decrease-key operations can be executed in
4+
amortized constant time.
5+
6+
> The Fibonacci heap is of interest only if the user needs the decrease-key
7+
> operation heavily. Use another data structure with better constants otherwise.
8+
9+
Recall that for any sequence of operations, the sum of the real costs of the
10+
operations is upper bounded by the sum of the amortized costs of the
11+
operations (as long as our potential stays non-negative).
12+
13+
## Idea
14+
15+
A Fibonacci heap is a collection of Fibonacci trees. The minimum key is held by
16+
the root of one of these trees. We implement this collection of trees as a
17+
circular list of pointers to root nodes called the *root list*. We access this
18+
list via a pointer to the root holding the smallest key. Maintaining this
19+
pointer allows easy access to the minimum key held in the heap.
20+
Certain nodes of the Fibonacci trees will be marked for a purpose explained
21+
later.
22+
23+
We amortize the cost of each operation on a heap `H` with `n` elements
24+
using the following potential `P(H) = R(H) + 2 M(H)` where `R(H)` is the size
25+
of the root list of `H` and `M(H)` is the number of marked nodes of `H`.
26+
27+
Let `D(H)` denote the maximum degree of a node in `H`, then the real cost,
28+
potential change, and amortized cost of the heap operations are respectively:
29+
30+
- MAKE-HEAP: `O(1)`, `0`, `O(1)`.
31+
- INSERT: `O(1)`, `1`, `O(1)`.
32+
- MELD: `O(1)`, `0`, `O(1)`
33+
- DECREASE-KEY: `O(c)`, `2-c`, `O(1)`
34+
- DELETE-MIN: `O(R(H) + D(H))`, `O(1)-R(H)` ,`O(D(H))`
35+
36+
Where `c` in DECREASE-KEY can be has large as the height of the tallest tree in
37+
our collection.
38+
39+
To obtain a good bound on the amortized cost of the DELETE-MIN
40+
operation we make sure each subtree of a node `x` has `size(x) >=
41+
phi^degree(x)` (`phi` is the golden ratio `1.618...`) so that `D(H) = O(log
42+
n)`.
43+
44+
### How to keep the degrees low
45+
46+
Whenever the decision is made to add a node to a parent as a child (link),
47+
we guarantee that the child's degree is equal to the parent's degree.
48+
49+
If this child is the ith node to be added to the parent, its degree is
50+
therefore i-1.
51+
52+
Whenever a child is removed from a parent (cut), one of two things happens:
53+
54+
- if the parent is marked, we cut it.
55+
- if the parent is not marked, we mark it.
56+
57+
This guarantees that the degree of the ith child of a parent is at least i - 2.
58+
59+
It can then be proven that the size of any node x of degree k is
60+
`size(x) >= F_{k+2} >= phi^k` where `F_i` is the ith Fibonacci number.
61+
62+
*Hint: `F_{k+2} = 1 + F_0 + F_1 + ... + F_k`.*
63+
64+
#### Problems
65+
66+
We may have to cut repeatedly if a chain of ancestors is marked when we cut a
67+
node. We can amortize this cost because each cut ancestors can be unmarked.
68+
Hence the number of marked nodes drops proportionally to the number of cut
69+
nodes.
70+
71+
The number of root nodes may grow arbitrarily large through INSERT, MERGE, and
72+
DECREASE-KEY operations. This will increase the real cost of the next
73+
DELETE-MIN operation but also the potential of the heap.
74+
The DELETE-MIN operation will therefore include a
75+
restructuring procedure leveraging this high potential to amortize its high
76+
cost.
77+
78+
## Heap Operations
79+
80+
This section details the implementation of standard heap operations in the
81+
Fibonacci heap.
82+
83+
### MAKE-HEAP
84+
85+
Initialize an empty heap. Real cost is O(1) and initial potential is zero.
86+
Amortized cost is therefore O(1).
87+
88+
### INSERT
89+
90+
Add new node as a root node. Update minimum with a single comparison.
91+
The real cost of this operation is constant. The change of potential is one.
92+
The amortized cost of this operation is therefore O(1).
93+
94+
### MELD
95+
96+
Concatenate heaps root lists in constant time. Update minimum with a single
97+
comparison. The real cost of this operation is constant and the change of
98+
potential is zero. Amortized cost is therefore O(1).
99+
100+
NB: Insert is meld where one of the heaps contains a single node.
101+
102+
### DECREASE-KEY
103+
104+
We can update our structure after decreasing the key of a node as follows:
105+
106+
1. If the node is already the minimum, there is nothing to do.
107+
2. Otherwise, if the node is not a root node and has now a smaller key than
108+
its parent, cut it and add it to the root list.
109+
3. Finally, update the minimum of the root list if necessary.
110+
111+
Note that in 2. we have to recursively cut ancestor nodes until an unmarked
112+
ancestor is reached to guarantee the small degree property of the nodes.
113+
Fortunately, as already noted, we can amortize this cost because each cut
114+
ancestors can be unmarked. Hence the number of marked nodes drops
115+
proportionally to the number of cut nodes. The exact computation gives, for `c`
116+
cut nodes, a real cost of `O(c)`, a change of potential of `2-c`, and an
117+
amortized cost of `O(1)`.
118+
119+
### DELETE-MIN
120+
121+
Minimum node is a root node. We can add all children of the deleted node as
122+
root nodes. This increases the number of root nodes by the degree of the
123+
deleted node. Since the degree of a node is at most `D(H)`, the number of root nodes
124+
increases by at most `D(H)`.
125+
126+
The minimum needs to be updated. This could be any of the root nodes after
127+
deletion. Updating the minimum will therefore cost something proportional to
128+
the number of root nodes after the addition of the children of the deleted
129+
node, that is `R(H) + D(H)`.
130+
131+
To amortize this costly operation, we need to reduce the number of nodes in the
132+
root list. We do so by making sure there is at most one node of each degree in
133+
the root list. We call this procedure CONSOLIDATE. Once that procedure is
134+
finished, there are at most `D(H) + 1` nodes left in the root list. The real
135+
cost of the procedure is proportional to `R(H) + D(H)` (see [below](#CONSOLIDATE)),
136+
the same as updating the minimum.
137+
138+
> There are at most `D(H) + 1` left in the root list after this procedure is
139+
> run because the list contains at most one node of each degree:
140+
> one node of degree `0`, one node of degree `1`, ..., and one node of degree
141+
> `D(H)`.
142+
143+
To sum up, the real cost of DELETE-MIN is `O(R(H)+D(H))`, the change of
144+
potential is at most `O(1) - R(H)`, and the amortized cost is therefore
145+
`O(D(H))`.
146+
147+
#### CONSOLIDATE
148+
149+
This procedure evokes the construction of binomial trees: given a list of
150+
trees, repeatedly merge trees whose roots have identical degree until no two
151+
trees have identical degrees. For further reference let `N` be the size of the
152+
list of trees to merge and let `N'` be the number of trees left after the
153+
procedure.
154+
155+
Merging two trees corresponds to making one tree the child of the other. When a
156+
tree is made the child of another it is removed from the merge list.
157+
158+
Hence a tree can be made the child of another at most once and, when this
159+
happens, the size of the merge list shrinks by one. Therefore the total number
160+
of merge operations is linear in `N-N'`.
161+
162+
Each tree can be processed in constant time and hence the total cost of
163+
consolidate is proportional to `N`.

0 commit comments

Comments
 (0)