amortized-analysis


Why we do Amortized Analysis for Fibonacci Heap?


In Fibonacci heap for all operations analysis are Amortized in nature. Why cant we have normal analysis as in case of Binomial Heap.
In a binomial heap, each operation is guaranteed to run with a certain worst-case performance. An insertion will never take more than time O(log n), a merge will never take more than time O(log n + log m), etc. Therefore, when analyzing the efficiency of a binomial heap, it's common to use a more traditional algorithmic analysis.
Now, that said, there are several properties of binomial heaps that only become apparent when doing an amortized analysis. For example, what's the cost of doing n consecutive insertions into a binomial heap, assuming the heap is initially empty? You can show that, in this case, the amortized cost of an insertion is O(1), meaning that the total cost of doing n insertions is O(n). In that sense, using an amortized analysis on top of a traditional analysis reveals more insights about the data structure than might initially arise from a more conservative worst-case analysis.
In some sense, Fibonacci heaps are best analyzed in an amortized sense because even though the worst-case bounds on many of the operations really aren't that great (for example, a delete-min or decrease-key can take time Θ(n) in the worst case), across any series of operations the Fibonacci heap has excellent amortized performance. Even though an individual delete-min might take Θ(n) time, it's never possible for a series of m delete-mins to take more than Θ(m log n) time.
In another sense, though, Fibonacci heaps were specifically designed to be efficient in an amortized sense rather than a worst-case sense. They were initially invented to speed up Dijkstra's and Prim's algorithms, where all that mattered were the total cost of doing m decrease-keys and n deletes on an n-node heap, and since that was the design goal, the designers made no attempt to make the Fibonacci heap efficient in the worst case.

Related Links

Amortized Analysis Example like dynamic array form
Can we use amortized analysis for best case analysis?
Why we do Amortized Analysis for Fibonacci Heap?
How is O(n)/n=1 in aggregate method of amortized analysis
is the amortized cost for increment and decrement binary counter the same?
Amortized Analysis: Find the Rate of Travel

Categories

HOME
client
stock
office365api
youtube-dl
google-docs
node-notifier
pheatmap
gitpitch
communication
commonmark
decimal
quartz-scheduler
clojurescript
pugjs
plunker
grails3
riot.js
language-agnostic
microsoft-r
autosys
emgucv
semantic-analysis
react-css-modules
facebook-apps
large-file-upload
libuv
kendo-datasource
flink-streaming
socialengine
airconsole
webtest
asset-pipeline
hexo
stacked
git-merge
restlet
dartium
objectlistview
az-application-insights
webix-treetable
no-www
convertapi
angular-resource
long-polling
return-value
git-diff
forever
media-player
r-forge
background-service
word-vba-mac
abcpdf9
deadbolt-2
android-fonts
heidisql
probability-density
dstu2-fhir
file-writing
jwplayer7
blackberry-10
qcustomplot
ptrace
two-factor-authentication
pyke
iad
xna-4.0
marching-cubes
collapse
metaclass
tmuxinator
coveralls
dukescript
sankey-diagram
device-orientation
offloading
gadt
android-nested-fragment
preferences
elliptic-curve
marmalade
kgdb
transcoding
meteor-velocity
message-driven-bean
fmod
industrial
cdc
odata4j
ftps
batterylevel
pushbackinputstream
html-editor
selected
chuck
tridion-worldserver
dbproviderfactories
qtkit
uiviewanimation-curve
assembly-loading
boost-filesystem
nsobject
genshi
infobox
telerik-scheduler
django-notification
ajax-forms

Resources

Database Users
RDBMS discuss
Database Dev&Adm
javascript
java
csharp
php
android
javascript
java
csharp
php
python
android
jquery
ruby
ios
html
Mobile App
Mobile App
Mobile App