amortized-analysis


Why we do Amortized Analysis for Fibonacci Heap?


In Fibonacci heap for all operations analysis are Amortized in nature. Why cant we have normal analysis as in case of Binomial Heap.
In a binomial heap, each operation is guaranteed to run with a certain worst-case performance. An insertion will never take more than time O(log n), a merge will never take more than time O(log n + log m), etc. Therefore, when analyzing the efficiency of a binomial heap, it's common to use a more traditional algorithmic analysis.
Now, that said, there are several properties of binomial heaps that only become apparent when doing an amortized analysis. For example, what's the cost of doing n consecutive insertions into a binomial heap, assuming the heap is initially empty? You can show that, in this case, the amortized cost of an insertion is O(1), meaning that the total cost of doing n insertions is O(n). In that sense, using an amortized analysis on top of a traditional analysis reveals more insights about the data structure than might initially arise from a more conservative worst-case analysis.
In some sense, Fibonacci heaps are best analyzed in an amortized sense because even though the worst-case bounds on many of the operations really aren't that great (for example, a delete-min or decrease-key can take time Θ(n) in the worst case), across any series of operations the Fibonacci heap has excellent amortized performance. Even though an individual delete-min might take Θ(n) time, it's never possible for a series of m delete-mins to take more than Θ(m log n) time.
In another sense, though, Fibonacci heaps were specifically designed to be efficient in an amortized sense rather than a worst-case sense. They were initially invented to speed up Dijkstra's and Prim's algorithms, where all that mattered were the total cost of doing m decrease-keys and n deletes on an n-node heap, and since that was the design goal, the designers made no attempt to make the Fibonacci heap efficient in the worst case.

Related Links

Amortized Analysis Example like dynamic array form
Can we use amortized analysis for best case analysis?
Why we do Amortized Analysis for Fibonacci Heap?
How is O(n)/n=1 in aggregate method of amortized analysis
is the amortized cost for increment and decrement binary counter the same?
Amortized Analysis: Find the Rate of Travel

Categories

HOME
hook
netsuite
keras
zeromq
activiti
gremlin
path-finding
bpmn
gis
at-command
maven-3
jgroups
google-project-tango
baqend
contact
elasticsearch-hadoop
windows-azure-storage
phaser
static-libraries
hex-editors
vaadin7
google-static-maps
pythonanywhere
swiftlint
conemu
autocad-plugin
jasonette
jquery-ajaxq
tokenize
lcd
uninstall
greendao
srcset
twilio-api
google-sites-2016
perlin-noise
hybridauth
mapdb
vxworks
swisscomdev
azure-ml
particles.js
optix
sfdc
node-gyp
turbogears
skeleton-css-boilerplate
azure-application-gateway
network-flow
zip4j
nstextview
lift-json
windows-mobile-6.5
windows-iot-core-10
sqlbulkcopy
firmata
deadbolt-2
android-textview
mathematica-frontend
mongocsharpdriver
yaws
javax.sound.midi
tcpserver
eclipse-clp
embedded-code
cakephp-3.1
asp.net-dynamic-data
gulp-less
internet-connection
php-5.4
android-radiobutton
comexception
concurrent-collections
java.nio.file
fmod
sitemesh
simba
mt
rdoc
runas
buildr
semantic-diff
vdsp
gcj
oncheckedchanged
telerik-scheduler
dmx512
user-friendly
web-architecture
webkit.net
adrotator
avatar
yagni
wsdl.exe

Resources

Mobile Apps Dev
Database Users
javascript
java
csharp
php
android
MS Developer
developer works
python
ios
c
html
jquery
RDBMS discuss
Cloud Virtualization
Database Dev&Adm
javascript
java
csharp
php
python
android
jquery
ruby
ios
html
Mobile App
Mobile App
Mobile App