[lnkForumImage]
TotalShareware - Download Free Software

Confronta i prezzi di migliaia di prodotti.
Asp Forum
 Home | Login | Register | Search 


 

Forums >

comp.lang.c

Complexity of data structure

alok

4/24/2011 7:13:00 AM

Why calculation of complexity of various algorithms(Linear search,
bubble sort) confined to number of comparisons, whereas the arithmetic
operations and other operations are not considered, though these also
may eat significant processing power.
5 Answers

Ian Collins

4/24/2011 7:19:00 AM

0

On 04/24/11 07:13 PM, C learner wrote:
> Why calculation of complexity of various algorithms(Linear search,
> bubble sort) confined to number of comparisons, whereas the arithmetic
> operations and other operations are not considered, though these also
> may eat significant processing power.

Because the arithmetic operations are common to all algorithms.

--
Ian Collins

Eric Sosman

4/24/2011 11:53:00 AM

0

On 4/24/2011 3:13 AM, C learner wrote:
> Why calculation of complexity of various algorithms(Linear search,
> bubble sort) confined to number of comparisons, whereas the arithmetic
> operations and other operations are not considered, though these also
> may eat significant processing power.

Your question isn't about the C programming language, nor about
any particular programming language, and probably belongs in a forum
like comp.programming.

... but for what it's worth, try the analysis yourself. Take
some simple algorithm, implement it, study the machine instructions
that it generates, and try to predict how much time it will take.
Don't forget to take account of pipeline parallelism, cache hits
and misses, translation look-aside buffer hits and misses, ... It
will be a difficult job, but perhaps you can get an answer after a
great deal of labor. And then you'll have an answer -- which will
go straight out the window as soon as you install a new compiler
version or change the compilation flags, or even add RAM. In other
words, all that enormous effort will produce, at best, an answer
that you can use only once and cannot transfer to the next machine.

--
Eric Sosman
esosman@ieee-dot-org.invalid

pete

4/24/2011 3:59:00 PM

0

Eric Sosman wrote:
>
> On 4/24/2011 3:13 AM, C learner wrote:
> > Why calculation of complexity of various algorithms(Linear search,
> > bubble sort) confined to number of comparisons,
> > whereas the arithmetic
> > operations and other operations are not considered,
> > though these also
> > may eat significant processing power.
>
> Your question isn't about the C programming language, nor about
> any particular programming language, and probably belongs in a forum
> like comp.programming.
>
> ... but for what it's worth, try the analysis yourself. Take
> some simple algorithm, implement it, study the machine instructions
> that it generates, and try to predict how much time it will take.

Binary insertion sort uses O(n * log(n)) comparisons,
but O(n * n) data movement, and runs in O(n * n) time.

#include <stddef.h>

void
bisort(void *base, size_t nmemb, size_t size,
int (*compar)(const void *, const void *))
{
if (nmemb-- > 1) {
size_t n;
size_t low, middle, high;
size_t bytes = 0;
const size_t odd_size = size ^ size - 1;
unsigned char *const array = base;
unsigned char *key = array;

do {
low = 0;
key += size;
bytes += size;
high = bytes;
do {
n = high - low;
middle = ((n & odd_size ? n - size : n) >> 1) + low;
if (0 > compar(key, middle + array)) {
high = middle;
} else {
low = middle;
}
} while (n != size);
base = array + high;
if (base != key) {
unsigned char *end = key;
unsigned char *after = end + size;

do {
const unsigned char swap = *--end;

*end = *--after;
*after = swap;
} while (end != base);
}
} while (--nmemb != 0);
}
}

--
pete

luserXtrog

4/25/2011 4:13:00 AM

0

On Apr 24, 2:13 am, C learner <manglika...@gmail.com> wrote:
> Why calculation of complexity of various algorithms(Linear search,
> bubble sort) confined to number of comparisons, whereas the arithmetic
> operations and other operations are not considered, though these also
> may eat significant processing power.

vol. 1 of The Art of Programming by Don Knuth would be a good resource
for learning how to do this analysis on an idealized machine
architecture. Used copies (2nd Edition) are super cheap.

Gene

4/25/2011 5:17:00 AM

0

On Apr 24, 3:13 am, C learner <manglika...@gmail.com> wrote:
> Why calculation of complexity of various algorithms(Linear search,
> bubble sort) confined to number of comparisons, whereas the arithmetic
> operations and other operations are not considered, though these also
> may eat significant processing power.

You probably aren't reading the right sources. Analyzing comparisons
only is a convenience for the analyst. It lets him/her avoid defining
a model of computation. The embedded assumption is that the rest of
the computation will have run time proportional to that number, but
this may not be true as some have already pointed out.

A good algorithms text will begin by defining a model of computation
and then go on to define run time as the number of execution steps in
that model. Usually it's a Real RAM model. IIRC this is defined
early in Aho, Hopcroft and Ulman, for example.