There are a few points to keep in mind: first of all, as you resize, the hardware (at least on a typical computer) will also have an effect. In particular, when your data gets large to fit a particular cache size, you can expect a significant transition at run time, completely independent of the algorithm in question.
To get an idea of ββthe right algorithm, you should (probably) start by comparing with some algorithm with really obvious complexity, but working with the same data size. For one obvious possibility, the time it takes to fill your array with random numbers. At least assuming a fairly typical PRNG, it certainly should be linear.
Then run your algorithm and see how it changes relative to the linear algorithm for the same sizes. For example, you can use the following code:
#include <vector> #include <algorithm> #include <iostream> #include <time.h> #include <string> #include <iomanip> class timer { clock_t begin; std::ostream &os; std::string d; public: timer(std::string const &delim = "\n", std::ostream &rep=std::cout) : os(rep), begin(clock()), d(delim) {} ~timer() { os << double(clock()-begin)/CLOCKS_PER_SEC << d; } }; int main() { static const unsigned int meg = 1024 * 1024; std::cout << std::setw(10) << "Size" << "\tfill\tsort\n"; for (unsigned size=10000; size <512*meg; size *= 2) { std::vector<int> numbers(size); std::cout << std::setw(10) << size << "\t"; { timer fill_time("\t"); std::fill_n(numbers.begin(), size, 0); for (int i=0; i<size; i++) numbers[i] = rand(); } { timer sort_time; std::sort(numbers.begin(), numbers.end()); } } return 0; }
If I type the time to fill and the sort time, I get something like this:

Since our dimensions are exponential, our linear algorithm shows a (roughly) exponential curve. Sorting time obviously grows (somewhat) faster.
Change In fairness, it should be added that log (N) grows so slowly that for almost any amount of data it is very small. For most practical purposes, you can simply treat quicksort (for example) as linear in size, just with a slightly larger constant factor than filling the memory. The growing size is linear and the graphical result makes this more obvious:

If you look carefully, you can see the top row showing only a small upward curve from the coefficient "log (N)". On the other hand, Iβm not sure that I would have noticed curvature at all if I had not yet known that it should be there.