I developed some kind of linear search algorithm (which I found out later) and was not satisfied with the speed of the function. So I searched for a faster function and found my own map function: map::find.
It was incredibly faster than the linear algorithm that I used.
std::map designed to store data sorted as it is inserted into the container. This is one of his main tasks. This is also the reason that you must define some kind of partial order for the data that you entered in std::map .
This means that each insertion takes a little longer than insertion into other containers (insertion into std::list - after you have an insertion point - for example, O (1), as added to std::vector or adding / addition to std::deque ). But the search is guaranteed to use a binary search (or rather move through the red-black tree behind std::map (in the "Premature or Reasonable Optimization" section)).
In another example, the STL search function was much faster than the other linear function we used.
But how is this possible? If you use the binary search algorithm, you first need to sort the map, which will take (hypothetically) more time, the larger your map.
There is nothing hypothetical about this. Sorting your data takes time and always takes more time than more data items.
std::find is capable of handling unsorted data, so it should be implemented as a linear search (compare std::binary_search / std::lower_bound ). But std::find allowed to hide and expand loops, compare more than one element at a time (if the elements are small, and especially if they are primitive types that lend themselves to low-level bit-reproduction), etc.
Also, how do you find out the algorithms underlying these core functions? Is there a list or some kind of database to find out about this?
Personally, I learned a lot of algorithms by reading what was available in STL and in several other languages. At first it was easier for me to study the containers.
Max lybbert
source share