Recently, I have heard an interesting and very interesting discussion about the famous quote from "Whip" in the podcast (I think these were deep fried bytes), which I will try to summarize:
Everyone knows the famous quote: premature optimization is the root of all evil ..
However, this is only half. Full quote:
We must forget about little efficiency, say, about 97% of the time: premature optimization is the root of all evil .
Look at it carefully - say, about 97% of the time.
The other side of this statement is about 3% of the time, "low" efficiency is critical.
My monitor displays about 50 lines of code. According to statistics, at least 1-2 lines of code on each screen will contain something that is sensitive to performance! Following the general wisdom of “do it now, optimize it later,” doesn’t seem like such a cunning plan when you think that on every screen you have a possible performance problem.
IMHO, you should always think about performance. You should not spend a lot of effort or sacrifice on this support until it is confirmed by profiling / testing, but you must have it in the depths of your consciousness.
I would personally apply this to the generic code, like this:
You definitely have some kind of code that, when you wrote it, thought that “it will be slow”, or “this is a stupid algorithm, but it doesn’t matter now, so I will fix it later”. Since you are in a shared library, and you cannot say that method A will ever be called with 5 elements, you have to go in and clear it all.
Once you figured out these things, I would not worry about what follows. Perhaps run the profiler over your unit tests to make sure nothing stupid has slipped, but otherwise wait for feedback from your library's consumers.