Dictionary search is the fastest way to perform this search. With this analysis, you usually compare the “Time Complexity” for each process.
To search for a dictionary, the time complexity is “constant time,” or O (1). Although this may mean that this is usually an integer value of the steps that the algorithm can perform, there is literally one in this case.
Other methods will require iteration (or in the case of an elses traversal), which is essentially a similar approach. They will vary from having to look at all the values of O (n), to look at some values, O (log n).
Since n is the size of the test set, and as the set becomes larger, the variance of the results will also be, while the dictionary will consistently surpass the other options shown.
There is no possible way to be faster than O (1). The only drawback of the approach that you showed is that increasing the set may require more memory, this is called the saving complexity of the algorithm. However, in this case, since we need only one value for the element in the set, the spatial complexity will be O (n), which is negligible.
In the general sense of optimizations, it is important to consider what complexity is present in the current solution, and how important it is to improve this complexity. If improvements are made, they should be aimed at achieving different levels of performance, for example, from O (n) to O (log n) or O (log n) to O (1).

Image courtesy: http://bigocheatsheet.com/
Microoptimizations tend to occur when optimizations are performed at the same level of complexity and at the same level of complexity, and they are often not constructive on their own.
Travis j
source share