I would like to use Simulated Annealing to find the local minimum of one Polynomial variable, within a certain predefined interval. I would also like to try to find the global minimum of a quadratic function.
An arbitrary algorithm such as this is not the best way to solve a problem, so this is for educational purposes only.
While the algorithm itself is fairly straightforward, I'm not sure how to efficiently select a neighbor in one or in n-dimensional space.
Suppose that I am looking for a local minimum of a function: 2 * x ^ 3+ x + 1 through the interval [-0.5, 30], and suppose that the interval decreases to tenths of each number, for example {1.1, 1.2, 1.3 ,. .., 29.9, 30}.
What I would like to achieve is a balance between a random move and the rate of convergence from the starting point to points with lower energy.
If I just pick a random number from a given interval every time, then there is no random walk, and the algorithm can go around. If, on the contrary, the next point is selected by simply adding or subtracting 0.1 with equal probability, then the algorithm can turn into an exhaustive search - based on the starting point.
How should I effectively balance the search for simultaneous annealing searches in one-dimensional and n-dimensional space?
source share