Need help understanding Python function

I am trying to teach myself Python by working on some of the problems I am facing, and I need help understanding how to pass functions.

Let's say I'm trying to predict tomorrow a temperature based on today's and yesterdayโ€™s temperatures, and I wrote the following function:

def predict_temp(temp_today, temp_yest, k1, k2): return k1*temp_today + k2*temp_yest 

And I also wrote an error function to compare the list of predicted temperatures with actual temperatures and return the average absolute error:

 def mean_abs_error(predictions, expected): return sum([abs(x - y) for (x,y) in zip(predictions,expected)]) / float(len(predictions)) 

Now, if I have a list of daytime temperatures for a certain interval in the past, I can see how my prediction function would do with specific parameters k1 and k2 as follows:

 >>> past_temps = [41, 35, 37, 42, 48, 30, 39, 42, 33] >>> pred_temps = [predict_temp(past_temps[i-1],past_temps[i-2],0.5,0.5) for i in xrange(2,len(past_temps))] >>> print pred_temps [38.0, 36.0, 39.5, 45.0, 39.0, 34.5, 40.5] >>> print mean_abs_error(pred_temps, past_temps[2:]) 6.5 

But how can I create a function to minimize my parameters k1 and k2 of my prec_temp function, taking into account the error function and my past_temps data?

In particular, I would like to write a minim (args *) function that takes a prediction function, an error function, some training data and uses a search / optimization method (for example, gradient descent) to estimate and return values โ€‹โ€‹from k1 and k2 that minimize my error given data?

I do not ask how to implement the optimization method. Suppose I can do this. Rather, I just wanted to know how to transfer my prediction and error functions (and my data) to my minimization function, and how to tell my minimization function that it should optimize the parameters k1 and k2 so that my minimization function can automatically search many different k1 and k2 settings, applying my prediction function with these parameters each time to data and computational error (for example, I did it manually for k1 = 0.5 and k2 = 0.5 above), and then return the best results.

I would like to be able to transfer these functions so that I can easily switch with various forecasting and error functions (differing not only in parameter settings). Each prediction function may have a different number of free parameters.

My minimization function should look something like this, but I don't know how to do it:

 def minimize(prediction_function, which_args_to_optimize, error_function, data): # 1: guess initial parameters # 2: apply prediction function with current parameters to data to compute predictions # 3: use error function to compute error between predictions and data # 4: if stopping criterion is met, return parameters # 5: update parameters # 6: GOTO 2 

Edit: Is it that simple ?? This is not fun. I am returning to Java.

In a more serious note, I think I was also hung on how to use different forecasting functions with a different number of parameters for tuning. If I just take all the free parameters as a single tuple, I can save the form of the function in the same way that it can be easily transferred and used.

+4
source share
3 answers

Here is an example of how to pass a function to another function. apply_func_to will take the function f and the number num as parameters and return f(num) .

 def my_func(x): return x*x def apply_func_to(f, num): return f(num) >>>apply_func_to(my_func, 2) 4 

If you want to be smart, you can use lambda (anonymous functions too). They allow you to transfer functions on the fly without the need to define them separately.

 >>>apply_func_to(lambda x:x*x, 3) 9 

Hope this helps.

+13
source

The function passing in Python is simple, you just use the function name as a variable that contains the function itself.

 def predict(...): ... minimize(predict, ..., mean_abs_error, ...) 

As for the rest of the question: I would suggest looking at the way SciPy implements this as a model. Basically, they have a leastsq function that minimizes the sum of squared residuals (I suppose you know what the minimum quadratic minimization is ;-). What you pass in leastsq is a function for calculating residuals, initial guesses for the parameters, and an arbitrary parameter that is passed to your residual computational function (closure), which includes data:

 # params will be an array of your k's, ie [k1, k2] def residuals(params, measurements, times): return predict(params, times) - measurements leastsq(residuals, initial_parameters, args = (measurements, times)) 

Please note that SciPy doesn't actually care about how you make up leftovers. The measurements array is simply passed unchanged to your residuals function.

I can find an example that I made recently if you want to get additional information, or, of course, you can find examples on the Internet, but in my experience they are not entirely clear. The specific bit of code I wrote will fit your scenario well.

+2
source

Like David and Il-Bhima , functions can be passed to other functions, like any other other type of object. When you pass a function, you simply name it as you normally would. People sometimes refer to this ability, saying that functions are first-class in Python. At a slightly higher level of detail, you should think of functions in Python as one type of object being called. Another important type of object being called in Python is class objects; in this case, calling the class object creates an instance of this object. This concept is discussed in detail here .

In the general case, you probably want to use the positional and / or key argument function for Python, as described here . This will allow you to write a general minimizer that can minimize the prediction functions using different sets of parameters. I wrote an example - it is more complicated than I would like (it uses generators!), But it works for forecasting functions with arbitrary parameters. I hush up a few details, but this should get you started:

 def predict(data, k1=None, k2=None): """Make the prediction.""" pass def expected(data): """Expected results from data.""" pass def mean_abs_err(pred, exp): """Compute mean absolute error.""" pass def gen_args(pred_args, args_to_opt): """Update prediction function parameters. pred_args : a dict to update args_to_opt : a dict of arguments/iterables to apply to pred_args This is a generator that updates a number of variables over a given numerical range. Equivalent to itertools.product. """ base_args = pred_args.copy() #don't modify input argnames = args_to_opt.keys() argvals = args_to_opt.values() result = [[]] # Generate the results for argv in argvals: result = [x+[y] for x in result for y in argv] for prod in result: base_args.update(zip(argnames, prod)) yield base_args def minimize(pred_fn, pred_args, args_to_opt, err_fn, data): """Minimize pred_fn(data) over a set of parameters. pred_fn : function used to make predictions pred_args : dict of keyword arguments to pass to pred_fn args_to_opt : a dict of arguments/iterables to apply to pred_args err_fn : function used to compute error data : data to use in the optimization Returns a tuple (error, parameters) of the best set of input parameters. """ results = [] for new_args in gen_args(pred_args, args_to_opt): pred = pred_fn(data, **new_args) # Unpack dictionary err = err_fn(pred, expected(data)) results.append((err, new_args)) return sorted(results)[0] const_args = {k1: 1} opt_args = {k2: range(10)} data = [] # Whatever data you like. minimize(predict, const_args, opt_args, mean_abs_err, data) 
+1
source

All Articles