Python regression with matrices

I have about 100 7x7 matrices of dependent variables (so 49 dependent variables). My independent variable is time. I am engaged in a physical project in which I have to get a matrix function (each element of the matrix is ​​a function of time), solving ODE. I used the numpy ODE solver, which gives me numerical answers to my matrix function, evaluated at different times. Now with these matrices and time, I want to find a time-dependent expression for each matrix of elements to get a time-dependent matrix. I heard that I need to find the hat matrix, and I think the predicted or set values ​​will be my 7x7 matrices, and the response values ​​will be arrays of times. So how can I find this hat matrix in Python?

I initially thought about doing polynomial regression in scikit-learn using my LinearRegression model. Will this work? Is there a possible way in StatsModel, or better, in scipy or numpy?

Basically I want to go from:

enter image description here

at

enter image description here

It is clear that I will use more test cases, but this is a general idea. So I will have one-dimensional X (X will be an array of different times) and multi-dimensional Y (Y will be matrices evaluated at different times)

In the above example, t-1 will be included in array X, and array Y will have the first matrix

+4
source share
1 answer

Given that the task you would like to do is classical linear regression:

numpy ( , X):

import numpy as np
a = np.linalg.inv(np.dot(X.T,X))
c = np.dot(X.T,Y)
b = np.dot(a,c)

numpy

np.polyfit(X,Y,1)

scipy:

scipy.linalg.solve(X,Y)

scipy.stats.linregr(X,Y)

+3

All Articles