ADF test in statsmodels in Python

I am trying to run an advanced Dickey-Fuller test in statsmodels in Python, but there seems to be something missing.

This is the code I'm trying to do:

 import numpy as np import statsmodels.tsa.stattools as ts x = np.array([1,2,3,4,3,4,2,3]) result = ts.adfuller(x) 

I get the following error:

 Traceback (most recent call last): File "C:\Users\Akavall\Desktop\Python\Stats_models\stats_models_test.py", line 12, in <module> result = ts.adfuller(x) File "C:\Python27\lib\site-packages\statsmodels-0.4.1-py2.7-win32.egg\statsmodels\tsa\stattools.py", line 201, in adfuller xdall = lagmat(xdiff[:,None], maxlag, trim='both', original='in') File "C:\Python27\lib\site-packages\statsmodels-0.4.1-py2.7-win32.egg\statsmodels\tsa\tsatools.py", line 305, in lagmat raise ValueError("maxlag should be < nobs") ValueError: maxlag should be < nobs 

My Numpy Version: 1.6.1 My source data Version: 0.4.1 I use windows.

I am looking at the documentation here , but cannot understand what I am doing wrong. What am I missing?

Thanks at Advance.

+4
source share
1 answer

I get it. By default, maxlag set to None , while it must be an integer. Something like this works:

 import numpy as np import statsmodels.tsa.stattools as ts x = np.array([1,2,3,4,3,4,2,3]) result = ts.adfuller(x, 1) # maxlag is now set to 1 

Conclusion:

 >>> result (-2.6825663173365015, 0.077103947319183241, 0, 7, {'5%': -3.4775828571428571, '1%': -4.9386902332361515, '10%': -2.8438679591836733}, 15.971188911270618) 
+5
source

All Articles