Imported modules become None when replacing the current module in sys.modules using a class object

an unpopular but “supported” python hack (see Guido: https://mail.python.org/pipermail/python-ideas/2012-May/014969.html ), which allows you to use __getattr__ for module attributes, the following:

 import os, sys class MyClass(object): def check_os(self): print os sys.modules[__name__] = MyClass() 

Upon import, the imported module becomes an instance of the class:

 >>> import myModule >>> myModule <myModule.MyClass object at 0xf76def2c> 

However, in Python-2.7, all other imported modules in the source module are None.

 >>> repro.check_os() None 

In Python 3.4, everything works:

 >>> repro.check_os() <module 'os' from '/python/3.4.1/lib/python3.4/os.py'> 

This has something to do with the Imported modules becoming invalid when the function starts , but does anyone know why this happens inside?

It seems that if you store the original module (without completely replacing it in Python-2), then everything continues to work:

 sys.modules[__name__+'_bak'] = sys.modules[__name__] 
+7
python python-import
source share
1 answer

The problem you're working with is that in Python before 3.4, when a module is destroyed (because yours is because you replace it with a class and there are no more references to it), the elements in this __dict__ module __dict__ forced to None .

The workaround, if you need to support Pythons up to 3.4, has an import statement in the class that will replace the module:

 class MyClass(object): import os def check_os(self): print(os) 

For more information, see this answer about disabling the interpreter .

+5
source share

All Articles