Are there any security risks that may arise in this scenario:
eval(repr(unsanitized_user_input), {"__builtins__": None}, {"True":True, "False":False})
where unsanitized_user_input is the str object. The string is user generated and can be annoying. Assuming our web infrastructure didn't let us down, this is a true instance with honest inheritance from embedded Python.
If this is dangerous, can we do something for input to make it safe?
We definitely don't want to do anything contained in the string.
See also:
The wider context, which (I believe) is not important for the question, is that we have thousands of them:
repr([unsanitized_user_input_1, unsanitized_user_input_2, unsanitized_user_input_3, unsanitized_user_input_4, ...])
in some cases enclosed:
repr([[unsanitized_user_input_1, unsanitized_user_input_2], [unsanitized_user_input_3, unsanitized_user_input_4], ...])
which themselves are converted to strings with repr() , put into persistent storage and, ultimately, read back into memory using eval.
Eval deserialized strings from persistent storage much faster than pickle and simplejson. The interpreter is Python 2.5, so json and ast are not available. C modules are not allowed, and cPickle is not allowed.
python eval
gravitation
source share