So, I'm trying to implement the lowest overall subsequence in Python and trying to use this alternative to my previous solution. I tried using a dictionary instead of a 2-D matrix to memorize the results.
def lcs(s1, s2): cache = {} if len(s1) == 0 or len(s2) == 0: return 0 if (s1, s2) in cache: return cache[s1, s2] else: if s1[-1] == s2[-1]: cache[s1, s2] = 1 + lcs(s1[:-1], s2[:-1]) else: cache[s1, s2] = max(lcs(s1[:-1], s2), lcs(s1, s2[:-1])) print cache
He returns
TypeError: unsupported operand type(s) for +: 'int' and 'NoneType'
which I understand because I am not returning anything, since I can do something like this.
return cache[s1, s2] = 1 + lcs(s1[:-1], s2[:-1])
And I'm trying to implement it without using any decorators.
source share