Running python manage.py test gives "Maximum Recursion Depth reached" error

1.4k Views Asked by At

So I have a django project which was built on 1.6.5 and now I'm migrating it to 1.9.5. I successfully migrated it to 1.7.0 and then to 1.8.0. When doing so from 1.8.0 to 1.9.0, I had to replace SortedDict by collections.OrderedDict. Now I'm encountering this error when I do python manage.py test:

    File "forum/models/base.py", line 134, in iterator
    key_list = [v[0] for v in self.values_list(*values_list)]
  File "venv/mystuff/lib/python2.7/site-packages/django/db/models/query.py", line 258, in __iter__
    self._fetch_all()
  File "venv/mystuff/lib/python2.7/site-packages/django/db/models/query.py", line 1074, in _fetch_all
    self._result_cache = list(self.iterator())
  File "forum/models/base.py", line 134, in iterator
    key_list = [v[0] for v in self.values_list(*values_list)]
  File "venv/mystuff/lib/python2.7/site-packages/django/db/models/query.py", line 258, in __iter__
    self._fetch_all()
  File "venv/mystuff/lib/python2.7/site-packages/django/db/models/query.py", line 1074, in _fetch_all
    self._result_cache = list(self.iterator())
  File "forum/models/base.py", line 134, in iterator
    key_list = [v[0] for v in self.values_list(*values_list)]
  File "venv/mystuff/lib/python2.7/site-packages/django/db/models/query.py", line 725, in values_list
    clone = self._values(*fields)
  File "venv/mystuff/lib/python2.7/site-packages/django/db/models/query.py", line 671, in _values
    clone = self._clone()
  File "venv/mystuff/lib/python2.7/site-packages/django/db/models/query.py", line 1059, in _clone
    query = self.query.clone()
  File "venv/mystuff/lib/python2.7/site-packages/django/db/models/sql/query.py", line 298, in clone
    obj._annotations = self._annotations.copy() if self._annotations is not None else None
  File "/opt/python/python-2.7/lib64/python2.7/collections.py", line 194, in copy
    return self.__class__(self)
  File "/opt/python/python-2.7/lib64/python2.7/collections.py", line 57, in __init__
    self.__update(*args, **kwds)
  File "venv/mystuff/lib64/python2.7/abc.py", line 151, in __subclasscheck__
    if subclass in cls._abc_cache:
  File "venv/mystuff/lib64/python2.7/_weakrefset.py", line 72, in __contains__
    wr = ref(item)
RuntimeError: maximum recursion depth exceeded

The solutions on other similar questions ask to upgrade python to 2.7.5 but I'm already running it on 2.7.11.

EDIT: forum/models/base.py

def iterator(self):
    cache_key = self.model._generate_cache_key("QUERY:%s" % self._get_query_hash())
    on_cache_query_attr = self.model.value_to_list_on_cache_query()

    to_return = None
    to_cache = {}

    with_aggregates = len(self.query.aggregates) > 0
    key_list = self._fetch_from_query_cache(cache_key)

    if key_list is None:
        if not with_aggregates:
            values_list = [on_cache_query_attr]

            if len(self.query.extra):
                values_list += self.query.extra.keys()

            key_list = [v[0] for v in self.values_list(*values_list)] #Line 134
            to_cache[cache_key] = (datetime.datetime.now(), key_list)
        else:
            to_return = list(super(CachedQuerySet, self).iterator())
            to_cache[cache_key] = (datetime.datetime.now(), [
                (row.__dict__[on_cache_query_attr], dict([(k, row.__dict__[k]) for k in self.query.aggregates.keys()]))
                for row in to_return])
    elif with_aggregates:
        tmp = key_list
        key_list = [k[0] for k in tmp]
        with_aggregates = [k[1] for k in tmp]
        del tmp

    if (not to_return) and key_list:
        row_keys = [self.model.infer_cache_key({on_cache_query_attr: attr}) for attr in key_list]
        cached = cache.get_many(row_keys)

        to_return = [
            (ck in cached) and self.obj_from_datadict(cached[ck]) or ToFetch(force_unicode(key_list[i])) for i, ck in enumerate(row_keys)
        ]

        if len(cached) != len(row_keys):
            to_fetch = [unicode(tr) for tr in to_return if isinstance(tr, ToFetch)]

            fetched = dict([(force_unicode(r.__dict__[on_cache_query_attr]), r) for r in
                          models.query.QuerySet(self.model).filter(**{"%s__in" % on_cache_query_attr: to_fetch})])

            to_return = [(isinstance(tr, ToFetch) and fetched[unicode(tr)] or tr) for tr in to_return]
            to_cache.update(dict([(self.model.infer_cache_key({on_cache_query_attr: attr}), r._as_dict()) for attr, r in fetched.items()]))

        if with_aggregates:
            for i, r in enumerate(to_return):
                r.__dict__.update(with_aggregates[i])


    if len(to_cache):
        cache.set_many(to_cache, 60 * 60)

    if to_return:
        for row in to_return:
            if hasattr(row, 'leaf'):
                row = row.leaf

            row.reset_original_state()
            yield row

django/db/models/query.py:

def _fetch_all(self):
    if self._result_cache is None:
        self._result_cache = list(self.iterator()) #Line 1074
    if self._prefetch_related_lookups and not self._prefetch_done:
        self._prefetch_related_objects()

django/db/models/query.py

def __iter__(self):
    """
    The queryset iterator protocol uses three nested iterators in the
    default case:
        1. sql.compiler:execute_sql()
           - Returns 100 rows at time (constants.GET_ITERATOR_CHUNK_SIZE)
             using cursor.fetchmany(). This part is responsible for
             doing some column masking, and returning the rows in chunks.
        2. sql/compiler.results_iter()
           - Returns one row at time. At this point the rows are still just
             tuples. In some cases the return values are converted to
             Python values at this location.
        3. self.iterator()
           - Responsible for turning the rows into model objects.
    """
    self._fetch_all() #Line 258
    return iter(self._result_cache)

UPDATE: My Django Log is showing this:

/forum/settings/base.py TIME: 2017-06-27 06:49:53,410 MSG: base.py:value:65 Error retrieving setting from database (FORM_EMPTY_QUESTION_BODY): maximum recursion depth exceeded in cmp
/forum/settings/base.py TIME: 2017-06-27 06:49:53,444 MSG: base.py:value:65 Error retrieving setting from database (FORM_MIN_NUMBER_OF_TAGS): maximum recursion depth exceeded
1

There are 1 best solutions below

3
On

So you call values_list() from your iterator, which clones the QuerySet, which iterates the QuerySet, which calls your iterator, which clones the Queryset...

The API changed there in this commit. It should provide you with ample information to reimplement your Queryset.

On a different note, it looks like Django has implemented query caching itself, so before refactoring you might take a look if your CachedQuerySet has become obsolete.