mpacts.core.units.compat. lrucache

LRU (least recently used) cache backport.

From https://code.activestate.com/recipes/578078-py26-and-py30-backport-of-python-33s-lru-cache/

copyright:2004, Raymond Hettinger,
license:MIT License

In order to be able to use this module import it like this:

import mpacts.core.units.compat.lrucache
#or assign it to a shorter name
import mpacts.core.units.compat.lrucache as lru

_CacheInfo

mpacts.core.units.compat.lrucache._CacheInfo

alias of CacheInfo

_HashedSeq

class mpacts.core.units.compat.lrucache._HashedSeq(tup, hash=<built-in function hash>)

Bases: list

_HashedSeq(name, parent, **kwargs)
hashvalue
mpacts.core.units.compat.lrucache._make_key(args, kwds, typed, kwd_mark=(<object object at 0x2aacc3cfb560>, ), fasttypes=set([<type 'frozenset'>, <type 'int'>, <type 'NoneType'>, <type 'str'>]), sorted=<built-in function sorted>, tuple=<type 'tuple'>, type=<type 'type'>, len=<built-in function len>)

Make a cache key from optionally typed positional and keyword arguments

mpacts.core.units.compat.lrucache.lru_cache(maxsize=100, typed=False)

Least-recently-used cache decorator.

If maxsize is set to None, the LRU features are disabled and the cache can grow without bound.

If typed is True, arguments of different types will be cached separately. For example, f(3.0) and f(3) will be treated as distinct calls with distinct results.

Arguments to the cached function must be hashable.

View the cache statistics named tuple (hits, misses, maxsize, currsize) with f.cache_info(). Clear the cache and statistics with f.cache_clear(). Access the underlying function with f.__wrapped__.

See: http://en.wikipedia.org/wiki/Cache_algorithms#Least_Recently_Used