@vlr/cache

update this

Usage no npm install needed!

<script type="module">
  import vlrCache from 'https://cdn.skypack.dev/@vlr/cache';
</script>

README

vlr/cache

This is implementation of cache, allowing to use flat object as a key.
Also, it has approximate 6 times performance improvement over "js-cache" npm package. To handle item expiration js-cache library fires a new timer for each item, which is expensive.
In this library there is only one timer at a time. Items are put in the timebox. When timer fires, it evicts all items accumulated in that slot from the cache. Timer is handled with throttling mechanism with configurable precision.
For example, with default precision, if there are 4 items put in the cache with timeouts of 200, 400, 600 and 800 milliseconds, then first 2 items will be evicted from the cache after 500ms, last 2 after 1 second.

CacheSettings

Each instrument in this library can be configured with following settings:\

  1. timeoutPrecision - time between expirations in milliseconds. Item that does not get into the time slot will be expired in the next one. Default value is 500.
  2. timeboxSize - number of time slots. If item expiration is later than timebox span, item expiration is put into the stack-tree, slower mechanism of handling expirations. Only about 2-2.5 times faster than "js-cache".
  3. DefaultTimeout. If timeout is not specified in set method, it falls back to this value. Default value is 500ms.
  4. DisableTickOptimization. There is an optimization implemented - current time is only taken once during the tick of JS event loop. All subsequent items placed into cache will calculate their timeout off the same time, even though current time may already change if tick was long.

CacheProvider

Cache provider is made with the similar interface as usual caching implementations.

var cache = new CacheProvider(settings);
const key = 1, value = 2;
cache.set(key, value, 100); // this sets item to the cache, and expires it after 100ms
cache.set(key, value); // this overwrites the item and sets it to expire after 500ms

const has = cache.has(1); // this method tells if there is value by that key in the cache
const result = cache.get(1); // this gets item from the cache by specified key
cache.delete(1); // force deletion of item from the cache.

Cache

This is pass-through cache implementation. So far only implemented with synchronous interface

const cache = new Cache(settings);
// this is an example of flat object that can be used as a key
const key = { a: "flat", b: "object", c: "key" }; 
const value = cache.getSync(key, (k) => calculateTheValueOffKey(k));

memoize

This an implementation of memoization pattern. It serializes array of arguments and uses resulting string as a cache key. So next time the function is called with the same parameters, value will be taken from cache, unless expired.

const calculateItems = memoize(function(arg1: string, arg2: number): number {
  // heavy calculations go here
}, settings);

memoized

This has the same functionality as memoize, but more restricted. Instead of serializing arguments array, it creates a flat object from parameters.
Also has alias 'memoizeStrict'.
There are 2 restrictions:

  1. Can't have variable number of parameters, i.e. no optional or default parameters allowed.
  2. Object parameters should never be mutated by any other code, since reference comparison will be run against parameters.

So in case of large immutable objects used as function parameters the performance is significantly better. A drawback of this approach are small objects having different reference but the same content. MemoizeStrict is ineffective in such a case, please fallback to regular memoize.

Plans for future versions

  1. getAsync method for Cache: along with caching value, will cache the promise. So 2 consumers asking for the item will spawn only one promise
  2. getArrayAsync method: provide a list of keys and a method to get a list of values by list of keys. Cache will find items in the cache and promises for items in the cache, run a new promise for remaining keys and then concatenate the results
  3. memoizeAsync