All Projects β†’ aboutlo β†’ async-memo-ize

aboutlo / async-memo-ize

Licence: MIT License
πŸ›  Memoize utility for async/await syntax and promises. It supports cache in memory or via Redis

Programming Languages

javascript
184084 projects - #8 most used programming language
CSS
56736 projects

Projects that are alternatives of or similar to async-memo-ize

bash-cache
Transparent caching layer for bash functions; particularly useful for functions invoked as part of your prompt.
Stars: ✭ 45 (+181.25%)
Mutual labels:  memoization, cache
Cachier
Persistent, stale-free, local and cross-machine caching for Python functions.
Stars: ✭ 359 (+2143.75%)
Mutual labels:  memoization, cache
tacky
Primitive Object Memoization for Ruby
Stars: ✭ 14 (-12.5%)
Mutual labels:  memoization, cache
Cached
Rust cache structures and easy function memoization
Stars: ✭ 530 (+3212.5%)
Mutual labels:  memoization, cache
cacheme-go
πŸš€ Schema based, typed Redis caching/memoize framework for Go
Stars: ✭ 19 (+18.75%)
Mutual labels:  memoization, cache
Python Memoization
A powerful caching library for Python, with TTL support and multiple algorithm options.
Stars: ✭ 109 (+581.25%)
Mutual labels:  memoization, cache
KJNetworkPlugin
🎑A lightweight but powerful Network library. Network Plugin, Support batch and chain operation. ζ’δ»Άη‰ˆη½‘η»œζžΆζž„
Stars: ✭ 43 (+168.75%)
Mutual labels:  cache
cache
PSR-16 compatible cache library
Stars: ✭ 30 (+87.5%)
Mutual labels:  cache
bazel-cache
Minimal cloud oriented Bazel gRPC cache
Stars: ✭ 33 (+106.25%)
Mutual labels:  cache
memoize
Caching library for asynchronous Python applications.
Stars: ✭ 53 (+231.25%)
Mutual labels:  cache
punic
Punic is a remote cache CLI built for Carthage and Apple .xcframework
Stars: ✭ 25 (+56.25%)
Mutual labels:  cache
qcache
In memory cache server with query capabilities
Stars: ✭ 36 (+125%)
Mutual labels:  cache
infinitree
Scalable and encrypted embedded database with 3-tier caching
Stars: ✭ 80 (+400%)
Mutual labels:  cache
varnish-cache-reaper
Simple python/twisted HTTP daemon forwarding PURGE and BAN requests to multiple varnish (or other proxy) instances
Stars: ✭ 12 (-25%)
Mutual labels:  cache
microstream
High-Performance Java-Native-Persistence. Store and load any Java Object Graph or Subgraphs partially, Relieved of Heavy-weight JPA. Microsecond Response Time. Ultra-High Throughput. Minimum of Latencies. Create Ultra-Fast In-Memory Database Applications & Microservices.
Stars: ✭ 283 (+1668.75%)
Mutual labels:  cache
tiny-cache
Cache WordPress post content, template part, translations and nav menu output in persistent object cache
Stars: ✭ 26 (+62.5%)
Mutual labels:  cache
cache-command
Manages object and transient caches.
Stars: ✭ 12 (-25%)
Mutual labels:  cache
hk-cache-manager
Simple wrapper for Redis Cache with Stackoverflow.Redis & AspNetCore aim
Stars: ✭ 17 (+6.25%)
Mutual labels:  cache
ultrafetch
Node-based fetch backed with an RFC-7234 compliant filesystem cache.
Stars: ✭ 30 (+87.5%)
Mutual labels:  cache
fastapi-cache
fastapi-cache is a tool to cache fastapi response and function result, with backends support redis and memcached.
Stars: ✭ 375 (+2243.75%)
Mutual labels:  cache

Async Memo-ize

npm version CircleCI Greenkeeper badge

In computing, memoization or memoisation is an optimization technique used primarily to speed up computer programs by storing the results of expensive function calls and returning the cached result when the same inputs occur again β€” Wikipedia

This library makes async function, aka Promises, first class citizen with memoization

Use cases covered:

  1. An expensive function call (eg. API calls, intensive CPU calculations, etc)
  2. Simultaneous multiple calls are handled by a queue so that the expensive function is invoked once.
  3. Multiple NodeJs instances with a centralized cache (eg. Redis)

Notice: sync function can be used too

Real project use case

A NodeJS cluster computes a calculation every day for each user. The calculation is incremental using the data from the last 90 days. With this approach, the calculus can be distributed across all the available nodes, and the results are shared among them via a distributed cache (e.g. Redis) So that, it isn't necessary crunching data from the previous days again and again.

Install

npm install async-memo-ize

or

yarn add  async-memo-ize

Usage

Named functions

import memoize from 'async-memo-ize'
import sleep from 'sleep-promise';

const whatsTheAnswerToLifeTheUniverseAndEverything = async () => {
     await sleep(2000);
     return 42
}
const memoized = memoize(whatsTheAnswerToLifeTheUniverseAndEverything)

const answer = await memoized() // wait 2 seconds 
const quickAnswer = await memoized() // wait ms  

Anonymous functions

import memoize from 'async-memo-ize'
import sleep from 'sleep-promise';

const whatsTheAnswerToLifeTheUniverseAndEverything = memoize(async () => {
                                                                  await sleep(2000);
                                                                  return 42
                                                             }, {id: 'whatsTheAnswerToLifeTheUniverseAndEverything'})

const answer = await whatsTheAnswerToLifeTheUniverseAndEverything() // wait 2 seconds 
const quickAnswer = await whatsTheAnswerToLifeTheUniverseAndEverything() // wait ms  

If you prefer to memoize anonymous function, you have to pass a unique id. The id is used to generate the cache key and it is required to share the same cache across multiple memoized functions. Named functions don't need because the lib rely on fn.name as id

Cache

In Memory

An async cache based on LRUMap is provided.

Usage

import memoize, {LocalCache} from 'async-memo-ize'

const fn = async () => Promise.resolve(42)
const memoized = memoize(fn, new LocalCache)

const answer = await memoized() // wait ms  

You can provide your own implementation given the below interface:

class LocalCache {

  async has(key) {
    ...
  }

  async get(key) {
    ...
  }

  async set(key, value) {
    ...
  }

  async del(key) {
    ...
  }

  async entries() {
    ...
  }

  async size() {
    ...
  }
}

Redis

If you want delegate and share the cache between NodeJs instances you can use RedisCache.

yarn install async-memo-ize-plugin-redis-cache 

Usage

import memoize from 'async-memo-ize'
import RedisCache from 'async-memo-ize-plugin-redis-cache'

const fn = async () => 42
const memoized = memoize(fn, new RedisCache())

const anser = await memoized()

Notice

The key name, serialized on Redis, is based on the named function args and his name.

Given:

const doSomething = async (a, b) => a+b

The key generated:

["doSomething",1,5]

It means multiple NodeJs instances can share the value computed if the function name and the args match. If you prefer to use an anonymous function it is required to pass an id as option

Test

Prerequisites

docker run -d -p 6379:6379 redis:alpine  

Run

yarn test

Release

lerna publish

TODO

  • Calculate at runtime a safe default for SimpleCache max
  • Decide if or not to implement .entries() and .size on RedisCache
  • Evaluate to create an ES5 compatible version

Reminder for SimpleCache max

-max_old_space_size
echo console.log(process.argv.splice(2)) > index.js
node index.js --max_old_space_size -expose_gc
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].