All Projects → blendle → Object Cache

blendle / Object Cache

Licence: mit
Simple Ruby object caching solution using Redis as a backend

Programming Languages

ruby
36898 projects - #4 most used programming language

Projects that are alternatives of or similar to Object Cache

Laravel Eloquent Query Cache
Adding cache on your Laravel Eloquent queries' results is now a breeze.
Stars: ✭ 529 (+958%)
Mutual labels:  redis, cache
Java Knowledge Mind Map
【🌱🌱Java服务端知识技能图谱】用思维脑图梳理汇总Java服务端知识技能
Stars: ✭ 787 (+1474%)
Mutual labels:  redis, cache
Bloom
🌸 HTTP REST API caching middleware, to be used between load balancers and REST API workers.
Stars: ✭ 553 (+1006%)
Mutual labels:  redis, cache
Lada Cache
A Redis based, fully automated and scalable database cache layer for Laravel
Stars: ✭ 424 (+748%)
Mutual labels:  redis, cache
Exchange Rates
💱 Querying a rate-limited currency exchange API using Redis as a cache
Stars: ✭ 37 (-26%)
Mutual labels:  redis, cache
Redis
Vapor provider for RediStack
Stars: ✭ 434 (+768%)
Mutual labels:  redis, cache
Smartsql
SmartSql = MyBatis in C# + .NET Core+ Cache(Memory | Redis) + R/W Splitting + PropertyChangedTrack +Dynamic Repository + InvokeSync + Diagnostics
Stars: ✭ 775 (+1450%)
Mutual labels:  redis, cache
Dynomite
A generic dynamo implementation for different k-v storage engines
Stars: ✭ 3,830 (+7560%)
Mutual labels:  redis, cache
Redis Tag Cache
Cache and invalidate records in Redis with tags
Stars: ✭ 48 (-4%)
Mutual labels:  redis, cache
Apicache
Simple API-caching middleware for Express/Node.
Stars: ✭ 957 (+1814%)
Mutual labels:  redis, cache
Stackexchange.redis.extensions
Stars: ✭ 419 (+738%)
Mutual labels:  redis, cache
Synchrotron
Caching layer load balancer.
Stars: ✭ 42 (-16%)
Mutual labels:  redis, cache
Ledge
An RFC compliant and ESI capable HTTP cache for Nginx / OpenResty, backed by Redis
Stars: ✭ 412 (+724%)
Mutual labels:  redis, cache
Aiocache
Asyncio cache manager for redis, memcached and memory
Stars: ✭ 496 (+892%)
Mutual labels:  redis, cache
Ring
Python cache interface with clean API and built-in memcache & redis + asyncio support.
Stars: ✭ 404 (+708%)
Mutual labels:  redis, cache
Gocache
☔️ A complete Go cache library that brings you multiple ways of managing your caches
Stars: ✭ 775 (+1450%)
Mutual labels:  redis, cache
Kache
A simple in memory cache written using go
Stars: ✭ 349 (+598%)
Mutual labels:  redis, cache
Ssm booksystem
ssm demo,ssm详细教程,SSM简明教程:简单的十步教你搭建人生第一个SSM框架[ SSM框架整合教程(spring+spring mvc+mybatis+redis+maven+idea+bootstrap) ]
Stars: ✭ 355 (+610%)
Mutual labels:  redis, cache
Weixinmpsdk
微信全平台 SDK Senparc.Weixin for C#,支持 .NET Framework 及 .NET Core、.NET 6.0。已支持微信公众号、小程序、小游戏、企业号、企业微信、开放平台、微信支付、JSSDK、微信周边等全平台。 WeChat SDK for C#.
Stars: ✭ 7,098 (+14096%)
Mutual labels:  redis, cache
Spring Boot
spring-boot 项目实践总结
Stars: ✭ 989 (+1878%)
Mutual labels:  redis, cache

Object::Cache wercker status

Easy caching of Ruby objects, using Redis as a backend store.

Installation

Add this line to your application's Gemfile:

gem 'object-cache'

And then execute:

bundle

Or install it yourself as:

gem install object-cache

Quick Start

# require the proper libraries in your project
require 'redis'
require 'object/cache'

# set the backend to a new Redis instance
Cache.backend = Redis.new

# wrap your object in a `Cache.new` block to store the object on first usage,
# and retrieve it again on subsequent usages
Cache.new { 'hello world' }

# add the core extension for easier access
require 'object/cache/core_extension'
cache { 'hello world' }

Usage

Using Object::Cache, you can cache objects in Ruby that have a heavy cost attached to initializing them, and then replay the recorded object on any subsequent requests.

For example, database query results can be cached, or HTTP requests to other services within your infrastructure.

Caching an object is as easy as wrapping that object in a Cache.new block:

Cache.new { 'hello world' }

Here, the object is of type String, but it can be any type of object that can be marshalled using the Ruby Marshal library.

marshaling data

You can only marshal data, not code, so anything that produces code that is executed later to return data (like Procs) cannot be cached. You can still wrap those in a Cache.new block, and the block will return the Proc as expected, but no caching will occur, so there's no point in doing so.

ttl

By default, a cached object has a ttl (time to live) of one week. This means that every request after the first request uses the value from the cached object. After one week, the cached value becomes stale, and the first request after that will again store the (possibly changed) object in the cache store.

You can globaly set the default ttl to a different value:

Cache.default_ttl = 120

You can easily modify the ttl per cached object, using the keyword argument by that same name:

Cache.new(ttl: 60) { 'remember me for 60 seconds!' }

Or, if you want the cached object to never go stale, disable the TTL entirely:

Cache.new(ttl: nil) { 'I am forever in your cache!' }
Cache.new(ttl: 0) { 'me too!' }

Note that it is best to never leave a value in the backend forever. Since this library uses file names and line numbers to store the value, a change in your code might mean a new cache object is created after a deployment, and your old cache object becomes orphaned, and will polute your storage forever.

namespaced keys

When storing the key/value object into Redis, the key name is based on the file name and line number where the cache was initiated. This allows you to cache objects without specifying any namespacing yourself.

If however, you are storing an object that changes based on input, you need to add a unique namespace to the cache, to make sure the correct object is returned from cache:

Cache.new(email) { User.find(email: email) }

In the above case, we use the customer's email to correctly namespace the returned object in the cache store. The provided namespace argument is still merged together with the file name and line number of the cache request, so you can re-use that same email namespace in different locations, without worrying about any naming collisions.

key prefixes

By default, the eventual key ending up in Redis is a 6-character long digest, based on the file name, line number, and optional key passed into the Cache object:

Cache.new { 'hello world' }
Cache.backend.keys # => ["22abcc"]

This makes working with keys quick and easy, without worying about conflicting keys.

However, this does make it more difficult to selectively delete keys from the backend, if you want to purge the cache of specific keys, before their TTL expires.

To support this use-case, you can use the key_prefix attribute:

Cache.new(key_prefix: 'hello') { 'hello world' }
Cache.backend.keys # => ["hello_22abcc"]

This allows you to selectively purge keys from Redis:

keys = Cache.backend.keys('hello_*')
Cache.backend.del(keys)

You can also use the special value :method_name to dynamically set the key prefix based on where the cached object was created:

Cache.new(key_prefix: :method_name) { 'hello world' }
Cache.backend.keys # => ["test_key_prefix_method_name_22abcc"]

Or, use :class_name to group keys in the same class together:

Cache.new(key_prefix: :class_name) { 'hello world' }
Cache.backend.keys # => ["CacheTest_22abcc"]

You can also define these options globally:

Cache.default_key_prefix = :method_name

redis replicas

Before, we used the following setup to connect Object::Cache to a redis backend:

Cache.backend = Redis.new

The Ruby Redis library has primary/replicas support built-in using Redis Sentinel.

If however, you have your own setup, and want the writes and reads to be separated between different Redis instances, you can pass in a hash to the backend config, with a primary and replicas key:

Cache.backend = { primary: Redis.new, replicas: [Redis.new, Redis.new] }

When writing the initial object to the backend, the primary Redis is used. On subsequent requests, a random replica is used to retrieve the stored value.

The above example obviously only works if the replicas receive the written data from the primary instance.

core extension

Finally, if you want, you can use the cache method, for convenient access to the cache object:

require 'object/cache/core_extension'

# these are the same:
cache('hello', ttl: 60) { 'hello world' }
Cache.new('hello', ttl: 60) { 'hello world' }

That's it!

License

The gem is available as open source under the terms of the MIT License.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].