Redis Pub/Sub : Keep your InMemory Cache Fresh

Redis has a very nice feature called Pub/Sub. You can publish an event when a key value change and other applications can subscribe to this event. You can use this feature to keep your in memory cache always fresh.

Problem

InMemory cache is the fastest cache and very critical for performance of your application. But the problem with in memory cache is its not possible to clear your cache when the data change in original source (i:e database).

I am trying to explain the whole idea using an example. Suppose your application displays list of categories with count (number of books in each category) in websites to allow users browse books by category. As categories do not change that much and we don’t add books every minute so your team decided to put this in memory cache for 30 minutes. Which works fine but sometimes when you add a new book under a category the count didn’t update in website immediately as they cached in memory for 30 minutes. The update could take maximum 30 minutes to display in browser. Because your application deployed in web farm all individual instance has its own in memory cache of this data.

Solution

To solve the problem you moved your caching to a centralized cache store like redis/memcache. That means all instances of your application sharing the same cache storage. Now when a book added under a category you can just remove the cachce key from redis/memcahce and all instances will display updated data immediately.

Now centralized caching solve the problem but this kind of cache is slower than in memory cache as latency involved in centralized caching source. What if we can use both in memory cache and centralized cache to boost the performance of our application? So for categories menu the application first check in memory cache and then fallback to redis/memcache and then fallback to Database. To achieve this we need to find a way to update in memory cache in all instances when a value changed. We can achieve this using Redis pub/sub. All instances of our application will listen redis for any published change event and then based on information of changed key the application will remove correct cache key from in memory. So your in memory cache will always stat sync with updated data.

Lets see some code

I am using Stackexchange.Redis library here. One thing while using this library make sure you create connection multiplexer only once in application life cycle.

RedisConnectionFactory

public class RedisConnectionFactory : IRedisConnectionFactory, IDisposable
{
    private readonly Lazy<IConnectionMultiplexer> _connection;

    // sample connection string. Make sure you put abortConnect=false. 
    // <your server>:<port>,KeepAlive=180,name=BookWorm.Web,Password=<your password>,abortConnect=false
    public RedisConnectionFactory(string connectionString)
    {
        _connection = new Lazy<IConnectionMultiplexer>(() => ConnectionMultiplexer.Connect(connectionString));
    }

    public IConnectionMultiplexer Get()
    {
        return _connection.Value;
    }

    public void Dispose()
    {
        _connection.Value.Dispose();
    }
}

Here is the code for web application. On application startup you subscribe to redis change event and raise event. Eventhandler will then remove the cache so next request always have fresh data and cache also created based on updated data. This will work for all instances of your application that deployed in multiple servers.

SetupEventSubscriptionTask

public class SetupEventSubscriptionTask : IStartUpTask
{
    private readonly IRedisConnectionFactory _connectionFactory;
    private readonly IEventDispatcher _dispatcher;

    public SetupEventSubscriptionTask(IRedisConnectionFactory connectionFactory, IEventDispatcher dispatcher)
    {
        _connectionFactory = connectionFactory;
        _dispatcher = dispatcher;
    }

    public void Run()
    {
        var con = _connectionFactory.Get();
            
        var channel = new RedisChannel("event:book:changed:*", RedisChannel.PatternMode.Pattern);

        var subscriber = con.GetSubscriber();

        subscriber.Subscribe(channel,
                    (channel, value) =>
                    {
                        _dispatcher.Dispatch<CacheSourceChanged>(new CacheSourceChanged
                        {
                            Name = channel.ToString(), // name of the key
                            Value = value // value that passed. in this case id of the book
                        });
                    });
    }
}

BookChangedEventHandler

public class BookChangedEventHandler : IEventHandler<CacheSourceChanged>
{
    // InMemory cache wrapper instance
    private readonly ICacheStore _cacheStore;

    public BookChangedEventHandler(ICacheStore cacheStore)
    {
        _cacheStore = cacheStore;
    }

    public void Handle(CacheSourceChanged eEvent)
    {
        // Remove cahce data from memory so next time the category menu load from original source
        _cacheStore.Remove("categories:menu");
    }
}

Sample code for publishing event

Here is a sample code that demonstrate how you can publish an event from redis when a book added from admin site.

CreateNewBookCommandHandler

public class CreateBookCommandHandler : IAsyncCommandHandler<CreateBookCommand,CommandReply<Guid>>
{
	private readonly IBookRepository _repo;
	private readonly IMapper _mapper;
	private readonly IEventDispatcher _eventDispatcher;

	public CreateBookCommandHandler(IBookRepository repo, IMapper mapper, IEventDispatcher eventDispatcher)
	{
		_repo = repo;
		_eventDispatcher = eventDispatcher;
		_mapper = mapper;
	}

	public async Task<CommandReply<Guid>> HandleAsync(CreateBookCommand cmd)
	{
		var id = await _repo.InsertAsync(_mapper.Map<CreateBookInput>(cmd));
		
		await _eventDispatcher.DispatchAsync<BookCreatedEvent>(
			new BookCreatedEvent
			{
				BookId = id	
			}
		);
		
		return new CommandReply<Guid>{ Data = id };
	}
}

BookCreatedEventHandler

public class BookCreatedEventHandler : IAsyncEventHandler<BookCreatedEvent>
{
	private readonly IRedisConnectionFactory _redisConnectionFactory;
	
	public BookCreatedEventHandler(IRedisConnectionFactory redisConnectionFactory)
	{
		_redisConnectionFactory = redisConnectionFactory;
	}
	
	public Task HandleAsync(BookCreatedEvent eEvent)
	{
		var con = _connectionFactory.Get();
             
        var channel = new RedisChannel($"event:book:changed:{eEvent.BookId}", RedisChannel.PatternMode.Pattern);
 
        var subscriber = con.GetSubscriber();
 
 		return subscriber.PublishAsync(channel, eEvent.BookId, CommandFlags.FireAndForget);
	} 
}

Hope the code sample makes it clear how you can keep your in memory cache data always fresh using Redis pub/sub technique. Whenever admin change any book all cahced version of categories will be removed from every instance of your public facing website. Redis pub/sub is not only limited to this. You can use this for lot of other cool things e.g broadcast updated data to all browsers using websocket.

Happy coding 🙂

Advertisements

3 thoughts on “Redis Pub/Sub : Keep your InMemory Cache Fresh

    • Good question Majid. The answer really depends on type of application. How badly it will affect the business to have some stale data on the scenario you mentioned, which might happen say 5% time. I generally make in memory cache shorter compare to distributed cache, so that 5% failure doesn’t allow stale data in system for longer. If you can’t tolerate that 5% then you have to go extra mile, which might not be necessary for most application. For example you applications from each box can push message back to redis after cache cleanup. The publisher application can also push a message to rabbitmq and a subscriber will make sure all boxes cleanup the cache, otherwise publish the keychange in redis again. Hope it answers your question.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s