Cache Strategies with Best Practices

Note: Presentation times are in Coordinated Universal Time (UTC).

Wednesday, 2021, October 13 - 04:0004:30

Tao Cai, LinkedIn


Problem statement
It's expensive to retrieve remote data in many data-intensive and latency-sensitive online services. A cache is widely used as a common practice to speed up the service. It's challenging to build an efficient cache and some bad cache implementations may break the system even more.

This is a technical deep-dive about a set of cache strategies with real examples. We will describe Cache Item strategies, Multiple Cache TTL strategies, and Cache warm-up strategies. We'll explain how they significantly improve system performance while maintaining cache efficiency and increase availability. We will also share our practices in adopting those strategies.

Tao Cai, LinkedIn

Tao Cai is a Staff Software Engineer in the Ads Serving Infra in LinkedIn, former Site Reliability Engineer in the Ads team. Focus on the ads system's scalability, reliability, and latency improvement.

SREcon21 Open Access Sponsored by Indeed

@conference {276727,
author = {Tao Cai},
title = {Cache Strategies with Best Practices},
year = {2021},
publisher = {USENIX Association},
month = oct,

Presentation Video