Store frequently used data in fast memory
Caching stores frequently-accessed data in fast storage (memory) instead of slow storage (database/disk). Like keeping common items on your desk instead of in the filing cabinet. Every time you fetch user profile from database it takes 50ms. With caching: first request 50ms, next 1000 requests <1ms. Huge performance win. Common tools: Redis, Memcached, CDN caching. The hard part: cache invalidation (keeping cached data fresh).
Cache when you have expensive operations (database queries, API calls, computations) that are called frequently with the same inputs. Great for: user sessions, product catalogs, homepage data, API responses. Don't cache: real-time data (stock prices), user-specific data that changes often, or data where staleness is unacceptable. Start caching when pages load slowly despite database optimization.
System Design Patterns