In applications such as customer relationship management (CRM), many systems and users are trying to reference commonly used data, which causes contention on the database as it’s a shared resource and the same data is read repeatedly. Also, accessing data from the database tier is slower than accessing data in the application server tier. The problem can be aggravated if the application or customization is not efficiently implemented to minimize calls to the database. It becomes a bottleneck trying to handle the flood of requests. The result is user frustration, costly hardware additions, and the overall business slows down and is unable to meet targets.
Anil Jacob walks you through how to protect shared resources such as databases and other information stores—which do complex work and are costly resources but are critical for application health—enabling the applications to scale with increasing demand, reducing both dependency on hardware and costs and making businesses and users happy.
When the data doesn’t change often but is read frequently between changes, an application cache can be used to cache the data from a database. Whenever data gets created or updated in the database, it can trigger a write to the cache. The application code uses the cache as the data source. Using a key-value pair, it can retrieve the cached data efficiently. The cache acts as a layer in front of the database protecting it from the huge number of user and application requests.
Caching needs to be selective. Data that’s updated by more than one thread can run into race conditions, so the application should be tolerant to such situations. Cache invalidation needs to be taken into account. The application can handle cache miss by falling back to database, and the retrieved data is then cached for subsequent requests.
The concept of application-layer cache is not new; however, selecting which contents to place in application (or platform) cache is a performance-efficiency versus data consistency trade-off decision. Salesforce has custom applications developed by its customers and partners where the latest data for cases or chats is checked at a high frequency to provide faster customer service. It has been able to reduce the impact of these use cases on the database by ~5 times by implementing an application layer cache.
Application caching traditionally has been used for caching static data to improve response times for the end user. Scale is an increasing challenge with applications frequently requesting data from shared resources. Implementing a caching layer in front of critical resources helps applications scale better, reduces cost, and improves user experience.
This session is sponsored by Salesforce.
Anil Jacob is a lead software engineer on the frontier scale team at Salesforce, where he works on large and complex customer implementations and related scale challenges. Previously, he was at Intuit, BEA WebLogic, and Wells Fargo. Anil’s interests are application scale, user experience, UX performance, and application development.
Comments on this page are now closed.
For exhibition and sponsorship opportunities, email velocity@oreilly.com
For information on trade opportunities with O'Reilly conferences, email partners@oreilly.com
View a complete list of Velocity contacts
©2019, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • confreg@oreilly.com
Comments
hi, can you share the slides?