Caching – pronounced ‘cashing’ – is the process of storing data in a cache, which is a temporary storage area that facilitates faster access to data with the goal of improving application and system performance
It is a small amount of faster, more expensive memory used to improve the performance of recently or frequently accessed data. Cached data is stored temporarily in an accessible storage media that’s local to the cache client and separate from the main storage. Cache is commonly used by the central processing unit (CPU), applications, web browsers and operating systems. Cache is used because bulk or main storage can’t keep up with the demands of clients. Cache decreases data access times, reduces latency and improves input/output (I/O). Because almost all application workloads depend on I/O operations, the caching process improves application performance.
References
-
Store expensive computation results so they can be reused, not recomputed.🔗dagster.io
-
WebSphere eXtreme Scale provides IBM Integration Bus with data caching capability.🔗ibm.com
-
Learn about the various types of caches, how they work, how they're used and the benefits -- like improved performance -- as well as the drawbacks of them.🔗Storage