Mon-Fri 10am-6pm EST
Saturday 10am-5pm EST
How does Cache memory effect the system
In computer science, a cache (pronounced kash) is a collection of data duplicating original values stored elsewhere or computed earlier, where the original data are expensive (usually in terms of access time) to fetch or compute relative to reading the cache. Once the data are stored in the cache, future use can be made by accessing the cached copy rather than refetching or recomputing the original data, so that the average access time is lower.
Caches have proven extremely effective in many areas of computing, because access patterns in typical computer applications have locality of reference. There are several sorts of locality, but we mainly mean that the same data are often used several times, with accesses that are close together in time, or that data near to each other are accessed close together in time.
A CPU cache is a cache used by the central processing unit of a computer to reduce the average time to access memory. The cache is a smaller, faster memory which stores copies of the data from the most frequently used main memory locations. As long as most memory accesses are to cached memory locations, the average latency of memory accesses will be closer to the cache latency than to the latency of main memory.