Tuesday, September 7, 2010

What is cache memory?

   A cache is a memory device that improves performance of the processor by transparently storing data such that future requests for that data can be served faster. The data that is stored within a cache might be values that have been computed earlier or duplicates of original values that are stored elsewhere.

 


   Access to cache can result in either one of the following: cache miss or cache hit.Cache hit means that the requested data is contained in the cache and cache miss means data is not found there in cache.On cache hit processor takes data from cache itself for processing.On cache miss the data is fetched from the original memory location.Cache memories are volatile and small in storage size.Since the storage size is small the address decoding takes less time and hence caches are faster then normal physical memories(RAM's) in computers.

  As I said the data is stored transparently in cache.This means that the user who is requesting data from the cache need not know whether data is stored in cache or system memory.It is handled by the processor.The word cache means "conceal" in French.

A simple cache contains three fields.
1)An index which is local to the cache.
2)A tag which is the index with reference to the main memory.This will let the processor know the location in main memory where an exact copy of data is stored.
3)Data, which is actual data needed by the processor.

   When processor needs some data from the memory it first checks in cache.It sees all the tag fields in the cache to see whether same data is available in cache.If the tag is found then the corresponding data is taken.Otherwise a cache miss error is asserted and the main memory is accessed.Also the cache memory is updated with the recent memory access.This is called cache update on cache miss.

   During a cache update if the cache is full, then it has to delete a row.This is decided on a cache replacement algorithm.Some algorithms are:
1)LRU - Least recently used data is replaced.
2)MRU - Most recently used data is replaced.
3)Random replacement - Simple, used in ARM processors.
4)Belady's Algorithm - discards the data which may not be used for the longest time in future.Not perfectly implementable in practice.

The average memory access time of a cache enabled system can be calculated using the hit and miss ratio of a cache.
Average memory access time = (Time_cache * Hit_counts ) + ( (Time_cache + Time_mm) * Miss_counts)
where,
Time_cache and Time_mm is the time needed to access a location for cache and main memory respectively.Hit_counts and Miss_counts are the hit and miss probabilities.
For an example on this, see the 7th problem on this post.

Note:-In the next post I will update more details on cache.Subscribe to the blog feeds for constant updates.

No comments:

Post a Comment