
Cache Locality
Cache locality is a principle in computing that describes how data used by a program tends to be accessed repeatedly or in nearby locations. There are two types: temporal locality (recently accessed data is likely to be reused soon) and spatial locality (data close to recently accessed data is likely to be used next). By keeping this data close to the processor in fast storage (cache), systems can improve speed and efficiency, reducing the time it takes to retrieve information from slower main memory. This concept helps make computers run faster by optimizing data access patterns.