Giter Site home page Giter Site logo

ericyoc / opt_lru_cache_perf_poc Goto Github PK

View Code? Open in Web Editor NEW
0.0 1.0 0.0 14 KB

Cache replacement algorithms, including Least Recently Used (LRU), First In First Out (FIFO), and Enhanced Not Recently Used (ENRU).

Home Page: https://doi.org/10.1007/978-3-031-33743-7_35

Jupyter Notebook 100.00%
cache-replacement first-in-first-out least-recently-used-algorithm enhanced-not-recently-used-algorithm cache-simulator

opt_lru_cache_perf_poc's Introduction

Cache Replacement Algorithm Analsyis

This repository contains implementations and comparisons of different cache replacement algorithms, including Least Recently Used (LRU), First In First Out (FIFO), and Enhanced Not Recently Used (ENRU).

Motivating Article

Gawanmeh, A., Abed, S., AlAwadh, S. A. A., Mansoor, W., Atalla, S., Alomari, A., & Yocam, E. (2023). Enhanced Not Recently Used Algorithm for Cache Memory Systems in Mobile Computing. Proceedings of the 2023 International Conference on Advances in Computing Research (ACR’23), 433–442. https://doi.org/10.1007/978-3-031-33743-7_35

Algorithm Type

Least Recently Used (LRU)

The Least Recently Used (LRU) algorithm is a popular cache replacement algorithm that keeps recently accessed items in the cache. It works by evicting the least recently used item when the cache reaches its capacity. LRU is widely used in various caching systems due to its simplicity and effectiveness in many scenarios.

First In First Out (FIFO)

The First In First Out (FIFO) algorithm is a simple cache replacement algorithm that evicts items in the order they were inserted into the cache. FIFO does not consider the access patterns of the items and can potentially evict frequently accessed items. It is used in situations where access patterns are unknown or unpredictable.

Enhanced Not Recently Used (ENRU)

The Enhanced Not Recently Used (ENRU) algorithm is a customized cache replacement algorithm that combines the benefits of LRU and FIFO. It divides the cache into three layers: top, middle, and bottom. The algorithm works as follows:

  1. New items are inserted into the first location of the bottom layer.
  2. When the cache reaches its capacity, items are replaced from the top layer using the ENRU algorithm.
  3. Items are moved between layers based on their access patterns.

ENRU uses two bits associated with each item in the top layer: a reference bit (R) and a dirty bit (D). The priority for replacement is determined by the combination of these bits, with the order being: not referenced and not dirty, not referenced and dirty, referenced and not dirty, and referenced and dirty.

Performance Comparison

The performance of each algorithm is evaluated using different access patterns: default and conditional. The default access pattern represents a scenario with a low skew factor, while the conditional access pattern represents a scenario with a high skew factor.

Algorithm Performance (Default) Performance (Conditional)
Least Recently Used (LRU) 0.682 0.975
First In First Out (FIFO) 0.511 0.487
Enhanced Not Recently Used (ENRU) 0.639 0.992

The results show that ENRU outperforms LRU and FIFO in scenarios with conditional access patterns and limited cache capacity. ENRU achieves a higher hit ratio by effectively adapting to the access patterns and prioritizing the replacement of items based on their reference and dirty bits.

Usage

To use the cache replacement algorithms in your project, you can simply import the corresponding classes (LRUCache, FIFOCache, ENRUCache) and create instances of the desired cache with a specified capacity. The get and put methods can be used to interact with the cache.

Conclusion

The Enhanced Not Recently Used (ENRU) algorithm provides a powerful cache replacement strategy that combines the strengths of LRU and FIFO. By dividing the cache into layers and using reference and dirty bits, ENRU optimizes cache performance in scenarios with conditional access patterns and limited cache capacity. It outperforms traditional algorithms like LRU and FIFO in such scenarios, making it a valuable choice for caching systems with specific requirements.

Feel free to explore the code and adapt it to your needs. Contributions and feedback are welcome!

Disclaimer This repository is intended for educational and research purposes.

opt_lru_cache_perf_poc's People

Contributors

ericyoc avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.