public class LRUCache<K,V>
A 2-level LRU cache implementation. - Both levels are implemented as global LRU lists. All cache instances are
sharing the global LRU lists. - The specified percentage defines the count of entries in level 2 to total count of
entries. - Each new entry is placed into the level 1. - If a lookup succeeds and the entry is in level 1, it gets
promoted to level 2. - If level 2 exceeds the allowed count it gets downgraded to level 1. - All entries in level 1
are soft references, and can be cleared by the gc. Since this reduces the total and further usages will establish the
specified percentage. - All entries in level 2 are hard references, so the often used entries are rarely discarded. -
There is no configuration but the ratio for level 2.
So the contract is: If you put something into the cache, it may be there if you look after it - or not. This is a
cache and not a HashMap!
Nested Class Summary
Nested classes/interfaces inherited from interface java.util.Map
Provides a snapshot of keys used in the cache. The HashSet is snapshot which was valid, but may not be accurate
now. There is no warranty that any value is still present in the cache or that nothing else is in the cache.