Design and implement a data structure for cache.
- get(key) - Get the value of the key if the key exists in the cache, otherwise return -1
- put(key, value, weight) - Set or insert the value, when the cache reaches its capacity, it should invalidate the least scored key. The scored is calculate as weight / ln(current_time - last_access_time)
Your data structure should be optimized for the computational complexity of get(key) i.e. Average case for computational complexity of get(key) could be O(1).
In your (pesudo-)code, you can assume common data structure such as array, different type of list, hash table are available.
Please explain the computational complexity of get(key) and put(...) in Big-O notation.
__hashMap
key1 | weight1, value1, lastAccess1 |
key2 | weight2, value2, lastAccess2 |
key3 | weight3, value3, lastAccess3 |
... | ... |
When get(), retrieve a record from it using the key. Also update lastAccess
. Time complexity O(1).
When put(), add a record first. If capacity exceeded, remove a record by a key which is obtained from sorting the __itemArray
below.
__itemArray
0 | 1 | 2 | ... |
---|---|---|---|
key1 | key2 | key3 | ... |
When get(), nothing to do with it
When put(), add the new key first. If capacity exceeded:
- Sort the array by descending order of score (lookup the relevant info from
__hashMap
) - Drop the last item in the array.
- Use the popped item's key to remove a record from
__hashMap
get(key): O(1)
put(...): O(nlog(n))
If interested, please see implementation in Python at this README