• lfu-cache(需要O(1),所以挺难的)


    https://leetcode.com/problems/lfu-cache/

    很难,看了下面的参考:

    https://discuss.leetcode.com/topic/69137/java-o-1-accept-solution-using-hashmap-doublelinkedlist-and-linkedhashset

    注意其中的思想就是如下所述:

    Your idea is brilliant... Especially storing all keys with same counts in one node, 

    if one of the keys in that node got hit once more,
    it will be moved into a new node with (count+1) if the node exits or it will be wrapped into a newly created node with
    (count+1). All your operations are guaranteed O(1) no doubt. There is no way to complete it bug-free within half an hour. So in the real interview,
    I might as well explain the idea and how we should implement all operations in each scenario,
    instead of actually trying to complete whole program...
    Anyway, thank you so much for your time and explanation. 

    并且注意,用到了LinkedHashSet的特性,就是虽然是Set,但是是按照顺序插入的方式来遍历的。

    public class LFUCache {
        private Node head = null;
        private int cap = 0;
        private HashMap<Integer, Integer> valueHash = null;
        private HashMap<Integer, Node> nodeHash = null;
        
        public LFUCache(int capacity) {
            this.cap = capacity;
            valueHash = new HashMap<Integer, Integer>();
            nodeHash = new HashMap<Integer, Node>();
        }
        
        public int get(int key) {
            if (valueHash.containsKey(key)) {
                increaseCount(key);
                return valueHash.get(key);
            }
            return -1;
        }
        
        public void set(int key, int value) {
            if ( cap == 0 ) return;
            if (valueHash.containsKey(key)) {
                valueHash.put(key, value);
                Node node = nodeHash.get(key);
                node.keys.remove(key);
                node.keys.add(key);
            } else {
                if (valueHash.size() < cap) {
                    valueHash.put(key, value);
                } else {
                    removeOld();
                    valueHash.put(key, value);
                }
                addToHead(key);
            }
            increaseCount(key);
        }
        
        private void addToHead(int key) {
            if (head == null) {
                head = new Node(0);
                head.keys.add(key);
            } else if (head.count > 0) {
                Node node = new Node(0);
                node.keys.add(key);
                node.next = head;
                head.prev = node;
                head = node;
            } else {
                head.keys.add(key);
            }
            nodeHash.put(key, head);      
        }
        
        private void increaseCount(int key) {
            Node node = nodeHash.get(key);
            node.keys.remove(key);
            
            if (node.next == null) {
                node.next = new Node(node.count+1);
                node.next.prev = node;
                node.next.keys.add(key);
            } else if (node.next.count == node.count+1) {
                node.next.keys.add(key);
            } else {
                Node tmp = new Node(node.count+1);
                tmp.keys.add(key);
                tmp.prev = node;
                tmp.next = node.next;
                node.next.prev = tmp;
                node.next = tmp;
            }
    
            nodeHash.put(key, node.next);
            if (node.keys.size() == 0) remove(node);
        }
        
        private void removeOld() {
            if (head == null) return;
            int old = 0;
            for (int n: head.keys) {
                old = n;
                break;
            }
            head.keys.remove(old);
            if (head.keys.size() == 0) remove(head);
            nodeHash.remove(old);
            valueHash.remove(old);
        }
        
        private void remove(Node node) {
            if (node.prev == null) {
                head = node.next;
            } else {
                node.prev.next = node.next;
            } 
            if (node.next != null) {
                node.next.prev = node.prev;
            }
        }
        
        class Node {
            public int count = 0;
            public LinkedHashSet<Integer> keys = null;
            public Node prev = null, next = null;
            
            public Node(int count) {
                this.count = count;
                keys = new LinkedHashSet<Integer>();
                prev = next = null;
            }
        }
    }
    /**
     * Your LFUCache object will be instantiated and called as such:
     * LFUCache obj = new LFUCache(capacity);
     * int param_1 = obj.get(key);
     * obj.set(key,value);
     */
  • 相关阅读:
    使用JQuery实现延迟加载UserControl
    VisualStudio中的列选择
    SQL SERVER 2008 CTE生成结点的FullPath
    woocommerce独立站建站
    Java NIO使用及原理分析(二)
    java.io学习总结 转载
    java io与装饰器模式
    函数式思维: 不变性
    函数式思维: 运用函数式思维,第2 部分
    maven添加非官方jar包到本地库(maven: install an external jar into local maven repository)
  • 原文地址:https://www.cnblogs.com/charlesblc/p/6096144.html
Copyright © 2020-2023  润新知