• https://blog.csdn.net/FnqTyr45/article/details/110675336


    Reviewer #1: To build a shared learning model for training parameters at the distributed edge to protect data privacy, a novel distributed hierarchical tensor depth optimization algorithm is designed in this paper. The mold parameters in a high-dimensional tensor space are compressed into a union of low-dimensional subspaces to reduce the bandwidth consumption and storage requirements of federated learning. Experimental results show that the proposed algorithm reduces the load of communication bandwidth and the energy consumption at the edge. This article is original and has a unique idea, but there are some areas that need to be improved as follows:

    1. Introduction is not hierarchical and logical enough. It tries to reflect the direct paragraph relationship and refer to recent literatures, and some terms should be expressed accurately.
    2. A related work section that discuss the contribution and difference from the previous works should be included.
    3. The concept of terminology in the paper is wrong. For example, there is no WEIGHT in the pooling layer and it needs to be improved.
    4. The reference format is inaccurate and should be revised. If the latest references are too few, the latest references should be added. Finally, the emphasis should be on strengthening language and grammar revisions. There are grammar and spelling errors throughout.

  • 相关阅读:
    Spring的AOP深入理解
    枚举和注解学习笔记
    单例模式
    工厂设计模式
    网络编程
    多线程笔记
    IOI2021集训队作业
    计蒜客 mark
    51nod mark
    关于此博客
  • 原文地址:https://www.cnblogs.com/devilmaycry812839668/p/16294632.html
Copyright © 2020-2023  润新知