• https://blog.csdn.net/FnqTyr45/article/details/110675336


    Reviewer #1: To build a shared learning model for training parameters at the distributed edge to protect data privacy, a novel distributed hierarchical tensor depth optimization algorithm is designed in this paper. The mold parameters in a high-dimensional tensor space are compressed into a union of low-dimensional subspaces to reduce the bandwidth consumption and storage requirements of federated learning. Experimental results show that the proposed algorithm reduces the load of communication bandwidth and the energy consumption at the edge. This article is original and has a unique idea, but there are some areas that need to be improved as follows:

    1. Introduction is not hierarchical and logical enough. It tries to reflect the direct paragraph relationship and refer to recent literatures, and some terms should be expressed accurately.
    2. A related work section that discuss the contribution and difference from the previous works should be included.
    3. The concept of terminology in the paper is wrong. For example, there is no WEIGHT in the pooling layer and it needs to be improved.
    4. The reference format is inaccurate and should be revised. If the latest references are too few, the latest references should be added. Finally, the emphasis should be on strengthening language and grammar revisions. There are grammar and spelling errors throughout.

  • 相关阅读:
    20165105 学习基础和C语言基础调查
    2017-2018网络攻防第四周作业
    2017-2018-2 20165233 实验三 敏捷开发与XP实践
    20165233 2017-2018-2 《Java程序设计》第九周学习总结
    Linux学习笔记(一)
    第三周学习总结
    数据结构C++,线性表学习
    uname()系统调用学习
    cd
    go连接数据库并执行文件中的sql语句
  • 原文地址:https://www.cnblogs.com/devilmaycry812839668/p/16294632.html
Copyright © 2020-2023  润新知