• RNN计算loss function


    用于理解RNN结构的两张图:


    Remember that our target at every time step is to predict the next character in the sequence. So our labels should look just like our inputs but offset by one character. Let’s look at corresponding inputs and outputs to make sure everything lined up as expected.

    print(textify(train_data[10, :, 3]))
    print(textify(train_label[10, :, 3]))
    

    te, but the twisted crystalline bars lay unfinished upon the
    ben
    e, but the twisted crystalline bars lay unfinished upon the
    benc

    Averaging the loss over the sequence

    def average_ce_loss(outputs, labels):
        assert(len(outputs) == len(labels))
        total_loss = 0.
        for (output, label) in zip(outputs,labels):
            total_loss = total_loss + cross_entropy(output, label)
        return total_loss / len(outputs)
    
  • 相关阅读:
    OMNETPP: tictoc
    OMNETPP安装
    Unified SR
    SCM
    DC tunnel
    AIMD
    AQM
    MANAGER POJ1281 C语言
    Pascal Library C语言 UVALive3470
    The 3n + 1 problem C语言 UVA100
  • 原文地址:https://www.cnblogs.com/yaos/p/14207563.html
Copyright © 2020-2023  润新知