Skip to content

seq2seq attention in train with "multi_label_flag=False" #135

@Tian14267

Description

@Tian14267

作者你好,我在运行“a1_seq2seq_attention_train.py”的时候,设置"multi_label_flag=False",却遇到错误提醒:“You must feed a value for placeholder tensor 'decoder_input' with dtype int32 and shape [?,6]”
在你代码 103 、104行说明,使用FLAGS.multi_label_flag=Flase的时候,不需要 decoder_input 才对。
image
请问这是什么情况?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions