Skip to content

Details on word2vec modelΒ #10

@PhilKuhnke

Description

@PhilKuhnke

Dear Kyubyong,
great work - thank you very much for proving these word vectors!
One question: Which model did you use to train your word vectors with word2vec? Skip-gram or cbow? Is this the standard model as reported in Mikolov et al. (2013) or a modified variant?
And which parameters did you use to train the model for each language? Always the default parameters in make_wordvectors.sh?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions