Thanks a lot, Felipe!

Cheers!

On Wed, 22 Aug 2018 at 01:17, Felipe Bravo <felipebravom@gmail.com> wrote:
Hi,
Yes you can get a document-level representation from pre-trained embeddings using the AffectiveTweets package (https://github.com/felipebravom/AffectiveTweets) or can even train your own embeddings using the deeplearning package (https://deeplearning.cms.waikato.ac.nz/). 
Cheers,
Felipe

On Wed, Aug 22, 2018 at 11:03 AM Jonnathan Carvalho <joncarv@gmail.com> wrote:
Hi, All!

I’m trying to figure out what word embeddings is...

Is it possible to use the dense feature representation generated by this technique with learning algorithms such as SVM? Or with neural networks only?

Does Weka support word embeddings?

Thanks!
Cheers!
_______________________________________________
Wekalist mailing list
Send posts to: Wekalist@list.waikato.ac.nz
List info and subscription status: https://list.waikato.ac.nz/mailman/listinfo/wekalist
List etiquette: http://www.cs.waikato.ac.nz/~ml/weka/mailinglist_etiquette.html


--
Cheers,
Felipe
_______________________________________________
Wekalist mailing list
Send posts to: Wekalist@list.waikato.ac.nz
List info and subscription status: https://list.waikato.ac.nz/mailman/listinfo/wekalist
List etiquette: http://www.cs.waikato.ac.nz/~ml/weka/mailinglist_etiquette.html


--
Jonnathan Carvalho
Instituto Federal de Educação, Ciência e Tecnologia Fluminense (RJ)