Date of Award
Bachelor of Science (B.S.) in Computer Science and University Honors
Natural language processing (Computer science), Machine learning, Computational linguistics
CBOW and Skip Gram are two NLP techniques to produce word embedding models that are accurate and performant. They were invented in the seminal paper by T. Mikolov et al. and have since observed optimizations such as negative sampling and subsampling. This paper implements a fully-optimized version of these models using Py-Torch and runs them through a toy sentiment/subject analysis. It is weakly observed that different corpus types affect the skew of work embeddings such that fictional corpus are better suited for sentiment analysis and non-fictional for subject analysis.
In Copyright. URI: http://rightsstatements.org/vocab/InC/1.0/ This Item is protected by copyright and/or related rights. You are free to use this Item in any way that is permitted by the copyright and related rights legislation that applies to your use. For other uses you need to obtain permission from the rights-holder(s).
Menon, Tejas, "Empirical Analysis of CBOW and Skip Gram NLP Models" (2020). University Honors Theses. Paper 934.