I went through 50 NLP interview questions and below is a list of questions I got wrong and learned / consolidated from 🙂
What techniques can be used for noun phrase detection, verb phrase detection, subject detection, and object detection in NLP?
Dependency Parsing and Constituency Parsing. Dependency parsing is the task of assigning syntactic structure to sentences, establishing relationships between words. Constituency parsing is the task of breaking a sentence or text into sub-phrases (constituents). Non-terminal nodes are phrases and terminal nodes are words in the sentence or text.
Dissimilarity between words expressed using cosine similarity will have values significantly higher than 0.5?
Transformer architecture was first introduced with which model?
Open AI’s GPT. ULMFit was an LSTM based language modelling architecture.
Which model trains two independent LSTM language model left to right and right to left and shallowly concatenates them?
ELMo concatenates the results of the two LSTM language models to produce contextualised word embeddings.
Which model uses unidirectional language model for producing word embedding?
Open AI’s GPT. Elmo is bidirectional and Word2Vec is a simple word embedding.
Permutation language models is a feature of?
XLNET. This is the main difference between XLNET and BERT. In permutation language modelling, tokens are randomly predicted and non-sequential, meaning that prediction order can be in any direction.
What’s the different between NLTK and SpaCy?
SpaCy only contains the best algorithm for each problem whereas NLTK has a collection of algorithms to choose from
NLTK supports more languages
SpaCy supports word vectors
SpaCy is an object-oriented library whereas NLTK is a string processing library
What is pragmatic ambiguity in NLP?
Pragmatic ambiguity refers to words that have multiple meanings depending in its surrounding contexts, causing the same sentence to have multiple interpretations.