Natural language processing: Difference between revisions
Line 21: | Line 21: | ||
A neural network architecture by Google. | A neural network architecture by Google. | ||
It is currently the best at NLP tasks and has mostly replaced RNNs for these tasks. | It is currently the best at NLP tasks and has mostly replaced RNNs for these tasks. | ||
;Guides and explanations | |||
* [https://nlp.seas.harvard.edu/2018/04/03/attention.html The Annotated Transformer] | |||
* [https://www.youtube.com/watch?v=iDulhoQ2pro Youtube Video] | |||
===Google Bert=== | ===Google Bert=== |
Revision as of 21:42, 13 November 2019
Natural language processing (NLP)
Classical NLP
The Classical NLP consists of creating a pipeline using processors to create annotations from text files.
Below is an example of a few processors.
- Tokenization
- Convert a paragraph of test or a file into an array of words.
- Part-of-speech annotation
- Named Entity Recognition
Machine Learning
Datasets and Challenges
SQuAD
Link
The Stanford Question Answering Dataset. There are two versions of this dataset, 1.1 and 2.0.
Transformer
Attention is all you need paper
A neural network architecture by Google.
It is currently the best at NLP tasks and has mostly replaced RNNs for these tasks.
- Guides and explanations
Google Bert
Github Link
Paper
Blog Post
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
A pretrained NLP neural network.
Note the code is written in TensorFlow 1.