Site icon Nebula-NLP

Executing NLP

On the right is sentiment analysis using BERT, a Transformer Model which we will encounter in a future post.

Below we use N-Grams to Generate Shakespeare. This approach preceded the use of LSTMs (Long-Short-Term-Memory) used for Artificial Shakespeare. Currently, ‘Transformers’ are the preferred Model for NLP and vision.

Following is an introduction to the Mechanics of Natural Language Processing.

Forward and Back Propagation. Back Propagation is the bain of Data Scientists!

Exit mobile version