Hands-on Question Answering Systems with BERT - Applications in Neural Networks and Natural Language Processing

Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning. The book begins with an overview of the technology landscape behind BERT. It takes you thr...

Full description

Saved in:
Bibliographic Details
Main Authors: Sabharwal, Navin, Agrawal, Amit
Format: eBook Book
Language:English
Published: Berkeley, CA Apress, an imprint of Springer Nature 2021
Apress
Apress L. P
Edition:1
Subjects:
ISBN:1484266633, 9781484266632, 9781484266649, 1484266641
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Table of Contents:
  • Title Page Introduction Table of Contents 1. Introduction to Natural Language Processing 2. Neural Networks for Natural Language Processing 3. Introduction to Word Embeddings 4. BERT Algorithms Explained 5. BERT Model Applications: Question Answering System 6. BERT Model Applications: Other Tasks 7. Future of BERT Models Index
  • Intro -- Table of Contents -- About the Authors -- About the Technical Reviewer -- Acknowledgments -- Introduction -- Chapter 1: Introduction to Natural Language Processing -- Natural Language Processing -- Sentence Segmentation -- Tokenization -- Parts of Speech Tagging -- Stemming and Lemmatization -- Identification of Stop Words -- Phrase Extraction -- Named Entity Recognition -- Coreference Resolution -- Bag of Words -- Conclusion -- Chapter 2: Neural Networks for Natural Language Processing -- What Is a Neural Network? -- Building Blocks of Neural Networks -- Neuron -- Input Layer -- Hidden Layers -- Output Layer -- Activation Function -- Neural Network Training -- Types of Neural Networks -- Feed-Forward Neural Networks -- Convolutional Neural Networks -- Recurrent Neural Networks -- Long Short-Term Memory -- Encoders and Decoders -- The Encoder-Decoder Architecture -- Encoder Part of the Model -- Decoder Part of the Model -- Bidirectional Encoders and Decoders -- Transformer Models -- Model Architecture -- Attention Models -- Why Is Attention Required? -- How Attention Works -- Types of Attention Models -- Global Attention Model -- Local Attention Model -- Hard and Soft Attention Model -- Self-Attention Model -- Conclusion -- Chapter 3: Introduction to Word Embeddings -- One-Hot Representation -- Count Vector -- TF-IDF Vectorization -- What Is Word Embedding? -- Different Methods of Word Embedding -- Word2vec -- Continuous Bag of Words -- Skip Gram Model -- GloVe -- Sentence Embeddings -- ELMo -- Universal Sentence Encoder -- Bidirectional Encoder Representations from Transformers -- BERT Base Model -- BERT Large Model -- Conclusion -- Chapter 4: BERT Algorithms Explained -- How Does BERT Work? -- Text Processing -- Masked Language Modeling -- Next Sentence Prediction -- Text Classification Using BERT -- Benchmarks for BERT Model
  • GLUE Benchmark -- SQuAD Dataset -- IMDB Reviews Dataset -- RACE Benchmark -- Types of BERT-Based Models -- ALBERT -- RoBERTa -- DistilBERT -- Distillation Loss -- Cosine Embedding Loss -- Masked Language Modeling Loss -- Architectural Modifications -- StructBERT -- Structural Pretraining in StructBERT -- Pretraining Objectives -- BERTjoint for Natural Questions -- Conclusion -- Chapter 5: BERT Model Applications: Question Answering System -- Types of QA Systems -- Question Answering System Design Using BERT -- For Windows Server -- Creation of REST API -- For Linux Server -- Creation of REST API -- Open-Domain Question Answering System -- Model Architecture -- DeepPavlov QA System -- Conclusion -- Chapter 6: BERT Model Applications: Other Tasks -- Sentiment Analysis -- Named Entity Recognition -- Text Classification -- Text Summarization -- Conclusion -- Chapter 7: Future of BERT Models -- Future Capabilities -- Abstractive Summarization -- Natural Language Generation -- Machine Translation -- Conclusion -- Index