Hands-on Question Answering Systems with BERT - Applications in Neural Networks and Natural Language Processing
Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning. The book begins with an overview of the technology landscape behind BERT. It takes you thr...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | eBook Book |
| Language: | English |
| Published: |
Berkeley, CA
Apress, an imprint of Springer Nature
2021
Apress Apress L. P |
| Edition: | 1 |
| Subjects: | |
| ISBN: | 1484266633, 9781484266632, 9781484266649, 1484266641 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Abstract | Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning. The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you'll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you'll cover word embedding and their types along with the basics of BERT. After this solid foundation, you'll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You'll see different BERT variations followed by a hands-on example of a question answering system. |
|---|---|
| AbstractList | Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning.The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you'll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you'll cover word embedding and their types along with the basics of BERT. After this solid foundation, you'll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You'll see different BERT variations followed by a hands-on example of a question answering system. Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT.What You Will Learn Examine the fundamentals of word embeddings Apply neural networks and BERT for various NLP tasks Develop a question-answering system from scratch Train question-answering systems for your own data Who This Book Is ForAI and machine learning developers and natural language processing developers. Providing step-by-step guidance and in-depth explanations, this book is a good starting point for developers and data scientists who want to create and design NLP systems using BERT. -- Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning. The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you'll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you'll cover word embedding and their types along with the basics of BERT. After this solid foundation, you'll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You'll see different BERT variations followed by a hands-on example of a question answering system. Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT. What You Will Learn * Examine the fundamentals of word embeddings * Apply neural networks and BERT for various NLP tasks * Develop a question-answering system from scratch * Train question-answering systems for your own data Who This Book Is For AI and machine learning developers and natural language processing developers. Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning.The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you’ll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you’ll cover word embedding and their types along with the basics of BERT. After this solid foundation, you’ll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You’ll see different BERT variations followed by a hands-on example of a question answering system. Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT.What You Will LearnExamine the fundamentals of word embeddingsApply neural networks and BERT for various NLP tasks Develop a question-answering system from scratch Train question-answering systems for your own data Who This Book Is For AI and machine learning developers and natural language processing developers. Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning. The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you'll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you'll cover word embedding and their types along with the basics of BERT. After this solid foundation, you'll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You'll see different BERT variations followed by a hands-on example of a question answering system. |
| Author | Sabharwal, Navin Agrawal, Amit |
| Author_xml | – sequence: 1 fullname: Sabharwal, Navin – sequence: 2 fullname: Agrawal, Amit |
| BackLink | https://cir.nii.ac.jp/crid/1130293545512597380$$DView record in CiNii |
| BookMark | eNplkV1v0zAUhoP4EGz0ByBx4QsQ4iLMn3Fy2VUdRaoGYxO3lus4nUlmhzhtt3_PcRtxgyz56zznPfZ7zrIXPnibZe8I_kIwlheVLHOS85LTvCgKnlfPshncEUEZEVgW9Hl2RlIYooy9ggMECoIrxl9nsxh_Y4wppyVh9E0WVtrXMQ8e3exsHB1s5j4e7OD8Ft0-xdE-RHRw4z26XP68Qzma933njE5kRM6ja7sbdAfLeAhDGxHIoWs9Hi_X2m93emvRjyEYGyNovs1eNrqLdjat59mvq-XdYpWvv3_9tpivc00Z5VXO65KKRlupjdlwQagkZWmxYNbYkhmsq5o0gkojdb0xddGIpikZwRpjvbG6ZufZp5NwbF3XxdCMahNCGyl_lGrTRjCBCEEJB_LzidSxtYd4H7oxqn1nj7hKzk5m8grYi0m1Tw7Z4SSqCFapN4lWRCVepQSVMj5MGbrRg5v4Pf1PeHpuP4Q_qRPqWN9YP4KTanm5KLgQXBRAvp9IO3R2GyZF8IhJGP_qeeeUcWkmhGFaMcgGH0UlWYkB-3jCWh_2tlPwnQc9PB3FVNuvbua30HDK_gIDXLij |
| ContentType | eBook Book |
| Copyright | 2021 Navin Sabharwal, Amit Agrawal 2021 |
| Copyright_xml | – notice: 2021 – notice: Navin Sabharwal, Amit Agrawal 2021 |
| DBID | RYH YSPEL OHILO OODEK |
| DEWEY | 629.8/95632 |
| DOI | 10.1007/978-1-4842-6664-9 |
| DatabaseName | CiNii Complete Perlego O'Reilly Online Learning: Corporate Edition O'Reilly Online Learning: Academic/Public Library Edition |
| DatabaseTitleList | |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering Computer Science Government |
| DocumentTitleAlternate | Hands-on question answering systems with Bidirectional Encoder Representations from Transformers |
| EISBN | 9781523150762 1523150769 9781484266649 1484266641 |
| Edition | 1 1st ed. |
| ExternalDocumentID | bks000155214 9781484266649 498208 EBC6455456 4513737 BC16731546 book_kpHQASBER2 |
| Genre | Electronic books |
| GroupedDBID | 38. AABBV AABLV AALIM ABCGU ACBPT ACLFK ACWLQ ACXXF AEKFX AFNLE AIYYB ALMA_UNASSIGNED_HOLDINGS BAHJK BBABE CMZ CZZ IEZ K-E KWVPI OCUHQ OHILO OODEK ORHYB SBO TD3 TPJZQ WZT YSPEL Z7R Z7X Z81 Z83 RYH Z7U Z85 ACBYE ARRAB Z7V Z7Z |
| ID | FETCH-LOGICAL-a23249-4d825fae7accb45127188e053ece83c0a9d1f527c7adbcd6f5ff8310a00abead3 |
| IEDL.DBID | K-E |
| ISBN | 1484266633 9781484266632 9781484266649 1484266641 |
| IngestDate | Sun Aug 27 04:37:18 EDT 2023 Fri Nov 08 04:07:20 EST 2024 Tue Jul 29 20:38:07 EDT 2025 Fri Dec 05 19:49:22 EST 2025 Fri May 30 22:54:31 EDT 2025 Tue Dec 02 17:04:28 EST 2025 Thu Jun 26 21:28:48 EDT 2025 Sat Nov 23 14:03:33 EST 2024 |
| IsPeerReviewed | false |
| IsScholarly | false |
| LCCallNum | J223.M53 .S23 2021 |
| LCCallNum_Ident | Q325.5-.7 |
| Language | English |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-a23249-4d825fae7accb45127188e053ece83c0a9d1f527c7adbcd6f5ff8310a00abead3 |
| Notes | Includes index |
| OCLC | 1231610934 |
| PQID | EBC6455456 |
| PageCount | 192 |
| ParticipantIDs | skillsoft_books24x7_bks000155214 askewsholts_vlebooks_9781484266649 springer_books_10_1007_978_1_4842_6664_9 safari_books_v2_9781484266649 proquest_ebookcentral_EBC6455456 perlego_books_4513737 nii_cinii_1130293545512597380 knovel_primary_book_kpHQASBER2 |
| PublicationCentury | 2000 |
| PublicationDate | 2021 c2021 2021-01-12T00:00:00 20210113 2021-01-12 2021. |
| PublicationDateYYYYMMDD | 2021-01-01 2021-01-12 2021-01-13 |
| PublicationDate_xml | – year: 2021 text: 2021 |
| PublicationDecade | 2020 |
| PublicationPlace | Berkeley, CA |
| PublicationPlace_xml | – name: New York – name: Berkeley, CA – name: Place of publication not identified |
| PublicationYear | 2021 |
| Publisher | Apress, an imprint of Springer Nature Apress Apress L. P |
| Publisher_xml | – name: Apress, an imprint of Springer Nature – name: Apress – name: Apress L. P |
| SSID | ssj0002428132 |
| Score | 2.260193 |
| Snippet | Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using... Providing step-by-step guidance and in-depth explanations, this book is a good starting point for developers and data scientists who want to create and design... |
| SourceID | skillsoft askewsholts springer safari proquest perlego nii knovel |
| SourceType | Aggregation Database Publisher |
| SubjectTerms | Automatic control Computer Science Data processing General References Machine Learning MATHEMATICS Neural networks (Computer science) Pattern recognition systems Professional and Applied Computing Professional Computing Software Engineering |
| SubjectTermsDisplay | Automatic control -- Data processing. Electronic books. Neural networks (Computer science) Pattern recognition systems -- Data processing. |
| TableOfContents | Title Page
Introduction
Table of Contents
1. Introduction to Natural Language Processing
2. Neural Networks for Natural Language Processing
3. Introduction to Word Embeddings
4. BERT Algorithms Explained
5. BERT Model Applications: Question Answering System
6. BERT Model Applications: Other Tasks
7. Future of BERT Models
Index Intro -- Table of Contents -- About the Authors -- About the Technical Reviewer -- Acknowledgments -- Introduction -- Chapter 1: Introduction to Natural Language Processing -- Natural Language Processing -- Sentence Segmentation -- Tokenization -- Parts of Speech Tagging -- Stemming and Lemmatization -- Identification of Stop Words -- Phrase Extraction -- Named Entity Recognition -- Coreference Resolution -- Bag of Words -- Conclusion -- Chapter 2: Neural Networks for Natural Language Processing -- What Is a Neural Network? -- Building Blocks of Neural Networks -- Neuron -- Input Layer -- Hidden Layers -- Output Layer -- Activation Function -- Neural Network Training -- Types of Neural Networks -- Feed-Forward Neural Networks -- Convolutional Neural Networks -- Recurrent Neural Networks -- Long Short-Term Memory -- Encoders and Decoders -- The Encoder-Decoder Architecture -- Encoder Part of the Model -- Decoder Part of the Model -- Bidirectional Encoders and Decoders -- Transformer Models -- Model Architecture -- Attention Models -- Why Is Attention Required? -- How Attention Works -- Types of Attention Models -- Global Attention Model -- Local Attention Model -- Hard and Soft Attention Model -- Self-Attention Model -- Conclusion -- Chapter 3: Introduction to Word Embeddings -- One-Hot Representation -- Count Vector -- TF-IDF Vectorization -- What Is Word Embedding? -- Different Methods of Word Embedding -- Word2vec -- Continuous Bag of Words -- Skip Gram Model -- GloVe -- Sentence Embeddings -- ELMo -- Universal Sentence Encoder -- Bidirectional Encoder Representations from Transformers -- BERT Base Model -- BERT Large Model -- Conclusion -- Chapter 4: BERT Algorithms Explained -- How Does BERT Work? -- Text Processing -- Masked Language Modeling -- Next Sentence Prediction -- Text Classification Using BERT -- Benchmarks for BERT Model GLUE Benchmark -- SQuAD Dataset -- IMDB Reviews Dataset -- RACE Benchmark -- Types of BERT-Based Models -- ALBERT -- RoBERTa -- DistilBERT -- Distillation Loss -- Cosine Embedding Loss -- Masked Language Modeling Loss -- Architectural Modifications -- StructBERT -- Structural Pretraining in StructBERT -- Pretraining Objectives -- BERTjoint for Natural Questions -- Conclusion -- Chapter 5: BERT Model Applications: Question Answering System -- Types of QA Systems -- Question Answering System Design Using BERT -- For Windows Server -- Creation of REST API -- For Linux Server -- Creation of REST API -- Open-Domain Question Answering System -- Model Architecture -- DeepPavlov QA System -- Conclusion -- Chapter 6: BERT Model Applications: Other Tasks -- Sentiment Analysis -- Named Entity Recognition -- Text Classification -- Text Summarization -- Conclusion -- Chapter 7: Future of BERT Models -- Future Capabilities -- Abstractive Summarization -- Natural Language Generation -- Machine Translation -- Conclusion -- Index |
| Title | Hands-on Question Answering Systems with BERT - Applications in Neural Networks and Natural Language Processing |
| URI | https://app.knovel.com/hotlink/toc/id:kpHQASBER2/hands-question-answering/hands-question-answering?kpromoter=Summon https://cir.nii.ac.jp/crid/1130293545512597380 https://www.perlego.com/book/4513737/handson-question-answering-systems-with-bert-applications-in-neural-networks-and-natural-language-processing-pdf https://ebookcentral.proquest.com/lib/[SITE_ID]/detail.action?docID=6455456 https://learning.oreilly.com/library/view/~/9781484266649/?ar http://link.springer.com/10.1007/978-1-4842-6664-9 https://www.vlebooks.com/vleweb/product/openreader?id=none&isbn=9781484266649 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwxV3Nb9MwFH_aBxL0MhhDZLDKQhy4WE1it0m4oK7qVAmoGExo4mI5TjKiRElVd4U_n2fHYQWEOHJJFDn-1a5f3qf9HsDLaJwwzrIxHcfSp2hvJDRNQ3xMkFtyFag4ymyxiWi5jK-vkw97oPuzMKa4VdW027y2bPpruzGBzNGmVaMye12tFpfTT-fzj-HI-JU1tbwTJ0CRr3-zufv-2vCmWtkNbkggnYNpHw5R2QiNxfaW3jlmUGbFaKLZQ2CxkV8TxvrcUO457MOjLkMtWmHYQrGJU1RVB1JXyKGQe200CrZuOii7mrJELXuVr-v8pv1Vo9WyQNN4APd1Vda1Rg78R2TWCryLo__yVz2Ew9ycvXgEe3lzDEd9xQniGNAxDHbSJz6GdmGR24ZcOnAy7cGJS8VOjKOZ4ACvCCXTnWg9KRtiMpPIGm92K7wmCEeW0mYiIe-cI5e4oxSIeQKfL-ZXswV1BSSoNIpiQnmGBnAh80gqlXLUbVASx6YYRq7ymClfJllQjMNIRTJLVTYpxkVhKq9J35cpfmPsCRw0bZM_BcJVyFM2iaXKM55GSoZBmqgo9rFrwn3lwYudZRfb2ga7tdihG554MOxWTKy6XCLCvCTu1sqDM6QSoUpzDUxQOWGo4-LI0fBjse_BiaMf0cHjpFjEIg9IT03C_rDb2ivm57MJ9kc1GaE7KnM9t-HvYyM_ia97JeTfI5FW2u8S9QXcg1c9TTqQPuk1IolAGCxhwERy-q-ZPoMHodkeZL1Zz-Fgs77Nz-Ce2m5KvR7C_uz9l6H9MH8A9ZlPvQ |
| linkProvider | Knovel |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=book&rft.title=Hands-on+question+answering+systems+with+BERT+%3A+applications+in+neural+networks+and+natural+language+processing&rft.au=Sabharwal%2C+Navin&rft.au=Agrawal%2C+Amit&rft.date=2021-01-01&rft.pub=Apress&rft.isbn=9781484266632&rft_id=info:doi/10.1007%2F978-1-4842-6664-9&rft.externalDocID=BC16731546 |
| thumbnail_l | http://cvtisr.summon.serialssolutions.com/2.0.0/image/custom?url=https%3A%2F%2Fwww.perlego.com%2Fbooks%2FRM_Books%2Fingram_csplus_gexhsuob%2F9781484266649.jpg |
| thumbnail_m | http://cvtisr.summon.serialssolutions.com/2.0.0/image/custom?url=https%3A%2F%2Fwww.safaribooksonline.com%2Flibrary%2Fcover%2F9781484266649 http://cvtisr.summon.serialssolutions.com/2.0.0/image/custom?url=https%3A%2F%2Fvle.dmmserver.com%2Fmedia%2F640%2F97814842%2F9781484266649.jpg |
| thumbnail_s | http://cvtisr.summon.serialssolutions.com/2.0.0/image/custom?url=https%3A%2F%2Fcontent.knovel.com%2Fcontent%2FThumbs%2Fthumb14855.gif http://cvtisr.summon.serialssolutions.com/2.0.0/image/custom?url=https%3A%2F%2Fmedia.springernature.com%2Fw306%2Fspringer-static%2Fcover-hires%2Fbook%2F978-1-4842-6664-9 |

