책 이미지
eBook 미리보기
책 정보
· 제목 : Introduction to Transformers for Nlp: With the Hugging Face Library and Models to Solve Problems (Paperback) 
· 분류 : 외국도서 > 컴퓨터 > 인공지능(AI)
· ISBN : 9781484288436
· 쪽수 : 165쪽
· 출판일 : 2022-10-21
· 분류 : 외국도서 > 컴퓨터 > 인공지능(AI)
· ISBN : 9781484288436
· 쪽수 : 165쪽
· 출판일 : 2022-10-21
목차
?Chapter 1: Introduction to Language Models
Chapter Goal: History and introduction to language models
Sub-topics:
? What is a language model
? Evolution of language models from n-grams to now Transformer based models
? High-level intro to Google BERTChapter 2: Transformers
Chapter Goal: Introduction to Transformers and their architecture
Sub-topics:
Introduction to Transformers
? Deep dive into Transformer architecture and how attention plays a key role in Transformers
? How Transformer realizes tasks like sentiment analysis, Q&A, sentence masking, etc.
Chapter 3: Intro to Hugging Face library
Chapter Goal: Gives an introduction to Hugging Face libraries and how they are used in achieving NLP tasks
Sub-topics:
? What is Hugging Face, and how its emerge as a relevant library for various data sets and models related to NLP
? Creating simple Hugging Face applications for NLP tasks like sentiment analysis, sentence masking, etc.
? Play around with different models available in the IT space.
Chapter 4: Code Generator
Chapter Goal: Cover an example of a code generator using Transformer architecture.
Sub-topics:
? Creating a simple code generator wherein user input is text in NLP like sorting a given array of numbers.
? The generator will take the user text and generate Python code or YAML (yet another markup language)file as an example for Kubernetes
? Deploying the model on the cloud as a service in Kubernetes
Chapter 5: Transformer Based Applications
Chapter Goal: Summary of the topics around Transformers, Hugging Face libraries, and their usage.
Subtopics:
? Summary of Transformer based applications and language models. ? Summarize Hugging Face libraries and why how they are relevant in NLP.
저자소개
추천도서
분야의 베스트셀러 >














