Transformers for Pure Language Processing and Laptop Imaginative and prescient – Third Version: Discover Generative AI and Giant Language Fashions with Hugging Face, ChatGPT, GPT-4V, and DALL-E 3


Value: [price_with_discount]
(as of [price_update_date] – Particulars)


[ad_1]

Unleash the complete potential of transformers with this complete information overlaying structure, capabilities, dangers, and sensible implementations on OpenAI, Google Vertex AI, and Hugging Face

Buy of the print or Kindle ebook features a free eBook in PDF format

Key FeaturesMaster NLP and imaginative and prescient transformers, from the structure to fine-tuning and implementationLearn find out how to apply Retrieval Augmented Technology (RAG) with LLMs utilizing custom-made texts and embeddingsMitigate LLM dangers, corresponding to hallucinations, utilizing moderation fashions and data basesBook Description

Transformers for Pure Language Processing and Laptop Imaginative and prescient, Third Version, explores Giant Language Mannequin (LLM) architectures, functions, and varied platforms (Hugging Face, OpenAI, and Google Vertex AI) used for Pure Language Processing (NLP) and Laptop Imaginative and prescient (CV).

The ebook guides you thru totally different transformer architectures to the newest Basis Fashions and Generative AI. You will pretrain and fine-tune LLMs and work via totally different use circumstances, from summarization to implementing question-answering programs with embedding-based search strategies. Additionally, you will study the dangers of LLMs, from hallucinations and memorization to privateness, and find out how to mitigate such dangers utilizing moderation fashions with rule and data bases. You will implement Retrieval Augmented Technology (RAG) with LLMs to enhance the accuracy of your fashions and acquire higher management over LLM outputs.

Dive into generative imaginative and prescient transformers and multimodal mannequin architectures and construct functions, corresponding to picture and video-to-text classifiers. Go additional by combining totally different fashions and platforms and studying about AI agent replication.

This ebook offers you with an understanding of transformer architectures, pretraining, fine-tuning, LLM use circumstances, and finest practices.

What you’ll learnLearn find out how to pretrain and fine-tune LLMsLearn find out how to work with a number of platforms, corresponding to Hugging Face, OpenAI, and Google Vertex AILearn about totally different tokenizers and the most effective practices for preprocessing language dataImplement Retrieval Augmented Technology and guidelines bases to mitigate hallucinationsVisualize transformer mannequin exercise for deeper insights utilizing BertViz, LIME, and SHAPCreate and implement cross-platform chained fashions, corresponding to HuggingGPTGo in-depth into imaginative and prescient transformers with CLIP, DALL-E 2, DALL-E 3, and GPT-4VWho this ebook is for

This ebook is right for NLP and CV engineers, software program builders, information scientists, machine studying engineers, and technical leaders seeking to advance their LLMs and generative AI abilities or discover the newest traits within the discipline.

Information of Python and machine studying ideas is required to totally perceive the use circumstances and code examples. Nevertheless, with examples utilizing LLM person interfaces, immediate engineering, and no-code mannequin constructing, this ebook is nice for anybody curious concerning the AI revolution.

Desk of ContentsWhat are Transformers?Getting Began with the Structure of the Transformer ModelEmergent vs Downstream Duties: The Unseen Depths of TransformersAdvancements in Translations with Google Trax, Google Translate, and GeminiDiving into Superb-Tuning via BERTPretraining a Transformer from Scratch via RoBERTaThe Generative AI Revolution with ChatGPTFine-Tuning OpenAI GPT ModelsShattering the Black Field with Interpretable ToolsInvestigating the Position of Tokenizers in Shaping Transformer Fashions

(N.B. Please use the Look Inside choice to see additional chapters)


From the Writer

B19899 -  top banner v2B19899 -  top banner v2

B19899- authorB19899- author

What’s new on this third version of Transformers for Pure Language Processing and Laptop Imaginative and prescient?

There’s loads of new content material on this version, together with the implementation of Retrieval Augmented Technology (RAG) with Giant Language Fashions (LLMs) for question-answering duties and decreasing the danger of mannequin hallucinations, providing you with extra management over your fashions. I showcase Google Vertex AI, PaLM 2, Llama 2, and HuggingGPT and focus on syntax-free semantic function labeling within the ebook, in addition to dive deeper into the several types of tokenizers.

The largest change from the earlier version is the addition of a number of pc imaginative and prescient multimodal mannequin chapters. Transformers corresponding to OpenAI GPT-4V are multimodal. As such, I discovered it important to incorporate Laptop Imaginative and prescient. The most recent imaginative and prescient transformers have taken us right into a world of creativity with fashions, corresponding to DALL-E 3 and Steady Diffusion.

B19899 -  3d mockupB19899 -  3d mockup

How does the ebook put together readers for a profession working with Generative AI and Giant Language Fashions?

From web page 1, I take the reader into pragmatic approaches to Generative AI and LLMs. The aim is to know the structure, potential, and limits of a number of platforms, corresponding to Hugging Face, Google Vertex AI, and OpenAI, earlier than making a call. It’s additionally essential to know the dangers of utilizing LLMs. There’s a probability of shedding management of superhuman AI. This ebook addresses a number of methods of mitigating dangers, corresponding to RAG, embedding-search, data bases, and moderation fashions.

Furthermore, working on this fast-paced, ever-evolving enviornment means discovering the most effective course of, mannequin, and platform to your state of affairs. The easiest way is to comply with the proper order with a baseline set of duties. First, strive a normal mannequin and go so far as potential with immediate design. Then, strive immediate engineering by controlling the enter with augmented (RAG) inputs (paperwork, internet scraping, and data base). The following step can be to fine-tune (or construct) a mannequin and discover it with immediate engineering and RAG.

B19899-4B19899-4

Add to Cart

Add to Cart

Buyer Evaluations

5.0 out of 5 stars
9

4.2 out of 5 stars
117

Value

$41.80$41.80

$44.99$44.99

Major Platforms and Libraries
OpenAI, Hugging Face, and Google Vertex AI OpenAI, Hugging Face, and AllenNLP

Highlights
RAG with GPT-4, embeddings-based search, testing Steady Diffusion with divergent affiliation duties Analyzing faux information, pc imaginative and prescient transformers, and Business 4.0

New Matters
RAG, Llama 2, PaLM 2, and goes deeper into pc imaginative and prescient, tokenization, embeddings, and prompts Superb-tuning and duties utilizing the OpenAI API and insights into pc imaginative and prescient and immediate engineering

Writer ‏ : ‎ Packt Publishing; third ed. version (February 29, 2024)
Language ‏ : ‎ English
Paperback ‏ : ‎ 728 pages
ISBN-10 ‏ : ‎ 1805128728
ISBN-13 ‏ : ‎ 978-1805128724
Merchandise Weight ‏ : ‎ 2.75 kilos
Dimensions ‏ : ‎ 9.25 x 7.52 x 1.45 inches

[ad_2]

AITA For Making My Daughter Select between Being A Vegan Or Meat Lover?

ChatGPT now not requires an account to make use of it. Right here's how OpenAI plans to deal with the mass adoption.