Ai Llm Nlp Pytorch Gpt Ithome


Download Ai Llm Nlp Pytorch Gpt Ithome PDF/ePub or read online books in Mobi eBooks. Click Download or Read Online button to get Ai Llm Nlp Pytorch Gpt Ithome book now. This website allows unlimited access to, at the time of writing, more than 1.5 million titles, including hundreds of thousands of titles in various foreign languages.

Download

全面掌握生成式AI與LLM開發實務:NLP×PyTorch×GPT輕鬆打造專屬的大型語言模型(iThome鐵人賽系列書)


全面掌握生成式AI與LLM開發實務:NLP×PyTorch×GPT輕鬆打造專屬的大型語言模型(iThome鐵人賽系列書)

Author: 黃朝隆

language: zh-CN

Publisher: 博碩文化

Release Date: 2024-10-18


DOWNLOAD





帶你掌握最前沿的AI技術,成為AI時代的領軍者! 從理論到實踐,自然語言處理必修指南 精通AI × NLP,快速脫穎而出! 【內容簡介】 ♔ 深度學習必備:理解AI與NLP理論,從入門到精通 ♔ 實戰案例解析:豐富的程式碼實例,培養實戰能力 ♔ 模型優化祕訣:掌握最新AI技術,提升模型表現 ♔ 全面培訓實戰:初學者或專業人士可精進AI專案程式 本書內容改編自第15屆iThome鐵人賽AI & Data組佳作系列文章《30天內成為NLP大師:掌握關鍵工具和技巧》。本書從基礎理論到實務應用,詳細介紹了自然語言處理的發展過程及相關技術,且循序漸進解釋了AI中的數學原理,如線性代數、矩陣相乘及機率,並將這些理論應用於深度學習模型中。 此外,本書內容還涵蓋了如何建立、訓練及優化自然語言處理模型的實作步驟,並介紹目前最熱門的模型架構,如Transformer、BERT、GPT、LLaMA等,指導讀者如何在實際應用中微調這些模型,以達到最佳效果。對於那些希望參與人工智慧競賽或是提升程式設計技能的讀者,本書也提供大量實例與程式碼,幫助讀者更加理解和掌握這些技術。 【本書特色】 ✪理解人工智慧實際上的運作原理以及電腦是如何理解文字資料的。 ✪完整介紹自然語言處理的「重要發展」與「最近進展」,讓你快速上手自然語言處理這一領域。 ✪全面理解大型語言模型的奧妙與其相關評估指標。 ✪告訴你模型優化技巧,讓你能在競賽中獲取優良名次。 ✪讓你擁有工程師的「程式風格」與培育「自學思維」。 【目標讀者】 ✪對人工智慧懷抱熱情的初學者與愛好者。 ✪希望快速學習、累積人工智慧程式專案經驗的程式設計師和開發者。 ✪數學理論強但程式設計經驗有限、想踏入人工智慧領域的學習者。 ✪希望參加人工智慧競賽的參賽者。 ✪希望了解最新NLP技術發展的技術專業人士。 【專業推薦】 「對許多人而言,AI技術既熟悉又遙遠,其快速發展加劇了資訊焦慮。在這個背景下,我推薦這本關於生成式AI與大型語言模型(LLM)開發實務的書籍,因為它提供了一個極具深度和廣度的學習資源,無論是剛接觸人工智慧領域的初學者,還是希望深入理解生成式AI的專業人士,都能從中獲得啟發與實踐經驗。本書的特點在於其層層遞進的結構設計,將複雜的理論與實務操作相結合,讓讀者能夠逐步建立起紮實的人工智慧知識體系。」 ──── 李俊宏,國立高雄科技大學 電機工程系資通組教授 「本書中的內容並不是單純地傳授如何使用特定工具,而是透過詳細的範例與步驟,讓讀者理解背後的邏輯,書中提供了豐富的實作指導,幫助讀者不只實現模型的搭建,還能深入探討模型優化的細節,這對於那些正在學習如何調整超參數來提高模型效能的讀者來說,尤為重要。本書是一本適合任何階段的讀者閱讀的書籍,無論你是剛入門的學生,還是有一定經驗的開發者,都能在這裡找到進一步提升的機會。」 ──── 吳宇祈,國立成功大學 電機工程系碩士生

Building Evolutionary Architectures


Building Evolutionary Architectures

Author: Neal Ford

language: en

Publisher: "O'Reilly Media, Inc."

Release Date: 2017-09-18


DOWNLOAD





The software development ecosystem is constantly changing, providing a constant stream of new tools, frameworks, techniques, and paradigms. Over the past few years, incremental developments in core engineering practices for software development have created the foundations for rethinking how architecture changes over time, along with ways to protect important architectural characteristics as it evolves. This practical guide ties those parts together with a new way to think about architecture and time.

Transformers for Natural Language Processing


Transformers for Natural Language Processing

Author: Denis Rothman

language: en

Publisher: Packt Publishing Ltd

Release Date: 2021-01-29


DOWNLOAD





Publisher's Note: A new edition of this book is out now that includes working with GPT-3 and comparing the results with other models. It includes even more use cases, such as casual language analysis and computer vision tasks, as well as an introduction to OpenAI's Codex. Key FeaturesBuild and implement state-of-the-art language models, such as the original Transformer, BERT, T5, and GPT-2, using concepts that outperform classical deep learning modelsGo through hands-on applications in Python using Google Colaboratory Notebooks with nothing to install on a local machineTest transformer models on advanced use casesBook Description The transformer architecture has proved to be revolutionary in outperforming the classical RNN and CNN models in use today. With an apply-as-you-learn approach, Transformers for Natural Language Processing investigates in vast detail the deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question answering, and many more NLP domains with transformers. The book takes you through NLP with Python and examines various eminent models and datasets within the transformer architecture created by pioneers such as Google, Facebook, Microsoft, OpenAI, and Hugging Face. The book trains you in three stages. The first stage introduces you to transformer architectures, starting with the original transformer, before moving on to RoBERTa, BERT, and DistilBERT models. You will discover training methods for smaller transformers that can outperform GPT-3 in some cases. In the second stage, you will apply transformers for Natural Language Understanding (NLU) and Natural Language Generation (NLG). Finally, the third stage will help you grasp advanced language understanding techniques such as optimizing social network datasets and fake news identification. By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models by tech giants to various datasets. What you will learnUse the latest pretrained transformer modelsGrasp the workings of the original Transformer, GPT-2, BERT, T5, and other transformer modelsCreate language understanding Python programs using concepts that outperform classical deep learning modelsUse a variety of NLP platforms, including Hugging Face, Trax, and AllenNLPApply Python, TensorFlow, and Keras programs to sentiment analysis, text summarization, speech recognition, machine translations, and moreMeasure the productivity of key transformers to define their scope, potential, and limits in productionWho this book is for Since the book does not teach basic programming, you must be familiar with neural networks, Python, PyTorch, and TensorFlow in order to learn their implementation with Transformers. Readers who can benefit the most from this book include experienced deep learning & NLP practitioners and data analysts & data scientists who want to process the increasing amounts of language-driven data.