Get Hands-On With Generative AI On Coursera
Written by Sue Gee   
Friday, 07 July 2023

Learn the fundamentals of how generative AI works, and how to deploy it in real-world applications with a new short course on the Coursera platform from DeepLearning.ai and Amazon Web Services.

gen AI Coursera

Opinion is divided about the impact of generative AI with some AI researchers fearful that it could have apocalyptic consequences while the majority of developers who have had a chance to try it for themselves favorably impressed by its ability to improve productivity.

Generative AI with Large Language Models provides the opportunity to make a practical exploration of the topic in a short amount of time. Structured over a three-week timetable it is expected to take 16 hours and as it is self-paced it can be completed in a long weekend. 

Disclosure: When you make a purchase having followed a link to from this article, we may earn an affiliate commission.

In his introductory video, Andrew Ng, Co-Founder of Coursera and Founder of DeepLearning.ai explains: 

Large language models or LLMs are a very exciting technology. But despite all the buzz and hype, one of the thing that is still underestimated by many people is their power as a developer tool. Specifically, there are many machine learning and AI applications that used to take me many months to build that you can now build in days or maybe even small numbers of weeks. 

 

Ng's promise is that the course takes a deep technical dive into how LLMs work including going through many of the technical details, like model training, instruction tuning, fine-tuning and the generative AI project life cycle framework to help you plan and execute your projects.

Providing more information about the course, Antje Barth, one of its four AWS instructors says:

With this course on generative AI with large language models, we've created a series of lessons meant for AI enthusiasts, engineers, or data scientists Looking to learn the technical foundations of how LLMs work, as well as the best practices behind training, tuning, and deploying them. 

In terms of prerequisites, we assume you are already familiar with Python programming and at least basic data science and machine learning concepts. If you have some experience with either Python or TensorFlow, that should be enough. 
 
Outlining the course schedule, Mike Chambers tells learners that in Week 1 they will examine the transformer architecture that powers large language models, explore how these models are trained, and understand the compute resources required to develop these powerful LLMs. They will also learn about a technique called in-context learning - how to guide the model to output at inference time with prompt engineering, and how to tune the most important generation parameters of LLMs for tuning your model output. Week 2 is where they explore options for adapting pre-trained models to specific tasks and datasets via a process called instruction fine tuning and in Week 3, learners will see how to align the output of language models with human values in order to increase  helpfulness and decrease potential harm and toxicity. 
 
Each week includes a hands-on lab which is where you can try out these techniques for yourself in an AWS environment that includes all the resources you need for working with large models. In the first, learners construct and compare different prompts and inputs for a given generative task, in this case, dialogue summarization. They also explore different inference parameters and sampling strategies to gain insight into how to  improve the responses from the generative mode. The second lab uses the popular open-source LLM from Hugging Face and introduces both full fine-tuning and parameter efficient fine tuning, PEFT for short, showing how PEFT makes your workflow much more efficient. In the third lab, you get hands-on with RLHF - reinforcement learning from human feedback, by building a reward model classifier to label model responses as either toxic or non-toxic. 

More Information 

Generative AI with Large Language Models 

Related Articles

Machine Learning At All Levels On Coursera

Follow Google's Generative AI Learning Path

The Hugging Face NLP Course

Pain And Panic Over Rogue AI - We Need a Balance

Hinton Explains His New Fear of AI

Can DeepMind's Alpha Code Outperform Human Coders?

 

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

 

Banner


OpenAI Library For .NET Exits Beta
19/11/2024

A few months ago the OpenAI .NET library was released as a beta. It has now reached version 2.0.0 and the time has come to leave beta and, with a few amendments enter production readiness.



GitHub Universe AI Announcements - Copilot And Spark
30/10/2024

GitHub has announced several improvements for developers at Universe, its annual conference. Developers will get multi-model Copilot and GitHub Spark, an AI-native tool for building applications in na [ ... ]


More News

espbook

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Friday, 14 July 2023 )