Transformer Models and BERT Model

Course Cover
compare button icon

Course Features

icon

Duration

8 hours

icon

Delivery Method

Online

icon

Available on

Limited Access

icon

Accessibility

Mobile, Desktop, Laptop

icon

Language

English

icon

Subtitles

English

icon

Level

Beginner

icon

Teaching Type

Self Paced

icon

Video Content

8 hours

Course Description

This course teaches you about the Transformer architecture as well as its Bidirectional Encoder Representations of Transformers (BERT) Model. Learn about the major elements of the Transformer architecture, including the self-attention mechanism and how it's used to create BERT. BERT model. Also, you learn about the various tasks BERT is able to be utilized for, including text classification, answering questions and inference of natural language.

Course Overview

projects-img

International Faculty

projects-img

Post Course Interactions

projects-img

Instructor-Moderated Discussions

Skills You Will Gain

What You Will Learn

Understand the main components of the Transformer architecture.

Learn how a BERT model is built using Transformers.

Use BERT to solve different natural language processing (NLP) tasks.

Target Students

This course is intended for those who are interested in learning about text classification, question answering, and natural language inference

Course Cover