SWE015 Introduction to Large Language ModelsIstinye UniversityDegree Programs Computer Engineering(English)(For Software Engineering)MinorGeneral Information For StudentsDiploma SupplementErasmus Policy StatementNational Qualifications

Course Introduction and Application Information

Course Code: SWE015
Course Name: Introduction to Large Language Models
Semester: Fall
Course Credits:
ECTS
5
Language of instruction: English
Course Condition:
Does the Course Require Work Experience?: No
Type of course: Departmental Elective
Course Level:
Bachelor TR-NQF-HE:6. Master`s Degree QF-EHEA:First Cycle EQF-LLL:6. Master`s Degree
Mode of Delivery: E-Learning
Course Coordinator: Dr. Öğr. Üy. MUHAMMED DAVUD
Course Lecturer(s): Assist. Prof. Dr. Alper Öner, Res. Assist. Yazım Beril Uluer
Course Assistants:

Course Objective and Content

Course Objectives: The aim of the large language model development course is to provide individuals specialized in natural language processing (NLP) with knowledge and skills in creating, training and developing large language models. This course provides in-depth knowledge of understanding complex language structures, performing language-based tasks, and using language models effectively in real-world applications.
Course Content: Docker, Tensorflow, Neural Network, Convolutional Neural Network, Introduction to Neural Language Processing, Tranformers-BERT & NER, LLM, LLM Finetuning, MLLM , Computer Vision and Neural Language Processing Applications

Learning Outcomes

The students who have succeeded in this course;
1) Design a well-defined problem formulation for a basic LLM problem.
2) Develop optimized LLM.
3) Will apply software tools to solve LLM problems.
4) Will be able to solve Basic Image Processing and Language Problems with transformers methods.
5) Will be able to realize a Transformers Project as a team.

Course Flow Plan

Week Subject Related Preparation
1) Docker – GIT
2) Neural Networks - Tensorflow
3) Convolutional Neural Network
4) Deep Learning for Computer Vision
5) Introduction to Neural Language Processing
6) Deep Learning for Neural Language Processing
7) Tranformers-BERT & NER
8) Midterm Exam
9) Multilingual Named Entity Recognition with Transformers
10) Making Transformers Efficient in Production
11) Large Language Models (LLM)
12) Open Source Large Language Models (LLM)
13) Foundation Models
14) LLM Applications

Sources

Course Notes / Textbooks: Natural Language Processing with Transformers, Lewis Tunstall, Leandro von Werra, and Thomas Wolf, O’Reilly.
References: Deep Learning with Python, François Chollet, Manning, 2018.

Course - Program Learning Outcome Relationship

Course Learning Outcomes

1

2

3

4

5

Program Outcomes

Course - Learning Outcome Relationship

No Effect 1 Lowest 2 Average 3 Highest
       
Program Outcomes Level of Contribution

Assessment & Grading

Semester Requirements Number of Activities Level of Contribution
Homework Assignments 2 % 20
Project 2 % 40
Final 1 % 40
total % 100
PERCENTAGE OF SEMESTER WORK % 60
PERCENTAGE OF FINAL WORK % 40
total % 100

Workload and ECTS Credit Calculation

Activities Number of Activities Workload
Course Hours 14 42
Presentations / Seminar 3 25
Project 3 28
Homework Assignments 4 20
Final 2 20
Total Workload 135