<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://eduwiki.innopolis.university/index.php?action=history&amp;feed=atom&amp;title=BSc%3A_Introduction_To_Machine_Learning.previous_version</id>
	<title>BSc: Introduction To Machine Learning.previous version - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://eduwiki.innopolis.university/index.php?action=history&amp;feed=atom&amp;title=BSc%3A_Introduction_To_Machine_Learning.previous_version"/>
	<link rel="alternate" type="text/html" href="https://eduwiki.innopolis.university/index.php?title=BSc:_Introduction_To_Machine_Learning.previous_version&amp;action=history"/>
	<updated>2026-05-07T17:31:34Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.36.1</generator>
	<entry>
		<id>https://eduwiki.innopolis.university/index.php?title=BSc:_Introduction_To_Machine_Learning.previous_version&amp;diff=6981&amp;oldid=prev</id>
		<title>M.petrishchev: Created page with &quot;= Introduction to Machine Learning =  * &lt;span&gt;'''Course name:'''&lt;/span&gt; Introduction to Machine Learning * &lt;span&gt;'''Course number:'''&lt;/span&gt; R-01  == Course characteristics ==...&quot;</title>
		<link rel="alternate" type="text/html" href="https://eduwiki.innopolis.university/index.php?title=BSc:_Introduction_To_Machine_Learning.previous_version&amp;diff=6981&amp;oldid=prev"/>
		<updated>2022-06-28T10:53:50Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot;= Introduction to Machine Learning =  * &amp;lt;span&amp;gt;&amp;#039;&amp;#039;&amp;#039;Course name:&amp;#039;&amp;#039;&amp;#039;&amp;lt;/span&amp;gt; Introduction to Machine Learning * &amp;lt;span&amp;gt;&amp;#039;&amp;#039;&amp;#039;Course number:&amp;#039;&amp;#039;&amp;#039;&amp;lt;/span&amp;gt; R-01  == Course characteristics ==...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;= Introduction to Machine Learning =&lt;br /&gt;
&lt;br /&gt;
* &amp;lt;span&amp;gt;'''Course name:'''&amp;lt;/span&amp;gt; Introduction to Machine Learning&lt;br /&gt;
* &amp;lt;span&amp;gt;'''Course number:'''&amp;lt;/span&amp;gt; R-01&lt;br /&gt;
&lt;br /&gt;
== Course characteristics ==&lt;br /&gt;
&lt;br /&gt;
=== Key concepts of the class ===&lt;br /&gt;
&lt;br /&gt;
* Machine learning paradigms&lt;br /&gt;
* Machine Learning approaches, and algorithms&lt;br /&gt;
&lt;br /&gt;
=== What is the purpose of this course? ===&lt;br /&gt;
&lt;br /&gt;
There is a growing business need of individuals skilled in artificial intelligence, data analytics, and machine learning. Therefore, the purpose of this course is to provide students with an intensive treatment of a cross-section of the key elements of machine learning, with an emphasis on implementing them in modern programming environments, and using them to solve real-world data science problems.&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
The course will benefit if students already know some topics of mathematics and programming.&lt;br /&gt;
&lt;br /&gt;
Maths: &lt;br /&gt;
* [https://eduwiki.innopolis.university/index.php/BSc:_Analytic_Geometry_And_Linear_Algebra_I1 CSE202] — Analytical Geometry and Linear Algebra I&lt;br /&gt;
* [https://eduwiki.innopolis.university/index.php/BSc:_Analytic_Geometry_And_Linear_Algebra_II CSE204] — Analytical Geometry and Linear Algebra II&lt;br /&gt;
* [https://eduwiki.innopolis.university/index.php/BSc:_Mathematical_Analysis_I CSE201] — Mathematical Analysis I&lt;br /&gt;
* [https://eduwiki.innopolis.university/index.php/BSc:_Mathematical_Analysis_II CSE203] — Mathematical Analysis II&lt;br /&gt;
* [https://eduwiki.innopolis.university/index.php/BSc:_Probability_And_Statistics CSE206] — Probability And Statistics&lt;br /&gt;
&lt;br /&gt;
Programming: &lt;br /&gt;
* [https://eduwiki.innopolis.university/index.php/BSc:_Data_Structures_Algorithms CSE117] — Data Structures and Algorithms: python, numpy, basic object-oriented concepts, memory management.&lt;br /&gt;
&lt;br /&gt;
For a more concrete identification of subtopics, please see chapters 2, 3 and 4 of (1), which has done an excellent job in listing and describing all important maths subtopics essential for machine learning students. In addition to that, students are strongly advised to gain a basic understanding of descriptive statistics and data distributions, statistical hypothesis testing, and data sampling, resampling, and experimental design techniques.&lt;br /&gt;
&lt;br /&gt;
[https://www.deeplearningbook.org/ (1) Ian Goodfellow, Yoshua Bengio, &amp;amp; Aaron Courville (2016). Deep Learning. MIT Press].&lt;br /&gt;
&lt;br /&gt;
== Course Objectives Based on Bloom’s Taxonomy ==&lt;br /&gt;
&lt;br /&gt;
=== What should a student remember at the end of the course? ===&lt;br /&gt;
&lt;br /&gt;
By the end of the course, the students should be able to recognize and define&lt;br /&gt;
&lt;br /&gt;
* Different learning paradigms&lt;br /&gt;
* A wide variety of learning approaches and algorithms&lt;br /&gt;
* Various learning settings&lt;br /&gt;
* Performance metrics&lt;br /&gt;
* Popular machine learning software tools&lt;br /&gt;
&lt;br /&gt;
=== What should a student be able to understand at the end of the course? ===&lt;br /&gt;
&lt;br /&gt;
By the end of the course, the students should be able to describe and explain (with examples)&lt;br /&gt;
&lt;br /&gt;
* Difference between different learning paradigms&lt;br /&gt;
* Difference between classification and regression&lt;br /&gt;
* Concept of learning theory (bias/variance tradeoffs and large margins etc.)&lt;br /&gt;
* Kernel methods&lt;br /&gt;
* Regularization&lt;br /&gt;
* Ensemble Learning&lt;br /&gt;
* Neural or Deep Learning&lt;br /&gt;
&lt;br /&gt;
=== What should a student be able to apply at the end of the course? ===&lt;br /&gt;
&lt;br /&gt;
By the end of the course, the students should be able to apply&lt;br /&gt;
&lt;br /&gt;
* Classification approaches to solve supervised learning problems&lt;br /&gt;
* Clustering approaches to solve unsupervised learning problems&lt;br /&gt;
* Ensemble learning to improve a model’s performance&lt;br /&gt;
* Regularization to improve a model’s generalization&lt;br /&gt;
* Deep learning algorithms to solve real-world problems&lt;br /&gt;
&lt;br /&gt;
=== Course evaluation ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|+ Course grade breakdown&lt;br /&gt;
!&lt;br /&gt;
!&lt;br /&gt;
!align=&amp;quot;center&amp;quot;| '''Proposed points'''&lt;br /&gt;
|-&lt;br /&gt;
| Labs/seminar classes&lt;br /&gt;
| 20&lt;br /&gt;
|align=&amp;quot;center&amp;quot;| 0&lt;br /&gt;
|-&lt;br /&gt;
| Interim performance assessment&lt;br /&gt;
| 30&lt;br /&gt;
|align=&amp;quot;center&amp;quot;| 40&lt;br /&gt;
|-&lt;br /&gt;
| Exams&lt;br /&gt;
| 50&lt;br /&gt;
|align=&amp;quot;center&amp;quot;| 60&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
If necessary, please indicate freely your course’s features in terms of students’ performance assessment: None&lt;br /&gt;
&lt;br /&gt;
=== Grades range ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|+ Course grading range&lt;br /&gt;
!&lt;br /&gt;
!&lt;br /&gt;
!align=&amp;quot;center&amp;quot;| '''Proposed range'''&lt;br /&gt;
|-&lt;br /&gt;
| A. Excellent&lt;br /&gt;
| 90-100&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|&lt;br /&gt;
|-&lt;br /&gt;
| B. Good&lt;br /&gt;
| 75-89&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|&lt;br /&gt;
|-&lt;br /&gt;
| C. Satisfactory&lt;br /&gt;
| 60-74&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|&lt;br /&gt;
|-&lt;br /&gt;
| D. Poor&lt;br /&gt;
| 0-59&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
If necessary, please indicate freely your course’s grading features: The semester starts with the default range as proposed in the Table [[#tab:MLCourseGradingRange|[tab:MLCourseGradingRange]]], but it may change slightly (usually reduced) depending on how the semester progresses.&lt;br /&gt;
&lt;br /&gt;
=== Resources and reference material ===&lt;br /&gt;
&lt;br /&gt;
* T. Hastie, R. Tibshirani, D. Witten and G. James. ''&amp;lt;span&amp;gt;An Introduction to Statistical Learning. Springer 2013.&amp;lt;/span&amp;gt;''&lt;br /&gt;
* T. Hastie, R. Tibshirani, and J. Friedman. ''&amp;lt;span&amp;gt;The Elements of Statistical Learning. Springer 2011.&amp;lt;/span&amp;gt;''&lt;br /&gt;
* Tom M Mitchel. &amp;lt;span&amp;gt;''Machine Learning, McGraw Hill''&amp;lt;/span&amp;gt;&lt;br /&gt;
* Christopher M. Bishop. &amp;lt;span&amp;gt;''Pattern Recognition and Machine Learning, Springer''&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Course Sections ==&lt;br /&gt;
&lt;br /&gt;
The main sections of the course and approximate hour distribution between them is as follows:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|+ Course Sections&lt;br /&gt;
!align=&amp;quot;center&amp;quot;| '''Section'''&lt;br /&gt;
! '''Section Title'''&lt;br /&gt;
!align=&amp;quot;center&amp;quot;| '''Teaching Hours'''&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;| 1&lt;br /&gt;
| Supervised Learning&lt;br /&gt;
|align=&amp;quot;center&amp;quot;| 24&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;| 2&lt;br /&gt;
| Decision Trees and Ensemble Learning&lt;br /&gt;
|align=&amp;quot;center&amp;quot;| 8&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;| 3&lt;br /&gt;
| Unsupervised Learning&lt;br /&gt;
|align=&amp;quot;center&amp;quot;| 8&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;| 4&lt;br /&gt;
| Deep Learning&lt;br /&gt;
|align=&amp;quot;center&amp;quot;| 12&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Section 1 ===&lt;br /&gt;
&lt;br /&gt;
=== Section title: ===&lt;br /&gt;
&lt;br /&gt;
Supervised Learning&lt;br /&gt;
&lt;br /&gt;
=== Topics covered in this section: ===&lt;br /&gt;
&lt;br /&gt;
* Introduction to Machine Learning&lt;br /&gt;
* Derivatives and Cost Function&lt;br /&gt;
* Data Pre-processing&lt;br /&gt;
* Linear Regression&lt;br /&gt;
* Multiple Linear Regression&lt;br /&gt;
* Gradient Descent&lt;br /&gt;
* Polynomial Regression&lt;br /&gt;
* Bias-varaince Tradeoff&lt;br /&gt;
* Difference between classification and regression&lt;br /&gt;
* Logistic Regression&lt;br /&gt;
* Naive Bayes&lt;br /&gt;
* KNN&lt;br /&gt;
* Confusion Metrics&lt;br /&gt;
* Performance Metrics&lt;br /&gt;
* Regularization&lt;br /&gt;
* Hyperplane Based Classification&lt;br /&gt;
* Perceptron Learning Algorithm&lt;br /&gt;
* Max-Margin Classification&lt;br /&gt;
* Support Vector Machines&lt;br /&gt;
* Slack Variables&lt;br /&gt;
* Lagrangian Support Vector Machines&lt;br /&gt;
* Kernel Trick&lt;br /&gt;
&lt;br /&gt;
=== What forms of evaluation were used to test students’ performance in this section? ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;tabular&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;span&amp;gt;|a|c|&amp;lt;/span&amp;gt; &amp;amp;amp; '''Yes/No'''&amp;lt;br /&amp;gt;&lt;br /&gt;
Development of individual parts of software product code &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Homework and group projects &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Midterm evaluation &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Testing (written or computer based) &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Reports &amp;amp;amp; 0&amp;lt;br /&amp;gt;&lt;br /&gt;
Essays &amp;amp;amp; 0&amp;lt;br /&amp;gt;&lt;br /&gt;
Oral polls &amp;amp;amp; 0&amp;lt;br /&amp;gt;&lt;br /&gt;
Discussions &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
=== Typical questions for ongoing performance evaluation within this section ===&lt;br /&gt;
&lt;br /&gt;
# Is it true that in simple linear regression &amp;lt;math display=&amp;quot;inline&amp;quot;&amp;gt;R^2&amp;lt;/math&amp;gt; and the squared correlation between X and Y are identical?&lt;br /&gt;
# What are the two assumptions that the Linear regression model makes about the '''Error Terms'''?&lt;br /&gt;
# Fit a regression model to a given data problem, and support your choice of the model.&lt;br /&gt;
# In a list of given tasks, choose which are regression and which are classification tasks.&lt;br /&gt;
# In a given graphical model of binary random variables, how many parameters are needed to define the Conditional Probability Distributions for this Bayes Net?&lt;br /&gt;
# Write the mathematical form of the minimization objective of Rosenblatt’s perceptron learning algorithm for a two-dimensional case.&lt;br /&gt;
# What is perceptron learning algorithm?&lt;br /&gt;
# Write the mathematical form of its minimization objective for a two-dimensional case.&lt;br /&gt;
# What is a max-margin classifier?&lt;br /&gt;
# Explain the role of slack variable in SVM.&lt;br /&gt;
&lt;br /&gt;
=== Typical questions for seminar classes (labs) within this section ===&lt;br /&gt;
&lt;br /&gt;
# How to implement various regression models to solve different regression problems?&lt;br /&gt;
# Describe the difference between different types of regression models, their pros and cons, etc.&lt;br /&gt;
# Implement various classification models to solve different classification problems.&lt;br /&gt;
# Describe the difference between Logistic regression and naive bayes.&lt;br /&gt;
# Implement perceptron learning algorithm, SVMs, and its variants to solve different classification problems.&lt;br /&gt;
# Solve a given optimization problem using the Lagrange multiplier method.&lt;br /&gt;
&lt;br /&gt;
=== Test questions for final assessment in this section ===&lt;br /&gt;
&lt;br /&gt;
# What does it mean for the standard least squares coefficient estimates of linear regression to be ''scale equivariant''?&lt;br /&gt;
# Given a fitted regression model to a dataset, interpret its coefficients.&lt;br /&gt;
# Explain which regression model would be a better fit to model the relationship between response and predictor in a given data.&lt;br /&gt;
# If the number of training examples goes to infinity, how will it affect the bias and variance of a classification model?&lt;br /&gt;
# Given a two dimensional classification problem, determine if by using Logistic regression and regularization, a linear boundary can be estimated or not.&lt;br /&gt;
# Explain which classification model would be a better fit to for a given classification problem.&lt;br /&gt;
# Consider the Leave-one-out-CV error of standard two-class SVM. Argue that under a given value of slack variable, a given mathematical statement is either correct or incorrect.&lt;br /&gt;
# How does the choice of slack variable affect the bias-variance tradeoff in SVM?&lt;br /&gt;
# Explain which Kernel would be a better fit to be used in SVM for a given data.&lt;br /&gt;
&lt;br /&gt;
=== Section 2 ===&lt;br /&gt;
&lt;br /&gt;
=== Section title: ===&lt;br /&gt;
&lt;br /&gt;
Decision Trees and Ensemble Methods&lt;br /&gt;
&lt;br /&gt;
=== Topics covered in this section: ===&lt;br /&gt;
&lt;br /&gt;
* Decision Trees&lt;br /&gt;
* Bagging&lt;br /&gt;
* Boosting&lt;br /&gt;
* Random Forest&lt;br /&gt;
* Adaboost&lt;br /&gt;
&lt;br /&gt;
=== What forms of evaluation were used to test students’ performance in this section? ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;tabular&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;span&amp;gt;|a|c|&amp;lt;/span&amp;gt; &amp;amp;amp; '''Yes/No'''&amp;lt;br /&amp;gt;&lt;br /&gt;
Development of individual parts of software product code &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Homework and group projects &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Midterm evaluation &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Testing (written or computer based) &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Reports &amp;amp;amp; 0&amp;lt;br /&amp;gt;&lt;br /&gt;
Essays &amp;amp;amp; 0&amp;lt;br /&amp;gt;&lt;br /&gt;
Oral polls &amp;amp;amp; 0&amp;lt;br /&amp;gt;&lt;br /&gt;
Discussions &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
=== Typical questions for ongoing performance evaluation within this section ===&lt;br /&gt;
&lt;br /&gt;
# What are pros and cons of decision trees over other classification models?&lt;br /&gt;
# Explain how tree-pruning works.&lt;br /&gt;
# What is the purpose of ensemble learning?&lt;br /&gt;
# What is a bootstrap, and what is its role in Ensemble learning?&lt;br /&gt;
# Explain the role of slack variable in SVM.&lt;br /&gt;
&lt;br /&gt;
=== Typical questions for seminar classes (labs) within this section ===&lt;br /&gt;
&lt;br /&gt;
# Implement different variants of decision trees to solve different classification problems.&lt;br /&gt;
# Solve a given classification problem problem using an ensemble classifier.&lt;br /&gt;
# Implement Adaboost for a given problem.&lt;br /&gt;
&lt;br /&gt;
=== Test questions for final assessment in this section ===&lt;br /&gt;
&lt;br /&gt;
# When a decision tree is grown to full depth, how does it affect tree’s bias and variance, and its response to noisy data?&lt;br /&gt;
# Argue if an ensemble model would be a better choice for a given classification problem or not.&lt;br /&gt;
# Given a particular iteration of boosting and other important information, calculate the weights of the Adaboost classifier.&lt;br /&gt;
&lt;br /&gt;
=== Section 3 ===&lt;br /&gt;
&lt;br /&gt;
=== Section title: ===&lt;br /&gt;
&lt;br /&gt;
Unsupervised Learning&lt;br /&gt;
&lt;br /&gt;
=== Topics covered in this section: ===&lt;br /&gt;
&lt;br /&gt;
* K-means Clustering&lt;br /&gt;
* K-means++&lt;br /&gt;
* Hierarchical Clustering&lt;br /&gt;
* DBSCAN&lt;br /&gt;
* Mean-shift&lt;br /&gt;
&lt;br /&gt;
=== What forms of evaluation were used to test students’ performance in this section? ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;tabular&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;span&amp;gt;|a|c|&amp;lt;/span&amp;gt; &amp;amp;amp; '''Yes/No'''&amp;lt;br /&amp;gt;&lt;br /&gt;
Development of individual parts of software product code &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Homework and group projects &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Midterm evaluation &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Testing (written or computer based) &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Reports &amp;amp;amp; 0&amp;lt;br /&amp;gt;&lt;br /&gt;
Essays &amp;amp;amp; 0&amp;lt;br /&amp;gt;&lt;br /&gt;
Oral polls &amp;amp;amp; 0&amp;lt;br /&amp;gt;&lt;br /&gt;
Discussions &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
=== Typical questions for ongoing performance evaluation within this section ===&lt;br /&gt;
&lt;br /&gt;
# Which implicit or explicit objective function does K-means implement?&lt;br /&gt;
# Explain the difference between k-means and k-means++.&lt;br /&gt;
# Whaat is single-linkage and what are its pros and cons?&lt;br /&gt;
# Explain how DBSCAN works.&lt;br /&gt;
&lt;br /&gt;
=== Typical questions for seminar classes (labs) within this section ===&lt;br /&gt;
&lt;br /&gt;
# Implement different clustering algorithms to solve to solve different clustering problems.&lt;br /&gt;
# Implement Mean-shift for video tracking&lt;br /&gt;
&lt;br /&gt;
=== Test questions for final assessment in this section ===&lt;br /&gt;
&lt;br /&gt;
# K-Means does not explicitly use a fitness function. What are the characteristics of the solutions that K-Means finds? Which fitness function does it implicitly minimize?&lt;br /&gt;
# Suppose we clustered a set of N data points using two different specified clustering algorithms. In both cases we obtained 5 clusters and in both cases the centers of the clusters are exactly the same. Can 3 points that are assigned to different clusters in one method be assigned to the same cluster in the other method?&lt;br /&gt;
# What are the characterics of noise points in DBSCAN?&lt;br /&gt;
&lt;br /&gt;
=== Section 4 ===&lt;br /&gt;
&lt;br /&gt;
=== Section title: ===&lt;br /&gt;
&lt;br /&gt;
Deep Learning&lt;br /&gt;
&lt;br /&gt;
=== Topics covered in this section: ===&lt;br /&gt;
&lt;br /&gt;
* Artificial Neural Networks&lt;br /&gt;
* Back-propagation&lt;br /&gt;
* Convolutional Neural Networks&lt;br /&gt;
* Autoencoder&lt;br /&gt;
* Variatonal Autoencoder&lt;br /&gt;
* Generative Adversairal Networks&lt;br /&gt;
&lt;br /&gt;
=== What forms of evaluation were used to test students’ performance in this section? ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;tabular&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;span&amp;gt;|a|c|&amp;lt;/span&amp;gt; &amp;amp;amp; '''Yes/No'''&amp;lt;br /&amp;gt;&lt;br /&gt;
Development of individual parts of software product code &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Homework and group projects &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Midterm evaluation &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Testing (written or computer based) &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
Reports &amp;amp;amp; 0&amp;lt;br /&amp;gt;&lt;br /&gt;
Essays &amp;amp;amp; 0&amp;lt;br /&amp;gt;&lt;br /&gt;
Oral polls &amp;amp;amp; 0&amp;lt;br /&amp;gt;&lt;br /&gt;
Discussions &amp;amp;amp; 1&amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
=== Typical questions for ongoing performance evaluation within this section ===&lt;br /&gt;
&lt;br /&gt;
# What is a fully connected feed-forward ANN?&lt;br /&gt;
# Explain different hyperparameters of CNNs.&lt;br /&gt;
# Calculate KL-divergence between two probability distributions.&lt;br /&gt;
# What is a generative model and how is it different from a discriminative model?&lt;br /&gt;
&lt;br /&gt;
=== Typical questions for seminar classes (labs) within this section ===&lt;br /&gt;
&lt;br /&gt;
# Implement different types of ANNs to solve to solve different classification problems.&lt;br /&gt;
# Calculate KL-divergence between two probability distributions.&lt;br /&gt;
# Implement different generative models for different problems.&lt;br /&gt;
&lt;br /&gt;
=== Test questions for final assessment in this section ===&lt;br /&gt;
&lt;br /&gt;
# Explain what is ReLU, what are its different variants, and what are their pros and cons?&lt;br /&gt;
# Calculate the number of parameters to be learned during training in a CNN, given all important information.&lt;br /&gt;
# Explain how a VAE can be used as a generative model.&lt;br /&gt;
&lt;br /&gt;
== Exams and retake planning ==&lt;br /&gt;
&lt;br /&gt;
=== Exam ===&lt;br /&gt;
&lt;br /&gt;
Exams will be paper-based and will be conducted in a form of problem solving, where the problems will be similar to those mentioned above and will based on the contents taught in lecture slides, lecture discussions (including white-board materials), lab materials, reading materials (including the text books), etc. Students will be given 1-3 hours to complete the exam.&lt;br /&gt;
&lt;br /&gt;
=== Retake 1 ===&lt;br /&gt;
&lt;br /&gt;
First retake will be conducted in the same form as the final exam. The weight of the retake exam will be 5% larger than the passing threshold of the course.&lt;br /&gt;
&lt;br /&gt;
=== Retake 2 ===&lt;br /&gt;
&lt;br /&gt;
Second retake will be conducted in the same form as the final exam. The weight of the retake exam will be 5% larger than the passing threshold of the course.&lt;/div&gt;</summary>
		<author><name>M.petrishchev</name></author>
	</entry>
</feed>