Difference between revisions of "MSc: Advanced Information Retrieval"
S.protasov (talk | contribs) |
|||
(11 intermediate revisions by 4 users not shown) | |||
Line 1: | Line 1: | ||
+ | |||
= Advanced Information Retrieval = |
= Advanced Information Retrieval = |
||
+ | * '''Course name''': Advanced Information Retrieval |
||
+ | * '''Code discipline''': CSE334 |
||
+ | * '''Subject area''': Computer systems organization; Information systems; Real-time systems; Information retrieval; World Wide Web |
||
+ | == Short Description == |
||
− | * <span>'''Course name:'''</span> Advanced Information Retrieval |
||
+ | This course covers the following concepts: Data indexing; Recommendations; Relevance and ranking. |
||
− | * <span>'''Course number:'''</span> N/A |
||
− | |||
− | == Course Characteristics == |
||
− | |||
− | == What subject area does your course (discipline) belong to? == |
||
− | |||
− | Computer systems organization; Information systems; Real-time systems; Information retrieval; World Wide Web |
||
− | |||
− | === Key concepts of the class === |
||
− | |||
− | * Data indexing |
||
− | * Recommendations |
||
− | * Relevance and ranking |
||
− | |||
− | === What is the purpose of this course? === |
||
− | |||
− | The course is designed to prepare students to understand and learn contemporary tools of information retrieval systems. Students, who will later dedicate their engineering or scientific careers to implementation of search engines, social networks, recommender systems and other content services will obtain necessary knowledge and skills in designing and implementing essential parts of such systems. |
||
== Prerequisites == |
== Prerequisites == |
||
+ | === Prerequisite subjects === |
||
− | The course will benefit if students already know some topics of mathematics and programming. |
||
+ | * CSE101 — Introduction to Programming I |
||
+ | * CSE102 — Introduction to Programming II |
||
+ | * CSE202 — Analytical Geometry and Linear Algebra I |
||
+ | * CSE204 — Analytic Geometry And Linear Algebra II: matrix multiplication, matrix decomposition (SVD, ALS) and approximation (matrix norm), sparse matrix, stability of solution (decomposition), vector spaces, metric spaces, manifold, eigenvector and eigenvalue. |
||
+ | * CSE113 — Philosophy I - (Discrete Math and Logic): graphs, trees, binary trees, balanced trees, metric (proximity) graphs, diameter, clique, path, shortest path. |
||
+ | * CSE206 — Probability And Statistics: probability, likelihood, conditional probability, Bayesian rule, stochastic matrix and properties. |
||
+ | === Prerequisite topics === |
||
− | * Programming: python, numpy library and basic programming for internet skills (HTTP requests, networking). |
||
− | * Mathematics (Algebra): |
||
− | * [https://eduwiki.innopolis.university/index.php/BSc:AnalyticGeometryAndLinearAlgebraI CSE202 — Analytical Geometry and Linear Algebra I] & [https://eduwiki.innopolis.university/index.php/BSc:AnalyticGeometryAndLinearAlgebraII CSE204 — Analytical Geometry and Linear Algebra II]: matrix multiplication, matrix decomposition (SVD, ALS) and approximation (matrix norm), sparse matrix, stability of solution (decomposition), vector spaces, metric spaces, manifold, eigenvector and eigenvalue. |
||
+ | == Course Topics == |
||
− | * [https://eduwiki.innopolis.university/index.php/BSc:Logic_and_Discrete_Mathematics CSE113 — Philosophy I - (Discrete Math and Logic)]: graphs, trees, binary trees, balanced trees, metric (proximity) graphs, diameter, clique, path, shortest path. |
||
+ | {| class="wikitable" |
||
+ | |+ Course Sections and Topics |
||
+ | |- |
||
+ | ! Section !! Topics within the section |
||
+ | |- |
||
+ | | Information retrieval basics || |
||
+ | # Introduction to IR, major concepts. |
||
+ | # Crawling and Web. |
||
+ | # Quality assessment. |
||
+ | |- |
||
+ | | Text processing and indexing || |
||
+ | # Building inverted index for text documents. Boolean retrieval model. |
||
+ | # Language, tokenization, stemming, searching, scoring. |
||
+ | # Spellchecking and wildcard search. |
||
+ | # Suggest and query expansion. |
||
+ | # Language modelling. Topic modelling. |
||
+ | |- |
||
+ | | Vector model and vector indexing || |
||
+ | # Vector model |
||
+ | # Machine learning for vector embedding |
||
+ | # Vector-based index structures |
||
+ | |- |
||
+ | | Advanced topics. Media processing || |
||
+ | # Image and video processing, understanding and indexing |
||
+ | # Content-based image retrieval |
||
+ | # Audio retrieval |
||
+ | # Relevance feedback |
||
+ | |} |
||
+ | == Intended Learning Outcomes (ILOs) == |
||
+ | === What is the main purpose of this course? === |
||
− | * [https://eduwiki.innopolis.university/index.php/BSc:ProbabilityAndStatistics CSE206 — Probability And Statistics]: probability, likelihood, conditional probability, Bayesian rule, stochastic matrix and properties. |
||
+ | The course is designed to prepare students to understand background theories of information retrieval systems and introduce different information retrieval systems. The course will focus on the evaluation and analysis of such systems as well as how they are implemented. Throughout the course, students will be involved in discussions, readings, and assignments to experience real world systems. The technologies and algorithms covered in this class include machine learning, data mining, natural language processing, data indexing, and so on. |
||
+ | === ILOs defined at three levels === |
||
− | * Analysis: DFT, [discrete] gradient. |
||
− | |||
− | How can students fill the gap? |
||
− | * [https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab 3blue1brown playlist on Linear Algebra] can help to overview selected topics. |
||
− | * Actually, on their channel you can find almost any maths topic, e.g. [https://www.youtube.com/watch?v=spUNpyF58BY Fourier Transform]. |
||
− | * Gilbert Strang is one of the best human teachers of Algebra, if you prefer classic lectures to fancy videos. |
||
− | * [https://ocw.mit.edu/courses/6-042j-mathematics-for-computer-science-spring-2015/ This MIT course] can help you with discrete objects. |
||
− | * For Russian readers there is a [http://sprotasov.ru/math_book.html maths book from the course author] with [https://github.com/str-anger/math-book labs]. |
||
− | * Also there is a [https://github.com/hsu-ai-course/mbp/tree/master/notebooks very basic python-based course on maths] with lots of relevant (and irrelevant) labs. |
||
− | * To have a better feeling of networking, please refer to [https://www.youtube.com/watch?v=7pyKSxWBDN0 this video lecture]. |
||
− | * Kick start your numpy with the official [https://numpy.org/doc/stable/user/quickstart.html quickstart guide]. |
||
− | |||
− | |||
− | |||
− | === Course objectives based on Bloom’s taxonomy === |
||
− | |||
− | === - What should a student remember at the end of the course? === |
||
− | |||
− | By the end of the course, the students should be able to remember and recognize |
||
+ | ==== Level 1: What concepts should a student know/remember/explain? ==== |
||
+ | By the end of the course, the students should be able to ... |
||
* Terms and definitions used in area of information retrieval, |
* Terms and definitions used in area of information retrieval, |
||
* Search engine and recommender system essential parts, |
* Search engine and recommender system essential parts, |
||
Line 59: | Line 65: | ||
* Indexing strategies. |
* Indexing strategies. |
||
− | === |
+ | ==== Level 2: What basic practical skills should a student be able to perform? ==== |
+ | By the end of the course, the students should be able to ... |
||
− | |||
+ | * Understand background theories behind information retrieval systems, |
||
− | By the end of the course, the students should be able to describe and explain |
||
− | |||
* How to design a recommender system from scratch, |
* How to design a recommender system from scratch, |
||
* How to evaluate quality of a particular information retrieval system, |
* How to evaluate quality of a particular information retrieval system, |
||
Line 68: | Line 73: | ||
* How to identify and fix information retrieval system problems. |
* How to identify and fix information retrieval system problems. |
||
− | === |
+ | ==== Level 3: What complex comprehensive skills should a student be able to apply in real-life scenarios? ==== |
+ | By the end of the course, the students should be able to ... |
||
− | |||
− | By the end of the course, the students should be able to |
||
− | |||
* Build a recommender service from scratch, |
* Build a recommender service from scratch, |
||
− | * Implement proper index for an unstructured dataset, |
+ | * Implement a proper index for an unstructured dataset, |
* Plan quality measures for a new recommender service, |
* Plan quality measures for a new recommender service, |
||
− | * Run initial data analysis and problem evaluation for a business task, related to information retrieval. |
+ | * Run initial data analysis and problem evaluation for a business task, related to information retrieval. |
+ | == Grading == |
||
− | === Course |
+ | === Course grading range === |
+ | {| class="wikitable" |
||
− | |||
− | + | |+ |
|
− | |+ Course grade breakdown |
||
− | !align="center"| |
||
− | !align="center"| |
||
− | !align="center"| '''Proposed points''' |
||
|- |
|- |
||
+ | ! Grade !! Range !! Description of performance |
||
− | |align="center"| Labs/seminar classes |
||
− | |align="center"| 20 |
||
− | |align="center"| 30 |
||
|- |
|- |
||
+ | | A. Excellent || 84-100 || - |
||
− | |align="center"| Interim performance assessment |
||
− | |align="center"| 30 |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | B. Good || 72-83 || - |
||
− | |align="center"| Assessments (homework) |
||
− | |align="center"| 0 |
||
− | |align="center"| 70 |
||
|- |
|- |
||
+ | | C. Satisfactory || 60-71 || - |
||
− | |align="center"| Exams |
||
+ | |- |
||
− | |align="center"| 50 |
||
+ | | D. Poor || 0-59 || - |
||
− | |align="center"| 0 |
||
|} |
|} |
||
+ | === Course activities and grading breakdown === |
||
− | 7 hometasks will cost you up to 70 points in total (10 points each). 7 contest labs can bring you up to 5 points each. Work in teams up to 3, you will get +2 points for each successful completion and +3 points for each submission in top 3. |
||
+ | {| class="wikitable" |
||
− | |||
+ | |+ |
||
− | === Exam and retake planning === |
||
− | |||
− | '''Exam''' |
||
− | |||
− | No exam. |
||
− | |||
− | '''Retake 1''' |
||
− | |||
− | First retake is conducted in a form of project defense. Student is given a week to prepare. Student takes any technical paper from Information Retrieval Journal (https://www.springer.com/journal/10791) for '''the last 3 years''' and approves it until the next day with a professor to avoid collisions and misunderstanding. Student implements the paper in a search engine (this can be a technique, metric, ...). At the retake day student presents a paper. Presentation is followed by QA session. After QA session student presents implementation of the paper. Grading criteria as follows: |
||
− | |||
− | * 30% – paper presentation is clear, discussion of results is full. |
||
− | * 30% – search engine implementation is correct and clear. Well-structured and dedicated to a separate service. |
||
− | * 30% – paper implementation is correct. |
||
− | |||
− | '''Retake 2''' |
||
− | |||
− | Second retake is conducted in front of the committee. Four (4) questions are randomly selected for a student: two (2) theoretical from "Test questions for final assessment in this section" and two (2) practical from "Typical questions for ongoing performance evaluation". Each question costs 25% of the grade. Student is given 15 minutes to prepare for theoretical questions. Then (s)he answers in front of the committee. After this student if given additional 40 minutes to solve practical questions. |
||
− | |||
− | === Grades range === |
||
− | |||
− | {| |
||
− | |+ Course grading range |
||
− | !align="center"| |
||
− | !align="center"| |
||
− | !align="center"| '''Proposed range''' |
||
|- |
|- |
||
+ | ! Activity Type !! Percentage of the overall course grade |
||
− | |align="center"| A. Excellent |
||
− | |align="center"| 90-100 |
||
− | |align="center"| 84-100 |
||
|- |
|- |
||
+ | | Assignments || 60 |
||
− | |align="center"| B. Good |
||
− | |align="center"| 75-89 |
||
− | |align="center"| 70-83 |
||
|- |
|- |
||
+ | | Quizzes || 40 |
||
− | |align="center"| C. Satisfactory |
||
− | |align="center"| 60-74 |
||
− | |align="center"| 60-69 |
||
|- |
|- |
||
+ | | Exams || 0 |
||
− | |align="center"| D. Poor |
||
− | |align="center"| 0-59 |
||
− | |align="center"| 0-59 |
||
|} |
|} |
||
+ | === Recommendations for students on how to succeed in the course === |
||
− | === Resources and reference material === |
||
+ | The simples way to succeed is to participate in labs and pass coding assignments in timely manner. This guarantees up to 60% of the grade. Participation in lecture quizzes allow to differentiate the grade. |
||
− | Main textbook: |
||
+ | == Resources, literature and reference materials == |
||
− | * "An Introduction to Information Retrieval" by Christopher D. Manning, Prabhakar Raghavan and Hinrich Schütze, Cambridge University Press (any edition) |
||
+ | === Open access resources === |
||
− | Other reference material: |
||
+ | * Manning, Raghavan, Schütze, An Introduction to Information Retrieval, 2008, Cambridge University Press |
||
+ | * Baeza-Yates, Ribeiro-Neto, Modern Information Retrieval, 2011, Addison-Wesley |
||
+ | * Buttcher, Clarke, Cormack, Information Retrieval: Implementing and Evaluating Search Engines, 2010, MIT Press |
||
+ | * [https://github.com/IUCVLab/information-retrieval Course repository in github]. |
||
+ | === Closed access resources === |
||
− | * “Natural Language Processing with Python – Analyzing Text with the Natural Language Toolkit,” Steven Bird, Ewan Klein, and Edward Loper. [https://www.nltk.org/book/ [link]] |
||
− | == Course Sections == |
||
+ | === Software and tools used within the course === |
||
− | The main sections of the course and approximate hour distribution between them is as follows: |
||
+ | = Teaching Methodology: Methods, techniques, & activities = |
||
− | {| |
||
+ | |||
− | |+ Course Sections |
||
+ | == Activities and Teaching Methods == |
||
− | !align="center"| '''Section''' |
||
+ | {| class="wikitable" |
||
− | ! '''Section Title''' |
||
+ | |+ Activities within each section |
||
− | !align="center"| '''Teaching Hours''' |
||
|- |
|- |
||
+ | ! Learning Activities !! Section 1 !! Section 2 !! Section 3 !! Section 4 |
||
− | |align="center"| 1 |
||
− | | Introduction. Crawling and quality basics |
||
− | |align="center"| 16 |
||
|- |
|- |
||
+ | | Development of individual parts of software product code || 1 || 1 || 1 || 1 |
||
− | |align="center"| 2 |
||
− | | Text indexing and language processing |
||
− | |align="center"| 20 |
||
|- |
|- |
||
+ | | Homework and group projects || 1 || 1 || 1 || 1 |
||
− | |align="center"| 3 |
||
− | | Advanced index data structures |
||
− | |align="center"| 8 |
||
|- |
|- |
||
+ | | Testing (written or computer based) || 1 || 1 || 1 || 1 |
||
− | |align="center"| 4 |
||
+ | |} |
||
− | | Advanced retrieval topics. Media retrieval |
||
+ | == Formative Assessment and Course Activities == |
||
− | |align="center"| 16 |
||
− | |} |
||
− | |||
− | === Section 1 === |
||
− | |||
− | ==== Section title: ==== |
||
− | |||
− | Introduction. Crawling and quality basics |
||
− | |||
− | === Topics covered in this section: === |
||
− | |||
− | * Introduction to information retrieval |
||
− | * Crawling |
||
− | * Quality assessment |
||
− | === |
+ | === Ongoing performance assessment === |
+ | ==== Section 1 ==== |
||
− | {| |
||
+ | {| class="wikitable" |
||
− | !align="center"| |
||
+ | |+ |
||
− | !align="center"| '''Yes/No''' |
||
|- |
|- |
||
+ | ! Activity Type !! Content !! Is Graded? |
||
− | |align="center"| Development of individual parts of software product code |
||
− | |align="center"| 1 |
||
|- |
|- |
||
+ | | Question || Enumerate limitations for web crawling. || 1 |
||
− | |align="center"| Homework and group projects |
||
− | |align="center"| 1 |
||
|- |
|- |
||
+ | | Question || Propose a strategy for A/B testing. || 1 |
||
− | |align="center"| Midterm evaluation |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Propose recommender quality metric. || 1 |
||
− | |align="center"| Testing (written or computer based) |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Implement DCG metric. || 1 |
||
− | |align="center"| Reports |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Discuss relevance metric. || 1 |
||
− | |align="center"| Essays |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Crawl website with respect to robots.txt. || 1 |
||
− | |align="center"| Oral polls |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || What is typical IR system architecture? || 0 |
||
− | |align="center"| Discussions |
||
− | |align="center"| 0 |
||
− | |} |
||
− | |||
− | === Typical questions for ongoing performance evaluation within this section === |
||
− | |||
− | # Enumerate limitations for web crawling. |
||
− | # Propose a strategy for A/B testing. |
||
− | # Propose recommender quality metric. |
||
− | # Implement DCG metric. |
||
− | # Discuss relevance metric. |
||
− | # Crawl website with respect to robots.txt. |
||
− | |||
− | ==== Typical questions for seminar classes (labs) within this section ==== |
||
− | |||
− | # Show how to parse a dynamic web page. |
||
− | # Provide a framework to accept/reject A/B testing results. |
||
− | # Compute DCG for an example query for random search engine. |
||
− | # Implement a metric for a recommender system. |
||
− | # Implement pFound. |
||
− | |||
− | ==== Test questions for final assessment in this section ==== |
||
− | |||
− | # Implement text crawler for a news site. |
||
− | # What is SBS (side-by-side) and how is it used in search engines? |
||
− | # Compare pFound with CTR and with DCG. |
||
− | # Explain how A/B testing works. |
||
− | |||
− | === Section 2 === |
||
− | |||
− | ==== Section title: ==== |
||
− | |||
− | Text indexing and language processing |
||
− | |||
− | === Topics covered in this section: === |
||
− | |||
− | * Building inverted index for text documents. Boolean retrieval model. |
||
− | * Language, tokenization, stemming, searching, scoring. |
||
− | * Spellchecking. |
||
− | * Language model. Topic model. |
||
− | * Vector model for texts. |
||
− | * ML for text embedding. |
||
− | |||
− | === What forms of evaluation were used to test students’ performance in this section? === |
||
− | |||
− | {| |
||
− | !align="center"| |
||
− | !align="center"| '''Yes/No''' |
||
|- |
|- |
||
+ | | Question || Show how to parse a dynamic web page. || 0 |
||
− | |align="center"| Development of individual parts of software product code |
||
− | |align="center"| 1 |
||
|- |
|- |
||
+ | | Question || Provide a framework to accept/reject A/B testing results. || 0 |
||
− | |align="center"| Homework and group projects |
||
− | |align="center"| 1 |
||
|- |
|- |
||
+ | | Question || Compute DCG for an example query for random search engine. || 0 |
||
− | |align="center"| Midterm evaluation |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Implement a metric for a recommender system. || 0 |
||
− | |align="center"| Testing (written or computer based) |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Implement pFound. || 0 |
||
− | |align="center"| Reports |
||
+ | |} |
||
− | |align="center"| 0 |
||
+ | ==== Section 2 ==== |
||
+ | {| class="wikitable" |
||
+ | |+ |
||
|- |
|- |
||
+ | ! Activity Type !! Content !! Is Graded? |
||
− | |align="center"| Essays |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Build inverted index for a text. || 1 |
||
− | |align="center"| Oral polls |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Tokenize a text. || 1 |
||
− | |align="center"| Discussions |
||
− | |align="center"| 0 |
||
− | |} |
||
− | |||
− | === Typical questions for ongoing performance evaluation within this section === |
||
− | |||
− | # Build inverted index for a text. |
||
− | # Tokenize a text. |
||
− | # Implement simple spellchecker. |
||
− | # Embed the text with a model. |
||
− | |||
− | === Typical questions for seminar classes (labs) within this section === |
||
− | |||
− | # Build inverted index for a set of web pages. |
||
− | # build a distribution of stems/lexemes for a text. |
||
− | # Choose and implement persistent index for a given text collection. |
||
− | # Visualize a dataset for text classification. |
||
− | |||
− | === Test questions for final assessment in this section === |
||
− | |||
− | # Explain how (and why) KD-trees work. |
||
− | # What are weak places of inverted index? |
||
− | # Compare different text vectorization approaches. |
||
− | # Compare tolerant retrieval to spellchecking. |
||
− | |||
− | === Section 3 === |
||
− | |||
− | ==== Section title: ==== |
||
− | |||
− | Advanced index data structures |
||
− | |||
− | ==== Topics covered in this section: ==== |
||
− | |||
− | * Vector-based tree data structures. |
||
− | * Graph-based data structures. Inverted index and multi-index. |
||
− | |||
− | === What forms of evaluation were used to test students’ performance in this section? === |
||
− | |||
− | {| |
||
− | !align="center"| |
||
− | !align="center"| '''Yes/No''' |
||
|- |
|- |
||
+ | | Question || Implement simple spellchecker. || 1 |
||
− | |align="center"| Development of individual parts of software product code |
||
− | |align="center"| 1 |
||
|- |
|- |
||
+ | | Question || Implement wildcard search. || 1 |
||
− | |align="center"| Homework and group projects |
||
− | |align="center"| 1 |
||
|- |
|- |
||
+ | | Question || Build inverted index for a set of web pages. || 0 |
||
− | |align="center"| Midterm evaluation |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || build a distribution of stems/lexemes for a text. || 0 |
||
− | |align="center"| Testing (written or computer based) |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Choose and implement case-insensitive index for a given text collection. || 0 |
||
− | |align="center"| Reports |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Choose and implement semantic vector-based index for a given text collection. || 0 |
||
− | |align="center"| Essays |
||
+ | |} |
||
− | |align="center"| 0 |
||
+ | ==== Section 3 ==== |
||
+ | {| class="wikitable" |
||
+ | |+ |
||
|- |
|- |
||
+ | ! Activity Type !! Content !! Is Graded? |
||
− | |align="center"| Oral polls |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Embed the text with an ML model. || 1 |
||
− | |align="center"| Discussions |
||
− | |align="center"| 0 |
||
− | |} |
||
− | |||
− | === Typical questions for ongoing performance evaluation within this section === |
||
− | |||
− | # Build kd-tree index for a given dataset. |
||
− | # Why kd-trees work badly in 100-dimensional environment? |
||
− | # What is the difference between metric space and vector space? |
||
− | |||
− | ==== Typical questions for seminar classes (labs) within this section ==== |
||
− | |||
− | # Build (H)NSW index for a dataset. |
||
− | # Compare HNSW to Annoy index. |
||
− | # What are metric space index structures you know? |
||
− | |||
− | ==== Test questions for final assessment in this section ==== |
||
− | |||
− | # Compare inverted index to HNSW in terms of speed, memory consumption? |
||
− | # Choose the best index for a given dataset. |
||
− | # Implement range search in KD-tree. |
||
− | |||
− | === Section 4 === |
||
− | |||
− | ==== Section title: ==== |
||
− | |||
− | Advanced retrieval topics. Media retrieval |
||
− | |||
− | ==== Topics covered in this section: ==== |
||
− | |||
− | * Image and video processing |
||
− | * Image understanding |
||
− | * Video understanding |
||
− | * Audio processing |
||
− | * Speech-to-text |
||
− | * Relevance feedback |
||
− | * PageRank |
||
− | |||
− | === What forms of evaluation were used to test students’ performance in this section? === |
||
− | |||
− | {| |
||
− | !align="center"| |
||
− | !align="center"| '''Yes/No''' |
||
|- |
|- |
||
+ | | Question || Build term-document matrix. || 1 |
||
− | |align="center"| Development of individual parts of software product code |
||
− | |align="center"| 1 |
||
|- |
|- |
||
+ | | Question || Build semantic index for a dataset using Annoy. || 1 |
||
− | |align="center"| Homework and group projects |
||
− | |align="center"| 1 |
||
|- |
|- |
||
+ | | Question || Build kd-tree index for a given dataset. || 1 |
||
− | |align="center"| Midterm evaluation |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Why kd-trees work badly in 100-dimensional environment? || 1 |
||
− | |align="center"| Testing (written or computer based) |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || What is the difference between metric space and vector space? || 1 |
||
− | |align="center"| Reports |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Choose and implement persistent index for a given text collection. || 0 |
||
− | |align="center"| Essays |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Visualize a dataset for text classification. || 0 |
||
− | |align="center"| Oral polls |
||
− | |align="center"| 0 |
||
|- |
|- |
||
+ | | Question || Build (H)NSW index for a dataset. || 0 |
||
− | |align="center"| Discussions |
||
+ | |- |
||
− | |align="center"| 0 |
||
+ | | Question || Compare HNSW to Annoy index. || 0 |
||
− | |} |
||
+ | |- |
||
− | |||
+ | | Question || What are metric space index structures you know? || 0 |
||
− | === Typical questions for ongoing performance evaluation within this section === |
||
+ | |} |
||
− | |||
+ | ==== Section 4 ==== |
||
− | # Extract semantic information from images. |
||
+ | {| class="wikitable" |
||
− | # Build an image hash. |
||
+ | |+ |
||
− | # Build a spectral representation of a song. |
||
+ | |- |
||
− | # Whats is relevance feedback? |
||
+ | ! Activity Type !! Content !! Is Graded? |
||
− | |||
+ | |- |
||
− | ==== Typical questions for seminar classes (labs) within this section ==== |
||
+ | | Question || Extract semantic information from images. || 1 |
||
− | |||
+ | |- |
||
− | # Build a "search by color" feature. |
||
+ | | Question || Build an image hash. || 1 |
||
− | # Extract scenes from video. |
||
+ | |- |
||
− | # Write a voice-controlled search. |
||
+ | | Question || Build a spectral representation of a song. || 1 |
||
− | # Semantic search within unlabelled image dataset. |
||
+ | |- |
||
− | |||
+ | | Question || Whats is relevance feedback? || 1 |
||
− | ==== Test questions for final assessment in this section ==== |
||
+ | |- |
||
− | |||
+ | | Question || Build a "search by color" feature. || 0 |
||
+ | |- |
||
+ | | Question || Extract scenes from video. || 0 |
||
+ | |- |
||
+ | | Question || Write a voice-controlled search. || 0 |
||
+ | |- |
||
+ | | Question || Semantic search within unlabelled image dataset. || 0 |
||
+ | |} |
||
+ | === Final assessment === |
||
+ | '''Section 1''' |
||
+ | # Implement text crawler for a news site. |
||
+ | # What is SBS (side-by-side) and how is it used in search engines? |
||
+ | # Compare pFound with CTR and with DCG. |
||
+ | # Explain how A/B testing works. |
||
+ | # Describe PageRank algorithm. |
||
+ | '''Section 2''' |
||
+ | # Explain how (and why) KD-trees work. |
||
+ | # What are weak places of inverted index? |
||
+ | # Compare different text vectorization approaches. |
||
+ | # Compare tolerant retrieval to spellchecking. |
||
+ | '''Section 3''' |
||
+ | # Compare inverted index to HNSW in terms of speed, memory consumption? |
||
+ | # Choose the best index for a given dataset. |
||
+ | # Implement range search in KD-tree. |
||
+ | '''Section 4''' |
||
# What are the approaches to image understanding? |
# What are the approaches to image understanding? |
||
# How to cluster a video into scenes and shots? |
# How to cluster a video into scenes and shots? |
||
# How speech-to-text technology works? |
# How speech-to-text technology works? |
||
# How to build audio fingerprints? |
# How to build audio fingerprints? |
||
+ | |||
+ | === The retake exam === |
||
+ | '''Section 1''' |
||
+ | # Solve a complex coding problem similar to one of the homework or lab. |
||
+ | '''Section 2''' |
||
+ | # Solve a complex coding problem similar to one of the homework or lab. |
||
+ | '''Section 3''' |
||
+ | # Solve a complex coding problem similar to one of the homework or lab. |
||
+ | '''Section 4''' |
||
+ | # Solve a complex coding problem similar to one of the homework or lab. |
Latest revision as of 17:35, 19 December 2022
Advanced Information Retrieval
- Course name: Advanced Information Retrieval
- Code discipline: CSE334
- Subject area: Computer systems organization; Information systems; Real-time systems; Information retrieval; World Wide Web
Short Description
This course covers the following concepts: Data indexing; Recommendations; Relevance and ranking.
Prerequisites
Prerequisite subjects
- CSE101 — Introduction to Programming I
- CSE102 — Introduction to Programming II
- CSE202 — Analytical Geometry and Linear Algebra I
- CSE204 — Analytic Geometry And Linear Algebra II: matrix multiplication, matrix decomposition (SVD, ALS) and approximation (matrix norm), sparse matrix, stability of solution (decomposition), vector spaces, metric spaces, manifold, eigenvector and eigenvalue.
- CSE113 — Philosophy I - (Discrete Math and Logic): graphs, trees, binary trees, balanced trees, metric (proximity) graphs, diameter, clique, path, shortest path.
- CSE206 — Probability And Statistics: probability, likelihood, conditional probability, Bayesian rule, stochastic matrix and properties.
Prerequisite topics
Course Topics
Section | Topics within the section |
---|---|
Information retrieval basics |
|
Text processing and indexing |
|
Vector model and vector indexing |
|
Advanced topics. Media processing |
|
Intended Learning Outcomes (ILOs)
What is the main purpose of this course?
The course is designed to prepare students to understand background theories of information retrieval systems and introduce different information retrieval systems. The course will focus on the evaluation and analysis of such systems as well as how they are implemented. Throughout the course, students will be involved in discussions, readings, and assignments to experience real world systems. The technologies and algorithms covered in this class include machine learning, data mining, natural language processing, data indexing, and so on.
ILOs defined at three levels
Level 1: What concepts should a student know/remember/explain?
By the end of the course, the students should be able to ...
- Terms and definitions used in area of information retrieval,
- Search engine and recommender system essential parts,
- Quality metrics of information retrieval systems,
- Contemporary approaches to semantic data analysis,
- Indexing strategies.
Level 2: What basic practical skills should a student be able to perform?
By the end of the course, the students should be able to ...
- Understand background theories behind information retrieval systems,
- How to design a recommender system from scratch,
- How to evaluate quality of a particular information retrieval system,
- Core ideas and system implementation and maintenance,
- How to identify and fix information retrieval system problems.
Level 3: What complex comprehensive skills should a student be able to apply in real-life scenarios?
By the end of the course, the students should be able to ...
- Build a recommender service from scratch,
- Implement a proper index for an unstructured dataset,
- Plan quality measures for a new recommender service,
- Run initial data analysis and problem evaluation for a business task, related to information retrieval.
Grading
Course grading range
Grade | Range | Description of performance |
---|---|---|
A. Excellent | 84-100 | - |
B. Good | 72-83 | - |
C. Satisfactory | 60-71 | - |
D. Poor | 0-59 | - |
Course activities and grading breakdown
Activity Type | Percentage of the overall course grade |
---|---|
Assignments | 60 |
Quizzes | 40 |
Exams | 0 |
Recommendations for students on how to succeed in the course
The simples way to succeed is to participate in labs and pass coding assignments in timely manner. This guarantees up to 60% of the grade. Participation in lecture quizzes allow to differentiate the grade.
Resources, literature and reference materials
Open access resources
- Manning, Raghavan, Schütze, An Introduction to Information Retrieval, 2008, Cambridge University Press
- Baeza-Yates, Ribeiro-Neto, Modern Information Retrieval, 2011, Addison-Wesley
- Buttcher, Clarke, Cormack, Information Retrieval: Implementing and Evaluating Search Engines, 2010, MIT Press
- Course repository in github.
Closed access resources
Software and tools used within the course
Teaching Methodology: Methods, techniques, & activities
Activities and Teaching Methods
Learning Activities | Section 1 | Section 2 | Section 3 | Section 4 |
---|---|---|---|---|
Development of individual parts of software product code | 1 | 1 | 1 | 1 |
Homework and group projects | 1 | 1 | 1 | 1 |
Testing (written or computer based) | 1 | 1 | 1 | 1 |
Formative Assessment and Course Activities
Ongoing performance assessment
Section 1
Activity Type | Content | Is Graded? |
---|---|---|
Question | Enumerate limitations for web crawling. | 1 |
Question | Propose a strategy for A/B testing. | 1 |
Question | Propose recommender quality metric. | 1 |
Question | Implement DCG metric. | 1 |
Question | Discuss relevance metric. | 1 |
Question | Crawl website with respect to robots.txt. | 1 |
Question | What is typical IR system architecture? | 0 |
Question | Show how to parse a dynamic web page. | 0 |
Question | Provide a framework to accept/reject A/B testing results. | 0 |
Question | Compute DCG for an example query for random search engine. | 0 |
Question | Implement a metric for a recommender system. | 0 |
Question | Implement pFound. | 0 |
Section 2
Activity Type | Content | Is Graded? |
---|---|---|
Question | Build inverted index for a text. | 1 |
Question | Tokenize a text. | 1 |
Question | Implement simple spellchecker. | 1 |
Question | Implement wildcard search. | 1 |
Question | Build inverted index for a set of web pages. | 0 |
Question | build a distribution of stems/lexemes for a text. | 0 |
Question | Choose and implement case-insensitive index for a given text collection. | 0 |
Question | Choose and implement semantic vector-based index for a given text collection. | 0 |
Section 3
Activity Type | Content | Is Graded? |
---|---|---|
Question | Embed the text with an ML model. | 1 |
Question | Build term-document matrix. | 1 |
Question | Build semantic index for a dataset using Annoy. | 1 |
Question | Build kd-tree index for a given dataset. | 1 |
Question | Why kd-trees work badly in 100-dimensional environment? | 1 |
Question | What is the difference between metric space and vector space? | 1 |
Question | Choose and implement persistent index for a given text collection. | 0 |
Question | Visualize a dataset for text classification. | 0 |
Question | Build (H)NSW index for a dataset. | 0 |
Question | Compare HNSW to Annoy index. | 0 |
Question | What are metric space index structures you know? | 0 |
Section 4
Activity Type | Content | Is Graded? |
---|---|---|
Question | Extract semantic information from images. | 1 |
Question | Build an image hash. | 1 |
Question | Build a spectral representation of a song. | 1 |
Question | Whats is relevance feedback? | 1 |
Question | Build a "search by color" feature. | 0 |
Question | Extract scenes from video. | 0 |
Question | Write a voice-controlled search. | 0 |
Question | Semantic search within unlabelled image dataset. | 0 |
Final assessment
Section 1
- Implement text crawler for a news site.
- What is SBS (side-by-side) and how is it used in search engines?
- Compare pFound with CTR and with DCG.
- Explain how A/B testing works.
- Describe PageRank algorithm.
Section 2
- Explain how (and why) KD-trees work.
- What are weak places of inverted index?
- Compare different text vectorization approaches.
- Compare tolerant retrieval to spellchecking.
Section 3
- Compare inverted index to HNSW in terms of speed, memory consumption?
- Choose the best index for a given dataset.
- Implement range search in KD-tree.
Section 4
- What are the approaches to image understanding?
- How to cluster a video into scenes and shots?
- How speech-to-text technology works?
- How to build audio fingerprints?
The retake exam
Section 1
- Solve a complex coding problem similar to one of the homework or lab.
Section 2
- Solve a complex coding problem similar to one of the homework or lab.
Section 3
- Solve a complex coding problem similar to one of the homework or lab.
Section 4
- Solve a complex coding problem similar to one of the homework or lab.