Difference between revisions of "MSc: Sensing Perception Actuation"
m (M.petrishchev moved page MSc:SensingPerceptionActuation.F21 to MSc:Sensing Perception Actuation) |
R.sirgalina (talk | contribs) |
||
(9 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
− | = Sensing, Perception & Actuation = |
||
− | + | = Sensing, Perception & Actuation = |
|
− | * |
+ | * '''Course name''': Sensing, Perception & Actuation |
+ | * '''Code discipline''': R-01 |
||
− | * <span>'''Subject area:'''</span> Visual Sensors, Data Analysis, Error Analysis, Theory of Measurements, Machine Vision, Inertial Sensors, Internal Sensors, Filtering, Sensor Fusion, Image Processing, Point Cloud Processing |
||
+ | * '''Subject area''': Visual Sensors, Data Analysis, Error Analysis, Theory of Measurements, Machine Vision, Inertial Sensors, Internal Sensors, Filtering, Sensor Fusion, Image Processing, Point Cloud Processing |
||
− | == |
+ | == Short Description == |
+ | This course covers the following concepts: Physical principles of sensors and their limitations; Measurements, Sensor Calibration, Data and Error analysis; Development of algorithms for image processing, feature extraction and object recognition; 3D Point cloud processing and scene reconstruction; Linear Kalman Filter and Sensor Fusion. |
||
+ | == Prerequisites == |
||
− | === Key concepts of the class === |
||
+ | === Prerequisite subjects === |
||
− | * Physical principles of sensors and their limitations |
||
+ | * CSE402 — Physics I (Mechanics) |
||
− | * Measurements, Sensor Calibration, Data and Error analysis |
||
+ | * CSE410 — Physics II - Electrical Engineering |
||
− | * Development of algorithms for image processing, feature extraction and object recognition |
||
+ | * CSE201 — Mathematical Analysis I |
||
− | * 3D Point cloud processing and scene reconstruction |
||
+ | * CSE203 — Mathematical Analysis II |
||
− | * Linear Kalman Filter and Sensor Fusion |
||
+ | * CSE202 — Analytical Geometry and Linear Algebra I |
||
+ | * CSE204 — Analytic Geometry And Linear Algebra II |
||
+ | * CSE206 — Probability And Statistics |
||
+ | === Prerequisite topics === |
||
− | === What is the purpose of this course? === |
||
− | One of the most important tasks of an autonomous system of any kind is to acquire knowledge about its environment. This is done by taking measurements using various sensors and then extracting meaningful information from those measurements. In this course we present the most common sensors used in mobile robots and autonomous systems and then discuss strategies for extracting information from the sensors. |
||
− | + | == Course Topics == |
|
+ | {| class="wikitable" |
||
+ | |+ Course Sections and Topics |
||
+ | |- |
||
+ | ! Section !! Topics within the section |
||
+ | |- |
||
+ | | Intro to Sensors. Data and Error Analysis || |
||
+ | # Sensors’ classification and Applications |
||
+ | # Sensors Characteristics: Dynamic range, Accuracy, Signal & Noise ratio |
||
+ | # Error analysis: Systematic vs Statistical Errors, Accuracy and Precision, Central Limit Theorem, 3 sigma rule, outliers |
||
+ | # Sensor calibration. Direct and indirect measurements |
||
+ | # Introduction to Data Analysis: Linear Regression, Least-Squares Fitting, Curve fitting, and Smoothing |
||
+ | |- |
||
+ | | Perception || |
||
+ | # Image sensors: camera matrix, characteristics. Color filters. |
||
+ | # Pinhole camera model, lenses and distortions |
||
+ | # Camera calibration, Intrinsic and Extrinsic matrices |
||
+ | # Video camera: CCTV, IR & thermal imaging camera, Fish eye camera |
||
+ | # Stereo vision: Stereosystem, Stereogeometry |
||
+ | # Image rectification, Disparity map, 3D Point Cloud from Stereo |
||
+ | # Point clouds processing, 3D reconstruction, Structure-from-Motion (SfM) |
||
+ | # LIDAR: Laser rangefinders. Laser-camera systems. Airborne LIDAR |
||
+ | # Depth, TOF, RGBD camera, MS Kinect: characteristics and calibration |
||
+ | # SONAR. Piezocrystalls. Doppler effect. Doppler radar. |
||
+ | # Acoustic sensor systems. Sound spectrogram |
||
+ | # mm-RADAR, Long-Range RADAR, Short-range RADAR |
||
+ | |- |
||
+ | | Sensor Fusion and Filtering || |
||
+ | # Filtering |
||
+ | # Linear Kalman Filter (KF) |
||
+ | # Sensor Fusion |
||
+ | # Linear Kalman Filter vs Extended KF |
||
+ | # MoCap system |
||
+ | # Multicamera system |
||
+ | |- |
||
+ | | Actuators and Passive Sensors: GPS, IMU and Inertial Sensors || |
||
+ | # GPS, differential GPS (dGPS) |
||
+ | # Inertial sensors: IMU, accelerometers, gyroscopes |
||
+ | # Magnetometers |
||
+ | # Internal sensors: position, velocity, torque & force sensors, encoders |
||
+ | # MEMS for robot applications |
||
+ | # Actuators |
||
+ | # Smart and Intelligent Sensors |
||
+ | |} |
||
+ | == Intended Learning Outcomes (ILOs) == |
||
− | === |
+ | === What is the main purpose of this course? === |
+ | One of the most important tasks of an autonomous system of any kind is to acquire knowledge about its environment. This is done by taking measurements using various sensors and then extracting meaningful information from those measurements. In this course we present the most common sensors used in mobile robots and autonomous systems and then discuss strategies for extracting information from the sensors. |
||
+ | === ILOs defined at three levels === |
||
− | By the end of the course, the students should be |
||
+ | ==== Level 1: What concepts should a student know/remember/explain? ==== |
||
+ | By the end of the course, the students should be able to ... |
||
* familiar with physical and sensing principles for Camera, Stereo vision, LIDAR, SONAR, Time-of-Flight camera, GPS, actuators, inertial and internal sensors |
* familiar with physical and sensing principles for Camera, Stereo vision, LIDAR, SONAR, Time-of-Flight camera, GPS, actuators, inertial and internal sensors |
||
* acquainted with measurements and error analysis, data analysis, and sensor calibration |
* acquainted with measurements and error analysis, data analysis, and sensor calibration |
||
* familiar with triangulation principle, basics of image and point cloud processing methods |
* familiar with triangulation principle, basics of image and point cloud processing methods |
||
− | === |
+ | ==== Level 2: What basic practical skills should a student be able to perform? ==== |
+ | By the end of the course, the students should be able to ... |
||
− | |||
− | By the end of the course, the students should be able to |
||
− | |||
* Understand how to remove systematic error and how to decrease statistical error |
* Understand how to remove systematic error and how to decrease statistical error |
||
* Recover depth information from stereo vision, structure-from-light and TOF cameras |
* Recover depth information from stereo vision, structure-from-light and TOF cameras |
||
* Explain how Linear Regression and Least-Squares Fitting allow to minimize measurement errors |
* Explain how Linear Regression and Least-Squares Fitting allow to minimize measurement errors |
||
− | === |
+ | ==== Level 3: What complex comprehensive skills should a student be able to apply in real-life scenarios? ==== |
+ | By the end of the course, the students should be able to ... |
||
− | |||
− | By the end of the course, the students should be able to |
||
− | |||
* Calibrate sensors to remove systematic errors |
* Calibrate sensors to remove systematic errors |
||
* Extract meaningful information from sensor’s data (features, objects, depth and accuracy information) |
* Extract meaningful information from sensor’s data (features, objects, depth and accuracy information) |
||
Line 48: | Line 94: | ||
* Match models to datasets |
* Match models to datasets |
||
* Fuse sensor’s data and apply Kalman Filtering |
* Fuse sensor’s data and apply Kalman Filtering |
||
− | * Apply GPS, camera, LIDAR, RADAR, SONAR, IMU, Stereo camera for a mobile robot localization |
+ | * Apply GPS, camera, LIDAR, RADAR, SONAR, IMU, Stereo camera for a mobile robot localization |
+ | == Grading == |
||
− | === Course |
+ | === Course grading range === |
+ | {| class="wikitable" |
||
− | |||
− | + | |+ |
|
− | |+ Course grade breakdown |
||
− | ! |
||
− | ! |
||
− | !align="center"| '''Proposed points''' |
||
|- |
|- |
||
+ | ! Grade !! Range !! Description of performance |
||
− | | Labs/seminar classes |
||
− | | 10 |
||
− | |align="center"| |
||
|- |
|- |
||
+ | | A. Excellent || 85-100 || - |
||
− | | Home assignments |
||
− | | 40 |
||
− | |align="center"| |
||
|- |
|- |
||
+ | | B. Good || 70-84 || - |
||
− | | Interim performance assessment |
||
− | | 15 |
||
− | |align="center"| |
||
|- |
|- |
||
+ | | C. Satisfactory || 55-69 || - |
||
− | | Quizzes |
||
− | | 20 |
||
− | |align="center"| |
||
|- |
|- |
||
+ | | D. Poor || 0-54 || - |
||
− | | Exams |
||
− | | 15 |
||
− | |align="center"| |
||
|} |
|} |
||
+ | === Course activities and grading breakdown === |
||
− | === Grades range === |
||
+ | {| class="wikitable" |
||
− | |||
− | + | |+ |
|
− | |+ Course grading range |
||
− | ! |
||
− | ! |
||
− | !align="center"| '''Proposed range''' |
||
|- |
|- |
||
+ | ! Activity Type !! Percentage of the overall course grade |
||
− | | A. Excellent |
||
− | | 90-100 |
||
− | |align="center"| 85-100 |
||
|- |
|- |
||
+ | | Labs/seminar classes || 10 |
||
− | | B. Good |
||
− | | 75-89 |
||
− | |align="center"| 70-84 |
||
|- |
|- |
||
+ | | Home assignments || 40 |
||
− | | C. Satisfactory |
||
− | | 60-74 |
||
− | |align="center"| 55-69 |
||
|- |
|- |
||
+ | | Interim performance assessment || 15 |
||
− | | D. Poor |
||
− | | |
+ | |- |
+ | | Quizzes || 20 |
||
− | |align="center"| 0-54 |
||
+ | |- |
||
+ | | Exams || 15 |
||
|} |
|} |
||
+ | === Recommendations for students on how to succeed in the course === |
||
− | === Resources and reference material === |
||
− | Main textbook: |
||
+ | == Resources, literature and reference materials == |
||
− | * Roland Siegwart, Illah R. Nourbakhsh, and Davide Scaramuzza. Introduction to Autonomous Mobile Robots, MIT press, 2011. |
||
− | |||
− | Other reference material: |
||
+ | === Open access resources === |
||
+ | * Roland Siegwart, Illah R. Nourbakhsh, and Davide Scaramuzza. Introduction to Autonomous Mobile Robots, MIT press, 2011. |
||
* M. Chli, D. Scaramuzza, R. Siegwart, et al. Autonomous Mobile Robots, ETH Zurich, 2017, http://www.asl.ethz.ch/education/lectures/autonomous_mobile_robots.html |
* M. Chli, D. Scaramuzza, R. Siegwart, et al. Autonomous Mobile Robots, ETH Zurich, 2017, http://www.asl.ethz.ch/education/lectures/autonomous_mobile_robots.html |
||
* Jacob Fraden. Handbook of modern sensors: physics, designs, and applications. Springer, 2010 |
* Jacob Fraden. Handbook of modern sensors: physics, designs, and applications. Springer, 2010 |
||
Line 119: | Line 143: | ||
* Gregory Dudek and Michael Jenkin. Computational Principles of Mobile Robotics, 2nd ed., Cambridge University Press, 2010 |
* Gregory Dudek and Michael Jenkin. Computational Principles of Mobile Robotics, 2nd ed., Cambridge University Press, 2010 |
||
− | == |
+ | === Closed access resources === |
− | The main sections of the course and approximate hour distribution between them is as follows: |
||
+ | === Software and tools used within the course === |
||
− | {| |
||
+ | |||
− | |+ Course Sections |
||
+ | = Teaching Methodology: Methods, techniques, & activities = |
||
− | !align="center"| '''Section''' |
||
+ | |||
− | ! '''Section Title''' |
||
− | + | == Activities and Teaching Methods == |
|
+ | {| class="wikitable" |
||
+ | |+ Activities within each section |
||
|- |
|- |
||
+ | ! Learning Activities !! Section 1 !! Section 2 !! Section 3 !! Section 4 |
||
− | |align="center"| 1 |
||
− | | Intro to Sensors. Data and Error Analysis |
||
− | |align="center"| 10 |
||
|- |
|- |
||
+ | | Development of individual parts of software product code || 1 || 1 || 1 || 1 |
||
− | |align="center"| 2 |
||
− | | Perception |
||
− | |align="center"| 30 |
||
|- |
|- |
||
+ | | Homework assignments || 1 || 0 || 0 || 0 |
||
− | |align="center"| 3 |
||
− | | Sensor Fusion and Filtering |
||
− | |align="center"| 10 |
||
|- |
|- |
||
+ | | Midterm evaluation || 1 || 1 || 1 || 1 |
||
− | |align="center"| 4 |
||
+ | |- |
||
− | | Actuators and Passive Sensors: GPS, IMU and Inertial Sensors |
||
+ | | Testing (written or computer based) || 1 || 1 || 1 || 1 |
||
− | |align="center"| 10 |
||
− | | |
+ | |- |
+ | | Discussions || 1 || 1 || 1 || 1 |
||
+ | |- |
||
+ | | Homework and group projects || 0 || 1 || 1 || 1 |
||
+ | |} |
||
+ | == Formative Assessment and Course Activities == |
||
− | === |
+ | === Ongoing performance assessment === |
− | |||
− | ==== Section title: ==== |
||
− | |||
− | Intro to Sensors. Data and Error Analysis |
||
− | |||
− | === Topics covered in this section: === |
||
− | |||
− | * Sensors’ classification and Applications |
||
− | * Sensors Characteristics: Dynamic range, Accuracy, Signal & Noise ratio |
||
− | * Error analysis: Systematic vs Statistical Errors, Accuracy and Precision, Central Limit Theorem, 3 sigma rule, outliers |
||
− | * Sensor calibration. Direct and indirect measurements |
||
− | * Introduction to Data Analysis: Linear Regression, Least-Squares Fitting, Curve fitting, and Smoothing |
||
− | |||
− | === What forms of evaluation were used to test students’ performance in this section? === |
||
− | |||
− | <div class="tabular"> |
||
− | |||
− | <span>|a|c|</span> & '''Yes/No'''<br /> |
||
− | Development of individual parts of software product code & 1<br /> |
||
− | Homework assignments & 1<br /> |
||
− | Midterm evaluation & 1<br /> |
||
− | Testing (written or computer based) & 1<br /> |
||
− | Reports & 0<br /> |
||
− | Essays & 0<br /> |
||
− | Oral polls & 0<br /> |
||
− | Discussions & 1<br /> |
||
− | |||
− | |||
− | |||
− | </div> |
||
− | === Typical questions for ongoing performance evaluation within this section === |
||
− | |||
− | # UAV flies through the strong wind and begins to oscillate. The pitch angle change was measured with the gyro during a few seconds. Exclude from the consideration rare random deviation (outliers) and estimate the true value of the roll angle. Calculate the confidence interval of error for 99.9% confidence level. Consider that the gyro errors are normally distributed, except for the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. |
||
− | # UAV flies through the strong wind and begins to oscillate. The roll angle change was measured with the gyro during a few seconds. Exclude from the consideration rare random deviation (outliers) and estimate the true value of the roll angle. Calculate the confidence interval of error for 95% confidence level. Consider that the gyro errors are normally distributed, except for the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. |
||
− | # The human CoM (center of mass) during the walking has been measured with Kinect sensors. Estimate the true value of the x-component of acceleration. For example, you can use a moving average filter. Calculate the confidence interval of error for 95% confidence level. Consider that the errors are normally distributed, except for the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. |
||
− | # UAV performs loop-the-loop. The pitch angle change was measured with the gyro during 1 seconds. Exclude from the consideration rare random deviation (outliers) and estimate the true value of the roll angle. Calculate the confidence interval of error for 95% confidence level. Consider that the gyro errors are normally distributed, except for the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. |
||
− | # You are given a dataset (select the dataset with corresponds to your ID), includes some data points in R<math display="inline">^3</math>. Your task is to estimate whether it represents a plane, line or something else. You must use the RANSAC for this task. Explain the way you selected your minimal sample set, number of iteration and the threshold level? It would be better to provide an analytical solution derivation as well as graphical interpretation. |
||
− | # Inside the right hand of the AR-601 robot there is an accelerometer. Engineers fixed the robot on the crane and went for a lunch. Robot was swinging by inertia for a few minutes. Engineers came back after lunch and read the data from accelerometer. Help them to understand what they measured. Exclude from the consideration rare random deviation (outliers) and estimate the true value of the y-component of acceleration. Calculate the confidence interval of error for 99.9% confidence level. Consider that the accelerometer errors are normally distributed, except the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. |
||
− | |||
− | === Typical questions for seminar classes (labs) within this section === |
||
− | |||
− | # Introduction to Confidence Interval (CI) |
||
− | # Introduction to Linear Regression |
||
− | # Introduction to Logistic Regression |
||
− | # Introduction to RANSAC |
||
− | # Introduction to Maximum Likelihood Estimation (MLE) |
||
− | |||
− | === Test questions for final assessment in this section === |
||
+ | ==== Section 1 ==== |
||
+ | {| class="wikitable" |
||
+ | |+ |
||
+ | |- |
||
+ | ! Activity Type !! Content !! Is Graded? |
||
+ | |- |
||
+ | | Question || UAV flies through the strong wind and begins to oscillate. The pitch angle change was measured with the gyro during a few seconds. Exclude from the consideration rare random deviation (outliers) and estimate the true value of the roll angle. Calculate the confidence interval of error for 99.9% confidence level. Consider that the gyro errors are normally distributed, except for the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. || 1 |
||
+ | |- |
||
+ | | Question || UAV flies through the strong wind and begins to oscillate. The roll angle change was measured with the gyro during a few seconds. Exclude from the consideration rare random deviation (outliers) and estimate the true value of the roll angle. Calculate the confidence interval of error for 95% confidence level. Consider that the gyro errors are normally distributed, except for the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. || 1 |
||
+ | |- |
||
+ | | Question || The human CoM (center of mass) during the walking has been measured with Kinect sensors. Estimate the true value of the x-component of acceleration. For example, you can use a moving average filter. Calculate the confidence interval of error for 95% confidence level. Consider that the errors are normally distributed, except for the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. || 1 |
||
+ | |- |
||
+ | | Question || UAV performs loop-the-loop. The pitch angle change was measured with the gyro during 1 seconds. Exclude from the consideration rare random deviation (outliers) and estimate the true value of the roll angle. Calculate the confidence interval of error for 95% confidence level. Consider that the gyro errors are normally distributed, except for the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. || 1 |
||
+ | |- |
||
+ | | Question || You are given a dataset (select the dataset with corresponds to your ID), includes some data points in R<math>{\textstyle ^{3}}</math> . Your task is to estimate whether it represents a plane, line or something else. You must use the RANSAC for this task. Explain the way you selected your minimal sample set, number of iteration and the threshold level? It would be better to provide an analytical solution derivation as well as graphical interpretation. || 1 |
||
+ | |- |
||
+ | | Question || Inside the right hand of the AR-601 robot there is an accelerometer. Engineers fixed the robot on the crane and went for a lunch. Robot was swinging by inertia for a few minutes. Engineers came back after lunch and read the data from accelerometer. Help them to understand what they measured. Exclude from the consideration rare random deviation (outliers) and estimate the true value of the y-component of acceleration. Calculate the confidence interval of error for 99.9% confidence level. Consider that the accelerometer errors are normally distributed, except the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. || 1 |
||
+ | |- |
||
+ | | Question || Introduction to Confidence Interval (CI) || 0 |
||
+ | |- |
||
+ | | Question || Introduction to Linear Regression || 0 |
||
+ | |- |
||
+ | | Question || Introduction to Logistic Regression || 0 |
||
+ | |- |
||
+ | | Question || Introduction to RANSAC || 0 |
||
+ | |- |
||
+ | | Question || Introduction to Maximum Likelihood Estimation (MLE) || 0 |
||
+ | |} |
||
+ | ==== Section 2 ==== |
||
+ | {| class="wikitable" |
||
+ | |+ |
||
+ | |- |
||
+ | ! Activity Type !! Content !! Is Graded? |
||
+ | |- |
||
+ | | Question || Place a 3d object such as a cube or a cylinder or something you like in an appropriate way with respect to Kinect. Then you are going to use Kinect 2 in order to get the depth map. Associate depth map with RGB information in order to isolate the object, which is to be extracted from the ground plane and other background. You may have to use RANSAC or some other algorithms to extract the object and find the center point of the object in the 3D space. It may be relative to Kinect or any known position in the world. || 1 |
||
+ | |- |
||
+ | | Question || Calibrate a camera (your phone or computer camera should be utilized) using the chessboard pattern. It’s logically to switch off the auto focus mode of the camera, if it is enabled. The number of images should be at least 30 (with different chessboard positions). Obtain the intrinsic and extrinsic parameters. Once you have calibrated your camera, store intrinsic and extrinsic parameters. Then take a photo of some object (for example, a cup) using the calibrated camera, estimate the height and width of the selected object using both a ruler and an image from the calibrated camera. Calculate the distance between the camera image plane and the selected object. || 1 |
||
+ | |- |
||
+ | | Question || The provided dataset contains left and right images in two different folders with the same name. For this task please select image pair corresponds to your id (0000[Id].png). You need to use 8 point algorithm in order to find the fundamental matrix. For the initial key points detection (minimum 8 corresponding points) you can either do it manually or use any key points detection technique. Next step is to estimate the disparity map for the selected image pair. You may assume the baseline of the stereo camera as the 10cm and focal length of both the left and right side cameras as 2.8mm. If you need any additional assumptions, please elaborate them in the report. || 1 |
||
+ | |- |
||
+ | | Question || Build the hardware, and program and implement suitable code, for a simple color sensor suitable for an application of interest to you. An example of the principle you might employ is a hardware where the component in the center is a photo transistor; it detects light – with varying sensitivity – across the full visible spectrum, a little into the ultraviolet, and into the infrared to a little. Basic idea of the color meter is that LEDs are turned on and off in sequence, and the correspondingly detected signals are recorded. When all the LEDs are off the ambient (background) is recorded. In this way a ”signature” of any particular color patch placed in a location that is illuminated by the LEDs and seen by the phototransistor is generated. You could then, for example, compare the signature that you obtain from an ”unknown” item with the signatures of various items that you previously stored in a ”library”, hence identify (with some quantifiable degree of certainty) the ”unknown” item. Even If you do use this principle, you don’t have to use exactly these components. In your Arduino kit you probably have a variety of LEDs, probably including one ”tri-color” LED, and a phototransistor that you can use. || 1 |
||
+ | |- |
||
+ | | Question || CCD vs CMOS technologies || 0 |
||
+ | |- |
||
+ | | Question || Bayer mosaic filter vs Faveon capture color filter || 0 |
||
+ | |- |
||
+ | | Question || Pinhole camera model || 0 |
||
+ | |- |
||
+ | | Question || Stereo vision, Disparity map and Stereo image rectification || 0 |
||
+ | |- |
||
+ | | Question || Multiple Camera Vision || 0 |
||
+ | |- |
||
+ | | Question || Structure-from-Motion || 0 |
||
+ | |- |
||
+ | | Question || ToF and multi-frequency phase-shift LIDAR technologies || 0 |
||
+ | |- |
||
+ | | Question || SONAR sensing, SONAR transducer and transmitter || 0 |
||
+ | |- |
||
+ | | Question || Distributed acoustic sensing (DAS) and applications || 0 |
||
+ | |- |
||
+ | | Question || Doppler mm-wave RADAR || 0 |
||
+ | |} |
||
+ | ==== Section 3 ==== |
||
+ | {| class="wikitable" |
||
+ | |+ |
||
+ | |- |
||
+ | ! Activity Type !! Content !! Is Graded? |
||
+ | |- |
||
+ | | Question || UAV flies through the strong wind and begins to oscillate. The pitch angle change was measured with the gyro during a few seconds. Estimate proper trajectory for the pitch angle while considering the gyro reading are normally distributed by using Kalman filter. || 1 |
||
+ | |- |
||
+ | | Question || UAV flies through the strong wind and begins to oscillate. The roll angle change was measured with the gyro during a few seconds. Estimate proper trajectory for the roll angle while considering the gyro reading are normally distributed by using Kalman filter. || 1 |
||
+ | |- |
||
+ | | Question || UAV performs loop-the-loop. The pitch angle change was measured with the gyro during a few seconds. Estimate proper trajectory for the pitch angle while considering the gyro reading are normally distributed by using Kalman filter. || 1 |
||
+ | |- |
||
+ | | Question || The human CoM (center of mass) during the walking has been measured with Kinect sensors. Estimate proper trajectory for the human CoM movement while considering reading are normally distributed by using Kalman filter. || 1 |
||
+ | |- |
||
+ | | Question || Inside the right hand of the AR-601 robot there is an accelerometer. Engineers fixed the robot on the crane and went for a lunch. Robot was swinging by inertia for a few minutes. Engineers came back after lunch and read the data from accelerometer. Help them to understand what they measured. Estimate proper trajectory for the accelerometer reading while considering reading are normally distributed by using Kalman filter. || 1 |
||
+ | |- |
||
+ | | Question || Expected Value and Variance of a Random Variable. Gaussian Distribution. || 0 |
||
+ | |- |
||
+ | | Question || One Dimensional Kalman Filter || 0 |
||
+ | |- |
||
+ | | Question || The main stages of Linear Kalman Filter (KF): Prediction and Update steps. KF initialization || 0 |
||
+ | |- |
||
+ | | Question || Linear Kalman Filter vs Extended KF || 0 |
||
+ | |- |
||
+ | | Question || KF, EKF, UKF and Particle Filter - main features and differences || 0 |
||
+ | |- |
||
+ | | Question || Motion Capture (MoCap) system || 0 |
||
+ | |- |
||
+ | | Question || Multicamera system || 0 |
||
+ | |} |
||
+ | ==== Section 4 ==== |
||
+ | {| class="wikitable" |
||
+ | |+ |
||
+ | |- |
||
+ | ! Activity Type !! Content !! Is Graded? |
||
+ | |- |
||
+ | | Question || Take your smart phone and run for at least 100 meters with a constant speed. Your task is to estimate the trajectory where you ran with your phone which may be used to get some sensor reading such as accelerometer, gyroscope, GPS sensor and so on. You should use multidimensional Kalman filter with sensor fusion to solve this task. In the report clearly explain all the assumptions you made. || 1 |
||
+ | |- |
||
+ | | Question || Inside the right hand of the AR-601 robot there is an accelerometer. Engineers fixed the robot on the crane and went for a lunch. Robot was swinging by inertia for a few minutes. Engineers came back after lunch and read the data from accelerometer. Help them to understand what they measured. Estimate proper trajectory for the accelerometer reading while considering reading are normally distributed by using Kalman filter. || 1 |
||
+ | |- |
||
+ | | Question || GPS vs differential GPS (dGPS) || 0 |
||
+ | |- |
||
+ | | Question || 9DOF IMU || 0 |
||
+ | |- |
||
+ | | Question || Optical vs Mechanical Gyroscope || 0 |
||
+ | |- |
||
+ | | Question || Laser vs Fiber Gyroscope || 0 |
||
+ | |- |
||
+ | | Question || Absolute vs Incremental Encoders || 0 |
||
+ | |- |
||
+ | | Question || MEMS for robot applications || 0 |
||
+ | |- |
||
+ | | Question || What are the Smart and Intelligent Sensors? || 0 |
||
+ | |} |
||
+ | === Final assessment === |
||
+ | '''Section 1''' |
||
# Why do mobile robots need sensors? |
# Why do mobile robots need sensors? |
||
# Why do mobile robots need actuators? |
# Why do mobile robots need actuators? |
||
Line 218: | Line 312: | ||
# What is outlier? |
# What is outlier? |
||
# What is the main idea of RANSAC? |
# What is the main idea of RANSAC? |
||
+ | '''Section 2''' |
||
− | |||
− | === Section 2 === |
||
− | |||
− | ==== Section title: ==== |
||
− | |||
− | Perception |
||
− | |||
− | === Topics covered in this section: === |
||
− | |||
− | * Image sensors: camera matrix, characteristics. Color filters. |
||
− | * Pinhole camera model, lenses and distortions |
||
− | * Camera calibration, Intrinsic and Extrinsic matrices |
||
− | * Video camera: CCTV, IR & thermal imaging camera, Fish eye camera |
||
− | * Stereo vision: Stereosystem, Stereogeometry |
||
− | * Image rectification, Disparity map, 3D Point Cloud from Stereo |
||
− | * Point clouds processing, 3D reconstruction, Structure-from-Motion (SfM) |
||
− | * LIDAR: Laser rangefinders. Laser-camera systems. Airborne LIDAR |
||
− | * Depth, TOF, RGBD camera, MS Kinect: characteristics and calibration |
||
− | * SONAR. Piezocrystalls. Doppler effect. Doppler radar. |
||
− | * Acoustic sensor systems. Sound spectrogram |
||
− | * mm-RADAR, Long-Range RADAR, Short-range RADAR |
||
− | |||
− | === What forms of evaluation were used to test students’ performance in this section? === |
||
− | |||
− | <div class="tabular"> |
||
− | |||
− | <span>|a|c|</span> & '''Yes/No'''<br /> |
||
− | Development of individual parts of software product code & 1<br /> |
||
− | Homework and group projects & 1<br /> |
||
− | Midterm evaluation & 1<br /> |
||
− | Testing (written or computer based) & 1<br /> |
||
− | Reports & 0<br /> |
||
− | Essays & 0<br /> |
||
− | Oral polls & 0<br /> |
||
− | Discussions & 1<br /> |
||
− | |||
− | |||
− | |||
− | </div> |
||
− | === Typical questions for ongoing performance evaluation within this section === |
||
− | |||
− | # Place a 3d object such as a cube or a cylinder or something you like in an appropriate way with respect to Kinect. Then you are going to use Kinect 2 in order to get the depth map. Associate depth map with RGB information in order to isolate the object, which is to be extracted from the ground plane and other background. You may have to use RANSAC or some other algorithms to extract the object and find the center point of the object in the 3D space. It may be relative to Kinect or any known position in the world. |
||
− | # Calibrate a camera (your phone or computer camera should be utilized) using the chessboard pattern. It’s logically to switch off the auto focus mode of the camera, if it is enabled. The number of images should be at least 30 (with different chessboard positions). Obtain the intrinsic and extrinsic parameters. Once you have calibrated your camera, store intrinsic and extrinsic parameters. Then take a photo of some object (for example, a cup) using the calibrated camera, estimate the height and width of the selected object using both a ruler and an image from the calibrated camera. Calculate the distance between the camera image plane and the selected object. |
||
− | # The provided dataset contains left and right images in two different folders with the same name. For this task please select image pair corresponds to your id (0000[Id].png). You need to use 8 point algorithm in order to find the fundamental matrix. For the initial key points detection (minimum 8 corresponding points) you can either do it manually or use any key points detection technique. Next step is to estimate the disparity map for the selected image pair. You may assume the baseline of the stereo camera as the 10cm and focal length of both the left and right side cameras as 2.8mm. If you need any additional assumptions, please elaborate them in the report. |
||
− | # Build the hardware, and program and implement suitable code, for a simple color sensor suitable for an application of interest to you. An example of the principle you might employ is a hardware where the component in the center is a photo transistor; it detects light – with varying sensitivity – across the full visible spectrum, a little into the ultraviolet, and into the infrared to a little. Basic idea of the color meter is that LEDs are turned on and off in sequence, and the correspondingly detected signals are recorded. When all the LEDs are off the ambient (background) is recorded. In this way a ”signature” of any particular color patch placed in a location that is illuminated by the LEDs and seen by the phototransistor is generated. You could then, for example, compare the signature that you obtain from an ”unknown” item with the signatures of various items that you previously stored in a ”library”, hence identify (with some quantifiable degree of certainty) the ”unknown” item. Even If you do use this principle, you don’t have to use exactly these components. In your Arduino kit you probably have a variety of LEDs, probably including one ”tri-color” LED, and a phototransistor that you can use. |
||
− | |||
− | === Typical questions for seminar classes (labs) within this section === |
||
− | |||
− | # CCD vs CMOS technologies |
||
− | # Bayer mosaic filter vs Faveon capture color filter |
||
− | # Pinhole camera model |
||
− | # Stereo vision, Disparity map and Stereo image rectification |
||
− | # Multiple Camera Vision |
||
− | # Structure-from-Motion |
||
− | # ToF and multi-frequency phase-shift LIDAR technologies |
||
− | # SONAR sensing, SONAR transducer and transmitter |
||
− | # Distributed acoustic sensing (DAS) and applications |
||
− | # Doppler mm-wave RADAR |
||
− | |||
− | === Test questions for final assessment in this section === |
||
− | |||
# What’s the difference between CCD and CMOS? |
# What’s the difference between CCD and CMOS? |
||
# Which advantages of CMOS technology do you know? |
# Which advantages of CMOS technology do you know? |
||
Line 309: | Line 343: | ||
# What is the disparity map in stereo vision reconstruction process? |
# What is the disparity map in stereo vision reconstruction process? |
||
# What is the result of Stereo Vision Scene Reconstruction? |
# What is the result of Stereo Vision Scene Reconstruction? |
||
− | # What’s the difference between Time-of-flight & |
+ | # What’s the difference between Time-of-flight & Structured-light sensors? |
# Give short description of Depth Measurement Techniques for Stereo Vision, ToF cameras, Kinect and LIDAR. |
# Give short description of Depth Measurement Techniques for Stereo Vision, ToF cameras, Kinect and LIDAR. |
||
# What is the motivation to use 3D sensors? |
# What is the motivation to use 3D sensors? |
||
Line 361: | Line 395: | ||
# What is the mm-wave RADAR? |
# What is the mm-wave RADAR? |
||
# How does mm-wave RADAR work? |
# How does mm-wave RADAR work? |
||
+ | '''Section 3''' |
||
− | |||
− | === Section 3 === |
||
− | |||
− | ==== Section title: ==== |
||
− | |||
− | Sensor Fusion and Filtering |
||
− | |||
− | ==== Topics covered in this section: ==== |
||
− | |||
− | * Filtering |
||
− | * Linear Kalman Filter (KF) |
||
− | * Sensor Fusion |
||
− | * Linear Kalman Filter vs Extended KF |
||
− | * MoCap system |
||
− | * Multicamera system |
||
− | |||
− | === What forms of evaluation were used to test students’ performance in this section? === |
||
− | |||
− | <div class="tabular"> |
||
− | |||
− | <span>|a|c|</span> & '''Yes/No'''<br /> |
||
− | Development of individual parts of software product code & 1<br /> |
||
− | Homework and group projects & 1<br /> |
||
− | Midterm evaluation & 1<br /> |
||
− | Testing (written or computer based) & 1<br /> |
||
− | Reports & 0<br /> |
||
− | Essays & 0<br /> |
||
− | Oral polls & 0<br /> |
||
− | Discussions & 1<br /> |
||
− | |||
− | |||
− | |||
− | </div> |
||
− | === Typical questions for ongoing performance evaluation within this section === |
||
− | |||
− | # UAV flies through the strong wind and begins to oscillate. The pitch angle change was measured with the gyro during a few seconds. Estimate proper trajectory for the pitch angle while considering the gyro reading are normally distributed by using Kalman filter. |
||
− | # UAV flies through the strong wind and begins to oscillate. The roll angle change was measured with the gyro during a few seconds. Estimate proper trajectory for the roll angle while considering the gyro reading are normally distributed by using Kalman filter. |
||
− | # UAV performs loop-the-loop. The pitch angle change was measured with the gyro during a few seconds. Estimate proper trajectory for the pitch angle while considering the gyro reading are normally distributed by using Kalman filter. |
||
− | # The human CoM (center of mass) during the walking has been measured with Kinect sensors. Estimate proper trajectory for the human CoM movement while considering reading are normally distributed by using Kalman filter. |
||
− | # Inside the right hand of the AR-601 robot there is an accelerometer. Engineers fixed the robot on the crane and went for a lunch. Robot was swinging by inertia for a few minutes. Engineers came back after lunch and read the data from accelerometer. Help them to understand what they measured. Estimate proper trajectory for the accelerometer reading while considering reading are normally distributed by using Kalman filter. |
||
− | |||
− | ==== Typical questions for seminar classes (labs) within this section ==== |
||
− | |||
− | # Expected Value and Variance of a Random Variable. Gaussian Distribution. |
||
− | # One Dimensional Kalman Filter |
||
− | # The main stages of Linear Kalman Filter (KF): Prediction and Update steps. KF initialization |
||
− | # Linear Kalman Filter vs Extended KF |
||
− | # KF, EKF, UKF and Particle Filter - main features and differences |
||
− | # Motion Capture (MoCap) system |
||
− | # Multicamera system |
||
− | |||
− | ==== Test questions for final assessment in this section ==== |
||
− | |||
# What’s the difference between least square fitting and Kalman filtration? |
# What’s the difference between least square fitting and Kalman filtration? |
||
# What’s the difference between Conventional and Recursive Estimating the Mean? |
# What’s the difference between Conventional and Recursive Estimating the Mean? |
||
Line 443: | Line 425: | ||
# What’s the difference between the sensor fusion by Kalman Filter and Particle Filter? |
# What’s the difference between the sensor fusion by Kalman Filter and Particle Filter? |
||
# Can we process non Gaussian Noise by Linear Kalman Filter? |
# Can we process non Gaussian Noise by Linear Kalman Filter? |
||
+ | '''Section 4''' |
||
− | |||
− | === Section 3 === |
||
− | |||
− | ==== Section title: ==== |
||
− | |||
− | Actuators and Passive Sensors: GPS, IMU and Inertial Sensors |
||
− | |||
− | ==== Topics covered in this section: ==== |
||
− | |||
− | * GPS, differential GPS (dGPS) |
||
− | * Inertial sensors: IMU, accelerometers, gyroscopes |
||
− | * Magnetometers |
||
− | * Internal sensors: position, velocity, torque & force sensors, encoders |
||
− | * MEMS for robot applications |
||
− | * Actuators |
||
− | * Smart and Intelligent Sensors |
||
− | |||
− | === What forms of evaluation were used to test students’ performance in this section? === |
||
− | |||
− | <div class="tabular"> |
||
− | |||
− | <span>|a|c|</span> & '''Yes/No'''<br /> |
||
− | Development of individual parts of software product code & 1<br /> |
||
− | Homework and group projects & 1<br /> |
||
− | Midterm evaluation & 1<br /> |
||
− | Testing (written or computer based) & 1<br /> |
||
− | Reports & 0<br /> |
||
− | Essays & 0<br /> |
||
− | Oral polls & 0<br /> |
||
− | Discussions & 1<br /> |
||
− | |||
− | |||
− | |||
− | </div> |
||
− | === Typical questions for ongoing performance evaluation within this section === |
||
− | |||
− | # Take your smart phone and run for at least 100 meters with a constant speed. Your task is to estimate the trajectory where you ran with your phone which may be used to get some sensor reading such as accelerometer, gyroscope, GPS sensor and so on. You should use multidimensional Kalman filter with sensor fusion to solve this task. In the report clearly explain all the assumptions you made. |
||
− | # Inside the right hand of the AR-601 robot there is an accelerometer. Engineers fixed the robot on the crane and went for a lunch. Robot was swinging by inertia for a few minutes. Engineers came back after lunch and read the data from accelerometer. Help them to understand what they measured. Estimate proper trajectory for the accelerometer reading while considering reading are normally distributed by using Kalman filter. |
||
− | |||
− | ==== Typical questions for seminar classes (labs) within this section ==== |
||
− | |||
− | # GPS vs differential GPS (dGPS) |
||
− | # 9DOF IMU |
||
− | # Optical vs Mechanical Gyroscope |
||
− | # Laser vs Fiber Gyroscope |
||
− | # Absolute vs Incremental Encoders |
||
− | # MEMS for robot applications |
||
− | # What are the Smart and Intelligent Sensors? |
||
− | |||
− | ==== Test questions for final assessment in this section ==== |
||
− | |||
# What can GPS module do? |
# What can GPS module do? |
||
# Is GPS module typically able to work indoor? |
# Is GPS module typically able to work indoor? |
||
Line 560: | Line 492: | ||
# What is the motivation in Smart sensors? |
# What is the motivation in Smart sensors? |
||
# What is the advantages of Smart sensors? |
# What is the advantages of Smart sensors? |
||
+ | |||
+ | === The retake exam === |
||
+ | '''Section 1''' |
||
+ | |||
+ | '''Section 2''' |
||
+ | |||
+ | '''Section 3''' |
||
+ | |||
+ | '''Section 4''' |
Latest revision as of 11:45, 29 August 2022
Sensing, Perception & Actuation
- Course name: Sensing, Perception & Actuation
- Code discipline: R-01
- Subject area: Visual Sensors, Data Analysis, Error Analysis, Theory of Measurements, Machine Vision, Inertial Sensors, Internal Sensors, Filtering, Sensor Fusion, Image Processing, Point Cloud Processing
Short Description
This course covers the following concepts: Physical principles of sensors and their limitations; Measurements, Sensor Calibration, Data and Error analysis; Development of algorithms for image processing, feature extraction and object recognition; 3D Point cloud processing and scene reconstruction; Linear Kalman Filter and Sensor Fusion.
Prerequisites
Prerequisite subjects
- CSE402 — Physics I (Mechanics)
- CSE410 — Physics II - Electrical Engineering
- CSE201 — Mathematical Analysis I
- CSE203 — Mathematical Analysis II
- CSE202 — Analytical Geometry and Linear Algebra I
- CSE204 — Analytic Geometry And Linear Algebra II
- CSE206 — Probability And Statistics
Prerequisite topics
Course Topics
Section | Topics within the section |
---|---|
Intro to Sensors. Data and Error Analysis |
|
Perception |
|
Sensor Fusion and Filtering |
|
Actuators and Passive Sensors: GPS, IMU and Inertial Sensors |
|
Intended Learning Outcomes (ILOs)
What is the main purpose of this course?
One of the most important tasks of an autonomous system of any kind is to acquire knowledge about its environment. This is done by taking measurements using various sensors and then extracting meaningful information from those measurements. In this course we present the most common sensors used in mobile robots and autonomous systems and then discuss strategies for extracting information from the sensors.
ILOs defined at three levels
Level 1: What concepts should a student know/remember/explain?
By the end of the course, the students should be able to ...
- familiar with physical and sensing principles for Camera, Stereo vision, LIDAR, SONAR, Time-of-Flight camera, GPS, actuators, inertial and internal sensors
- acquainted with measurements and error analysis, data analysis, and sensor calibration
- familiar with triangulation principle, basics of image and point cloud processing methods
Level 2: What basic practical skills should a student be able to perform?
By the end of the course, the students should be able to ...
- Understand how to remove systematic error and how to decrease statistical error
- Recover depth information from stereo vision, structure-from-light and TOF cameras
- Explain how Linear Regression and Least-Squares Fitting allow to minimize measurement errors
Level 3: What complex comprehensive skills should a student be able to apply in real-life scenarios?
By the end of the course, the students should be able to ...
- Calibrate sensors to remove systematic errors
- Extract meaningful information from sensor’s data (features, objects, depth and accuracy information)
- Detect objects from 2D images
- Recover scene from 3D Point Cloud
- Filter noisy data
- Match models to datasets
- Fuse sensor’s data and apply Kalman Filtering
- Apply GPS, camera, LIDAR, RADAR, SONAR, IMU, Stereo camera for a mobile robot localization
Grading
Course grading range
Grade | Range | Description of performance |
---|---|---|
A. Excellent | 85-100 | - |
B. Good | 70-84 | - |
C. Satisfactory | 55-69 | - |
D. Poor | 0-54 | - |
Course activities and grading breakdown
Activity Type | Percentage of the overall course grade |
---|---|
Labs/seminar classes | 10 |
Home assignments | 40 |
Interim performance assessment | 15 |
Quizzes | 20 |
Exams | 15 |
Recommendations for students on how to succeed in the course
Resources, literature and reference materials
Open access resources
- Roland Siegwart, Illah R. Nourbakhsh, and Davide Scaramuzza. Introduction to Autonomous Mobile Robots, MIT press, 2011.
- M. Chli, D. Scaramuzza, R. Siegwart, et al. Autonomous Mobile Robots, ETH Zurich, 2017, http://www.asl.ethz.ch/education/lectures/autonomous_mobile_robots.html
- Jacob Fraden. Handbook of modern sensors: physics, designs, and applications. Springer, 2010
- Alonzo Kelly. Mobile Robotics: Mathematics, Models, and Methods. Cambridge University Press, 2013
- Horn, Berthold K. P. Robot Vision. Cambridge, MA: MIT Press /McGraw-Hill, March 1986
- H. Choset, K. M. Lynch, et. al. “Principles of Robot Motion: Theory, Algorithms, and Implementations”, MIT press, 2005
- Gregory Dudek and Michael Jenkin. Computational Principles of Mobile Robotics, 2nd ed., Cambridge University Press, 2010
Closed access resources
Software and tools used within the course
Teaching Methodology: Methods, techniques, & activities
Activities and Teaching Methods
Learning Activities | Section 1 | Section 2 | Section 3 | Section 4 |
---|---|---|---|---|
Development of individual parts of software product code | 1 | 1 | 1 | 1 |
Homework assignments | 1 | 0 | 0 | 0 |
Midterm evaluation | 1 | 1 | 1 | 1 |
Testing (written or computer based) | 1 | 1 | 1 | 1 |
Discussions | 1 | 1 | 1 | 1 |
Homework and group projects | 0 | 1 | 1 | 1 |
Formative Assessment and Course Activities
Ongoing performance assessment
Section 1
Activity Type | Content | Is Graded? |
---|---|---|
Question | UAV flies through the strong wind and begins to oscillate. The pitch angle change was measured with the gyro during a few seconds. Exclude from the consideration rare random deviation (outliers) and estimate the true value of the roll angle. Calculate the confidence interval of error for 99.9% confidence level. Consider that the gyro errors are normally distributed, except for the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. | 1 |
Question | UAV flies through the strong wind and begins to oscillate. The roll angle change was measured with the gyro during a few seconds. Exclude from the consideration rare random deviation (outliers) and estimate the true value of the roll angle. Calculate the confidence interval of error for 95% confidence level. Consider that the gyro errors are normally distributed, except for the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. | 1 |
Question | The human CoM (center of mass) during the walking has been measured with Kinect sensors. Estimate the true value of the x-component of acceleration. For example, you can use a moving average filter. Calculate the confidence interval of error for 95% confidence level. Consider that the errors are normally distributed, except for the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. | 1 |
Question | UAV performs loop-the-loop. The pitch angle change was measured with the gyro during 1 seconds. Exclude from the consideration rare random deviation (outliers) and estimate the true value of the roll angle. Calculate the confidence interval of error for 95% confidence level. Consider that the gyro errors are normally distributed, except for the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. | 1 |
Question | You are given a dataset (select the dataset with corresponds to your ID), includes some data points in R . Your task is to estimate whether it represents a plane, line or something else. You must use the RANSAC for this task. Explain the way you selected your minimal sample set, number of iteration and the threshold level? It would be better to provide an analytical solution derivation as well as graphical interpretation. | 1 |
Question | Inside the right hand of the AR-601 robot there is an accelerometer. Engineers fixed the robot on the crane and went for a lunch. Robot was swinging by inertia for a few minutes. Engineers came back after lunch and read the data from accelerometer. Help them to understand what they measured. Exclude from the consideration rare random deviation (outliers) and estimate the true value of the y-component of acceleration. Calculate the confidence interval of error for 99.9% confidence level. Consider that the accelerometer errors are normally distributed, except the rare random deviations. Proper regression technique should be applied so as to fit the data while excluding rare random deviations. | 1 |
Question | Introduction to Confidence Interval (CI) | 0 |
Question | Introduction to Linear Regression | 0 |
Question | Introduction to Logistic Regression | 0 |
Question | Introduction to RANSAC | 0 |
Question | Introduction to Maximum Likelihood Estimation (MLE) | 0 |
Section 2
Activity Type | Content | Is Graded? |
---|---|---|
Question | Place a 3d object such as a cube or a cylinder or something you like in an appropriate way with respect to Kinect. Then you are going to use Kinect 2 in order to get the depth map. Associate depth map with RGB information in order to isolate the object, which is to be extracted from the ground plane and other background. You may have to use RANSAC or some other algorithms to extract the object and find the center point of the object in the 3D space. It may be relative to Kinect or any known position in the world. | 1 |
Question | Calibrate a camera (your phone or computer camera should be utilized) using the chessboard pattern. It’s logically to switch off the auto focus mode of the camera, if it is enabled. The number of images should be at least 30 (with different chessboard positions). Obtain the intrinsic and extrinsic parameters. Once you have calibrated your camera, store intrinsic and extrinsic parameters. Then take a photo of some object (for example, a cup) using the calibrated camera, estimate the height and width of the selected object using both a ruler and an image from the calibrated camera. Calculate the distance between the camera image plane and the selected object. | 1 |
Question | The provided dataset contains left and right images in two different folders with the same name. For this task please select image pair corresponds to your id (0000[Id].png). You need to use 8 point algorithm in order to find the fundamental matrix. For the initial key points detection (minimum 8 corresponding points) you can either do it manually or use any key points detection technique. Next step is to estimate the disparity map for the selected image pair. You may assume the baseline of the stereo camera as the 10cm and focal length of both the left and right side cameras as 2.8mm. If you need any additional assumptions, please elaborate them in the report. | 1 |
Question | Build the hardware, and program and implement suitable code, for a simple color sensor suitable for an application of interest to you. An example of the principle you might employ is a hardware where the component in the center is a photo transistor; it detects light – with varying sensitivity – across the full visible spectrum, a little into the ultraviolet, and into the infrared to a little. Basic idea of the color meter is that LEDs are turned on and off in sequence, and the correspondingly detected signals are recorded. When all the LEDs are off the ambient (background) is recorded. In this way a ”signature” of any particular color patch placed in a location that is illuminated by the LEDs and seen by the phototransistor is generated. You could then, for example, compare the signature that you obtain from an ”unknown” item with the signatures of various items that you previously stored in a ”library”, hence identify (with some quantifiable degree of certainty) the ”unknown” item. Even If you do use this principle, you don’t have to use exactly these components. In your Arduino kit you probably have a variety of LEDs, probably including one ”tri-color” LED, and a phototransistor that you can use. | 1 |
Question | CCD vs CMOS technologies | 0 |
Question | Bayer mosaic filter vs Faveon capture color filter | 0 |
Question | Pinhole camera model | 0 |
Question | Stereo vision, Disparity map and Stereo image rectification | 0 |
Question | Multiple Camera Vision | 0 |
Question | Structure-from-Motion | 0 |
Question | ToF and multi-frequency phase-shift LIDAR technologies | 0 |
Question | SONAR sensing, SONAR transducer and transmitter | 0 |
Question | Distributed acoustic sensing (DAS) and applications | 0 |
Question | Doppler mm-wave RADAR | 0 |
Section 3
Activity Type | Content | Is Graded? |
---|---|---|
Question | UAV flies through the strong wind and begins to oscillate. The pitch angle change was measured with the gyro during a few seconds. Estimate proper trajectory for the pitch angle while considering the gyro reading are normally distributed by using Kalman filter. | 1 |
Question | UAV flies through the strong wind and begins to oscillate. The roll angle change was measured with the gyro during a few seconds. Estimate proper trajectory for the roll angle while considering the gyro reading are normally distributed by using Kalman filter. | 1 |
Question | UAV performs loop-the-loop. The pitch angle change was measured with the gyro during a few seconds. Estimate proper trajectory for the pitch angle while considering the gyro reading are normally distributed by using Kalman filter. | 1 |
Question | The human CoM (center of mass) during the walking has been measured with Kinect sensors. Estimate proper trajectory for the human CoM movement while considering reading are normally distributed by using Kalman filter. | 1 |
Question | Inside the right hand of the AR-601 robot there is an accelerometer. Engineers fixed the robot on the crane and went for a lunch. Robot was swinging by inertia for a few minutes. Engineers came back after lunch and read the data from accelerometer. Help them to understand what they measured. Estimate proper trajectory for the accelerometer reading while considering reading are normally distributed by using Kalman filter. | 1 |
Question | Expected Value and Variance of a Random Variable. Gaussian Distribution. | 0 |
Question | One Dimensional Kalman Filter | 0 |
Question | The main stages of Linear Kalman Filter (KF): Prediction and Update steps. KF initialization | 0 |
Question | Linear Kalman Filter vs Extended KF | 0 |
Question | KF, EKF, UKF and Particle Filter - main features and differences | 0 |
Question | Motion Capture (MoCap) system | 0 |
Question | Multicamera system | 0 |
Section 4
Activity Type | Content | Is Graded? |
---|---|---|
Question | Take your smart phone and run for at least 100 meters with a constant speed. Your task is to estimate the trajectory where you ran with your phone which may be used to get some sensor reading such as accelerometer, gyroscope, GPS sensor and so on. You should use multidimensional Kalman filter with sensor fusion to solve this task. In the report clearly explain all the assumptions you made. | 1 |
Question | Inside the right hand of the AR-601 robot there is an accelerometer. Engineers fixed the robot on the crane and went for a lunch. Robot was swinging by inertia for a few minutes. Engineers came back after lunch and read the data from accelerometer. Help them to understand what they measured. Estimate proper trajectory for the accelerometer reading while considering reading are normally distributed by using Kalman filter. | 1 |
Question | GPS vs differential GPS (dGPS) | 0 |
Question | 9DOF IMU | 0 |
Question | Optical vs Mechanical Gyroscope | 0 |
Question | Laser vs Fiber Gyroscope | 0 |
Question | Absolute vs Incremental Encoders | 0 |
Question | MEMS for robot applications | 0 |
Question | What are the Smart and Intelligent Sensors? | 0 |
Final assessment
Section 1
- Why do mobile robots need sensors?
- Why do mobile robots need actuators?
- What are the risks of not calibrating the robot sensors?
- Why do we need to calibrate sensors?
- What’s the difference between external and internal robot sensors?
- What is the difference between Statistical and Systematic errors?
- What is the difference between accuracy and precision?
- Which type of error influences on accuracy and which type influences on precision?
- Can we eliminate stochastic errors?
- Can we eliminate systematic errors?
- What is the difference between direct and indirect measurements?
- How can we determine the uncertainty of indirect measurements?
- Which type of error can be brought into measurements, if you did not perform the sensor calibration?
- Sensors’ data contain rare random deviations (outliers), which are characterized a priori as unreliable values with unknown distribution law. Which algorithm can we apply for eliminating these unreliable values?
- It is known that the instrumental error of the HOKUYO URG-04LX-UG01 laser rangefinder (LRF) is about ± 3% of measuring value. Let’s consider a quadcopter with the LRF, which flies in a corridor, making stops each 1 m to map the environment with the rangefinder. It is known that due to the quadcopter fluctuations, the total LRF measurement error increases up to ± 10% from the measurement value. How long does the quadcopter need to stay at each stop to map the environment with minimal error, if LRF performs 10 scans/sec?
- Why do we use the regression analysis?
- When we speak about regression data, what do we mean?
- Can we use linear regression for variables with strong nonlinear relationships?
- Can a least-squares fit be applied for polynomials?
- Can we apply regression analysis for noisy wave described by sine function?
- What is outlier?
- What is the main idea of RANSAC?
Section 2
- What’s the difference between CCD and CMOS?
- Which advantages of CMOS technology do you know?
- What’s the difference between Bayer mosaic filter and Faveon capture filter?
- Why does the Bayer mosaic array have in two times more green color filters if compare with red and blue ones?
- Why do we need to calibrate camera?
- Which subjects or structures can we use for camera calibration?
- What’s the difference between intrinsic and extrinsic parameters?
- What’s the difference between radial and tangential distortions?
- Can we calibrate a camera with autofocusing?
- What is the pinhole camera model?
- Does the pinhole camera model contain lens?
- What’s the name of calibration parameters that reproject 3D world coordinate system to the 3D camera’s coordinate system?
- What’s the name of calibration parameters that transform the 3D camera’s coordinates into the 2D image coordinates?
- What is the optical center of a camera?
- Which technology is widely used for producing a megapixel camera? CMOS or CCD?
- What is the matrix name, which concludes focal length and a skew coefficient?
- What are the applications of stereo vision?
- What is the stereo vision?
- Why do we calibrate the stereo camera system?
- What is the output of stereo vision?
- How do we get 3D reconstruction from binocular stereo?
- Which visual cues (patterns) can provide 2D image with 3D information?
- What is the triangulation?
- What is the Stereo Image Rectification?
- Why do we take multiple pairs of chessboard images from different angles during calibration?
- What is the meaning of use an asymmetric chessboard as a calibration pattern?
- Can we store the calibration images in a format with lossy compression?
- What is the connection between anaglyph image and stereo image rectification?
- What is the disparity map in stereo vision reconstruction process?
- What is the result of Stereo Vision Scene Reconstruction?
- What’s the difference between Time-of-flight & Structured-light sensors?
- Give short description of Depth Measurement Techniques for Stereo Vision, ToF cameras, Kinect and LIDAR.
- What is the motivation to use 3D sensors?
- Which 3D sensing applications do you know?
- What are the disadvantages of Multiple Camera Vision?
- What is the ToF camera?
- What’s the difference between ToF camera and scanning LIDAR?
- What’s the difference of ToF imaging for Pulsed Modulation and Continuous Wave Modulation?
- What do flying pixels mean?
- What is the ToF Depth inhomogeneity?
- What are the sources of systematic error for depth sensors?
- Describe the main principle of Structured Light Imaging (e.g. for Kinect).
- What’s the difference between LIDAR and Kinect?
- What are the basic camera elements?
- What is the focal length?
- What are the Field of View and Angle of View?
- Which Angle of View and Focal Length do you have at Zoom In?
- What is the Depth of Field (DoF)?
- What is Macro Photography (Close-up shooting) mode?
- What is Fish eye camera?
- What is Fish eye camera configuration concept?
- What are the main characteristics of IR / thermal imaging camera?
- What is the LIDAR?
- What does a conventional LIDAR system involve?
- What is the difference between incoherent and coherent LIDAR detection schemes?
- How does LIDAR work?
- Which optical components can LIDAR conclude?
- What’s the advantage of low energy micro pulse LIDAR?
- What is the difference between ToF and multi-frequency phase-shift LIDAR technologies?
- Which achievements can we expect from LIDAR industry in the nearest future?
- Give examples of LIDAR applications in surveillance and security.
- What is the motivation to use LIDARs at the Airborne Laser Scanning?
- What is SONAR?
- How does SONAR work?
- What are the main SONAR characteristics?
- How can objects or surfaces’ features influence on SONAR sensing?
- How does SONAR sensing change with work frequency increase?
- What is the character of directional diagram for typical SONAR?
- What are SONAR applications?
- What is the typical material for the modern SONAR transducers?
- How does piezo crystal work as the SONAR transducer and transmitter?
- What is the piezoelectric effect? (direct and inverse)
- How do Level Sensors work in tanks or cisterns?
- How does Echo Sounders work?
- What is the Doppler effect?
- What is Distributed acoustic sensing (DAS)?
- How can Distributed acoustic sensing (DAS) sense the acoustic signal?
- What is Rayleigh scattering?
- How does Doppler radar work?
- Give examples of Distributed acoustic sensing (DAS) applications.
- What is the mm-wave RADAR?
- How does mm-wave RADAR work?
Section 3
- What’s the difference between least square fitting and Kalman filtration?
- What’s the difference between Conventional and Recursive Estimating the Mean?
- What’s the difference between filtration and smoothing?
- Does averaging deal with smoothing or filtration?
- What are the advantages of Kalman filter implementation?
- What can be the applications of Kalman filter?
- What are the maximal and minimal values of Kalman gain (Kalman coefficient) and what they mean?
- What is the difference between Kalman Filter (KF) and Extended Kalman Filter (EKF)?
- The observations are processed using Kalman Filter. Imagine that State has 5 elements, and 1 observation data type. What is the size of the estimation error covariance matrix?
- Describe two main steps of Kalman filter.
- With what type of random variable distribution Kalman filter works?
- What is the sensor fusion?
- What is the advantage of sensor fusion?
- Give an example of obtaining qualitatively new information from Sensor Fusion.
- Give examples of Sensor fusion applications.
- Why do we build multi-sensor systems instead of using a single sensor?
- What is motivation to use Sensor Fusion in Robotics?
- Why don’t we prefer averaging sensors’ data instead of fusion?
- How can we combine (process) sensors’ data to get the best estimate of measurement value?
- Can we fuse the sensors of different types with different measurement techniques and accuracies?
- Why do we present the sensor’s models in terms of probability distributions?
- What is the mean of a random variable? What is the variance of a random variable?
- What is the Probability Density Function (PDF)? What is the Multi-modal Probability Density Function?
- What are the advantages of using Gaussian PDF?
- What does the central limit theorem state?
- Let’s merge two Gaussian noise models. Is the standard deviation of the merged model bigger than for the initial noise models?
- Having the sensor fusion, in which case the data averaging can be effective?
- What’s the difference between the sensor fusion by Kalman Filter and Particle Filter?
- Can we process non Gaussian Noise by Linear Kalman Filter?
Section 4
- What can GPS module do?
- Is GPS module typically able to work indoor?
- What does the altimeter module include?
- What can an altimeter module do?
- What is inertia? What is inertial sensors?
- Which types of inertial sensors do you know?
- Which are two basic classes of rotation sensing gyros do you know?
- What is the difference between Rate gyros and Rate integrating gyros?
- What is IMU? What can it do?
- What does accelerometer measure?
- Can accelerometers measure the gravity?
- What’s the difference between accelerometer and gyro?
- What can 3D gyro measure?
- Why is built-in temperature sensor used in modern gyros or accelerometers?
- Why is embedded power down or sleep functions used in modern gyros?
- What is the magnetometer?
- What can magnetometer (compass module) do?
- How does MEMS accelerometer work?
- Which material is used for MEMS accelerometer’ spring and mass?
- What is the Coriolis force?
- Give examples of the Coriolis force appearance.
- Does MEMS gyroscope contain of rotating mechanical parts?
- How does MEMS gyro work?
- Which optical gyros do you know?
- Which components do you know in Laser Gyro?
- How does laser gyro work?
- What is Sagnac effect?
- What is the advantages and disadvantages of Laser Gyro?
- What is the difference between Ring-Laser Gyro (RLG) and Fibre optic gyro (FOG)?
- Which position robot sensors do you know?
- Which sensors can be used to measure mechanical quantities?
- How do linear potentiometers work?
- How do rotary potentiometers work?
- What are the strain gauges?
- What is the difference between strain gauges and tensometers?
- How does differential pressure sensor work?
- How do Force Measurement Resistors (FSR) work?
- What are the Force Measurement Resistors (FSR)?
- How do Capacitive sensors work?
- What is the difference in position (displacement) measurements of conventional capacitive sensors and differential capacitive sensors?
- Compare capacitive and inductive position sensors.
- What is the principle of Inductive position sensor’s work?
- What is piezoelectricity?
- How does sensitivity depend on frequency in a typical piezoelectric sensor?
- What does optical encoder measure?
- What are the main components of optical encoder?
- Can incremental encoders save angle position when power is switched off?
- What is the difference between incremental and absolute optical encoders?
- What is the Gray Code?
- Does it have sense to apply the Gray code for incremental encoders?
- What is the Resolver?
- What is the difference between Resolvers and Optical Encoders?
- What is MEMS?
- What is the motivation to use MEMS?
- Which materials can be used in MEMS fabrication?
- Which basic MEMS fabrication techniques do you know?
- What is the right order of fabrication process (etching, patterning, deposition)?
- What is the patterning?
- What is the etching?
- Which types of micromachining do you know?
- Which actuators do you know?
- Give the examples of MEMS applications.
- What are the typical MEMS dimensions?
- What are the Smart sensors?
- What is the motivation in Smart sensors?
- What is the advantages of Smart sensors?
The retake exam
Section 1
Section 2
Section 3
Section 4