Skip to main content

Machine Learning and Deep Learning

Curriculum

Certificate students will select three classes of interest from the following courses offered in our Professional Master’s Program. Depending on the yearly course offerings and availability, you can complete the certificate in one to three quarters. 

Autumn 2023

This course will cover topics related to control (Proportional Integral Derivative and Model Predictive Control applied to trajectory following), state estimation (particle filters, motion models, sensor models), planning (A*, Rapidly exploring Random Tree), and learning. Each of the assignments will involve student teams implementing the algorithms learned in lecture on 1/10th sized rally cars. Concepts from the assignments will culminate in a partially open-ended final project with a final demo on the rally cars. The course will involve programming in a Linux and Python environment along with ROS for interfacing to the robot.

Computer vision has made tremendous progress over the past decade in solving problems such as image classification, object detection, semantic segmentation, and 3D reconstruction. A major paradigm shift occurred in 2012, when the technique of deep learning began to replace hand-crafted features with automatically learned ones. The purpose of this class is to introduce students to both classic techniques (pre-2012) as well as modern ones (since 2012), along with fundamental concepts that underlie both.

This course Introduces theoretical formulations and practical applications of deep learning associated with big visual data. It will cover conventional unsupervised learning and supervised machine learning, followed by neural network based deep learning, and important issues related to deep learning, such as reinforcement learning, few shot learning, domain adaptation, open-set and long tailed data learning, and active learning. It will address hidden Markov model and recurrent neural networks to address temporal visual data. Finally, it explores deep learning techniques with applications to image/radar/lidar object detection and recognition, as well as application to video object segmentation and tracking.

Winter 2024

There are many security applications such as credit card fraud, Malware, spam, which have large amounts of data related to the system as well as adversarial actions. This course will study the use of machine learning for detecting and mitigating cyber threats arising in commercial applications. Our ability to identify the type of machine learning algorithms that are useful for specific security applications can help us to improve the defense against attacks and also anticipate the potential attack variants that may arise in the future. Classes will consist of lectures followed by hands-on Python Labs so that the students are able to first learn about a cyber threat, how to extract essential features, data preprocessing and then identify suitable suite of ML algorithms that can be used to detect and mitigate the cyber threat.

Large language models (LLMs) have become a more popular and exciting theme in machine learning in the past few years with the advent of generative AI and frameworks like ChatGPT. Applications range from sentiment analysis to question answering to text summarization and more. LLMs are revolutionizing the way we interact with technology and our world, with Virtual Assistants being one example. In this course, we will begin with a recap of Deep learning and Natural language processing before moving into Large Language models and embeddings in first half of course. In the second half, we will build on top of LLM foundations and discuss generative AI applications, models, tools and techniques including touching on ChatGPT 3.5 and Dalle-2 APIs. Both parts of the course will involve weekly programming assignments and one mini project to showcase your learnings on your own analytics/AI webpage.

Spring 2024

This class is intended for engineers and scientists within and outside Electrical Engineering. What is it useful for? Some examples, from basic to advanced: How many engineers and scientists know that linear interpolation is a poor way to increase sample rate? And why are cubic splines also, under most conditions, a bad choice? How can you do a better job of interpolation? What is an optimal interpolator and what metric space is it optimal for? What are the best signal representations for machine learning systems which generalize well, that is do well outside their training data? How do linear time invariant systems generalize? Why is a Fourier transform and frequency important? How does the concept of frequency have principled depth which goes way beyond simply decomposing arbitrary signals into oscillating components? Yet what are the limitations of Fourier transforms, and how does modern machine learning potentially get around these limitations?

TinyML is an emerging area where ultra large powerful ML models are converted into executables for embedded systems that are battery operated and mostly well beyond the operation capacity of smart phones (e.g., microcontrollers).  TinyML is the real-time processing of time-series data that comes directly from sensors and has applications in agriculture, health, retail, energy industry, and more. Student will learn how to deploy TinyML models on power and performance-constraint devices to solve real-word problems, how to implement machine learning algorithms, utilize Python libraries such as NumPy and Pandas, and use C language for deploying TinyML on embedded systems.

This course explores the variety of topics related to privacy preserving machine learning (PPML), focusing on theoretical and applied aspects of PPML. The course begins  by considering statistical and information-theoretic notion of privacy. It then considers privacy attacks against machine learning models and then examines a variety of topics focused on preventing and mitigating such privacy attacks, including multi-party secure computation (MPC), differential privacy (DP), federated learning, robust federated learning, and split learning.

 

Autumn 2022

This course will cover topics related to control (Proportional Integral Derivative and Model Predictive Control applied to trajectory following), state estimation (particle filters, motion models, sensor models), planning (A*, Rapidly exploring Random Tree), and learning. Each of the assignments will involve student teams implementing the algorithms learned in lecture on 1/10th sized rally cars. Concepts from the assignments will culminate in a partially open-ended final project with a final demo on the rally cars. The course will involve programming in a Linux and Python environment along with ROS for interfacing to the robot.

  • Computer Vision (Mohan)

The field of Computer vision has made a lot of advances in the past decade with the advent of deep learning. Problems previously considered intractable not only have a solution in computer vision but also see implementation in the real world (e.g. self driving cars, terrain navigation, etc). The course will introduce applications and methods side by side. We will start with basic concepts in human and computer vision, learn building blocks for vision and proceed towards machine learning methods for CV. Towards the end of the course, we will also look at state of the art deep learning methods for different vision problems including: medical image detection (MRI analytics), automated captioning of images, image segmentation, handwriting recognition and more. The course will have a combination of conceptual and hands-on programming assignments with a focus on learning how to think in vision and developing intuition, and hands-on experience in this space.

This course Introduces theoretical formulations and practical applications of deep learning associated with big visual data. It will cover conventional unsupervised learning and supervised machine learning, followed by neural network based deep learning, and important issues related to deep learning, such as reinforcement learning, few shot learning, domain adaptation, open-set and long tailed data learning, and active learning. It will address hidden Markov model and recurrent neural networks to address temporal visual data. Finally, it explores deep learning techniques with applications to image/radar/lidar object detection and recognition, as well as application to video object segmentation and tracking.

Winter 2023

There are many security applications such as credit card fraud, Malware, spam, which have large amounts of data related to the system as well as adversarial actions. This course will study the use of machine learning for detecting and mitigating cyber threats arising in commercial applications. Our ability to identify the type of machine learning algorithms that are useful for specific security applications can help us to improve the defense against attacks and also anticipate the potential attack variants that may arise in the future. Classes will consist of lectures followed by hands-on Python Labs so that the students are able to first learn about a cyber threat, how to extract essential features, data preprocessing and then identify suitable suite of ML algorithms that can be used to detect and mitigate the cyber threat.

  • Advanced Introduction to Machine Learning (Mohan)

In this course, you’ll get a broad overview of the many different machine learning methods. We’ll cover linear and logistic regressions, k‐nearest neighbors, feature selection and engineering, cross validation, decision trees and random forests, generative vs. discriminative models, information retrieval, matrix factorization and machine teaching, among many others. Along the way, we’ll work with these methods using applications in computational biology, recommendation systems, anomaly/fraud detection, computer vision and natural language processing. The course will have a generous amount of programming to keep what we learn grounded in data, and gain real insights!

Spring 2023

This course covers data science applications for energy systems operations and control. Sensors and monitoring systems are producing an ever-increasing amount of data about energy systems, from battery packs, to industrial and commercial buildings, to the bulk transmission grid. In this class we will explore how to use these data to enable cleaner, sustainable and more equitable energy systems. We focus on the management and analytics of multi-domain multi-resolution data, especially on how to integrate data science tools with physical operations

This project-intensive hands-on course focuses on how to implement and accelerate deep learning on power-constrained IoT and mobile devices. Lectures and programming assignments cover a range of topics in deep learning including feature extraction, convolution, recurrent and spiking neural networks, deep learning hardware accelerators, deep learning programming and code optimization. Programming assignments and projects cover practical applications of embedded artificial intelligence in vision, natural language processing, sequential data modeling for smart healthcare and smart society. 

TinyML is an emerging area where ultra large powerful ML models are converted into executables for embedded systems that are battery operated and mostly well beyond the operation capacity of smart phones (e.g., microcontrollers).  TinyML is the real-time processing of time-series data that comes directly from sensors and has applications in agriculture, health, retail, energy industry, and more. Student will learn how to deploy TinyML models on power and performance-constraint devices to solve real-word problems, how to implement machine learning algorithms, utilize Python libraries such as NumPy and Pandas, and use C language for deploying TinyML on embedded systems.

This course explores the variety of topics related to privacy preserving machine learning (PPML), focusing on theoretical and applied aspects of PPML. The course begins  by considering statistical and information-theoretic notion of privacy. It then considers privacy attacks against machine learning models and then examines a variety of topics focused on preventing and mitigating such privacy attacks, including multi-party secure computation (MPC), differential privacy (DP), federated learning, robust federated learning, and split learning.

Questions? email pmp@ece.uw.edu