Instructor: Dr. Liping Liu, 360 CBA Building, +5947, liping@uakron.edu
Credits: 3 hours
Text Books:
- Manel Martínez-Ramón, Meenu Ajith, and Aswathy Rajendra Kurup, Deep Learning--A Practical Introduction, John Wiley & Sons, 2024.
- Liping Liu, Lecture Notes on Machine Learning Topics
Time and Location: Tuesdays and Thursdays: 3:30-4:45 PM; January 10-May 15, 2026. Regular Classroom: CBA 176 (Computer Lab).
Office Hours: 1:30-3:30 PM Tuesdays and Thursdays (No appointments are necessary).
Course Description: This course studies the approach of machine learning via neural network models. It covers multilayer perceptrons for prediction and classification, convolutional neural networks for image recognition, recurrent neural networks for language processing, deep belief networks, and attention and transformer architectures for large language models. This course will use Python programming for projects. Prerequisite: ISM 420
Philosophy: Deep learning is currently the most popular approach to AI and has gained a lot of media coverage due to large language models such as ChatGPT. This course explores the models and algorithms behind the approach and explores its applications to business analytics.
Course Objectives: Upon satisfactory completion of this course, a student should be able to
- Reinforce Python programming skills and familiarize with packages with deep learning
- Understand the concepts and use cases of machine learning
- Understand the neural networks models and alignment between network architectures and learning tasks.
- Know how to apply Tensor Flow and/or Pytorch for network construction and training and how to prepare business data, including time serieses, images, videos, and unstructured text for training
- Know how to validate and evaluate trained models and make recommendations
Weekly Schedule:
- Week 1: Introduction to machine learning (prediction, classification, clustering, supervised, unsupervised learning, reinforcement learning, training, validation, and testing), basic neural networks (artificial neuros, activation functions for regression and classifications, feed-forward networks)
- Week 2: Python Review: conditional statements, loops, functions, objects, and classes
- Week 3: Python for Data Science: Numpy for manipulating and reshape arrays, scipy for numerical optimization and image processing, Scikit-learn for data preprocessing, feature selection, model selection, Pandas for manipulating time series and data frame data structures, Matplotlib and Seaborn packages for 2D and 3D data visualization, and natural language toolkit for processing natural languages.
- Week 4: Python for Deep Learning: Tensor flow, Keras, and pytorch packages for specifying and training neural networks for regression and binary and multiclass classifications.
- Week 5: Training Tune-up: learning rate, minibatch gradient decent, data normalization, overfitting, regularization, dropout, early stopping, data augmentation, and optimizers--Momentum Optimization, Nesterov-Accelerated Gradient, Adam, AdaGrad, Adamax, RMSProp, .
- Week 6: Multilayer Perceptron: python coding for feedforward neural network, Python coding for back propagation, and mathematics for batch gradient decent for minimizing SSE and Cross-Entropy using ReLU, sigmoid, step, and linear activation functions.
- Week 7: Exam I
- Week 8: Convolutional Neural Network: concepts and Python coding for convolution and pooling operations, padding, strides, Convnet architectures (AlexNet, VGG, ResNet, Inception, Xception, MobileNet, EfficientNet, DenseNet), transform image matrices, transfer CNN learning, use keras and tensor flow to create convnets for image recognition
- Week 9: Recurrent Neural Networks I: sequential data, RNN units, RNN architectures (1:m, m:1, 1:1, and m:m) for business applications, Backpropagation Through Time, deep RNN, and bidirectional RNN, multivariate time series prediction using RNN
- Week 10: Recurrent Neural Networks II: gradient vanishing and explosion in sequential models, Gated Recurrent Units, Long Short-Term Memory Models, internal states, LSTM gates, word Encoding and Embedding, input shape conversion, output of LSTM layers, use keras and tensor flow to design RNN and LSTM modules for sentiment analysis
- Week 11: Recurrent Neural Networks III: encoder and decoder, Repeat Vector and Time Distributed Densely, Functional APIs, multistep and multivariate predictions and generations. Text Generation with LSTM and keras
- Week 12: Recurrent Neural Networks IV: Nadaraya–Watson Attention Mechanism, Bahdanau Attention Mechanism, attention pooling, self-attention and training, multi-headed attention, Machine translation with encoder–decoder attention, transformers, Text generation using a decoder-only transformer architecture, Time-series forecast using encoder only transformer architecture.
- Week 13: Recurrent Neural Networks V: transformers for large language models BERT, ChatGPT, and vision transformers
- Week 14: Deep Belief Networks: deep unsupervised learning, Boltzmann Machines, Bayesian rules for deep learning, Markov Chain Monte Carlo simulation, Variational Inference, and Bayes back propagation.
- Week 15: Final Exam (May 5-9, 2026)
Exams: This course will have two major exams as scheduled above. Each exam includes both multiple choice and hands-on problems.
Assignments: Homework is assigned once a week for 12 weeks; each consists of conceptual questions and hands-on projects classified into three grading categories: correctness, closeness, and completeness. The correctness problems will be graded by ecourse.org, and closeness questions are graded and/or commented by instructors. Students will earn points automatically for each completeness question if it is finished (it has to be deemed complete). Assignments are due at the beginning of classes meetings on Mondays (except for holidays). No late homework will be graded. Please show your work in a neat and orderly fashion. Write or type your work on one side and in every other line. Use standard size paper (8 1/2'' by 11''). Do not use spiral notebook paper.
Attendance: Attendance is MUST and will be 10% of your final grade. Attendance will be managed by ecourse.org. The formula for computing your attendance grade is non-linear. It will take one point off for the first absence, 2 points off the second, 3 points off the third, and 4 points off the fourth. If you missed the equivalent of three-week classes, you fail the course automatically. Under special situations, you can take some classes online with the following guidelines:
- You must obtain permission from the instructor at least one day ahead of each online session
- Follow the lectures or recordings to perform all in-class hands-on exercises and take notes. Within one day from the class submit your notes and the finished exercises to ecourse.org as Proof of Attendance.
- All weekly assignments are due at the same time as in-person classes. All exams must be onsite.
Quizzes: I will use quizzes regularly to check your completion or preparation of assignments
Makeup: Each student with appropriate excuses may have at most one chance to makeup homework or quiz. Note that it is your privilege but not your right to have this special favor. Also, all makeups must be completed within one week of due date and before answer key is released.
Grades: Your final grades will be calculated by the following formulas:
35% (HW) + 55% (Tests) + 10% (Attendance)
A = 93-100%; A– = 90-92%; B+ = 87-89%; B = 83-86%; B– = 80-82%; C+ = 77-79%; C = 73-76%; C– =70-72%; D = 60-69%; F = 59% and less
Misconduct: Academic misconduct by a student shall include, but not limited to: disruption of classes, giving and receiving unauthorized aid on exams or in the preparation of assignments, unauthorized removal of materials from the library, or knowingly misrepresenting the source of any academic work. Academic misconduct by an instructor shall include, but not limited to: grading student work by criteria other than academic performance or repeated and willful neglect in the discharge of duly assigned academic duties. Convicted violations may result in grade penalties, besides the school official ones, such as increased scrutiny of future submissions, reduced benefits of curving, if any, and/or the reduction of overall grade.
On Collaboration: All for-credit assignments, except for those designated as group projects, must be done independently, and collaboration in providing or asking for answers to those assignments constitutes cheating.
On AI Tools: In this class, I allow students to use AI tools to help their learning. However, submitting AI generated work for credits is a violation of academic code. If a submitted work is suspected to be AI generated, the student will be asked to reproduce the submitted work in front of the instructor.
Looking for additional help? Students looking for additional assistance outside of the classroom are advised to consider working with a peer tutor through Knack. The University of Akron CBA has partnered with Knack to provide students with access to verified peer tutors who have previously aced this course. To view available tutors, visit uakron.joinknack.com and sign in with your student account. At the same time, if you are doing well in this class, please go to uakron.joinknack.com where you can create a verified tutoring profile and begin helping other students.
|