Spring 2025
Topics in Machine Learning (기계학습특론) - Graduate Course
Instructor: Dr. Muhammad Syafrudin
Course information
The objectives of this course are to explore different topics in machine learning, such as Regression, Deep Learning, Convolutional Neural Networks and Self-Attention, Transformers, Generative Models, Explainable AI, and Meta Learning. Through this class, students will gain new insights into recent developments in machine learning, allowing them to deepen their knowledge in areas of interest. Additionally, the course aims to foster research exchange and facilitate discussions among students.
Lecture
Tue at 12:00‐15:00 am KST at 대양AI센터 735호
Office hours
Appointment by email (Office at AI Center 502호)
Prerequisites
Prior knowledge of Calculus, Linear Algebra, and Probability will be advantageous
Notes
Course material can be downloaded in ecampus and please be aware,
that we will not publicly release the homework assignments this year.
Please note that:
-All lectures are in English.
-Weekly lecture topics may be adjusted or changed without prior notice depending on the understanding level during the class.
1. Presentation:
Your presentation will consist of a 25-30 minute talk followed by a 10-15 minute Q&A session.
** Since most of the course material is based on research papers, active class participation is crucial for this semester. Therefore, there are a couple of things you need to do before and during the class.
2. Once you have selected your paper, please inform the class by dropping a message in ecampus, stating "OOO presents OOO paper." Each student should choose their paper at least two weeks before their scheduled presentation.
3. For the paper you have selected, you should prepare your presentation. Additionally, you are required to write a homework assignment called "paper critique" for ALL the other papers selected by your classmates. This assignment involves creating a 1-2 page summary for each paper chosen by your peers. The paper critique should include the following content:
- 3 main points of the reading: What is the paper about? What problem does it address? What solution does it propose?
- 3 strengths of the reading: What is the main novelty of the paper? What is its impact? If you were to write such a paper, what aspects would you consider?
- 3 potential improvements for the reading: Are there any weaknesses in the paper? How can it be extended and improved?
- 3 questions you have about the paper.
4. You must submit a hard copy of the paper critique before the class. Additionally, you need to upload your presentation file and paper critique on ecampus prior to your presentation.
5. Each presentation will be followed by a 10-minute Q&A session. During the Q&A session, all students should actively engage by asking questions or discussing the paper. This is mandatory for all presentations.
6. If you have a suggestion for a paper that is related to our class, you can propose it with prior approval from me.
Schedule
| 주차(Week) | 강의내용(Class Topic & Contents) | 강의활동유형(Class Type) |
|---|---|---|
| 1 | Course introduction (3/4) | Lecture and discussion |
| 2 | Intro to ML - regression (3/11) | Lecture and discussion |
| 3 | Deep Learning (3/18) | Lecture and discussion |
| 4 | Convolutional Neural Networks & Self-Attention (3/25) | Lecture and discussion |
| 5 | Transformer (4/1) | Lecture and discussion |
| 6 | Generative Model (4/8) | Lecture and discussion |
| 7 | Paper presentation - round one: 3 students (4/15) | Student presentation and discussion |
| 8 | Midterm exam (4/22) | 시험 (Exam) |
| 9 | Paper presentation - round one: 3 students (4/29) | Student presentation and discussion |
| 10 | No class -- Public holiday (5/6) | No class |
| 11 | Paper presentation - round one: 3 students(5/13) | Student presentation and discussion |
| 12 | Explainable AI(5/20) | Lecture and discussion |
| 13 | Meta Learning / Guest Lecture (Tentative) (5/27) | Lecture and discussion |
| 14 | Paper presentation - round two: 4 students (6/3) | Student presentation and discussion |
| 15 | Paper presentation - round two: 4 students (6/10) | Student presentation and discussion |
| 16 | Paper presentation - round two: 1 student (6/17) | Student presentation and discussion |
Grading
The final grade will be calculated using the following weights:
| # | Final Grade Weight |
|---|---|
| Attendance | 20% |
| Mid exam | 30% |
| Paper Critiques and Presentations | 50% |
| Total | 100% |
Assignment
There might be one or two student presentations, and in addition, paper critiques for other students' presentations will be assigned and evaluated.
Submitting an assignment
Instructions for turning in assignments will be posted when the semester starts (in ecampus).
Getting help
For questions about homework, course content, installation, and after you have tried to troubleshoot yourselves, the process to get help is: Post the question in ecampus/group chat and hopefully your peers will answer. Note that in ecampus questions are visible to everyone. For private matters send an email to helpline: udin [at] sju [dot] ac [dot] kr.
Course Policies
Collaboration policy
We encourage you to talk and discuss the assignments with your fellow students (and on ecampus), but you are not allowed to look at any other students assignment or code outside of your pair. Discussion is encouraged, copying is not allowed.
Late day policy
Homework is due before each class. Late submission are not allowed.
Communication to students
Class announcements will be through ecampus. All homework and quizzes will be posted in ecampus. Also all feedback forms. Important note: make sure you have your settings set so you can receive emails from ecampus.
Academic honesty
We give a strong emphasis to Academic Honesty. As a student your best guidelines are to be reasonable and fair. We encourage teamwork for problem sets, but you should not split the homework and you should work on all the problems together.
Presented Papers (Selected by students)
- Wan, A., Chang, Q., AL-Bukhaiti, K., & He, J. (2023). Short-term power load forecasting for combined heat and power using CNN-LSTM enhanced by attention mechanism. Energy, 282, 128274. https://doi.org/10.1016/j.energy.2023.128274
- Mutsaddi, A., & Choudhary, A. (2025). Enhancing Plagiarism Detection in Marathi with a Weighted Ensemble of TF-IDF and BERT Embeddings for Low-Resource Language Processing (Version 1). arXiv. https://doi.org/10.48550/ARXIV.2501.05260
- Ardianto Nugroho, V., & Moo Lee, B. (2025). GPS-Aided Deep Learning for Beam Prediction and Tracking in UAV mmWave Communication. IEEE Access, 13, 117065–117077. https://doi.org/10.1109/ACCESS.2025.3586594
- Prabhu, H., Valadi, J., & Arjunan, P. (2024). Generative Adversarial Network with Soft-Dynamic Time Warping and Parallel Reconstruction for Energy Time Series Anomaly Detection (Version 1). arXiv. https://doi.org/10.48550/ARXIV.2402.14384
- Han, H., Lee, K., & Soylu, F. (2020). Applying the Deep Learning Method for Simulating Outcomes of Educational Interventions. SN Computer Science, 1(2), 70. https://doi.org/10.1007/s42979-020-0075-z
- Wang, G., Liu, D., & Cui, L. (2024). Auto-Embedding Transformer for Interpretable Few-Shot Fault Diagnosis of Rolling Bearings. IEEE Transactions on Reliability, 73(2), 1270–1279. https://doi.org/10.1109/TR.2023.3328597
- Li, X., Liang, S., Lei, Y., Li, C., Hou, Y., Zheng, D., & Ma, T. (2024). CausalMed: Causality-Based Personalized Medication Recommendation Centered on Patient Health State. Proceedings of the 33rd ACM International Conference on Information and Knowledge Management, 1276–1285. https://doi.org/10.1145/3627673.3679542
- Nguyen, Q. M., Nguyen, L. M., & Das, S. (2023). Correlated Attention in Transformers for Multivariate Time Series (Version 1). arXiv. https://doi.org/10.48550/ARXIV.2311.11959
- Martin Ester, Hans-Peter Kriegel, Jörg Sander, and Xiaowei Xu. 1996. A density-based algorithm for discovering clusters in large spatial databases with noise. In Proceedings of the Second International Conference on Knowledge Discovery and Data Mining (KDD'96). AAAI Press, 226–231.
- Zhang, X., Li, Y., Wang, J., Sun, B., Ma, W., Sun, P., & Zhang, M. (2024). Large Language Models as Evaluators for Recommendation Explanations. 18th ACM Conference on Recommender Systems, 33–42. https://doi.org/10.1145/3640457.3688075
- Ahmed, I., Ahmad, M., Chehri, A., & Jeon, G. (2023). A Smart-Anomaly-Detection System for Industrial Machines Based on Feature Autoencoder and Deep Learning. Micromachines, 14(1), 154. https://doi.org/10.3390/mi14010154
- Ali, R., Hussain, J., & Lee, S. W. (2023). Multilayer perceptron-based self-care early prediction of children with disabilities. DIGITAL HEALTH, 9, 20552076231184054. https://doi.org/10.1177/20552076231184054
- An, T. T., & Lee, B. M. (2023). Robust Automatic Modulation Classification in Low Signal to Noise Ratio. IEEE Access, 11, 7860–7872. https://doi.org/10.1109/ACCESS.2023.3238995