Computation theory (automata, formal grammar, Chomsky hierarchy, Parikh matrix, etc.)
Computational linguistics & NLP (grammar-based NLP, few-shot learning, data augmentation)
Education
Ph.D. in Computer Science, Yonsei University, Seoul, Korea (2019–Current) Research: formal grammar-based NLP, Parikh matrix, computational linguistics
Bachelor's degree in Computer Science, Yonsei University (2015–2019)
Experience
Theory of Computation Lab, Yonsei University
Research Intern (2017–2018): Worked on automata, grammar classification, and NLP tasks
Graduate Researcher (2019–Present): Leading projects on grammar-based NLP, formal language theory, Parikh matrix
Projects
EEG Rule Extraction & Data Similarity Approximation (2019–2020): Project proposal and lead on similarity algorithms
AI Programming Platform & Malware Pattern Extraction: Focus on NLP + code, malware detection
5th AI Grand Challenge (2022): Math problem solving with deep learning models
Financial Data AI Trend Analysis (2023–2024): Industry-academic project with Bankware Global
Publications
Characterizations of M-equivalence and weak M-relation Joonghyuk Hahn, Hyunjoon Cheon, Yo-Sub Han International Journal of Foundations of Computer Science (IJFCS), To appear
Formalizes M-equivalence and weak M-relation over Parikh matrices, extending theoretical foundation for language classification.
TCProF: Time-Complexity Prediction SSL Framework Joonghyuk Hahn, Hyeseon Ahn, Jungin Kim, Soohan Lim, Yo-Sub Han NAACL 2025
A semi-supervised framework for predicting code time complexity using symbolic analysis in low-resource settings.
Advanced Code Time Complexity Prediction Approach Using Contrastive Learning
Shinwoo Park, Joonghyuk Hahn, Elizabeth Orwig, Sang-Ki Ko, Yo-Sub Han Engineering Applications of Artificial Intelligence, 2025
Proposes a contrastive learning method that clusters codes for the same programming problem, leveraging problem descriptions and reference codes as anchors.
On the Decidability of Infix Inclusion Problem
Hyunjoon Cheon, Joonghyuk Hahn, Yo-Sub Han Theory of Computing Systems, Vol. 68, 2024
Establishes new decidability results in formal language theory regarding infix inclusion problems.
Universal Rewriting Rules for the Parikh Matrix Injectivity Problem
Ingyu Baek, Joonghyuk Hahn, Yo-Sub Han, Kai Salomaa DLT 2024, LNCS 14791
Derives rewriting rules to resolve the Parikh matrix injectivity issue in language theory.
SuperST: Superficial Self-Training for Few-Shot Text Classification
Ju-Hyoung Lee, Joonghyuk Hahn, Hyeon-Tae Seo, Jiho Park, Yo-Sub Han LREC-COLING 2024
A noise-aware self-training method for few-shot classification tasks.
ATHENA: Mathematical Reasoning with Thought Expansion
J.B. Kim, Hazel Kim, Joonghyuk Hahn, Yo-Sub Han EMNLP 2023
Enhances reasoning in LLMs through a structured thought expansion mechanism.
GDA: Grammar-based Data Augmentation for Text Classification using Slot Information Joonghyuk Hahn, Hyunjoon Cheon, Elizabeth G. Orwig, Su-Hyeon Kim, Sang-Ki Ko, Yo-Sub Han Findings of EMNLP 2023
Introduces grammar-based augmentation techniques for slot-annotated data.
M-equivalence of Parikh Matrix over a Ternary Alphabet Joonghyuk Hahn, Hyunjoon Cheon, Yo-Sub Han CIAA 2023, LNCS 14151
Analyzes M-equivalence under ternary alphabet constraints.
Boosting Code Summarization by Embedding Code Structures
Jikyeong Son, Joonghyuk Hahn, Hyeon-Tae Seo, Yo-Sub Han COLING 2022
Uses graph-structured embeddings to enhance code summarization.
On the Decidability of Infix Inclusion Problem
Hyunjoon Cheon, Joonghyuk Hahn, Yo-Sub Han DLT 2022, LNCS 13257
Further explores infix inclusion properties in formal languages.
Self-Training using Rules of Grammar for Few-Shot NLU Joonghyuk Hahn, Hyunjoon Cheon, Kyuyeol Han, Cheongjae Lee, Junseok Kim, Yo-Sub Han Findings of EMNLP 2021
Applies grammar-based pseudo-labeling to few-shot natural language understanding.
Most Pseudo-copy Languages Are Not Context-free
Hyunjoon Cheon, Joonghyuk Hahn, Yo-Sub Han, Sang-Ki Ko COCOON 2021
Proves limitations of context-free grammars regarding pseudo-copy languages.
Teaching & Mentoring
TA for Automata & Formal Languages (2019–2022)
Mentor for undergraduate research interns in formal grammar and computational linguistics (2021–2023)
Professional Activities
Reviewer/PC Member since 2022 for ACL Rolling Review, EMNLP, etc.
Skills
Programming: C/C++, Python, Java
Tools/Frameworks: LaTeX, Git, PyTorch, TensorFlow
Advisor
Prof. Yo-Sub Han
Department of Computer Science, Yonsei University
Email: emmous@yonsei.ac.kr