I’m a PhD Student at the Linguistic Department at Georgetown University, advised by Ethan Wilcox. I am most interested in natural language understanding and how humans and computational systems interact with meaning and form when they process language and perform linguistic tasks.
Before coming to Georgetown, I was NLP engineer at NCSOFT, where I worked on NLU and information extraction.
Outside of classes and assistantships, I like to participate in intramural sports as both player and coach and attend free food events around campus.
In Fall 2025, I will be serving as Teaching Assistant for LING1000 at Georgetown University as well as supervising CSC494 at the University of Toronto.
You can reach me at jm3743@georgetown.edu.
Publications
Junghyun Min, Xiulin Yang, Shira Wein. When does meaning backfire? Investigating the role of AMRs in NLI. *SEM 2025.
Lauren Levine, Junghyun Min, Amir Zeldes. Building UD Cairo for Old English in the Classroom. UDW at SyntaxFest 2025.
Junghyun Min, Minho Lee, Woochul Lee, Yeonsoo Lee. Punctuation restoration improves structure understanding without supervision. RepL4NLP at NAACL 2025.
Junghyun Min, R. Thomas McCoy, Dipanjan Das, Emily Pitler, and Tal Linzen. Syntactic data augmentation increases robustness to inference heuristics. ACL 2020.
R. Thomas McCoy, Junghyun Min, and Tal Linzen. BERTs of a feather do not generalize together: Large variability in generalization across models with similar test set performance. BlackboxNLP at EMNLP 2020.
Education and work experience
Ph.D. student, Linguistics, Georgetown University. Advised by Ethan Wilcox | 2024 - |
---|---|
Visiting Researcher, Computer Science, University of Toronto | 2025 |
NLP Engineer, NCSOFT | 2021 - 2024 |
M.A. Cognitive Science, Johns Hopkins University. Advised by Tal Linzen | 2019 - 2020 |
Data Analyst, Harford Community College | 2018 - 2019 |
B.S. Physics, B.A. Mathematics, Johns Hopkins University | 2014 - 2017 |