David Chiang 蔣偉

Associate Professor, Computer Science and Engineering
Natural Language Processing Group

My research is in natural language processing, the subfield of computer science that aims to enable computers to understand and produce human language. I focus mainly on language translation, and am interested in syntactic parsing and other areas as well.

Teaching

Recent and selected publications

Lena Strobl, William Merrill, Gail Weiss, David Chiang, and Dana Angluin. Transformers as recognizers of formal languages: a survey on expressivity. Transactions of the Association for Computational Linguistics, 2024. To appear. PDF BibTeX
Dana Angluin, David Chiang, and Andy Yang. Masked hard-attention transformers and Boolean RASP recognize exactly the star-free languages. 2023. arXiv:2310.13897. PDF BibTeX
Stephen Bothwell, Justin DeBenedetto, Theresa Crnkovich, Hildegund Müller, and David Chiang. Introducing rhetorical parallelism detection: a new task with datasets, metrics, and baselines. In Proc. EMNLP, 5007–5039. 2023. doi:10.18653/v1/2023.emnlp-main.305. PDF BibTeX
Alexandra Butoi, Tim Vieira, Ryan Cotterell, and David Chiang. Efficient algorithms for recognizing weighted tree-adjoining languages. In Proc. EMNLP. 2023. PDF BibTeX
Aarohi Srivastava and David Chiang. BERTwich: extending BERT's capabilities to model dialectal and noisy text. In Findings of ACL: EMNLP. 2023. PDF BibTeX
Brian DuSell and David Chiang. Stack attention: improving the ability of transformers to model hierarchical patterns. In Proc. ICLR. 2024. PDF BibTeX
Chihiro Taguchi, Yusuke Sakai, Parisa Haghani, and David Chiang. Universal automatic phonetic transcription into the International Phonetic Alphabet. In Proc. INTERSPEECH. 2023. doi:10.21437/Interspeech.2023-2584. PDF BibTeX
Alexandra Butoi, Ryan Cotterell, and David Chiang. Convergence and diversity in the control hierarchy. In Proc. ACL. 2023. PDF BibTeX
David Chiang, Peter Cholak, and Anand Pillay. Tighter bounds on the expressivity of transformer encoders. In Proc. ICML, 5544–5562. 2023. PDF BibTeX
Aarohi Srivastava and David Chiang. Fine-tuning BERT with character-level noise for zero-shot transfer to dialects and closely-related languages. In Proc. Workshop on NLP for Similar Languages, Varieties and Dialects. 2023. PDF BibTeX
Patrick Soga and David Chiang. Bridging graph position encodings for transformers with weighted graph-walking automata. Transactions on Machine Learning Research, 2023. PDF BibTeX
Brian DuSell and David Chiang. The surprising computational power of nondeterministic stack RNNs. In Proc. ICLR. 2023. PDF BibTeX
David Chiang, Colin McDonald, and Chung-chieh Shan. Exact recursive probabilistic programming. PACMPL, 2023. doi:10.1145/3586050. PDF BibTeX
Chihiro Taguchi and David Chiang. Introducing morphology in Universal Dependencies Japanese. In Proc. Workshop on Universal Dependencies, 65–72. 2023. PDF BibTeX
David Chiang, Alexander M. Rush, and Boaz Barak. Named tensor notation. Transactions on Machine Learning Research, 2023. PDF BibTeX
Darcey Riley and David Chiang. A continuum of generation tasks for investigating length bias and degenerate repetition. In Proc. BlackboxNLP. 2022. PDF BibTeX
Alexandra Butoi, Brian DuSell, Tim Vieira, Ryan Cotterell, and David Chiang. Algorithms for weighted pushdown automata. In Proc. EMNLP. 2022. PDF BibTeX
David Chiang and Peter Cholak. Overcoming a theoretical limitation of self-attention. In Proc. ACL. 2022. PDF BibTeX
Brian DuSell and David Chiang. Learning hierarchical structures with differentiable nondeterministic stacks. In Proc. ICLR. 2022. PDF BibTeX

full list