All things you want to know about Raphael Shu
Contents
My Name and Status
![]() |
Raphael Shu / 朱 中元 The University of Tokyo Nakayama Lab PhD Candidate |
Research Interests
- Natural Language Processing
- Natural Language Understanding
- Machine Translation
- Deep Learning
- Learning Discrete Latent Structures
- Generative Models
Cool Stuffs
- Deep Learning Monitor (deeplearn.org)
- tree2code
- Learning discrete codes for syntax
- lanmt
- Non-autoregressive neural machine translation with latent variables and refinement
- NLPer Blog
- Visualization of optimization methods (Chrome-only)
- Visualization of Monte Carlo methods (Chrome-only)
My Papers
- Raphael Shu, Jason Lee, Hideki Nakayama, Kyunghyun Cho, “Latent-Variable Non-Autoregressive Neural Machine Translation with Deterministic Inference using a Delta Posterior”, Arxiv abs/1908.07181
- Raphael Shu, Hideki Nakayama and Kyunghyun Cho, “Generating Diverse Translations with Sentence Codes”, ACL 2019
- Jiali Yao, Raphael Shu, Xinjian Li, Katsutoshi Ohtsuki, Hideki Nakayama. “Enabling Real-time Neural IME with Incremental Vocabulary Selection”, NAACL 2019
- [PDF] to be added
- 朱中元, 中山英樹, 「Generating Syntactically Diverse Translations with Syntactic Codes」, 言語処理学会 2019
- Best Paper / 最優秀賞
- Official Report
- Best Paper / 最優秀賞
- Raphael Shu, Hideki Nakayama, “Improving Beam Search by Removing Monotonic Constraint for Neural Machine Translation”, ACL 2018
- 朱中元, 中山英樹, 「深層コード学習による単語分散表現の圧縮」, 言語処理学会 2018
- Raphael Shu, Hideki Nakayama, “Compressing Word Embeddings via Deep Compositional Code Learning”, ICLR 2018
- Raphael Shu, Hideki Nakayama, “An Empirical Study of Adequate Vision Span for Attention-Based Neural Machine Translation”, ACL Workshop on Neural Machine Translation, 2017
- Outstanding Paper Award
- http://aclweb.org/anthology/W/W17/W17-3201.pdf
- 朱中元, 中山英樹, 「文脈を考慮したアテンションメカニズムの計算量の削減」, 人工知能大会 2017
- Student Paper Award / 学生奨励賞
- Evaluating Neural Machine Translation in English-Japanese Task
- Second place in human evaluation, third place in JPO accuracy test
- Weblio Pre-reordering Statistical Machine Translation System
- A preordering method using head-restructured CFG parse tree for SMT
- TSUKU Statistical Machine Translation System for the NTCIR-10 PatentMT Task
- Thesis of master’s degree
My Codes
- tree2code: Learning Discrete Syntactic Codes for Structural Diverse Translation
- nmtlab: A simplified framework for neural machine translation
- neuralcompressor
- Compress word embeddings
- https://github.com/zomux/neuralcompressor
- deepy
- Highly extensible deep learning framework based on Theano
- https://github.com/zomux/deepy
- neuralmt
- Neural machine translation modelling tookit
- https://github.com/zomux/neuralmt
- deepy-draw
- An implementation of DRAW with deepy framework
- https://github.com/zomux/deepy-draw/
- JRNNLM
- RNNLM implementation in JAVA
- https://github.com/zomux/jrnnlm
My Experiences
year | place | what done |
---|---|---|
2016 - 2019 | The University of Tokyo, PhD Program | Neural Machine Translation / Multimodal Deep Learning / Discrete Structural Learning |
2014 - 2016 | Research & Development at Weblio, Inc. | Machine Translation / Conversation Models / Data Scientist |
2012 - 2014 | University of Tsukuba, Master Program | Machine Translation |
How to find me
1 | raphael [at] uaca [dot] com |