Kenji Kawaguchi

PhD student
Massachusetts Institute of Technology (MIT)
Computer Science and Artificial Intelligence Laboratory (CSAIL)
Learning and Intelligent Systems Group (LIS)


Recent Publications

Refereed Conference Papers

Kenji Kawaguchi. Deep Learning without Poor Local Minima. In Advances in Neural Information Processing (NIPS), 2016.
[pdf] [BibTex] [Spotlight Video] [Talk] Selected for NIPS oral presentation (top 2% submissions)

Kenji Kawaguchi*, Bo Xie*, and Le Song. Deep Semi-Random Features for Nonlinear Function Approximation. In Proceedings of the 32nd AAAI Conference on Artificial Intelligence (AAAI), 2018.
[pdf] [BibTex]

Kenji Kawaguchi. Bounded Optimal Exploration in MDP. In Proceedings of the 30th AAAI Conference on Artificial Intelligence (AAAI), 2016.
[pdf] [BibTex]

Kenji Kawaguchi, Leslie Pack Kaelbling and Tomás Lozano-Pérez. Bayesian Optimization with Exponential Convergence. In Advances in Neural Information Processing (NIPS), 2015.
[pdf] [BibTex] [Code]

Refereed Journal Articles

Kenji Kawaguchi, Yu Maruyama and Xiaoyu Zheng. Global Continuous Optimization with Error Bound and Fast Convergence. Journal of Artificial Intelligence Research (JAIR), 56: 153-195, 2016.
[pdf] [BibTex]

Xiaoyu Zheng, Hiroto Itoh, Kenji Kawaguchi, Hitoshi Tamaki and Yu Maruyama. Application of Bayesian nonparametric models to the uncertainty and sensitivity analysis of source term in a BWR severe accident. Reliability Engineering & System Safety, 138: 253-262, 2015.
[pdf] [BibTeX]

Jun Ishikawa, Kenji Kawaguchi and Yu Maruyama. Analysis for iodine release from unit 3 of Fukushima Dai-ichi nuclear power plant with consideration of water phase iodine chemistry. Journal of Nuclear Science and Technology, 52(3):308-315, 2015.
[pdf] [BibTeX]

Preprints

Kenji Kawaguchi and Yoshua Bengio. Generalization in Machine Learning via Analytical Learning Theory. arXiv preprint arXiv:1802.07426, 2018.
[pdf] [BibTex]

Kenji Kawaguchi, Leslie Pack Kaelbling and Yoshua Bengio. Generalization in Deep Learning. arXiv preprint arXiv:1710.05468, 2017.
[pdf] [BibTex] [Code]

Tomaso Poggio, Kenji Kawaguchi, Qianli Liao, Brando Miranda, Lorenzo Rosasco, Xavier Boix, Jack Hidary and Hrushikesh Mhaskar. Theory of Deep Learning III: explaining the non-overfitting puzzle. arXiv preprint arXiv:1801.00173, 2018.
[pdf] [BibTex]

Technical Reports

Qianli Liao, Kenji Kawaguchi and Tomaso Poggio. Streaming Normalization: Towards Simpler and More Biologically-plausible Normalizations for Online and Recurrent Learning. Massachusetts Institute of Technology CBMM Memo No. 57, 2016.
[pdf] [BibTeX]


Code

Bayesian optimization with exponential convergence

Generalization in Deep Learning


Education

Massachusetts Institute of Technology             2014/09 - Present
Ph.D. student, Electrical Engineering and Computer Science
Advisors: Leslie Pack Kaelbling and Tomás Lozano-Pérez.
GPA 5.0/5.0 (+ highest internal grades: all A+ except one class where A was the highest offered)

Massachusetts Institute of Technology             2016/02
M.S., Electrical Engineering and Computer Science
Advisors: Leslie Pack Kaelbling and Tomás Lozano-Pérez.
Thesis: Towards Practical Theory: Bayesian Optimization and Optimal Exploration
GPA 5.0/5.0 (+ highest internal grades: all A+ except one class where A was the highest offered)


Awards

Funai Overseas Scholarship
FFIT
April 2014 - August 2016

Nakajima Foundation Fellowship
December 2013

Additional Honors & Awards