1 Introduction
1.1 Deep Learning Background
1.2 Structure-Sensitive Neural Networks 1.3 The Proposed Tree-Based Convolutional Neural Networks
1.4 Overview of the Book 2 Preliminaries and Related Work
2.1 General Neural Networks 2.1.1 Neurons and Multi-Layer Perceptrons
2.1.2 Training of Neural Networks: Backpropagations 2.1.3 Pros and Cons of Multi-Layer Perceptrons
2.1.4 Pretraining of Neural Networks
2.2 Neural Networks Applied in Natural Language Processing
2.2.1 The Characteristics of Natural Language
2.2.2 Language Models
2.2.3 Word Embeddings
2.3 Existing Structure-Sensitive Neural Networks 2.3.1 Convolutional Neural Networks
2.3.2 Recurrent Neural Networks 2.3.3 Recursive Neural Networks
2.4 Summary and Discussions
3 General Concepts of Tree-Based Convolutional Neural Networks (TBCNNs)
3.1 Idea and Formulation
3.2 Applications of TBCNNs
3.3 Issues in designing TBCNNs
4 TBCNN for Programs' Abstract Syntax Trees (ASTs) 4.1 Background of Program Analysis
4.2 Proposed Model 4.2.1 Overview
4.2.2 Representation Learning of AST nodes
4.2.3 Encoding Layer
4.2.4 AST-Based Convolutional Layer
4.2.5 Dynamic Pooling 4.2.6 Continuous Binary Tree
4.3 Experiments
4.3.1 Unsupervised Representation Learning
4.3.2 Program Classification
4.3.3 Detecting Bubble Sort 4.3.4 Model Analysis
4.4 Summary and Discussions
5 TBCNN for Constituency Trees in Natural Language Processing
5.1 Background of Sentence Modeling and Constituency Trees
5.2 Proposed Model
5.2.1 Constituency Trees as Input
5.2.2 Recursively Representing Intermediate Layers 5.2.3 Constituency Tree-Based Convolutional Layer
5.2.4 Dynamic Pooling Layer 5.3 &
About the Author:
Lili Mou is currently a research scientist at AdeptMind Research. He received his BS and PhD degrees from the School of EECS, Peking University, in 2012 and 2017, respectively. After that, Lili worked as a postdoctoral fellow at the University of Waterloo. His current research interests include deep learning applied to natural language processing, and programming language processing. His work has been published at leading conferences and in respected journals, like AAAI, ACL, CIKM, COLING, EMNLP, ICML, IJCAI, INTERSPEECH, LREC, and TACL. He has been a primary reviewer/PC member for top venues including AAAI, ACL, COLING, IJCNLP, and NAACL-HLT. Lili received the "Outstanding PhD Thesis Reward" from Peking University and the "Top-10 Student Scholars Prize" from the School of EECS, Peking University for his research achievements.
Zhi Jin is a professor of Computer Science at Peking University. In addition, she is deputy director of the Key Laboratory of High Confidence Software Technologies (Ministry of Education) at Peking University and Director of the CCF Technical Committee of Software Engineering. Her research work is primarily concerned with knowledge engineering and requirements engineering, focusing on knowledge/requirements elicitation, conceptual modeling and analysis. Recently, has begun focusing more on modeling adaptive software systems. She is/was the principal investigator of over 10 national competitive grants including the chief scientist of a national basic research project (973 project) for the Ministry of Science and Technology of China and the project leader of three key projects for the National Science Foundation of China. She was the General Chair of RE2016, Program Co-Chair of COMPSAC2011, General Co-Chair and Program Co-Chair of KSEM2010 and KSEM2009. She is executive editor-in-chief of theChinese Journal of Software, and serves on the Editorial Board of REJ and IJSEKE. She was an Outstanding Youth Fund Winner of the National Science Foundation of China in 2006 and Distinguished Young Scholars of Chinese Academy of Sciences in 2001. She received the Zhong Chuang Software Talent Award in 1998 and the First Prize in Science and Technology Outstanding Achievement: Science and Technology Progress Award (Ministry of Education, China) in 2013. She is the co-author/author of three books and more than 120 journal and conference publications.