Naganand Yadati

Postdoctoral Research Fellow

School of Computing

National University of Singapore

Advisor: Prof. Arnab Bhattacharyya







iisc_logo


iiitb_logo


rvce_logo

Bachelor of Engineering (B.E.)

Rashtreeya Vidyalaya College of Engineering, Bangalore

2010-2014



gnn

Deep Learning with Emphasis on Graph Neural Networks

Understanding structured data across varied domains, from social networks and biological systems to knowledge graphs, is becoming increasingly critical.

Traditional deep learning models (e.g., convolutions, transformers) are well-suited to ordered data like images and text but often fall short when dealing with graph-structured data's irregularities and complexities.

This challenge has led to the development of Graph Neural Networks (GNNs), a class of deep learning models tailored to graph data, emphasising the crucial interdependencies between nodes.

gnn

Learning on Rich Structures (e.g., causal graphs)

Motivated by the intricate web of multi-faceted relationships found within various datasets, the research focus delves into machine learning of data rich systems such as causal graphs and hypergraphs.

Causal graphs encapsulate cause-and-effect dynamics and are pivotal in fields that require understanding the underlying mechanisms of observed phenomena, such as epidemiology and economics.

Hypergraphs take this a step further by capturing higher-order relationships, where edges can connect more than two nodes, facilitating a multidimensional analysis of interactions.



Google Scholar





Key Publications



hypergcn_neurips19

HyperGCN: A New Method for Training Graph Convolutional Networks on Hypergraphs

In Proceedings of NeurIPS'19| code| slides

Innovative and effective extenion of graph neural networks to hypergraphs, proven by extensive real-world testing.

gmpnnr_neurips20

Neural Message Passing for Multi-Relational Ordered and Recursive Hypergraphs

In Proceedings of NeurIPS'20| code

Frameworks that extend Message Passing Neural Networks to effectively handle multi-relational and recursive structures in real-world learning.

cvxgnn_icdm22

A Convex Formulation for Graph Convolutional Training: Two Layer Case

In Proceedings of ICDM'22| code

A convex approach to train two-layer ReLU-based Graph Neural Networks, ensuring global optimality in a field where theoretical understandings of optimisation have been limited.

gainer_eacl24

GAINER: Graph Machine Learning with Node-specific Radius for Classification of Texts

To Appear In Proceedings of EACL'24

Node-specific message passing radii in Graph Machine Learning for NLP, enhancing model flexibility validated by testing on several NLP tasks.

gnn_emnlp19

EMNLP Tutorial on Graph-based Deep Learning in Natural Language Processing

In Proceedings of EMNLP'19| video

A summary of various Graph Neural Network models in NLP covering a broad range of NLP tasks such as relation extraction, question answering.