Keynote 1

Sparse Adjacency Matrices at the Core of Graph Databases:
GraphBLAS the Engine Behind RedisGraph Property Graph Database

Roi Lipman (RedisLabs, USA)

In the last couple of years, we have seen a rise in the number of new graph database vendors. What used to be a field with just a handful of key players, has transitioned into a vibrant arena full of innovation and competition, where performance matters most. Engineers and researcher are always trying to improve and come up with new techniques to perform graph traversals and other types of graph analysis. With the quite recent release of GraphBLAS (an open effort to define standard building blocks for graph algorithms in the language of linear algebra), we are able to define graphs using sparse adjacency matrices and evaluate queries by using linear algebra operations. RedisGraph, a property graph database, is the first to do this. Transitioning from a vertex centric point of view proved to be challenging but well worth it, as we are able to incorporate years of research and development in the graph DB world. Projects such as LAGraph (a collection of algorithms that use the GraphBLAS) can be incorporated and exposed easily to end users, and soon enough running graph queries on GPUs. GraphBLAS has indeed performed a revolution in this field. In this talk, I will present RedisGraph and the way it uses GraphBLAS to answer graph queries formulated in the Cypher query language (one of the most popular graph query languages). I'll touch on the pros and cons of using sparse adjacency matrices at the core of this DB, and a few of the classical graph algorithms (implemented by linear algebra operations) incorporated in RedisGraph.

Mobirise

Roi Lipman is the creator of the only Linear Algebra formulation-based Graph Database at RedisLabs where he leads the development of RedisGraph since 2017. His key interests include: GraphBLAS, database systems, high performance computing, and parallel and distributed algorithms.


Keynote 2

Label Propagation and Graph Neural Networks

Austin Benson (Cornell University, USA)

Semi-supervised learning on graphs is a widely applicable problem in network science and machine learning. Two standard algorithms -- label propagation and graph neural networks -- both operate by repeatedly passing information along edges, the former by passing labels and the latter by passing node features, modulated by neural networks. These two types of algorithms have largely developed separately, and there is little understanding about their relationship and how the approaches can be meaningfully combined. In this talk, I will present some probabilistic models that unify these algorithms, showing how label propagation and graph neural network ideas are naturally connected and how this leads to algorithms that can use both effectively. The talk will also discuss computational and machine learning tradeoffs of complex, highly expressive models that are expensive to train and difficult to implement, compared to simpler, less expressive models that run faster, are easy to implement, and offer more opportunities for parallelism.

Mobirise

Austin Benson is an Assistant Professor of Computer Science and a Field Member of Applied Mathematics at Cornell University. His research develops numerical methods and algorithmic frameworks that enable new, better, and bigger analyses of complex network data. Austin’s research has appeared in Science, the Proceedings of the National Academy of Sciences, and SIAM Review, and has been recognized with a KDD best paper award and the Gene Golub doctoral Dissertation Award. Before joining Cornell, he received his PhD in computational and mathematical engineering at Stanford University.

This page was created with Mobirise web templates