| Description |
This lecture introduces the principles and methods of graph-based pattern recognition, a field concerned with the analysis and learning of structured and relational data. Many real-world domains, such as molecules, social networks, transportation systems, documents, or images, are naturally represented as graphs, where entities are modeled as nodes and their relationships as edges. Compared to fixed-length feature vectors, graph representations offer high expressive power, but they also pose significant computational and methodological challenges. The lecture addresses these challenges and presents a comprehensive overview of algorithms and learning techniques for graph-structured data.
The course begins with the foundations of structural pattern recognition and basic concepts of graph theory. Students learn how graphs are formally defined, how structural information and attributes are represented, and how different learning tasks arise at the graph, subgraph, node, and edge levels. The complementary roles of statistical and structural representations are discussed to provide a broader perspective on modern machine learning for relational data.
A central part of the lecture focuses on graph comparison and similarity. Exact graph matching methods are introduced, including graph isomorphism, subgraph isomorphism, and maximum common subgraph techniques, together with algorithmic strategies such as the Weisfeiler–Lehman refinement. Since exact matching is computationally demanding, the course also covers inexact and error-tolerant approaches, most notably Graph Edit Distance and its efficient approximations.
Beyond matching-based approaches, alternative paradigms for processing graphs are presented. These include spectral and continuous matching methods, as well as techniques for transforming graphs into vector spaces through graph embedding. Students learn how structural information can be captured by topological, spectral, or dissimilarity-based embeddings, enabling the use of standard machine learning methods.
The lecture further introduces graph kernels, which allow similarity-based learning in implicit high-dimensional feature spaces. Important kernel families such as walk, path, and subtree kernels are discussed together with their computational properties. In addition, node representation learning methods are covered, including similarity-based measures, spectral embeddings, and random-walk-based approaches such as Node2Vec.
Another important topic is the analysis of large networks through clustering and community detection. Methods such as spectral clustering, graph partitioning, and modularity optimization (e.g., the Louvain method) are presented and compared.
The final part of the lecture focuses on modern deep learning methods for graphs. After a brief review of neural network fundamentals, the principles of Graph Neural Networks are introduced, including message passing and neighborhood aggregation. Representative architectures such as Graph Convolutional Networks, GraphSAGE, Graph Attention Networks, and Graph Isomorphism Networks are discussed for node- and graph-level prediction tasks.
In addition, the lecture addresses structural pattern discovery through frequent subgraph mining and motif detection, which enable the identification of recurring and functionally relevant patterns in graph data.
Overall, the course provides a coherent view of classical and modern approa |