Click here to join the seminar
Title: Learning and Inference with hybrid models (three example
Date: Tuesday 6th of June at 16:00
Abstract: Hybrid models are the result of the integration of two types of models, i.e., deep neural networks and logical models. The former are capable to process high throughput data in continuous spaces; the latter class contains models which are capable to express knowledge about the abstract properties and relations between the observed data. There is no unique way to integrate these two classes of models. A first method exploits logical knowledge to impose additional supervision during the training of a (set of) neural models that predicts abstract properties and relations. This idea is implemented in Logic Tensor Networks (LTN) [1]. A second method consists in using the background knowledge to correct/revise the prediction done by a (set of) neural models. This method has been implemented Iterative Local Refinement (ILR) [2]. A third method consists in defining a hybrid model as the composition of a (set of) deep learning model(s), which abstract the continuous perceptions in a finite and abstract representation--the symbols--and a discrete model that computes a finite function on the set of abstract symbols. This method is described in Deep Symbolic Learning [3]. In the seminar, I will briefly introduce the hybrid model described above and the mechanism used for training and inference.
[1] Badreddine, S., Garcez, A. D. A., Serafini, L., & Spranger, M. (2022). Logic tensor networks. Artificial Intelligence, 303, 103649.
[2] Daniele, A., van Krieken, E., Serafini, L., & van Harmelen, F. (2023). Refining neural network predictions using background knowledge. Machine Learning, 1-39.
[3] DANIELE, Alessandro, CAMPARI, Tommaso, MALHOTRA, Sagar, et al. Deep Symbolic Learning: Discovering Symbols and Rules from Perceptions. arXiv preprint arXiv:2208.11561, 2022. (to appear in IJCAI 2023)
Bio: Luciano Serafini is a researcher in the area of Artificial Intelligence. He currently coordinates the Data and Knowledge Management Research Unit at Fondazione Bruno Kessler. He graduated in information science with a thesis on logic for knowledge representation. Since 1990 he joined Fondazione Bruno Kessler (before IRST) as a researcher in the area of logic for knowledge representation and reasoning. He proposes a formalism called Multi-Context (MC) System for the representation of modular interconnected context-dependent knowledge. MC Systems has a great influence on the semantic web area and in the area of information integration. Between 2000-2013 he worked on semantic matching heterogeneous schemas and he was the first who suggested to encode this problem in propositional satisfiability. After 2010, he started his research about integrating machine learning and logical reasoning, and in 2016 he invented Logic Tensor Network, one of the first neuro-symbolic architectures. In the last years, he also has been interested in integrating learning, acting, and planning. PropHe has been a EurAI fellow since 2020 and teaches regular courses on Knowledge representation and learning at the University of Padova.
Cliquer sur Click here to join the seminar pour ouvrir la ressource.