November 26, 2025
Europe/Bratislava timezone

Optimizing hyperparameters of a novel bio-inspired neural network via interpretable meta-modeling

Nov 26, 2025, 3:25 PM
1m
Zamestnanci informatika Poster session + káva: prezentácie vedeckých výsledkov FMFI UK Zamestnanci Informatika Poster session + káva: prezentácie vedeckých výsledkov FMFI UK Zamestnanci Informatika

Description

Artificial neural networks are currently at their prime, mainly due to the bloom of deep learning. Despite their inspiration from brain processes, learning in such systems, based on error backpropagation (BP), is only loosely inspired by the actual neural mechanisms. Learning in the brain is local and makes use of the bidirectional flow of information. Our Universal Bidirectional Activation-based Learning (UBAL) model extends the existing work on bio-inspired alternatives to BP based on the so-called contrastive Hebbian-anti-Hebbian learning principle. Introducing more brain-related properties, it implements two mutually dependent yet separate weight matrices for different activation-propagation directions and a so-called echo component that allows the network to learn from its own internal representations. At its core, UBAL performs heteroassociative learning via a novel learning rule with special hyperparameters that enable our model to master qualitatively different tasks, such as auto-encoding, denoising, and classification, all with a single learning rule. In a good hyperparameter setup, UBAL achieves performance comparable to a standard multi-layer perceptron as well as to related biologically motivated state-of-the-art models. Due to its heteroassociative nature, it is able to generate images of the learned classes as an emergent phenomenon, without being explicitly trained to do so, and this is also dependent on the setup of the special new hyperparameters. As it is vital yet difficult to find an optimal setup of the beta and gamma hyperparameters, we have focused on well-interpretable and effective methods for finding these values. Here, we present our chosen analytical method that reveals the individual effects of given hyperparameters on the performance of our model as a form of meta-modeling the model's performance as a function of the hyperparameter configuration. Namely, we utilize a multi-layer perceptron, which learns to predict the model's performance as a function of a hyperparameter setup and the SHAP method to reveal the feature importance of the novel hyperparameters of the UBAL model. Our proposed method helps us to further explore their function and relative importance in a particular machine learning task.

Pracovisko fakulty (katedra)/ Department of Faculty Katedra aplikovanej informatiky
Tlač postru/ Print poster Nebudem požadovať tlač posteru / I don't require to print the poster

Authors

Kristina Malinovska (Department of Applied Informatics, FMPI CU) Miroslav Cibula

Presentation materials

There are no materials yet.