Hector Garcia Rodriguez

I am a research engineer at Huawei Technologies, where I do research on efficient deep learning.

I graduated with an MSc Machine Learning from University College London, where I was advised by Timoleon Moraitis and Pontus Stenetorp during my dissertation "A new recurrent unit with synaptic short-term plasticity". I was included in the Dean's List (Top 5%) and awarded a Distiction. Previously, I interned as a Software Development Engineer in Amazon Web Services, and obtained a BSc Theoretical Physics from University College London with First Class Honours.

LinkedIn  /  Twitter  /  Github

profile photo
Research

I'm interested in multimodal deep learning: improving efficiency using adaptable networks, and using more contextualised representations for sequential decision making tasks.

Hebbian Deep Learning Without Feedback
Adrien Journé, Hector Garcia Rodriguez, Qinghai Guo, Timoleon Moraitis
To appear in ICLR, 2023
arXiv

We train deep ConvNets with a Hebbian soft winner-take-all algorithm, multilayer SoftHebb. With an added linear probe, it achieves competitive results in image classification in CIFAR-10, STL-10 and ImageNet when compared to other biologically plausible baselines. SoftHebb increases biological compatibility, parallelisation and performance of state-of-the-art bio-plausible learning.

Short-Term Plasticity Neurons Learning to Learn and Forget
Hector Garcia Rodriguez, Qinghai Guo, Timoleon Moraitis
ICML, 2022
arXiv / talk / code

STPN is a recurrent neural network that improves Supervised and Reinforcement Learning by meta-learning to adapt its weights to the recent context, inspired by computational neuroscience. Additionally, STPN shows higher energy efficiency in a simulated neuromorphic implementation due to its optimised explicit forgetting mechanism.



Template