Contact me at:
[2 Accepted Papers] [ICML 2023] “On the forward-invariance of neural ODEs” and “Dataset distillation with convexified implicit gradients” are now accepted to ICML 2023!
[TEDx Talk] “AI that understands what it does!” at TEDxMIT, April 22nd, 2023.
[Invited Talk] “Generalist AI Systems in Finance” at the New York AI in Finance Summit, April 20th, 2023.
[TEDx Talk] “What is a Generalist Artificial Intelligence?” at TEDxBoston, March 6th, 2023.
[Accepted Paper] [L4DC 2023] “Learning Stability Attention in Vision-based End-to-end Driving Policies” is now accepted at L4DC 2023.
[Accepted Paper] [ICLR 2023] “Liquid Structural State-Space Models” is now accepted to ICLR 2023. [link]
[Accepted Paper] [IEEE T-RO 2023] “BarrierNet: Differentiable Control Barrier Functions for Learning of Safe Robot Control”, is now accepted in the IEEE Transactions on Robotics Journal.
[Invited Lecture at MIT] I gave a lecture on “The modern era of statistics” at the MIT Introduction to Deep Learning course, on Jan 12th, 2023. [link]
[Accepted Paper] [ICRA 2023] “Infrastructure-based End-to-End Learning and Prevention of Driver Failure” is now accepted to ICRA 2023. [link]
[TEDx Talk] Liquid Neural Networks at TEDxMIT, December 4th, 2022. [link]
[Keynote Talk] “Generalist AI Models” at the Vanguard’s 5th Artificial Intelligence and Machine Learning Summit (Oct 16th, 2022)
[New PrePrint] Achieving state-of-the-art performance in sequence modeling with Liquid State-Space Models (Liquid-S4) [link]
[New Patent] Sparse Closed-form Neural Algorithms for Out-of-Distribution Generalization on Edge Robots [US Provisional Patent Case No. 63/415,382] (October 12th, 2022)
[New PrePrint] Interpreting Neural Policies with Disentangled Tree Representations [link]
[New Software and PrePrint] PyHopper: HyperParameter Optimization [link]
[New PrePrint] On the Forward Invariance of Neural ODEs [link]
[2 Accepted Papers] [NeurIPS 2022] 2 papers accepted to NeurIPS 2022. Both articles are related to Exploring neural network properties in the infinite width limit with my brilliant Ph.D. student Noel Loo at MIT.
[New Patent] Systems and Methods for Efficient Dataset Distillation Using Non-Deterministic Feature Approximation [US Provisional Patent Case No. 63/390,952] (July 20th, 2022)
[New PrePrint] Are All Vision Models Created Equal? A Study of the Open-Loop to Closed-Loop Causality Gap 2022 [link]
[Invited Talk] “Achieving Causality and Out-of-distribution Robustness via Liquid Neural Networks,” Centre for Autonomous and Cyber-Physical Systems at Cranfield University, UK (Oct 7th, 2022)
[Invited Talk] “Liquid Neural Networks” Stanford Intelligent Systems Laboratory: SISL, Stanford University (July 18th, 2022)
[Invited Talk] “Liquid Neural Networks” at the Vectors of Cognitive AI Workshop, Intel AI Labs, CA (May 17th, 2022)
[Keynote Talk] “Liquid Neural Networks” at the Council of Scientific Society Presidents (CSSP), Spring Leadership Workshop, Role of Artificial Intelligence on Science and Quality of Life [link] (May 2nd, 2022)
[Accepted Paper] [ICRA 2022] Latent Imagination Improves Real-World Deep Reinforcement Learning
[AWARD] Hyperion Research 2022 HPC Innovation Excellence Awards for the invention of Liquid Machine Learning [link]
[Accepted Paper] [AAAI 2022] Gotube: Guarantee the Safety of Continuous-depth Neural Models
[Seminar talk] at MIT Center for Brain, Minds, and Machines (CBMM), on Liquid Neural Networks, (Oct 5, 2021) [link]
[Keynote talk] on the 20th of August, 2021, I will give a keynote talk on “Liquid Neural Networks for Autonomous Driving” at IJCAI 2021 Artificial Intelligence for Autonomous Driving Workshop! [link]
[Accepted Paper] [ICML 2021] Our paper On-Off Center-Surround Receptive Fields for Robust Image Classification got accepted for publication at the 38th International Conference on Machine Learning (ICML), 2021. [link]
[Recent Invited Talks]
“Liquid Time-Constant Networks”,
Synthesis of Models and Systems Seminar at Simons Institute, UC Berkeley, CA, 3.22.21 [link]
“Understanding Liquid Time-Constant Networks”,
MIT Lincoln Laboratory Machine Learning Special Interest Group (LL-MLSIG) Seminar Series, 3.25.21
“Liquid Neural Networks”
MIT Open Learning, MIT Horizon, Cambridge, MA, 4.8.21
“What Is a Liquid Time-Constant Network?”,
Northeastern University, Boston, MA, 3.14.21 [link]
[New Preprint] A new preprint of our work on comparing model-based to model-free agents in autonomous racing environments is out! [link]
[Accepted Papers] [ICRA 2021] Our work “Adversarial training is not ready for robot learning” has been accepted for publication at the IEEE International Conference on Robotics and Automation (ICRA) 2021. [link]
[Press MIT News] article about our research: “Liquid” machine-learning system adapts to changing conditions.
The new type of neural network could aid decision-making in autonomous driving and medical diagnosis. (Jan 28th, 2021) [link]
[2 Accepted Papers] [AAAI 2021] our works “Liquid time-constant networks” and “On the verification of Neural ODEs” have been accepted for publication at the 35th AAAI Conference on Artificial Intelligence. [link]
[Position Update] I joined the Distributed Robotics Lab (DRL) of CSAIL MIT, as a postdoctoral associate.[link]
[Accepted Paper] [Nature Machine Intelligence] “Neural Circuit Policies Enabling Auditable Autonomy” got accepted for publication in Nature Machine Intelligence. [link]
[Accepted Paper] [ICML 2020] “A Natural Lottery Ticket Winner: Reinforcement Learning with Ordinary Neural Circuits” got accepted to the 2020 International Conference on Machine Learning (ICML) [link]
[Accepted Paper] [Journal of Autonomous Robots 2020] “Plug-and-play supervisory control using muscle and brain signals for real-time gesture and error detection” got accepted to the journal of Autonomous Robots, August 2020. [link]
[Ph.D. dissertation] Check out my Ph.D. dissertation here: [link]
[Ph.D. studies – Done!] Completed my Ph.D. degree with honors on May 5th, 2020
[Accepted Paper] ICRA 2020 – We introduced a new regularization scheme to obtain state-stable recurrent neural networks, in control environments. The paper is gonna be presented during ICRA 2020 (May 29th -June 4th) in Paris, France.
[TED Talk] Watch my latest TEDxCluj talk entitled “A journey inside a neural network” [link]
[Accepted Paper] ICRA 2019 – We proposed a new brain-inspired neural network design methodology for interpretable and noise-robust robotic control [link]
[Accepted Paper] IJCNN 2019 – We proposed a new method to interpret LSTM networks [link]
[TED Talk] My TEDxVienna talk entitled “Simple Artificial Brains to Govern Complex Tasks” is officially released by TEDx. watch it [here]
[Interview] Read my interview with Vera Steiner at TEDxVienna here
[AAAI-IAAI 2019] One accepted paper entitled “a machine learning suite for Machine’s health monitoring”, for oral presentation. [link]
[Accepted papers] 2 journal papers got published in the Royal Society Philosophical Transactions: Biological Sciences. [Publications]
I participated in the Deep Learning Indaba 2017 in Johannesburg, South Africa.
I am currently an AI Scientist at CSAIL MIT. Prior to that, I was jointly a Principal AI and Machine Learning Scientist at the Vanguard Group and a Research Affiliate at CSAIL MIT(11/2021-7/2023). Before that, I worked with Daniela Rus as a Postdoctoral Associate at CSAIL MIT (10/2020-12/2021). I have completed my Ph.D. studies (with distinction) in Computer Science, at TU Wien, Austria (May 2020). My Ph.D. dissertation on Liquid Neural Networks was co-advised by Prof. Radu Grosu (TU Wien) and Prof. Daniela Rus (MIT).
My research focuses on flexible decision-making algorithms.