As a mathematician, I embody a modern-day peripatetic, having traversed the academic world extensively. I've been honored twice as a Marie Curie Fellow at Imperial College London, and once at Yıldız Technical University and Koç University (Istanbul, Turkey). My early career included roles as a Temporary Assistant Professor at the University of Paris-Sud/Orsay, an INRIA Research Fellow, and a Research Assistant Professor at UCDavis (CA), and subsequent positions at the Mathematical Sciences Research Institute in Berkeley, Duke University, and the Fields Institute in Toronto. Most recently, I served as a visiting professor at Johns Hopkins University. Currently, I am a Senior Scientist at Caltech's Department of Computing and Mathematical Sciences, an Affiliate Fellow of the Data Science Institute at Imperial College London, and an External Researcher at the Alan Turing Institute (London, UK).
At the Alan Turing Institute, I am co-leading, with Prof. Robert Mackay, the Research Interest Group (RIG) on "Machine Learning and Dynamical Systems", click here for more details and here for a short video. Among the activities of the RIG is the seminar series on "Machine Learning and Dynamical Systems", click here for the Youtube Channel and here to get updates about the RIG. The schedule of the seminars can be found here. I am also a co-organiser for the One World Seminar Series on the Mathematics of Machine Learning, click here for more details.
Throughout my research career, I have sought to answer the pivotal question: How can complex systems be effectively analyzed? My investigations have branched into three key approaches:
Dynamical Systems Theory (DST): This approach allows for the analysis of complex systems when the model is known. It offers nontrivial ways to analyze dynamical systems. It has the status of Theory, but it is currently limited to low-dimensional and some classes of infinite-dimensional dynamical systems.
Machine Learning (ML): ML is concerned with designing algorithms that accomplish tasks, improving as they process more data. It's particularly useful for analyzing high-dimensional complex systems where the model is unknown. However, its theoretical framework is still missing, and it lacks clear methodologies, making it unclear why certain algorithms work and their domain of applicability.
Algorithmic Information Theory (AIT): AIT provides a framework for understanding concepts such as complexity, induction, simplicity, randomness, and information content. It's a robust theoretical approach but faces practical challenges in computing the involved quantities.
My previous research interests focused on ``Dynamical Theory of Control'' that is Control Theory from the point of view of the theory of dynamical systems where the goal is the integration of concepts and ideas from dynamical systems theory and control theory into a framework that allows to develop both theories, emphasizing the analysis and control of systems with bifurcations.
My current research interests are at the intersection(s) of Machine Learning, Dynamical Systems, and Algorithmic Information Theory in view of developing a theoretical framework for Machine Learning in the following directions (please refer to the short video here for an explanation):
Machine Learning for Dynamical Systems: This involves analyzing dynamical systems based on observed data rather than analytical study, aiming to extend classical theory and develop a qualitative theory in reproducing kernel Hilbert spaces.
Dynamical Systems for Machine Learning: Here, I look at analyzing ML algorithms using dynamical systems theory tools by considering ML algorithms as dynamical systems, aiming to understand these algorithms' potential and limitations and establish a solid theoretical foundation.
Machine Learning for Algorithmic Information Theory: This direction explores using ML to approach problems in AIT, including applications of Solomonoff induction and algorithmic probability, and developing ML algorithms for better compression and prediction to approximate Kolmogorov Complexity.
Algorithmic Information Theory for Machine Learning: This involves reformulating and analyzing ML algorithms using AIT tools to understand their potential, limits, and applicability domain. This helps in understanding why certain ML methods will a priori work or not work well for specific problems.
Publications: Most of my papers are on Research Gate ORCID Google Scholar
Email: [email protected]
Recorded Talks:
Machine Learning and Dynamical Systems meet in Reproducing Kernel Hilbert Spaces, 17 June 2022, Center for Stochastic Dynamics Seminar, IIT. video, slides.
Machine Learning and Dynamical Systems meet in Reproducing Kernel Hilbert Spaces, The Third Symposium on Machine Learning and Dynamical Systems, September 26 - 30, 2022, The Fields Institute for Research in Mathematical Sciences
Some Talks:
Balanced Reduction of Nonlinear Control Systems in Reproducing Kernel Hilbert Spaces (Imperial College London, 03/2011; LDSG meeting, 05/2011)
On Control and Random Dynamical Systems in Reproducing Kernel Hilbert Spaces (Parameter Estimation for Dynamical Systems, Eurandom, 06/2012; UMD College Park, 11/2012; Queen Mary University of London, 01/2013; METU, University of Augsburg, Bogazici University, Yildiz Technical University, Bilkent University, Koc University - 2013/2014 )
Embedology for Control and Random Dynamical Systems in Reproducing Kernel Hilbert Spaces (University of Oxford, 02/2015; Fields Institute, 03/2015; University of Toronto, 03/2015; University of Ottawa, 03/2015; Concordia University, 03/2015; McMaster University (Hamilton, ON), 04/2015; University of Waterloo (Waterloo, ON), 04/2015)
Kernel Methods for the Model Reduction of Nonlinear Control Systems, The Workshop on Data-Driven Model Order Reduction and Machine Learning (MORML 2016), March 30 - April 1, 2016, University of Stuttgart.
Kernel Methods for Seizure Detection, 1st CRITICS Workshop on Critical Transitions in Complex Systems: Mathematical theory and applications, Kulhuse, Denmark, 09/2016
Kernel Methods for Dynamical Systems (University of Stuttgart, 07/2017)
Kernel Methods and the Maximum Mean Discrepancy for Some Systems with Critical Transitions, 4th CRITICS Workshop and Winter School on “Critical Transitions in Complex Systems: Mathematical theory and applications”, Wöltingerode, Germany, March 5-16, 2018
Kernel Methods for Dynamical Systems, University of Exeter, July 2018.
Kernel Methods for Center Manifold Approximation, ICOSAHOM 2018, Imperial College London, July 2018
Kernel Methods for Dynamical Systems, Symposium on Machine Learning and Dynamical Systems, Imperial College London, Feb. 2019.
Kernel Methods for Dynamical Systems, V Workshop on Dynamical Systems and Brain-inspired Information Processing, University of Konstanz, Jul. 2019
Machine Learning and Dynamical Systems meet in Reproducing Kernel Hilbert Spaces, IPAM, Oct. 2019, https://www.youtube.com/watch?v=WxwH7Lk1gWc
Machine Learning and Dynamical Systems meet in Reproducing Kernel Hilbert Spaces, multiple online seminars in 2021, https://www.youtube.com/watch?v=Hkknq9iDOJA
Academic Service: I am the Organizer of the "Machine Learning and Dynamical Systems" seminar series. I am also a Co-Organizer of the One World Seminar Series on the Mathematics of Machine Learning. I was the lead organizer of the following events
Symposium on Machine Learning and Dynamical Systems, Imperial College London, Feb. 11-13, 2019
Workshop on Machine Learning and Data Assimilation for Dynamical Systems, June 12-14, 2019
Symposium on Algorithmic Information Theory and Machine Learning, Alan Turing Institute, London, UK, July 4-5, 2022
First Workshop on Computational and Mathematical Medicine, April 20th, 2017
Second Symposium on Computational and Mathematical Medicine, March 29th, 2018
Third Symposium on Computational and Mathematical Medicine, Jan. 10th, 2019