This book presents the first unified, practical framework for continuous-time series analysis using state-of-the-art neural architectures. Moving beyond traditional discrete-time methods, it directly addresses real-world challenges such as irregular sampling, asynchronous observations, and hidden system dynamics through Neural ODEs, SDEs, and CDEs.
Covering both foundational and advanced models — RNNs, Transformers, graph networks, and emerging quantum-hybrid approaches — the book bridges classical time-series theory with modern deep learning. It emphasizes probabilistic forecasting, uncertainty quantification, and cutting-edge generative techniques, including diffusion models and VAEs, equipping readers with tools for robust, interpretable predictions.
Recent Trends in Modelling the Continuous Time Series using Deep Learning tackles core issues such as long-range dependencies, multivariate interactions, dimensionality reduction, and spatiotemporal coherence, while providing structured evaluation frameworks and benchmarking protocols tailored to continuous-time settings.
Through rich case studies in healthcare (EHR analytics, wearable monitoring), finance (volatility forecasting, high-frequency trading), and IoT systems (sensor fusion, predictive maintenance), the book demonstrates how continuous-time models enable personalized insights, constraint-aware learning, and more reliable decision-making. Designed for researchers, engineers, and practitioners, this book is a definitive resource for applying continuous-time neural methods to complex, real-world environments.
Table of Contents:
Chapter 1: Introduction to Continuous Time Series.- Chapter 2: Different Neural Network Models For Time Series Processing.- Chapter 3 : Emerging Trends and Open Challenges in Time Series Modelling.- Chapter 4 : Neural Network Techniques in Continuous Time Series Prediction.- Chapter 5: Neural Network for Time Series Modelling.- Chapter 6: Probabilistic and Generative Approaches to Continuous Time Series forecasting.- Chapter 7: Quantum-Hybrid Neural Networks for Continuous Time Series.- Chapter 8: Model Evaluation and Benchmarking.- Chapter 9: Applications of Continuous Time Series Analysis.
About the Author :
Dr. Mansura Habiba is a seasoned technology leader with over 15 years of experience in AI, machine learning, and cloud architecture. As Principal Platform Architect at IBM, she has led major AI and high-performance computing (HPC) initiatives, delivering advanced solutions for global banks, automotive leaders, and major energy enterprises. Her expertise includes designing scalable AI systems, optimizing cloud infrastructure, and driving digital transformation across IBM Cloud, AWS, and Azure.
She is the author of multiple books and numerous peer-reviewed publications that advance the AI and cloud computing community. Driven by a vision to accelerate AI adoption across industries, Dr. Habiba focuses on enabling organizations to harness AI for efficiency, innovation, and sustainable growth. Her commitment to ethical AI, data privacy, and strategic leadership makes her a trusted advisor in today’s rapidly evolving digital landscape.
Professor Dr. Barak Pearlmutter is a distinguished researcher and professor in machine learning, neural computation, and theoretical neuroscience. He is a faculty member in the Department of Computer Science at Maynooth University in Ireland, where he conducts pioneering research in automatic differentiation, deep learning, and the brain’s computational processes.
He is well known for his contributions to automatic differentiation (AD), a foundational technique for optimizing machine learning algorithms by enabling efficient computation of derivatives. His work has significantly influenced computational methods in both theoretical neuroscience and artificial intelligence, and he has also made important contributions to the study of dynamical systems and the development of sophisticated neural computation models.
An advocate for interdisciplinary research, Professor Pearlmutter frequently collaborates with neuroscientists, physicists, and engineers to bridge the gap between biological and artificial neural systems. His numerous publications and presentations in top-tier journals and conferences have established him as a leading figure in computational science.
Dr. Mehrdad Maleki is an AI and machine learning expert with a strong mathematical background and a PhD in theoretical computer science. He specializes in developing deep learning models, large language models, and AI-driven solutions for pattern prediction across industries. He has a deep understanding of complex systems and excels at tackling the most challenging problems with a diverse set of advanced tools and techniques. In addition to his AI expertise, Dr. Maleki has a solid foundation in cryptography and quantum computing, and holds several patents in these fields. His work brings together cutting-edge AI methods with secure, scalable solutions applied across a range of domains.