A bridge is a cross-disciplinary event of the AAAI program that brings together different research communities to explore common goals and spark new collaborations. Unlike workshops, bridges focus on education, outreach, and community-building, helping emerging fields grow and connect.
Motivation
Learning from data streams is inherently challenging due to their dynamic and non-stationary nature. A key issue is concept drift, where the underlying data distribution evolves over time. In such scenarios, traditional offline training on historical data is insufficient, as previously learned models may quickly become outdated. Instead, models must learn continuously, detect changes as they occur, and adapt to new distributions. This raises the further challenge of catastrophic forgetting, where acquiring new knowledge may erase past information. Moreover, data in streaming contexts often exhibits temporal dependencies, which classical machine learning approaches tend to overlook.
Existing research areas address these issues only in isolation. Continual Learning (CL) [link] focuses on long-term retention and mitigating forgetting, often without strong real-time constraints. Streaming Machine Learning (SML) [link] emphasizes rapid, efficient adaptation to high-frequency streams but typically neglects temporal dependence and forgetting. Meanwhile, Time Series Analysis (TSA) [link] directly models temporal structure, yet its application to streaming scenarios remains limited.
The vision of Streaming Continual Learning (SCL) [link1, link2] is to bring these threads together, combining the adaptability of SML, the memory-preserving properties of CL, and the temporal modeling strength of TSA, towards robust, real-world learning from non-stationary data streams.
The Bridge
This bridge welcomes researchers at any level working on learning protocols and models for non-stationary environments where CL, SML, and TSA ideas intersect. Participants must register for the bridge through the AAAI website (further details will be available starting October 10). We strongly encourage participants to contribute their ideas and work, but the bridge welcomes everyone interested in joining the discussion.
The program will include a poster session with published works and new ideas, along with tutorials, invited talks, and interactive roundtables with experts from diverse fields. Participants will be able to share their work, learn about the key ideas behind CL, SML, and TSA, discuss their differences, and explore open challenges. This will help spark collaborations where each community can contribute to the progress of the other.
Beyond the event, the bridge seeks to build a lasting community around SCL, connecting researchers across SML, CL, and TSA and highlighting promising avenues where they can converge as complementary aspects of the same challenge.
We will encourage participants to reason about key open questions, like:
Can we design learning models that quickly adapt to new information (in the spirit of SML) without forgetting previous knowledge (in the spirit of CL)?
What is the meaning of avoiding forgetting in the case of real drifts (i.e., the new classification problem changes the decision boundary in a portion of the previously observed feature space)?
Is the loss of plasticity commonly encountered in CL also present in SML? If not, how can we leverage insights from SML to mitigate this adverse phenomenon?
Can we separate the concerns of continual knowledge representation and rapid task adaptation by combining CL and SML techniques?
Can we leverage the temporal dependencies usually present in a data stream to improve the learning experience?
Submission deadline for contributions (via OpenReview):Â 31 October 2025