Every tick is a learning opportunity. Every pattern has a rhythm.
The time-series continual learning engine for the [&] stack. Temporal anomaly detection, digital twin synchronization, and evolving pattern recognition.
Every event has a timestamp. Every pattern has a rhythm. TickTickClock gives AI agents temporal awareness that learns and evolves.
Adaptive State-Space anomaly detection inspired by the Mamba architecture (Gu & Dao, 2023). Selective state-space models with adaptive gating dynamically modulate hidden state updates based on contextual cues — capturing both short-range and long-range temporal dependencies while remaining computationally efficient on streaming sensor data.
Mamba SSM · Sparse AttentionEpoch-aware delta-CRDT replication inspired by GeoCoCo (2025). Conflict-free replicated data types with commutative, associative, and idempotent merge functions ensure convergence under arbitrary message reordering, delivery delays, and network partitions. Local state mirrors global state deterministically — no coordination overhead.
Delta-CRDTs · Event SourcingContinual Compositionality & Orchestration (CCO) — the research direction identified by the continual learning community as most promising for 2026. Multi-timescale memory modules that update at different frequencies, inspired by Google's Nested Learning and Titans architectures. Fast memory promotes to slow memory. No catastrophic forgetting. No retraining.
Nested Learning · Titans · CCOTickTickClock implements methods drawn from the latest advances in time-series ML, distributed systems, and continual learning.
Linear-time sequence modeling with input-dependent state transitions. Unlike Transformers, SSMs scale linearly with sequence length while capturing long-range dependencies — critical for high-frequency sensor streams where Transformer attention becomes prohibitively expensive.
Conflict-free replicated data types that transmit only state deltas, not full state. Merge operations satisfy ACI properties (Associative, Commutative, Idempotent), guaranteeing eventual consistency regardless of message ordering, duplication, or partition events.
Inspired by Google's Nested Learning (NeurIPS 2025) and the Titans architecture — a continuum memory system where fast-updating modules handle immediate context and slow-updating modules store long-term knowledge. Memory consolidation happens at inference time, not during retraining.
Every TickTickClock endpoint is an MCP server. Anthropic's Model Context Protocol standardizes how AI agents connect to tools — meaning any MCP-compatible agent can query temporal state, subscribe to anomaly streams, or request predictions without custom integration.
TickTickClock is the Intelligence layer — temporal awareness where GeoFleetic provides spatial. Together: when and where.
Vibration, temperature, and pressure streams processed through Mamba-SSM with adaptive gating. Learns per-machine temporal baselines that evolve — detects degradation signatures weeks before failure using association discrepancy between expected and observed patterns.
Heart rate at 3am differs from 3pm during exercise. Multi-timescale memory builds per-patient baselines that consolidate over days and weeks. Distribution-shift-aware models handle non-stationary physiology without false positive storms during routine changes.
Paired with GeoFleetic for time-aware route optimization. Delta-CRDT synchronization keeps fleet state consistent across vehicles. Temporal patterns (time-of-day, day-of-week, seasonal) compound with spatial data for demand prediction that sharpens every delivery cycle.
MCP-connected temporal context injection. Agents query TickTickClock for "what happened when" — retrieving sequenced event histories, cadence patterns, and temporal anomalies. Enables agents to reason about time without embedding temporal logic into each model.
Part of the [&] Ampersand Box Design portfolio — the infrastructure that makes AI learn, adapt, and decide. Elixir/OTP. Edge-native. MCP-first.
GitHub Explore [&]