Synthetic Network Metric Generation via Conditional DDPM with Categorical and Continuous Log-Metric Conditions

Open Access
Article
Conference Proceedings
Authors: Sangwon OhHoyong RyuJaehyung ParkJinsul Kim

Abstract: Recent network systems increasingly rely on synthetic data for tasks such as anomaly detection, performance analysis, and digital-twin-based evaluation. However, most existing generators focus solely on metric time-series and overlook the contextual information embedded in operational logs. As a result, they fail to reproduce the joint behavior that emerges when metric fluctuations are closely linked to event-driven operational states. To address this limitation, we develop a conditional denoising diffusion probabilistic model (DDPM) that generates metric sequences using both categorical and continuous conditions derived from metrics and logs. These heterogeneous conditions are transformed into a unified vector and injected into the diffusion process, enabling the model to capture dependencies between system events and metric dynamics. Experiments on real network traces demonstrate that our conditional diffusion models—based on U-Net, CSDI, and SSSD architectures—substantially outperform unconditional diffusion baselines and show strong fidelity and downstream utility. These findings indicate that context-aware diffusion modeling provides a robust foundation for synthetic metric generation in AIOps and digital-twin environments where access to real operational data is limited.

Keywords: Computer Network, Deep Learning, Synthetic Generation Model, Diffusion Model

DOI: 10.54941/ahfe1007079

Cite this paper:

Downloads
1
Visits
2
Download