KLiNQ: Knowledge Distillation-Assisted Lightweight Neural Network for Qubit Readout on FPGA

Superconducting qubits are among the most promising candidates for building quantum information processors. Yet, they are often limited by slow and error-prone qubit readout-a critical factor in achieving high-fidelity operations. While current methods, including deep neural networks, enhance readou...

Full description

Saved in:
Bibliographic Details
Published in:2025 62nd ACM/IEEE Design Automation Conference (DAC) pp. 1 - 7
Main Authors: Guo, Xiaorang, Bunarjyan, Tigran, Liu, Dai, Lienhard, Benjamin, Schulz, Martin
Format: Conference Proceeding
Language:English
Published: IEEE 22.06.2025
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Superconducting qubits are among the most promising candidates for building quantum information processors. Yet, they are often limited by slow and error-prone qubit readout-a critical factor in achieving high-fidelity operations. While current methods, including deep neural networks, enhance readout accuracy, they typically lack support for mid-circuit measurements essential for quantum error correction, and they usually rely on large, resource-intensive network models. This paper presents KLiNQ, a novel qubit readout architecture leveraging lightweight neural networks optimized via knowledge distillation. Our approach achieves around a 99 \% reduction in model size compared to the baseline while maintaining a qubitstate discrimination accuracy of 91 \%. KLiNQ facilitates rapid, independent qubit-state readouts that enable mid-circuit measurements by assigning a dedicated, compact neural network for each qubit. Implemented on the Xilinx UltraScale+ FPGA, our design can perform the discrimination within 32 ns. The results demonstrate that compressed neural networks can maintain highfidelity independent readout while enabling efficient hardware implementation, advancing practical quantum computing.
DOI:10.1109/DAC63849.2025.11132854