APA (7th ed.) Citation

Li, Z., Lyu, D., Wang, G., Chen, Y., Chen, L., Li, W., . . . He, G. (2025, June 22). KVO-LLM: Boosting Long-Context Generation Throughput for Batched LLM Inference. 2025 62nd ACM/IEEE Design Automation Conference (DAC), 1-7. https://doi.org/10.1109/DAC63849.2025.11132542

Chicago Style (17th ed.) Citation

Li, Zhenyu, Dongxu Lyu, Gang Wang, Yuzhou Chen, Liyan Chen, Wenjie Li, Jianfei Jiang, Yanan Sun, and Guanghui He. "KVO-LLM: Boosting Long-Context Generation Throughput for Batched LLM Inference." 2025 62nd ACM/IEEE Design Automation Conference (DAC) 22 Jun. 2025: 1-7. https://doi.org/10.1109/DAC63849.2025.11132542.

MLA (9th ed.) Citation

Li, Zhenyu, et al. "KVO-LLM: Boosting Long-Context Generation Throughput for Batched LLM Inference." 2025 62nd ACM/IEEE Design Automation Conference (DAC), 22 Jun. 2025, pp. 1-7, https://doi.org/10.1109/DAC63849.2025.11132542.

Warning: These citations may not always be 100% accurate.