Nyströmformer: A nyström-based algorithm for approximating self-attention Y Xiong, Z Zeng, R Chakraborty, M Tan, G Fung, Y Li, V Singh Proceedings of the AAAI Conference on Artificial Intelligence 35 (16), 14138 …, 2021 | 468 | 2021 |
You only sample (almost) once: Linear cost self-attention via bernoulli sampling Z Zeng, Y Xiong, S Ravi, S Acharya, GM Fung, V Singh International conference on machine learning, 12321-12332, 2021 | 23 | 2021 |
Large-field-of-view visualization utilizing multiple miniaturized cameras for laparoscopic surgery JJ Kim, A Watras, H Liu, Z Zeng, JA Greenberg, CP Heise, YH Hu, H Jiang Micromachines 9 (9), 431, 2018 | 21 | 2018 |
Multi resolution analysis (MRA) for approximate self-attention Z Zeng, S Pal, J Kline, GM Fung, V Singh International Conference on Machine Learning, 25955-25972, 2022 | 9 | 2022 |
VCC: scaling transformers to 128K tokens or more by prioritizing important tokens Z Zeng, C Hawkins, M Hong, A Zhang, N Pappas, V Singh, S Zheng Advances in Neural Information Processing Systems 36, 2024 | 7 | 2024 |
LookupFFN: making transformers compute-lite for CPU inference Z Zeng, M Davies, P Pulijala, K Sankaralingam, V Singh International Conference on Machine Learning, 40707-40718, 2023 | 5 | 2023 |
FrameQuant: Flexible Low-Bit Quantization for Transformers H Adepu, Z Zeng, L Zhang, V Singh arXiv preprint arXiv:2403.06082, 2024 | 4 | 2024 |
Parallax mitigation for real-time close field video stitching A Watras, J Ke, Z Zeng, JJ Kim, H Liu, H Jiang, YH Hu 2017 International Conference on Computational Science and Computational …, 2017 | 4 | 2017 |
IM-Unpack: Training and Inference with Arbitrarily Low Precision Integers Z Zeng, K Sankaralingam, V Singh arXiv preprint arXiv:2403.07339, 2024 | 1 | 2024 |
On the Efficiency of Transformers Z Zeng The University of Wisconsin-Madison, 2024 | | 2024 |
Controlled differential equations on long sequences via non-standard wavelets S Pal, Z Zeng, SN Ravi, V Singh International Conference on Machine Learning, 26820-26836, 2023 | | 2023 |