Jialin Zhao (赵嘉霖)

selfie.jpg

I’m a Ph.D. candidate at Computer Science and Technology at Tsinghua University, advised by Prof. Carlo Cannistraci. I obtained my Master’s degree in Data Science from Tsinghua University and the University of Washington, advised by Prof. Jie Tang, and my B.E. at Computer Science and Technology at Tsinghua University.

My research focuses on efficient AI, natural language processing, and graph learning.

Email: jialin [dot] zhao97 [at] gmail [dot] com

Vitæ

Full Resume in PDF.

selected publications

(*: co-first author;  ^: corresponding author)

  1. Preprint
    Accelerating Attention with Basis Decomposition
    Jialin Zhao^
    Preprint: Under review, 2025
  2. ICML’25
    Pivoting Factorization: A Compact Meta Low-Rank Representation of Sparsity for Efficient Inference in Large Language Models
    Jialin Zhao^, Yingtao Zhang, and Carlo Vittorio Cannistraci^
    ICML’25: Forty-second International Conference on Machine Learning , 2025
  3. ICML’25
    Sparse Spectral Training and Inference on Euclidean and Hyperbolic Neural Networks
    Jialin Zhao^, Yingtao Zhang, Xinghang Li, Huaping Liu, and Carlo Vittorio Cannistraci^
    ICML’25: Forty-second International Conference on Machine Learning , 2025
  4. NeurIPS’25
    Adaptive Cannistraci-Hebb Network Automata Modelling of Complex Networks for Path-based Link Prediction
    Jialin Zhao, Alessandro Muscoloni, Umberto Michieli, Yingtao Zhang, and Carlo Vittorio Cannistraci^
    NeurIPS’25: Advances in neural information processing systems, 2025
  5. NeurIPS’21
    Adaptive Cannistraci-Hebb Network Automata Modelling of Complex Networks for Path-based Link Prediction
    Jialin Zhao, Yuxiao Dong, Ming Ding, Evgeny Kharlamov, and Jie Tang^
    NeurIPS’21: Advances in neural information processing systems, 2021
  6. NeurIPS’25
    Brain network science modelling of sparse neural networks enables Transformers and LLMs to perform as fully connected
    Yingtao Zhang, Diego Cerretti, Jialin Zhao, Wenjing Wu, Ziheng Liao, Umberto Michieli, and Carlo Vittorio Cannistraci^
    NeurIPS’25: Advances in neural information processing systems, 2025
  7. ICLR’24
    Plug-and-Play: An Efficient Post-training Pruning Method for Large Language Models
    Yingtao Zhang^, Haoli Bai, Haokun Lin, Jialin Zhao, Lu Hou, and Carlo Vittorio Cannistraci^
    ICLR’24: The Twelfth International Conference on Learning Representations , 2024
  8. ICLR’24
    Epitopological learning and Cannistraci-Hebb network shape intelligence brain-Inspired theory for ultra-sparse advantage in deep learning
    Yingtao Zhang, Jialin Zhao, Wenjing Wu, Alessandro Muscoloni, and Carlo Vittorio Cannistraci^
    ICLR’24: The Twelfth International Conference on Learning Representations , 2024

Services

Conference reviewer: ICML (2025), NeurIPS (2025), ICLR (2026)

Journal reviewer: IEEE Transactions on Big Data, Applied Network Science, Scientific Reports