博彩导航

新版入口 | 设为博彩导航 | 加入收藏 | 宁波大学
博彩导航
博彩导航博导概况 师资队伍科学研究人才培养党群工作党风廉政学生工作校友之家招聘信息内部信息
旧版
 学院新闻 
 通知通告 
 学术活动 
 学生工作 
 人才培养 
 
当前位置: 博彩导航>>旧版>>博彩导航>>学术活动>>正文
甬江数学讲坛559讲(明理数学大讲堂之数学讲座2026年第4讲)-Constant Bit-size Transformers are Turing Complete
2026-03-17 14:26     (点击:)

报告时间:202641 16:00开始

报 告 人:Qian Li ( Shenzhen Research Institute of Big Data)

报告地点:9-113

报告题目:Constant Bit-size Transformers are Turing Complete

报告摘要The empirical success of Transformer-based LLMs on complex reasoning tasks has motivated research on their expressiveness. A central result in this line of research is that Transformers are Turing complete; that is, Transformers possess sufficient expressive power to compute any computable function. In this talk, I will first show that even constant bit-size Transformers are Turing complete. In particular, greater reasoning power requires only longer context windows and longer chain-of-thought (CoT), while all learnable parameters—including parameter count and numerical precision—may remain fixed constants. Moreover, the required context-window length scales with the space complexity, rather than the time complexity, of the simulated computation. I will then discuss how log-sparse attention preserves this efficient universality, providing a formal explanation for its practical success. These results help bridge modern neural architectures with classical models of computation and contribute to a deeper theoretical understanding of the computational power of LLMs.

报告人简介:Qian Li currently serves as a research scientist at Shenzhen International Center for Industrial and Applied Mathematics, Shenzhen Research Institute of Big Data. Prior to joining the SRIBD, he was an assistant professor at the Institute of Computing Technology, Chinese Academy of Sciences. He has a broad interest in theoretical computer science. He is also interested in machine learning and artificial intelligence.


关闭窗口
宁波大学 | 图书馆 | 中美精算

地址:宁波市江北区风华路818号宁波大学包玉书9号楼