 |
|
|
(Seminar)Can physics contribute to natural langauge processing? |
CAS Key Laboratory of Theoretical Physics
|
Institute of Theoretical Physics
|
Chinese Academy of Sciences
|
Seminar
|
Title
題目
|
Can physics contribute to natural langauge processing? |
Speaker
報告人
|
|
Affiliation
所在單位
|
意大利帕多瓦大學博士生,歐盟瑪麗居里研究員。
在天津大學獲得碩士學位,曾在丹麥哥本哈根大學,加拿大蒙特利爾大學,荷蘭阿姆斯特丹大學,華為諾亞方舟實驗室交流訪問,多次受邀在MILA,頭條,騰訊,華為等研究所和企業做主題報告。在工業應用方面,他2017年開始曾在騰訊全職工作,作為主要算法設計人員,在騰訊云上從零搭建了穩健的智能客服系統,服務中國銀行,云南省旅游局等頭部客戶;并與騰訊同事合寫的《推薦系統與深度學習》由清華大學出版社出版。在相對較短的學術生涯,他致力于構建更加魯棒和智能的自然語言處理系統,兼顧技術合理性和語言學動機。迄今他和他的合作者一起獲得了國際信息檢索頂級會議SIGIR 2017最佳論文提名獎和國際自然語言處理頂級會議NAACL 2019最佳可解釋論文,發表了包括國際頂級會議ICLR/SIGIR/WWW/NAACL/AAAI/IJCAI/CIKM等20余篇。 |
Date
日期
|
2020年12月2日(周三)上午10:00-11:00 |
Venue
地點
|
ITP South Building 6420 |
Contact Person
所內聯系人
|
張潘 |
Abstract
摘要
|
The physics community has been working for many decades on processing data in a high-dimensional space. Quantum computing also shows great potential to accelerate computing. Recently, natural language processing has been largely improved by using large-scale pretraining language models (huge neural networks like BERT, GPT, etc.) with massive cheap unstructured corpora. Such neural networks dealing with massive data may therefore benefit from the principles and tools in the physics community, especially processing large-amount high-dimensional data in BERT and GPT. How physics contributes to NLP (or the opposite direction) in the context of large-scale pretraining language models is worthy investigated. In this report, many aspects of processing natural language, e.g., efficiency, effectiveness, and interpretability, will be discussed, which can be an expected playground of well-designed tools invented by the physics community, e.g., tensor networks. |
|
|
 |
|