BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Computer Science - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Computer Science
X-ORIGINAL-URL:https://csc.ncsu.edu
X-WR-CALDESC:Events for Computer Science
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20240310T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20241103T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20250309T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20251102T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20260308T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20261101T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20250919T110000
DTEND;TZID=America/New_York:20250919T120000
DTSTAMP:20260427T123334
CREATED:20250915T141215Z
LAST-MODIFIED:20250930T181812Z
UID:10000029-1758279600-1758283200@csc.ncsu.edu
SUMMARY:Breaking Barriers: Advancing Long Context LLMs
DESCRIPTION:Speaker:\nZirui Liu\, University of Minnesota \nAbstract:\nLLMs have demonstrated impressive conversational abilities. However\, scaling them to handle longer contexts\, such as extracting information from lengthy articles&mdash;a critical task in healthcare\, law\, and finance applications&mdash;presents significant challenges. The two main obstacles are: first\, LLMs struggle to process input lengths beyond what they encountered during pre-training; second\, even when information is accurately extracted from extended contexts\, deploying LLMs in real-world scenarios is limited by hardware capacity. I will discuss recent advances in serving long context LLMs at scale. To address the first challenge\, I&rsquo;ll present our work on extending LLM context length 10X by coarsening the positional encoding. For the second challenge\, I will highlight our recent success in 2-bit KV Cache quantization. Lastly\, I will briefly discuss the reproducibility issue of reasoning evaluation. \nSpeaker Bio:\nZirui Ray Liu is an Assistant Professor of Computer Science at University of Minnesota. His interests lie in the broad area of Machine Learning and Data Mining. He regularly published papers in top venues such as\, NeurIPS\, ICML\, ICLR\, and MLSys. His work has been integrated into widely used NLP tools like Llama.cpp and Huggingface Transformers\, and was highlighted at Google I/O sessions. Website: https://zirui-ray-liu.github.io/ \nSpecial Instructions:\nThis seminar will be hosted online only. \nHost:\nXiaorui Liu\, CSC
URL:https://csc.ncsu.edu/event/breaking-barriers-advancing-long-context-llms/
CATEGORIES:CS AI Seminar Series,Lecture/Seminar
LOCATION:https://ncsu.zoom.us/j/91683735738
END:VEVENT
END:VCALENDAR