The SCIPE Workshop on Large Language Models brings together researchers, practitioners, and students interested in understanding and working with state-of-the-art language models. This intensive three-day program covers everything from foundational concepts to advanced research topics.
Whether you're looking to apply LLMs in your work or explore cutting-edge research directions, this workshop provides hands-on experience with modern tools, access to high-performance computing resources, and guidance from experts in the field.
Learn to deploy and use LLMs effectively with practical tools like transformers, vLLM, and LangChain. Gain hands-on experience with inference optimization, RAG systems, and API integration.
Explore advanced topics including training methodologies, fine-tuning techniques, test-time scaling, compression strategies, and the latest developments in agentic AI systems.
Foundations and Getting Started
Opening remarks and networking
Overview of LLM architecture, current capabilities, generative AI, agentic AI, and active research areas
Includes 15-minute Q&A sessionSetting up your environment: GitHub, SSH, and HPC basics using Jupyter Notebooks
Run your first LLM on HPC infrastructure
Run reasoning and non-reasoning LLMs on your local machine
Training and Optimization
Review of take-home assignment, followed by pretraining fundamentals and post-training methods including supervised fine-tuning and reinforcement learning approaches
Techniques for efficient serving, compression strategies, and quantization methods
Understanding metrics and benchmarking methodologies for LLMs
Practical session on quantization, KV-cache optimization, and inference acceleration
Literature review of recent research papers in LLMs
Advanced Applications and Research
Specialized applications of LLMs for programming tasks
Review of assigned papers and initial research idea presentations
Continued presentation of research proposals
10-minute presentations from participants on their workshop projects and insights
Curated materials to help you prepare for and continue learning after the workshop
Andrej Karpathy provides a comprehensive 1-hour introduction to LLMs, covering architecture, training, and applications
Watch VideoPractical insights from Andrej Karpathy on leveraging LLMs in real-world workflows
Watch VideoIBM Technology's comprehensive playlist covering LLM fundamentals and chatbot development
View Playlist3Blue1Brown's intuitive visual explanations of neural network concepts
View PlaylistPython AI tutorial from a LangChain engineer on building retrieval-augmented generation systems
Watch TutorialDeep dive into implementing GPT-2 from scratch with Andrej Karpathy
Watch VideoCME 295, Fall 2025 - Comprehensive course on transformer architectures and large language models
View SyllabusFall 2025 - Advanced topics in NLP with video lectures available
View CourseFall 2025 - Focused course on inference optimization techniques
View CourseCS 230, Fall 2025 - Foundational deep learning concepts and applications
View SyllabusThe workshop will be offered in two formats to accommodate different participant needs:
All registered participants will receive access to high-performance computing resources for the duration of the workshop. Access credentials and setup instructions will be provided after registration confirmation.
While the workshop accommodates participants at different levels, you'll get the most value with:
Registration is open from November 13 to December 13, 2025. Space is limited, so we encourage early registration.
Register Here