SCIPE Workshop on Large Language Models

Workshop Dates January 16-18, 2026
Registration Period November 13 – December 13, 2025
Register Now

About the Workshop

The SCIPE Workshop on Large Language Models brings together researchers, practitioners, and students interested in understanding and working with state-of-the-art language models. This intensive three-day program covers everything from foundational concepts to advanced research topics.

Whether you're looking to apply LLMs in your work or explore cutting-edge research directions, this workshop provides hands-on experience with modern tools, access to high-performance computing resources, and guidance from experts in the field.

For Practitioners

Learn to deploy and use LLMs effectively with practical tools like transformers, vLLM, and LangChain. Gain hands-on experience with inference optimization, RAG systems, and API integration.

For Researchers

Explore advanced topics including training methodologies, fine-tuning techniques, test-time scaling, compression strategies, and the latest developments in agentic AI systems.

Workshop Schedule

Day 1 - January 16, 2026

Foundations and Getting Started

12:30 PM - 1:30 PM

Welcome and Lunch

Opening remarks and networking

1:30 PM - 2:45 PM

Introduction to Large Language Models

Overview of LLM architecture, current capabilities, generative AI, agentic AI, and active research areas

Includes 15-minute Q&A session
3:00 PM - 4:00 PM

High-Performance Computing Access

Setting up your environment: GitHub, SSH, and HPC basics using Jupyter Notebooks

4:00 PM - 5:00 PM

Hands-on Session

Run your first LLM on HPC infrastructure

Evening

Take-Home Assignment

Run reasoning and non-reasoning LLMs on your local machine

Day 2 - January 17, 2026

Training and Optimization

9:00 AM - 10:30 AM

Training Deep Dive

Review of take-home assignment, followed by pretraining fundamentals and post-training methods including supervised fine-tuning and reinforcement learning approaches

11:00 AM - 12:00 PM

Efficient LLM Deployment

Techniques for efficient serving, compression strategies, and quantization methods

12:00 PM - 1:00 PM

Lunch Break

1:00 PM - 2:30 PM

Evaluation and Benchmarking

Understanding metrics and benchmarking methodologies for LLMs

3:00 PM - 5:00 PM

Inference Optimization Workshop

Practical session on quantization, KV-cache optimization, and inference acceleration

Evening

Take-Home Assignment

Literature review of recent research papers in LLMs

Day 3 - January 18, 2026

Advanced Applications and Research

9:00 AM - 10:30 AM

Code Generation with LLMs

Specialized applications of LLMs for programming tasks

11:00 AM - 12:00 PM

Paper Discussion and Research Ideas

Review of assigned papers and initial research idea presentations

12:00 PM - 1:00 PM

Lunch Break

1:00 PM - 2:00 PM

Research Ideas Presentations

Continued presentation of research proposals

2:30 PM - 5:00 PM

Final Presentations

10-minute presentations from participants on their workshop projects and insights

Topics Covered

Core Concepts

  • Transformer architecture and attention mechanisms
  • Generative AI fundamentals
  • Agentic AI systems
  • Current research landscape

Infrastructure

  • High-performance computing access
  • GitHub and version control
  • SSH and remote access
  • Jupyter Notebook workflows

Inference and Deployment

  • CUDA programming basics
  • Hugging Face Transformers
  • vLLM for efficient serving
  • Local deployment with llama.cpp
  • LlamaIndex for data integration

Retrieval-Augmented Generation

  • LangChain framework
  • Vector databases
  • RAG system architecture
  • API integration patterns

Training and Fine-tuning

  • Pretraining methodologies
  • Instruction fine-tuning
  • Parameter-efficient fine-tuning (PEFT)
  • LoRA, DoRA, and PiSSA
  • Reinforcement learning from human feedback
  • PPO, DPO, and ORPO algorithms

Advanced Topics

  • Test-time scaling and reasoning
  • Model compression and quantization
  • Evaluation metrics and benchmarking
  • AI agents and Model Context Protocol
  • Code generation applications

Learning Resources

Curated materials to help you prepare for and continue learning after the workshop

Video Lectures and Tutorials

Introduction to Large Language Models

Andrej Karpathy provides a comprehensive 1-hour introduction to LLMs, covering architecture, training, and applications

Watch Video

How I Use LLMs

Practical insights from Andrej Karpathy on leveraging LLMs in real-world workflows

Watch Video

Large Language Models and Chatbots

IBM Technology's comprehensive playlist covering LLM fundamentals and chatbot development

View Playlist

Neural Networks

3Blue1Brown's intuitive visual explanations of neural network concepts

View Playlist

Learn RAG From Scratch

Python AI tutorial from a LangChain engineer on building retrieval-augmented generation systems

Watch Tutorial

Reproducing GPT-2 (Advanced)

Deep dive into implementing GPT-2 from scratch with Andrej Karpathy

Watch Video

University Courses

Transformers & LLMs - Stanford University

CME 295, Fall 2025 - Comprehensive course on transformer architectures and large language models

View Syllabus

Advanced Natural Language Processing - Carnegie Mellon University

Fall 2025 - Advanced topics in NLP with video lectures available

View Course

Inference Algorithms for Language Modeling - Carnegie Mellon University

Fall 2025 - Focused course on inference optimization techniques

View Course

Deep Learning - Stanford University

CS 230, Fall 2025 - Foundational deep learning concepts and applications

View Syllabus

Logistics

Attendance Options

The workshop will be offered in two formats to accommodate different participant needs:

  • On-site: Full in-person experience with direct access to instructors and networking opportunities
  • Remote: Virtual attendance with live streaming and interactive Q&A sessions

HPC Access

All registered participants will receive access to high-performance computing resources for the duration of the workshop. Access credentials and setup instructions will be provided after registration confirmation.

Prerequisites

While the workshop accommodates participants at different levels, you'll get the most value with:

  • Basic programming knowledge (Python preferred)
  • Familiarity with machine learning concepts
  • Access to a computer with internet connection

Registration

Registration is open from November 13 to December 13, 2025. Space is limited, so we encourage early registration.

Register Here