PowerInfer-2: Fast Large Language Model Inference on a Smartphone

PowerInfer-2: Fast Large Language Model Inference on a Smartphone

12分钟 ·
播放数0
·
评论数0

A podcast discussion about PowerInfer-2, a framework for running large language models on smartphones, focusing on its neuron cluster design, adaptive computation strategies, and I/O optimizations.