<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
    <channel>
        <title>TLMs: Tiny LLMs and Agents on Edge Devices with LiteRT-LM — Cormac Brick, Google</title>
        <link>https://video.ut0pia.org/videos/watch/a91b6f80-bd83-4223-8641-26fb88af3d72</link>
        <description>Tiny LLMs are making on-device agents much more practical. In this workshop, Cormac Brick walks through how LiteRT-LM brings language models to edge devices, with a focus on Gemma, agent skills, and the real engineering tradeoffs behind running LLM workflows on phones and other constrained hardware. The session covers performance across edge devices, on-device function calling, fine-tuning and deployment, platform support across Android and iOS, and the memory, safety, and UX constraints that shape edge-native AI systems. If you're building local agents or want a practical look at where edge LLMs are headed, this is a useful hands-on overview. Speaker info: https://www.linkedin.com/in/cbrick/</description>
        <lastBuildDate>Mon, 04 May 2026 11:29:11 GMT</lastBuildDate>
        <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
        <generator>PeerTube - https://video.ut0pia.org</generator>
        
        <copyright>All rights reserved, unless otherwise specified in the terms specified at https://video.ut0pia.org/about and potential licenses granted by each content's rightholder.</copyright>
        <atom:link href="https://video.ut0pia.org/feeds/video-comments.xml?videoId=a91b6f80-bd83-4223-8641-26fb88af3d72" rel="self" type="application/rss+xml"/>
    </channel>
</rss>