Page de couverture de EDGE AI POD

EDGE AI POD

EDGE AI POD

Auteur(s): EDGE AI FOUNDATION
Écouter gratuitement

À propos de cet audio

Discover the cutting-edge world of energy-efficient machine learning, edge AI, hardware accelerators, software algorithms, and real-world use cases with this podcast feed from all things in the world's largest EDGE AI community.

These are shows like EDGE AI Talks, EDGE AI Blueprints as well as EDGE AI FOUNDATION event talks on a range of research, product and business topics.

Join us to stay informed and inspired!

© 2026 EDGE AI FOUNDATION
Épisodes
  • Atym and WASM is revolutionizing edge AI computing for resource-constrained devices.
    Feb 3 2026

    Most conversations about edge computing gloss over the enormous challenge of actually deploying and managing software on constrained devices in the field. As Jason Shepherd, Atym's founder, puts it: "I've seen so many architecture diagrams with data lakes and cloud hubs, and then this tiny little box at the bottom labeled 'sensors and gateways' - which means you've never actually done this in the real world, because that stuff is some of the hardest part."

    Atym tackles this challenge head-on by bringing cloud principles to devices that traditionally could only run firmware. Their revolutionary approach uses WebAssembly to enable containerization on devices with as little as 256 kilobytes of memory - creating solutions thousands of times lighter than Docker containers.

    Founded in 2023, Atym represents the natural evolution of edge computing. While previous solutions focused on extending cloud capabilities to Linux-based edge servers and gateways, Atym crosses what they call "the Linux barrier" to bring containerization to microcontroller-based devices. This fundamentally changes how embedded systems can be developed and maintained.

    The impact extends beyond technical elegance. By enabling containers on constrained devices, Adam bridges the skills gap between embedded engineers who understand hardware and firmware, and application developers who work with higher-level languages and AI. A machine learning engineer can now deploy models to microcontrollers without learning embedded C, while the embedded team maintains the core device functionality.

    This capability becomes increasingly crucial as edge AI proliferates and cybersecurity regulations tighten. Devices that once performed simple functions now need to run sophisticated intelligence that may come from third parties and require frequent updates - a scenario traditional firmware development approaches cannot efficiently support.

    Ready to revolutionize how you manage your edge devices? Explore how Atym's lightweight containerization could transform your edge deployment strategy.

    Send us a text

    Support the show

    Learn more about the EDGE AI FOUNDATION - edgeaifoundation.org

    Voir plus Voir moins
    25 min
  • Honey, I Shrunk the LLMs: Edge-Deployed AI Agents
    Jan 27 2026

    The landscape of artificial intelligence is experiencing a profound transformation, with AI capabilities moving from distant cloud servers directly to edge devices where your data lives. This pivotal shift isn't just about running small models locally—it represents a fundamental reimagining of how we interact with AI systems.

    In this fascinating exploration, Dell Technologies' Aruna Kolluru takes us deep into the world of edge-deployed AI agents that can perceive their surroundings, generate language, plan actions, remember context, and use tools—all without requiring cloud connectivity. These aren't simple classification systems but fully autonomous digital partners capable of making complex decisions where your data is generated.

    Discover how miniaturized foundation models like Mistral and TinyLlama, combined with agentic frameworks and edge-native runtimes, have made this revolution possible. Through compelling real-world examples, Aruna demonstrates how these systems are transforming industries today: autonomous factory agents detecting defects and triggering interventions, rural healthcare assistants providing offline medical guidance, disaster response drones generating situational awareness, and personalized retail advisors creating real-time offers for shoppers.

    The technical journey doesn't stop at deployment. We examine the sophisticated optimization techniques making these models edge-friendly, the memory systems enabling contextual awareness, and the planning frameworks orchestrating multi-step workflows. Importantly, we tackle the critical governance considerations for these autonomous systems, including encrypted storage, tool access control, and comprehensive audit logging.

    Whether you're a developer looking to build edge AI solutions, an enterprise decision-maker exploring AI deployment options, or simply curious about where AI is headed, this episode offers invaluable insights into a technology that's bringing intelligence directly to where it's needed most. Subscribe to our podcast and join the conversation about the future of AI at the edge!

    Send us a text

    Support the show

    Learn more about the EDGE AI FOUNDATION - edgeaifoundation.org

    Voir plus Voir moins
    42 min
  • Ambient Scientific's Journey: From Personal Tragedy to Ultra-Low Power AI Innovation
    Jan 20 2026

    When personal tragedy strikes, some find a way to transform pain into purpose. Such is the remarkable story behind Ambient Scientific, where founder GP Singh's mission to prevent falls after losing a family member evolved into groundbreaking semiconductor technology enabling AI at the ultra-low power edge.

    The journey wasn't simple. Creating chips that could run sophisticated deep learning algorithms on tiny batteries proved more challenging than building data center processors. This demanded innovation at every level – from custom instruction sets and compilers to complete software stacks. What emerged wasn't just a single-purpose chip but a programmable platform with the versatility to support diverse applications while consuming a fraction of the power of conventional solutions.

    Most fascinating is what GP calls the "gravitational pull" toward edge computing. Applications initially deployed in the cloud inevitably migrate closer to where data originates – from data centers to on-premises, to desktops, to mobile devices, and ultimately to tiny wearables. This migration stems from fundamental business concerns: operating costs, data sovereignty, vendor lock-in, and the inherent distrust organizations have for cloud dependencies. The evidence? In hundreds of customer conversations, GP has yet to meet a single organization content with keeping their AI exclusively in the cloud.

    Ready to explore ultra-low power AI? Ambient Scientific offers development kits accessible to anyone familiar with embedded systems programming and Python-based deep learning. Join the revolution bringing intelligence to where data is created, not where it's processed. Your next innovation might be powered by a chip that sips power while delivering remarkable AI capabilities.

    Send us a text

    Support the show

    Learn more about the EDGE AI FOUNDATION - edgeaifoundation.org

    Voir plus Voir moins
    23 min
Pas encore de commentaire