Local AI AI News & Updates

Google Launches AI Edge Gallery App for Local Model Execution on Mobile Devices

Google has quietly released an experimental app called AI Edge Gallery that allows users to download and run AI models from Hugging Face directly on their Android phones without internet connectivity. The app enables local execution of various AI tasks including image generation, question answering, and code editing using models like Google's Gemma 3n. The app is currently in alpha and will soon be available for iOS, with performance varying based on device hardware and model size.

Framework Launches Desktop PC Optimized for Local AI Model Inference

Framework has released its first desktop computer featuring AMD's Strix Halo architecture (Ryzen AI Max processors), designed specifically for gaming and local AI inference. The compact 4.5L device supports running large language models locally, including Llama 3.3 70B, with configurations offering up to 128GB of soldered RAM and 256GB/s memory bandwidth.