Spatial Perception at the Edge
Open-source libraries and microservices for AI-driven spatial perception on embedded devices.
Au-Zone Technologies builds EdgeFirst — a comprehensive platform for deploying AI perception on edge devices. We support cameras, LiDAR, radar, and time-of-flight sensors for real-time object detection, segmentation, sensor fusion, and 3D spatial understanding.
Our stack spans four layers: Foundation (hardware abstraction and inference delegates), Zenoh (pub/sub microservices), GStreamer (media pipeline plugins), and ROS 2 integration. All released under Apache 2.0.
Pre-trained models validated on real edge hardware with full-dataset accuracy metrics and detailed timing breakdowns per device.
Browse the EdgeFirst Model Zoo →| Task | Models |
|---|---|
| Detection | YOLO26 · YOLO11 · YOLOv8 · YOLOv5 |
| Segmentation | YOLO26 · YOLO11 · YOLOv8 |
EdgeFirst Studio is our MLOps platform for the complete perception development lifecycle — dataset management, model training, INT8 quantization, on-target validation, and deployment. Free tier available.
| Website | edgefirst.ai |
| Studio | edgefirst.studio |
| GitHub | github.com/EdgeFirstAI |
| Documentation | doc.edgefirst.ai |
| Model Zoo | EdgeFirst/Models |
| Company | Au-Zone Technologies |