The quality of results remained high, with less than 1% degradation compared to non-edge inference.
Meet our client
Client:
Industry:
Market:
Technology:
Client’s Challenge
A leading producer of handheld devices and self checkouts needed to optimize their computer vision models for edge devices to stay competitive. Their model zoo required conversion, with the main challenges being achieving high computational speed without sacrificing quality and meeting a tight deadline.
Our Solution
We implemented model compression and conversion to ensure compatibility with edge devices, optimizing the inference pipeline for edge hardware while closely monitoring quality to minimize performance loss. Additionally, we developed an easy-to-use API, allowing application developers to integrate the models without requiring deep expertise in computer vision.
Client’s Benefits
The optimized solution enabled ML models to run directly on edge devices, increasing inference speed by up to 10x compared to the baseline. The quality of results remained high, with less than 1% degradation compared to non-edge inference. Additionally, the solution improved scalability and reduced costs.