Mobile app development for crop analysis based on computer vision - Lemberg Solutions

Mobile app development for crop analysis based on computer vision

Android app development that allows farmers to run an extensive AI analysis of harvested crops using their smartphones

Full case study in PDF
About the client

Inarix is a French company that builds an advanced AI-based platform for all actors in the agritech industry. Their Pocket Lab app allows unprecedented multi-criteria harvest analysis anywhere and anytime.

The challenge

With a harvest season just around the corner, Inarix aimed to improve their app quickly in order to deliver it to the agritech clients on time. Given the wide scope of their work, they took Lemberg Solutions on board to help them with creating a new version of their Kotlin app for Android.

Delivered value

Lemberg Solutions helped Inarix rewrite their existing app from React Native to Kotlin and improved the work of the barcode scanner, API error handling, and app responsiveness through different device screens. The dynamic background logic of the app and offline mode development were large-scale tasks requiring the most effort. Our team developed an offline feature, allowing the app to queue necessary data so that users can still get their AI analysis in the cloud, even when they have intermittent or no internet connection.

The process

Working on the crop-analysis project is the second time we have cooperated with Inarix. Initially, they were looking for an experienced DevOps engineer at LS to perform MLOps tasks, and since our competence matched their needs, Inarix decided to engage our Android developers in the Pocket Lab app rewrite.

The project was a race against time since our client wanted to make it before the harvest season would start. We received project specifications, UI/UX designs, legacy code, and requirements for new features and user flows. Subsequently, we agreed on a fixed-price collaboration for the project.

Here’s how the app works: users take pictures of the crops via the app that sends them to the server for analysis on the cloud and subsequently displays the crop qualification results. To match user-specific use-cases and processes, Pocket Lab uses scenarios which it needs to download from the server. These contain the steps for the app to perform. Dynamic scenarios make the UI dynamic, expanding the scope of its work.

We started with rewriting the app from React Native to Kotlin to improve its performance and implement an offline-mode feature. To make the app process the images smoothly, our team employed the CameraX library, the latest API for camera app development. For scanning the barcodes quickly, we offered to use Google's ML Kit.

In offline mode, LS engineers had to eliminate the dependence of the app on direct online connection with the server — users needed the opportunity to check the result later. For this feature, the app collects all the data a user provides, including the pictures. When a user connects to the Internet, the app should upload the data to the server and provide a given result. The lack of an Internet connection doesn't have to affect the field operator’s work, as the only issue is that users don't get the result immediately. To integrate the offline feature, we developed a queue mechanism, using the Retrofit library and Kotlin coroutines. Data exchange configurations were realized through extensive JSON use since the app depends on qualitative communication with the server. We built the app architecture following the MVVM design pattern.

We also employed an upload manager that saves its state and renews it when the app connects to the Internet, allowing the tasks to be added to a queue. Queue realization required extensive development, such that our Android developer even came up with several ideas that he later developed in the open source RetrofitRetry library. For this project, these ideas helped with the development of request repetition in case the request failed.

The project was divided into separate milestones, some of which included UAT phases. Inarix provided their workers with Android phones to test interim releases.

After the project was completed, Inarix planned to spec out and deliver an iOS version of Pocket Lab with our help.

Technologies
Kotlin
MVVM architecture
Coroutines
Retrofit
JSON
Google ML Kit
CameraX
Industries
Mobile app development for crop analysis based on computer vision - Lemberg Solutions
Mobile app development for crop analysis based on computer vision - Lemberg Solutions
Mobile app development for crop analysis based on computer vision - Lemberg Solutions

How it works

Inarix - Scheme - Lemberg Solutions
Guillaume Robin - Inarix - Testimonial - Lemberg Solutions

Lemberg Solutions helped us transfer our Pocket Lab application from React Native to Kotlin within a short period of time. Collaborating with them allowed us to bring new features, including an offline mode, to our customers. They successfully adapted to our requirements with a dedicated team of experts ready to tackle our challenges.

Guillaume Robin - Inarix - Testimonial - Lemberg Solutions
Mobile app development for crop analysis based on computer vision - Lemberg Solutions - Download PDF form
INTERESTED IN THIS CASE STUDY?
Download the full case study in PDF to save it for later.