Announcing Verification Projects: A Scalable Way of Validating Machine-Generated Data

With Verification Projects, map makers and GIS experts can review machine-extracted map data by adding human validation to the workflow. Comprising a project dashboard and a simple game-based tool for verifying object detections, Verification Projects enable quality assurance of the data available in any area, as well as help to improve Mapillary’s machine learning algorithms with human feedback.

Mapillary Verification Projects

Mapillary helps people understand the world and update maps at scale by using computer vision technology. The Mapillary platform automatically extracts map data from street-level imagery with the power of machine learning. Just recently we made 42 new types of objects available as map features, in addition to the 1,500 traffic sign classes that were already placed on the map before. All this data is generated using our machine learning algorithms and available to anyone through our data subscriptions.

But even the best machine learning algorithms sometimes make mistakes. For example, a traffic sign may be mistaken for a different, similar sign (a “false positive” in industry terminology). Or a sign only visible from a challenging viewpoint may be missed altogether (a “false negative”). It is worth noting, however, that on object recognition tasks such as traffic signs, state-of-the-art algorithms perform equally well or better than humans.

Example of object detections in an image Example of object recognition—machine learning algorithms have detected things such as crosswalks, support poles, and traffic lights in the image

These machine mistakes may have an impact on the quality of your final data output. Map features are created from detecting the same object, such as a traffic sign, in multiple images (using triangulation, we estimate the position of the sign on the map). So if a traffic sign is misrecognized in the image, it can mean that the position of the resulting map feature is calculated less accurately, or that the wrong type of map feature is created altogether, affecting the count of traffic signs by type in your area.

There are two strategies to address these occasional mistakes in the data: first, to fix the issues in your area of interest by manual quality assurance, and second, to improve the algorithms so they don’t make these mistakes in the future. With today’s release of Verification Projects, we address both.

Verification Projects let people review if the machine recognized the object correctly by simply voting yes or no on it. To make it more fun for the participants, it’s built up as a game with scores and leaderboards. All detections that humans have deemed to be false are removed from the data that’s available on the Mapillary platform. At the same time, Mapillary uses this feedback to further train our algorithms, meaning they will be more precise in the future.

Getting started

Verification Projects are available to anyone on the Mapillary web app. To get started, create an account and set up an organization (which is easy to do and free, if you haven’t done it yet). Navigate to your organization’s dashboard, and Verification Projects are available right there from the sidebar.

Head over to your organization dashboard in the Mapillary web app to set up a verification project. →

Accessing Verification Projects in the Mapillary web app Access Verification Projects from your organization dashboard in the Mapillary web app

Similar to capture projects, you can set up a verification project on any of your geographic areas of interest. These are represented as “Shapes” in our web application. After picking a shape, you choose the object classes you would like to have verified by humans; for example, benches or stop signs. You can also combine multiple classes in one project.

For each class, the system generates a verification task and gives you a summary of how many detections for that class are present in the images in your shape. There is a unique link for your project (also, separately for each task) that you can distribute to anyone who is doing verifications for you. This could be folks in your team, you could share the link via social media to get help from your network, or you could hire a dedicated outsourcing provider. Get in touch with us if you need help with getting your verifications done.

Anyone opening the link will see a list of simple yes/no tasks, one for each object class that’s included in the project. In the task, you’ll be shown detections from your area that have been classified as this object, and you need to simply confirm or reject it.

Verification task Example of a verification task

To further improve quality, each verification is only deemed “done” when at least two people have confirmed or rejected it in the same way. Everyone who verifies detections also collects points and will climb up in a leaderboard that you can see in the task, making it fun to participate.

As a project admin, you can track progress in your project through the project dashboard which shows how many detections have been verified already and how many are still pending. Map features that are based on these detections are periodically recalculated so the work that has been done in the verification project will shortly be reflected in the map data that you download from the platform or access via API.

Dashboard of a verification project Follow the progress of your project on the dashboard

For more detailed documentation of the toolset, please head over to our Help Center.

What’s next? The attentive reader may have noticed that in this first stage of verification projects we cover wrong detections (false positives), but not yet missed detections (false negatives). Extending verification projects accordingly will be the next step. Stay tuned.

/Till, VP of Product

Go to the Mapillary web app to set up an organization and get started with your first verification project. →

Continue the conversation