From Satellite to Street-Level: How MapSwipe’s New Mapillary Integration is Transforming Mapping

MapSwipe now includes street-level imagery thanks to a new integration with Mapillary, giving volunteers another way to explore and contribute to open mapping. This means people can now see details that aren’t visible from above, like road surfaces, signs of waste, or other conditions on the ground, helping to fill in data gaps and build a more complete map.
Nicole Siggins
22 May 2025

What is MapSwipe?

The MapSwipe website offers many ways to get started and get involved with the MapSwipe community.

MapSwipe is a free app (for mobile and web) that helps volunteers around the world contribute high-quality information to geospatial data projects — but makes it as easy as scrolling through social media. Volunteers can contribute by looking for features in satellite imagery, like buildings or roads; comparing before and after scenes; or validating OpenStreetMap (OSM) data.

MapSwipe first launched in 2016 and today has 190 million swipes contributed by 101K volunteers, covering 1,739 active and completed projects, 5,356,000 square kilometers mapped, and totaling 12 years and 6 months of time spent MapSwiping!

A contribution heatmap, showing all the projects that have ever been on MapSwipe from 2016 to present day!

Until recently, MapSwipe only supported satellite imagery. The need to incorporate street-level imagery had been recognized for some time by the OSM community and humanitarian organizations. HeiGIT had already begun beta development to address this gap, but the project lacked the resources to move forward.

When I secured funding through the Humanitarian OpenStreetMap Team in 2024, it provided the boost needed to bring the project to life. As a result, the street-level imagery feature officially launched in March 2025!

So, why does street-level imagery matter?

Satellite imagery isn’t enough to complete the map: you can’t see a road's surface, whether a building is a hospital or a school, or if there’s garbage on the ground. Additionally, trees, building overhangs, and elevated roadways can be obstructions, often blocking features like sidewalks, crossings, or signage.

Enter the Street Level Imagery project, MapSwipe’s new mission that uses the power of Mapillary to help volunteers review and validate what they see in street-level photos. While these contributions don’t directly edit OpenStreetMap, they support mappers by pointing to where updates or additions might be needed, helping improve the accuracy of the map.

With the Mapillary integration, you can now use MapSwipe to look at on-the-ground images to determine what should be on the map: whether you’re reporting that a road is paved or unpaved, that a building doesn’t exist, or if there’s garbage on the side of the road. Street-level imagery opens up so many new possibilities that simply aren’t there with satellite imagery alone.

A street-level imagery MapSwipe project where volunteers can determine if a road is paved or unpaved. By HeiGIT.

How does it work? Let’s take a tour!

First, find a street-level imagery project on MapSwipe.

Like other projects on MapSwipe, the street-level imagery project also comes with a tutorial to help you best understand the project’s scope and objectives, and to provide the best quality data.

The tutorial walks through the various options you have for the project and how you might respond depending on the images you see. In this project, you have the option to say:

  • Yes, there’s solid waste in this image,
  • No, there’s no solid waste, or,
  • It’s bad imagery, or you’re unsure.

Instructions for a street-level imagery project

The great thing about MapSwipe is that you don’t have to be perfect: just do your best with each image you see. At least three volunteers will see each image, so the folks who made the projects won’t ONLY be relying on your input. Contributing directly to OpenStreetMap can be stressful for beginners, since they often work independently without oversight. In this sense, MapSwipe offers a much more gentle on-ramp for providing data to important projects.

Once inside the project, you can use your mouse to drag your view around (if it's a 360-degree image) or zoom in to get a closer view.

The MapSwipe UI makes it easy to zoom in on images — you can use the on-image zoom arrows or use a mouse or trackpad.

Once you’ve made your selection for each image and gotten to the end of a group of images, you can save your work for the entire group. Your progress is saved to the database, and you can move on to more images or leave the project to do something else.

The Street Level Imagery project, like all MapSwipe projects, is made with micro-volunteering in mind. Each contribution you make is only a few images and decisions, but with the power of many, it can lead to a real-world impact in many vulnerable communities. It’s as easy as scrolling aimlessly through social media, but you can leave your coffee break feeling good about helping a community project!

Why Mapillary?

For years, HeiGIT has researched using Mapillary street-level imagery to improve and enhance OpenStreetMap in ways that would support humanitarian response, urban planning, and progress towards the Sustainable Development Goals. In November 2024, they launched a global dataset identifying whether roads are paved or unpaved, derived from Mapillary street-view images and deep learning.

At the time, only 33% of OpenStreetMap roads had surface data, which can affect transportation safety, emergency routing, business logistics, and rural accessibility. The new dataset improves global coverage to ~36%, representing 36.7 million km of roads - but there’s still work to be done!

HeiGIT’s research supported a dataset based on Mapillary images by applying advanced deep learning techniques to classify global road surfaces as paved or unpaved. Image courtesy of HeiGIT.

When deciding to produce a production-ready version of the Street Level Imagery project on MapSwipe, Mapillary was a clear choice, as it’s already well-integrated into the OpenStreetMap community.

Mapillary is the largest crowdsourcing platform for street-level imagery in the world, with over 2.5 billion street-level images under an open license. Additionally, all of Mapillary’s APIs are open sourced and have generous rate limits. Most importantly, Mapillary makes capturing imagery incredibly accessible–offering support for a multitude of cameras, even allowing you to upload imagery directly from your smartphone.

The value of using Mapillary and MapSwipe together

By implementing Mapillary into Mapswipe, a new frontier of opportunity exists for organizations and communities who collect their own street-level imagery. Mapillary detects over 1,500 traffic sign classes, trained on signage from all over the globe, however there are cases where niche signage is not classified.

Imagine you were interested in identifying a unique type of street-sign, hyperlocal to your community. You could use Mapillary object detections to filter out imagery that has signage as a first pass. Next, you could bring this imagery to MapSwipe and have volunteers validate the presence of your desired sign type. With the completed MapSwipe project, you’ll be able to use the MapSwipe data as a signal for current asset existence and as training data for custom machine learning models.

An example of a scene that MapSwipe users would swipe on in Dar es Salaam, Tanzania.

A great example of the Mapillary integration in action from the OpenMap Development Tanzania (OMDTZ) team. They used GoPro Max cameras mounted on “Bajajs” (three-wheeled tricycles) to collect and upload images, on the order of millions, to Mapillary. Those images are now being used to train machine learning models to detect solid waste in images; calling on volunteers from MapSwipe to validate waste existence is a key part of their workflow. Their results will be used to update OpenStreetMap, helping to inform development and humanitarian efforts.

What’s next with MapSwipe?

MapSwipe has a lot coming up on the horizon! In summer 2025, we plan to launch:

  1. A completeness project type that crowdsources gaps in OpenStreetMap data,
  2. An object detection project type using machine learning

By the end of the year, we plan to launch a conflation project type. In this project, volunteers can look at machine learning datasets produced by the Humanitarian OpenStreetMap Team’s fAIr project and compare them to existing OpenStreetMap data.

Prototype of the upcoming Conflation MapSwipe project type - anticipated to launch in late 2025. Image courtesy of the Humanitarian OpenStreetMap Team and HeiGIT.

The MapSwipe governance team is also considering a new project type on MapSwipe where volunteers can help validate machine learning datasets. This validation could improve the model over several iterations. This kind of human-in-the-loop AI map generation could result in MapSwipe becoming an OpenStreetMap editor in its own right. The future is bright for MapSwipe!

Want to get involved?

You (and your community!) can get involved in MapSwipe in several ways.

You can start swiping on street-level imagery in MapSwipe today on our web app! To start, street-level imagery projects are only available for swiping on the web, but stay tuned for the release on our iOS and Android mobile apps.

If your organization is interested in creating a project on MapSwipe, or in getting involved in MapSwipe in other ways, like helping translate the app or site into various languages, or helping with our code base, visit MapSwipe’s Get Involved page for further information on how to get started and take the next steps.

Happy MapSwiping!

/Nicole

Nicole is an open mapping strategist focusing on practical innovation, community engagement, and ethical technology. At the Humanitarian OpenStreetMap Team (HOT), she helped secure funding and launch key initiatives, including MapSwipe4Web and the Street-Level Imagery project, expanding how open data supports disaster response and planning. Nicole is an active MapSwipe governance member, a HOT voting member, and a core contributor to Missing Maps London, where she continues to support mapping communities and advocate for accessible, community-driven geospatial tools.

You can find Nicole on GitHub, Linkedin, and OpenStreetMap; you can also follow along on her website.