Maps iOS 15 – 3D Visual Design and Art Pipelines

It is…

The first major visual change in maps since the 2012 initial launch. Apple Maps had continuously made underlying data collection, data model and back end improvements, but for the most part the cartography and visual design process for the map display was the same. Starting in 2016 the wheels were set in motion to step into another dimension.

I did…

I was brought in to build, scale and then lead a team of about 50 Artists, GIS Editors, and a few Engineers to make sure the cities we were delivering for launch and beyond were up to Apple standards. While my role was clearly the one of Project Manager, once things were rolling and due to the time crunch I would step in and help with any data editing (think of Z-brush), Maya modeling, Photoshop work for textures and general Quality Assurance by visually scanning city data.

Deliverables

I was in charge of the the High Zoom track for this overhaul. In addition, my team developed the procedural generation pipeline for creating 3D assets at scale using simple inputs from our data team. Simultaneously they were developing pipelines for getting a lot of our map data into tools like Houdini, Maya, Blender and custom in house tools. We had our hands on almost all of this new visual design as there was a lack of 3D knowledge in both the design and cartography teams. Finally, I lead the upgrade of our Custom 3D landmark pipeline and that work is reviewed on another section of my website.

Results

The details

At the top of my deck I mentioned maps started in 2012. Nine years after that initial launch, Maps in iOS15 was our biggest change to date.

Back in 2016, we were challenged by Tim with this simple statement.

What are the reasons other than navigation that will get people into the map?

Tim Cook

Context; our map is not a differentiator – we all look pretty much the same.

Yellow highways! landmark icons! beige base map! Circle POIs!

Bing, Google, Here, Apple

Which of these things does not belong?

In addition, the foundation for a map built upon our own data versus Tom Tom’s data was starting to take shape. But these vans were collecting more data than what was required to recreate the existing map. We had tons of data but no story of what to do with it.

Sweet Ride

So, to answer Tim’s question we came up with the Design Tenets.

  • Create a beautiful, detailed map that only Apple could provide.
  • Make it flexible, responsive, and adaptive to user needs.
  • Take full advantage of new van Drive data.
  • Returns on our huge KH investment

The list of things needed to pull this off as you can imagine was enormous. My team being well versed in 3D focused in many aspects of not only defining what we could do but HOW we could do things.

Vans were collecting lots of data but what was inside and what could be extracted from that raw data? My team worked with the Engineering team to understand would could be extracted and what we in Design would be interested in pursuing. The next step was to simulate a future delivery. It was going to be months before we would receive any usable data so my team, including yours truly had to create data by hand mostly in Maya.

While the vans could collect roads in great detail buildings would still rely on building footprints and height data for extrusions. We needed a way to get custom landmarks into the world and we leveraged our landmark data pipeline and brought that to a new level. (More on landmarks in another section)

Another totally new concept that we introduced to Maps was procedurally generated objects. Now custom curation is amazing BUT imagine hand modeling in Maya every bridge in every. Multiply that with every tunnel, pedestrian bridge, etc. This does not scale, so we needed to come up with an automated process. This process would take some standard inputs and generated an asset that looked hand made but did not require any human curation. We used this technique for bridges, ped bridges and tunnels. But we also were successful in creating things like Parking garages and Taco Bells!! The team then integrated this process with the rest of our maps services. It was an amazing accomplishment that we built from the ground up.

Dynamic scaling based on footprint size. Yo Quiero!

With all this data my team created a POC that demonstrated map flexing. As you can imagine, all this new data can easily overload the map so we wanted to Emphasize, deemphasize or even remove items on the map depending on context or user intent. This WebGL demo was a slice of the city and switched between our main map modes of Explore, Transit and Driving.

Shrink and Grow

In iOS today users can toggle on and of Dark Mode. What that does to the Maps Application is we switch style sheets to tweak the palette to make the cartography better suited for nighttime driving for example. Our new custom 3D landmarks had to follow suite so my team came up with a way to merge the night time diffuse maps with emissive maps and pull of scenes like the Ferry Building at night.

Another big change my team researched and prototyped was bathymetry for the worlds Oceans and Seas. Previously these areas were represented by a flat blue color but we received bathymetry data and visualized for the client team how we wanted to implement bathymetry. We even used techniques to create data were it did not exist, rivers and lakes for example. In this shot you can see the trench under the GG bridge.

Finally, we helped with getting finer details in the map like Crosswalks/Scrambles and grass on sports fields including golf courses! Our product manager was thrilled…

The majority of work my team was in charge of was pre-visualization of the various data elements we now had at our disposal. Even if we could get all this detail, did we want to render it?

Camera focal lengths and impacts of crowdedness.

Roof types and explorations in residential areas.

Building details, trees and parking lot data

We even consider special camera angles for tourism searches, delightful treatments, tilt-shift/depth of field, atmospheric touches and animations all to make this new “map” come to life.

Now everything I just presented was nice but not real. And what I mean is it was a pre-rendered in tools like Maya, RedShift and offline. Our success who be defined by what we could actually pull off in our client/engine. 

Below are some of our first tests with real data, real lighting and running on an iPad early on. Granted that iPad was a little hot to the touch, but the results speak for themselves.