ZOMFG WHEN WILL THIS GUY JUST HYPERVENTILATE HIMSELF INTO SHUTTING UP
It's not that hard.
Google bought Sketchup in like 2004. They sold it in 2012. In between, they allowed everyone the tools to build whatever they wanted and put it on Google Maps. They created a marketplace of objects that had geolocation. And they gave you magic internet points for creating 3D buildings on their maps in a race to see who could mechanical turk a 3d version of the world for them first.
That's an entire ecosystem of geospacial data that millions of fans were slavishly, exactingly interpreting for them in 3D. Then Bing rolled out sidescan imagery and Apple rolled out 3D maps in order to beat Google to the punch because Google wasn't ready (note that Bing's 3d imagery wasn't useful for navigation and Apple's 3d maps were painfully, dangerously wrong). Now? Now Google has sidescan from four directions, billions of data points demonstrating how humans interpret a 2d image into a 3d object and streetview to train its maps on. Commercial districts? Bitch please. Google used to give that stuff to you in Google Earth Pro. The databases exist and aren't even Google's. Google knows exactly who owns that parcel; so do you if you look it up. Google is just faster and has deeper pockets. So google is synthesizing:
- overhead aerial imagery
- oblique aerial imagery
- hand-assembled solid construction based on human interaction (exemplars and data)
- parcel data
- ownership data
But wait, there's more. Because this is Google, and because every time you use it to look up an address and go there with your phone, it also has
- ground truth verification of GPS coordinates
- travel paths
- perimeter verification
But wait. If you're using your phone to take pictures there it also has
- imagery within
- imagery without
That's not a "moat" that's data synthesis. The only thing Apple doesn't have is the hand-pieced stuff that allows them to verify they got it right. And the ground truth. Here's the only important sentence in this entire article:
They didn't have to. That's the truly important thing here. Google came through and got the exemplars they needed to make their maps reliable and accurate, and they did it by hand. Once they had an accurate and reliable map, they could just train their algorithms until the algorithms recreated ground truth accurately enough that they could synthesize without needing to collect.
Because now it's just data and math. Now it's just a matter of taking the points they have, running them through a blender and confidently creating a synthetic universe to match the real one.
Not mentioned in this article - Google's maps are years ahead, sure, but everything they've done after the street view stuff is just processing. Apple could totally do that. But Apple hasn't rolled a bunch of cars everywhere to verify ground truth. So Apple's synthesis will never be as valuable as Google's because they have no good way to verify it.
Apple created Apple Maps because they didn't want to be squeezed out by Google. They didn't want to have to pay Google to get Apple customers where they were going. They didn't want to be in a position for Google to go "who run bartertown" and pull access to Google Maps from the iOS ecosystem.
But they lost.
It's been six years since Apple and Google got in a pissing match. Google has since updated their app several times, and Apple maps are still also-ran bad. Apple could try and catch up, but they aren't.
Google used to have a head start. Now? Now they've just got better algorithms. Consider: this is the company that clobbered hand-compiled Yahoo Search by writing a better algorithm... that won the mapping war by driving around taking pictures. Google Street View was the antithesis of Google thinking. Slamming a dozen databases together and creating a Tron's-eye-view of the world? That's the Google we all know and love.
Now that they've been in the world and checked it, they don't have to go there ever again... at least until their self-driving cars go through and LIDAR everything to half an inch.