What would you say if I told you that Niantic’s greatest asset isn’t that of the multi-million dollar machine we all know and love as Pokémon GO, but rather an ingenious software system that constantly learns and adapts to the world around us? This is a story about Niantic Real World Platform (NRWP), an AR system that has been in development for more than 7 years and that (almost) nobody is talking about.
- March 2019 – Wizards Unite confirmed to use NRWP
The brief history of Niantic Labs
Niantic Labs was formed in 2010 as a an internal startup within Google, aimed at following up on their previous work in the space. With a team whose expertise was deeply entrenched in mapping, 3D modeling and map interaction, Niantic set out to create a combination of maps and games that would entice people to explore the world around them.
Their first foray into the genre was Field Trip, an innocent-looking app that enabled users to discovered unique, cool and hidden locations around them. Field Trip was a far cry from NRWP, but consider this quote from Today.com and keep in mind we’re talking about year 2012:
“When you download the app to your iPhone or Android, it will run in the background of your device. You tell “Field Trip” about your interests, and it will then prioritize suggestions of nearby locations of interest. As you approach something the app thinks you’ll find interesting, it will launch a pop-up with details about the suggested location.”
Field Trip was a powerful recommendation engine, but a limited one as well. You see, Field Trip was powered by publications such as Zagat, Thrillist and TimeOut, rather than using app users’ provided data. Everything that Field Trip suggested was pre-picked by someone else, someone who often wasn’t native to that area.
Field Trip’s recommendation engine was quickly followed up by Ingress, the first ever location-based mobile game that blended augmented reality with a mysterious science-fantasy setting. Ingress was and is the core of Niantic’s technological growth. Incidentally, Ingress is also where our story begins.
Stranger in a strange land
Niantic launched Ingress in November 2012, backed by an intriguing mystery plot that connected the real world to the online world at last. Growing in popularity via Google’s Google+ social network, Ingress was spreading like wildfire among tech geeks and early smartphone gamers.
Initially, Ingress had the same problem like Field Trip: how to populate the real world map with actual Points of Interest? Niantic tried to solve the problem by reaching into the Historical Marker Database and the map of national Post Offices, but as older players remember, it was not good. It was not good at all. You see, Niantic created most portals (Ingress POIs) using a curated data set with limited geographical reach. Niantic needed a better solution for this issue, and indeed, they delivered.
On March 14 2013, Niantic released Ingress 1.21.3, the first version of Ingress that introduced Portal submissions. For the first time ever, players were able to influence the in-game map by submitting local highlights, attractions, and other unique places that represented interesting places in their area. Those places would eventually become actionable POIs inside the world of Ingress and players would be able to interact with them and further improve their gaming experience.
Players raved about the feature and from this moment onward, Niantic never had to worry about collecting, scraping or buying geographical data – the players were providing it themselves by simply participating in the game play loops designed by Niantic’s ingenious developers.
Ingress portal submission was released six years ago and players are still submitting large numbers of portal candidates in an effort to improve the game’s world map across the globe. Inadvertently, six years of portal submissions has provided Niantic with the world’s largest user generated map of world’s unique places, local highlights and everything else that fits Ingress Portal requirements.
This data set, humbly referred to as “POI DATA”, is one of the pillars of Niantic’s Real World Platform.
With POI data solved, Niantic set out to diversify and learn more about the real world in the context of AR gaming. It’s still not clear which data sets were actually used, but in the coming years Niantic grew their knowledge of cellular data, player concentration and even used OpenStreetMap to improve the map.
Analysis of Niantic’s Real World Platform
Ingress is where the story began, but it was Pokemon GO where the Platform started shaping up as a product. Pokemon GO’s release was rocky at best, plagued with server problems, scaling challenges and 24/7 fire drills for the ops team. The game took months to stabilize, but around March 2017 Niantic was ready to start working on the Real World Platform again.
It is not clear when the decision was made to make the Platform into a stand alone product, but what we know for sure is that Niantic is planning to release it to third party developers soon.
Additionally, Niantic has recently shared a high level illustration of the platform (source). For transparency sake, we’re sharing the diagram in full:
As you may expect, the Real World Platform is comprised of several key areas and data sets populated and consumed by various parties. In the rest of this article, we’ll cover the fundamental parts of the Platform and explain how each of them works in the grand scheme of things.
Server runtime 🌍
It all starts here. Players use the game client and provide POI, AR and action data, all of which are stored in the server runtime layer. The data is not just passively stored – it’s used to power various connected services that support Niantic’s partners, marketing and finance departments. These services include:
- Live events
Some subsystems are smarter than others. For example, take a look at the “anti cheat security” subsystem. Niantic’s anti cheat mechanisms are powered by machine learning algorithms that gobble up all of the player action data and flag potential malicious activities for further investigation.
Another notable subsystem is called “social” and it allows Niantic to segment the player base, act on player activity and push notifications down the line, usually in the form of platform specific push notifications and in-game news.
Other parts of the Platform are more technical in nature. AR cloud and spatial serving allow for progressive loading of AR layer and storing special AR-specific objects in the real world.
Our seasoned readers will remember that Niantic bought Escher Reality in March 2018. Escher Reality was a technology company based around providing a shared multiplayer experience in AR, but also around providing a persistent storage layer for AR objects in the real world. Want to hide a portkey in a specific locker in a public location and have it visible only to players who crack the locker’s passcode? No problem,
Escher’s Niantic’s AR cloud has you covered.
These systems are core pillars of the RWP’s future and they are powering Niantic’s entire product vision:
Throughout the past year, we have made strategic investments in initiatives focused on AR mapping and computer vision. Recently, we announced the acquisition of Escher Reality, who are contributing to our “Planet Scale AR” efforts. And today, we are announcing that we have acquired the computer vision and machine learning company Matrix Mill, and have established our first London office. It’s through the coordination of these teams that we’ve been able to establish what the Niantic Real World Platform looks like today, and what it will be in the future.
Game client 📱
If you pick up Pokemon GO and put it side by side with Ingress Prime, you will notice an eerie similarity. That’s not by chance, as Niantic’s modern game clients are all based around a shared native plugin layer and follow similar design principles.
Each client has access to the same high level map loading, player and account management solutions. Bundled with those are lower level libraries that were in development for years: GPS handling. authentication, caching, scheduling and cell management. Don’t believe me? Not a problem, read our Ingress Prime APK teardown for an in-depth analysis of everything shared between Pokemon GO and Ingress Prime.
The Real World Platform doesn’t shy away from demonstrating this, as it’s only natural for businesses to reuse and generalize the solutions they once built for a specific need. Nothing wrong with some code sharing, right? When I did my first foray into Ingress Prime’s code base, I remember finding the following quite amusing:
- Unity3D powers the game engine, UI interactions, asset management and player interactions, Protobuff and RPC are used to perform server side communication. All of this applies to Pokemon GO as well.
- Both Ingress Prime and Pokemon GO use Niantic’s shared platform code:
- Both games used the same third party libraries: Zenject (dependency injection framework for Unity), Google Protobuff (lightweight, schema based communication) and Firebase (fast key value storage)
We expect that Wizards Unite will follow suite as well, if the client app is indeed developed by Niantic. If it’s not, then we’re in the wild west territory.
Of course, Niantic isn’t just copying the systems they previously built – they are enhancing them over time. If you were impressed by Pokemon GO’s AR features, take a look at a relatively recent tech demo that features real world AR occlusion:
Technology behind the Real World Platform
I admit: I have been deeply impressed by Niantic’s Real World Platform for a while now. However, there are still some unanswered questions in the back of my mind. How ready is it? How do you interact with it? What languages do you need to know in order to use it?
In order to answer these questions, we’ve analyzed Niantic’s recently published developer contest website. The website is basically a contest listing, but it does reveal a lot about programming languages and solutions used in Niantic’s Real World Platform.
The following bullet points were extracted from various places on the website:
- The Niantic Real World Platform requires both Unity development (client side) and Java server development experience (server side)
- Finalists will have access to the Niantic Real World AR and geospatial development kits with Unity and Java API support in order to build their projects
- […] finalists will also have access to thorough documentation and samples to ensure their questions about the software are answered
- All selected teams are required to furnish their own Mac-based hardware for development. Additionally, teams must be able to provide their own mobile hardware (with built-in ARKit or ARCore support) for playtesting.
- […] will need a separate GitLab instance stood up, which requires VPN and Maven credentials
Frankly speaking, this looks great from a techie’s perspective. The Real World Platform has bindings and APIs in popular languages (Java and C#), good documentation and uses a modern build system: Maven. It checks all of the relevant boxes in the world of modern software development.
For more details about how these technologies work together when it comes to mapping Earth in AR, we suggest you watch this video featuring Niantic’s AR mapping lead:
One of the key technologies coming into the Real World Platform is Codename: Neon, a multiplayer AR demo showcased during the MWC19 conference:
Come to “Codename: Neon” and try gaming on a new level – #Telekom, @NianticLabs, @Samsung and @mobiledgex present the performance strength of edge computing technology at #MWC19 at the #Telekom booth in hall 3 pic.twitter.com/mKvrBKKhA4
— Stephan Broszio (@StBroszioDT) February 25, 2019
Wrapping it up
We hope you enjoyed this long-form article – it took us a lot of time to research and discover all of the facts and figures behind the Platform. Niantic has really built something special with the Real World Platform and it’s not surprising that Warner Bros. decided to go with them as the development studio for Harry Potter: Wizards Unite.