SHARE

Google ARCore software kit announced today

Google’s new ARCore software developer’s kit, announced today, makes augmented reality on Android much more widely available.

To be clear, ARCore is not the same thing as Tango, which is Google’s other augmented reality project. While Tango requires specialized hardware like sensors and cameras, ARCore doesn’t. Instead, much like ARKit, ARCore doesn’t require anything other than your phone.

Google has similarly been experimenting with smartphone AR since it first showed off Project Tango to the world in 2014. Three years later the company has some great technology to show off as a result, but very little in the way of actual users. Today, the company is signaling a bit of a reactionary shift in its strategy by releasing a developer preview of ARCore, a platform that will deliver AR capabilities to Android smartphones at a scale Tango was never able to reach.

What was Tango?

Project Tango dropped the “Project” part of its name in June of last year to signal its exit from beta but it never seems to have reached far beyond the experimental. For the company’s part, Google AR/VR boss Clay Bavor says that he’s seen Tango as something to validate the use cases of AR.

“Our goal with Tango was really to prove out the core technology and show the world that it’s possible,” Bavor told TechCrunch “Obviously others have started to invest in smartphone AR, our goal with Tango has always been to drive that capability into as many devices as possible.”

And while we’re still waiting for iOS 11, and thus, ARKit, to make its public debut, ARCore is available right now. Starting today, developers can use ARCore on the Pixel and Samsung’s Galaxy S8, as long as they’re running Android 7.0 Nougat or above. Eventually, Google hopes for ARCore to run on millions more Android devices from manufacturers like Samsung, Huawei, LG and ASUS.

Like ARKit, ARCore works with Java/OpenGL, Unity and Unreal, and it will deliver on three features: motion tracking (it uses the phone’s camera to detect your position in the room), environmental understanding (so it can detect horizontal surfaces) and light estimation (so that the lighting and shadow of virtual objects match your surroundings).

whats special with Google ARCore !

ARCore works with Java/OpenGL, Unity, and Unreal and focuses on three things:

  • Motion tracking: Using the phone’s camera to observe feature points in the room and IMU sensor data, ARCore determines both the position and orientation (pose) of the phone as it moves. Virtual objects remain accurately placed.
  • Environmental understanding: It is common for AR objects to be placed on a floor or a table. ARCore can detect horizontal surfaces using the same feature points it uses for motion tracking.
  • Light estimation: ARCore observes the ambient light in the environment and makes it possible for developers to light virtual objects in ways that match their surroundings, making their appearance even more realistic.




As Google describes it, ARCore has three basic components. The first is motion tracking, which estimates a phone’s relative location based on internal sensors and video footage — so you can pin objects in one place and walk around them. The second is environmental understanding, which uses the camera to detect flat surfaces. The third is light estimation, which helps virtual props cast accurate shadows and otherwise fit in with their surroundings. Google is also showcasing a few semi-interactive tricks. In a simple demo app, you can set a little Android mascot down in a virtual forest, where it’ll wave when you hold your phone to its face.

Google announced something called ARCore, its equivalent to Apple’s ARKit. It’s a built-in AR platform for app makers, and is available now on Google Pixel and Samsung Galaxy 8 phones, with the hopes that it will run on 100 million phones by this winter. This could expand the community for Google AR apps significantly. Google is also working on two experimental AR web browsers, one that will use ARCore and one that will run on iOS and support ARKit.

what if ARCore meets Google Lens !

Google’s augmented reality program could also intersect with its push for visual search. One of the ARCore team’s members is Jon Wiley, formerly the lead designer of Google Search. Now the company’s director of immersive design, he thinks combining ARCore with a visual search tool like Google Lens could pull human-computer interaction more toward the “human” side of the spectrum. If smartphones are going to follow our thought processes and not the other way around, they need to see the world like we do, Wiley says. “Getting the phone and getting the real world to line up is an incredible technical challenge, but it also offers the opportunity to have a much more intuitive interface.”




For an example of how this might work, imagine searching for instructions — say, a guide to that complicated espresso machine — by showing Google a picture of the object. Visual search could identify it automatically, and augmented reality could offer an overlay of instructions, instead of a link to a YouTube video or written manual. “We’re working very closely with the Google Lens team, and I see ARCore as one of the many ingredients that will go into experiences like Lens,” says Bavor. “Not anything to announce on that right now, but let’s just say we think ARCore is going to make all that stuff more interesting, more powerful, and more useful for people.”

Since we are seeing a huge change in software, so can surely aspect the hardware to change too. these is a huge chance that Google’s new Pixel phone will be designed in such a way that its hardware supports the next gen softwares and projects. 

Though Google Augmented Reality, Google ARCore is a long term project, but hardware compatibility is a must for testing and developing.

Leave a Reply