Published inymedialabs-innovation

Forget the iPhone 7 Headphones controversy. Apple just changed the game in a big way with the iPhone 7 Plus and no one is talking about it

I’m not going to debate about the headphone jack being replaced with the lightning connector. That’s what everyone else is going to talk about for the next month.

Apple just announced the iPhone 7 plus , if you want the details you can read it in our blog Apple’s iPhone 7 event. Before I get into why the Apple iPhone 7 plus is completely going to change our world, let me explain certain concepts.

Depth Perception

Depth perception is the visual ability to perceive the world in three dimensions (3D) and the distance of an object. We all have this ability with our eyes. If we get this technology in our devices then, we can create depth maps which can be used for a variety of applications.

Kinect is a device which has this capability and here is a small 2 minute intro for the depth sensing.

Motion Tracking

Motion tracking involves tracking the device position and movement of the device with high accuracy.

Machine Learning

If you don’t know what machine learning is, then stop right here and do this.

iPhone 7 Plus dual camera system

In the apple event, we got a glimpse of what the iPhone 7 plus dual camera system is capable of for taking photographs. We even got to see the depth perception generated by the dual camera along with a depth map. Its even mentioned in that its done through machine learning.

Here is the depth map shown in the event.

Not only did it identify the background, it was able to create a depth map which can be used to construct a true stereoscopic image.

What can be done with this capability you say? Everything!

  • we can identify people in the image
  • we can identify products in the image
  • we can measure distance of objects
  • we can detect gestures and track hand movement
  • we can construct 3d images of objects

Imagine this capability opened to all developers. Along with this and the power of iPhone sensors: GPS, accelerometer and gyroscope, which gives it motion tracking. We can easily create Augmented Reality applications. Applications like -

  • Electrical blueprints in your walls, watch the blueprints of where each wiring is just by pointing your camera
  • Measure distance just by opening your camera. See if the new sofa will fit through your door
  • See how a furniture would look like in your home
  • Capture a true 3D model of your home, along with all textures
  • Hold it to any item and see its ratings live
  • Hold it to your meal plate in cafeteria and skip the line by paying with apple pay
  • Get instructions if your yoga position isn’t correct

If that isn’t enough, the iPhone 7 plus screen with the wide color gamut and large screen display is best suited to become the ultimate VR headset.

Apple has exposed some basic Machine Learning API’s in iOS 10. With all the news about how Apple is using Machine Learning , it wouldn’t be a surprise to see these capabilities be exposed to developers in the near future.

Why can’t an iPhone 7 do it ?

There has to be 2 camera’s to do depth perception and iPhone 7 just doesn’t. I would definitely buy the iPhone 7 plus. Its the revolutionary device we deserve, but not the one we need right now.

An Exclusive Look at How AI and Machine Learning Work at Apple
The iBrain is here — and it’s already inside your

Developed at Innovation Labs @ Y Media Labs

Thanks to Robbie Abed