image

Three dimensional modeling used to be hard. It used to require something at least as big as the Xbox Kinect to get really high quality scans[1] you needed high-powered laser sensor systems. Now all you need is your phone and Capture[2].

Capture is a proof-of-concept for a company called Standard Cyborg led by Jeff Huber and Garrett Spiegel. These YCombinator grads have worked in a number of high-profile vision startups and raised $2.4 million in seed from folks like Scott Banister, Trevor Blackwell, and Jeff Huber.

They launched the app on December 3 and it’s already making 3D waves. The tool, which uses the iPhone X’s front camera and laser scanning system to create a live color point cloud, can create 3D models that you can view inside the app or in an AR setting. You can also export them into a USDZ file[3] for use elsewhere. The app is actually a Trojan horse for the company’s other applications including a programming framework for 3D scanning.

“We are at the bleeding edge – deploying 3D dense reconstruction and point cloud deep learning on mobile devices,” said Huber. “We package up this core technology for developers, abstracting away all the math and GPU acceleration, and giving them superpowers in just 3 lines of code.”

I’ve tried the app a few times and the resulting scans are still a little iffy. You have to take special care to slowly scan all facets of an object and if you move, as you see below, you end up with two noses. That said it’s an amazingly cool use of the iPhone’s powerful front-facing sensors.

“Standard Cyborg is building the API for the...

Read more from our friends at TechCrunch