Episode 120: Learn how 3-D sensors work before Apple puts them in the iPhone

13/07/2017 52 min
Episode 120:  Learn how 3-D sensors work before Apple puts them in the iPhone

Listen "Episode 120: Learn how 3-D sensors work before Apple puts them in the iPhone"

Episode Synopsis

What did you buy for Prime Day this week? This week we tackle if Amazon’s new program to help folks install Alexa-enabled devices is a big deal and Apple’s retail plans for HomeKit. We also discuss fashion-forward wearables, and a new startup called Nodle that’s trying to create crowdsourced Bluetooth-based IoT networks. We have a lot of data on voice thanks to IFTTT and spent some time discussing a friendly French IoT company.
Lighthouse combines machine learning, natural language processing and computer vision to create an assistant for your home that can see, hear and speak.
Then I chat with Alex Teichman about Lighthouse, his new startup that marries computer vision with a voice-based personal assistant to make your life easier. For the nerds out there, we also discuss the category of sensors available for 3-D sensing and how they differ. This matters for Lighthouse, self-driving cars and maybe even for the next-generation iPhone. Get ready to cover everything from recurrent neural networks to frickin’ lasers!
Hosts: Stacey Higginbotham and Kevin Tofel
Guest: Alex Teichman co-founder and CEO at Lighthouse
Sponsors: Schlage and Affiliated Monitoring

Do you need a Mother? It’s on sale.
Can Apple build the right showroom to sell the smart home?
Louis Vuitton gets into wearables
How to use 3-D sensing to make computers see more
How Apple may choose to use 3-D sensors to unlock phones

The post Episode 120: Learn how 3-D sensors work before Apple puts them in the iPhone appeared first on IoT Podcast - Internet of Things.