Commit 47045988 authored by Tobias Due Munk's avatar Tobias Due Munk

Add README.md

parent bad93ff6
# Into the Deep
## or what would the 80s have done with depth sensing technology?
Into the Deep is a conference talk given by [Tobias Due Munk](https://twitter.com/tobiasdm) at various mobile app developer conferences in 2018. The talk is interactive and the "slides" are powered by an iPhone X/XS running a custom built app.
### Pitch for the talk
Mobile has always been driven forward by a plentitude of sensors like GPS, motion, and camera. With the advent of depth sensing technologies new possibilities open up for apps.
Watch and learn intricate details and loads of use cases through an immersive and interactive talk packed with 80s visuals and audio.
### Recordings
[17 April 2018](https://www.youtube.com/watch?v=SCOLRVVRDJk)[AppBuilders](https://appbuilders.ch) – Lugano, Switzerland
[5 July 2018](https://www.youtube.com/watch?v=XfEBtxhhI0g)[ADDC, App Developer & Designer Conference](http://addconf.com/) – Barcelona, Spain
[5 October 2018][MobiConf](http://mobilconf.org) – Krakow, Poland
[1 November 2018](https://vimeo.com/299042781)[MobileEra](http://mobileera.rocks) – Oslo, Norway
### Code
This repo includes all the code and commit history for the work on this talk.
· Open the `Into the deep.xcproj` in Xcode
· Build and run it on an iOS device iPad with a depth sensor (i.e. has FaceID).
· Connect an external screen to the iPhone using AirPlay screen mirroring or using a Lightning to HDMI dongle.
The iOS device should now show speaker notes and controls for each slide, and the external screen will show the content of the slides.
\ No newline at end of file
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment