« Back to home

Droidcon Bangkok - Day One

Posted on

I just got back from the first day of Droidcon Bangkok, I was going to wait until the end of the conference to write anything about it but there was enough interesting stuff on day one to justify a complete post, so here it is.

Android and Tuk Tuk

Google is moving from Mobile First to AI First which is something they have been talking about for a while now - and the keynote covered this as well. TensorFlow is a big part of this as well as backend services that provide high level interfaces to AI processes (Cloud Vision and Cloud Speech are two good examples). Mobile isn't being forgotten though, TensorFlow can be used there as well.

Firebase Logo

Another item that came up a lot during the first day was Firebase which is being positioned as the preferred way to set up infrastructure free backends for mobile apps. It provides a set of API's to manage hosting, storage, messaging, and many other functions (all backed by Googles cloud infrastructure of course). I haven't had a chance to play with it yet but it does seem like an easy entry point to cloud based computing - the free tier has pretty generous quotas so setting up a sample application wouldn't involve any additional cost.

There were three sessions that particularly caught my interest - Android Things, the Beacon Interface and Project Tango.

Android Things

Android Things is essentially a trimmed down version of Android geared towards IoT style devices (similar to Windows 10 IoT). There was a good slide in the session showing what parts of the normal Android API were left out but I can't find a copy of it. Going from my notes it seems you lose the following:

  • System Apps (including Play Store)
  • Notifications
  • Telephony
  • Contacts
  • Media Store (Photo Gallery)

These all make sense, they are not really required on a non-phone device. To support interaction with hardware they have added support for user drivers to support specific hardware and a Peripheral I/O manager to control the GPIO pins. The PIO manager seems comprehensive - it supports PWM, I2C, SPI and UART as well as simple GPIO. For most applications this would probably be enough.

One big difference between Android Things and Android is that a display is optional. You have a lot of options in terms of providing a UI - the Cast API is still available so you can use that to display a dashboard on any compatible cast receiver, a simple HTTP server could provide a web based interface or you could use the Nearby API with a normal Android application to allow managing the device when you are in the same room.

It isn't a small system so you wouldn't use it for a simple sensor but as a data aggregator or home controller it would be perfect. Having the Android ecosystem available allows for a lot of powerful functionality to be added - Firebase for cloud operations, Vision and Speech APIs for more natural user interfaces and TensorFlow for Android is available which means you can integrate Machine Learning/AI functionality directly on the board.

Google have stated that they want to help minimise the development and hardware costs involved in building an IoT device and to assist in this they are using a SOM (System on Module) approach with the BSP (Board Support Package) being provided by Google themselves. For now a range of common tinkerer boards are supported including the Intel Edison and Joule, the NXP Pico and of course the Raspberry Pi 3. You can get a system image and the development kit right now and start playing.

Personally I'm very excited about this, I am really hoping that they will support the Raspberry Pi Zero W in the near future - that would be an excellent platform for a wide range of intelligent IoT devices.

The session was presented by Leonid Olevsky - if you would like to see some sample code he has quite a bit in his GitHub account.

Beacon Interface

Another session that was particularly interesting was the Beacon Interface. This isn't a Google project (or even Android specific for that matter) but it addresses a very common problem - more and more services essentially require a smart phone in order to access them but not everyone can use a smart phone. Look at how you launch and navigate applications - it requires visual acuity and relatively fine motor control, if you don't have both of these things it can be at best annoying and at worst impossible.

Beacon was developed at Kasikorn Bank to provide internet banking for the elderly and those with impaired vision and the way they approached it is very clever. Essentially they use a movement based interface rather than a location based one - turning the touch screen into a flexible d-pad - swipe right anywhere is like pressing the right button, swipe down for down, etc.

This simplifies the interface and allows it to be used without requiring precise point and tap movements. Once you have those basic navigation operations you can simply walk through a menu system fairly easily. To address the sight requirement they use a number of different options - information is still displayed on the screen but uses high contrast and larger text, haptic feedback (vibration) is used to indicate when operations have been performed and text to speech is used to convey the relevant information as well.

This combination makes it a lot easier for a lot more people to consume services through a phone - over a billion people world wide would have physical difficulty using a smart phone so this is a technology that has major direct benefits to a significant portion of the population.

The Beacon system is being used to provide online banking for now but it could be used for so much more - I would love to see a simple Android home screen using Beacon just to access common services like telephony and SMS messages (this would certainly make my fathers life easier). The presention mentioned an SDK for the interface but I couldn't find any reference to it on their site. The unfortunate similarity of the name to 'iBeacon' makes Google searches a bit difficult as well. This is definitely a project to keep your eye on.

This presentation had an extremely moving video about their motivation for developing the product, I don't think there was a dry eye in the house at the end of it. Not really what you expect at a developer conference.

Daydream View

Daydream and Tango are Googles solutions for Virtual Reality (VR) and Augmented Reality (AR) respectively. To be honest VR doesn't really interest me, the applications are relatively narrow (Entertainment, Shopping and remote viewing) but it is interesting to see it gaining wider distribution. Daydream support is built into Android N (as long as your phone has the appropriate hardware specifications) and the viewers range from the simple Google Cardboard to the official Daydream View complete with controller.

The controller that comes with the Daydream View looks pretty interesting - it's a Bluetooth LE device and steps have already been made towards figuring out the protocol. I can see it as a great way to interact with a Android Things based device.

AR on the other hand is very interesting and Tango has come a long way from the original developer device (I had a chance to work with one of those for a work project when they first came out). Tango based devices are now available to consumers with the Lenovo Phab 2 Pro being the first off the block followed by the Asus ZenFone AR (the ZenFone supports both Daydream and Tango).

The sensors on the Tango devices allow it to detect it's relative position in an environment (down to 1cm accuracy according to the presentation). With this detailed model of your environment you can map virtual objects into the real world making your phone a window into a virtual universe. The prospects for this are endless - adding data overlays, injecting virtual objects or simply annotating physical things.

A lot of the demos are a bit toy like unfortunately which doesn't really show off the full potential of the technology. Hopefully that will improve in the future.

Droid and Tablet

The first day of the conference was impressive, I am looking forward to tomorrows sessions - if you a Droidcon comes up in your area I highly recommend going.