Android Dev Summit is a conference with over 25 technical talks, and a chance for the Android team at Google to share the latest and greatest of what’s going on. This was my first year attending, and it lived up to the hype. Keep reading to see my recommended highlights and best bits.

Learning comes in many different forms, from listening to speaker talks (live or recorded), talking to others, or hands on learning with codelabs. I like to mix between these styles and #AndroidDevSummit did not disappoint. They have many codelabs ready to be done here and most of the talks recorded here.


A slide reminding us that we can all get a little MAD sometimes.

I really enjoyed the talk on Developing Themes with Style, this took UI programming & code organisation back to basics, Explaining when and where to use themes vs styles, as well as some great pro tips with gotcha’s in the layout system. Relying heavily on the inbuilt symantic properties, such as colorSurface or colorError, can really help keep your UI coding clean and minimal. Recording can be found here.

This talk will start with a crash course in themes and styles and how to apply styling throughout your app while isolating theme dependent resources. We’ll then walk through applied examples of using the styling system to build material themed apps and dark themes.


A slide from the Developing Themes with Style talk.

Machine Learning on Android is getting better and better. This isn’t the underlying theory of ML changing in anyway, but the tooling that novices like myself can use. It reminds me of the early days of Firebase and how that exploded into an amazing toolset. The talk ‘On-device ML: Tackling Complex Use Cases with ML Kit’ showed the suite of tools available to use, including not just object detection, but bounding those objects to a cropped smaller detection to allow for even more accurate description and classification. Recording not yet available.

ML Kit makes it easy to integrate ML powered solutions into your apps, either through our turn-key Vision and Natural Language processing APIs or with your own custom TF Lite models. Not only can you easily tackle singular tasks like Text recognition, Face Detection or Language detection, but you can also create more complex user experiences by chaining multiple ML Kit APIs or using these in combination with your own custom models.


A slide from the On Device ML Kit talk.

The best codelab from the event has to be the introduction to Compose. Compose is the (pre-alpha) new Android reactive UI framework. It promises to deliver a faster, cleaner and more flexible way of creating your UI. Drawing from the successes in Web of React and Angular as well as this paradigm coming to mobile with Flutter, and perhaps just a little because of the iOS Swift UI framework. You need to install Android Studio Canary 4.0 to have access to the Kotlin compiler needed to make compose work, 100% recommend checking out this code lab.

In this codelab, you will learn, What Compose is, How to build UIs with Compose, How to manage state in Composable functions, Data flow principles in Compose


A slide from the Compose Deep Dive talk.

In conclusion, the Android Developer Summit is an amazing event for the proliferation of knowledge. Not just for those in person, but the fact that all the talks are put on YouTube straight away, that the codelabs are available and that social media is a-buzz with insights and knowledge. This all makes this event an awesome time of year in any Android Developers calendar.

YouTube playlist: Android Dev Summit 2019 talk videos

Android Codelabs: Android Dev Summit 2019 developer code labs

Event page: Android Dev Summit 2019 home page

(Title spoiler: Yes it was I, who recommended the Summit ;-))