June has come around again, so for iOS developers this means one of the most exciting events of the year — Apple’s developer conference, WWDC! For the lucky ones that get to go to San Jose, it’s an opportunity to meet up with old friends, make new ones, and get to spend valuable time talking to Apple engineers. For those who don’t get a ticket, there’s no worry — all of the sessions are recorded to watch whenever it suits.
Meghan Kane was able to fly to SF and attend WWDC this year 👩💻
Our team of iOS developers pay close attention to what Apple announce, in order to figure out what will be valuable for our clients and what we can do to push our projects forward. But what our developers really think, and what were the sessions that caught our attention?
Overall I felt that WWDC was good but didn't have much of a "wow" factor. Now that we're on the 12th version of iOS, there isn't much that Apple can really improve or change. One thing I like about iOS 12 is the Digital Wellbeing tools, and I hope this is something Apple will improve in future iterations of iOS. I wish they would make more focus on the development tools however, as I feel they're still lacking.
My favourite talk so far was A Tour of UICollectionView, as it was a good overview of the tool. It included a lot of optimisation tips and some interesting thoughts about how to make scrolling really smooth and how you can accidentally impact the performance of your app. I wish they'd covered layouting, as that is always the bit I struggle with, but I definitely understand how to use collection views better.
A Tour of UICollectionView
For Apple, iOS is now a stable platform and they did a great job on the optimisation of the performance. The accusation on the planned obsolence are now the past, the new iOS 12 runs on the same devices as 11 and they also focused a lot of performance, actually making the life of older devices even longer.
My personal favourite is Siri Shortcuts. Siri Shortucts is a new way to expose the actions available on your apps to Siri. These actions can be performed by the user using customizable voice commands, or they can be combined with other actions to create useful workflow. The most interesting thing is that Siri will learn to predict these actions observing how the user use your app, and it will show these actions in Spotlight, Lock Screen on on the Watch Faces.
Introducing Siri Shortcuts
I was lucky enough to be in San Jose during WWDC week this year. It’s a special week, because we get to learn about the new improvements Apple has been working on for the past year and it’s somewhat of an intense reunion with developer friends (old & new) from all around the world.
My favorite WWDC 2018 talk was “Metal for Accelerated Machine Learning” (presented by Anna Tikhonova). New this year is Metal’s expressive graph API to describe your neural networks simply in order to do training your laptop’s GPU (it will target several types of GPUs). While there are many good options for training a neural network (e.g. the high level Turi Create by Apple or TensorFlow by Google), Metal is a solid option for training that offers flexibility high performance, and simplicity.
Additionally, MPS can be used to power training with Turi Create or TensorFlow. This allows you to harness the performance gains offered by some GPUs (e.g. as she showed using an external AMD Vega GPU). It is clear that Apple is investing more resources, and thus, their future into Metal, so this is an interesting space to watch.
In addition to presenting the new ML-related improvements in Metal, Anna gave a thoughtful overview of the steps involved in training a ML model. If you’re looking for a concise explanation of convolutional and recurrent neural networks through the lens of an Apple developer, don’t miss this talk. This is a refreshing break from Apple’s standard “black box” approach to explaining ML. I’ve been hoping Apple would move away from the “black box” explanation approach because it is too opaque, oversimplifies ML, and inhibits developers’ learning journey into ML. So, a shoutout and big thank you to the Metal team for this highly educational talk!
“Metal for Accelerated Machine Learning”
The WWDC keynote this year wasn't full of great and heartbreaking news but apple did do an awesome improvement on iOS 12 making it lot more performant than the older versions, also Xcode 10 has got a great enhancement on refactoring as well as on the source control tools.
My favorite talk so far is Creating Custom Instruments as they dived deep into how instruments work under the hood.
We can now use StandardUI and Analysis Core frameworks to create our own instruments which allows you to create a custom configuration(instruments) that will be able to analyze and profile your application, the same way all the built in instruments do.
Creating Custom Instruments
So, WWDC definitely wasn't about blockbusters this year, which doesn't mean that it was boring. One announcement that excited me as a developer this year was the promise to release an iOS-to-macOS porting framework soon. Being able to develop for both platforms at the same time more easily opens up new possibilities for both existing and new features. Another development I, as a user, really liked is the continued commitment to data privacy. For example stronger protection for data transfers via devices’ Lightning ports, reduced tracking and fingerprinting options in Safari, and more convenient two-factor authentication.
If you want to learn more or find out how you can improve your apps, watch Better Apps through Better Privacy.
Better Apps through Better Privacy
This years was not a dev focused WWDC like those where CollectionView, Autolayout, StackView or Swift were introduced. It was more user facing changes. I will give them a pass this year because of the groundwork done on stability, speed and preparation for iOS to run on MacOS. But I expect a big announcement next year!
The last one with this wonderful conclusion: No Raw Loops!
I liked the Keynote this year because they seem to be focusing more on stability over shiny new features but Apple still did no address one of the biggest issues for me which is the lack of acknowledgement to the entire developer community. Apple has had a track record of selling new API as this big leap for developers when other systems already implement them, and even worse sometimes without acknowledging the contributions of Open Source community. And I still keep waiting for a year that Apple announces a developer program similar to GDG for Android. Where you can test your skills and get a community of ios developers back by Apple.
But I did love some of the talks and the new things they introduced at WWDC 2018. My personal favorite one is What is new in ARKit 2? and Integrating Apps and Content with AR Quick Look. With new ARKit 2 API they proved that Augmented Reality is not a gimmick API anymore but a really powerful piece of technology that can completely change how users interact with our apps every day. They brought 3D object detection and Image detection that follows the image even when in movement, all of that running on the user's phone and completely integrated into ARKit. They also brought environmental texturing that allows for reflective models to reflect the AR scene completely automatically done by ARKit.
They also improved world tracking and added the functionality to be able to share your entire world map and scene with other users or persist your session and experience. All of this changes and how powerful the code behind is show that Apple has a clear commitment to AR and that it will keep pushing it to get more apps crossing into a new reality.
What is new in ARKit 2?
I was particularly interested in what Apple is going to announce regarding testing. There are a few things that got introduced and demoed with Xcode 10. They are placed in a new, little menu
Firstly, test parallelization. With Xcode 10, Apple decided to change the approach to parallel testing - Parallel Destination Testing - that’s been introduced in Xcode 9 by applying Parallel Distributed Testing. Instead running tests one after the other, Xcode will create a virtual clones of the same simulator and will simultaneously distribute and execute the tests classes to them. Thanks to that the execution time will decrease massively. This feature is supported in Xcode and xcodebuild.
Secondly, test ordering. Be default, tests run accordingly to their name. It means they always run in the same order until you rename them. By selecting “Randomize execution order” tests will run without specific dependencies. It will push users who write and organize test as deterministic as possible.
At last, test selection. With the Xcode 10 users will be able to control which tests they want to run within the scheme. Thanks to that user can decide to run only particular tests for one scheme or skip them