It was an early day for those who wanted to get a seat up-front for the WWDC keynote address, held for the first time at the San Jose Convention Center, a good hour's train ride south of San Francisco.
A lot of new and improved hardware was presented, including the HomePod, iPad Pros, and the iMac Pro. But also there was a lot of exciting news to share on the software front.
During the conference, we had the opportunity to attend numerous sessions and labs and discuss questions with Apple developers. Here are just some of the highlights that will be of interest to vendors and developers of apps.
iOS 11 will see the launch of the ARKit SDK. This framework, together with other frameworks like SpriteKit and SceneKit (which are mainly used in the development of games), will offer numerous options for displaying objects in 3-D or 2-D as well as information in the real world through the iPhone camera. Quite impressive are the advances in resolution, quality, and performance. In my opinion, Apple has successfully taken a giant leap forward in the AR world.
The MLKit is just as exciting. This allows learning mechanisms to be added to apps. This will now allow such things as predictive text and handwriting recognition for the recognition of symbols. The new Vision Framework enables facial recognition in photos and videos. In addition, it is now also possible to "recognize" photos and other data using natural language processing (NLP), which is useful for finding objects in photos or videos.
NPC opened up (at least partially)
A first step has been taken as Apple opens its NFC interface, at least partially. With the Core NFC Framework, apps can read NFC tags (NDEF data) to provide users with more information about their physical environment and real objects. This may include information about products in shops or objects in museums and exhibitions. It is only one-way, for now, meaning that data can only be read and not sent and only for iPhone 7/ 7 Plus or higher.
watchOS: SiriKit integration and more
SiriKit can now also be used with watchOS 4, which makes it possible to use Siri to perform app functions. The Siri watchface gives users a custom timeline for their entire day. It is also exciting to see new applications from third-party developers, for example, blood glucose meters, that can be controlled via Bluetooth. In addition, the watchOS 4 offers a new, successful design for watch apps in the dock.
New app store deployment options
The redesigned app store will provide new opportunities for app providers as well as for users. For example, developers can choose to keep reviews of older app versions with each new version release. This can be an advantage if your release is only a minor bug fix. The number of downloads can now be limited; this is useful for testing new app versions. In addition, the app's marketing capabilities can be improved, for example, with up to three possible (localizable) app videos.
Improved developer environment
Apple is simplifying the developer's life: Xcode 9 now offers the possibility for cross-language refactoring, something which had no longer been possible since the introduction of Swift. The entire intelligence in the Xcode has been massively improved with the new code editor written in Swift. Debugging apps is also more efficient since several simulators can be opened simultaneously and wireless debugging is also finally possible. Swift 4.0, on the other hand, has not seen any major changes, which suggests that Swift's syntax has now stabilized.
Personally, I find the most exciting developments to be in the area of AI/machine learning. These will permanently change the user experience on all devices. As far as the conference itself is concerned: it was a perfectly organized event! The arrangement of the session rooms and labs helped corral the mass of people quite well and the mood among the developers and scholars from all over the world was quite buoyant.
This article has also been published in Netzwoche online and in the print edition 11/2017.
Some pictures from the event: