Google rolls out third Android P developer preview
The new beta update brings the final APIs and final SDK for developers to start adapting their creative apps to the future Android version.
Four weeks ago at the I/O developer conference, Google had released the first beta version of Android P that focused on putting AI at the core of the operating system. Since then, the company has unveiled two other developer previews by adding and experimenting new features.
Google has now launched its second beta version of Android P, which is technically the third developer preview since the launch of Android P. The new beta update brings the final APIs and final SDK for app developers to start adapting their creative apps to the future Android version, which will be released later this year.
Google has partnered with DeepMind to introduce a new feature called Adaptive Battery, that uses machine learning to prioritise system resources for the apps the users care about most.
App Actions is a new way to help raise the visibility of your app, that takes advantage of machine learning on Android, which eventually surfaces your app at the right time, based on app's semantic intents and the user's context.
It is already evident that Android P platform supports screens with display notches, Google with this new update has added the API for the developers with which they can create their apps to deliver an edge-to-edge experience on the latest screens.
For the app messaging notifications, the new update can give app developers take advantage of the changes in MessagingStyle that make notifications even more useful and actionable. Developers can now show conversations, attach photos and stickers, and even suggest smart replies with the use of ML Kit provided in the update.
If the developers' app uses the device camera, the new beta update gives them the new multi-camera APIs that lets them access to stream simultaneously from two or more physical cameras. On devices with dual cameras, they will now be able to create new features such as seamless zoom, bokeh, and stereo vision.
The audio app developers can use the Dynamics Processing API for access to a multi-stage, multi-band dynamics processing effect to modify the audio coming out of Android devices and optimise it according to the preferences of the listener or the ambient conditions.
For developers who are already enrolled to the Google's developer beta program, this update can be downloaded over-the-air (OTA). New users who have Pixel devices and want to try out these new features can subscribe to the Google's developer preview program and flash the latest system images accordingly.
(Source)
Click on Deccan Chronicle Technology and Science for the latest news and reviews. Follow us on Facebook, Twitter.