2014-10-20, 17:58
Android NDK (Native Development Kit) r10c have now been released with Google's announcement of Android 5.0 (Lollipop)
https://developer.android.com/tools/sdk/ndk/index.html
Anyone already looking into building Kodi with Android NDK Revision 10c (October 2014) and been having any success?
This is the first NDK that officially features support to build for Android TV
http://developer.android.com/training/tv...index.html
http://developer.android.com/training/tv/index.html
Some small stuff also changed for MediaCodec in API Level 21 http://developer.android.com/reference/a...Codec.html
Otherwise most interestingly in Android 5.0 (Lollipop) for Kodi developers is probably the new APIs in API Level: 21
http://developer.android.com/about/versi...d-5.0.html
https://developer.android.com/tools/sdk/ndk/index.html
Anyone already looking into building Kodi with Android NDK Revision 10c (October 2014) and been having any success?
This is the first NDK that officially features support to build for Android TV
http://developer.android.com/training/tv...index.html
http://developer.android.com/training/tv/index.html
Some small stuff also changed for MediaCodec in API Level 21 http://developer.android.com/reference/a...Codec.html
Otherwise most interestingly in Android 5.0 (Lollipop) for Kodi developers is probably the new APIs in API Level: 21
http://developer.android.com/about/versi...d-5.0.html
Quote:Android 5.0 APIs
API Level: 21
If you haven't tested your app against the new Android Runtime (ART)...
The 4.4 release introduced a new, experimental Android runtime, ART. Under 4.4, ART was optional, and the default runtime remained Dalvik. With Android 5.0, ART is now the default runtime.
For an overview of ART's new features, see Introducing ART. Some of the major new features are:Most Android apps should just work without any changes under ART. However, some techniques that work on Dalvik do not work on ART. For information about the most important issues, see Verifying App Behavior on the Android Runtime (ART). Pay particular attention if:
- Ahead-of-time (AOT) compilation
- Improved garbage collection (GC)
- Improved debugging support
- Your app uses Java Native Interface (JNI) to run C/C++ code.
- You use development tools that generate non-standard code (such as some obfuscators).
- You use techniques that are incompatible with compacting garbage collection. (ART does not currently implement compacting GC, but compacting GC is under development in the Android Open Source Project.)
If your app uses RemoteControlClient...
The RemoteControlClient class is now deprecated. Switch to the new MediaSession API as soon as possible.
To display media playback controls if your app is running on the Android TV or Wear platform, implement the MediaSession class. You should also implement MediaSession if your app needs to receive media button events on Android devices.
If you are using the Android Native Development Kit (NDK)...
Android 5.0 introduces support for 64-bit systems. The 64-bit enhancement increases address space and improves performance, while still supporting existing 32-bit apps fully. The 64-bit support also improves the performance of OpenSSL for cryptography. In addition, this release introduces new native media NDK APIs, as well as native OpenGL ES (GLES) 3.1 support.
To use the 64-bit support provided in Android 5.0, download and install NDK Revision 10c from the Android NDK page. Refer to the Revision 10c release notes for more information about important changes and bug fixes to the NDK.
If your app binds to a Service...
The Context.bindService() method now requires an explicit Intent, and throws an exception if given an implicit intent. To ensure your app is secure, use an explicit intent when starting or binding your Service, and do not declare intent filters for the service.
If your app binds to a Service...
The Context.bindService() method now requires an explicit Intent, and throws an exception if given an implicit intent. To ensure your app is secure, use an explicit intent when starting or binding your Service, and do not declare intent filters for the service.
Audio playback
This release includes the following changes to AudioTrack:
- Your app can now supply audio data in floating-point format (ENCODING_PCM_FLOAT). This permits greater dynamic range, more consistent precision, and greater headroom. Floating-point arithmetic is especially useful during intermediate calculations. Playback endpoints use integer format for audio data, and with lower bit depth. (In Android 5.0, portions of the internal pipeline are not yet floating point.)
- Your app can now supply audio data as a ByteBuffer, in the same format as provided by MediaCodec.
- The WRITE_NON_BLOCKING option can simplify buffering and multithreading for some apps.
Media playback control
Use the new notification and media APIs to ensure that the system UI knows about your media playback and can extract and show album art. Controlling media playback across a UI and a service is now easier with the new MediaSession and MediaController classes.
The new MediaSession class replaces the deprecated RemoteControlClient class and provides a single set of callback methods for handling transport controls and media buttons. If your app provides media playback and runs on the Android TV or Wear platform, use the MediaSession class to handle your transport controls using the same callback methods.
You can now build your own media controller app with the new MediaController class. This class provides a thread-safe way to monitor and control media playback from your app's UI process. When creating a controller, specify a MediaSession.Token object so that your app can interact with the given MediaSession. By using the MediaController.TransportControls methods, you can send commands such as play(), stop(), skipToNext(), and setRating() to control media playback on that session. With the controller, you can also register a MediaController.Callback object to listen for metadata and state changes on the session.
In addition, you can create rich notifications that allow playback control tied to a media session with the new Notification.MediaStyle class.
Media browsing
Android 5.0 introduces the ability for apps to browse the media content library of another app, through the new android.media.browse API. To expose the media content in your app, extend the MediaBrowserService class. Your implementation of MediaBrowserService should provide access to a MediaSession.Token so that apps can play media content provided through your service.
To interact with a media browser service, use the MediaBrowser class. Specify the component name for a MediaSession when you create an MediaBrowser instance. Using that browser instance, your app can then connect to the associated service and obtain a MediaSession.Token object to play content exposed through that service.
Storage - Directory selection
Android 5.0 extends the Storage Access Framework to let users select an entire directory subtree, giving apps read/write access to all contained documents without requiring user confirmation for each item.
To select a directory subtree, build and send an OPEN_DOCUMENT_TREE intent. The system displays all DocumentsProvider instances that support subtree selection, letting the user browse and select a directory. The returned URI represents access to the selected subtree. You can then use buildChildDocumentsUriUsingTree() and buildDocumentUriUsingTree() along with query() to explore the subtree.
The new createDocument() method lets you create new documents or directories anywhere under the subtree. To manage existing documents, use renameDocument() and deleteDocument(). Check COLUMN_FLAGS to verify provider support for these calls before issuing them.
If you're implementing a DocumentsProvider and want to support subtree selection, implement isChildDocument() and include FLAG_SUPPORTS_IS_CHILD in your COLUMN_FLAGS.
Android 5.0 also introduces new package-specific directories on shared storage where your app can place media files for inclusion in MediaStore. The new getExternalMediaDirs() returns paths to these directories on all shared storage devices. Similarly to getExternalFilesDir(), no additional permissions are needed by your app to access the returned paths. The platform periodically scans for new media in these directories, but you can also use MediaScannerConnection to explicitly scan for new content.
Wireless & Connectivity - Multiple network connections
Android 5.0 provides new multi-networking APIs that let your app dynamically scan for available networks with specific capabilities, and establish a connection to them. This functionality is useful when your app requires a specialized network, such as an SUPL, MMS, or carrier-billing network, or if you want to send data using a particular type of transport protocol.
To select and connect to a network dynamically from your app, follow these steps:
Create a ConnectivityManager.
Use the NetworkRequest.Builder class to create an NetworkRequest object and specify the network features and transport type your app is interested in.
To scan for suitable networks, call requestNetwork() or registerNetworkCallback(), and pass in the NetworkRequest object and an implementation of ConnectivityManager.NetworkCallback. Use the requestNetwork() method if you want to actively switch to a suitable network once it’s detected; to receive only notifications for scanned networks without actively switching, use the registerNetworkCallback() method instead.
When the system detects a suitable network, it connects to the network and invokes the onAvailable() callback. You can use the Network object from the callback to get additional information about the network, or to direct traffic to use the selected network.
Scheduling jobs
Android 5.0 provides a new JobScheduler API that lets you optimize battery life by defining jobs for the system to run asynchronously at a later time or under specified conditions (such as when the device is charging). Job scheduling is useful in such situations as:
- The app has non-user-facing work that you can defer.
- The app has work you'd prefer to do when the unit is plugged in.
- The app has a task that requires network access or a Wi-Fi connection.
- The app has a number of tasks that you want to run as a batch on a regular schedule.
A unit of work is encapsulated by a JobInfo object. This object specifies the scheduling criteria.
Use the JobInfo.Builder class to configure how the scheduled task should run. You can schedule the task to run under specific conditions, such as:
- Start when the device is charging
- Start when the device is connected to an unmetered network
- Start when the device is idle
- Finish before a certain deadline or with a minimum delay
Testing and accessibility improvements - Android 5.0 adds the following support for testing and accessibility:
The new getWindowAnimationFrameStats() and getWindowContentFrameStats() methods capture frame statistics for window animations and content. These methods let you write instrumentation tests to evaluate whether an app is rendering frames at a sufficient refresh frequency to provide a smooth user experience.
The new executeShellCommand() method lets you execute shell commands from your instrumentation test. The command execution is similar to running adb shell from a host connected to the device, allowing you to use shell-based tools such as dumpsys, am, content, and pm.
Accessibility services and test tools that use the accessibility APIs (such as UiAutomator) can now retrieve detailed information about the properties of windows on the screen that sighted users can interact with. To retrieve a list of AccessibilityWindowInfo objects, call the new getWindows() method.
The new AccessibilityNodeInfo.AccessibilityAction class lets you define standard or customized actions to perform on an AccessibilityNodeInfo. The new AccessibilityNodeInfo.AccessibilityAction class replaces the actions-related APIs previously found in AccessibilityNodeInfo.
Android 5.0 provides finer-grain control over text-to-speech synthesis in your app. The new Voice class allows your app to use voice profiles associated with specific locales, quality and latency rating, and text-to-speech engine-specific parameters.
Manifest Declarations - Declarable required features
The following values are now supported in the <uses-feature> element, so you can ensure that your app is installed only on devices that provide the features your app needs.
- FEATURE_AUDIO_OUTPUT
- FEATURE_CAMERA_CAPABILITY_MANUAL_POST_PROCESSING
- FEATURE_CAMERA_CAPABILITY_MANUAL_SENSOR
- FEATURE_CAMERA_CAPABILITY_RAW
- FEATURE_CAMERA_LEVEL_FULL
- FEATURE_GAMEPAD
- FEATURE_LIVE_TV
- FEATURE_MANAGED_USERS
- FEATURE_LEANBACK
- FEATURE_OPENGLES_EXTENSION_PACK
- FEATURE_SECURELY_REMOVES_USERS
- FEATURE_SENSOR_AMBIENT_TEMPERATURE
- FEATURE_SENSOR_RELATIVE_HUMIDITY
- FEATURE_VERIFIED_BOOT
- FEATURE_WEBVIEW