After the first burst of innovation, the landscape of mobile app development has not changed that much during the last years. Despite the number of smart devices every person uses is growing quickly,
the apps and services that we use daily are still struggling to share content across different devices in a seamless way.
Huawei's strategy is to build an ecosystem of connected devices and related technologies to let developers build apps that are not confined anymore in the hardware limitation of a smartphone, but can be extended to create a whole smarter environment around the user.
Imagine if you can build an app that is not only designed for you phone, but also for your home, your car or your gaming or sport equipment!
As an interesting example, I'd like to show you how to build a sport app that take advantage of smart devices such as external cameras and smartwatches.
Practically, let's build an Android app that is able to:
- Discover smart devices around you
- Connect to the action camera and the smartwatch
- Get live stream video from the camera
- Get biometrics data from the smartwatch
- Overlay biometrics data to the video stream
- Enjoy the video :)
- A Huawei smartphone equipped with Kirin 980 or later and running EMUI 10.1+ (at the time of the article I recommend Huawei P40)
- A Huawei smartwatch (I am using Huawei GT Watch Active 2)
- An action camera compatible with DV Kit (for example Drift Ghost 4K)
- Love for sport and new technologies!
Security first, let's apply for permissions. In order to be able to use DV Kit in your app you need to register as a developer in Huawei's portal. From the console then submit an app, upload your signing certificate fingerprint and and keep note of your app id. For more detailed steps see environment preparation.
Create a new project and add custom permissions in the manifest: https://gist.github.com/43d396f773029347d53e221baa4a9b60
Those permissions are checked at runtime and verified by Huawei Cloud, for this reason the app must be signed with the same certificate previously uploaded in the console. Also, the app id must be specified in the manifest: https://gist.github.com/4e9357ea057183a960eb3ce95979ade0
Some Android standard permissions are also needed to access the Camera API, to record audio and to access body sensors data: https://gist.github.com/aac6a6b934b6976a3c70f75ed6e73e4c
Now it's time to include DV kit dependencies in the gradle file. Add the repository in both buildscript and allprojects locations: https://gist.github.com/713cb7bdd321a5c4a7dc41fdc14cc781
While in the app-level gradle file, configure the build dependencies: https://gist.github.com/e7073eabd383302dfbaf2d6141353b2a
You can find all the sdk versions published here.
HiHealth Kit comes in the form of a jar file that you need to manually download from the developer console and add in the libs folder. It can only be accessed by registered partners, if you want to join the program see HiHealth data access introduction. Once you got the artifact, add the gradle dependencies in this way: https://gist.github.com/1102ee0dbca2ad2dbcb6e30638f890ea
In an over-simplified scenario we will need just 2 objects, a VirtualDeviceManager that handles the lifecycle of the service and a collection of VirtualDevice to keep track of the smart devices around the smartphone.
For simplicity I will use an HashMap where the key is the device id. Declared like this: https://gist.github.com/4c5a53b5466445ebd7394bbd74c01d11
Then let's initialize the DVKit service, calling the following function in the onCreate phase: https://gist.github.com/077254875f58b25fb80a3aaab824cc00
You may notice that we need to define 2 additional objects to specify what happens when a new device gets discovered and when a new capability is enabled.
In the first case, we just want to add the discovered device to our collection (ideally avoiding duplicates), we can do that by implementing the IDiscoveryCallback:
https://gist.github.com/8e24c4ea1a4c6be301fad1f6501dbffb
In the second case, the IVirtualDeviceObserver will notify that a new capability has been detected. In particular, we want to use the just discovered smart camera, enable it and start a new Activity passing the camera id as parameter.
https://gist.github.com/0403ebe76abed3881da05697fcc2b8a6
That's it! Now the smartphone is connected with the external camera and we can work with it just like we would do with the phone's internal one. Let's see how.
For simplicity, we will use the camera in a separate Activity, an interesting use case is to put the phone in the pocket once connected to the external camera (mounted on the helmet or bike) while keeping the live
stream ongoing. In this case it's better to use a Service instead, the code is anyway very similar.
As already said, once the external camera has been connected, it can be used just like any other camera, consider it a CameraDevice object from Android Camera 2 API.
The Activity got the cameraId as a parameter, another property that will be useful is the range of available frames target, keeping this in mind, the function to open the camera will look like this:
https://gist.github.com/5daea3bec054ffbf2299e5c7d440129f
The stateCallbak needs to be initialized:
https://gist.github.com/60a57ec41d46fae4f96e21d898564182
Now, the tough part. This article is not meant to be an exhaustive guide on how to work with cameras in Android, many things can be done and my suggestion is to give a look at the official documentation.
However it's important to say that we need to define a CaptureRequest to be used in a CameraCaptureSession.
So the startPreview(cameraDevice) looks like this:
https://gist.github.com/1fefa82e243983be3379e3b545520fc6
Since we want to add an additional layer with biometrics data on the top of the video stream, we are going to use a custom SurfaceView which also needs to be an OpenGL Render. Let's call this class OverlayView and let's explore its main characteristics. From the previously defined startPreview(cameraDevice) function we know that the class holds a Surface, images drawn to this Surface can be attached to an OpenGL ES texture. Without going too deep in the OpenGL ES functionality, let's assume we have a draw(data: BiometricData) function that simply draws a Sprite icon and a Text on the surface.
It's now time to retrieve this real time data from the smartwatch!
HUAWEI HiHealth is an open platform facilitating easier data access and service aggregation of biometrics data, and consists of the HUAWEI Health app and HUAWEI HiHealth cloud.
The HUAWEI Health app runs on the smartphone, providing users with diverse functions, such as workout data tracking, advanced fitness training, sleep monitoring, and healthy lifestyle services.
The HUAWEI HiHealth cloud provides users with secure data storage services, enabling them to upload historical data to the cloud.
In particular, we are going to use the Java APIs to retrieve data from the HUAWEI Health app, which can be synchronized in real time with the smartwatch.
User needs to grant permissions to let third-party apps accessing health data by accepting a dialog box. The dialog can be shown using the requestAuthorization(context, writingPermissions, readingPermissions,listener) method from HiHealthAuth class. In our case, a simple method would be this: https://gist.github.com/89c3db9ecd44b61f11f5a1431d5a4397
In particular, we are asking the user to let our app accessing the heart rate and other real time sport data. All the possible read/write permissions are listed here.
As a starting point, let's see how to read heart rate information. Additional data can be retrieved with a similar approach.
It's time to meet our special guest, the HiHealthDataStore. This object is the main interface between our app and user's data.
It has 2 convenient method that we are going to use: startReadingHeartRate and stopReadingHeartRate. The results are in JSON format, it's common to define a data class to store the information and to take advantage of the Gson deserializer: https://gist.github.com/4c5055e351efa0ea81fd748881b938dd
Getting the desired data now looks like this: https://gist.github.com/1adbfbb8c060b016f7b78717ed2e56d7
The heartBeatRate value can be consumed by an observer that will notify our OverlayView, which will eventually draw the data on screen.
Another interesting method is startRealTimeSportData, it can be used in a similar way with an HiSportDataCallback to obtain a Bundle of data, including speed, calories burnt, altitude and many others. A complete list of all the keywords can be found here.
This is the end of our journey! If you want to see a wonderful example you can check the Drift Life App, which fully leverage DV Kit and HiHealth to bring an innovative experience to their users.
By the way, this is just one of the possible scenarios that can be realized using newest HMS kits.
A further investigation could be the real-time processing of the video stream using HiAI: Huawei's powerful tool to run neural networks on device. If the smartphone is going to be the center of a connected smart enviroment, better to learn how to fully leverage the potential of its "brain"!
I hope to have intrigued you with this article, if you are intrepid enough to try developing an app that uses these technologies, do not hesitate to contact me!