About a year ago I had the idea to try out Flutter, Google’s UI framework for building cross-platform (desktop, mobile & web) applications from a single code-base. As Flutter application development was all in Dart, I took this as an opportunity to learn a new programming language as well.
Originally I had in mind to develop an NDI Camera application for Android devices, as (prior to late 2020) the official NDI Camera application was pulled from the Google Play Store. Whilst I had the old APK file from a system backup, I wanted to make my own application that I could develop and suit to my needs. (EDIT: Here’s a link to the latest NDI|HX Camera)
Having never used Flutter / Dart before, my first attempts at doing anything were quite futile; as I had to learn the syntax and nuances of the workflow. After a day or two of wandering aimlessly, I decided to pause the project and do other work until my brain subconsciously clicked with what to do.
Prior to the release of NDI 5 (mid-2021) the NDI SDK for ARM devices only supported the sending of NDI streams - and not the receiving / decoding of NDI streams; however with the release of the NDI 5 SDK, ARM devices were now able to both send AND receive sources
Fast forwards to July 2021. After countless video productions and live-streams as consequence to COVID-19 restrictions, I decided to pick up this project again. With the release of the NDI|HX Camera and the NDI 5 SDK, I shifted focus to building an NDI Receiver - as I was unaware of any existing solution.
The Sienna NDI Monitor for iOS is AU $14.99, and works somewhat “okay”.
Regardless there was no Android version
There were quite a few hurdles to overcome, especially with picking up a new framework and language.
Setting up the Environment
Installing Flutter Dependencies
It was quite simple to set up the developer environment for Flutter! The Flutter SDK contains a diagnostic tool (
flutter doctor) that helps to check that everything needed is installed and configured correctly.
My installation process was basically a process of running the diagnostic tool, and installing the packages/libraries/etc that it told me to.
Development of Flutter seems to be primarily done in IntelliJ / Android Studio, or in Visual Studio Code, through the official Flutter extension. This provides a bunch of development features like IntelliSense, Debugging, Profiling, Logging, etc.
When initialising Flutter starter code for Android, you have a choice to use Kotlin or Java, however after being unable to set up a working environment for Kotlin (code formatting wasn’t working?) I reverted to using Java instead, bit of a bummer. (More on why we need to write Kotlin/Java code later)
When it came to building my Flutter project, I kept running into compile/build issues. Yet, my original code base (Using whatever Flutter / Dart / ??? version was available 2020) was able to build successfully.
Eventually I found a GitHub issue, which outlined my exact issue. The issue seemed to occur when the codebase was not on an
NTFS file system (when building on Windows). This made sense, as I had often run into issues with Node.js and npm when trying to install packages on my
D: disk drive, which was formatted as
exFAT (which does not implement every filesystem method). After moving my project to my
C: NTFS drive, all was well - it was probably some symbolic link issue..?
Hopefully the developers of Flutter (Google) fix this issue in the future
Whilst Flutter is able to build cross-platform applications, my primary target platform is for Android devices, however in the future I would definitely like to make this application work on other platforms too
To debug Android applications, you can either use an Android device emulator (AVD) or connect the computer to an actual Android device using the Android Debug Bridge (ADB). I presumed that the latter was a better idea, as I know that my application was more complicated than a simple mobile application (read below: Native Implementations)
To my future self who will probably forget how to use ADB again.
0. Enable debug mode in the Android device’s developers options
Connect to a device over WiFi / TCP
- Ensure the computer is authenticated to the device
adb tcpip 5555
adb connect <ipAddress>:5555
- Launch a shell -
- Uninstall a package -
adb uninstall <package>
- Install a package -
adb install <packageFilePathOnYourComputer>
scrcpy is an application that allows you to remotely mirror your Android device’s screen, as well as to remotely control it! Run it with no arguments to connect to the default device. This makes it easy to interact with your phone from the convenience of your computer!
Useful because half the time I don’t know where my phone is…
I don’t know why I keep doing this, but I keep doing projects that require low-level interaction as my ‘starter project’… Throwing myself into the deep end eh.
In order to use the NDI SDK, I had to find out how to communicate with the NDI libraries within the Flutter application, which led to many rabbit holes.
FFI (Foreign Function Interface)
In order to call the NDI library functions (from here we call “
native code”), we need to create a Foreign Function Interface, which is an abstraction layer that represents the native code as Dart functions which can be used in the application. I decided to write the FFI bindings as a separate plugin (or package idk?) - so that I can reuse it for any future NDI-enabled Flutter application.
flutter-ndi is available on GitHub.
Creating FFI bindings generally require looking at the library headers and reimplementing the functions and datatypes with their native Dart types, which can be quite tedious. Thankfully there is an official tool called
ffigen which is able to automatically generate the bindings.
ffigen wasn’t perfect, as the union datatypes in the NDI SDK’s C headers weren’t accounted for, and I lost some structure member definitions. I decided to modify the NDI SDK C headers that I had committed into the
flutter-ndi repo, so that any future FFI generation would keep my patches.
ABI (Application Binary Interface)
Whilst the FFI provides the Dart bindings to use native code, I also had to include the libraries themselves into the project (and ultimately into the APK file).
I looked through heaps of StackOverflow pages without much success.
Many replies suggested putting your additional files into a certain directory and including it within the Gradle properties, however it seems to only work for Java libraries (
Other replies suggested putting my libraries in the
jniLibs/ABI directory, but this didn’t seem to work until I found this post, which said to put the files in
<ABI> is a placeholder for the architecture. It worked. Ohhhhhhhhhhhhhhhhhhhhh.
To my surprise, putting the library files the
flutter-ndi plugin source still copied over the libraries into the application, despite some people commenting that I had to copy the library files into the application’s ABI directory.
“Because Android handles discovery differently than other NDI platforms, some additional work is needed. The NDI library requires use of the “NsdManager” from Android and, unfortunately, there is no way for a third-party library to do this on its own. As long as an NDI sender, finder, or receiver is instantiated, an instance of the NsdManager will need to exist to ensure that Android’s Network Service Discovery Manager is running and available to NDI”
Whilst the NDI native code bindings were working, the NDI SDK documentation also stated that I had to initialise the Network Service Discovery Manager for Android devices. In order to do this, I learned about Platform Channels.
Platform Channels are a way for the Flutter application to call functions that are implemented differently depending on platform (i.e getting the battery level on an Android device would involve different code as compared to an iOS device)
NsdManager is specific to Android systems, Platform Channels were the way to implement this functionality. When the application setup is called for an iOS device, I can simply just do nothing (or so) for iOS devices, whilst for Android devices I would do the necessary setup for Android systems.
Platform-specific code for Android is written either in Java or Kotlin (for iOS it is Objective-C or Swift), and as mentioned earlier I opted to use Java as the Kotlin code was giving me issues within VS Code.
FFI vs Platform Channels - My understanding
Whilst developing aNDI I was unsure if my FFI implementation should have utilised as part of a Platform Channel interface, however I decided against it for the following reasons
- FFI bindings interact with native code / machine code (i.e. assembly)
- Platform Channels interact with platform-specific code (i.e. Java / Objective-C)
- FFI bindings are part of the Dart language
- Platform Channels are part of the Flutter framework, which itself uses Dart
Basically, my conclusion was that implementing the FFI within a Platform Channel would be inefficient as there is extra overhead to call a function.
To get the application working, I knew that I would need some sort of way to run multiple pieces of code in parallel (regardless of it is legitimate or faux). This led me to learn about futures (async), isolates, and computes.
I found this YouTube video which provided a good introduction of Isolates (Note: IPA
ˈaɪsəˌləts), the Dart version of a thread. From what I understand, Isolates are the primitives for multi-threaded operation. I was able to spawn an Isolate which would act as the NDI receive loop, as to not interfere with Flutter’s frontend event loop.
A gotcha with Isolates is that they only allow primitive data types (and under specific circumstances, copies of objects) to be passed between other Isolates, which is facilitated through the use of the
With these two
ReceivePort pairs, only the
ReceivePort has a
close method and closed event (it doesn’t have a concrete closed event), which means that a sender cannot close the port (possibly for good reason). As you can send a
SendPort, bi-directional Isolate communication can be established, which allowed me to perform some witchcraft to send a close instruction to the NDI event loop.
Computes seem to be a short-lived implementation of Isolates, designed to “compute” some heavy calculation / operation on a separate thread whilst the Flutter application does other things. It’s basically a multi-threaded async-await I guess?
I’m currently (as of 13th August 2021) using a Compute to handle the conversion of raw image data (RGBA / RGBX) into a bitmap, however I believe this is currently the bottle-neck of my application, as each computation takes more than 16ms (
1 / 60 FPS ≈ 16ms).
Consequently the decoding operation is extremely slow (giving performance of only around 2 FPS). In addition, the slow processing speed means that multiple Isolates would be spawned, and compete for resources - eventually causing an Out Of Memory Error (OOME) and crashing the application.
My way around this is to have a remedial lock variable that only starts a Compute if it no other Compute is running, however I believe this is rather un-ideal. Another idea was to add new frames into a queue, however this will immediately cause sync drift amongst huge memory requirements (and an inevitable OOME). I think the best method would be write / find some native code that can convert the raw image data into a bitmap format.
Even better yet would to find a way to turn the frames into a “video stream” of sorts, as I’m not sure how well Flutter’s Image widget can handle fast and continuous frame updates (to the rate of 30 FPS or 60 FPS)
Dev Rants and Rookie Mistakes
The biggest issue I faced was that my application only behaved correctly when I launched the application from the computer in debug mode - when launching the application in release mode I was not detecting any NDI sources???
It eventually turned out to be some issue with the
NDIlib_find_get_current_sources function which kept returning
false despite there being detected sources. I assumed that it might have been some sort of timing / race-condition issue as the application would be executing slower during debug mode, meaning that sources were not yet detected until the around the time the function is called… but yeah idk.
In terms of rookie mistakes, I forgot to add the
android.permission.INTERNET permission to the manifest file for the release version - simple fix and I had network connectivity!
But for now, after a week’s worth of working on this project; I finally have some sort of minimal viable product, yay!
I plan to still add camera functionality in the future, but also add some extra tools like a Test Pattern generator, Screen Capture utility, maybe a Title generator too; who knows!
Flutter / Dart
Although I don’t work with front-end as much as I work with back-end, I found developing applications in Flutter quite similar to how one might design the UI in React (albeit with class components rather than functional components).
I also enjoyed the language syntax! Oh how I miss the nullish coalescing operator 😇.
The cascade notation is also kinda cool though I haven’t fully wrapped my head around it yet.