Nanjing is a software solution for the events industry that would aid technicians in the process of assembling large scale screens for festivals such as Tomorrowland or for conferences such as the Samsung keynote.
In the events field, a very big problem that needs to be solved is the logistics involved in transporting and assembling the audio and video equipment. For video, just think about festivals such as Tomorrowland that have 40 by 40 meter screens and that these screens have to be transported, installed and maintained.
The screens that are destined for festivals are composed out of LED panels of about 500x500 centimetres. These panels are connected between each-other in a serial way, so that they can propagate the video signal through their internal wiring.
In order to distribute video signal to these panels, multiple distroboxes (signal distribution boxes) are used that are all connected in turn to a video processor. This video processor is the heart of the show as this is the point in which the A/V engineers have to intervene for adjusting panel related settings such as colour calibration or signal routing.
The Nanjing project lives inside of the image processor hardware and runs on top of a 4 core with 1GB RAM ARM processor.


Since the Nanjing solution was detsined to run on top of a very limited environment, we started out by benchmarking multiple technology stacks for performance , in order to find the best stack that could handle enterprise tier features.
The project itself was a challenge for both performance and experience.
The UI would have to allow the users to easily manipulate and map visual representations of LED panels, that in real life can exceed 1000 in number. This means that we had to create a sort of "work area" where the A/V engineers could place panels in the their desired slots and then send video signals to them. Render speed thus became important.
Because most A/V engineers were used to working using macs, we designed the UX according to the Apple Human Interface Guidelines.
The application logic layer would have to allow for realtime updates for multiple panel attributes. For this we've based all communication on websockets over our own client implementation (there was no STOMP non-blocking client implementation for Spring).
After the feasibility was performed, we created a development process that was rooted in the Agile methodology for shipping the actual solution.



Spring Webflux as a non-blocking and fast solution that would interact with the low level FPGA code in order to manipulate panel states


Angular 6 as a client SPA that would consume a socket based API


Redis was used as a fast key-value storage that would be kept in sync with the panels


D3 js was used for creating a user "workspace" that would allow the user to perform operations such as zoom, pan, drag and drop to draw shapes from panels and so on.


We used in order to constantly measure for performance issues. We created a set of socket based gatling tests that would be run by Jenkins CI run against an environment each time a merge request would be opened.