Blog Posts

Week 27: Home stretch!

SyncAssist in our weekly liaison meeting

We’re finally in the home stretch! This week has been all about FDR preparation and final bug fixes for SyncAssist.

Last week, we mentioned some strange things happening with the audio quality on mobile. It turns out that when screensharing to mobile, the screen audio will duplicate (which we observed last Friday with Sriram) — but this only happens in P2P mode. As soon as a third person joins the call, or if the P2P session initiation fails, then the duplicate stream vanishes. After extensive testing, it seems like the issue is with how lib-jitsi-meet handles the P2P-Videobridge media switch, but until we can further narrow down the bug and patch it, we at least have a temporary workaround by disabling P2P connections on mobile.

We’ve also made progress in keyboard remote control latency by switching to fully peer-to-peer data channels to send the remote control events. Now with batched keyboard actions, the remote control experience is far smoother than before, and we hope to show off these improvements at the FDR prototype showcase!

On the documentation side, we’ve finished our first draft of the FDR report and are working on revisions before sending it off to our liaisons for approval. We also gave our peer reviewed FDR presentation, and (somewhat to our surprise), it was well received! We’re planning on making some tweaks to the presentation to make the takeaways of graphs and charts more clear, but other than that, we feel pretty confident in our final days leading up to the FDR!

Week 26: So close, but so far away

Team 5 (minus Jose and Reneca) with Sriram at Kinya Ramen

Though we’re less than two weeks away from FDR, it seems like Team 5 still has a lot of work to do! This week, our liaison Sriram came into town to see a live demo of our mobile app, and unfortunately it didn’t exactly go as planned. Since we’d just gotten the iOS app to work on a physical phone, we hadn’t had much time to try out how the native screen reader works with our app — which Sriram immediately took note of. On top of that, the keyboard and audio lagged significantly during mobile remote control, and the audio quality was not exactly the greatest.

On the bright side, though, we’ve made some progress in other areas — namely, in gathering some metrics for latency, bitrate, and stress level for our server as the number of participants grows. Here are some graphs to give you a sense of the relationship between these three things:

It seems like the stress level (or percent CPU usage) of the Jitsi VideoBridge grows linearly with the number of video/audio-sharing participants, and at a stress level, the round trip time will also increase linearly. Besides these metrics, Sriram also asked us for clarification on how to scale the SFUs and cost per participant metrics for a certain SFU set up, so we’ll work on getting these benchmarks for him along with fixing the miscellaneous bugs by the time FDR rolls around.

Week 25: Demo day debugging

SyncAssist taking down equipment after Prototype Inspection Day

Things are heating up for SyncAssist! With only two and a half weeks left until our Final Design Review, we definitely have our work cut out for us. This week, we showcased our app at Prototype Inspection Day and despite testing our app quite thoroughly the day before, we ran into some unexpected bugs during the actual live demo. Fortunately, we were able to relatively quickly recover from these issues and resolve them before the next round of judges came, so our PID was overall a success!

On the technical side, we’ve essentially completed the mobile app, aside from testing it to ensure compatibility with iOS. Last week, we mentioned hurdles with getting keyboard-based remote control on Android to work as expected, since there were key combinations like ‘Tab’ and ‘Alt + Tab’ that were reserved for certain actions by the OS. It turns out that in order to block these, you need to create an accessibility service, which essentially grants your app full control of the device in order to help users with disabilities interact with your app. In our case, we used it to block all keyboard commands after remote control begins and instead send over the command to the remote control target — which is exactly what we want!

Other than that, we’re still in the process of stress testing our app and having some issues linking up child nodes with the parent node to make our Selenium Grid work. Hopefully by next week, we’ll have some metrics for CPU usage and bitrate are affected by the number of peers in a conference, but you’ll have to stay tuned until then!

Week 24: Back from break! (for possibly the last time)

SyncAssist showing off our mobile remote control progress during our weekly liaison meeting

After a restful spring break, SyncAssist is ready to finish strong! This week, we focused on merging any open pull requests from before break and fixing small bugs before our Prototype Inspection Day.

On the desktop side, we added some changes to the layout of video components and buttons to make everything fit on a screen and scale properly — you can check it out live on dev.ippd-jitsi.com! We also did some preliminary testing on how much of a load the server can handle using Selenium tests on our GCP instance, and we can reliably hold 50 users in a single meeting without using much CPU on the server. Moving forward, since the desktop app is pretty much stable now, we plan to hone in on load testing and get some statistics for how CPU usage, bitrate, and number of load-generating endpoints are related.

On the mobile side, we added some gestures for switching windows — swipe up and hold, then swipe right to trigger Alt + Tab. Since we don’t want to make the gestures too complicated, we currently plan to limit the gestures to the following:

  1. Swipe right: Tab
  2. Swipe left: Shift + Tab
  3. Swipe up + swipe right: Alt + Tab
  4. Tap: Enter

For more complicated shortcuts, the user can just use an external keyboard instead (since the number of use cases for purely gesture-based remote control probably isn’t that large). We are having a bit of trouble getting the keyboard hook to work properly on Android since it seems that the system handles global shortcuts before it can ever be received by our process. Fortunately for us, we found out that shortcuts don’t do anything on iOS 🎉 so we’ll be able to just pass the keys directly to the host.

Stay tuned for updates on PID!

Week 23: Mobile gestures and keyboard pressers

SyncAssist coding on our free work day!

This week for SyncAssist has been all about mobile remote control! Based on our discussion with our liaison engineers on how mobile remote control should work, we’ve since worked on two main modes of control: gesture control, and keyboard control.

In order to be able to navigate a remote desktop using only touch gestures, we decided that we would at minimum need to simulate the Tab, Shift + Tab, and Enter key combinations so that the user could focus on elements and select them. Thus, we mapped these shortcuts to the ‘swipe right’, ‘swipe left’, and ‘tap’ gestures, respectively. We are in the process of to simulating additional shortcuts like Alt + Tab for switching applications by using more complex gestures such as a swipe up and hold, then swipe right and release, but we also recognize that there’s a fine line between an additional gesture being useful or being too complex to use.

On the keyboard side, we’ve been able to map all the keys from external keyboards to be properly handled by the controller and sent to the host! Unfortunately, we’ve encountered a similar issue as we have with desktop remote control — we’ll have to write native code to block any keyboard inputs received by the controller from being simulated because certain shortcuts — like Alt + Tab — are protected. And though this will be probably be pretty straightforward on Android, we’re doubtful that we’ll be able to block any system-level shortcuts on iOS in typical Apple fashion. If this really is the case, then we might only be able to allow full remote control on Android devices, and gesture-based control on iOS.

Regardless, we’ve made a ton of progress in the past few weeks and will be enjoying our well-deserved spring break! Until next time!

Week 22: Mobile, mobile, mobile

The current state of SyncAssist’s mobile app

This week, SyncAssist has gone all in on mobile! We were able to implement video tracks and audio tracks in the app, allowing users to send and receive media streams from other users, as well as the chat and participants list. With this work done, we are now ready to tackle mobile remote control — the most important part of the app. Initially, we thought that the implementation of this feature should be fairly similar to how we accomplished it in the desktop app. However, there may be more to this feature it seems.

The first potentially tough aspect is the multimodal nature of interactions with a phone. We need to be able to handle both touch gestures, as well as events coming from peripheral devices like a Bluetooth keyboard. Handling remote control when a user has an external keyboard connected should be fairly simple — they should be able to navigate the remote desktop using the keyboard just as they would be able to if it were their own desktop. But what if the user doesn’t have an external keyboard attached? Should they navigate the screen in the same way as they would with their phone with VoiceOver/TalkBack on?

The second aspect is handling the interaction between JAWS and the VoiceOver/TalkBack screen readers. With the way that our screenshare is set up now, if the user already had VoiceOver or TalkBack running during remote control, then they would hear both the feedback from JAWS on the remote desktop and VoiceOver/TalkBack locally. Of course, this isn’t ideal. How do we make sure that the user gets all the information needed from JAWS for the remote desktop, but is also able to navigate the mobile app locally with VoiceOver/TalkBack?

These are questions that we’ll be figuring out over the next week as we implement remote control, and we’ll come back with updates in the next blog post!

Week 21: Working hard or hardly working?

SyncAssist’s web app deployed on dev.ippd-jitsi.com!

This week, SyncAssist has been hard at work. On Tuesday, we presented our current prototype and testing plan to the QRB 2 committee, where we received valuable feedback from our coach Dr. Grant and other IPPD coaches about our scope of work and our presentation skills. By the Prototype Inspection Day, we’ll be sure to have all their advice internalized and nail that demo!

On the development side, we’ve made strides in several areas. First, I (Mai) was able to finally deploy our website to the AWS server, so now anyone can access it at dev.ippd-jitsi.com! The deployment is automated so that any time we make a change to the main branch of our repo, the website is automatically updated.

With the website deployed, Lucas and Johnny were able to set up some tests using Selenium to simulate users joining a meeting. We plan to use these tests to see how many users can join without audio/video quality degrading or straining the server too much and then report these findings back to our liaisons.

On the mobile side, Anh has been working to add video tracks using the react-native-webrtc library, and we are now able to add the mobile user’s camera to the conference! We’re having a bit of trouble with having the remote tracks display properly, but we’ll continue debugging and report back next week. And over the next few days, the whole team will be renewing our focus on mobile development and hopefully get to implementing remote control, the main feature that our liaisons are requesting from us.

On the topic of remote control, we’ve encountered some strange issues regarding global shortcuts. Jose has been working to add a C++ native hook to block global shortcuts on the controller’s side, but found that having a single local audio track makes the hook break. After hours of debugging, we finally found the issue: getUserMedia in Chromium blocks the native hook when an audio channel is active. Fortunately, since other developers have encountered this issue before, we were able to replicate their solution and are now able to disable native Windows shortcuts!

That’s all for now, until next time!

Week 20: On the up and up

SyncAssist (minus Lucas) with Sriram Ramanathan and Ryan Jones of Vispero at Yummy House

After last week’s fiasco, we’d figured that things could only get better from here on out. And better indeed they did get! This week, we honed in on improving the performance of our app by fixing any state-related bugs and optimizing the resolution and frame rate of our media streams as much as possible, and with these improvements, the baseline audio/video/remote control experience of our app is far better than before.

To improve our video frame rate and resolution, we had to experiment between setting the desktop sharing frame rates as well as the sender/receiver video resolution constraints. The lib-jitsi-meet library has functions conveniently provided to handle these so we don’t have to modify anything on the server side, but unfortunately not-so-great documentation for how to use them. After countless trials and a long back-and-forth with the maintainers of the lib-jitsi-meet library, we finally came to the solution of cranking up the video resolution to 2000 and setting the frame rate at 30 fps that maintained the right balance.

For remote control, we identified several different scenarios for ending remote control that weren’t being handled properly — for example, when the host leaves the conference — then went about implementing those scenarios correctly. We also made some optimizations to remote keyboard by sending the key-down/key-up state of the key to the host rather than locking the keys, which we demonstrated to our liaisons through a typing test.

Overall, we (and our liaisons) are much happier with the current state of our app and are looking forward to seeing what’s next for it!

Week 19: Demos and Renos

The latest updates on SyncAssist’s mobile UI

Over the past week, mobile development has been going swimmingly. We managed to get the mobile development environment set up properly on all our team members’ devices and started making basic screens allowing users to make new conference rooms and join the conference. With the refactoring that we accomplished last week, along with some additional refactoring of conference functions, we’re feeling confident that we can implement the major features requested by our liaisons for mobile in a fairly short amount of time.

Unfortunately, in our quest to streamline our mobile development, we had to somewhat abandon our desktop and web app for a bit, and the repercussions of this became clear at our regular liaison meeting this week. During the meeting, we attempted to do a live demo of our app to show off our improvements in remote control latency and screen-shared video quality, but instead ended up showing off the plethora of bugs in our state management and our rather unremarkable video/audio quality.

But as our IPPD Lab Manager Larry Warren likes to say, “Bad news is good news. Good news is no news. And no news is bad news”. With these bugs exposed, we now have renewed priorities: make our baseline video, audio, and remote control experience as seamless and performant as possible.

With that in mind, we’ve refreshed our isuses list to add priorities to any bugs related to video/audio performance and state management. Here’s a quick sneak peek:

SyncAssist’s issue list

Next week we’ll hopefully have updates on the improved performance of our app, so stay tuned for that!

Week 18: Refactor extractor

SyncAssist’s target supported platforms (mobile, desktop and browser)

This week, SyncAssist has been doing a pretty heavy overhaul of our code! To start our mobile development work, we realized that several of the functions and states that we would be using for our conference handling would have to essentially be copied into our mobile app. So, to avoid this, we began by refactoring our states into global contexts, which allows us to reuse the states on the mobile side as well as avoid having to pass the state down as props.

Along with refactoring, we’ve fixed a number of existing bugs. Here’s the list, pulled straight from our commit history: fixed the home page, fixed user joining getting a message that the host joined, disabled remote control requests for non-Electron users, patched react-native-performance, reset the screen-sharing states, fix low screen-share FPS, update logic to stop screen-sharing, fix share screen, fix div error with RemoteControl.tsx — and these are only a fraction of the commits. It’s almost like every time we create something, we introduce new bugs 😃.

Although it’s been tough trying to implement new features while fixing old bugs, we’re making slow but steady progress and are planning on having most of the mobile app features built out over the next two weeks. Stay tuned for updates on our mobile app, and until next time!