
This week, SyncAssist has gone all in on mobile! We were able to implement video tracks and audio tracks in the app, allowing users to send and receive media streams from other users, as well as the chat and participants list. With this work done, we are now ready to tackle mobile remote control — the most important part of the app. Initially, we thought that the implementation of this feature should be fairly similar to how we accomplished it in the desktop app. However, there may be more to this feature it seems.
The first potentially tough aspect is the multimodal nature of interactions with a phone. We need to be able to handle both touch gestures, as well as events coming from peripheral devices like a Bluetooth keyboard. Handling remote control when a user has an external keyboard connected should be fairly simple — they should be able to navigate the remote desktop using the keyboard just as they would be able to if it were their own desktop. But what if the user doesn’t have an external keyboard attached? Should they navigate the screen in the same way as they would with their phone with VoiceOver/TalkBack on?
The second aspect is handling the interaction between JAWS and the VoiceOver/TalkBack screen readers. With the way that our screenshare is set up now, if the user already had VoiceOver or TalkBack running during remote control, then they would hear both the feedback from JAWS on the remote desktop and VoiceOver/TalkBack locally. Of course, this isn’t ideal. How do we make sure that the user gets all the information needed from JAWS for the remote desktop, but is also able to navigate the mobile app locally with VoiceOver/TalkBack?
These are questions that we’ll be figuring out over the next week as we implement remote control, and we’ll come back with updates in the next blog post!