Getting Started

This guide will help you create a simple conference application. Make sure to read the Run Time Dependencies before getting started to note any platform specific runtime requirements.

Contents of the SDK package

All C++ SDK packages are available on GitHub: https://github.com/DolbyIO/comms-sdk-cpp/releases.

The following table lists the contents of the C++ SDK package:

Directory

Contents

include

Header files constituting the public API

lib

Shared libraries constituting the C++ SDK

share

Cmake files for building target, samples, docs, and licenses

bin

Prebuilt sample application binaries

Note

The macOS package contains two subdirectories inside the unzipped sdk-release directory, with libraries for respective platforms. The sdk-release-arm packages are designed for Arm architectures, and the sdk-release-x86 packages are designed for x86 architectures. To simplify this guide, we will mention only the sdk-release directory. We also provide a macos64-universal.zip for download. This package contains universal binaries (both x86_64/arm64 architecture) and thus can be used on both Intel and Apple Silicon Macbooks.

CA certificates

The SDK uses Certificate Authority (CA) certificates to authenticate the identity of remote servers during an SSL handshake. The SDK library provides two options for selecting CA certificates:

  • The SDK library contains built-in CA certificates that are used by default.

  • The SDK library checks the value of the DOLBYIO_COMMS_CA_CERT_FILE environment variable. If the ENV variable points to a proper certificate file, these certificates are loaded.

The SDK does not try to use the system-installed CA certificate files.

Getting the access token

Applications using the C++ SDK must provide an access token when creating the SDK. The sample applications provided as a part of the sdk-release package require the access token to be provided as a command line parameter when launching the executable. To get the access token from the Dolby.io dashboard, follow these instructions: https://docs.dolby.io/communications-apis/docs/overview-developer-tools

After getting the access token, you can build and run the Desktop sample application.

Spatial audio conferences

If you join a spatial audio conference using the Desktop application, you need to set your spatial position. The sample application sets the default spatial position for local participant to (0,0,0). For more information on interactively changing the position, please refer the interactive commands section of the sample application.

Sample application

Note

Samples are only provided to illustrate the programming concept with the C++ SDK, and should only be used as a design reference by developers.

A basic C++ sample application is available in the sdk-release/share/dolbyio/comms/sample/desktop_app/ directory of the package. This directory contains C++ code for the application and the respective CMakeLists.txt file, which you can use to build the application.

Note that most of the interaction with the SDK itself are located in the sdk-release/share/dolbyio/comms/sample/utilities/ directory of the package. This is still sample code but it can be thought of as like an example of a simple wrapper over the SDK.

To showcase all of the features of the SDK the sample application can be run in two modes, Standard or Media IO. Standard Mode this mode demonstrates the standard client application workflow that uses media peripheral devices. For example, how to implement a client to connect to a spatial audio conference. The Media IO features are enabled by setting the –enable-media-io command line switch when starting the application. Our sample application links all of the additional required libraries for Media IO features as it is a showcase application. Please note that you should only link what you need to reduce your binary size.

All applications must perform the following steps in order to connect/disconnect from Dolby.io conferences:

  1. Create and initialize the SDK. This must be done first to get access to the SDK services.

  2. Open a Dolby.io session. The session::open call returns a user_info object which contains the participant ID.

  3. Create and join a specific conference.

  4. Leave the conference.

  5. Close the Dolby.io session.

Standard Mode

In the Standard Mode the sample app can also:

  1. Manage local audio and video devices using the device_management service.

  2. provide a camera and possibly a video frame handler to stream video into the conference.

  3. The video frame handler can be used to display the camera locally, inject non-camera video, or process camera frames and then inject the processed video. Video processor section shows how to create a video processor which can then be attached to the SDK using step 2.

  4. set audio capture mode to alter how capture microphone audio is processed before being sent to the conference.

  5. Use the local audio and local video services to start/stop audio/video whilst in the conference.

  6. Set a video sink using the remote video service set sink function to receive incoming video streams from remote participants.

  7. Start/stop receiving the audio streams of desired participants using the remote audio service.

Media IO Mode

In the Media IO Mode the sample app can:

  1. Set Audio Source for the conference to provide audio to the conference from some external source. This means Opus Codec will be used and thus to use the Audio Source it must be set prior to conference creation.

  2. Set Audio Sink for the conference to receive PCM audio from conference participants. This means Opus Codec will be used and thus to use the Audio Sink must be set prior to conference creation.

  3. Set Encoded Video Sink for the conference to receive video frames before they are decoded. This must be called before the conference is created, but it does not prevent applications from receiving the decoded video frames as well.

The sample application makes use of the Default Media Recorder module. If you would like to write your own Media Recorder, see the Example Recorder Implementation section. The sample application also uses the Default Media Injector module and the sample Media Source File library. If you would like to write your own Media Injector, see the Example Injector Implementation section. If you would like to write your own Media Source for the injector, see the Media Source File section, as well as the sample library itself.

Build and run the sample application

  1. Build the desktop application in a few simple steps from the sample/ directory:

$ cd sdk-release/share/dolbyio/comms/sample/
$ mkdir build
$ cd build/ && cmake ../
$ cmake --build . -j 8
        [  4%] Building CXX object custom_recorder/CMakeFiles/custom_recorder.dir/custom_recorder.cc.o
        [ 13%] Building CXX object media_source/CMakeFiles/media_source_file.dir/file/source_capture.cc.o
        [ 21%] Building CXX object custom_injector/CMakeFiles/custom_injector.dir/custom_injector.cc.o
        [ 21%] Building CXX object custom_video_processor/CMakeFiles/custom_video_processor.dir/custom_video_processor.cc.o
        [ 30%] Building CXX object media_source/CMakeFiles/media_source_file.dir/file/utils/audio_buffer.cc.o
        [ 30%] Building CXX object media_source/CMakeFiles/media_source_file.dir/file/libav_wrapper/avcontext.cc.o
        [ 30%] Building CXX object media_source/CMakeFiles/media_source_file.dir/file/libav_wrapper/decoder.cc.o
        [ 34%] Building CXX object media_source/CMakeFiles/media_source_file.dir/file/libav_wrapper/frame.cc.o
        [ 39%] Building CXX object media_source/CMakeFiles/media_source_file.dir/file/utils/media_frame.cc.o
        [ 43%] Linking CXX static library libcustom_injector.a
        [ 43%] Built target custom_injector
        [ 52%] Linking CXX static library libmedia_source_file.a
        [ 52%] Linking CXX static library libcustom_recorder.a
        [ 56%] Built target custom_recorder
        [ 56%] Built target media_source_file
        [ 56%] Built target custom_video_processor
        [ 65%] Building CXX object utilities/CMakeFiles/utils.dir/sdk/interactions.cc.o
        [ 69%] Building CXX object utilities/CMakeFiles/utils.dir/sdk/events.cc.o
        [ 78%] Building CXX object utilities/CMakeFiles/utils.dir/commands_handler.cc.o
        [ 82%] Building CXX object utilities/CMakeFiles/utils.dir/ui_loop/ui.cc.o
        [ 86%] Building CXX object utilities/CMakeFiles/utils.dir/ui_loop/macos_ui.mm.o
        [ 86%] Building CXX object utilities/CMakeFiles/utils.dir/media/media_io_interactions.cc.o
        [ 86%] Building CXX object utilities/CMakeFiles/utils.dir/sdk/device_manager/interactions.cc.o
        [ 91%] Linking CXX static library libutils.a
        [ 91%] Built target utils
        [ 95%] Building CXX object desktop_app/CMakeFiles/desktop_app.dir/desktop_app.cc.o
        [100%] Linking CXX executable desktop_app
        [100%] Built target desktop_app
  1. Run the created desktop_app executable with help option to see the available command line parameters:

$ ./desktop_app -h
$ ./desktop_app --help

For example:

$ ./desktop_app -u USERNAME -k ACCESS_TOKEN -i CONF_ID -l LOG_LEVEL -ld LOG_DIRECTORY -lmedia LOG_LEVEL_MEDIA -p user -m AV -spatial shared

                 or

$ ./desktop_app -u USERNAME -k ACCESS_TOKEN -i CONF_ID -l LOG_LEVEL -ld LOG_DIRECTORY -lmedia LOG_LEVEL_MEDIA -p user -enable-media-io -d PATH_TO_DUMP_DIRECTORY -m AV -f SOME_FILE.mp4 -spatial shared

Note that when using the media io enabled mode` the sample application methods that start and stop audio consist of two parts. For example for starting audio, the first step uses the dolbyio::comms::services::local_audio::start() method, which attaches an audio track to the active PeerConnection. This method resolves once the audio track has been successfully added. Once this method is resolved, the injection of audio from the Media Source starts. The following code block presents the entire procedure:

sdk->audio().local()
.start_()
.then([injection_src]() {
  std::cerr << "start audio success" << std::endl;
  if (injection_src)
    injection_src->set_audio_capture(true);
})
.on_error([](std::exception_ptr&& e) {
  try {
    std::rethrow_exception(e);
  } catch (std::exception& e) {
    std::cerr << "start audio failed: " << e.what() << std::endl;
  }
});
  1. Interact with the desktop_app executable using the available interactive command options. The application will prompt the user for input and explain each of the options. If the chosen option requires more input, the user will be explicitly prompted for this as well.

Video processor

The share/dolby/comms/sample/custom_video_processor/ directory contains cpp and header files to build a library implementing a custom video processor. In order for this processor to be used by the C++ SDK it must be provided as the dolbyio::comms::video_frame_handler with the dolbyio::comms::services::local_video::start() function call.

Note

For macOS the captured camera frames are stored in shared IOSurface. This sample processes the frames in place. A proper application implementation should make a copy of the CVPixelBuffer and underlying IOSurface and only then process its copy of these memory buffers. Otherwise other applications capturing from the same video device will have their frames altered as well.

Note

For macOS Video Support relies on the presence and running of the macOS main event loop. If you try to start video before you have executed something like [NSApp run] the SDK will throw an exception. Thus any calls to start_video must be done after the main event loop has been started.

The interfaces inheritted by this custom_video_processor example must be implemented by any custom video processor which wishes to receive video frames and have somewhere to inject them.

Adding this sample custom video procesor to any app is very simple, here we show basic instructions for integrating it with our sample desktop_app:

  1. Edit the share/dolby/comms/sample/utilities/CMakeLists.txt file so it links the custom_video_processor shared library by modifying the following:

target_link_libraries(utils PUBLIC
        custom_video_processor
media_source_file
DolbyioComms::multimedia_streaming_addon
DolbyioComms::media
)

2) Next edit the share/dolby/comms/sample/utilities/ui_loop/ui.cc file to include the custom_processor header file. Then finally inside the ui_interface::ui_loop_on_helper_thread method create a processor and provide the processor to the SDK using the start video method. Essentially you just need to add 3 lines of code in respective places as show below:

#include "comms/sample/custom_video_processor/custom_video_processor.h"

...

void ui_interface::ui_loop_on_helper_thread() {
        auto processor = std::make_unique<custom_video_processor>();
        try {
                ...
                wait(sdk->video().local().start({}, processor.get()));
                ...
        }
  1. Then follow the Build and run the sample application instructions of Getting Started and you will have built your editted desktop_app

Issues Linking?

If you have problems with running the application due to a not-found media library, set the LD_LIBRARY_PATH to the location of the sdk-release/lib directory. If you plan to move the libraries to different locations, include these locations in the paths. The path can be set as follows:

$ export LD_LIBRARY_PATH=/path/to/sdk-release/lib/:$LD_LIBRARY_PATH