Video Calling
Project homepage on GIT — https://github.com/QuickBlox/quickblox-ios-sdk/tree/master/sample-videochat-webrtc
Download ZIP - https://github.com/QuickBlox/quickblox-ios-sdk/archive/master.zip
Overview
The VideoChat code sample allows you to easily add video calling and audio calling features into your iOS app. Enable a video call function similar to FaceTime or Skype using this code sample as a basis.
It is built on the top of WebRTC technology.
System requirements
The QuickbloxWebRTC.framework supports the next:
* Quickblox.framework v2.7 (pod QuickBlox)
* iPhone 4S+.
* iPad 2+.
* iPod Touch 5+.
* iOS 8+.
* iOS simulator 32/64 bit (audio might not work on simulators).
* Wi-Fi and 4G/LTE connections.
Getting Started with Video Calling API
Installation with CocoaPods
CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.
Step 1: Downloading CocoaPods
CocoaPods is distributed as a ruby gem, and is installed by running the following commands in Terminal.app:
$ sudo gem install cocoa pods $ pod setup
Step 2: Creating a Pod file
Project dependencies to be managed by CocoaPods are specified in the Pod file. Create this file in the same directory as your Xcode project (.xcodeproject) file:
$ touch Podfile $ open -e Podfile
TextEdit should open showing an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?
Copy and paste the following lines into the TextEdit window:
source 'https://github.com/CocoaPods/Specs.git' platform :ios, '8.0' pod 'Quickblox-WebRTC', '~> 2.6' pod 'QuickBlox'
Step 3: Installing Dependencies
Now you can install the dependencies in your project:
$ pod install
From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:
$ open ProjectName.xcworkspace
Step 4: Importing Headers
At this point, everything is in place for you to start using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:
#import <SystemConfiguration/SystemConfiguration.h>
#import <MobileCoreServices/MobileCoreServices.h>
#import <Quickblox/Quickblox.h>
#import <QuickbloxWebRTC/QuickbloxWebRTC.h>
Add the Framework to your Xcode Project
Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC
Step 1: Download & unzip the Framework
QuickbloxWebRTC.framework
Step 2: Add the framework to your Xcode Project
Drag the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination's group folder" checkbox is checked.
Step 3: Link Binary With Library Frameworks
Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.
Here is the list of required Apple library frameworks:
libicucore.dylib
libc++.dylib
libresolv.dylib
libxml2.dylib
libz.dylib
CFNetwork.framework
GLKit.framework
MobileCoreServices.framework
SystemConfiguration.framework
VideoToolbox.framework
Accelerate.framework
Step 4: Embedded binary for Dynamic framework
Step 5: Importing Headers
At this point, everything is in place for you to start using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:
#import <SystemConfiguration/SystemConfiguration.h>
#import <MobileCoreServices/MobileCoreServices.h>
#import <Quickblox/Quickblox.h>
#import <QuickbloxWebRTC/QuickbloxWebRTC.h>
Run Script Phase for Dynamic framework
Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:
bash "${BUILT_PRODUCTS_DIR}/${FRAMEWORKS_FOLDER_PATH}/QuickbloxWebRTC.framework/strip-framework.sh"
This fixes a known Apple bug, that does not allowing to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.
Life cycle
// Initialise QuickbloxWebRTC and configure singling
// You should call this method before any interact with QuickbloxWebRTC
[QBRTCClient initialiseRTC];
// Call this method when you finish your work with
QuickbloxWebRTC [QBRTCClient reinitialiseRTC];
Call users
To call users just use this method:
[[QBRTCClient instance] addDelegate:self];
// self class must conform to QBRTCClientDelegate protocol
// 2123, 2123, 3122 - opponent's
NSArray *opponentsIDs = @[@3245, @2123, @3122];
QBRTCSession *newSession = [[QBRTCClient instance] createNewSessionWithOpponents:opponentsIDs withConferenceType:QBRTCConferenceTypeVideo];
// userInfo - the custom user information dictionary for the call. May be nil.
NSDictionary *userInfo = @{ @"key" : @"value" };
[newSession startCall:userInfo];
After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per 5 second for a duration of 45 seconds (you can configure these settings with QBRTCConfig):
#pragma mark - #pragma mark QBRTCClientDelegate
- (void)didReceiveNewSession:(QBRTCSession *)session userInfo:(NSDictionary *)userInfo
{
if (self.session)
{
// we already have a video/audio call session, so we reject another one // userInfo - the custom user information dictionary for the call from caller. May be nil. NSDictionary *userInfo = @{ @"key" : @"value" };
[session rejectCall:userInfo]; return;
}
self.session = session;
}
self.session - this refers to this session. Each particular audio - video call has a unique sessionID. This allows you to have more than one independent audio-video conferences.
If you want to increase the call timeout, e.g. set to 60 seconds:
[QBRTCConfig setAnswerTimeInterval:60];
Accept a call
To accept a call request just use this method:
// userInfo - the custom user information dictionary for the accept call. May be nil.
NSDictionary *userInfo = @{ @"key" : @"value" };
[self.session acceptCall:userInfo];
After this your opponent will receive an accept signal:
#pragma mark - #pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session acceptedByUser:(NSNumber *)userID userInfo:(NSDictionary *)userInfo
{
}
Reject a call
To reject a call request just use this method:
// userInfo - the custom user information dictionary for the reject call. May be nil.
NSDictionary *userInfo = @{ @"key" : @"value" };
[self.session rejectCall:userInfo];
// and release session instance
self.session = nil;
After this your opponent will receive a reject signal:
#pragma mark - #pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session rejectedByUser:(NSNumber *)userID userInfo:(NSDictionary *)userInfo
{
NSLog(@"Rejected by user %@", userID);
}
Connection life-cycle
Called when connection is initiated with user:
#pragma mark - #pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session startedConnectingToUser:(NSNumber *)userID
{
NSLog(@"Started connecting to user %@", userID);
}
Called when connection is closed for user
#pragma mark - #pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session connectionClosedForUser:(NSNumber *)userID
{
NSLog(@"Connection is closed for user %@", userID);
}
Called in case when connection is established with user:
#pragma mark - #pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session connectedToUser:(NSNumber *)userID
{
NSLog(@"Connection is established with user %@", userID);
}
Called in case when user is disconnected:
#pragma mark - #pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session disconnectedFromUser:(NSNumber *)userID
{
NSLog(@"Disconnected from user %@", userID);
}
Called in case when user did not respond to your call within timeout . note: use +[QBRTCConfig setAnswerTimeInterval:value] to set answer time interval
#pragma mark - #pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session userDidNotRespond:(NSNumber *)userID
{
NSLog(@"User %@ did not respond to your call within timeout", userID);
}
Called in case when connection failed with user.
#pragma mark - #pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session connectionFailedForUser:(NSNumber *)userID
{
NSLog(@"Connection has failed with user %@", userID);
}
States
Called when QBRTCSession state was changed. Session's state might be new, pending, connecting, connected and closed.
#pragma mark - #pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session didChangeState:(QBRTCSessionState)state
{
NSLog(@"Session did change state to %tu", state);
}
Called when session connection state changed for a specific user. Connection state might be unknown, new, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no answer, rejected, hangup and failed.
#pragma mark - #pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session didChangeConnectionState:(QBRTCConnectionState)state forUser:(NSNumber *)userID
{
NSLog(@"Session did change state to %tu for userID %@", state, userID);
}
Manage remote media tracks
#pragma mark - #pragma mark QBRTCClientDelegate
//Called in case when receive remote video track from opponent
- (void)session:(QBRTCSession *)session receivedRemoteVideoTrack:(QBRTCVideoTrack *)videoTrack fromUser:(NSNumber *)userID
{
// we suppose you have created UIView and set it's class to QBRTCRemoteVideoView class
// also we suggest you to set view mode to UIViewContentModeScaleAspectFit or
// UIViewContentModeScaleAspectFill
[self.opponentVideoView setVideoTrack:videoTrack];
}
You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:
#pragma mark - #pragma mark QBRTCClientDelegate
//Called in case when receive remote audio track from opponent
- (void)session:(QBRTCSession *)session receivedRemoteAudioTrack:(QBRTCAudioTrack *)audioTrack fromUser:(NSNumber *)userID
{
// mute specific user audio track here (for example)
// you can also always do it later by using '[QBRTCSession remoteAudioTrackWithUserID:]' method audioTrack.enabled = NO;
}
You can always get both remote video and audio tracks for a specific user ID in call using these QBRTCSession methods:
/** * Remote audio track with opponent user ID.
* * @param userID opponent user ID * *
@return QBRTCAudioTrack audio track instance */
- (QBRTCAudioTrack *)remoteAudioTrackWithUserID:(NSNumber *)userID;
/** * Remote video track with opponent user ID.
* * @param userID opponent user ID * *
@return QBRTCVideoTrack video track instance */
- (QBRTCVideoTrack *)remoteVideoTrackWithUserID:(NSNumber *)userID;
Manage local video track
In order to show your local video track from camera you should create UIView on storyboard and then use the following code:
// your view controller interface code
@interface CallController()
@property (weak, nonatomic) IBOutlet UIView *localVideoView;
// your video view to render local camera video stream
@property (strong, nonatomic) QBRTCCameraCapture *videoCapture;
@property (strong, nonatomic) QBRTCSession *session;
@end
@implementation CallController
- (void)viewDidLoad
{
[super viewDidLoad]; [[QBRTCClient instance] addDelegate:self];
QBRTCVideoFormat *videoFormat = [[QBRTCVideoFormat alloc] init];
videoFormat.frameRate= 30;
videoFormat.pixelFormat = QBRTCPixelFormat420f;
videoFormat.width = 640;
videoFormat.height = 480;
// QBRTCCameraCapture class used to capture frames using AVFoundation APIs
self.videoCapture = [[QBRTCCameraCapture alloc]initWithVideoFormat:videoFormat position:AVCaptureDevicePositionFront];
// or AVCaptureDevicePositionBack
// add video capture to session's local media stream
// from version 2.3 you no longer need to wait for 'initializedLocalMediaStream:' delegate to do it
self.session.localMediaStream.videoTrack.videoCapture = self.videoCapture;
self.videoCapture.previewLayer.frame = self.localVideoView.bounds;
[self.videoCapture startSession];
[self.localVideoView.layer insertSublayer:self.videoCapture.previewLayer atIndex:0];
// start call
}
@end
Hang up
To hang a up call:
NSDictionary *userInfo = @{ @"key" : @"value" }
[self.session hangUp:userInfo];
After this your opponent's will receive a hangUp signal
#pragma mark - #pragma mark QBRTCClientDelegate
- (void)session:(QBRTCSession *)session hungUpByUser:(NSNumber *)userID userInfo:(NSDictionary *)userInfo
{
//For example:Update GUI // // Or /** HangUp when initiator ended a call */
if ([session.initiatorID isEqualToNumber:userID])
{
[session hangUp:@{}];
}
}
In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:
#pragma mark - #pragma mark QBRTCClientDelegate
- (void)sessionDidClose:(QBRTCSession *)session
{
// release session instance self.session = nil;
}
Disable / enable audio stream
You can disable / enable the audio stream during a call:
self.session.localMediaStream.audioTrack.enabled ^= 1;
Please note: due to webrtc restrictions silence will be placed into stream content if audio is disabled.
Disable / enable video stream
You can disable / enable the video stream during a call:
self.session.localMediaStream.videoTrack.enabled ^= 1;
Please note: due to webrtc restrictions black frames will be placed into stream content if video is disabled.
Switch camera
You can switch the video capture position during a call (Default: front camera):
'videoCapture' below is QBRTCCameraCapture described in CallController.
// to set default (preferred) camera position
- (void)viewDidLoad
{
[super viewDidLoad];
QBRTCVideoFormat *videoFormat = [[QBRTCVideoFormat alloc] init];
videoFormat.frameRate = 30;
videoFormat.pixelFormat = QBRTCPixelFormat420f;
videoFormat.width = 640;
videoFormat.height = 480;
self.videoCapture = [[QBRTCCameraCapture alloc] initWithVideoFormat:videoFormat position:AVCaptureDevicePositionFront];
// or AVCaptureDevicePositionBack
}
// to change some time after, for example, at the moment of call
AVCaptureDevicePosition position = self.videoCapture.position;
AVCaptureDevicePosition newPosition = position == AVCaptureDevicePositionBack ? AVCaptureDevicePositionFront : AVCaptureDevicePositionBack;
// check whether videoCapture has or has not camera position
// for example, some iPods do not have front camera
if ([self.videoCapture hasCameraForPosition:newPosition])
{
self.videoCapture.position = newPosition;
}
Audio Session (Previously Sound Router)
QBRTCSoundRouter is deprecated from version 2.3. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customisable and conform to many requirements.
//Save current audio configuration before start call or accept call
[[QBRTCAudioSession instance] initialize];
//OR you can initialise audio session with a specific configuration
[[QBRTCAudioSession instance]
initializeWithConfigurationBlock:^(QBRTCAudioSessionConfiguration *configuration)
{
// adding bluetooth support configuration.categoryOptions |= AVAudioSessionCategoryOptionAllowBluetooth; configuration.categoryOptions |= AVAudioSessionCategoryOptionAllowBluetoothA2DP;
// adding airplay support configuration.categoryOptions |= AVAudioSessionCategoryOptionAllowAirPlay;
if (_session.conferenceType == QBRTCConferenceTypeVideo)
{
// setting mode to video chat to enable airplay audio and speaker only for video call
configuration.mode = AVAudioSessionModeVideoChat;
}
}];
//Set headphone or phone receiver
[QBRTCAudioSession instance].currentAudioDevice = QBRTCAudioDeviceReceiver;
//or set speaker [QBRTCAudioSession instance].currentAudioDevice = QBRTCAudioDeviceSpeaker;
//deinitialise after session close [[QBRTCAudioSession instance] deinitialize];
QBRTCAudioSession also does have a delegate protocol with helpful methods:
/** * Notifying about current audio device being updated by QBRTCAudioSession. * *
@param audioSession QBRTCAudioSession instance *
@param updatedAudioDevice new audio device * *
@discussion Called, for example, when headphones plugged in. In that case audio will automatically be updated from speaker/receiver to headphones.
Headphones are considered to be receiver. You can use this delegate to keep your current audio device state up-to-date in your UI. * *
@note Only called if audio device was changed by QBRTCAudioSession itself, and not on user request. */
- (void)audioSession:(QBRTCAudioSession *)audioSession didChangeCurrentAudioDevice:(QBRTCAudioDevice)updatedAudioDevice;
/** * Notifying when audio device change on user request was failed. * *
@param audioSession QBRTCAudioSession instance *
@param error error * *
@discussion Called when audio device change is not possible. For example, when audio session options set to speaker only, you cannot update device to receiver, etc. */
- (void)audioSession:(QBRTCAudioSession *)audioSession didFailToChangeAudioDeviceWithError:(NSError *)error;
/** * Called when the audio device is notified to begin playback or recording. * *
@param audioSession QBRTCAudioSesson instance. */
- (void)audioSessionDidStartPlayOrRecord:(QBRTCAudioSession *)audioSession;
/** * Called when the audio device is notified to stop playback or recording. * *
@param audioSession QBRTCAudioSesson instance. */
- (void)audioSessionDidStopPlayOrRecord:(QBRTCAudioSession *)audioSession;
/** * Called when AVAudioSession starts an interruption event. * *
@param session QBRTCAudioSession instance */
- (void)audioSessionDidBeginInterruption:(QBRTCAudioSession *)session;
/** * Called when AVAudioSession ends an interruption event. * *
@param session QBRTCAudioSession instance */
- (void)audioSessionDidEndInterruption:(QBRTCAudioSession *)session shouldResumeSession:(BOOL)shouldResumeSession;
Also QBRTCAudioSession introducing some new properties, that might be also helpful in any case:
/** * Determines whether QBRTCAudioSession is initialised and have saved previous active audio session settings.
*/ @property (nonatomic, readonly, getter=isInitialised) BOOL initialised;
/** * Represents permission for WebRTC to initialise the VoIP audio unit.
* When set to NO, if the VoIP audio unit used by WebRTC is active, it will be
* stopped and uninitialised. This will stop incoming and outgoing audio.
* When set to YES, WebRTC will initialise and start the audio unit when it is
* needed (e.g. due to establishing an audio connection).
* This property was introduced to work around an issue where if an AVPlayer is * playing audio while the VoIP audio unit is initialised, its audio would be
* either cut off completely or played at a reduced volume. By preventing
* the audio unit from being initialised until after the audio has completed,
* we are able to prevent the abrupt cutoff. *
* @remark As an issue is only affecting AVPlayer, default value is always YES. */
@property (assign, nonatomic, getter=isAudioEnabled) BOOL audioEnabled;
Background mode
Use the QuickbloxRTC.framework in applications running in the background state
Set the app permissions
In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add void to this dictionary. We have seen applications rejected from the App Store specifically for the use of the void flag, so it is important not to skip this step.
When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a red background of the status bar, as well as an additional bar indicating the name of the app holding the active audio session — in this case, your app.
Custom video capture
Note: a CADisplayLink object is a timer object that allows your application to synchronise its drawing to the refresh rate of the display.
/** * By default sending frames in screen share using BiPlanarFullRange pixel format type. * You can also send them using ARGB by setting this constant to NO.
*/ static const BOOL kQBRTCUseBiPlanarFormatTypeForShare = YES;
@interface QBRTCScreenCapture()
@property (weak, nonatomic) UIView * view;
@property (strong, nonatomic) CADisplayLink *displayLink;
@end
@implementation QBRTCScreenCapture
- (instancetype)initWithView:(UIView *)view
{
self = [super init]; if (self)
{
_view = view;
}
return self;
}
#pragma mark - Enter BG / FG notifications
- (void)willEnterForeground:(NSNotification *)note
{
self.displayLink.paused = NO;
}
- (void)didEnterBackground:(NSNotification *)note
{
self.displayLink.paused = YES;
}
#pragma mark - - (UIImage *)screenshot
{
UIGraphicsBeginImageContextWithOptions(_view.frame.size, YES, 1);
[_view drawViewHierarchyInRect:_view.bounds afterScreenUpdates:NO];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext(); return image;
}
- (CIContext *)qb_sharedGPUContext
{
static CIContext *sharedContext; static dispatch_once_t onceToken; dispatch_once(&onceToken, ^{
NSDictionary *options = @{ kCIContextPriorityRequestLow: @YES
};
sharedContext = [CIContext contextWithOptions:options];
});
return sharedContext;
}
- (void)sendPixelBuffer:(CADisplayLink *)sender
{
dispatch_async(self.videoQueue, ^{ @autoreleasepool
{
UIImage *image = [self screenshot];
int renderWidth = image.size.width;
int renderHeight = image.size.height;
CVPixelBufferRef buffer = NULL;
OSType pixelFormatType;
CFDictionaryRef pixelBufferAttributes = NULL;
if (kQBRTCUseBiPlanarFormatTypeForShare)
{
pixelFormatType = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange;
pixelBufferAttributes = (__bridge CFDictionaryRef) @ { (__bridge NSString *)kCVPixelBufferIOSurfacePropertiesKey: @{},
};
} else {
pixelFormatType = kCVPixelFormatType_32ARGB;
pixelBufferAttributes = (__bridge CFDictionaryRef) @ { (NSString *)kCVPixelBufferCGImageCompatibilityKey : @NO, (NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey : @NO
};
}
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, renderWidth, renderHeight, pixelFormatType, pixelBufferAttributes, &buffer);
if (status == kCVReturnSuccess && buffer != NULL)
{
CVPixelBufferLockBaseAddress(buffer, 0);
if (kQBRTCUseBiPlanarFormatTypeForShare)
{
CIImage *rImage = [[CIImage alloc] initWithImage:image];
[self.qb_sharedGPUContext render:rImage toCVPixelBuffer:buffer];
} else {
void *pxdata = CVPixelBufferGetBaseAddress(buffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
uint32_t bitmapInfo = kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst; CGContextRef context = CGBitmapContextCreate(pxdata, renderWidth, renderHeight, 8, renderWidth * 4, rgbColorSpace, bitmapInfo);
CGContextDrawImage(context, CGRectMake(0, 0, renderWidth, renderHeight), [image CGImage]);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
}
CVPixelBufferUnlockBaseAddress(buffer, 0);
QBRTCVideoFrame *videoFrame = [[QBRTCVideoFrame alloc] initWithPixelBuffer:buffer videoRotation:QBRTCVideoRotation_0];
[super sendVideoFrame:videoFrame];
}
CVPixelBufferRelease(buffer);
} });
}
#pragma mark - <QBRTCVideoCapture>
- (void)didSetToVideoTrack:(QBRTCLocalVideoTrack *)videoTrack
{
[super didSetToVideoTrack:videoTrack];
self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(sendPixelBuffer:)];
[self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];
self.displayLink.frameInterval = 12;
//5 fps
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(willEnterForeground:) name:UIApplicationWillEnterForegroundNotification object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(didEnterBackground:) name:UIApplicationDidEnterBackgroundNotification object:nil];
}
- (void)didRemoveFromVideoTrack:(QBRTCLocalVideoTrack *)videoTrack
{
[super didRemoveFromVideoTrack:videoTrack];
self.displayLink.paused = YES;
[self.displayLink removeFromRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];
self.displayLink = nil;
[[NSNotificationCenter defaultCenter] removeObserver:self name:UIApplicationWillEnterForegroundNotification object:nil];
[[NSNotificationCenter defaultCenter] removeObserver:self name:UIApplicationDidEnterBackgroundNotification object:nil];
}
@end
To link this capture to your local video track simply use:
//Save previous video capture self.capture =self.session.localMediaStream.videoTrack.videoCapture;
self.screenCapture = [[QBRTCScreenCapture alloc] initWithView:self.view];
//Switch to sharing self.session.localMediaStream.videoTrack.videoCapture = self.screenCapture;
// here videoTrack calls didSetToVideoTrack:
Calling offline users
- (void)sendPushToOpponentsAboutNewCall
{
NSString *currentUserLogin = [[[QBSession currentSession] currentUser] login];
[QBRequest sendPushWithText:[NSString stringWithFormat:@"%@ is calling you", currentUserLogin] toUsers:[self.session.opponentsIDs componentsJoinedByString:@","]
successBlock:^(QBResponse * _Nonnull response, NSArray<QBMEvent *> * _Nullable events)
{
NSLog(@"Push sent!");
}
errorBlock:^(QBError * _Nullable error) { NSLog(@"Can not send push: %@", error);
}];
.......
}
Comments
Post a Comment
Thank You.