iOS 屏幕实时共享功能实践(内附详细代码)

很多人对屏幕共享的印象还只停留在 PC 端做 PPT 汇报的场景中,但事实上,今天的屏幕共享早已跨界出圈了。比如一个大家很熟悉的场景 —— 游戏直播,主播就需要将自己的画面以“屏幕共享”的形式展示给观众,并且对实时性、流畅性的要求非常高。

对于很多手游主播来说,目前比较常见的做法是,通过借助 PC 端的中转将手机游戏画面进行直播分享;而实际上,通过调用融云屏幕共享 SDK,直接在手机端就可以拥有屏幕实时共享的能力。

本文就将主要围绕 iOS 屏幕共享问题展开讨论,一起了解 iOS ReplayKit 框架的发展过程,各个阶段的功能演变,以及结合融云屏幕共享 SDK 实现相应功能的代码和思路。

01 ReplayKit 发展史

iOS 端屏幕录制 ReplayKit 是从 iOS9 开始出现的。

iOS9

WWDC15 初次提供 ReplayKit 框架,它的初期出现主要用于录制视频,存于相册。

iOS9 开始录制和停止录制两个 API 有很大的局限性:

只能获取系统生成好的 MP4 文件,且不能直接获取,需要先保存到相册,再从相册获取;

不可以获取源数据,也就是 pcm 和 yuv 数据;

给开发者的权限低,不能录制其他 APP,且退出后台就不会录制了,只能录制当前 APP 画面。

可控行为在于:

停止录制可弹出一个视频预览窗口,能进行保存或取消或分享该视频文件;

录制完成后可以进行查看、编辑,或者通过指定方式分享。

开始录制视频的 API 如下所示。

/*!
 Deprecated. Use startRecordingWithHandler: instead.

 @abstract Starts app recording with a completion handler. Note that before recording actually starts, the user may be prompted with UI to confirm recording.
 @param microphoneEnabled Determines whether the microphone input should be included in the recorded movie audio.
 @discussion handler Called after user interactions are complete. Will be passed an optional NSError in the RPRecordingErrorDomain domain if there was an issue starting the recording.
 */
[[RPScreenRecorder sharedRecorder] startRecordingWithMicrophoneEnabled:YES handler:^(NSError * _Nullable error) {
        if (error) {
            //TODO.....
        }
}];

调用开始录屏的时候,系统会弹出一个弹窗,需要用户进行确认后才能正常录制。
iOS 屏幕实时共享功能实践(内附详细代码)_第1张图片
停止录制视频的 API 如下所示。


/*! @abstract Stops app recording with a completion handler.
 @discussion handler Called when the movie is ready. Will return an instance of RPPreviewViewController on success which should be presented using [UIViewController presentViewController:animated:completion:]. Will be passed an optional NSError in the RPRecordingErrorDomain domain if there was an issue stopping the recording.
 */
[[RPScreenRecorder sharedRecorder] stopRecordingWithHandler:^(RPPreviewViewController *previewViewController, NSError *  error){
            [self presentViewController:previewViewController animated:YES completion:^{
                //TODO.....
            }];
}];

iOS10

经过 WWDC16 发布,苹果对 ReplayKit 进行了升级,开放了源数据获取途径,增加了两个 Extension 的 Target。具体情况包括:

新增 UI 和 Upload 两个 Extension 的 Target;

增加开发者权限,允许用户登录到服务并设立了直播、源数据的操作;

只能通过扩展区录制屏幕,不仅可以录制自己的 APP,还可以录制其他 APP;

只能录制 APP 屏幕,不能录制 iOS 系统屏幕。

创建 Extension 的方法如下图所示。
iOS 屏幕实时共享功能实践(内附详细代码)_第2张图片
UI Extension

/*
这俩API可以理解为弹窗触发的事件回调函数;
*/
- (void)userDidFinishSetup {
  //触发  Host App 的RPBroadcastActivityViewControllerDelegate
}

- (void)userDidCancelSetup {
    //触发  Host App 的RPBroadcastActivityViewControllerDelegate
}

Upload Extension

- (void)broadcastStartedWithSetupInfo:(NSDictionary *)setupInfo {
    // User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
  //这里主要就是做一些初始化的行为操作
}

- (void)broadcastPaused {
    // User has requested to pause the broadcast. Samples will stop being delivered.
  //接收系统暂停信号
}

- (void)broadcastResumed {
    // User has requested to resume the broadcast. Samples delivery will resume.
  //接收系统恢复信号
}

- (void)broadcastFinished {
    // User has requested to finish the broadcast.
  //接收系统完成信号
}

 //这里就是此次更新最炸的点,我们可以拿到系统源数据,而且系统还分了三类,分别为视频帧、App内声音、麦克风
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
   
    switch (sampleBufferType) {
        case RPSampleBufferTypeVideo:
            // Handle video sample buffer
            break;
        case RPSampleBufferTypeAudioApp:
            // Handle audio sample buffer for app audio
            break;
        case RPSampleBufferTypeAudioMic:
            // Handle audio sample buffer for mic audio
            break;
            
        default:
            break;
    }
}

Host APP

RPBroadcastControllerDelegate
//start
if (![RPScreenRecorder sharedRecorder].isRecording) {
      [RPBroadcastActivityViewController loadBroadcastActivityViewControllerWithHandler:^(RPBroadcastActivityViewController * _Nullable broadcastActivityViewController, NSError * _Nullable error) {
          if (error) {
              NSLog(@"RPBroadcast err %@", [error localizedDescription]);
          }
          broadcastActivityViewController.delegate = self; /*RPBroadcastActivityViewControllerDelegate*/
          [self presentViewController:broadcastActivityViewController animated:YES completion:nil];
      }];
  }

#pragma mark- RPBroadcastActivityViewControllerDelegate
- (void)broadcastActivityViewController:(RPBroadcastActivityViewController *)broadcastActivityViewController didFinishWithBroadcastController:(RPBroadcastController *)broadcastController error:(NSError *)error {
  if (error) {
    //TODO:
    NSLog(@"broadcastActivityViewController:%@",error.localizedDescription);
    return;
  }
  
   [broadcastController startBroadcastWithHandler:^(NSError * _Nullable error) {
        if (!error) {
            NSLog(@"success");
        } else {
            NSLog(@"startBroadcast:%@",error.localizedDescription);
        }
    }];
}

#pragma mark- RPBroadcastControllerDelegate
- (void)broadcastController:(RPBroadcastController *)broadcastController didFinishWithError:(nullable NSError *)error{
    NSLog(@"didFinishWithError: %@", error);
}
- (void)broadcastController:(RPBroadcastController *)broadcastController didUpdateServiceInfo:(NSDictionary  *> *)serviceInf {
    NSLog(@"didUpdateServiceInfo: %@", serviceInf);
}

iOS11

经过 WWDC17,苹果对 ReplayKit2 进行了再次升级,新增了 APP 外数据获取,可以直接在 Host App 中获取,具体包括:

可以直接在 Host APP 中处理录制的 APP 屏幕数据;

可以录制 iOS 系统的屏幕数据,但是需要通过控制中心手动开启。

启动 APP 屏幕录制

[[RPScreenRecorder sharedRecorder] startCaptureWithHandler:^(CMSampleBufferRef  _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {
    [self.videoOutputStream write:sampleBuffer error:nil];
} completionHandler:^(NSError * _Nullable error) {
    NSLog(@"startCaptureWithHandler:%@",error.localizedDescription);
}];

停止 APP 屏幕录制


[[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError * _Nullable error) {
        [self.assetWriter finishWritingWithCompletionHandler:^{
            //TODO
        }];
}];

iOS12

苹果在 WWDC18 上针对 ReplayKit 更新,新增了 RPSystemBroadcastPickerView,类用于在 APP 内可启动系统录制,极大地简化了屏幕录制的流程。


if (@available(iOS 12.0, *)) {
        self.systemBroadcastPickerView = [[RPSystemBroadcastPickerView alloc] initWithFrame:CGRectMake(0, 0, 50, 80)];
        self.systemBroadcastPickerView.preferredExtension = ScreenShareBuildID;
        self.systemBroadcastPickerView.showsMicrophoneButton = NO;
        self.navigationItem.rightBarButtonItem = [[UIBarButtonItem alloc] initWithCustomView:self.systemBroadcastPickerView];
} else {
    // Fallback on earlier versions
}

02 融云 RongRTCReplayKitExt

为减轻开发者的集成负担,融云专门打造了 RongRTCReplayKitExt 库,以服务于屏幕共享业务。

设计思路

Upload Extension

SampleHandler 进行数据的接收、RCRTCReplayKitEngine 初始化配置 ;

RCRTCReplayKitEngine 初始化 socket 通讯、处理YUV数据转i420、控制内存峰值。

App

原有发布流程:

IM 连接--加入房间--发布资源(RCRTCScreenShareOutputStream);

内部做 socket 初始化、实现协议接收处理过的数据、推流。
iOS 屏幕实时共享功能实践(内附详细代码)_第3张图片

代码示例

Upload extension

#import "SampleHandler.h"
#import 

static NSString *const ScreenShareGroupID = @"group.cn.rongcloud.rtcquickdemo.screenshare";

@interface SampleHandler ()
@end

@implementation SampleHandler

- (void)broadcastStartedWithSetupInfo:(NSDictionary *)setupInfo {
    // User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
    [[RCRTCReplayKitEngine sharedInstance] setupWithAppGroup:ScreenShareGroupID delegate:self];
}

- (void)broadcastPaused {
    // User has requested to pause the broadcast. Samples will stop being delivered.
}

- (void)broadcastResumed {
    // User has requested to resume the broadcast. Samples delivery will resume.
}

- (void)broadcastFinished {
    [[RCRTCReplayKitEngine sharedInstance] broadcastFinished];
}

- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType  API_AVAILABLE(ios(10.0)) {
    switch (sampleBufferType) {
        case RPSampleBufferTypeVideo:
            [[RCRTCReplayKitEngine sharedInstance] sendSampleBuffer:sampleBuffer withType:RPSampleBufferTypeVideo];
            break;
        case RPSampleBufferTypeAudioApp:
            // Handle audio sample buffer for app audio
            break;
        case RPSampleBufferTypeAudioMic:
            // Handle audio sample buffer for mic audio
            break;

        default:
            break;
    }
}

#pragma mark - RongRTCReplayKitExtDelegate
-(void)broadcastFinished:(RCRTCReplayKitEngine *)broadcast reason:(RongRTCReplayKitExtReason)reason {
    NSString *tip = @"";
    switch (reason) {
        case RongRTCReplayKitExtReasonRequestedByMain:
            tip = @"屏幕共享已结束......";
            break;
        case RongRTCReplayKitExtReasonDisconnected:
            tip = @"应用断开.....";
            break;
        case RongRTCReplayKitExtReasonVersionMismatch:
            tip = @"集成错误(SDK 版本号不相符合)........";
            break;
    }

    NSError *error = [NSError errorWithDomain:NSStringFromClass(self.class)
                                         code:0
                                     userInfo:@{
                                         NSLocalizedFailureReasonErrorKey:tip
                                     }];
    [self finishBroadcastWithError:error];
}

Host App

- (void)joinRoom {
    RCRTCVideoStreamConfig *videoConfig = [[RCRTCVideoStreamConfig alloc] init];
    videoConfig.videoSizePreset = RCRTCVideoSizePreset720x480;
    videoConfig.videoFps = RCRTCVideoFPS30;
    [[RCRTCEngine sharedInstance].defaultVideoStream setVideoConfig:videoConfig];

    RCRTCRoomConfig *config = [[RCRTCRoomConfig alloc] init];
    config.roomType = RCRTCRoomTypeNormal;

    [self.engine enableSpeaker:YES];

    __weak typeof(self) weakSelf = self;
    [self.engine joinRoom:self.roomId
                   config:config
               completion:^(RCRTCRoom *_Nullable room, RCRTCCode code) {
                   __strong typeof(weakSelf) strongSelf = weakSelf;
                   if (code == RCRTCCodeSuccess) {
                       self.room = room;
                       room.delegate = self;
                       [self publishScreenStream];
                   } else {
                       [UIAlertController alertWithString:@"加入房间失败" inCurrentViewController:strongSelf];
                   }
               }];
}

- (void)publishScreenStream {
    self.videoOutputStream = [[RCRTCScreenShareOutputStream alloc] initWithAppGroup:ScreenShareGroupID];

    RCRTCVideoStreamConfig *videoConfig = self.videoOutputStream.videoConfig;
    videoConfig.videoSizePreset = RCRTCVideoSizePreset1280x720;
    videoConfig.videoFps = RCRTCVideoFPS24;
    [self.videoOutputStream setVideoConfig:videoConfig];
    
    [self.room.localUser publishStream:self.videoOutputStream
                            completion:^(BOOL isSuccess, RCRTCCode desc) {
                                if (isSuccess) {
                                    NSLog(@"发布自定义流成功");
                                } else {
                                    NSLog(@"发布自定义流失败%@", [NSString stringWithFormat:@"订阅远端流失败:%ld", (long) desc]);
                                }
                            }];
}

03 一些注意事项

第一,ReplayKit2 内存不能超过 50MB,一旦超过峰值系统会进行强制回收,所以在 Extension 里面处理数据需要格外注意内存释放。

第二,进程之前通讯,如果是以 CFDefaultcenter 不能携带参数,只能发送消息;如果需要携带参数,必须做本地文件缓存。这里需要注意的一个问题是,在 debug 模式下运行可以打印出数据,在 release 下获取不到本地文件数据,具体实现可参见 Github 中的攻略。

最后想小小吐槽一下:非正常结束录屏时,系统往往会出现弹窗,且这个弹窗无法删除,只能重启设备 —— 这应该算是 iOS 系统一个比较恼人的 BUG 了。

你可能感兴趣的