iOS 基于WebRTC的音视频通信 总结篇(2020年最新)

iOS 基于WebRTC的音视频通信 总结篇(2020最新)

附上我的swfit项目, 项目里面有整个swift应用使用框架, 网络请求框架, DSBridge原生与H5交互的用法, 反射知识的使用, WCDB数据库的封装使用, WebRTC音视频直播demo, socket的使用, socket协议的封装使用等等知识点. 希望对大家有用.-->swfit完整项目2020持续更新完善

公司要用webrtc进行音视频通信, 参考了国内外众多博客和demo, 总结一下经验:
webrtc官网
webrtc对iOS使用的说明

另外一个帮助大家很好理解webrtc的单独实例demo: https://github.com/Xianlau/WebRTC_Demo
先展示demo效果图

image.png

WEBRTC结构

完整的WebRTC框架,分为 Server端、Client端两大部分。

  • Server端:
    Stun服务器 : 服务器用于获取设备的外部网络地址
    Turn服务器 : 服务器是在点对点失败后用于通信中继
    信令服务器 : 负责端到端的连接。两端在连接之初,需要交换信令,如sdp、candidate等,都是通过信令服务器 进行转发交换的。
  • Client有四大应用端:
    Android iOS PC Broswer

介绍下WebRTC三个主要API,以及实现点对点连接的流程。

  1. MediaStream:通过MediaStream的API能够通过设备的摄像头及话筒获得视频、音频的同步流
  2. RTCPeerConnection:RTCPeerConnection是WebRTC用于构建点对点之间稳定、高效的流传输的组件
  3. RTCDataChannel:RTCDataChannel使得浏览器之间(点对点)建立一个高吞吐量、低延时的信道,用于传输任意数据。
    其中RTCPeerConnection是我们WebRTC的核心组件。

WEBRTC的建立连接流程图

webrtc流程图.png

整个webrtc连接的流程说明

其主要流程如上图所示, 具体流程说明如下:

  1. 客户端通过socket, 和服务器建立起TCP长链接, 这部分WebRTC并没有提供相应的API, 所以这里可以借助第三方框架, OC代码建议使用CocoaAsyncSocket第三方框架进行socket连接https://github.com/robbiehanson/CocoaAsyncSocket
    swift代码的话国外工程师最喜欢用Starscream (WebSocket)
    https://github.com/daltoniam/Starscream

  2. 客户端通过信令服务器, 进行offer SDP 握手

SDP(Session Description Protocol):描述建立音视频连接的一些属性,如音频的编码格式、视频的编码格式、是否接收/发送音视频等等
SDP 是通过webrtc框架里面的PeerConnection所创建, 详细创建请参考我的demo.

3.客户端通过信令服务器, 进行Candidate 握手

Candidate:主要包含了相关方的IP信息,包括自身局域网的ip、公网ip、turn服务器ip、stun服务器ip等
Candidate 是通过webrtc框架里面的PeerConnection所创建, 详细创建请参考我的demo.

  1. 客户端在SDP 和Candidate握手成功后, 就建立起一个P2P端对端的链接, 视频流就能直接传输, 不需要经过服务器啦.

SDP握手流程和Candidate握手流程类似, 但有点繁琐, 下面就SDP握手流程简要说明:

下图为WebRTC通过信令建立一个SDP握手的过程。只有通过SDP握手,双方才知道对方的信息,这是建立p2p通道的基础。


SDP.jpg

1.anchor端通过 createOffer 生成 SDP 描述
2.anchor通过 setLocalDescription,设置本地的描述信息
3.anchor将 offer SDP 发送给用户
4.用户通过 setRemoteDescription,设置远端的描述信息
5.用户通过 createAnswer 创建出自己的 SDP 描述
6.用户通过 setLocalDescription,设置本地的描述信息
7.用户将 anwser SDP 发送给主播
8.anchor端通过 setRemoteDescription,设置远端的描述信息。
9.通过SDP握手后,两端之间就会建立起一个端对端的直接通讯通道。

由于我们所处的网络环境错综复杂,用户可能处在私有内网内,使用p2p传输时,将会遇到NAT以及防火墙等阻碍。这个时候我们就需要在SDP握手时,通过STUN/TURN/ICE相关NAT穿透技术来保障p2p链接的建立。

1. 建立Socket长连接, 为接下来的信令通信做好铺垫.

与服务器端建立长连接, 选用了socket连接, 用的第三方框架是CocoaAsyncSocket, 其实也可以使用WebSocket, 看你们团队的方案选型吧.

  • 以下是socket建立连接以及WebRTC建立连接的逻辑代码. socket连接其实代码量极少, socket连接参考一下github的CocoaAsyncSocket说明就好, 不必花太多时间在这块, 重点还是在WebRTC建立连接, 在与服务端进行数据传输的时候, 注意你们可能会有数据分包策略.
  • 网上绝大部分代码用的是OC, 而且很多已经过且零散的, OC版本相对简单, 以下分享的是swift版, 阅读以下代码请一定一定要先看看以上提到的两个逻辑时序图.
// MARK: - socket状态代理
protocol SocketClientDelegate: class {
    
    func signalClientDidConnect(_ signalClient: SocketClient)
    func signalClientDidDisconnect(_ signalClient: SocketClient)
    func signalClient(_ signalClient: SocketClient, didReceiveRemoteSdp sdp: RTCSessionDescription)
    func signalClient(_ signalClient: SocketClient, didReceiveCandidate candidate: RTCIceCandidate)
}

final class SocketClient: NSObject {
    
    //socket
    var socket: GCDAsyncSocket = {
       return GCDAsyncSocket.init()
    }()
    
    private var host: String? //服务端IP
    private var port: UInt16? //端口
    weak var delegate: SocketClientDelegate?//代理
    
    var receiveHeartBeatDuation = 0 //心跳计时计数
    let heartBeatOverTime = 10 //心跳超时
    var sendHeartbeatTimer:Timer? //发送心跳timer
    var receiveHeartbearTimer:Timer? //接收心跳timer

    //接收数据缓存
    var dataBuffer:Data = Data.init()
    
    //登录获取的peer_id
    var peer_id = 0
    //登录获取的远程设备peer_id
    var remote_peer_id = 0

    // MARK:- 初始化
    init(hostStr: String , port: UInt16) {
        super.init()
        
        self.socket.delegate = self
        self.socket.delegateQueue = DispatchQueue.main
        self.host = hostStr
        self.port = port
        //socket开始连接
        connect()
    }

    // MARK:- 开始连接
    func connect() {
        
        do {
            try  self.socket.connect(toHost: self.host ?? "", onPort: self.port ?? 6868, withTimeout: -1)
            
        }catch {
            print(error)
        }
    }
    
    // MARK:- 发送消息
    func sendMessage(_ data: Data){
        self.socket.write(data, withTimeout: -1, tag: 0)
    }

    // MARK:- 发送sdp offer/answer
    func send(sdp rtcSdp: RTCSessionDescription) {
        
        //转成我们的sdp
        let type = rtcSdp.type
        var typeStr = ""
        switch type {
        case .answer:
            typeStr = "answer"
        case .offer:
            typeStr = "offer"
        default:
            print("sdpType错误")
        }
        let newSDP:SDPSocket = SDPSocket.init(sdp: rtcSdp.sdp, type: typeStr)
        let jsonInfo = newSDP.toJSON()
        let dic = ["sdp" : jsonInfo]
        let info:SocketInfo = SocketInfo.init(type: .sdp, source: self.peer_id, destination: self.remote_peer_id, params: dic as Dictionary<String, Any>)
        let data = self.packData(info: info)
        //print(data)
        self.sendMessage(data)
        print("发送SDP")
    }

    // MARK:- 发送iceCandidate
    func send(candidate rtcIceCandidate: RTCIceCandidate) {
        
        let iceCandidateMessage = IceCandidate_Socket(from: rtcIceCandidate)
        let jsonInfo = iceCandidateMessage.toJSON()
        let dic = ["icecandidate" : jsonInfo]
        let info:SocketInfo = SocketInfo.init(type: .icecandidate, source: self.peer_id, destination: self.remote_peer_id, params: dic as Dictionary<String, Any>)
        let data = self.packData(info: info)
        //print(data)
        self.sendMessage(data)
         print("发送ICE")
    }
}

extension SocketClient: GCDAsyncSocketDelegate {
    
    // MARK:- socket连接成功
    func socket(_ sock: GCDAsyncSocket, didConnectToHost host: String, port: UInt16) {
        
        debugPrint("socket连接成功")
        self.delegate?.signalClientDidConnect(self)
        
        //登录获取身份id peer_id
        login()
        //发送心跳
        startHeartbeatTimer()
        //开启接收心跳计时
        startReceiveHeartbeatTimer()
        
        //继续接收数据
        socket.readData(withTimeout: -1, tag: 0)
    }
    
    // MARK:- 接收数据  socket接收到一个数据包
    func socket(_ sock: GCDAsyncSocket, didRead data: Data, withTag tag: Int) {
        
        //debugPrint("socket接收到一个数据包")
        let _:SocketInfo? = self.unpackData(data)
        //let type:SigType = SigType(rawValue: socketInfo?.type ?? "")!
        //print(socketInfo ?? "")
        //print(type)

        //继续接收数据
        socket.readData(withTimeout: -1, tag: 0)
    }
    
    // MARK:- 断开连接
    func socketDidDisconnect(_ sock: GCDAsyncSocket, withError err: Error?) {
        
        debugPrint("socket断开连接")
        print(err ?? "")
        
        self.disconnectSocket()
        
        // try to reconnect every two seconds
        DispatchQueue.global().asyncAfter(deadline: .now() + 5) {
            debugPrint("Trying to reconnect to signaling server...")
            self.connect()
        }
    }

}

2. 进行信令通信, 建立端对端的连接.


import Foundation
import WebRTC

// MARK: - webrtc连接状态代理 
protocol WebRTCClientDelegate: class {
    func webRTCClient(_ client: WebRTCClient, didDiscoverLocalCandidate candidate: RTCIceCandidate)
    func webRTCClient(_ client: WebRTCClient, didChangeConnectionState state: RTCIceConnectionState)
    func webRTCClient(_ client: WebRTCClient, didReceiveData data: Data)
}

final class WebRTCClient: NSObject {

    // MARK:- 懒加载factory
    private static let factory: RTCPeerConnectionFactory = {
        RTCInitializeSSL()
        let videoEncoderFactory = RTCVideoEncoderFactoryH264()
        let videoDecoderFactory = RTCVideoDecoderFactoryH264()
        let factory = RTCPeerConnectionFactory(encoderFactory: videoEncoderFactory, decoderFactory: videoDecoderFactory)

//        let options = RTCPeerConnectionFactoryOptions()
//        options.ignoreVPNNetworkAdapter = true
//        options.ignoreWiFiNetworkAdapter = true
//        options.ignoreCellularNetworkAdapter = true
//        options.ignoreEthernetNetworkAdapter = true
//        options.ignoreLoopbackNetworkAdapter = true
//        factory.setOptions(options)
        return factory
    }()
    
    weak var delegate: WebRTCClientDelegate?
    
    private let peerConnection: RTCPeerConnection
    private let rtcAudioSession =  RTCAudioSession.sharedInstance()
    private let audioQueue = DispatchQueue(label: "audio")
    private let mediaConstrains = [kRTCMediaConstraintsOfferToReceiveAudio: kRTCMediaConstraintsValueTrue,
                                   kRTCMediaConstraintsOfferToReceiveVideo: kRTCMediaConstraintsValueTrue]    
    private var videoCapturer: RTCVideoCapturer?
    private var localVideoTrack: RTCVideoTrack?
    private var remoteVideoTrack: RTCVideoTrack?
    private var localDataChannel: RTCDataChannel?
    private var remoteDataChannel: RTCDataChannel?

    @available(*, unavailable)
    override init() {
        fatalError("WebRTCClient:init is unavailable")
    }
    
    required init(iceServers: [String]) {

//        // gatherContinually will let WebRTC to listen to any network changes and send any new candidates to the other client
//        config.continualGatheringPolicy = .gatherContinually

        //config.iceTransportPolicy = .all
        //contraints: 控制MediaStream的内容(媒体类型、分辨率、帧率)
//        let constraints = RTCMediaConstraints(mandatoryConstraints: nil,
//                                              optionalConstraints: ["DtlsSrtpKeyAgreement":kRTCMediaConstraintsValueTrue])
        
        let config = RTCConfiguration()
        config.iceServers = [RTCIceServer(urlStrings: iceServers)]
        // Unified plan is more superior than planB
        config.sdpSemantics = .unifiedPlan
        //contraints: 控制MediaStream的内容(媒体类型、分辨率、帧率)
        let mediaConstraints = RTCMediaConstraints.init(mandatoryConstraints: nil, optionalConstraints: nil)
        self.peerConnection = WebRTCClient.factory.peerConnection(with: config, constraints: mediaConstraints, delegate: nil)
        
        super.init()
        self.createMediaSenders()
        self.configureAudioSession()
        self.peerConnection.delegate = self
    }
    
    
    // MARK: 挂起
    func disconnect(){
        self.peerConnection.close()
    }
    
    // MARK: Signaling 获取本地sdp 用来发送给socket服务器
    func offer(completion: @escaping (_ sdp: RTCSessionDescription) -> Void) {
        let constrains = RTCMediaConstraints(mandatoryConstraints: self.mediaConstrains,
                                             optionalConstraints: nil)
        self.peerConnection.offer(for: constrains) { (sdp, error) in
            guard let sdp = sdp else {
                return
            }
            
            self.peerConnection.setLocalDescription(sdp, completionHandler: { (error) in
                completion(sdp)
            })
        }
    }
    // MARK:- 回复sockdet服务器sdp answer
    func answer(completion: @escaping (_ sdp: RTCSessionDescription) -> Void)  {
        let constrains = RTCMediaConstraints(mandatoryConstraints: self.mediaConstrains,
                                             optionalConstraints: nil)
        //拿到本地sdp
        self.peerConnection.answer(for: constrains) { (sdp, error) in
            guard let sdp = sdp else {
                return
            }
            //设置本地sdp
            self.peerConnection.setLocalDescription(sdp, completionHandler: { (error) in
                //发送出去sdp
                completion(sdp)
            })
        }
    }
    // MARK:- 设置远程sdp
    func set(remoteSdp: RTCSessionDescription, completion: @escaping (Error?) -> ()) {
        self.peerConnection.setRemoteDescription(remoteSdp, completionHandler: completion)
    }
    
    // MARK:- 添加远程candidate
    func set(remoteCandidate: RTCIceCandidate) {
        self.peerConnection.add(remoteCandidate)
    }
    
    // MARK: Media
    func startCaptureLocalVideo(renderer: RTCVideoRenderer) {
        guard let capturer = self.videoCapturer as? RTCCameraVideoCapturer else {
            return
        }

        guard
            //获取前置摄像头front 后置取back
            let frontCamera = (RTCCameraVideoCapturer.captureDevices().first { $0.position == .front }),
            // choose highest res
            let format = (RTCCameraVideoCapturer.supportedFormats(for: frontCamera).sorted { (f1, f2) -> Bool in
                let width1 = CMVideoFormatDescriptionGetDimensions(f1.formatDescription).width
                let width2 = CMVideoFormatDescriptionGetDimensions(f2.formatDescription).width
                return width1 < width2
            }).last,
        
            // choose highest fps
            let fps = (format.videoSupportedFrameRateRanges.sorted { return $0.maxFrameRate < $1.maxFrameRate }.last) else {
            return
        }

        capturer.startCapture(with: frontCamera,
                              format: format,
                              fps: Int(fps.maxFrameRate))
        
        self.localVideoTrack?.add(renderer)
    }
    
    func renderRemoteVideo(to renderer: RTCVideoRenderer) {

        self.remoteVideoTrack?.add(renderer)
    }
    
    private func configureAudioSession() {
        self.rtcAudioSession.lockForConfiguration()
        do {
            try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
            try self.rtcAudioSession.setMode(AVAudioSession.Mode.voiceChat.rawValue)
        } catch let error {
            debugPrint("Error changeing AVAudioSession category: \(error)")
        }
        self.rtcAudioSession.unlockForConfiguration()
    }
    
    // MARK:- 创建媒体流
    private func createMediaSenders() {
        let streamId = "stream"
        
        // Audio
        let audioTrack = self.createAudioTrack()
        self.peerConnection.add(audioTrack, streamIds: [streamId])
        
        // Video
        let videoTrack = self.createVideoTrack()
        self.localVideoTrack = videoTrack
        self.peerConnection.add(videoTrack, streamIds: [streamId])
        self.remoteVideoTrack = self.peerConnection.transceivers.first { $0.mediaType == .video }?.receiver.track as? RTCVideoTrack

        //samadd
        //self.remoteVideoTrack?.source.adaptOutputFormat(toWidth: 960, height: 480, fps: 30)

        // Data
        if let dataChannel = createDataChannel() {
            dataChannel.delegate = self
            self.localDataChannel = dataChannel
        }
    }
    
    // MARK:- 创建音频track
    private func createAudioTrack() -> RTCAudioTrack {
        let audioConstrains = RTCMediaConstraints(mandatoryConstraints: nil, optionalConstraints: nil)
        let audioSource = WebRTCClient.factory.audioSource(with: audioConstrains)
        let audioTrack = WebRTCClient.factory.audioTrack(with: audioSource, trackId: "audio0")
        return audioTrack
    }
    
    // MARK:- 创建视频track
    private func createVideoTrack() -> RTCVideoTrack {
        let videoSource = WebRTCClient.factory.videoSource()
        
        #if TARGET_OS_SIMULATOR
        self.videoCapturer = RTCFileVideoCapturer(delegate: videoSource)
        #else
        self.videoCapturer = RTCCameraVideoCapturer(delegate: videoSource)
        #endif
        
        let videoTrack = WebRTCClient.factory.videoTrack(with: videoSource, trackId: "video0")
        return videoTrack
    }
    
    // MARK: Data Channels
    // MARK:- 创建data通道
    private func createDataChannel() -> RTCDataChannel? {
        let config = RTCDataChannelConfiguration()
        guard let dataChannel = self.peerConnection.dataChannel(forLabel: "WebRTCData", configuration: config) else {
            debugPrint("Warning: Couldn't create data channel.")
            return nil
        }
        return dataChannel
    }
    
    // MARK:- 发送data
    func sendData(_ data: Data) {
        let buffer = RTCDataBuffer(data: data, isBinary: true)
        self.remoteDataChannel?.sendData(buffer)
    }
}

// MARK:- Audio control
extension WebRTCClient {
    func muteAudio() {
        self.setAudioEnabled(false)
    }
    
    func unmuteAudio() {
        self.setAudioEnabled(true)
    }
    
    // Fallback to the default playing device: headphones/bluetooth/ear speaker
    func speakerOff() {
        self.audioQueue.async { [weak self] in
            guard let self = self else {
                return
            }
            
            self.rtcAudioSession.lockForConfiguration()
            do {
                try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
                try self.rtcAudioSession.overrideOutputAudioPort(.none)
            } catch let error {
                debugPrint("Error setting AVAudioSession category: \(error)")
            }
            self.rtcAudioSession.unlockForConfiguration()
        }
    }
    
    // Force speaker
    func speakerOn() {
        self.audioQueue.async { [weak self] in
            guard let self = self else {
                return
            }
            
            self.rtcAudioSession.lockForConfiguration()
            do {
                try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
                try self.rtcAudioSession.overrideOutputAudioPort(.speaker)
                try self.rtcAudioSession.setActive(true)
            } catch let error {
                debugPrint("Couldn't force audio to speaker: \(error)")
            }
            self.rtcAudioSession.unlockForConfiguration()
        }
    }
    
    private func setAudioEnabled(_ isEnabled: Bool) {
        let audioTracks = self.peerConnection.transceivers.compactMap { return $0.sender.track as? RTCAudioTrack }
        audioTracks.forEach { $0.isEnabled = isEnabled }
    }
}

extension WebRTCClient: RTCDataChannelDelegate {
    func dataChannelDidChangeState(_ dataChannel: RTCDataChannel) {
        debugPrint("dataChannel did change state: \(dataChannel.readyState)")
    }
    
    func dataChannel(_ dataChannel: RTCDataChannel, didReceiveMessageWith buffer: RTCDataBuffer) {
        self.delegate?.webRTCClient(self, didReceiveData: buffer.data)
    }
}

3. 对Webrtc模块进行封装管理

import Foundation
import AVFoundation
import WebRTC

// MARK:- 图传连接状态
public enum RtcConnectedState {
    
    case sucessed //连接成功
    case falure   //连接失败
    case connecting //正在连接
}


protocol WebRTCManagerDelegate: class {

    //socket是否连接上
    func webRTCManager(_ manager: WebRTCManager, socketConnectState isSucessed: Bool)
    
    //webrtc连接状态
    func webRTCManager(_ manager: WebRTCManager, didChangeConnectionState state: RTCIceConnectionState)
}

class WebRTCManager {
    
    static let shareInstance:WebRTCManager  = WebRTCManager()
    
    //private let signalClient: SignalingClient
    var signalClient: SocketClient?
    var webRTCClient: WebRTCClient?

    ///初始化的时候请传入config
    var sockitConfig: SocketConfig =  SocketConfig.default
    
    //代理
    weak var delegate: WebRTCManagerDelegate?
    
    var remoteCandidate: Int = 0
    ///rtc连接成功回调
    var feedbackConnectedBlock: ((_ webClient: WebRTCClient)->())?

    // MARK:- 断开socket连接
    public func disconnect(){
        
        self.signalClient?.disconnectSocket()
        
        self.webRTCClient?.disconnect()
        self.signalClient?.delegate = nil
        self.webRTCClient?.delegate = nil
        self.signalClient = nil
        self.webRTCClient = nil
        remoteCandidate = 0
    }
    // MARK:- 开始连接socket
    public func connect(){

        //打印RTC日记
        //RTCSetMinDebugLogLevel(.verbose)
        //let log = RTCFileLogger.init()
        //log.start()
        
        //创建socket和rtc对象
        signalClient = SocketClient.init(hostStr: sockitConfig.host, port: sockitConfig.port)
        webRTCClient = WebRTCClient(iceServers: sockitConfig.webRTCIceServers)
        webRTCClient?.delegate = self
        signalClient?.delegate = self

        self.signalClient?.connect()
    }
    
}

extension WebRTCManager: SocketClientDelegate {
    
    //socket登录成功
    func signalClientdidLogin(_ signalClient: SocketClient) {
        logger.info("********socket登录成功************************")
    }
    
    // MARK:- socket连接成功
    func signalClientDidConnect(_ signalClient: SocketClient) {

        self.delegate?.webRTCManager(self, socketConnectState: true)
    }

    // MARK:- socket连接失败
    func signalClientDidDisconnect(_ signalClient: SocketClient) {
        
        self.delegate?.webRTCManager(self, socketConnectState: false)
    }
    
    // MARK:- 收到对方sdp
    func signalClient(_ signalClient: SocketClient, didReceiveRemoteSdp sdp: RTCSessionDescription) {
        logger.info("************收到对方sdp****************************")
        
        //设置远方sdp
        self.webRTCClient?.set(remoteSdp: sdp) { (error) in
            
            self.webRTCClient?.answer { (localSdp) in
                
                self.signalClient?.send(sdp: localSdp)
            }
            
            logger.error(error.debugDescription)
        }
        
    }
    // MARK:- 收到对方ice
    func signalClient(_ signalClient: SocketClient, didReceiveCandidate candidate: RTCIceCandidate) {
         logger.info("************收到对方ice****************************")
        
        self.remoteCandidate += 1
        //设置远方ice
        self.webRTCClient?.set(remoteCandidate: candidate)
    }
}

extension WebRTCManager: WebRTCClientDelegate {
    
    // MARK:- 收到本地ice
    func webRTCClient(_ client: WebRTCClient, didDiscoverLocalCandidate candidate: RTCIceCandidate) {
        logger.info("********************************发现本地 ice candidate **********")
        
        self.signalClient?.send(candidate: candidate)
    }
    // MARK:- rtc连接状态
    func webRTCClient(_ client: WebRTCClient, didChangeConnectionState state: RTCIceConnectionState) {
        
        self.delegate?.webRTCManager(self, didChangeConnectionState: state)
        
        switch state {
        case .connected, .completed:
            
            logger.info("*********RTC连接状态成功*****************************************")
            if let block = feedbackConnectedBlock {
                block(client)
            }
            
        case .disconnected:
            logger.info("*********RTC失去连接*****************************************")
            
        case .failed, .closed:
            
            logger.info("*********RTC连接失败*****************************************")
        case .new, .checking, .count: break
            
        @unknown default: break
            
        }

    }
    // MARK:- 收到rtc 数据通道数据
    func webRTCClient(_ client: WebRTCClient, didReceiveData data: Data) {
//        DispatchQueue.main.async {
//            let message = String(data: data, encoding: .utf8) ?? "(Binary: \(data.count) bytes)"
//            let alert = UIAlertController(title: "Message from WebRTC", message: message, preferredStyle: .alert)
//            alert.addAction(UIAlertAction(title: "OK", style: .cancel, handler: nil))
//            self.present(alert, animated: true, completion: nil)
//        }
    }
}

持续更新中.....
大家有问题可以QQ我: 506299396
最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 158,560评论 4 361
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 67,104评论 1 291
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 108,297评论 0 243
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 43,869评论 0 204
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 52,275评论 3 287
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 40,563评论 1 216
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 31,833评论 2 312
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 30,543评论 0 197
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 34,245评论 1 241
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 30,512评论 2 244
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 32,011评论 1 258
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 28,359评论 2 253
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 33,006评论 3 235
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 26,062评论 0 8
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 26,825评论 0 194
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 35,590评论 2 273
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 35,501评论 2 268