I made a minimum implementation that acquires the screen of an iPhone cabled to a mac in real time.
Until now, it was necessary to start QuickTime Player and select the iPhone from "New movie recording", etc., but it is now possible to do it with your own program.

It is uploaded to GitHub. https://github.com/satoshi0212/DeviceCameraMonitorSample
Information updates such as virtual camera / AR / video expression including this implementation are posted on Twitter. https://twitter.com/shmdevelop
"Hardware" and "Camera" must be selected.

plist
Add Privacy --Camera Usage Description to the plist.

ʻAVCaptureDevice.DiscoverySession` By specifying the following before executing, the external device will be displayed by opt-in.
        var prop = CMIOObjectPropertyAddress(
            mSelector: CMIOObjectPropertySelector(kCMIOHardwarePropertyAllowScreenCaptureDevices),
            mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal),
            mElement: CMIOObjectPropertyElement(kCMIOObjectPropertyElementMaster))
        var allow: UInt32 = 1;
        CMIOObjectSetPropertyData(CMIOObjectID(kCMIOObjectSystemObject), &prop, 0, nil, UInt32(MemoryLayout.size(ofValue: allow)), &allow)
And if you search with the following parameters, iPhone is included in devices.
You can identify iPhone devices by filtering the found devices appropriately by modelID and manufacturer.
        let devices = AVCaptureDevice.DiscoverySession(deviceTypes: [.externalUnknown], mediaType: nil, position: .unspecified).devices
        if let device = devices.filter({ $0.modelID == "iOS Device" && $0.manufacturer == "Apple Inc." }).first {
            ...
        }
However, it was also necessary to observe the notification of ʻAVCaptureDeviceWasConnectedNotification` because the iPhone may not be found immediately after startup or search.
        let nc = NotificationCenter.default
        nc.addObserver(forName: NSNotification.Name(rawValue: "AVCaptureDeviceWasConnectedNotification"), object: nil, queue: .main) { (notification) in
            print(notification)
            guard let device = notification.object as? AVCaptureDevice else { return }
            ...
        }
In the uploaded implementation, it has been resized for screen display.
Calculate the ratio with the height as a fixed value, calculate the width, and specify the size of the imageView.
The image is resized with CGAffineTransform.
    private func resizeIfNeeded(w: CGFloat, h: CGFloat) {
        guard targetRect == nil else { return }
        let aspect = h / fixedHeight
        let rect = CGRect(x: 0, y: 0, width: floor(w / aspect), height: fixedHeight)
        imageView.frame = rect
        targetRect = rect
    }
    ...
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        connection.videoOrientation = .portrait
        DispatchQueue.main.async(execute: {
            let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
            let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
            let w = CGFloat(CVPixelBufferGetWidth(pixelBuffer))
            let h = CGFloat(CVPixelBufferGetHeight(pixelBuffer))
            self.resizeIfNeeded(w: w, h: h)
            guard let targetRect = self.targetRect else { return }
            let m = CGAffineTransform(scaleX: targetRect.width / w, y: targetRect.height / h)
            let resizedImage = ciImage.transformed(by: m)
            let cgimage = self.context.createCGImage(resizedImage, from: targetRect)!
            let image = NSImage(cgImage: cgimage, size: targetRect.size)
            self.imageView.image = image
        })
    }