RealTimeEngine

public class RealTimeEngine : NSObject

A class to perform real-time optical character recognition

  • The region within the AVCaptureVideoPreviewLayer that OCR is to be performed. If using a UIView to define the region of interest this must be assigned as the UIView’s frame and be a subview of the the AVCaptureVideoPreviewLayer’s parent view.

    Declaration

    Swift

    public var regionOfInterest: CGRect?
  • Sets recognition to be running or not. Default is true. Setting the value to false will allow the preview to be active without processing incoming video frames. If it is not desired for recognition to be active after initialization, set this value to false immediately after creating an instance of SwiftyTesseractRTE

    Declaration

    Swift

    public var recognitionIsActive: Bool
  • The quality of the previewLayer video session. The default is set to .medium. Changing this setting will only affect how the video is displayed to the user and will not affect the results of OCR if set above .medium. Setting the quality higher will result in decreased performance.

    Declaration

    Swift

    public var cameraQuality: AVCaptureSession.Preset { get set }
  • Action to be performed after successful recognition

    Declaration

    Swift

    public var onRecognitionComplete: ((String) -> ())?
  • Primary Initializer - Uses SwiftyTesseractRTE defaults

    Declaration

    Swift

    public convenience init(swiftyTesseract: SwiftyTesseract,
                            desiredReliability: RecognitionReliability,
                            cameraQuality: AVCaptureSession.Preset = .medium,
                            onRecognitionComplete: ((String) -> ())? = nil)

    Parameters

    swiftyTesseract

    Instance of SwiftyTesseract

    desiredReliability

    The desired reliability of the recognition results.

    cameraQuality

    The desired camera quality output to be seen by the end user. The default is .medium. Anything higher than .medium has no impact on recognition reliability

    onRecognitionComplete

    Action to be performed after successful recognition

  • Declaration

    Swift

    public convenience init(swiftyTesseract: SwiftyTesseract,
                            desiredReliability: RecognitionReliability,
                            imageProcessor: AVSampleProcessor,
                            cameraQuality: AVCaptureSession.Preset = .medium,
                            onRecognitionComplete: ((String) -> ())? = nil)

    Parameters

    swiftyTesseract

    Instance of SwiftyTesseract

    desiredReliability

    The desired reliability of the recognition results.

    imageProcessor

    Performs conversion and processing from CMSampleBuffer to UIImage

    cameraQuality

    The desired camera quality output to be seen by the end user. The default is .medium. Anything higher than .medium has no impact on recognition reliability

    onRecognitionComplete

    Action to be performed after successful recognition

  • Declaration

    Swift

    public convenience init(swiftyTesseract: SwiftyTesseract,
                            desiredReliability: RecognitionReliability,
                            avManager: AVManager,
                            onRecognitionComplete: ((String) -> ())? = nil)

    Parameters

    swiftyTesseract

    Instance of SwiftyTesseract

    desiredReliability

    The desired reliability of the recognition results.

    avManager

    Manages the AVCaptureSession

    onRecognitionComplete

    Action to be performed after successful recognition

  • Declaration

    Swift

    public convenience init(swiftyTesseract: SwiftyTesseract,
                            desiredReliability: RecognitionReliability,
                            imageProcessor: AVSampleProcessor,
                            avManager: AVManager,
                            onRecognitionComplete: ((String) -> ())? = nil)

    Parameters

    swiftyTesseract

    Instance of SwiftyTesseract

    desiredReliability

    The desired reliability of the recognition results.

    imageProcessor

    Performs conversion and processing from CMSampleBuffer to UIImage

    avManager

    Manages the AVCaptureSession

    onRecognitionComplete

    Action to be performed after successful recognition

  • Stops the camera preview

    Declaration

    Swift

    public func stopPreview()
  • Restarts the camera preview

    Declaration

    Swift

    public func startPreview()
  • Binds SwiftyTesseractRTE AVCaptureVideoPreviewLayer to UIView.

    Declaration

    Swift

    public func bindPreviewLayer(to view: UIView)

    Parameters

    view

    The view to present the live preview

  • Provides conformance to AVCaptureVideoDataOutputSampleBufferDelegate

    Declaration

    Swift

    public func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)

    Parameters

    output

    AVCaptureOutput

    sampleBuffer

    CMSampleBuffer

    connection

    AVCaptureConnection