AVSampleProcessor
public protocol AVSampleProcessor
To be implemented if custom image processing is needed.
The default SwiftyTesseractRTE implementation does a small
amount of image enhancement and conversion to grayscale.
If the results from RealTimeEngine do not come back reliably,
then performing your own image processing may be neccessary to
receive optimal results.
-
Converts CMSampleBuffer into a grayscale UIImage.
Declaration
Swift
func convertToGrayscaleUiImage(from sampleBuffer: CMSampleBuffer) -> UIImage?Parameters
sampleBufferThe incoming
CMSampleBufferfrom the AVCaptureSessionReturn Value
An optional grayscale
UIImage -
Crops
UIImageto the bounds of areaOfInterest. The areaOfInterest must be located within the bounds of the AVCaptureVideoPreviewLayer or recognition will not be properly performed.Declaration
Swift
func crop(_ image: UIImage, toBoundsOf areaOfInterest: CGRect, containedIn previewLayer: AVCaptureVideoPreviewLayer) -> UIImage?Parameters
imageThe image to be processed for OCR
areaOfInterestThe area within the
AVCaptureVideoPreviewLayerto explicitly perform recognition onpreviewLayerInternal `RealTimeEngine
Return Value
Final
UIImageready for OCR
AVSampleProcessor Protocol Reference