AVSampleProcessor
public protocol AVSampleProcessor
To be implemented if custom image processing is needed.
The default SwiftyTesseractRTE implementation does a small
amount of image enhancement and conversion to grayscale.
If the results from RealTimeEngine
do not come back reliably,
then performing your own image processing may be neccessary to
receive optimal results.
-
Converts CMSampleBuffer into a grayscale UIImage.
Declaration
Swift
func convertToGrayscaleUiImage(from sampleBuffer: CMSampleBuffer) -> UIImage?
Parameters
sampleBuffer
The incoming
CMSampleBuffer
from the AVCaptureSessionReturn Value
An optional grayscale
UIImage
-
Crops
UIImage
to the bounds of areaOfInterest. The areaOfInterest must be located within the bounds of the AVCaptureVideoPreviewLayer or recognition will not be properly performed.Declaration
Swift
func crop(_ image: UIImage, toBoundsOf areaOfInterest: CGRect, containedIn previewLayer: AVCaptureVideoPreviewLayer) -> UIImage?
Parameters
image
The image to be processed for OCR
areaOfInterest
The area within the
AVCaptureVideoPreviewLayer
to explicitly perform recognition onpreviewLayer
Internal `RealTimeEngine
Return Value
Final
UIImage
ready for OCR