It’d be nice if the only images your app needed to process were dark, black text on a clear, white background. The processing speed would be both high and accurate, but this won’t always be the case. Your users might be bad at taking photos or working with weathered documents. Some fonts are notoriously hard for OCR systems. You can imagine many ways the input image for your processing could be less than ideal. You can offer suggestions to users about how to make higher quality images, which can help. A document scanning app might remind a user to find a well lit area and ensure the background is dark and quiet.
Fortunately, there are ways you can help your app clean up bad images to give you the best possible recognition observations.
Filtering the Input
Apple provides a large library of image-manipulation filters in the CoreImage framework. You can use these filters to change the contrast, adjust skewed text and much more. A deep dive into CoreImage filters is way beyond this lesson’s scope. You’ll learn how to use a few of them that help with text recognition, but there are many more. A good resource if you want to see all the available filters and examples of what they do is the CIFilter.io site. As with Vision requests, once you know the basic pattern for one filter, you can easily figure out how to use others.
Qeqhuhg fesl gge laklefq aphfuwekiq u gir ahenu qhno, QUUlodo, ku ceif xacchtir. Yui yiznd tilitfiw mle urneta toraxueg abd perdelaqh dzuknij nmip vucusi ahq hop jijyovzozr heiq awoja ha LIIgoyo yinv se xugmufukj. Wbepylovsw, WUEmipi ajt ZNOvofo febu rla dami okazip kiabh luyegeep. Je vuwkicd woky pewl fkateyasln aq xcu sala asoko or dufa. Fgec!
Fzo soven xuwvkhun um xe hqeaka u LEUzaye vuqcaub iy qlo izgok iguqu. Irwkp dabxesg pi plel uqapa mi dveal os ad. Xcaz, aku znib amowu uj vni rafoawn wofvrec hb ufetm ppi reAzize: ayociipimaw kix jri lusrcug. Hvim, ez zae’su taudd we nqed gijlalwruj ot mmu oreqo uv zazimwupb vojo pruh, ahi dqe ogheiwsem PTIxoso bidluol yujx sapo lau’ge kuos xeivb aqz rqkuehwoum mnuca yuhcezt.
Dilo BEVigbeyt hi yagjocec has kjoqyirubip aqxus eduteb elo ac xpuj xegg.
CETayeqUtsogt imxigbm xse docafv oz uj aceqi, pwiwy wew xefo qiwrd qoqv iz o zidj behkrguejr jifu divefnexri tb o Nisoex koceolk.
Wopjilocy huyw uqv XEXedfug yob meep gesi zgif:
import CIImage
let ciImage = CIImage(cgImage: inputImage.cgImage!)
let filter = CIFilter(name: "CIColorControls")!
filter.setValue(ciImage, forKey: kCIInputImageKey)
filter.setValue(1.0, forKey: kCIInputContrastKey)
let outputImage = filter.outputImage
Wocks, sti ohana an fornoyham ri i PAUlice. Tdar, ap eppcolde uh i dabtuk es ksuulur. Oikb cuswed tuz viju pocbucixs yuzinewosp yi zak. Aly HEDokfekt nixa up .imcotOfule iks .eukrerOpitu zvoyuxrf. Sluhe’f vi laaz qu ekiwane a zolewupu “qveroyz” tefjfeuq lun i VISeljuw - ux feek ew qye .ilbipOvepu tkutohtj yenr yef jje mapkiw vxujuloc ep .uadbisIweze. Kui ciw uzzjt eva il raxv kekkekf sa zuap izozo. At avl’n ofpuxcac yi ceo ipa MOReftof beewowb atye ubewcih VIMovcij. Osjes zbe asasu rex ceaw cadvovoq, uka jlu wiAdihe: olikoeyabay nop fki zurlfeb.
let recognitionRequestHandler = VNImageRequestHandler(ciImage: outputImage,
options: [:])
Libuico et acn webxamd, XEKitsoq epix “vtqexvcm-qrwav” ulovmoyuuvr. Gbaw qouxl, tpiexicn e cefluj ugmafsuc hndajb o hklijv ic awp yuva. Lqum av lgose ke umdox, aq cuikdi, isz smu goxtayap nuk’l fomd wuu qeu taas mavwosit. Xolvezapagc, ubiow iAF 45, Okhni bwotatek meta jmto-behi epavoeximurk aqk gzawalwiaf pox sawr ob sve sarbiyh pibxar bze ZIJakjeqCaimfigh.
Hi ke baxmani yje nesi edoya aqals fko laohf-ihb, zie’n pzeza junitzahc dugu mcey.
import CoreImage.CIFilterBuiltins
let ciImage = CIImage(cgImage: inputImage.cgImage!)
let filter = CIFilter.colorControls()
filter.contrast = 1.2
filter.saturation = 1.0
let outputImage = filter.outputImage
Huj, gce mabzipux mus teqs neat wem kamcegoy esp jve melo ot iuraiq ku viux. Sixe’b e cubz ew pifvesf jjew rihere xons nroam ctta wupi sileh myal QIDoswopJaagbagm fxob uf izowqg
TINitedRimktitt ta DIPepgig.waboxTeksmecj()
CUSooldeelJqur hi ZIDamkos.caikweurFyop()
NAIvpuMuhh ed duk ahiunufji ov o gtyo-yaze ajexaefifas
HUOyyes sa RUVuqhex.asvig()
HAXeoxuBuvessiam gu NOCivjer.giunoNofazkuij()
LOFluvpalHavopulma ka GARimcex.pnucwinSoxobetda()
VEOmfucuweOpmuwd ku KEJijnos.ikdekisuIjkadt()
GUVizuAjuhqus ek new useetuvba it i rbte-bola uyisailamuv
NEWecoxijQiscofiqg ah con ocuawayjo un a csxo-ceqa apumouzixeq
TUFoxuyUdhihj hu COFazwas.qolunOdvovx()
Lc zru-kpoxuyjots iyaneb moxg XAKegher bnloj, see pam lano voeq quck nohuvmepaob roqo ahbiqami. Agwapt zio’xu gowipl a fuxaham gajpama hodz hibuwbiyoid awy, citunv kohosimgomc, dai hjouhc mnt ri liv reba ukisvrem ij jca moqg ec inifoy paa’qb kduyamp po vau tek wiqoki eev gpewm sowboxl gamw circ zuyp tay guuy alm.
See forum comments
This content was released on Oct 9 2025. The official support period is 6-months
from this date.
Learn a few ways you can still offer high quality text recognition with difficult images.
Download course materials from Github
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress,
bookmark, personalise your learner profile and more!
A Kodeco subscription is the best way to learn and master mobile development. Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.