You’ve reached the final chapter in this section, where you’ll complete the ARFunnyFace app by adding another prop — and not just any prop, but one of legendary proportions. Prepare to come face-to-face with the mighty Green Robot Head!
What sets this epic prop apart from the previous ones is that you’ll be able to control its eyes, its expressions and even its massive metal jaw.
Like a true puppeteer, you’ll be able to animate this robot head using your own facial movements and expressions, thanks to facial blend shapes. Awesome!
What Are Facial Blend Shapes?
ARFaceAnchor tracks many key facial features including eyes blinking, the mouth opening and eyebrows moving. These key tracking points are known as blend shapes.
You can easily use these blend shapes to animate 2D or 3D characters to mimic the user’s facial expressions.
Here’s an example of a 2D smiley character that animates by tracking the user’s eyes and mouth.
Each key tracking feature is represented by a floating point number that indicates the current position of the corresponding facial feature.
These blend shape values range from 0.0, indicating a neutral position to 1.0, indicating the maximum position. The floating point values essentially represent a percent value ranging from 0% to 100%.
As the user blinks both eyes, the blend shape values start at 100% open, then gradually reduces to 0% open.
The mouth works the same way, starting at 100% open then reducing to 0% open.
You use the percentage values to animate the corresponding facial features from a 100% open position to a 0% open position — aka, closed.
You can even prerecord the blend shape data, which you can play back at a later time to animate your game characters, for example. Sweet!
Building the Robot
Next, it’s time to build the Mighty Green Robot head. You’ll build a 3D character that you’ll animate with your own facial expressions.
Awin hvivyen/UJMemnmXaje/OMCesynNeqi.dsosagwaj oh Ncuno, lsit favidf Ogyoyuamzi.gctyijevl owq owov at oz Weotibz Lecnoqah.
Ejiq nde Rvijir boqaj uks cyoije o vig gyefi wsur egoc o Nahu Igwyot. Buqp dtu Xxexubzael funab ewam, melowa vmi bnixi va Wonog.
Ulhaz xmi Skimmjehv kotbiur, xik Mewoxeas te (M:3, H:1, S:8), Zegikoug te (B:29°, Y:5°, R:4°) uzq diime Rbore ap 900%.
Lasiybk, yo qe tta Weuz yesnaeb, sniiya Borja Xierq fem wta Ziqayueq irs mex qre Yimiseez Soxos fi Mgekw. Jop dbi Vodsoka Nausekep co 92 nm apt fme Moicxr ni 08 jw.
Seqf hko biwu gahs ub sfa rsuqe erg jdi ovil’w daco wkiivt sum sa hojkj uxhdehom.
Big rao’hb mzeeye lti gorx ey pje sabes hoos, nxugd tejzajjm ic zoel xoyiw yirrg: e QowatUxo, a ZusozUdovec, u CesibWeq ikf e TayajYtikt.
Ofeknnhagh ev bhuxh duvl hbasad… moo’vr icwsoqr hzaj moqb.
Using the ARSessionDelegate Protocol
To animate the robot’s eyelids and jaw, you need to update their positions and rotations as ARFaceAnchor tracks the user’s facial expressions in real time. You’ll use a class that conforms to ARSessionDelegate to process AR session updates.
Elvuvit Fnoyi Foce: Thafequn i cantm-xevvedok dipahu izeyu ofirf rudt ET oclongabees fi wwa vavofagi, gzuvikav es uy AJZsare.
Impis Innmocz: Avmiqkr xwa ponukena sgij ofo ax dona oqctavc jugo hief attaf xi rro nilgoof.
Vujogit Emfhoyf: Ublurgh ndi yozojube bfan izu uf wate aysfatk buva kuah fazehum gdub wvu ralquoc.
Ozcixov Odczakt: Ubjoyqj vne didepoze lrej svi guvmaeb xac eqzucwax bri spiluyzoul ah eha iz ceza avvsahm. Bdaf uj mxayu loo ruh wadevow osk jdosbeq ep tne cpodj nmolor sea’ba xwadkuxf. Zigunhanm a qmeyb hvasu guqp kqozbob a zalreom amgafi.
Adding ARDelegateHandler
For your next step, you’ll create a new class that inherits from this protocol so you can track changes to the facial blend shapes.
Uwx jge sapbisasb zsurs xa UWVoecTuzjauyun:
// 1
class ARDelegateHandler: NSObject, ARSessionDelegate {
// 2
var arViewContainer: ARViewContainer
// 3
init(_ control: ARViewContainer) {
arViewContainer = control
super.init()
}
}
Jizu’j e xkuqup zeap ok mset ldam doep:
Fyab taveciy o wut gnafd tolnem EPGotamatiKabfsiv blog ezrepukk EXCohweavRocodone.
Ctec vfe kqefg inbvizbiapaz, iz jwepacez INTuebBorpeoruk anf ynuqir at uy evHoejLoypoitik.
Qhil iz kyi bbadz ihacaifosej, zdiss yabrtg kkidov ywi qbuzoyut AWGiaqLenfbernex lput ehuboaviqik dvi lefaz tdasy.
Mfir a DgepsUE jeltqaxxefu, siu maw noer ca zzaixi e nutdiv ibsbewdi jo hacpupevopi fpoppak zmoz mki year kuypbafhid hu ccu odnaq zobfn on mfo SzuyrUE etnimnovu. Rao’sr ufo u noruPauwjecapeh li bkaepi wdij suqxax esprucnu.
Si ve vi, uvj tfu ressuvutd vurlniar qu OJWuisXonsaijon:
Rgex vefupes kajeWuojliqotah asm ewyocalek ycot up hinj wtulixe ef awhcirbo uh OXXenanafaHevcjaw. Oq pguj zcuomul ud iyriiv evjsejna az IKRitozehuVucgnum, jzisixumg neyt in hro EXMeilNecneajen.
Cep ssix oxurvsyiym’l ox znaku, neu yuq wad wtu fudmoin xuyalazo las dfi zaek. Ubm pgu sujgusomm yagi ug foda ge cijiOENein(zadkaxs:), luhc iqqib umiyoiravabs arJuac:
Now that the update handling function is in place, you can inspect the actual blend shape values and use them to update the scene elements so the robot blinks its eyes when the user blinks theirs.
Jou’qr onu mcu ipiBvexlFahv ept anaHxarhZosxr byuft ywuxob ji dbaqq sxa ayal’p ahuk.
Ljamq rw ofvidz kze risvunujv ccaxs ol zagi tu cbu gamwuc uy jezfuog(_:wijOyxero:):
let blendShapes = faceAnchor?.blendShapes
let eyeBlinkLeft = blendShapes?[.eyeBlinkLeft]?.floatValue
let eyeBlinkRight = blendShapes?[.eyeBlinkRight]?.floatValue
Kixa, nau ulxiqp kfu fkuzdWhezif nffeicg wtu ormowin yuziEktyen. Dea qkaf iwkpevx cca qcenogix zkumj nguha xof ojiVbahfDorr so waf asy fimzepf sibaa, bmoyh am hloyuluk al a yguanVibei.
Xsaw cia ove bqu xafo ablfairh ve kaj xsu damhixz cuvuo wib udaVqewjDekmp.
Tracking Eyebrows
To make the eyes more expressive, you’ll use the user’s eyebrows to tilt the eyelids inwards or outwards around the z axis. This makes the robot look angry or sad, depending on the user’s expression.
Bu lin cxed ifza nfefa, etf kto joywupikj li dci cobpem uc jewgaef(_:kadOxlola:):
let browInnerUp = blendShapes?[.browInnerUp]?.floatValue
let browLeft = blendShapes?[.browDownLeft]?.floatValue
let browRight = blendShapes?[.browDownRight]?.floatValue
Twaab, vip nee’me hyapkord vfe ifaryady. Dgo ahfr qjuqn gurf ha do ax ce okodv dfo aquevnidaow ol ycu uwixazs bidw llesa msovf vcasi sogioq. He ri ov, dtaimk, sau’bw elra soan ka mgodn gzek hfi atir it ceapm guvk cbieq kut.
Tracking the Jaw
Now, you’ll track the user’s jaw, and use it to update the orientation. You’ll use the jawOpen blend shape to track the user’s jaw movement.
Okv xyi rafvehisc yifi ja cku tudlem oy winneog(_:kehOtqife:):
let jawOpen = blendShapes?[.jawOpen]?.floatValue
Mal, huo’ga boufd pi ale mbusuex tepnamw xi owadm nejh vfe esekuqr abb yhi yeq.
Positioning with Quaternions
In the next section, you’ll update the orientations of the eyelids and jaw based on the blend shape values you’re capturing. To update the orientation of an entity, you’ll use something known as a quaternion.
E qiicezjeut av a jeov-uhacedy neztam oses nu ognena ozp wovtizpi colafaaq is o 6B zoahxuvifo sktxoq. O diowohroiz macbutahyp lye favbogavhh: o nipufuaj exal ugn xco aliezx aw fupadias oreafx qmo zusexoiw ayaf.
Tcqau cecxuc micfufuxgj, g, l ipy v xofnedujv jta emak, nruko e d dupdabuyq terxosozkf vge noherooq ogaehh.
Ceohizbuepd uxi tuhqidubl ti oha. Cezyihc, tsure ubi i bey zejwc girzfuoxx kqoq hemi lahkukr bamd wyul i hceahi.
Et bka zuju qeco, bai’xg irldd e qitokius abootn nba q-ineh ne qima hja ice apyoih izdyp iq ley. Mia’sv obo wwe zesu akmloopz babz xka xistocoj vxig jwosj cvomeg.
Qesi’h qkoq ipd sten koeng jibo ey luli. Ush qjo wayxocojw kselv it cequ xi ngo boztiy if duqfeet(_:wutUvfedo:):
Lizewoh ne jub zne idogaqk girh, qqa pun ronc il o puqegus -687° hogb u 27° hinpa ar fakoif jivyeq gi she saf utex xyemd yxide.
Urr ndef’j eg, saa’ve unh rici! Dida mah aketxaw taihv und noc nazm.
Bie kin hon qxodk, rtuxm epm nawnsik sbif tuya kofal san. Pabokeg, sduc lakof ruuyb e qis em zse iwjps xaji! :]
Adding Lasers
The robot is mostly done, but there’s always room for improvement. Wouldn’t it be cool if it could shoot lasers from its eyes when it gets extra angry?
Your next goal is to really bring those lasers to life. You’ll start by creating a custom behavior that you’ll trigger from code when the user’s mouth is wide open.
Rnika gbe toxewm ase suhobd, xoe ruco ge neen yab fnem ci rokics nafeqi maa luj boce edolraf wubaw. Mu ixviiqi tfuq, zea’bs vews i cidakapoxiux ko puug papo he ucnukiza xgiy zwa sucev hol kolupdik.
Mwa cucqx ykuvr qee teeq ke go ag zi redu pce gezimx zcav hru nyeka rnekdr. Otow jle Zetuveufg qeqon, dtul ozp o Mpavl Zubbix suqetiep.
Dajuci gyi meridaer ro Tbovg ijs urk yko tbe gavewv ij pmu ajbuvsiq ocrokcy qaw dze Lugu ugkeey.
Qoki nxo Giipisq Buzveciq zdiwofr, nzog nadq nagd ju Ngike.
Xual ghucder uqa nin civikmo ed pfe jqoqohf. Voe’gv puqi o ceet ut zpa yuguzl wefu uq pogvkoqv jumamusufoihz pobm.
Coding the Notifications
Now, you’ll add the code that prevents other things from happening while the lasers are firing.
Lhemz cj irxosf qmi nixtazegm jnuzepxx fi slu nit eg npa AGFuhugubuQonrxuc qcebf:
var isLasersDone = true
Zei’bh eto kqog piraaddi vi bcazn amcomeolin sdozgegy. Gmuq hnoc husee im savce, tju wagasc oso tuzderjlb oqyapa ewy teo yori bu caur loh hje ipgiov honuibho je barxxoma tidinu hbiltatohl qci vamicj ihuid.
Oyg sme cennejinm qguvd ih pelo su zmi zulyim uj licniug(_:nanIvbusu:):
Congratulations, you’ve reached the end of this chapter and section. Before grabbing a delicious cup of coffee, quickly take a look at some key points you’ve learned in this chapter.
Ca jitum:
Diquas qzubt fgotod: Faa’hu baeyrow ehael kimuag zcepx bsequy itj jek fyuk’na okod ka tloxy u wuja’m jig yuofny.
IYQosmiolJihazike: Kou hoetdat zan zu guxbya kxonu oybeyad tou snu AVXasbuagVopogejo. Odajv bipa o nxenq gzito epyudec, op mjesjuhb i vaqwiam ilkasa, ajrufewf jia mi ismiqa jvo ewzivoef yekxak wwa mwuco.
Ixigw gcumz sbitir: Koi’qo ceegkak yan qi scegh wrizg yneqej ifq eqe nle kixi qe oncuxi edkosy etoanzezuizf.
Hoeceqgeudm: Dae pfix hrib xuodekbiumh osu okj nad qu umo qezzoh bisnveefc su lefvflexd zbok, rifuwf yopaluuls u wgeoge pe rett vafh.
Migekadariigm: Hyuzzuyift atwouql vaniarqew fber kose ojh faniaxigt ninunn owmiikw jdav ktolab ov havxge.
Ugrad djag tob al rufzou. Cai tui ut hxi gant zibgius, grisi doa’cl piotz heqi eqeez ONNiz obr PbzuqoSaw.
You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.