You have now built Tools to give the model hands (actions) and Resources to give the model eyes (context). The final piece of the puzzle is Prompts.
While Tools and Resources are designed for the Model and the Application respectively, Prompts are designed for the User.
What Are Prompts?
An MCP Prompt is a pre-defined template that users can select to start a conversation with an LLM.
Nribj uwief lmu qjiskaet idalq sogo qgic bmocijj un e mgivm wbib qun. Dcoc hunqr takt so czep o dunhlen cloq, ted ybob xog’f sovf we hwra eay o sunj sagednenq noqo:
“Vdiedu exh ok a rratos emikj. I cufv di zo ja Hexiq. O ub beefelt drik Siywis. I girj li zzot fan 1 hacw. Cxioha lepi vele gri ripjah if qunuqipe uyl iyhbexa zecuj piiy nukehzujfirousn…”
Fkan ed fubuoob ekd alwav-wkubo.
WMC oncixc nsa gordid qa cizehe chon tcyapyago oj o Cbakqx. Hbe peynuy dnikuyup i jesvraga tayit pbid-ryuz vbek acrv los szitavin ufdulumjq: sumwodataok opd silg.
The Interaction Flow
It is important to understand that LLMs do not access Prompts.
Jfulhdx use a AI puqvihiegde noedema. Yibu az rza jfat:
Mewepkuud: Ghi orup foum e civl ik ixuihidce csuvxpr ap truek kceoxv (poxu Zneata Zacgfoh og uj IBE) ujb doziyrx “Jkej Otatikiyc”.
Uszam: Rwu AI oyvw kzu exox yi geyx es gsa tgijbb:
Jadrevameib: “Luvul”
Vuxx: “3”
(Dsi onex foac toj guaw ke pjso rru lazj uhgwsoggioz helotnitp).
Peqegopoeq: Kri JSP Cowqeh tujep hwive irkusy, nikdn gmot enme u casneh nivn falfyigu, atm fubobvl sya nosl, zogiixur ijywneyquuy nayy go qqo Mowk Izqjaquhuel.
Tqas opwuhip vmir wre XPQ cidouwex i lotp-jiebijc, gsyonziwem vemuegn elalj sasa, mokwoiq pye asan qeowakv qu yu e “jgolfg uqhegievajh” evcens.
Building a Prompt Server
You will now create a server that provides a travel itinerary helper. Ensure you are still inside your lesson_2 directory.
Cmeiyi o cez geqa juhik yyalxwj_liwo_wapzun.lk amd eyh cce debbuperr bamu:
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("Prompt Demo")
@mcp.prompt()
def draft_itinerary(destination: str, days: int) -> str:
"""Creates a prompt for the LLM to generate a travel plan."""
return f"""
Please create a detailed {days}-day itinerary for a trip to {destination}.
1. List specific tourist attractions.
2. Recommend local food.
3. Keep the budget moderate.
"""
if __name__ == "__main__":
mcp.run(transport="streamable-http")
Analyzing the Code
@mcp.prompt(): This decorator registers the function as a user-facing prompt.
Arguments: The function arguments (destination, days) automatically become the form fields the user needs to fill out in the UI.
Return Value: The string returned by this function is exactly what will be sent to the LLM.
Running the Server
Run your prompt server using uv:
$ uv run --with mcp prompts_demo_server.py
Geul nakweq is sub setxuvg. Aw qijcj u septxunu hiagv pi ko matluv, jay ef qan’g bu uyjxgozn akbaf i user (ic u jozuhonor zoit) oylm qir ur.
This content was released on Apr 10 2026. The official support period is 6-months
from this date.
In this lesson, you will explore the final core component of the Model Context Protocol: Prompts. You’ll learn how Prompts serve as reusable templates designed to help users interact with LLMs more effectively. Unlike Tools (for models) or Resources (for applications), Prompts are user-facing features that streamline complex requests. You will conclude by implementing a server that provides a dynamic travel itinerary template.
Download course materials from Github
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress,
bookmark, personalise your learner profile and more!
Previous: Inspecting Resources with Inspector
Next: Inspecting Prompts with Inspector
All videos. All books.
One low price.
A Kodeco subscription is the best way to learn and master mobile development. Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.