Microsoft’s POML: A New Language Turns AI Prompts Into Code

Screenshot of Microsoft’s POML turning AI prompts into structured code with HTML-like tags

Microsoft quietly released the Prompt Orchestration Markup Language (POML) on GitHub, and developers are buzzing. POML borrows concepts from HTML and CSS to transform prompt engineering into structured, reusable code and is already climbing GitHub’s trending charts.

  • Microsoft introduced POML, a markup language that structures AI prompts using HTML‑like tags and a CSS‑style theming system.

  • POML supports external data embedding, loops, conditionals and variables; it ships with a VS Code extension and SDKs for Node.js and Python.

  • Developers on GitHub and Reddit are excited about modular, reusable prompts and plan to integrate POML into agent workflows.

What happened

On Aug. 17 2025 (IST), Microsoft’s open‑source team quietly posted POML (Prompt Orchestration Markup Language) to GitHub. The project shot to the top of GitHub’s trending page with over 2,800 stars within a day and gained hundreds of forks. In a Reddit thread on r/LocalLLaMA, users described POML as “HTML for AI prompts” and praised its tag‑based structure.

POML allows developers to break long prompts into semantically meaningful components—<role>, <task>, <example> and more—similar to HTML tags. A CSS‑like styling system decouples formatting from content, enabling global changes in tone or persona. The language also provides a templating engine with variables, loops, conditionals and <let> tags. It supports embedding external data sources like documents, tables and images into prompts via <document>, <table> and <img> tags. Microsoft shipped IDE support through a VS Code extension and released SDKs for Node.js and Python.

The day after release, POML sparked discussions on GitHub Issues, Hacker News and Discord. Some developers called it a game‑changer for scaling prompt engineering. Others were skeptical, arguing that LLM context windows are large enough that prompt structuring is unnecessary. Meanwhile, the VS Code demo video showcasing real‑time syntax highlighting and auto‑completion reached more than 50,000 views.

Why This Matters

Everyday workers

AI chatbots are becoming tools for everything from scheduling to summarizing documents. POML could empower non‑programmers to reuse high‑quality prompts. Imagine HR teams maintaining a library of <template> prompts for performance reviews or teachers sharing <example> tags for lesson planning.

Tech professionals

For prompt engineers, POML introduces software‑engineering discipline. Version control, code reuse and modular design can reduce prompt drift in production systems. The separation of presentation and logic resembles the way web developers use HTML and CSS, making it easier for teams to collaborate on large prompt libraries.

For businesses and startups

Companies building AI products can now standardize prompts across teams. A POML file can embed a knowledge base via <document> tags and call dynamic variables from a CRM. This encourages maintainability, reduces errors, and may shorten the development cycle for AI workflows.

From an ethics and society standpoint

Structured prompts could enhance transparency. Stakeholders can audit POML files to see exactly which data sources and personas influence an AI’s output. Yet the ability to embed external documents raises privacy questions; developers must ensure they’re not inadvertently exposing sensitive information in prompts.

Key details & context

  • Modular tags: POML uses HTML‑like tags (e.g., <role>, <task>, <example>, <system>) and allows nested structures.

  • Data embedding: <document>, <table>, <img> tags allow integration of external data sources into prompts.

  • Templating engine: Supports variables, loops, conditionals and <let> blocks for dynamic prompts.

  • Styling: A separate styling system decouples presentation from content, akin to CSS, enabling consistent persona or tone across prompts.

  • Tooling: Comes with a VS Code extension, Node.js and Python SDKs, and an online playground.

  • Licensing: Released under the MIT License with contributions encouraged.

Community pulse

  • u/Technical‑Love‑8479 (Reddit): “It’s like HTML but for prompts. Instead of plain text you break them into tag‑based chunks. Can’t wait to use it in our Llama agents.” (38 upvotes, r/LocalLLaMA).

  • @promptsmith on X: “POML = HTML + CSS + AI prompts. This will change how we build multi‑step agents. Already ported my entire prompt library!” (621 likes).

  • HN comment by versteegen: “This language seems quite similar to Scallop… Both generalise Datalog to arbitrary semirings. POML’s weight system could enable probabilistic reasoning.” (18 points).

  • Discord user @ai_chad: “Finally, a standard for prompts! No more copy‑pasting 3‑page system prompts into every notebook.”

What’s next / watchlist

Expect community‑built libraries of reusable POML components and maybe even marketplaces selling premium prompt modules. Tool vendors could integrate POML into low‑code platforms. Microsoft may extend support to Visual Studio and JetBrains IDEs, while third‑party authors will likely create POML linters and static analysis tools. Adoption depends on how well POML plays with large context windows and whether developers find its syntax intuitive.

FAQs

  1. Do I need to learn a new language to use POML?
    POML’s syntax is intentionally similar to HTML. It may take a few hours to learn the tags, but existing web developers should feel at home.

  2. Will POML increase latency in AI calls?
    Since POML ultimately outputs a plain text prompt for an LLM, any overhead depends on how you pre‑process and parse POML files. Microsoft’s SDK handles rendering and caching to minimize delays.

  3. Can POML embed proprietary data?
    Yes, via <document> and <table> tags. However, embedding sensitive data requires caution and proper access controls.

Share Post:
Facebook
Twitter
LinkedIn
This Week’s
Related Posts