Prompt

Official Definition

An input or instruction provided to a generative AI model to guide or direct the model’s output.

Source: AIEOG AI Lexicon (Feb 2026), adapted from NIST AI 100-1

What prompt means in plain language

A prompt is the text, question, or instruction a user gives to a generative AI model to tell it what to do. When you type a question into ChatGPT, write instructions for an AI assistant, or configure a system prompt that guides an AI application’s behavior, you are creating prompts.

Prompts are the primary interface between humans and generative AI systems. The quality and specificity of the prompt significantly affects the quality of the output. This has given rise to “prompt engineering” — the practice of crafting prompts that produce desired, reliable, and consistent outputs.

In enterprise contexts, prompts exist at multiple levels: system prompts (configured by developers to set the model’s behavior and constraints), user prompts (provided by individual users at the point of interaction), and chain-of-thought prompts (sequences of prompts that guide the model through multi-step reasoning).

Why it matters in financial services

Prompts are a governance concern because they directly influence what AI systems produce. Poorly designed prompts can lead to inaccurate outputs, compliance violations, or inappropriate content. Well-designed prompts with appropriate guardrails can improve accuracy and reduce risk.

In financial services, prompt governance considerations include system prompt security (preventing unauthorized modification of prompts that define model behavior), prompt standardization (ensuring consistent, compliant prompts for regulated processes), prompt logging (maintaining records of prompts for audit and investigation), and prompt testing (validating that prompts produce appropriate outputs across a range of scenarios).

Key considerations for compliance teams

  1. Govern system prompts. System prompts that define AI behavior in production applications should be subject to change management controls.
  2. Standardize prompts for regulated processes. Where AI is used in compliance-relevant workflows, standardize prompts to ensure consistency.
  3. Log prompts and outputs. Maintain records of prompts and corresponding outputs for audit, compliance, and investigation purposes.
  4. Test prompt effectiveness. Validate that prompts produce accurate, appropriate outputs across a range of inputs and edge cases.
  5. Train users on prompt best practices. Educate staff on how to craft effective prompts and what information to avoid including.
  6. Include prompt management in AI governance. Prompt design, testing, and management should be part of the overall AI governance framework.

Stay current on AI risk in financial services

Get practical guidance on AI governance, model risk, and regulatory developments delivered to your inbox. Stay up to date on the latest in financial compliance from our experts.

Google reCaptcha: Invalid site key.