Writing a Prompt: Technical Guide
Posted on April 18, 2025 • 3 min read • 623 wordsPrompt engineering allows for optimal use of language models. This technical guide presents the basics for creating effective prompts, especially with LangChain.

In LangChain, a prompt is a textual structure designed to guide a language model. A well-constructed prompt includes several key components:
This format enables clear, framed, and reproducible queries.
Here’s another example using LangChain in Python with this structure:
from langchain.prompts import PromptTemplate
# Define a structured prompt template
prompt = PromptTemplate(
input_variables=["product_name"],
template="""
Instruction: Write an appealing marketing description for a product.
Context: The target audience is young adults interested in technology.
Input Data: {product_name}
Output Indicator: A short description with a maximum of 2 sentences.
"""
)
# Generate the final prompt
final_prompt = prompt.format(product_name="waterproof Bluetooth wireless earbuds")
print(final_prompt)
# Example LLM response:
# "Experience total freedom with our waterproof wireless earbuds. Perfect for workouts, they deliver immersive sound with no compromise."Thanks to PromptTemplate, you can design reusable and composable prompts. Here’s a modular approach:
from langchain.prompts import PromptTemplate
base_template = """\
Instruction: {instruction}
Input: {input_data}
Respond with: {output_indicator}
"""
template = PromptTemplate(
input_variables=["instruction", "input_data", "output_indicator"],
template=base_template,
)This makes it possible to dynamically build prompts for various use cases without duplicating code.
Here’s another example using LangChain in Python with this structure:
from langchain.prompts import PromptTemplate
from langchain.llms import OpenAI
from langchain.chains import LLMChain
template = """\
Instruction: {instruction}
Context: {context}
Input: {input_data}
Respond with: {output_indicator}
"""
prompt = PromptTemplate(
input_variables=["instruction", "context", "input_data", "output_indicator"],
template=template,
)
llm = OpenAI(temperature=0.3)
chain = LLMChain(llm=llm, prompt=prompt)
response = chain.run({
"instruction": "Classify the sentiment of the following text.",
"context": "Use only the categories: positive, negative, neutral.",
"input_data": "I am very satisfied with this product, it exceeded my expectations.",
"output_indicator": "a single word indicating the sentiment"
})print(response)
Expected response: positive
Designing a good prompt relies on several key principles:
| Strategy | Structure | Complexity | Level of Control |
|---|---|---|---|
| Zero-shot | A direct instruction without example | Low | Low |
| Few-shot | Instruction + a few input/output examples | Medium | Medium |
| Chain-of-Thought | Instruction + step-by-step reasoning | High | High |
Each strategy fits different use cases: zero-shot for simple tasks, few-shot to improve accuracy, and chain-of-thought for complex reasoning.
The prompt structure should depend on the nature of the task:
Here’s a selection of trusted resources to reuse or get inspiration from existing prompts:
| Name | Prompt Type | Source Link |
|---|---|---|
| Prompt Engineering Guide | General-purpose, annotated | github.com/dair-ai |
| Flowise AI | Visual, workflow-based | flowiseai.com |
| OpenPrompt | NLP & classification-focused | github.com/thunlp |
| LangChain Templates | Modular prompt chains | python.langchain.com |
These sources provide an excellent foundation for experimenting and iterating with your prompts.
Flowise makes it easy to integrate prompts into no-code pipelines. Benefits include:
For data scientists looking to prototype without writing code, Flowise is an efficient alternative.
The quality of a prompt directly impacts model performance. By structuring prompts around instruction, context, input data, and output indicator, you enhance the precision, coherence, and relevance of generated responses.
Continue experimenting with different strategies (zero-shot, few-shot, chain-of-thought) and explore tools like Flowise and OpenPrompt to accelerate your iterations.
References: