Simple Enough Blog logo
  • Home 
  • Projects 
  • Tags 

  •  Language
    • English
    • Français
  1.   Blogs
  1. Home
  2. Blogs
  3. Writing a Prompt: Technical Guide

Writing a Prompt: Technical Guide

Posted on April 18, 2025 • 3 min read • 623 words
LLM   Langchain   General   Helene  
LLM   Langchain   General   Helene  
Share via
Simple Enough Blog
Link copied to clipboard

Prompt engineering allows for optimal use of language models. This technical guide presents the basics for creating effective prompts, especially with LangChain.

On this page
I. Understanding What a Prompt Is in LangChain   II. Designing a Modular Prompt with LangChain   III. Best Practices   IV. Prompting Strategy Comparison   V. Adapting Prompts to the Task   VI. Where to Find Ready-to-Use Prompts?   VII. Integrating Prompts into Visual Workflows with Flowise   Conclusion  
Writing a Prompt: Technical Guide
Photo by Helene Hemmerter with DALLE

I. Understanding What a Prompt Is in LangChain  

In LangChain, a prompt is a textual structure designed to guide a language model. A well-constructed prompt includes several key components:

  • Instruction: What you explicitly ask the model to do.
  • Context: Additional information to frame the answer.
  • Input Data: The specific input provided at runtime.
  • Output Indicator: What kind of output is expected.

This format enables clear, framed, and reproducible queries.
Here’s another example using LangChain in Python with this structure:

from langchain.prompts import PromptTemplate

# Define a structured prompt template
prompt = PromptTemplate(
    input_variables=["product_name"],
    template="""
Instruction: Write an appealing marketing description for a product.
Context: The target audience is young adults interested in technology.
Input Data: {product_name}
Output Indicator: A short description with a maximum of 2 sentences.
"""
)

# Generate the final prompt
final_prompt = prompt.format(product_name="waterproof Bluetooth wireless earbuds")

print(final_prompt)

# Example LLM response:
# "Experience total freedom with our waterproof wireless earbuds. Perfect for workouts, they deliver immersive sound with no compromise."

II. Designing a Modular Prompt with LangChain  

Thanks to PromptTemplate, you can design reusable and composable prompts. Here’s a modular approach:

from langchain.prompts import PromptTemplate

base_template = """\
Instruction: {instruction}
Input: {input_data}
Respond with: {output_indicator}
"""

template = PromptTemplate(
    input_variables=["instruction", "input_data", "output_indicator"],
    template=base_template,
)

This makes it possible to dynamically build prompts for various use cases without duplicating code.

Here’s another example using LangChain in Python with this structure:

from langchain.prompts import PromptTemplate
from langchain.llms import OpenAI
from langchain.chains import LLMChain

template = """\
Instruction: {instruction}
Context: {context}
Input: {input_data}
Respond with: {output_indicator}
"""

prompt = PromptTemplate(
    input_variables=["instruction", "context", "input_data", "output_indicator"],
    template=template,
)

llm = OpenAI(temperature=0.3)
chain = LLMChain(llm=llm, prompt=prompt)

response = chain.run({
    "instruction": "Classify the sentiment of the following text.",
    "context": "Use only the categories: positive, negative, neutral.",
    "input_data": "I am very satisfied with this product, it exceeded my expectations.",
    "output_indicator": "a single word indicating the sentiment"
})

print(response)
Expected response: positive


III. Best Practices  

Designing a good prompt relies on several key principles:

  • Be explicit: Clearly define the goal.
  • Provide useful context: format, rules, examples.
  • Specify the output: expected format, style, constraints.
  • Test and iterate: adjust based on outputs.

IV. Prompting Strategy Comparison  

StrategyStructureComplexityLevel of Control
Zero-shotA direct instruction without exampleLowLow
Few-shotInstruction + a few input/output examplesMediumMedium
Chain-of-ThoughtInstruction + step-by-step reasoningHighHigh

Each strategy fits different use cases: zero-shot for simple tasks, few-shot to improve accuracy, and chain-of-thought for complex reasoning.


V. Adapting Prompts to the Task  

The prompt structure should depend on the nature of the task:

  • Classification: use short prompts with explicit options.
  • Text Generation: include examples and rich context.
  • Data Extraction: clearly specify the expected output format.

VI. Where to Find Ready-to-Use Prompts?  

Here’s a selection of trusted resources to reuse or get inspiration from existing prompts:

NamePrompt TypeSource Link
Prompt Engineering GuideGeneral-purpose, annotatedgithub.com/dair-ai
Flowise AIVisual, workflow-basedflowiseai.com
OpenPromptNLP & classification-focusedgithub.com/thunlp
LangChain TemplatesModular prompt chainspython.langchain.com

These sources provide an excellent foundation for experimenting and iterating with your prompts.


VII. Integrating Prompts into Visual Workflows with Flowise  

Flowise makes it easy to integrate prompts into no-code pipelines. Benefits include:

  • Drag-and-drop interface to build LLM chains
  • Direct integration with LangChain
  • Rapid deployment for business use cases

For data scientists looking to prototype without writing code, Flowise is an efficient alternative.


Conclusion  

The quality of a prompt directly impacts model performance. By structuring prompts around instruction, context, input data, and output indicator, you enhance the precision, coherence, and relevance of generated responses.

Continue experimenting with different strategies (zero-shot, few-shot, chain-of-thought) and explore tools like Flowise and OpenPrompt to accelerate your iterations.


References:

  • LangChain Documentation
  • Prompt Engineering Guide (dair-ai)
  • Flowise
  • OpenPrompt (THUNLP)
 Real-Time, Real-Easy: Deploying WebSockets
Hallucinating with LangChain 
  • I. Understanding What a Prompt Is in LangChain  
  • II. Designing a Modular Prompt with LangChain  
  • III. Best Practices  
  • IV. Prompting Strategy Comparison  
  • V. Adapting Prompts to the Task  
  • VI. Where to Find Ready-to-Use Prompts?  
  • VII. Integrating Prompts into Visual Workflows with Flowise  
  • Conclusion  
Follow us

We work with you!

   
Copyright © 2026 Simple Enough Blog All rights reserved. | Powered by Hinode.
Simple Enough Blog
Code copied to clipboard