AI-powered applications are revolutionizing the way users interact with software, from intelligent chatbots to complex document analysis tools. In this tutorial, you’ll learn how to build a secure and scalable AI app using FastAPI for the backend, LangChain for prompt orchestration, and Hugging Face Transformers for natural language processing.
By the end of this guide, you'll have a working AI API that accepts user prompts, routes them through LangChain to a Hugging Face model, and returns the response, all while being protected with modern JWT authentication. Whether you’re building an internal AI tool or a public-facing application, this stack offers high performance, modularity, and security.
What We’ll Use:
-
⚡ FastAPI – A high-performance Python web framework for building APIs quickly and intuitively.
-
🧱 LangChain – A framework for developing applications powered by language models.
-
🤗 Hugging Face Transformers – Access to state-of-the-art NLP models like BERT, GPT-2, and more.
-
🔐 JWT Authentication – To secure endpoints and prevent unauthorized usage.
Let’s get started!
Project Setup
1. Create a Project Directory
First, create and navigate to your project folder:
mkdir fastapi-langchain-ai
cd fastapi-langchain-ai
2. Create and Activate a Virtual Environment
It's good practice to isolate dependencies in a virtual environment:
python3 -m venv venv
source venv/bin/activate # On Windows use: venv\Scripts\activate
3. Install Required Dependencies
Install the core packages needed for the app:
pip install fastapi uvicorn transformers langchain "python-jose[cryptography]" "passlib[bcrypt]" httpx
4. Project Structure
Here’s how your project will be organized:
fastapi-langchain-ai/
│
├── main.py # FastAPI entry point
├── auth.py # JWT token and auth logic
├── langchain_app.py # LangChain logic with Hugging Face
├── models/ # Pydantic models
│ └── schemas.py
├── users/ # User data and logic (in-memory or DB)
│ └── users.py
└── requirements.txt # Freeze dependencies (optional)
You can initialize your files like so:
touch main.py auth.py langchain_app.py
mkdir models users
touch models/schemas.py users/users.py
Build the FastAPI Backend
In this section, we’ll set up the main FastAPI application, define basic routes, and prepare the structure for JWT authentication and LangChain integration.
1. Create the FastAPI App (main.py
)
This will be the entry point for your backend:
# main.py
from fastapi import FastAPI, Depends, HTTPException
from fastapi.security import OAuth2PasswordRequestForm
from fastapi.middleware.cors import CORSMiddleware
from auth import authenticate_user, create_access_token, get_current_user
from models.schemas import Token, User
from langchain_app import generate_response
app = FastAPI()
# Enable CORS for local frontend (optional)
app.add_middleware(
CORSMiddleware,
allow_origins=["*"], # Set to specific origin in production
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
@app.get("/")
def read_root():
return {"message": "Welcome to the Secure AI App!"}
@app.post("/login", response_model=Token)
async def login(form_data: OAuth2PasswordRequestForm = Depends()):
user = authenticate_user(form_data.username, form_data.password)
if not user:
raise HTTPException(status_code=400, detail="Incorrect username or password")
token = create_access_token(data={"sub": user.username})
return {"access_token": token, "token_type": "bearer"}
@app.post("/prompt")
async def ai_prompt(prompt: str, current_user: User = Depends(get_current_user)):
response = generate_response(prompt)
return {"user": current_user.username, "response": response}
2. Create a Token and User Schema (models/schemas.py
)
# models/schemas.py
from pydantic import BaseModel
class Token(BaseModel):
access_token: str
token_type: str
class User(BaseModel):
username: str
disabled: bool = False
3. Add a Dummy User Store (users/users.py
)
# users/users.py
from passlib.context import CryptContext
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
fake_users_db = {
"admin": {
"username": "admin",
"hashed_password": pwd_context.hash("admin123"),
"disabled": False
}
}
def get_user(username: str):
user = fake_users_db.get(username)
if user:
return user
Implement JWT Authentication (auth.py
)
Create auth.py
and add the following:
# auth.py
from datetime import datetime, timedelta
from jose import JWTError, jwt
from fastapi import Depends, HTTPException, status
from fastapi.security import OAuth2PasswordBearer
from passlib.context import CryptContext
from users.users import get_user
from models.schemas import User
# Secret key to encode JWT
SECRET_KEY = "your-secret-key" # Change this in production!
ALGORITHM = "HS256"
ACCESS_TOKEN_EXPIRE_MINUTES = 30
# OAuth2 scheme to extract token from Authorization header
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="login")
# Password hashing
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
def verify_password(plain_password, hashed_password):
return pwd_context.verify(plain_password, hashed_password)
def authenticate_user(username: str, password: str):
user_data = get_user(username)
if not user_data or not verify_password(password, user_data["hashed_password"]):
return None
return User(username=user_data["username"], disabled=user_data.get("disabled", False))
def create_access_token(data: dict, expires_delta: timedelta = None):
to_encode = data.copy()
expire = datetime.utcnow() + (expires_delta or timedelta(minutes=ACCESS_TOKEN_EXPIRE_MINUTES))
to_encode.update({"exp": expire})
return jwt.encode(to_encode, SECRET_KEY, algorithm=ALGORITHM)
def get_current_user(token: str = Depends(oauth2_scheme)):
credentials_exception = HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Could not validate credentials",
headers={"WWW-Authenticate": "Bearer"},
)
try:
payload = jwt.decode(token, SECRET_KEY, algorithms=[ALGORITHM])
username: str = payload.get("sub")
if username is None:
raise credentials_exception
except JWTError:
raise credentials_exception
user_data = get_user(username)
if user_data is None:
raise credentials_exception
return User(username=user_data["username"], disabled=user_data.get("disabled", False))
How It Works
-
/login
usesauthenticate_user()
to verify credentials and return a JWT token. -
/prompt
is protected withDepends(get_current_user)
, so only users with a valid token can access it. -
The token is passed in the
Authorization: Bearer <token>
header.
LangChain Integration with Hugging Face (langchain_app.py
)
In this example, we’ll use LangChain’s HuggingFacePipeline
with a local transformer model (like gpt2
) to handle AI responses. You can later swap this for another model or even the Hugging Face Inference API.
1. Set Up the LangChain Pipeline
# langchain_app.py
from transformers import pipeline, AutoTokenizer, AutoModelForCausalLM
from langchain.llms import HuggingFacePipeline
from langchain import PromptTemplate, LLMChain
# Load tokenizer and model (you can replace 'gpt2' with any model you prefer)
tokenizer = AutoTokenizer.from_pretrained("gpt2")
model = AutoModelForCausalLM.from_pretrained("gpt2")
# Create the pipeline
generator = pipeline("text-generation", model=model, tokenizer=tokenizer, max_length=100)
# Wrap with LangChain
llm = HuggingFacePipeline(pipeline=generator)
# Optional: Add prompt template
template = PromptTemplate(
input_variables=["prompt"],
template="Answer the following: {prompt}"
)
llm_chain = LLMChain(prompt=template, llm=llm)
def generate_response(prompt: str) -> str:
result = llm_chain.run(prompt)
return result.strip()
💡 Tip: If you're running low on memory, switch to a smaller model like sshleifer/tiny-gpt2
.
2. Use the generate_response
Function in FastAPI
This function is already called in your /prompt
endpoint:
@app.post("/prompt")
async def ai_prompt(prompt: str, current_user: User = Depends(get_current_user)):
response = generate_response(prompt)
return {"user": current_user.username, "response": response}
When a logged-in user sends a prompt, LangChain wraps it using the template and passes it through the Hugging Face model.
Sample Request
curl -X POST http://localhost:8000/prompt \
-H "Authorization: Bearer <your_token>" \
-H "Content-Type: application/json" \
-d '{"prompt": "What is the capital of Indonesia?"}'
Sample Response
{
"user": "admin",
"response": "The capital of Indonesia is Jakarta."
}
That’s it! You’ve now connected FastAPI, LangChain, and Hugging Face into a secure, working AI backend.
Final Testing of the API
Now that everything is wired up, let's test the full flow of your secure AI app.
1. Start the FastAPI Server
In your project root:
uvicorn main:app --reload
Your app will be running at:
👉 http://localhost:8000
2. Get a JWT Token (Login)
Use Postman, curl, or httpx to login and retrieve a token:
Request
curl -X POST http://localhost:8000/login \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "username=admin&password=admin123"
Response
{
"access_token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...",
"token_type": "bearer"
}
3. Send a Prompt Request
Use the token you received to make an authorized request:
Request
curl -X POST http://localhost:8000/prompt \
-H "Authorization: Bearer <your_token>" \
-H "Content-Type: application/json" \
-d '{"prompt": "Explain quantum computing in simple terms."}'
Response
{
"user": "admin",
"response": "Quantum computing is a type of computing that uses qubits..."
}
If you try without a token or with an invalid token, you’ll get a 401 Unauthorized error — just as expected.
Conclusion
In this tutorial, you built a secure AI-powered application using:
-
FastAPI for a fast and clean API backend
-
JWT Authentication to protect endpoints and users
-
LangChain to orchestrate prompts and structure LLM logic
-
Hugging Face Transformers to generate responses using state-of-the-art language models
This architecture provides a flexible foundation for building intelligent applications, such as chatbots, summarizers, code assistants, or internal tools, all while remaining secure and scalable.
You can get the full source code on our GitHub.
That's just the basics. If you need more deep learning about Python and the frameworks, you can take the following cheap course:
-
Edureka's Django course helps you gain expertise in Django REST framework, Django Models, Django AJAX, Django jQuery etc. You'll master Django web framework while working on real-time use cases and receive Django certification at the end of the course.
-
Unlock your coding potential with Python Certification Training. Avail Flat 25% OFF, coupon code: TECHIE25
-
Database Programming with Python
-
Python Programming: Build a Recommendation Engine in Django
-
Python Course:Learn Python By building Games in Python.
-
Learn API development with Fast API + MySQL in Python
-
Learn Flask, A web Development Framework of Python
Thanks!