Creating a text on SQL app with OpenAI + FastPy + SQLite

by SkillAiNest

Creating a text on SQL app with OpenAI + FastPy + SQLiteCreating a text on SQL app with OpenAI + FastPy + SQLitePhoto by author

# Introduction

Data has become an indispensable resource for any successful business, as it provides valuable insights for informed decision-making. Given the importance of data, many companies are building systems to store and analyze it. However, there are many times when it is difficult to obtain and analyze the necessary data, especially with the increasing complexity of data systems.

With the advent of generative AI, working with data has become significantly easier, as we can now use simple natural language to get mostly accurate outputs that follow the inputs we provide. This is also applicable to data processing and analysis with SQL, where we can ask for query development.

In this article, we’ll build a simple API application that translates natural language into SQL queries that our database understands. We will use three main tools: Open Eyefor , for , for , . Fastpyand sqlite.

Here is the plan.

# Text to SQL App Development

First, we’ll prepare everything we need for our project. All you need to provide Openai API keywhich we will use to access the generative model. To reduce the request, we will use Dockerwhich you can get for local implementation Docker Desktop.

Other components, such as SQLite, will already be available when you install Python, and FastPy will be installed later.

For the overall structure of the project, we will use the following:

text_to_sql_app/
├── app/
│   ├── __init__.py          
│   ├── database.py           
│   ├── openai_utils.py       
│   └── main.py               
├── demo.db                   
├── init_db.sql               
├── requirements.txt          
├── Dockerfile                
├── docker-compose.yml        
├── .env

Create the structure as above, or you can use the following storage To simplify things. We will still go through each file to get an understanding of how to build the application.

Let’s start by settling it .env file with the openai API key we got earlier. You can do this with the following code:

OPENAI_API_KEY=YOUR-API-KEY

Then, go to requirements.txt To fill in the required libraries we will use

fastapi
uvicorn
sqlalchemy
openai
pydantic
python-dotenv

Next, we proceed __init__.py file, and we’ll put the following code inside:

from pathlib import Path
from dotenv import load_dotenv

load_dotenv(dotenv_path=Path(__file__).resolve().parent.parent / ".env", override=False)

The above code ensures that the environment contains all of our necessary keys.

Then, we’ll generate the Python code database.py file to connect to the SQLite databasedemo.db) and provide a way to run SQL queries.

from sqlalchemy import create_engine, text
from sqlalchemy.orm import Session

ENGINE = create_engine("sqlite:///demo.db", future=True, echo=False)

def run_query(sql: str) -> list(dict):
    with Session(ENGINE) as session:
        rows = session.execute(text(sql)).mappings().all()
    return (dict(r) for r in rows)

After that, we will prepare openai_utils.py file that will accept the database schema and input queries. The output will be JSON containing the SQL query (with a guard to prevent any write operations).

import os
import json
from openai import OpenAI        

client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))

_SYSTEM_PROMPT = """
You convert natural-language questions into read-only SQLite SQL.
Never output INSERT / UPDATE / DELETE.
Return JSON: { "sql": "..." }.
"""

def text_to_sql(question: str, schema: str) -> str:
    response = client.chat.completions.create(
        model="gpt-4o-mini",        
        temperature=0.1,
        response_format={"type": "json_object"},
        messages=(
            {"role": "system", "content": _SYSTEM_PROMPT},
            {"role": "user",
             "content": f"schema:\n{schema}\n\nquestion: {question}"}
        )
    )
    payload = json.loads(response.choices(0).message.content)
    return payload("sql")

With both the code and the connection ready, we’ll build the application using the Fast API. The application will accept natural language queries and database schema, converting them to SQL SELECT queries, run them through the SQLite database, and return the results as JSON. The request will be an API that we can access through the CLI.

from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from sqlalchemy import inspect
from .database import ENGINE, run_query
from .openai_utils import text_to_sql

app = FastAPI(title="Text-to-SQL Demo")

class NLRequest(BaseModel):
    question: str

@app.on_event("startup")
def capture_schema() -> None:
    insp = inspect(ENGINE)
    global SCHEMA_STR
    SCHEMA_STR = "\n".join(
        f"CREATE TABLE {t} ({', '.join(c('name') for c in insp.get_columns
        for t in insp.get_table_names()
    )

@app.post("/query")
def query(req: NLRequest):
    try:
        sql = text_to_sql(req.question, SCHEMA_STR)
        if not sql.lstrip().lower().startswith("select"):
            raise ValueError("Only SELECT statements are allowed")
        return {"sql": sql, "result": run_query(sql)}
    except Exception as e:
        raise HTTPException(status_code=400, detail=str(e))

That’s all we need for the main application. The next thing we will create is the database. Use the database below init_db.sql For example purposes, but you can always change it if you want.


DROP TABLE IF EXISTS order_items;
DROP TABLE IF EXISTS orders;
DROP TABLE IF EXISTS payments;
DROP TABLE IF EXISTS products;
DROP TABLE IF EXISTS customers;

CREATE TABLE customers (
    id INTEGER PRIMARY KEY,
    name TEXT NOT NULL,
    country TEXT,
    signup_date DATE
);

CREATE TABLE products (
    id INTEGER PRIMARY KEY,
    name TEXT NOT NULL,
    category TEXT,
    price REAL
);

CREATE TABLE orders (
    id INTEGER PRIMARY KEY,
    customer_id INTEGER,
    order_date DATE,
    total REAL,
    FOREIGN KEY (customer_id) REFERENCES customers(id)
);

CREATE TABLE order_items (
    order_id INTEGER,
    product_id INTEGER,
    quantity INTEGER,
    unit_price REAL,
    PRIMARY KEY (order_id, product_id),
    FOREIGN KEY (order_id) REFERENCES orders(id),
    FOREIGN KEY (product_id) REFERENCES products(id)
);

CREATE TABLE payments (
    id INTEGER PRIMARY KEY,
    order_id INTEGER,
    payment_date DATE,
    amount REAL,
    method TEXT,
    FOREIGN KEY (order_id) REFERENCES orders(id)
);

INSERT INTO customers (id, name, country, signup_date) VALUES
 (1,'Alice','USA','2024-01-05'),
 (2,'Bob','UK','2024-03-10'),
 (3,'Choi','KR','2024-06-22'),
 (4,'Dara','ID','2025-01-15');

INSERT INTO products (id, name, category, price) VALUES
 (1,'Laptop Pro','Electronics',1500.00),
 (2,'Noise-Canceling Headphones','Electronics',300.00),
 (3,'Standing Desk','Furniture',450.00),
 (4,'Ergonomic Chair','Furniture',250.00),
 (5,'Monitor 27"','Electronics',350.00);

INSERT INTO orders (id, customer_id, order_date, total) VALUES
 (1,1,'2025-02-01',1850.00),
 (2,2,'2025-02-03',600.00),
 (3,3,'2025-02-05',350.00),
 (4,1,'2025-02-07',450.00);

INSERT INTO order_items (order_id, product_id, quantity, unit_price) VALUES
 (1,1,1,1500.00),
 (1,2,1,300.00),
 (1,5,1,350.00),
 (2,3,1,450.00),
 (2,4,1,250.00),
 (3,5,1,350.00),
 (4,3,1,450.00);

INSERT INTO payments (id, order_id, payment_date, amount, method) VALUES
 (1,1,'2025-02-01',1850.00,'Credit Card'),
 (2,2,'2025-02-03',600.00,'PayPal'),
 (3,3,'2025-02-05',350.00,'Credit Card'),
 (4,4,'2025-02-07',450.00,'Bank Transfer');

Next, run the following code in your CLI to create the SQLITE database for our project.

sqlite3 demo.db < init_db.sql  

With the database ready, we’ll create a Dockerfile Containerizing our application.

FROM python:3.12-slim
WORKDIR /code

COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

COPY . .

CMD ("uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000")

We will make too docker-compose.yml file to run the application more easily.

services:
  text2sql:
    build: .
    env_file: .env    
    ports:
      - "8000:8000"    
    restart: unless-stopped
    volumes:
      - ./demo.db:/code/demo.db

With everything ready, start your Docker desktop and run the following code to build the application.

docker compose build --no-cache   
docker compose up -d 

If everything is done well, you can test the application using the following code. We will ask how many customers we have in the data.

curl -X POST " -H "Content-Type: application/json" -d "{\"question\":\"How many customers?\"}"

The output will look like this.

{"sql":"SELECT COUNT(*) AS customer_count FROM customers;","result":({"customer_count":4})}

We can try something more complex, like the number of orders for each customer:

curl -X POST " -H "Content-Type: application/json" -d "{\"question\":\"What is the number of orders placed by each customer\"}"

with output like below.

{"sql":"SELECT customer_id, COUNT(*) AS number_of_orders FROM orders GROUP BY customer_id;","result":({"customer_id":1,"number_of_orders":2},{"customer_id":2,"number_of_orders":1},{"customer_id":3,"number_of_orders":1})}

That’s all you need to build a basic Text-to-SQL application. You can further enhance it with a front-end interface and a complex system as per your needs.

# wrap up

Data is the heart of any data operation, and companies use it to make decisions. Many times, the system we have is very complex, and we need to rely on generative AI to help us navigate it.

In this article, we have learned how to develop a simple text-to-SQL application using the OpenAI model, Fast API, and SQLite.

I hope this has helped!

Cornelius Yudhavijaya Data Science Assistant Manager and Data Writer. Working full-time at Allianz Indonesia, he likes to share Python and data tips through social media and written media. Cornelius writes on a variety of AI and machine learning topics.

You may also like

Leave a Comment

At Skillainest, we believe the future belongs to those who embrace AI, upgrade their skills, and stay ahead of the curve.

Get latest news

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

@2025 Skillainest.Designed and Developed by Pro