December 16, 2025

Integrating ChatGPT with Oracle for SQL Optimization

 
AI assistants for Database Administrators (DBAs) are transforming how databases are managed, optimized, and secured.

AI assistants like ChatGPT(OpenAI), Microsoft Copilot, etc. can be integrated with Oracle databases, to help DBAs in Automating routine tasks, helping in Performance tuning, simplifying Query generation, Providing real-time insights and recommendations.

  • Microsoft Copilot Studio - The Copilot Studio can be directly integrated with Oracle database(No-code integration), now the Copilot studio lets users add Oracle as a knowledge source, enabling Natural language querying and Real-time reasoning.
  • ChatGPT (OpenAI) - Python script can be used to connect to Oracle database, extract top SQLs, execution plans, other stats and send it to ChatGPT(OpenAI). Store Recommendations output in JSON format and generate test scripts from ChatGPT.

In this blog, I am going to demonstrate how to integrate ChatGPT(OpenAI)  with Oracle database, for SQL Tuning and index optimization. 

Note Following my YouTube video demonstrates below steps in detail, on how to integrate ChatGPT(OpenAI)  with Oracle database for SQL Optimization.


Detailed Demo Video


Architecture Diagram:



Benefits of AI assistants in SQL optimization:

SQL optimization has traditionally been manual, time-consuming and it requires years of experience. AI assistants help DBAs:

  • To quickly identify high-cost queries.
  • AI tools can automatically rewrite inefficient SQL queries, reducing manual optimization efforts.
  • AI assistants can interpret complex execution plans and highlight bottlenecks or recommend indexes
  • Review AWR reports, identify bottlenecks and make suggestions

Sample response/recommendations from ChatGPT for one SQL query:







Implementation:

Environment/Components:

  • Oracle Database (19c or later recommended)
  • Python 3.9+ runtime (or latest version)
  • OpenAI API key
  • Install Python packages - oracledb, openai


Steps to integrate ChatGPT with Oracle database:


Step 1 - Generate an API key

  • Go to the OpenAI Platform and Log in with your OpenAI account https://platform.openai.com/
  • Once logged in, click your profile icon (top-right) and select "View API Keys".
  • Click “Create new secret key”
  • Copy the API key and use it in your code/app


Step 2 - Install Python package “oracledb” and “openai”

  • pip install oracledb
  • pip install openai


Step 3 Prepare Python script that connects to Oracle database and OpenAI, to perform below steps. The high level python code is listed in below steps.


Step 3.1. Identify High-Cost SQL Workloads, in last one week

SELECT *
FROM (SELECT ss.sql_id, SUM(ss.elapsed_time_delta) AS total_elapsed_time,
    SUM(ss.executions_delta) AS total_executions, ss.plan_hash_value, sn.begin_interval_time, 
    ss.parsing_schema_name, SUM(ss.cpu_time_delta) AS total_cpu_time,
    SUM(ss.disk_reads_delta) AS total_disk_reads,
    DBMS_LOB.SUBSTR(st.sql_text, 4000, 1) AS sql_text
    FROM dba_hist_sqlstat ss,dba_hist_sqltext st,dba_hist_snapshot sn 
    where ss.sql_id = st.sql_id
    and ss.snap_id = sn.snap_id
    and ss.instance_number=sn.instance_number
    and sn.begin_interval_time >= SYSDATE - 7
    and ss.parsing_schema_name not in ('SYS','SYSTEM')
    and ss.elapsed_time_delta IS NOT NULL
    and ss.executions_delta > 0
    and ss.plan_hash_value >0
    GROUP BY ss.sql_id, sn.begin_interval_time,ss.parsing_schema_name,
DBMS_LOB.SUBSTR(st.sql_text, 4000, 1),ss.plan_hash_value
    ORDER BY total_elapsed_time DESC
) 
WHERE ROWNUM <= 10;

Note:

  • Real-time SQL(from Library cache): Pulled from V$SQL
  • Historical SQL: Pulled from DBA_HIST_SQLSTAT  joined with DBA_HIST_SNAPSHOT

Step 3.2. Retrieve Execution Plans for each SQL, using below statement:

SELECT * FROM TABLE(DBMS_XPLAN.DISPLAY_AWR('<sql_id>', 
plan_hash_value => '<plan_hash_value>', 
format => 'ALL -ROWS -BYTES +COST +PARALLEL +PARTITION +PEEKED_BINDS +OUTLINE +IOSTATS +MEMSTATS +PREDICATE +REMOTE +NOTE'));

Note:

  • DBMS_XPLAN.DISPLAY_CURSOR - Displays the execution plan of a SQL statement currently in the cursor cache.
  • DBMS_XPLAN.DISPLAY_AWR - Displays execution plans stored in AWR snapshots

Step 3.3. Send Workload Data to ChatGPT

import oracledb, os, json
from openai import OpenAI

client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))

def get_plan(sql_id):
    cur = conn.cursor()
    cur.execute(f"""
        SELECT * FROM TABLE(DBMS_XPLAN.DISPLAY_AWR(sql_id= > " +sql_id + 
", FORMAT= > 'ALL -ROWS -BYTES +COST +PARALLEL +PARTITION +PEEKED_BINDS +OUTLINE +IOSTATS +MEMSTATS +PREDICATE +REMOTE +NOTE'))
 """)
    return "\n".join(r[0] for r in cur)

def ask_gpt(sql_id, sql_text, plan):
    prompt = f"""
    You are helping to optimize sql and indexes for Oracle.
    Given a SQL statement and its Execution PLAN,
    recommend up to 3 B*Tree indexes or say "none".
    Also suggest for SQL improvements or rewrite SQLs.
    Return strict in JSON format:

    {{
      "sql_id": "{sql_id}",
      "recommendations": [
        {{
          "create_sql": "CREATE INDEX ...",
          "reason": "why it helps",
          "risk": "low/medium/high"
          "benefits": "Percentage of improvement"
        }}
      ]
    }}
    SQL:
    {sql_text}

    PLAN:
    {plan}
    """
    response = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role":"user","content":prompt}]
    )
    return response.choices[0].message.content


Step 3.4. Collect GPT output and save to files

results = []
for sql_id, sql_text in [("03guhbfpak0w7", "SELECT ... FROM ...")]:
    plan = get_plan(sql_id)
    suggestion = ask_gpt(sql_id, sql_text, plan)
    results.append({"sql_id": sql_id, "suggestions": suggestion})


# Write JSON
with open("recommendations.json", "w", encoding="utf-8") as f:
    json.dump(results, f, indent=2, ensure_ascii=False)

# Build index scripts
create_lines = ["-- Test indexes (INVISIBLE)"]

for r in results:
    try:
        recs = json.loads(r["suggestions"])
        for rec in recs.get("recommendations", []):
            stmt = rec["create_sql"].rstrip(";") + " INVISIBLE;"
            create_lines.append(stmt)
            idx_name = stmt.split()[2]
  
    except Exception as e:
        create_lines.append(f"-- Error parsing {r['sql_id']}: {e}")

with open("create_indexes.sql", "w", encoding="utf-8") as f:
    f.write("\n".join(create_lines))


Step 4. Run Python script

python3 <script name>


Step 5. Review output files. Script will generate 2 response files from ChatGPT:

  • Output.json ⇒ Convert JSON format to HTML using any tools or ChatGPT. This output file contains Recommendations from ChatGPT and SQL rewrite suggestions as shown on below screenshot:

Article content

  • Indexes_test.sql - Contains script for creating new indexes


Limitations:

  • ChatGPT doesn’t see all schema context
  • LOBs and complex data types may require special handling
  • Over-indexing can slow DML

Following my YouTube video demonstrates above steps in detail, on how to integrate ChatGPT(OpenAI)  with Oracle database for SQL Optimization.

Detailed Demo Video