JSON to SQL: How to Generate INSERT Statements from JSON Data
Converting JSON to SQL INSERT statements is a common task in data migration, test database seeding, and ETL pipelines. You have a JSON file from an API, an export, or a config, and you need to get it into a relational database. This guide covers the core patterns, how nested JSON maps to SQL, and how to handle edge cases.
The Simple Case: A Flat JSON Array
If your JSON is an array of flat objects where each key maps directly to a column, the conversion is straightforward.
[
{ "id": 1, "name": "Alice", "email": "[email protected]", "active": true },
{ "id": 2, "name": "Bob", "email": "[email protected]", "active": false }
]INSERT INTO users (id, name, email, active) VALUES
(1, 'Alice', '[email protected]', TRUE),
(2, 'Bob', '[email protected]', FALSE);Generate this instantly
Paste your JSON into our JSON to SQL Converter to generate INSERT statements in seconds. Works for any flat or moderately nested JSON structure.
Data Type Mapping: JSON → SQL
| JSON Type | SQL Type | Example |
|---|---|---|
| string | VARCHAR / TEXT | "Alice" → 'Alice' |
| number (integer) | INTEGER / BIGINT | 42 → 42 |
| number (float) | DECIMAL / FLOAT | 3.14 → 3.14 |
| boolean | BOOLEAN / TINYINT(1) | true → TRUE / 1 |
| null | NULL | null → NULL |
| array | JSON / TEXT (serialized) | ["a","b"] → '["a","b"]' |
| object | JSON / TEXT (serialized) | {"x":1} → '{"x":1}' |
Handling Nested JSON
Nested objects and arrays do not map directly to SQL columns. You have two options:
Option 1: Serialize to JSON column (PostgreSQL / MySQL 5.7+)
{ "id": 1, "name": "Alice", "address": { "city": "New York", "zip": "10001" } }-- PostgreSQL / MySQL (JSON column type)
INSERT INTO users (id, name, address) VALUES
(1, 'Alice', '{"city":"New York","zip":"10001"}');
-- Query it later with:
SELECT address->>'city' FROM users WHERE id = 1; -- PostgreSQL
SELECT JSON_EXTRACT(address, '$.city') FROM users WHERE id = 1; -- MySQLOption 2: Flatten into separate columns
INSERT INTO users (id, name, address_city, address_zip) VALUES
(1, 'Alice', 'New York', '10001');Option 3: Normalize into a related table
INSERT INTO users (id, name) VALUES (1, 'Alice');
INSERT INTO addresses (user_id, city, zip) VALUES (1, 'New York', '10001');Escaping and Safety
When generating SQL from untrusted JSON, proper escaping is critical to prevent SQL injection.
function jsonToInsert(tableName, records) {
if (!records.length) return '';
const columns = Object.keys(records[0]);
const escape = (val) => {
if (val === null || val === undefined) return 'NULL';
if (typeof val === 'boolean') return val ? 'TRUE' : 'FALSE';
if (typeof val === 'number') return val;
if (typeof val === 'object') return `'${JSON.stringify(val).replace(/'/g, "''")}'`;
return `'${String(val).replace(/'/g, "''")}'`; // escape single quotes
};
const rows = records.map(r => `(${columns.map(c => escape(r[c])).join(', ')})`);
return `INSERT INTO ${tableName} (${columns.join(', ')}) VALUES\n${rows.join(',\n')};`;
}Always use parameterized queries in production
Generated INSERT statements are great for one-off migrations and seeding. For application code that runs repeatedly with user-provided data, always use parameterized queries or an ORM to prevent SQL injection.
Common Use Cases
- Database seeding: Convert JSON fixture files into SQL seed scripts for development and test environments.
- Data migration: Export data from a NoSQL store (MongoDB, DynamoDB) as JSON, convert to SQL, import into Postgres or MySQL.
- API response archiving: Snapshot API responses as JSON and import them into a database for analysis.
- Spreadsheet to database: Export CSV → convert to JSON → generate INSERT statements.
- ETL pipelines: Transform JSON from a data lake into structured SQL inserts for a data warehouse.
import json
import psycopg2
with open('users.json') as f:
users = json.load(f)
conn = psycopg2.connect(DATABASE_URL)
cur = conn.cursor()
for user in users:
cur.execute(
"INSERT INTO users (id, name, email) VALUES (%s, %s, %s)",
(user['id'], user['name'], user['email'])
)
conn.commit()Convert JSON to SQL instantly
Paste any JSON array and generate ready-to-run INSERT statements for PostgreSQL, MySQL, SQLite, and more.