Convert Oracle SQL to Databricks SQL
The free Oracle → Databricks SQL converter. Handles ROWNUM, CONNECT BY, NVL, DECODE, SYSDATE, sequences, and 20+ Oracle-specific functions. No signup.
✓ No signup
✓ No data stored
✓ Instant
Load a sample query:
Ctrl+Enter to convert
Oracle SQL
Paste any Oracle SQL — SELECT, INSERT, DDL, stored procedure fragments
Databricks / Spark SQL
Conversion Notes
Conversion details will appear here after converting...
SQL Conversion Reference
Complete reference for Oracle → Spark SQL conversions. Bookmark for quick lookup during your migration.
| Oracle | Spark SQL / Databricks | Notes |
|---|---|---|
SYSDATE | CURRENT_TIMESTAMP | Direct replacement |
SYSTIMESTAMP | CURRENT_TIMESTAMP | Direct replacement |
NVL(x, y) | COALESCE(x, y) | Identical behavior |
NVL2(x, y, z) | IF(x IS NOT NULL, y, z) | Evaluates whether x is NOT NULL |
DECODE(col, v1, r1, …, def) | CASE WHEN col=v1 THEN r1 … ELSE def END | Last odd argument becomes ELSE |
TO_DATE(str, fmt) | TO_DATE(str, fmt) | Format syntax differs: Oracle YYYY → Spark yyyy, DD → dd |
TO_CHAR(date, fmt) | DATE_FORMAT(date, fmt) | Format strings differ significantly; verify each format mask |
TO_NUMBER(x) | CAST(x AS DECIMAL) | Specify precision/scale if known |
TO_CLOB(x) | CAST(x AS STRING) | Spark STRING is unbounded |
VARCHAR2(n) | STRING | No length limit needed in Spark |
NUMBER(p, s) | DECIMAL(p, s) | With precision and scale |
NUMBER | BIGINT | Without precision; use DECIMAL for fractional values |
CLOB | STRING | Spark STRING has no size limit |
BLOB | BINARY | Binary data type |
DATE (type) | TIMESTAMP | Oracle DATE stores time; Spark DATE is date-only. Use TIMESTAMP to preserve time. |
FROM DUAL | FROM (SELECT 1) | Oracle dummy table for scalar queries |
MINUS | EXCEPT | Set subtraction operator |
x || y | CONCAT(x, y) | Spark also supports || natively |
SUBSTR(x, y, z) | SUBSTRING(x, y, z) | Same argument order |
INSTR(str, sub) | LOCATE(sub, str) | Argument order is reversed! |
LENGTHB(x) | LENGTH(x) | Oracle byte count → Spark character count |
TRUNC(date) | DATE_TRUNC('day', date) | Truncate to day boundary |
TRUNC(number) | TRUNCATE(number, 0) | Truncate to integer |
MOD(x, y) | x % y | Modulo operator |
CEIL(x) | CEILING(x) | Ceiling function |
LPAD / RPAD | LPAD / RPAD | Identical in Spark |
WHERE ROWNUM <= N | LIMIT N | Moved to end of query |
ROWID | monotonically_increasing_id() | No direct equivalent; consider row_number() OVER() |
seq.NEXTVAL | — (manual) | Use Delta sequences or monotonically_increasing_id() |
seq.CURRVAL | — (not supported) | No equivalent; requires redesign |
CONNECT BY PRIOR | WITH RECURSIVE cte AS (…) | Hierarchical queries must be rewritten as recursive CTEs |
table.col = other.col(+) | LEFT JOIN | Oracle (+) outer join → ANSI JOIN syntax |
MERGE INTO … USING | MERGE INTO … USING | Delta Lake supports MERGE; verify MATCHED/NOT MATCHED syntax |
CREATE OR REPLACE | CREATE OR REPLACE | Supported in Delta Lake |
DBMS_OUTPUT.PUT_LINE | print() / display() | Use Python/Scala notebook output functions |
EXCEPTION / WHEN OTHERS | try/except (Python) | PL/SQL exception blocks → Python/Scala wrapper code |
Need Help With the Full Migration?
Oracle-to-Databricks migrations typically take 3–6 months with traditional consultants. Our AI-powered approach cuts that to 3–6 weeks.