SAPSQL_ARRAY_INSERT_DUPREC¶
Duplicate record when inserting into database table.
A bulk INSERT (array insert) tried to write a row whose primary key already exists in the table, and the code didn't handle the conflict.
Symptom¶
ST22 shows:
Runtime Errors SAPSQL_ARRAY_INSERT_DUPREC
Short text Duplicate record when inserting into database table.
The dump details include the table name and the offending key values.
Cause¶
ABAP's INSERT ... FROM TABLE is a single SQL statement that inserts all rows at once. If any row violates the primary key constraint, the whole statement fails and dumps — unless you handle it explicitly.
Common root causes:
- Running the same program twice (reprocessing without cleanup).
- Two parallel jobs or users inserting the same key at the same time.
- A dataset that already contains duplicate keys internally (the first duplicate row was inserted fine; the second hits the existing one).
- Missing a
DELETE/ cleanup step before re-inserting in a load program.
Reproduce¶
DATA lt_data TYPE STANDARD TABLE OF ztable.
" Insert the same key twice
APPEND VALUE #( key1 = 'A' key2 = '001' value = 'X' ) TO lt_data.
APPEND VALUE #( key1 = 'A' key2 = '001' value = 'Y' ) TO lt_data.
INSERT ztable FROM TABLE lt_data. " <-- dumps on second row
Fix¶
For single-row inserts, check SY-SUBRC and handle duplicates:
If your intent is "insert if new, update if exists", use MODIFY:
MODIFY does a full row overwrite
All non-key fields are overwritten. If you only want to update specific columns, use UPDATE ... SET instead.
Remove duplicates from the internal table before the bulk insert:
In newer releases you can catch database exceptions:
TRY.
INSERT ztable FROM TABLE lt_data.
CATCH cx_sy_open_sql_db INTO DATA(lx_db).
" log lx_db->get_text( ) and continue
ENDTRY.
Note: this catches the dump but no rows are inserted — the whole array insert is atomic at DB level. You'll need to fall back to row-by-row inserts with individual error handling if partial success is needed.
Prevention¶
- Always deduplicate (
SORT+DELETE ADJACENT DUPLICATES) before a bulk insert. - In load/migration programs, add a
DELETE FROM ztable WHERE ...orTRUNCATEbefore re-inserting. - Use
MODIFYinstead ofINSERTwhen idempotency is more important than detecting unexpected duplicates. - For concurrent scenarios (parallel jobs), use database locking (
ENQUEUE) or a sequence/GUID key that is inherently unique.
See also¶
CX_SY_OPEN_SQL_DB— catchable DB exception (7.40+)MODIFYstatement — upsert semantics- Transaction ST22 — ABAP runtime error analysis