
Step by Step Guide to solve n8n Merge Node Type Mismatch Error
Who this is for: n8n developers who hit a Merge node type mismatch error in production or testing workflows and need a concise, repeatable fix. We cover this in detail in the n8n Node Specific Errors Guide.
Quick Diagnosis
The error means the two incoming streams have different data shapes (e.g., one is an array, the other a single object or binary).
Fix in 4 steps
- Open Execution > Data on the failing run and inspect each input’s top‑level structure.
- Insert a Set (or Function) node before the Merge to standardise both streams to the same type (usually an array of objects).
- Pick a Merge mode that matches the normalised shape (Append / Merge / Wait / Pass‑Through).
- Re‑run – the error disappears.
Why the Merge node throws “type mismatch”?
The node validates the top‑level json (or binary) payload of each incoming connection. If they differ, execution stops.
| Cause | What n8n sees | Why it fails |
|---|---|---|
| One branch returns JSON, the other binary (file, image) | json vs binary |
Merge can only combine matching data sections. |
| One branch outputs a single object, the other an array of objects | {…} vs [{…}] | Append/merge expects identical container types. |
| Different property names when using *Merge by key* | {id:1} vs {userId:1} | No common key → rows cannot be aligned. |
| Mixed null/undefined values in one stream | null in field price |
Null breaks structural comparison in *Append* mode. |
Diagnosing the mismatch – step‑by‑step
- Open the failed execution → Execution > Data → click the Merge node.
- Expand “Input Data” for each incoming connection.
- Look at the top‑level keys (
json,binary) and the structure ofjson. - Note any array vs object, missing keys, or presence of
binary.
EEFA tip: Enable *“Save Execution Data”* only for failures in production to keep storage lean while still giving you a debug view.
Normalising the data shape
If you encounter any n8n set node type mismatch resolve them before continuing with the setup.
Typical mismatches & the node you need
| Situation | Normalisation node | Minimal configuration |
|---|---|---|
| Object → Array | Set (or Function) | Wrap the object in an array. |
| Array → Single object | Set | Pull the first element. |
| Binary ↔ JSON | Set (keep binary untouched) + Pass‑Through mode | No conversion, just adjust merge mode. |
| Different key names | Set | Rename the key to a common identifier. |
Example 1 – Convert a single webhook payload into an array
Purpose: The webhook returns an object, but the downstream Merge expects an array.
{
"json": {
"items": [
{{ $json }}
]
}
}
(Place this in a Set node before the Merge.)
Example 2 – Extract the first element from an array
Purpose: The upstream node returns an array, but you need a single object for the merge.
{
"json": {{ $json[0] }}
}
Example 3 – Rename a mismatched key
Purpose: Align keys for a “Merge by key” operation.
{
"json": {
"id": {{ $json.userId }}
}
}
Choosing the correct Merge mode
| Mode | When to use | Required input shape |
|---|---|---|
| Append | Concatenate two *arrays* of objects | Both inputs → [{…}] |
| Merge (by key) | Combine rows based on a common field | Both inputs → arrays with matching key |
| Wait | Keep both objects separate (e.g., binary + JSON) | Any shape, but Pass‑Through must be selected |
| Pass‑Through | Forward one branch unchanged while the other is ignored | Useful for binary‑only merges |
Real‑world example: Merging a paginated API response with a static config object
[HTTP Request] (page 1) → Set → Merge (input #1) [Set (static config)] → Merge (input #2)
Problem: HTTP Request returns [{id:…, name:…}]; the config node returns {env: "prod"} → type mismatch.
Step 1 – Wrap the config object in an array
Purpose: Align the config payload with the API array shape.
{
"json": {
"items": [
{ "env": "prod" }
]
}
}
Step 2 – Use Append mode on the Merge node
Both inputs are now arrays, so the Merge can concatenate them. If you encounter any n8n function node reference error resolve them before continuing with the setup.
Step 3 – Add the config field to each API record
Purpose: Enrich each API item with the environment value.
return $json.map(item => ({
...item,
env: $json[1].items[0].env
}));
Result:
[
{ "id": 1, "name": "A", "env": "prod" },
{ "id": 2, "name": "B", "env": "prod" }
]
Checklist – before you hit Execute
- Both incoming branches output the same top‑level type (
jsonorbinary). - For Append or Merge, both are arrays of objects.
- All required key fields exist in both arrays (when merging by key).
- Any binary data is isolated or the Merge mode is set to Pass‑Through.
- Test with a minimal data set to avoid rate‑limit or memory issues.
EEFA (Experience, Errors, Fixes, Advice) notes
- Production warning: Merging very large arrays (> 10 k records) can exhaust memory. Use Pagination + SplitInBatches before the Merge.
- Error nuance: The message may read “Expected input to be of type
array, gotobject”. It’s the same root cause – mismatched container type. - Provider constraint: Some APIs (e.g., Airtable) always return an object with a
recordsarray; you must unwrap it ($json["records"]) before merging with another source. - Why the fix works: The Merge node internally runs
Array.isArray()on each input. Aligning both payloads to arrays satisfies this check, allowing the subsequentconcatorlodash.mergeto run without error. - If you encounter any n8n function node syntax error resolve them before continuing with the setup.
Conclusion
The “Merge node type mismatch” error is always a shape problem: the two streams feeding the node don’t share the same top‑level structure. By inspecting the execution data, normalising each branch to a matching type (usually an array of objects), and selecting the appropriate Merge mode, you can resolve the error in minutes and keep your workflow reliable in production.
Key takeaways
- Inspect the exact payload shape in Execution > Data.
- Standardise both inputs with a tiny Set/Function node.
- Match the Merge mode to the normalised shape.
- Validate with a quick test run before scaling.
Apply this pattern whenever you encounter type mismatches, and your n8n workflows will stay robust and maintainable.



