How to Convert JSON to CSV: Methods, Tools, and What to Consider
JSON and CSV are two of the most common data formats you'll encounter in tech, but they serve very different purposes. Converting between them is a routine task — but the right method depends heavily on your data structure, technical comfort level, and what you plan to do with the output.
What Makes JSON and CSV Different
JSON (JavaScript Object Notation) is a hierarchical, flexible format. It supports nested objects, arrays, and mixed data types within a single file. It's the go-to format for APIs, configuration files, and web applications.
CSV (Comma-Separated Values) is flat. Every row has the same columns, and there's no concept of nesting. It's ideal for spreadsheets, database imports, and data analysis tools like Excel or Google Sheets.
That structural difference is the core challenge when converting JSON to CSV. A simple, flat JSON file converts cleanly. A deeply nested JSON with arrays inside objects requires decisions about how to "flatten" the data before it can sit neatly in rows and columns.
When the Conversion Is Straightforward
If your JSON looks like a simple list of objects with consistent keys, conversion is almost mechanical:
[ {"name": "Alice", "age": 30, "city": "Austin"}, {"name": "Bob", "age": 25, "city": "Denver"} ] This maps directly to CSV rows with headers name, age, city. Most tools handle this in seconds without any configuration.
When the Conversion Gets Complicated
Nested structures are where most conversion headaches begin. Consider:
[ {"name": "Alice", "address": {"city": "Austin", "zip": "78701"}} ] A flat CSV can't represent the address object natively. Tools will typically either:
- Flatten the keys — creating columns like
address.cityandaddress.zip - Stringify nested values — dumping the entire nested object as a text string in one cell
- Expand arrays into multiple rows — which can multiply your row count significantly
Which approach is correct depends entirely on how you intend to use the resulting CSV.
Methods for Converting JSON to CSV 🔄
Online Tools
Browser-based converters like ConvertCSV, JSON to CSV Converter, or similar utilities let you paste or upload a JSON file and download the result. These work well for:
- One-off conversions
- Non-sensitive data
- Users who don't need scripting or automation
Most handle flat JSON reliably. Results with nested data vary by tool, so it's worth reviewing the output before trusting it.
Spreadsheet Applications
Google Sheets can import JSON indirectly using the IMPORTDATA function or via Apps Script. Microsoft Excel supports JSON import through Power Query (available in Excel 2016 and later on Windows):
- Go to Data → Get Data → From File → From JSON
- Use the Power Query editor to expand nested fields as needed
- Load the result into a worksheet
Power Query's visual interface makes it accessible for non-developers, and it handles moderate nesting reasonably well.
Python (for Developers and Data Analysts)
Python is one of the most flexible options, especially for complex or large datasets. The pandas library is the standard approach:
import pandas as pd import json with open('data.json') as f: data = json.load(f) df = pd.json_normalize(data) df.to_csv('output.csv', index=False) pd.json_normalize() automatically flattens nested structures using dot notation for column names. For deeply nested or irregular JSON, you may need to pre-process the data before normalizing it.
Command-Line Tools
jq is a lightweight command-line JSON processor. Combined with tools like csvkit or simple shell piping, it can handle transformations without writing full scripts. This suits developers working in terminal environments or building data pipelines.
Node.js
For JavaScript developers, libraries like json2csv (available via npm) offer programmatic conversion with fine-grained control over field selection, flattening behavior, and output formatting.
Key Variables That Affect Your Approach
| Factor | Impact on Method Choice |
|---|---|
| JSON structure | Flat vs. nested determines complexity |
| File size | Large files may require scripting, not browser tools |
| Data sensitivity | Avoid online tools for private or confidential data |
| Frequency | One-time vs. recurring conversions favor different tools |
| Technical skill | No-code tools vs. Python/CLI vs. custom scripts |
| Downstream use | Excel import vs. database load vs. analysis pipeline |
What "Correct" Output Actually Looks Like
There's no universal standard for how nested JSON should appear in CSV. A flattened column named address.city is perfectly valid in one context and confusing in another. If you're feeding the CSV into a database, you may want specific column names. If it's going into a BI tool, the structure may need to match an existing schema.
This also means that validating your output matters — especially when arrays are involved. A JSON record with three items in an array might become three rows in CSV, which can unintentionally change aggregation results if you're not expecting it. 📊
Handling Edge Cases
A few things commonly cause silent conversion errors:
- Missing keys — not all JSON objects have the same fields; some tools fill blanks, others drop rows
- Special characters — commas, quotes, or newlines inside values must be properly escaped in CSV
- Data type coercion — JSON distinguishes between strings, numbers, and booleans; CSV treats everything as text unless the receiving application interprets it otherwise
The right approach depends on how clean and consistent your source JSON is, what tooling you're comfortable with, and what the resulting CSV needs to do. Those specifics point toward very different paths. 🗂️