JSON轉CSV轉換指南
Complete JSON to CSV Conversion Guide | 3 Methods to Export Excel-Readable Format in 1 Minute
Exported JSON data is a mess, your manager wants it in Excel-readable format? Chinese characters become gibberish, columns don't align, don't know how to handle complex structures? Don't panic! This article teaches you 3 practical methods, from one-click online tools to Python automation, even frontend JavaScript direct export. Whether you're a data analyst, backend engineer, or product manager, you'll find the perfect solution in 1 minute!
Why Convert JSON to CSV?
Differences Between JSON and CSV
JSON and CSV are two completely different data formats:
JSON (JavaScript Object Notation):
- Hierarchical structure, supports nested objects and arrays
- Suitable for data exchange between programs
- Human-readable but not user-friendly for non-technical people
CSV (Comma-Separated Values):
- Flat table structure, one row per record
- Can be opened directly in Excel, Google Sheets
- Simple and intuitive, suitable for data analysis and reports
Three Common Conversion Scenarios
Scenario 1: Reports for Non-Technical Staff
Marketing department needs to view user data from API responses—JSON is too complex, converting to CSV for Excel is most intuitive.
Scenario 2: Data Analysis and Visualization
Import backend API data into analysis tools like Tableau, Power BI—CSV is the most universal format.
Scenario 3: System Integration and Data Migration
Old system exports JSON, new system only accepts CSV—need to batch convert thousands of records.
Preparation Before Conversion
Before starting conversion, recommend using JSON Parser to validate data format. Confirm:
- JSON syntax is correct (no extra commas, brackets matched)
- Data structure is consistent (every object in array has same fields)
- No special characters or encoding issues
Method 1: Quick Conversion with Online Tools
Tool Master JSON Parser (Recommended)
The fastest method is using online tools—completely free and no installation required:
Steps:
1. Go to JSON Parser
2. Paste JSON data
3. Click "Convert to CSV" button
4. Download CSV file
Advantages:
- ✅ 100% local processing, data not uploaded to server
- ✅ Auto-detect character encoding (UTF-8 BOM)
- ✅ Supports automatic flattening of complex nested structures
- ✅ Customizable delimiters (comma, semicolon, tab)
Other Online Tool Options
ConvertCSV:
- Supports multiple format conversions (JSON, CSV, XML, YAML)
- Provides API service (paid)
JSON-CSV.com:
- Simple interface, intuitive operation
- But doesn't support complex nested structures
Selection Advice:
If data contains sensitive information (like customer personal data, financial data), strongly recommend using Tool Master's local processing tool to ensure data security.
Method 2: Python Automation Processing
Using Built-in csv Module
For simple JSON arrays, Python built-in modules can handle it:
import json
import csv
# Read JSON file
with open('data.json', 'r', encoding='utf-8') as f:
data = json.load(f)
# Write CSV
with open('output.csv', 'w', encoding='utf-8-sig', newline='') as f:
if data:
writer = csv.DictWriter(f, fieldnames=data[0].keys())
writer.writeheader()
writer.writerows(data)
print("✅ Conversion complete!")
Key Points:
- encoding='utf-8-sig': Solves Chinese character gibberish in Excel
- newline='': Avoids extra blank lines on Windows
- DictWriter: Automatically handles dictionary to table conversion
Using pandas for Complex Structures
For nested JSON (objects containing arrays or sub-objects), pandas is more powerful:
import pandas as pd
import json
# Read JSON
with open('nested_data.json', 'r', encoding='utf-8') as f:
data = json.load(f)
# Method 1: Normalize nested structure
df = pd.json_normalize(data)
# Method 2: Keep nested structure (convert to string)
df = pd.DataFrame(data)
df = df.applymap(lambda x: str(x) if isinstance(x, (dict, list)) else x)
# Save as CSV
df.to_csv('output.csv', index=False, encoding='utf-8-sig')
json_normalize Example:
Original JSON:
[
{
"name": "John",
"age": 30,
"address": {
"city": "New York",
"district": "Manhattan"
}
}
]
Converted CSV:
name,age,address.city,address.district
John,30,New York,Manhattan
Batch Processing Multiple Files
import os
import glob
import pandas as pd
# Read all JSON files in folder
for file in glob.glob('data/*.json'):
df = pd.read_json(file)
output = file.replace('.json', '.csv')
df.to_csv(output, index=False, encoding='utf-8-sig')
print(f"✅ {file} conversion complete")
Method 3: JavaScript Frontend Conversion
Using json2csv Package (Node.js)
In Node.js environment:
const fs = require('fs');
const { Parser } = require('json2csv');
// Read JSON
const data = JSON.parse(fs.readFileSync('data.json', 'utf8'));
// Set fields
const fields = ['name', 'age', 'city'];
const opts = { fields };
// Convert
const parser = new Parser(opts);
const csv = parser.parse(data);
// Save
fs.writeFileSync('output.csv', '\uFEFF' + csv);
console.log('✅ Conversion complete!');
Key Points:
- '\uFEFF': UTF-8 BOM, solves Chinese character gibberish in Excel
- fields: Can customize field order and filtering
Direct Export in Browser Frontend
Let users download CSV directly on webpage:
function jsonToCSV(jsonData) {
const array = typeof jsonData !== 'object' ? JSON.parse(jsonData) : jsonData;
// Get field names
const headers = Object.keys(array[0]);
// Build CSV content
let csv = headers.join(',') + '\n';
array.forEach(obj => {
const row = headers.map(header => {
const value = obj[header];
// Handle values containing commas or newlines
return typeof value === 'string' && (value.includes(',') || value.includes('\n'))
? `"${value}"`
: value;
});
csv += row.join(',') + '\n';
});
return csv;
}
// Trigger download
function downloadCSV(csv, filename) {
const blob = new Blob(['\uFEFF' + csv], { type: 'text/csv;charset=utf-8;' });
const link = document.createElement('a');
link.href = URL.createObjectURL(blob);
link.download = filename;
link.click();
}
// Usage example
const jsonData = [
{ name: 'John Doe', age: 25, city: 'New York' },
{ name: 'Jane Smith', age: 30, city: 'Los Angeles' }
];
const csv = jsonToCSV(jsonData);
downloadCSV(csv, 'export.csv');
Integration into React Application
import React from 'react';
function ExportButton({ data }) {
const handleExport = () => {
const csv = jsonToCSV(data);
downloadCSV(csv, `export_${new Date().getTime()}.csv`);
};
return (
<button onClick={handleExport} className="btn-export">
Export CSV
</button>
);
}
export default ExportButton;
Handling Complex JSON Structures (Nested Objects and Arrays)
Flattening Nested Objects
Original Data:
{
"user": {
"name": "John",
"profile": {
"age": 30,
"address": {
"city": "New York",
"zipcode": "10001"
}
}
}
}
Flattening Strategy 1: Dot Separation (Recommended)
user.name,user.profile.age,user.profile.address.city,user.profile.address.zipcode
John,30,New York,10001
Python Implementation:
def flatten_json(nested_json, prefix=''):
result = {}
for key, value in nested_json.items():
new_key = f"{prefix}.{key}" if prefix else key
if isinstance(value, dict):
result.update(flatten_json(value, new_key))
else:
result[new_key] = value
return result
# Usage
flat_data = [flatten_json(item) for item in data]
df = pd.DataFrame(flat_data)
Flattening Strategy 2: Underscore Separation
user_name,user_profile_age,user_profile_address_city
John,30,New York
Handling Array Fields
Original Data:
[
{
"name": "John",
"skills": ["Python", "JavaScript", "SQL"]
}
]
Method 1: Convert to String (Simple)
name,skills
John,"Python, JavaScript, SQL"
Method 2: Explode into Multiple Rows (Suitable for analysis)
name,skill
John,Python
John,JavaScript
John,SQL
Python Implementation:
# Method 1
df['skills'] = df['skills'].apply(lambda x: ', '.join(x) if isinstance(x, list) else x)
# Method 2
df = df.explode('skills')
🎯 Try Tool Master's JSON Tool Combination!
Solving complex data conversion problems requires multiple tools working together:
| Tool | Function | Use Case |
|---|---|---|
| JSON Parser | Validate, format, fix errors | Check data format before conversion |
| Unit Converter | Unified unit processing | Convert numeric units in CSV |
| Color Shower | Color code conversion | Process data containing color codes |
💡 Practical Workflow:
1. First use JSON parser to validate format
2. Convert to CSV
3. Use unit converter to standardize numeric units
4. Finally import into Excel for analysis
👉 Use JSON Parser Now | Explore More Developer Tools
Chinese Encoding Problem Solutions
Why Chinese Characters Become Gibberish in Excel
Excel on Windows defaults to Big5 or ANSI encoding, while modern JSON data is all UTF-8, causing gibberish when opened directly.
Solution 1: Add UTF-8 BOM
BOM (Byte Order Mark) is a special marker at the beginning of file, telling Excel this is a UTF-8 file.
Python Implementation:
# Method 1: Add BOM when writing
df.to_csv('output.csv', index=False, encoding='utf-8-sig')
# Method 2: Add BOM to existing file
with open('input.csv', 'r', encoding='utf-8') as f:
content = f.read()
with open('output.csv', 'w', encoding='utf-8-sig') as f:
f.write(content)
JavaScript Implementation:
const csv = jsonToCSV(data);
const csvWithBOM = '\uFEFF' + csv; // \uFEFF is UTF-8 BOM
Solution 2: Excel Import Wizard
If file cannot be modified, use Excel's built-in import feature:
- Open Excel → Data tab
- Click From Text/CSV
- After selecting file, choose "65001: Unicode (UTF-8)" for File Origin
- Click Load
Solution 3: Google Sheets
Simplest method: upload to Google Sheets, it automatically detects UTF-8 encoding.
CSV Format Optimization Tips
Handling Special Characters
Three types of characters in CSV need special handling:
1. Comma (,):
Values containing commas must be wrapped in double quotes.
name,address
John,"New York, Manhattan, 5th Avenue"
2. Newline (\n):
Preserving newlines requires double quote wrapping.
name,description
Product A,"Line 1
Line 2
Line 3"
3. Double Quote ("):
Double quotes need escaping with two double quotes.
name,quote
John,"He said: ""Hello"""
Python Auto-Handling:
import csv
writer = csv.DictWriter(f, fieldnames=headers, quoting=csv.QUOTE_MINIMAL)
# QUOTE_MINIMAL: Only quote when necessary
# QUOTE_ALL: Quote all fields
Custom Delimiters
Some regions use different delimiters:
Semicolon (;): Common in Europe (because decimal point uses comma)
df.to_csv('output.csv', sep=';', encoding='utf-8-sig')
Tab (\t): Handle data containing both commas and semicolons
df.to_csv('output.tsv', sep='\t', encoding='utf-8-sig')
Real-World Use Cases (Data Analysis, Report Generation, System Integration)
Case 1: E-commerce Order Report
Requirement: Daily fetch order data from API (JSON), convert to CSV for finance department reconciliation.
JSON Example:
[
{
"order_id": "A001",
"customer": "John Doe",
"items": [
{"product": "Product A", "price": 100, "qty": 2},
{"product": "Product B", "price": 200, "qty": 1}
],
"total": 400
}
]
Python Script (Auto-execute daily):
import requests
import pandas as pd
from datetime import datetime
# Fetch data from API
response = requests.get('https://api.example.com/orders')
data = response.json()
# Flatten items array
rows = []
for order in data:
for item in order['items']:
rows.append({
'Order ID': order['order_id'],
'Customer': order['customer'],
'Product': item['product'],
'Unit Price': item['price'],
'Quantity': item['qty'],
'Subtotal': item['price'] * item['qty']
})
df = pd.DataFrame(rows)
today = datetime.now().strftime('%Y%m%d')
df.to_csv(f'orders_{today}.csv', index=False, encoding='utf-8-sig')
Case 2: Social Media Data Analysis
Requirement: Fetch post data from Instagram API, import into Tableau for analysis.
Key Points:
- Flatten hashtags array (convert to comma-separated string)
- Convert timestamp to readable date format
- Remove emoji to avoid encoding issues
df['hashtags'] = df['hashtags'].apply(lambda x: ', '.join(x) if x else '')
df['created_at'] = pd.to_datetime(df['timestamp'], unit='s')
df['caption'] = df['caption'].apply(lambda x: x.encode('ascii', 'ignore').decode('ascii'))
Case 3: Multilingual Product Catalog
Requirement: Export product data from website backend (JSON), convert to multilingual CSV for translation agency.
Common Errors and Solutions
Error 1: KeyError (Field Not Exist)
Error Message:
KeyError: 'email'
Cause: Inconsistent object fields in JSON array.
Solution:
# Use get() method to set default value
rows = []
for item in data:
rows.append({
'name': item.get('name', 'Not Provided'),
'email': item.get('email', 'Not Provided'),
'phone': item.get('phone', 'Not Provided')
})
Error 2: UnicodeDecodeError (Encoding Error)
Error Message:
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff
Cause: File encoding is not UTF-8.
Solution:
# Try different encodings
encodings = ['utf-8', 'big5', 'gbk', 'latin1']
for enc in encodings:
try:
with open('data.json', 'r', encoding=enc) as f:
data = json.load(f)
print(f"✅ Using encoding: {enc}")
break
except:
continue
Error 3: CSV Columns Misaligned in Excel
Cause: Data contains unescaped commas or newlines.
Solution:
# Use pandas to auto-handle escaping
df.to_csv('output.csv', index=False, encoding='utf-8-sig', quoting=csv.QUOTE_NONNUMERIC)
Summary: Choosing the Best Conversion Method
| Method | Use Case | Advantages | Limitations |
|---|---|---|---|
| Online Tool | One-time conversion, small files (<10MB) | Zero code, instantly available | Not suitable for automation |
| Python | Batch processing, automated scheduling | Powerful, customizable | Requires programming basics |
| JavaScript | Frontend web export | Great user experience | Browser memory limits |
Recommended Process:
1. Small Test: First test with JSON Parser online tool
2. Verify Format: Confirm conversion results meet requirements
3. Automate Deployment: Write Python script, schedule daily execution
If encountering complex syntax errors, recommend first reading Complete JSON Syntax Error Solutions to fix data, then proceed with conversion. For readers wanting to dive deeper into JSON processing, refer to Complete JSON Parser Guide for more advanced techniques.
Reference Resources
Official Documentation:
- Python csv Module Official Documentation
- pandas.DataFrame.to_csv
- RFC 4180 - CSV Format Standard
Recommended Tools:
- Tool Master JSON Parser - Free online tool, local processing protects privacy
- json2csv NPM Package - Most popular Node.js conversion package
Further Reading:
- Complete JSON Parser Guide - 7 Essential Tips from Beginner to Expert
- JSON Formatter Best Practices - Learn How to Optimize JSON Format
- JSON Beautifier Usage Guide - Beautify JSON Data to Improve Readability
- JSON Schema Validation Practical Guide - Build Automated Validation Systems
- Complete JSON Syntax Error Solutions - 30+ Error Examples with Fixes
- JSON Tree Viewer Visualization Guide - Visualize Complex JSON Structures
Try Tool Master JSON Tools Now, experience 100% local processing safety and convenience! No registration required, completely free, data never leaves your browser.