1. Introduction

Invariably, there is an inevitable need to convert data between various formats during data manipulation and analysis. In particular, we often encounter the task of converting JSON (JavaScript Object Notation) to CSV (Comma-Separated Values) in Linux. However, this can be achieved using several Linux commands.

In this tutorial, we’ll understand the importance of this conversion and explore the different methods to accomplish it using the Linux command-line tools.

2. Why Convert JSON to CSV?

To begin, let’s see why it’s necessary to convert JSON to CSV:

  • Data Compatibility: as the CSV format is highly compatible, widely supported, and easily manipulated by various applications and tools, it makes the data more accessible and shareable.
  • Data Analysis: as many data analysis and processing tools have built-in support for CSV, converting JSON to CSV allows us to utilize these tools efficiently, facilitating insightful data analysis.
  • Data Export: as some databases and applications prefer data in CSV format for import, converting JSON to CSV simplifies the data export process, ensuring seamless integration.

Next, let’s explore the different methods for converting JSON to CSV using Linux commands.

3. Using the jq Tool

jq is a simple and powerful command-line JSON processor in Linux. In fact, it allows parse, filter, and manipulate JSON data.
Let’s use the below data.json as an input file throughout this article:

$ cat data.json
    "employees": [
            "name": "Alice",
            "department": "HR",
            "age": 28
            "name": "Bob",
            "department": "IT",
            "age": 32

Now, let’s use jq to convert JSON to CSV:

$ jq -r '.employees[] | [.name, .department, .age] | @csv' data.json > data.csv


  • jq -r instructs jq to operate in “raw” mode, which ensures the output is suitable for CSV formatting
  • [.name, .department, .age] | @csv filter handles proper value escaping and converts the JSON data to CSV format

Subsequently, we save the resulting CSV file as data.csv:

$ cat data.csv

Further, we can adjust the JSON path to match the data structure. Also, jq is a good choice for data processing in high-performance environments. jq efficiently processes large data sets by reducing network latency and improving user experience.

4. Using the awk Tool

Alternatively, we can use awk, a versatile text processing tool in Linux, to convert JSON to CSV. It scans the file line by line and splits it into multiple fields.

Let’s say we have a JSON file named data.json:

$ awk -F '[:,]' 'BEGIN { print "name,department,age" } \
/"name"/ { gsub(/[" ]/, "", $2); name = $2 } \
/"department"/ { gsub(/[" ]/, "", $2); department = $2 } \
/"age"/ { gsub(/[" ]/, "", $2); age = $2; 
print name "," department "," age }' data.json > data.csv


  • -F'[:,]’ specifies two delimiters to split the input, thus effectively parsing the JSON data.
  • BEGIN {print “Key,Value”} initializes the output CSV file with column headers.
  • {gsub(…)} removes unwanted spaces and double quotes from the key-value pairs.
  • print name “,” department “,” age prints the comma-separated value to create the CSV structure.

Lastly, let’s save the output as data.csv.

$ cat data.csv

In some cases, we may need to customize this command to match the structure of our JSON data.

5. Using Python

Sometimes, we might not have jq or awk installed on our Linux system. In such cases, Python is a versatile alternative to convert JSON to CSV.
Let’s see how to use a Python script to achieve this:

$ python3 -c "import json, csv; \
data = json.load(open('data.json')); \
employees = data['employees']; \
with open('data.csv', 'w', newline='') as csvfile: \
    csv.writer(csvfile).writerow(['name', 'department', 'age']); \
    [csv.writer(csvfile).writerow([e['name'], e['department'], e['age']]) for e in employees]"

Consequently, this command loads the JSON data, extracts the keys, and writes them to a CSV file named data.csv:

$ cat data.csv

Let’s remember that this command requires Python on our Linux system.

6. Using sed and paste

Analogously, another method to convert JSON to CSV is to use a combination of sed and paste commands. Basically, the sed command is a powerful text stream editor in Linux, widely used for text manipulation. In this case, we can use sed to clean up the JSON data and remove unnecessary characters such as curly braces and commas.

Now, let’s see how to use the sed command:

$ sed -n '/"name":/s/^[[:space:]]*"name": "\(.*\)",$/\1/gp;/"department":/s/^[[:space:]]*"department": "\(.*\)",$/\1/gp;/"age":/s/^[[:space:]]*"age": \([0-9]*\)$/\1/gp' data.json | paste -d, - - -


    • /”name”:/ matches the line containing the pattern “name”
    • s is the substitution command to indicate that we want to replace specific patterns with other characters.
    • s/^[[:space:]]*”name”: “\(.*\)”,$/\1/g

This is a substitution command that extracts the value associated with “name” from the line. It looks for the “name” key and captures its value within double quotes.

  • g modifier at the end of the sed command makes the replacement globally. It removes not just the first one but all the occurrences of the specified characters in the JSON file.
  • p it prints the result after the substitution
  • paste -d, – – – It combines the extracted values from the sed output with the delimiter as comma (,) and combines the input line into three fields.

In this manner, the combination of sed and paste cleans up and transforms the JSON data into a CSV format.

7. Using the dasel Command

Another method is dasel, a command-line utility that manipulates data in various formats, including JSON. Fundamentally, dasel simplifies this conversion with a user-friendly command structure.

Now, let’s explore how to convert JSON to CSV using the dasel command:

$ dasel -i data.json to csv -a employees -p name,department,age


  • -i data.json specifies the input JSON file.
  • to csv instructs dasel to convert the JSON data to CSV.
  • -a employees specifies the path to the array we want to convert.
  • -p name,department,age specifies the properties we want to include in the CSV.

As we can see, whether we need to extract, transform, or convert data, dasel is a valuable addition to our data toolkit.

8. Conclusion

In conclusion, this article portrays how converting JSON to CSV in a Linux environment is a valuable skill when working with different data formats and tools. In addition, we learned to use Linux commands such as jq, awk, sed, tr, and dasel to perform this conversion efficiently without any additional software or online tools.

Hence, the choice of method ultimately depends on our specific requirements and the tools available on the system.

Comments are closed on this article!