Baeldung Pro – SQL – NPI EA (cat = Baeldung on SQL)
announcement - icon

Learn through the super-clean Baeldung Pro experience:

>> Membership and Baeldung Pro.

No ads, dark-mode and 6 months free of IntelliJ Idea Ultimate to start with.

1. Introduction

Importing data from a CSV (Comma-Separated Values) file into an SQL table is a common task. Additionally, we use CSV files to transfer and store data across various applications. Because it’s lightweight and human-readable, we can easily export CSV data into an SQL database for further analysis, manipulation, or storage.

In this tutorial, we’ll focus on how to import data from a CSV file into an SQL table. In particular, we’ll explore this solution in databases such as PostgreSQL, MySQL, and SQL Server. For illustration, we’ll create a sample CSV file named employee_data.csv that contains a list of employees with columns for their ID, name, age, department, and hire date.

2. Sample Data

First, let’s save the sample data in a location on the system:

$ cd import-csv
$ cat employee_data.csv 
id,name,age,department,hire_date
1,John Doe,28,Sales,2021-05-01
2,Jane Smith,34,Marketing,2020-07-15
3,Emily Johnson,45,Engineering,2018-03-10
4,Michael Brown,29,Human Resources,2019-11-23
5,Sarah Davis,31,Finance,2022-01-12

Here, we save the sample data in a folder named import-csv. Let’s use the pwd command to get the full path of the directory containing the CSV file:

$ pwd
/home/kali/import-csv

3. Implementation in PostgreSQL

In PostgreSQL, we can use the psql \copy instruction to load data from files directly into an SQL table. Furthermore, the command reads the data from a CSV file and inserts it into a specified table’s columns.

To begin, we need to create a table structure in our database that matches the CSV file:

CREATE TABLE employees (
    id SERIAL PRIMARY KEY,
    name VARCHAR(50),
    age INTEGER,
    department VARCHAR(30),
    hire_date DATE
);

This command creates an employees table with columns that align with our CSV file’s structure.

Since we’ve created an SQL table, we can directly load the data from employee_data.csv into the employees table using the \copy command:

\copy employees(id, name, age, department, hire_date)
FROM '/home/kali/import-csv/employee_data.csv'
DELIMITER ','
CSV HEADER;
COPY 5
Time: 0.004s

The command inserts the data from the employee_data.csv file into the employees table. However, we can verify the import by running the SELECT query to display the data:

SELECT * FROM employees;
+----+---------------+-----+-----------------+------------+
| id | name          | age | department      | hire_date  |
|----+---------------+-----+-----------------+------------|
| 1  | John Doe      | 28  | Sales           | 2021-05-01 |
| 2  | Jane Smith    | 34  | Marketing       | 2020-07-15 |
| 3  | Emily Johnson | 45  | Engineering     | 2018-03-10 |
| 4  | Michael Brown | 29  | Human Resources | 2019-11-23 |
| 5  | Sarah Davis   | 31  | Finance         | 2022-01-12 |
+----+---------------+-----+-----------------+------------+
SELECT 5
Time: 0.009s

Consequently, the output shows all the rows from the employee_data.csv file successfully loaded into the PostgreSQL table.

4. Implementation in MySQL

In MySQL, there are various methods to insert a CSV file into an SQL table. In particular, we can use the LOAD DATA INFILE command to load data from a CSV file into a database table efficiently. Alternatively, we can use the mysqlimport command-line utility, which is designed to scale efficiently by handling larger files.

4.1. Using the LOAD DATA INFILE Command

The LOAD DATA INFILE command is a native MySQL command that efficiently loads CSV data directly into a table. Additionally, we need to ensure that the table structure aligns with the CSV file’s format and that we have the appropriate permissions to access the CSV file.

First, let’s log in to MySQL as the root user to have full control over settings:

$ mysql -h 127.0.0.1 -P 3306 -u root -p
Enter password: 
Welcome to the MariaDB monitor.  Commands end with ; or \g.
...

Once we’re logged in, let’s run a query to enable local file loading:

SET GLOBAL local_infile=1;
Query OK, 0 rows affected (0.001 sec)

After enabling this setting, we exit MySQL:

quit

Then, we re-login as the root user and create the target table, which will store the imported CSV data:

CREATE TABLE employees (
    id INT PRIMARY KEY,
    name VARCHAR(50),
    age INT,
    department VARCHAR(30),
    hire_date DATE
);

Therefore, with the table structure in place, we use the LOAD DATA LOCAL INFILE command to import the CSV file:

LOAD DATA LOCAL INFILE '/home/kali/import-csv/employee_data.csv'
INTO TABLE employees
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(id, name, age, department, hire_date);
Query OK, 5 rows affected (0.006 sec)
Records: 5 Deleted: 0 Skipped: 0 Warnings: 0

Now, let’s explain what the command does:

  • FIELDS TERMINATED BY ‘,’: sets commas as column separators
  • ENCLOSED BY ‘ ” ‘: handles text fields enclosed in double quotes
  • LINES TERMINATED BY ‘\n’: treats each line as a new record
  • IGNORE 1 ROWS: skips the CSV header row

To confirm the data was successfully imported, let’s query the employees table:

SELECT * FROM employees;
+----+---------------+------+-----------------+------------+
| id | name          | age  | department      | hire_date  |
+----+---------------+------+-----------------+------------+
|  1 | John Doe      |   28 | Sales           | 2021-05-01 |
|  2 | Jane Smith    |   34 | Marketing       | 2020-07-15 |
|  3 | Emily Johnson |   45 | Engineering     | 2018-03-10 |
|  4 | Michael Brown |   29 | Human Resources | 2019-11-23 |
|  5 | Sarah Davis   |   31 | Finance         | 2022-01-12 |
+----+---------------+------+-----------------+------------+
5 rows in set (0.001 sec)

This shows that we’ve successfully imported CSV data into the MySQL table using the LOAD DATA INFILE command.

4.2. Using the mysqlimport Command-Line Utility

The mysqlimport command is a powerful utility for importing CSV files directly into MySQL tables. In particular, the tool is efficient for batch imports of large datasets. Furthermore, this method can be used to import CSV files into MySQL outside the MySQL shell environment.

First, let’s create a table in MySQL that matches the structure of the CSV file:

CREATE TABLE employee_data (
    id INT PRIMARY KEY,
    name VARCHAR(50),
    age INT,
    department VARCHAR(30),
    hire_date DATE
);

Next, we’ll use the mysqlimport command to load the CSV data into the employee_data table:

$ mysqlimport --local --fields-terminated-by=',' --lines-terminated-by='\n' --ignore-lines=1 --columns='id,name,age,department,hire_date' -u root -p --host=127.0.0.1 University /home/kali/import-csv/employee_data.csv
Enter password: 
University.employee_data: Records: 5  Deleted: 0  Skipped: 0  Warnings: 0

Now, let’s break down the command:

  • –local: specifies the file is located on the local machine
  • –fields-terminated-by=’,’: indicates fields are separated by commas
  • –lines-terminated-by=’\n’: defines each record is separated by a newline
  • –ignore-lines=1: skips the header row in the CSV file
  • –columns=’id,name,age,department,hire_date’: maps the CSV columns to the corresponding table columns
  • -u root -p: specifies the MySQL user (in this case, we use the root user)
  • University: specifies the target database
  • /home/kali/import-csv/employee_data.csv: path to the CSV file

Additionally, mysqlimport strips the filename extension and uses the result as the name of the table into which to import the file’s contents.

Finally, let’s verify the imported data by querying the employee_data table in MySQL:

SELECT * FROM employee_data;
+----+---------------+------+-----------------+------------+
| id | name          | age  | department      | hire_date  |
+----+---------------+------+-----------------+------------+
|  1 | John Doe      |   28 | Sales           | 2021-05-01 |
|  2 | Jane Smith    |   34 | Marketing       | 2020-07-15 |
|  3 | Emily Johnson |   45 | Engineering     | 2018-03-10 |
|  4 | Michael Brown |   29 | Human Resources | 2019-11-23 |
|  5 | Sarah Davis   |   31 | Finance         | 2022-01-12 |
+----+---------------+------+-----------------+------------+
5 rows in set (0.001 sec)

The result confirms that the import was successful.

5. Implementation in SQL Server

In SQL Server, we can use the BULK INSERT command to load data from a CSV file into a table efficiently. Additionally, we need to ensure that the table structure in SQL Server aligns with the data in our CSV file.

Furthermore, for the command to work, we need to specify the full path of the file and other parameters to configure the data format.

To get started, let’s create an employees table in SQL Server that matches the columns in employee_data.csv:

CREATE TABLE employees (
    id INT PRIMARY KEY,
    name NVARCHAR(50),
    age INT,
    department NVARCHAR(30),
    hire_date DATE
);

Then, let’s import the data from employee_data.csv to the SQL Server:

BULK INSERT employees
FROM '/home/kali/import-csv/employee_data.csv'
WITH (
    FIELDTERMINATOR = ',',
    ROWTERMINATOR = '\n',
    FIRSTROW = 2
);
Time: 0.002s

Now, let’s break down the command:

  • FIELDTERMINATOR = ‘,’: specifies that fields in the CSV file are separated by commas
  • ROWTERMINATOR = ‘\n’: defines each line break as the end of a row
  • FIRSTROW = 2: skips the first row (header) in the CSV file

The query inserts each row in the CSV file into the employees table.

6. Conclusion

In this article, we’ve explored how to import CSV data into SQL tables across PostgreSQL, MySQL, and SQL Server. Specifically, we used the \copy instruction in PostgreSQL, the LOAD DATA INFILE and mysqlimport commands in MySQL, and the BULK INSERT command in SQL Server.

Furthermore, these methods provide a streamlined way to load data from CSV files into structured databases, supporting further analysis and reporting.

The code backing this article is available on GitHub. Once you're logged in as a Baeldung Pro Member, start learning and coding on the project.