In this post i am going to share a very useful command in linux to splitting big csv file into smaller files.
Suppose you have a CSV file approx 800MB in size and 50Lakh rows, Then if you’ll try to open this file on any application, It makes your system hang up. And probably you’ll not able to open this files on any editor. So best idea is to split file by smaller size or records.

I have the same problem, I have 800MB file in CSV format which i need to import into mysql database after making some changes on file, But I am not able to open it on Open Office and MS Office. So i used this fabulous linux command in my ubuntu PC, It takes hardly few seconds to spiting files in 100000 records per file.

Here is the command to split CSV file by records wise.

split -d -l 100000 file.csv new/file_part_

Where 100000 is records per file, You can change it as per your need.
file.csv is the source file which need to split in 100000 records per file.
new/file_part_ is the new directory with spitted files, Your new files name will be something like that file_part_00.csv, file_part_01.csv, file_part_02.csv ….. and so on



For Windows User – Split Large CSV file using CSV Splitter Tool

Windows User can simply download CSV Splitter a Windows tool to split large csv file into small files with GUI interface. CSV Splitter is a simple tool for your CSV files. It will split large comma separated files into smaller files based on a number of lines. CSV Splitter will process millions of records in just a few minutes. It will work in the background so you can continue your work without the need to wait for it to finish.
DOWNLOAD

Thanks :)

If you like this post please don’t forget to subscribe My Public Notebook for more useful stuff.