Logstash process getting killed when csv has large columns

11/6/2021

I have a pod of 6gb ram and storage and i have been using logstash to sync the data from csv to Elastic Search but logstash process is getting killed after syncing some data. I have tried it with min 1 worker 500 batch too

**CSV chunk** {
Rows=> 5 million 
columns=> 641
}

**pod configuration** 
{
ram=> 6gb
storage => 6gb
}

**logstash config** :- 
input {  
		file {
				path => "/app/table1.csv" #ssv of 5m rows rows and 641 columns  
				start_position => "beginning"
				sincedb_path => "/dev/null"
				ignore_older => 36000000 
				close_older => 36000000
			} 
		}
		
		filter {
			csv {
				separator => ";"#csv separator 
                remove_field => ["message"]
                
}}

output { 
				elasticsearch {
					hosts => ["localhost:9200"] 
					index => "activity_test04232_3"
					retry_max_interval => 5
                    retry_initial_interval => 30
				}#output ES host 
			}
-- amanlalwani007
kubernetes
linux
logstash
operating-system
python

0 Answers