My MariaDB suddenly not working and it is looking a table that does not exist when uploading my csv

For 4 years my MariaDB has been working but this month it suddenly became less responsive so I did a check, optimize, and repair and do a optimization on MariaDB-server.cnf. I used this script to upload my csv file to the mariadb server: I call this bash script "process_all_files.sh"

or file in /home/devdb/incoming/*.csv; 
do 
	file_name=${file##*/}
	file_path=${file%/*}
	file_path_incoming="${file_path}/${file_name}"
	file_path_process="${file_path}/processing/${file_name}"
	file_path_archive="${file_path}/archive/${file_name}"
	
	#echo $file_path;
	#echo $file_path_incoming;
	#echo $file_path_process;
	#echo $file_path_archive;	
	
	#check csv file has an header
	header=$(head -n 1 "$file_path_incoming")
	header_string="IMO_Number,SerialNo,ChannelNo,ChannelDescription,TimeStamp,Value,Unit"

	# Add header to csv file if not found
	if [[ $header != $header_string ]]
	then
		sed -i '1s/^/$header_string\n/' $file_path_incoming
	fi
	
	# Move file from incoming folder to processing folder.
	echo "Move $file_name to processing to folder"
	mv -f $file_path_incoming $file_path_process
	

	# Import csv file to database. Header line will skip.
	echo "Import $file into database"	
	sql="
	LOAD DATA LOCAL INFILE '$file_path_process'
	INTO TABLE Monitoring
	FIELDS TERMINATED BY ',' ENCLOSED BY '\"'
	LINES TERMINATED BY '\n'
	IGNORE 1 LINES
	(IMO_No,SerialNo,ChannelNo,ChannelDescription,TimeStampOriginal,@Value,Unit) 
	set TimeStamp=STR_TO_DATE(TimeStampOriginal, '%Y-%m-%dT%H:%i:%s+00:00'),Value = @Value,FileName='${file_name}';
	"
	
	echo $sql > "${file_path}/import.sql"
	
	mysql -u root -ppassword01 drums < "${file_path}/import.sql" && mv -f $file_path_process $file_path_archive
	
	echo "";
done

And my import.sql is this:

LOAD DATA LOCAL INFILE '/home/devdb/incoming/processing/*.csv' INTO TABLE Monitoring FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' IGNORE 1 LINES (IMO_No,SerialNo,ChannelNo,ChannelDescription,TimeStampOriginal,@Value,Unit) set TimeStamp=STR_TO_DATE(TimeStampOriginal, '%Y-%m-%dT%H:%i:%s+00:00'),Value = @Value,FileName='*.csv';

These codes run in a cron job every 5 minutes when I inspect the logs there is an error:

Move MODmQueenie-2023-12-30-12-40.csv to processing to folder
Import /home/devdb/incoming/MODmQueenie-2023-12-30-12-40.csv into database
ERROR 1146 (42S02) at line 1: Table 'drums.a' doesn't exist

I tried a different approach using python with the print(load_sql) to see if the script looks for "drums.a" but it does not. This is my python code:

import os
import pandas as pd
import pymysql
import shutil
from datetime import datetime

# Database credentials and configuration
db_config = {
    'user': 'root',
    'password': 'password01',
    'host': 'localhost',
    'database': 'drums',
    'local_infile': True
}

# Directories
incoming_dir = '/home/devdb/incoming'
processing_dir = os.path.join(incoming_dir, 'processing')
archive_dir = os.path.join(incoming_dir, 'archive')

# Header for CSV files
header_string = ["IMO_No", "SerialNo", "ChannelNo", "ChannelDescription", "TimeStampOriginal", "Value", "Unit"]

# Function to convert TimeStampOriginal to TimeStamp
def convert_timestamp(original_timestamp):
    return datetime.strptime(original_timestamp, '%Y-%m-%dT%H:%M:%S+00:00').strftime('%Y-%m-%d %H:%M:%S')

# Process each CSV file in the incoming directory
for file_name in os.listdir(incoming_dir):
    if file_name.endswith('.csv'):
        file_path_incoming = os.path.join(incoming_dir, file_name)
        file_path_process = os.path.join(processing_dir, file_name)
        file_path_archive = os.path.join(archive_dir, file_name)

        # Read CSV
        df = pd.read_csv(file_path_incoming, names=header_string, header=None)

        # Convert TimeStampOriginal to TimeStamp
        df['TimeStamp'] = df['TimeStampOriginal'].apply(convert_timestamp)

        # Save the modified CSV file to processing directory
        df.to_csv(file_path_process, index=False, header=True)

        # Connect to the database and load the data
        try:
            conn = pymysql.connect(**db_config)
            cursor = conn.cursor()
            # Construct the SQL query
            load_sql = f"""
            LOAD DATA LOCAL INFILE '{file_path_process}'
            INTO TABLE monitoring
            FIELDS TERMINATED BY ',' ENCLOSED BY '\"'
            LINES TERMINATED BY '\\n'
            IGNORE 1 LINES
            (IMO_No, SerialNo, ChannelNo, ChannelDescription, TimeStamp, Value, Unit)
            """
            print(load_sql)
            cursor.execute(load_sql)
            conn.commit()
        except pymysql.MySQLError as error:
            print(f"Error: {error}")
        finally:
            if conn and conn.open:
                cursor.close()
                conn.close()

        # Move file to archive directory
        shutil.move(file_path_process, file_path_archive)

        print(f"Processed and archived {file_name}")

This is the output of the python script which shows it is looking for "a" table:

Processed and archived MODmQueenie-2023-12-30-14-13.csv

            LOAD DATA LOCAL INFILE '/home/devdb/incoming/processing/MODmQueenie-2023-12-30-14-16.csv'
            INTO TABLE monitoring
            FIELDS TERMINATED BY ',' ENCLOSED BY '"'
            LINES TERMINATED BY '\n'
            IGNORE 1 LINES
            (IMO_No, SerialNo, ChannelNo, ChannelDescription, TimeStamp, Value, Unit)

Error: (1146, "Table 'drums.a' doesn't exist")

I inspect my database, and there is no "a" tables in my schema but my code error tells me it is looking for 'drums.a"

I dump all data which is about 40GB and put it on the cloud server hosted in Ubuntu but still the same response looking for "drums.a"

Please help me what is this causing my problem that my script looks for "a" table which is not existing on my codes that working for 4 years.

Thanks and Regards,

Romel

Comments

Comments loading...
Content reproduced on this site is the property of its respective owners, and this content is not reviewed in advance by MariaDB. The views, information and opinions expressed by this content do not necessarily represent those of MariaDB or any other party.