Ora2Pg Migration – Custom PostgreSQL Log Parser

In our Oracle to PostgrSQL migration project, once we cover the issues reported by static code analysers we brought up the instances in development and did Brut Force Testing to capture all SQL Failures in the database layer.

We enabled all debug logs at the database and as result it started throwing exceptions like the one below in the log,

2020-01-18 09:07:30 PDT [2502783]: user=db_user, db=my-dev-db-01, app=service-1, client=XX.XX.XX.XX ERROR:  cache lookup failed for type 2465323

2020-01-18 09:07:30 PDT [2502783]: user=db_user, db=my-dev-db-01, app=service-1,client=XX.XX.XX.XX STATEMENT:  call  PKG_V2.GET_DATA($1,$2,$3,$4,$5,$6,$7,$8,$9)

2020-01-18 09:07:30 PDT [2502783]: user=db_user, db=my-dev-db-01, app=service-1,client=XX.XX.XX.XX LOG:  duration: 0.260 ms

Since PG logs are at finest level and heavy – we needed a script to extract unique problem and query in the context. I developed following scripts to do the same,

#!/bin/sh

NUMBER_EXP='^[0-9]+$'

LOG_FILE_PATTERN=postgresql_*
Q_FILE="queries.txt"
rm -rf $Q_FILE
touch $Q_FILE


# Read all the files of the pattern
FILES=`ls -1 $LOG_FILE_PATTERN`
for file in $FILES
do
        # For each file - find unique errors
        ERROR_LIST=`grep "db=my-dev-db-01" $file | grep "ERROR:" | grep "client=XX." | grep "user=db_user" | awk '{$1=$1};1' | awk 'sub ("^.*ERROR: ", "")' | sort | uniq`
        # | awk '{print $0,"\n"}'`
        #echo Found "$ERROR_LIST" in $file

        ERROR_LENGTH=$(echo -n $ERROR_LIST | wc -m)
        if [ "$ERROR_LENGTH" != "0" ];
        then
            # For each error in the list
            echo "The error list is $ERROR_LIST"
            while read -r error_txt;
            do
                ERROR_LINE=`grep -n "db=my-dev-db-01" $file | grep "ERROR:" | grep "$error_txt" | head -n 1 | cut -f1 -d':'`
                if ! [[ $ERROR_LINE =~ $NUMBER_EXP ]] ; then
                        echo "Unable to get line for "$error_txt"...got only \"$ERROR_LINE\""
                else
                        FOUND_ALREADY=`cat $Q_FILE | grep "$error_txt" | wc -l`
                        if [ "$FOUND_ALREADY" = "0" ]; then
                                ERROR_LINE_END=`expr $ERROR_LINE + 2`
                                ERROR_TXT=`cat $file | awk 'NR >= '$ERROR_LINE' && NR <= '$ERROR_LINE_END''`
                                echo "$error_txt" >> $Q_FILE
                                LENGTH=$(echo -n $error_txt | wc -m)
                                for (( c=1; c<=$LENGTH; c++)) ;
                                do
                                        echo -n "#" >> $Q_FILE
                                done
                                echo "" >> $Q_FILE
                                echo "$ERROR_TXT" >> $Q_FILE
                                for (( c=1; c<=$LENGTH; c++)) ;
                                do
                                        echo -n "#" >> $Q_FILE
                                done
                                echo "" >> $Q_FILE
                        fi
                fi
            done <<< "$ERROR_LIST"
            sleep 1
        fi
done

The script can parse log files as specified in the “LOG_FILE_PATTERN” and create queries.txt for you to check. Try running it with your PG logs and report back if you want to see improvement. Before running, make sure those constants used inside the script are changed to suit your environment !

If you are an UI guy – go, check out PgBadger

Leave a Reply

Your email address will not be published.

WP2Social Auto Publish Powered By : XYZScripts.com