Sometimes we need to write error details(like error number, description) into table.
Let us say, a job is being executed, an error occured in the middle of the execution, error details will be written in to error log file.
Now, i want to capture those error details and preserve in a backend table.
Are there any metadata tables that holds this information? Certainly, I didn’t find any. I looked at metadata table AL_HISTORY_INFO, It holds the path of log files but not the error values(like error number, description etc.,).
one solution is there, we can use get_error_filename() environment function, this funtion gives the current job error log filename and its complete path.
Here is the way to go,
I have a simple dataflow which loads the distinct data into target. Now, I’m purposefully generating an error by loading same data in the target table since it contains unique constraint on CUSTOMER_ID column.
and in the catch statement, I have a script to get the error filename, monitor filename, trace filename and their path. And in the next step I placed a workflow, in that I have designed a dataflow for each log file to load.
And in the script, I’m using environments functions to get their respective filenames, and passing those values in to three global variables , and then trimming the path from one of the filename.
$PATH ——- log files entire path
$GV_ERR_FILENAME ——- For Error log filename
$GV_MTR_FILENAME —— For Monitor log filename
$GV_TRC_FILENAME ——- For Trace log filename
Define these global variables at job level
I have defined fixed width flatfile formats for each log file.
Flatfile format for Error log
Flatfile format for Monitor log
Flatfile format for Trace log
after defining these file formats, design a dataflow for each file.
Trace log window
Error log window
Error table data