NPrinting Audit Logs

It is long time since I written my previous blog post. This one is going to be short, but I hope you find it useful. NPrinting November 2020 introduced audit logs. It is long awaited feature allowing us to capture practically every movement happening in NPrinting environment. For it to work we have to enable audit logging in NPrinting admin console settings. There are also 2 parameters available which will dictate how much data we want to keep in the logging database.

Admin console settings for audit logging

Now, If you are still on November 2020 version the only supported way to get those logs is via API which enables you to download CSV file with logs. This is documented here, so I will not duplicate that documentation in my post. There is also unsupported option. It is unsupported as it involves messing up with NPrinting repository database config files. Since audit logs are kept in PostgreSQL database on NPrinting server we can simply connect to it using Qlik Sense or QlikView PostgreSQL connector. So, what we will have to change?

NPrinting DB has configuration in C:\ProgramData\NPrinting\data\pg_hba.conf file. We need to stop all services, edit it to allow for “listening” from other machines, save and restart DB. The bit we need to edit is at the very end of the file (you should see it below this):

# If you want to allow non-local connections, you need to add more
# “host” records. In that case you will also need to make PostgreSQL
# listen on a non-local interface via the listen_addresses
# configuration parameter, or via the -i or -h command line switches.

This is also documented here: link (just add host record for example Qlik Sense server from where you would like to connect to database).

Once you do this you will be able to easily create PostgreSQL connection to your NPrinting repository with audit logs. Connections details are:

  • hostname: your nprinting server
  • port: 4992
  • database: nprinting_audit
  • user: postgres
  • password: your repository password

With the above set, you can just SQL select audit logs table and create analysis as per your liking. There is nothing else to explain other than timestamps used are in UTC and may require conversion. My sample script is here:

LIB CONNECT TO 'PostgreSQL';
[audit_event]:
LOAD 
ConvertToLocalTime(event_time , 'Sydney') 					as [local_date_time],	//	Adjust city to show your local time
Date(Floor(ConvertToLocalTime(event_time , 'Sydney')))     	as [local_date],
Time(Frac(ConvertToLocalTime(event_time , 'Sydney')))      	as [local_time],
id, 
user_id, 
user_name, 
ip_address, 
source, 
target, 
target_id, 
target_name, 
app_id, 
event_time, 
action, 
description, 
data, 
data_type
;
SELECT "id",
"user_id",
"user_name",
"ip_address",
"source",
"target",
"target_id",
"target_name",
"app_id",
"event_time",
"action",
"description",
"data",
"data_type"
FROM "public"."audit_event";

With the above you can come up with the way of loading only last day of logs and storing them locally as .qvd files. There are plenty known incremental load methods you can use so you can end up with full history of logs available for analysis without having to put too much stress on logging database.

Now, what if you want to go with supported solution?

With the NPrinting February 2021 version we get couple more API methods we can use. This is not officially released yet but is available as beta/testing version and I put my hands on it. Those who played with NPrinting API know that we can use REST connector from Qlik Sense/QlikView to execute calls to NPrinting API. I have written “NPrinting.qvs” library which has plenty subroutines you can use. I now updated it with the one large subroutine called “NP_Audit_Model” which will return 1000 records from audit (this is the max number i was able to use as vQueryLimit variable – putting greater value causes error). Official documentation explains other additional query parameters, but I have not used them. These are:

  • Query parameters
    • offset (optional) The number of entries to skip. Default is 0.
    • limit (optional) The maximum number of entries to return. Default is 50.
    • start (optional) The start date to filter on in UTC timezone and format yyyy-MM-ddTHH:mm:ssZ. Example: 2020-09-15T11:59:08Z
    • end (optional) The end date to filter on in UTC timezone and format yyyy-MM-ddTHH:mm:ssZ. Example: 2020-09-15T11:59:08Z
    • appId (optional) The ID of the app to filter on.
    • userId (optional) The ID of the user to filter on.
    • targetId (optional) The ID of the target to filter on.
    • target (optional) The target name to filter on.
    • dataType (optional) The data type to filter on

With those in place you can again come up with the solution on how to load logs incrementally and store them to .qvd files.

Summary:

  • My NPrinting.qvs is now updated with NP_Audit_Model subroutine and you can download it from here.
    • note that you have to get authenticated first (sample script below – replace Sydney with your city as it is required parameter for UTC time conversion)
//	==================================================================
// Adjust below include file so NPrinitng.qvs is properly referenced
            
	$(Must_Include=lib://Files (elvaio_lech)/NPrinting/NPrinting.qvs);
//	Authenticate
	Call NP_Authenticate;
//	Get all audit data
	Call NP_Audit_Model('Sydney');
//	END
	exit script;
//	==================================================================
  • I am also attaching .qvs file here called NPrinting Audit Trail.qvf. It is Qlik Sense app with scripts and sample visualisation of data from API call. It should be good starting point for anyone who needs this data. It is an application which I put together in 5 min for the purpose of this blog post so don’t expect anything great in it and remember that API call to audit data requires NPrinting February 2021!!!

This is it… hope you enjoy it.

4 Comments

  1. Dear,
    I really love the NPrinting API from Qliksense. I wondered is it possible to link the taskid to the current users ?

    Thanks in advance

    Like

  2. Hi Lech,

    Firstly, thank you for this great blogpost.

    I want to monitor all the logging on the Qlik NPrinting server via the Audit logs in Qlik Sense. Now I got everything working to import the Audit log in Qlik Sense but I obviously only receive the last 1000 rows because of the query limit.
    Now I was trying to make the script more dynamic so I can do an import that
    only requests the data larger than the datetime I already have with the start query parameter but if I use this line the script fails.

    QUERY “Limit” “$(vQueryLimit)” & “Start” “2023-01-01T00:00:00Z”

    If I only use the Limit it works fine and if I remove Limit and only leave start the script works but with the & sign the script returns an error

    Connector reply error: Unknown substring, string, or value at (128,375): ‘& “Start” “2023-01-01T00:00:00Z”)’

    This is the WITH I use. It’s just like the one in the nprinting.qvs but I extended the QUERY parameters.

    CONNECTION(URL “$(vURL_Audit)”, HTTPHEADER “cookie” “$(vCookie)”, QUERY “Limit” “$(vQueryLimit)” & “Start” “2023-01-01T00:00:00Z”)

    Hope you can help. Thank

    Like

Leave a comment