All You need to know about Query supervisor

One of the dream of sysadmins is monitoring system SQL usage, to check how much temporary storage is used by SQL statements, to check long-running SQL statements and so on. Daily, in my job I am able to see a lot of situations in which user run a lot of SQL statements with ODBC tools in Excel and without them noticing, the temporary memory began to grow because of all the correlations made.

But, what happens if I tell you that in IBM i this feature is already in OS?

Since IBM i V7R3M0 there is a nested mechanism that is called Query Supervisor and its goal is to define metrics on the execution of SQL statements such as the execution time, temporary storage, I/O and CPU count.

So, as I said before this feature is in the operative system so you needn’t to get some additional feature, the only thing you need is only to understand what do you want to do when a threshold has been reached. Yes, that because with this real-time monitor you are able to set up some different actions as:

  • holding job
  • ending SQL execution
  • sending message to a message queue
  • logging SQL statement

At this point, we can start with some considerations about this topic:

  1. as mentioned earlier, this is real-time monitoring so unlike other mechanisms or tools it calculates thresholds on live data, not based on estimates or assumptions.
  2. metrics that can be monitored are CPU time (i.e., the amount of CPU seconds used during the execution of a SQL statement), elapsed time (i.e., the amount of seconds used to execute the SQL statement), temporary storage (i.e., the amount of temporary memory allocated expressed in MB), I/O count (i.e., the amount of I/O operations used to execute the SQL statement).
  3. not all SQL statements trigger this mechanism, in fact some statements that do not require any kind of processing by the system (such as a trivial INSERT) are not considered. In addition, for some SQL statements the metrics computation is not based on the entire execution but only on the execution of some steps (for example, in a DELETE only the search for records to be deleted in the WHERE clause is part of the metrics computation, while the actual deletion of records is not counted). If an SQL statement contains references or calls to functions or procedures, in that case the metric takes into account the time taken to execute each individual function.
  4. it is possible to go and define very precisely the thresholds and the mechanisms that trigger them; in fact, it is possible to go and work on individual subsystems, users, or jobs. By combining these filters, it is possible to manage individual situations in the optimal way.

Let’s add a new threshold on my dummy lpar!

So before starting, we are able to check if something was already registered with this query:

SELECT * FROM QUERY_SUPERVISOR 

As you can see, I don’t have any row (no threshold has been setted up).

Now we will create a new threshold that monitor every 5 minutes all statements that are running for more than 30 minutes:

CALL QSYS2.ADD_QUERY_THRESHOLD(THRESHOLD_NAME => 'MAXTIME', THRESHOLD_TYPE  => 'ELAPSED TIME', THRESHOLD_VALUE => 1800, DETECTION_FREQUENCY => 300, LONG_COMMENT => 'Check queries running for 30 mins');

In addition, I want to create another threshold only for job of BIUSER in BISBS; in this case I will monitor current temporary storage:

CALL QSYS2.ADD_QUERY_THRESHOLD(THRESHOLD_NAME => 'BIUSER',THRESHOLD_TYPE  => 'TEMPORARY STORAGE', THRESHOLD_VALUE => 1024, DETECTION_FREQUENCY => 300, INCLUDE_USERS=> 'BIUSER', SUBSYSTEMS=> 'BISBS', LONG_COMMENT => 'Check queries with tempstg > 1GB')

In order to check if everything is ok, let’s check the previous query, I expect to find two rows:

Please, pay attention to the DETECTION_FREQUENCY parameter, it indicates the distance between monitoring checks. It is necessary to identify a correct value that allows you to go and intercept critical conditions in the best possible way, here is a specific page about this parameter.

Now we have to build the program that will intercept and manage the attainment of the threshold. The IBM documentation presents some examples, I will also leave one below. It will simply e-mail me detailing the job that crossed the threshold:

To complete everything, we need only to register this program in exit point QIBM_QQQ_QRY_SUPER with this command:  ADDEXITPGM EXITPNT(QIBM_QQQ_QRY_SUPER) FORMAT(QRYS0100) PGMNBR(1) PGM(UTILITIES/QRYSUP) TEXT('Query supervisor EXITPGM')


Here you can find some documentation:

Andrea

Playing with SQL: translator function

Oh yes, playing with SQL on an IBM i system can reserve more and more fun. SQL on these systems can also help sysadmin (as me) that not know perfectly how to program in RPG & co. to optimize day by day activities; that because since V7R3M0 IBM invests a lot in native SQL services that now a day cover a lot of different fields in system management.

Let’s start with a small example in order to break the ice, a translator function.

Sometimes, in multilanguage environments, could be really comfortable getting information in your current language. IBM i with secondary language and their message file will do it for you only for operative system menu/commands, so for application you need to get a middleware unless you want to translate every data. In this short example, you will see how it’s so easy to create your own SQL service that helps to do that.

Firstly, in this example, I will use Azure’s public API. When I’ve built this solution at work I choose this service because it’s so easy to use and because it has a very free tier, you can translate 2.000.000 of characters a month. Here you can get the FAQ page. It’s obvious that in order to use this API, you need that your partition is able to go on the internet. Another thing that you need to consider is to trust via DCM some public CAs, that because without doing that you will get some SQL errors.

At this point, we only need to play with SQL, have fun!


So, let’s test in your SQL IDE (ACS, VSCode, DataGrip, ecc) QSYS2.HTTP_POST function in order to test your credential and how this API works:

values QSYS2.HTTP_post('https://api.cognitive.microsofttranslator.com/translate?api-version=3.0&from=it&to=en', '[{"text":"proviamo a tradurre qualcosa con questa API"}]','{"headers":{"Content-Type":"application/json","Ocp-Apim-Subscription-Key":"SUBSCRIPTION-KEY","Ocp-Apim-Subscription-Region":"REGION"}}')

As you can see, this API give me back a JSON string with the translation.

So as not to overcomplicate the example, we assume that we will give to our API only one string to translate, that because as you can see in this example below, it’s able to translate as much string as you want:

So, if we know that we have only one string, it’s quite easy manipulating this JSON, to do that we will use JSON_TABLE. This table function convert a JSON object (or array) into a relation database table, so you only need to give as parameter the JSON object and the relative path:

select *
from JSON_TABLE(QSYS2.HTTP_post('https://api.cognitive.microsofttranslator.com/translate?api-version=3.0&from=it&to=en','[{"text":"proviamo a tradurre qualcosa con questa API"}]','{"headers":{"Content-Type":"application/json","Ocp-Apim-Subscription-Key":"SUBSCRIPTION-KEY","Ocp-Apim-Subscription-Region":"REGION"}}'),'lax $' COLUMNS(dataout varchar(3000) PATH 'lax $[0].translations[0].text'))

I know that my example is not difficult, my JSON object hasn’t so many attributes so we only get back the string in the target language (in my example is DATAOUT varchar(3000)).

Now we can easily proceed creating our function. In my example I will put three parameters like language source, language target and text to translate:

CREATE OR REPLACE FUNCTION sqltools.translate (lanin char(2), lanout char(2), textin varchar(10000))
    RETURNS varchar(10000)
    LANGUAGE SQL
    BEGIN
        DECLARE txtout varchar(10000);
        SELECT dataout
        into txtout
        from JSON_TABLE(QSYS2.HTTP_post('https://api.cognitive.microsofttranslator.com/translate?api-version=3.0&from=it&to=en','[{"text":"proviamo a tradurre qualcosa con questa API"}]','{"headers":{"Content-Type":"application/json","Ocp-Apim-Subscription-Key":"SUBSCRIPTION-KEY","Ocp-Apim-Subscription-Region":"REGION"}}'),'lax $' COLUMNS(dataout varchar(3000) PATH 'lax $[0].translations[0].text'));
        RETURN txtout;
    END;

To check if its everything OK, you could check if there is an object in your library like the screen:


Now let’s try our function with a sample database (you can create with this statement: CALL
 QSYS.CREATE_SQL_SAMPLE('DEMODB')
):

SELECT DEPTNAME,sqltools.translate('en','it',DEPTNAME) as DEPTIT FROM demodb.DEPT

As you can see, the DEPTIT column contains the translation of the column DEPTNAME.


So, this would be only an example of what you can do with SQL on IBM i, we will soon see some example more complex of SQL usage, as monitoring tool and so on.

And… what about you, do you have or use any SQL function that you have built? Please tell us your experience on the comments.

Andrea