Informing the IBM Community

A screen a story – to include or to filter

0
(0)

Once in while an ad hoc job ends up on your desk. A while ago I had such a day, although the disk space usage was around 70 %, the backup of the Integrated file System (IFS) part of the system was consuming more and more time every night.

The decision was made not to make any incremental daily backups, so it was time to examine the space usage of the IFS. Although there are DB2 for i Services available for investigating the IFS, I often use the RTVDIRINF command to build up a history, to see where the growth of the IFS was occurring.

When you have the RTFDIRINF information available, it is less time consuming to use that data, instead of collecting it from scratch with the DB2 for i Service IFS_OBJECT_STATISTICS table function. Armed with this data, I used the IBM knowledge base item titled: How to Locate Which “User directories” are Using the Most Space in the IFS.


If you run the command below frequently, you can build up a repository to keep an eye on the space consumption:

SBMJOB CMD(RTVDIRINF DIR(‘/’) INFFILEPFX(IFS) SUBTREE(*ALL)

OMIT(‘/QNTC’ ‘/QFilesvr.400’ ‘/QDLS’ ‘/QSYS.lib’ ‘/QOPT’)) JOB(RTVDIRINF)

I decided not to follow the example but used a shorter prefix in order to allow space for a sequence number in the file name. Just in case that after extraction information from the data, you want to keep it available on your system.
ProTip: I like using QRPLOBJ, because that library is cleared when you IPL the LPAR. As long as you do not IPL very frequently that makes sense is my guess.

Using the files generated by this command you can use the IBM i Access Client Solutions (ACS), Run SQL Script (RSS) example called “Analyze IFS storage consumption”.

When you run the last statement from that example a list is build sorted on the total size of a directory. If you follow this method, do not forget to adjust the file names to match the names and location of your data.

This will allow you to determine where a lot of time is spent when running the backup. In some case a directory does contain a lot of files all small in size. Sometimes only big files or any other possible combination of big and small files in a directory.

In my case for this particular customer, we discovered that a purge process had never been activated in one of their applications. As a result, one single directory contained over 30,000 files. It is there where the real work started.

 
When running from the Integrated File System from ACS, you will get a screen which will look similar to the one below:

I would like to draw your attention to the “filter and “Include” fields, as they can be really useful an save you a lot of time and effort:

The “Filter” option

This will allow you to filter the content after the information is presented in the GUI. In other words, the filter is used at the client level. This function is looking for the string you enter, anywhere in list in the name being displayed.

The “Include” option

This can be used when selecting which data needs to be displayed at the server level. So, when you need to work with a large number of files in a library, the include function will allow you to limit the amount of data being brought to your client.

When you bring together the information about where the data is stored with the ability to filter and include data from it, it makes it much easier to understand what data you have, where it is, what sort of data it might be and when it was created. 

Armed with all this information, it makes it much easier to clean up your system, which in this case means a faster backup containing just the data the customer really needed.

So, it make take a little effort but the results made everybody happy.

I do know that working with QSHELL, the command QSH allows you delete files from a directory with the “RM” command. My problem is that I do not feel comfortable when deleting files without first seeing what I am deleted. Maybe it is just me, but was that not part of the idea when a GUI was designed?

How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *