Eliminate unused space in datasets

Clean up mainframe files to speed backups.

 

Eliminate unused space in datasets
Rick Cook

One of the simplest ways to reduce backup times is to avoid backing up allocated space that's not been used. Many datasets on a mainframe have allocated space that's currently empty. This space can be easily 'squeezed out' of backups without affecting the quality of the backup.

Many modern S/390 backup utilities, such as IBM's DFSMS and Innovation Data Processing's FDR and ABR, can edit out the unused space with one or a few simple commands. However in some utilities, such as Innovation's, the default is to back up all the space allocated to each dataset.

In the case of Innovation's products, the setting can be changed from DATA=ALL to DATA=USED. In DFSMS the parameters ALLDATA and ALLEXCP backup all the allocated space. They can be changed using the PATCH command.

Backing up only the actual data doesn't change the dataset's space allocation. When restored the dataset will occupy the same number of tracks, but only the ones with data will actually be backed up and restored.


Rick Cook has been writing about mass storage since the days when the term meant an 80K floppy disk. The computers he learned on used ferrite cores and magnetic drums. For the last twenty years he has been a freelance writer specializing in storage and other computer issues.

Editor's note: The mention of products or services in this tip is for example only, and does not imply that such products or services listed are an all-inclusive list. Moreover, the mention of such products of services does not imply endorsement of them by SearchStorage.com.


Follow-up reader comments: Rick Cook's recent tip on editing out free space in mainframe backups didn't go far enough on one hand and it went too far on the other hand. A one-size solution, while easy, does not necessarily fit all. Some MVS file types (i.e. VSAM and proprietary databases, such as Nomad) imbed free space within the file structure and are often better handled with the appropriate database unload utility. Other file types, such as HFS, deliberately have free space to ease expansion and fragmentation problems. There are special system files, such as page space and dump datasets, that you want a specific size on restore. In a disaster, you won't have time to remember to fix things that get restored too small. Bottom line: Know your data and then decide the backup methodology: full, data or unload utility. -- John Weinhoeft


This was first published in April 2002

Dig deeper on Primary storage capacity optimization

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchSolidStateStorage

SearchVirtualStorage

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

Close