ibmi-brunch-learn

Announcement

Collapse
No announcement yet.

Performance improval for large size FILE

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Performance improval for large size FILE

    Hi, we have a file with more than 10 million records in it and the 150 fields in the record format. It is a history file and every month around 80,000 more records will be added to that file. As a result of this the performance of the programs accessing this file is very poor.
    Is there any solution in AS400 to improve the performance of the programs using this file? please note there are hundreds of programs using this file, so practically it is not possible to amend RPGLE programs.
    Currently we have an idea of splitting the file and move the records which are not needed to a new file. Please provide if there is any other solution?

  • #2
    Re: Performance improval for large size FILE

    150 fields! 10 million records is not all that much in this day and age but 150 fields! I wonder how many of those fields are blank for a typical row?

    The biggest factor in performance for any database is the design. A table should always represent one thing. Each field is a fact about that one thing. Having 150 fields tells me this table is representing more than one thing. I see this a lot on the AS400 where developers tend to see the database as an inconvenience and create tables that fit the programs rather than create programs that fit the tables.

    If the programs accessing this file are SQLRPG programs then you could try building some indexes. It might take a while on a file of this size though!
    Ben

    Comment


    • #3
      Re: Performance improval for large size FILE

      +1.
      Check out this file on Index Advisor in iSeries Nav and build index and/or EVI as stated on this web page.
      Philippe

      Comment


      • #4
        Re: Performance improval for large size FILE

        You don't need to change all the programs that use the file just the big impact ones.

        Look at what programs are running long and see how they are accessing the file. They might be able to be fixed but just using a new logical with better keys. You might find a few programs that are reading the file from top down and a little key here will make life much better. Who many of the programs have been used with in the last year, last month? Extract this and go from there.
        Hunting down the future ms. Ex DeadManWalks. *certain restrictions apply

        Comment


        • #5
          Re: Performance improval for large size FILE

          Beside create an index or a better key, how about try running N programs (submit job) which will process that file?

          For example :
          if the file has 100 records then you can run :
          - program A which will process record number 1 through 10.
          - program B which will process record number 11 through 20.
          .
          .
          - program J which will process record number 91 through 100.


          i think, it'll be faster (but i never try it, just my idea).

          But, i also think this approach suitable if all records in the file need to be processed for example :
          if you need to convert one file to another file.

          Please don't mind it if i'm wrong

          Comment


          • #6
            Re: Performance improval for large size FILE

            If it were me, I'd break the file into important pieces, linking them back to the information I need. Like...

            AcctNo, Name, Addresses, CSZip, Phones, Contacts, etc ==> Contact Info
            AcctNo, Balances, Totals, Values, Details ==> Detail Info
            AcctNo, Misc Info, Comments, etc ==> Misc Info

            That way you can always Pull just the info you need to get as you need it, and the size of the file you're dealing with is somewhat manageable. This also allows for a "drill down" affect that you don't have to utilize unless the user needs that specific information.

            But, that's just me...

            I would say that's my 2cents worth, but the money here isn't worth that much now.

            -R

            Comment

            Working...
            X