ibmi-brunch-learn

Announcement

Collapse
No announcement yet.

Bad data in file

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Bad data in file

    While querying file to locate bad data in file which have a num field for date in which i have data as '+++++++++++' in some records. I want to update these bad data with zero or want to convert to correct data if possible.

  • #2
    but am also not able to locate bad data using query. query giving error like - Selection error involving field C9CLSDT.

    Comment


    • #3
      Close
      Date
      0
      0
      5,092,018
      0
      0
      9,022,019
      9,022,019
      5,092,018
      +++++++++++
      +++++++++++
      +++++++++++
      9,022,019
      9,022,019
      +++++++++++
      +++++++++++
      5,092,018
      +++++++++++

      Comment


      • #4
        I can tell you how to do it in RPG - but in SQL maybe you can to cast the value to hex and look at it that way.

        Comment


        • #5
          That's a little tricky, since SQL it is not possible to enter invalid numeric values in SQL Tables (contrary to DDS described physical files).
          But you can do the following trick:
          1. Run your SELECT * Statement with ACS (Access Client Solutions - Run SQL Script. And make sure all rows are read.
          2. Invalid numeric values cause an CPF5035 Joblog-Entry.
          3. You can search the joblog (for the current job) to get all CPF5035 messages with the following SQL-Statement.
          Code:
          Select * from Table(Joblog_Info('*')) x
          Where Message_Id = 'CPF5035';
          If you execute the following SQL-Statement returns the File Name, Library Name, Field Name and the relative record no of the invalid numeric values.
          Code:
             Select Cast(Trim(Substr(Message_Tokens,  1, 10)) as VarChar(10))  File,
                    Cast(Trim(Substr(Message_Tokens, 21, 10)) as VarChar(10))  Library,
                    Cast(Trim(Substr(Message_Tokens, 31, 10)) as VarChar(10))  Field,
                    BigInt(Substr(Hex(Message_Tokens), 81, 8))                 RRN,
                    Cast(Trim(Substr(Message_Second_Level_Text, 57))
                                                              as VarChar(256)) Text
             From Table(QSYS2.JobLog_Info('*')) x
             where MESSAGE_ID = 'CPF5035';
          ... and now you know exactly where your invalid data is and can update it directly.



          Comment


          • JonBoy
            JonBoy commented
            Editing a comment
            Given this amount of work isn't it easier just to write it using native I/O in the first place? That way you can correct the entry as soon as it is detected.

        • #6
          EDIT: I had an old page open and looks like this was answered clearly.
          Last edited by Ghost +; May 5, 2020, 06:09 AM. Reason: Deleted my comment :) No value added!

          Comment


          • #7
            @JonBoy: Sure you need a little program, but ... we just create a view over the SELECT statement and then loop through the view and update the appropriate columns with dynamic SQL
            ... and voilĂ  you have a small tool which can revise any physical file.
            ... may be 20 statements

            Comment


            • JonBoy
              JonBoy commented
              Editing a comment
              Maybe - but I've found that in these cases reporting is needed. I know we see these things differently but to me this kind of thing is in the category of "The answer is SQL ... now what was the problem". SQL is a fabulous tool - I use it all the time - but it wasn't designed for this kind of stuff and just feels forced to me.

          • #8
            wow thanks both. finally am able to locate and update these bad data records.
            Update file set field= 0 where
            hex(field) = '4040404040404040';

            Comment


            • #9
              but this query will capture only specific bad data type.

              Comment


              • #10
                Of course this kind of error requires a lot of analysis.

                But I think that perhaps you can use the good old CPYF for this.
                If you try to copy the file containing the invalid data to another temporary file
                CPYF would react on the invalid data. You can specify that errors are allowed to happen and that the record should
                be ignored and listed in character and HEX format in a spool file.
                Then you can investigate what was wrong and correct it.
                Repeating this, in the end no errors will occur and you have corrected all invalid data ---- at least into data in valid format

                CPYF FROMFILE(*LIBL/myBadFile) TOFILE(QTEMP/XXX) MBROPT(*ADD) CRTFILE(*YES) OUTFMT(*HEX) PRINT(*ERROR) ERRLVL(*NOMAX)

                Comment

                Working...
                X