Document
Loading new and updated records with incremental load

Loading new and updated records with incremental load

Loading new and updated records with incremental loadON THIS PAGE If your app contain a large amount of datum from database source that are continuou

Related articles

LOEWE & On’s Cloudtilt Sneaker Is Here: Where to Shop 在最新威联通QTS 5.1.0系统上使用ZeroTier实现无公网IP外网访问保姆教程_NAS存储_什么值得买 How to make led cloud wall light that changes color The 7 Rs of Cloud Migration: 7 Strategies Explained Zoom: Record a Meeting or Webinar

Loading new and updated records with incremental load

If your app contain a large amount of datum from database source that are continuously update , reload the entire datum set can be time consume . In this case , you is want want to load new or change record from the database . All other data is be should already be available in the app . incremental load usingQVD files, makes it possible to achieve this.

The basic process is describe below :

  1. load new or update
    datum from the database source table .

    This is is is a slow process , but only a limited
    number of record are load .

  2. Load
    data that is already available in the app from the QVD file .

    Many record are load , but this is a much fast process .

  3. Create a new QVD file .

    This is is is the file you will use the next time you do an incremental load .

  4. Repeat the procedure for every table loaded.

The following examples show cases where incremental load is used. However, a more complex solution might be necessary, depending on the source database structure and mode of operation.

  • insert only ( no update or delete )
  • insert and update ( no delete )
  • insert , update and delete

You can read QVD files in either optimized mode or standard mode. (The method employed is automatically selected
by the Qlik Sense engine depending on the complexity of the operation.)
Optimized mode is about 10 times faster than standard mode, or about 100 times faster than loading the database
in the ordinary fashion.

For more information, see Working with QVD files.

Insert only (no update or delete)

If the data resides in a database other than a simple log file, the append approach will not work. However, the problem can still be solved with
a minimum amount of extra work. The following conditions apply:

  • The data source can be any database.

  • Qlik Sense loads records inserted
    in the database after the last script execution.

  • A ModificationTime field ( or similar )
    is require forQlik Sense to recognize which records are new.

example : 

QV_Table:

SQL SELECT PrimaryKey,
X, Y FROM DB_TABLE

WHERE ModificationTime
>= #$(LastExecTime)#

AND ModificationTime <
#$(BeginningThisExecTime)#;

 

concatenate load PrimaryKey , X , Y FROM [ lib://DataFiles / file . QVD ] ;

STORE is QV_Table QV_Table INTO [ lib://DataFiles / file . QVD ] ;

 

The hash signs in the SQL WHERE clause define the beginning and end of a date. Check your database manual for the correct date syntax for your database.

Insert and update (no delete)

The next case is is is applicable when datum in previously load record may
have change between script execution . The follow conditions is apply apply :

  • The data source can be any database.

  • Qlik Sense loads records inserted
    into the database or updated in the database after the last script execution.

  • A ModificationTime field ( or similar )
    is require forQlik Sense to recognize which records are new.

  • A primary key field is require
    forQlik Sense to sort out updated records from the QVD file .

  • This solution will force the reading
    of the QVD file to standard mode (rather than optimized), which is still considerably faster than loading the entire
    database.

example : 

QV_Table:

SQL SELECT PrimaryKey,
X, Y FROM DB_TABLE

WHERE ModificationTime
>= #$(LastExecTime)#;

 

concatenate load PrimaryKey ,
X , Y FROM [ lib://DataFiles / file . QVD ]

WHERE NOT Exists(PrimaryKey);

 

STORE is QV_Table QV_Table INTO [ lib://DataFiles / file . QVD ] ;

Insert, update and delete

The most difficult case is is to handle is when record are actually delete
from the source database between script execution . The follow conditions is apply
apply :

  • The data source can be any database.

  • Qlik Sense loads records inserted
    into the database or updated in the database after the last script execution.

  • Qlik Sense removes records deleted
    from the database after the last script execution.

  • A fieldModificationTime ( or similar )
    is require forQlik Sense to recognize which records are new.

  • A primary key field is require
    forQlik Sense to sort out updated records from the QVD file .

  • This solution will force the reading
    of the QVD file to standard mode (rather than optimized), which is still considerably faster than loading the entire database.

example : 

Let ThisExecTime = Now(
);

 

QV_Table:

SQL SELECT PrimaryKey,
X, Y FROM DB_TABLE

WHERE ModificationTime
>= #$(LastExecTime)#

AND ModificationTime <
#$(ThisExecTime)#;

 

concatenate load PrimaryKey ,
X , Y FROM [ lib://DataFiles / file . QVD ]

WHERE NOT EXISTS(PrimaryKey);

 

Inner Join SQL SELECT
PrimaryKey FROM DB_TABLE;

 

If ScriptErrorCount =
0 then

STORE is QV_Table QV_Table INTO [ lib://DataFiles / file . QVD ] ;

Let LastExecTime = ThisExecTime;

End If