When attempting to duplicate a company you are receiving an error message stating the maximum file size has been exceeded. There’s enough free space yet (80GB) on the NTFS-formatted hard disk and the machine is equipped with plenty of RAM (>=20GB) and virtual memory as well.  

When trying to duplicate or import a huge .DAT file (9GB) the following error message is thrown:

"Error in the file C:\Users\Admin~1.SIN\AppData\Local\Temp\$tmp0030005.$$$ while writing in record=FFFFEB40 Windows error : =

Error Code: 50110 = Maximum file size exceeded Operation will fail"


This is a 32-bit file size limit (< 4GB). The copy company process uses a few temporary tables and when a temporary table reaches a certain size then it will be no longer held in the cache, it will be written into a kind of an ISAM file on the hard disk. There is a limitation that this file cannot be bigger than 4GB.  


To optimize the .DAT file method of export/import there are a number of things you can do:  
1. Change the TmpRecIdMap and TmpTransactionIdMap table in the class SysDataImport\classDeclaration to use real table instead of temporary table.  
Otherwise this may consume all temp space available on the machine.

Old code -

TmpRecIdMap old2NewRecId;

TmpRecIdMap newRecId;

TmpTableIdMap tmpTableIdMap;

// Transaction id

TmpTransactionIdMap old2NewCreatedTransactionId;

TmpTransactionIdMap old2NewModifiedTransactionId;

Example of new code (use 4 new tables) -

//    TmpRecIdMap             old2NewRecId;
//    TmpRecIdMap             newRecId;
    DD_RecIdMapOldRecId     old2NewRecId;
    DD_RecIdMapNewRecId     newRecId;
    TmpTableIdMap           tmpTableIdMap;

    // Transaction id
//    TmpTransactionIdMap     old2NewCreatedTransactionId;
//    TmpTransactionIdMap     old2NewModifiedTransactionId;
    DD_TransactionIdMapCreated  old2NewCreatedTransactionId;
    DD_TransactionIdMapModified old2NewModifiedTransactionId;

Download attached sample xpoto use as an example of the change.

2. In SQL server set logging level to simple (just while importing to prevent excessive logging).

3. Inside AX export the data in several separate groups of tables (by configuring multiple definition groups).

4. Make sure to keep tables referencing RecId in another table within the same group as the related table, if you don't then the RecId relationship will not be maintained.

5. Make sure the size of the database files are set high enough so that they do not have to grow whilst the import is running.

6. Set the following options "execute on AOS: Yes", "Do not search for existing records: yes", "Indexing: reindex after import", "Use Record ID Compression: Yes".

7. Run the import/export of each definition group on a separate client or even on separate AOSs in parallel.

Using these options will maximize performance and will minimize the risk of a single error bringing the whole thing to a halt as you run separate imports in parallel.”

Need more help?

Expand your skills
Explore Training
Get new features first
Join Microsoft Insiders

Was this information helpful?

What affected your experience?

Thank you for your feedback!