Have you ever wondered if there is a better way to import large Excel files into D365 Finance and Operations? Using the standard Data Management workspace to upload files is relatively easy, but on the complexity of the import, it can take hours to get your data into D365. This is especially true for imports that touch many tables in D365 such as vendors, customers or items. Even worse, you may end up with a variety of error messages if even one of those fields contains a typo.
After far too many unsuccessful imports of released products (Entity name “Released Products V2”) that were taking me 45 minutes, I decided there must be a way to expedite the process. This is when I stumbled upon the Entity import execution parameters. This is an entity specific setup that changes the way D365 handles uploads to the given entity.
Setting up Configuration Parameters:
- Navigate to the Data Management Workspace
- Click on ‘Framework Parameters’
- Go to the ‘Entity Settings’ tab
- Select ‘Configure entity execution parameters
- Within the Entity Import Execution Parameters table, you should find 3 fields:
- a. Entity – This is a dropdown of all available data entities in the system. Select one you want to speed up.
- Unfortunately, these settings cannot be applied in mass to all entities or a group of entities. Each data entity must be setup individually.
- b. Import threshold record court – This field defines how many records (lines) D365 should upload per ‘Task’ (described below)
- c. Import task count – This field defines the number of tasks that should run simultaneously.
- You can think of each task as an individual upload in itself.
- Most environments can only support up to 12 tasks simultaneously
- 7-9 seems to be a good number that will not consume excess system resources
- In the above screenshot, you can see an example of how I have configured the released product entity.
- For example, my most recent released product upload contains over 2,200 products, which has taken 45 minutes on average.
- I set my task count to 8 and the record count at 300
- Effectively, D365 is splitting up my upload files into 8 parts, in increments of 300 records
- Run the import according to your standard procedure.
- With these import execution parameters, I was able to get the upload run time down to 15 minutes!
- Keep in mind that these settings may need to tweaked depending on:
- Upload size – Larger file sizes may require a higher record count per task
- Import entity – Some entities only validate against one or two tables in D365, whereas something an entity like released products touches more than 10
- System resources – Setting a task count higher than 9 may have an impact on system performance. If you want users in the environment while an upload is running, try lowering the task count.
- There is no one size-fits all approach to these settings. The first time I setup the released product settings, the upload ended up taking longer than before. Trial and error is your friend, throw some settings against the wall and see if they stick.
Thanks for reading! You can find even more data management related blogs here, as well as many other D365 for Finance and Operations topics.