D365 Finance & Operations and Dynamics AX Forum

Expand all | Collapse all

D365 F&O Incremental Data Exports in v10

  • 1.  D365 F&O Incremental Data Exports in v10

    Posted 22 days ago
    All...

    I'm interested in any feedback that is available for Incremental Data Exports in D365 F&O v10 to AZURE BYODB.

    Prior to migrating to the current v10.0.x release from 7.2; we had tried unsuccessfully to implement Incremental Exports...  And now we are potentially interested in trialing the Incremental Exports under v10...  But before setting forth with a trial and assigning resources I wanted to gather feedback from others who may have trialed Incremental Exports under v10...  Some feedback that I've observed in blogs is less than positive...

    In our environment, we have a number of Data Entities that are in excessive of 5-6mm+ rows and growing, some are custom entities, that are being exported nightly to support 3rd Party interface details and reporting PowerBI and SSRS...  Certainly not necessary to export all historical data, so we've retooled the process by writing SQL Stored Procedures to append/merge data into an AZURE BYODB AX_PROD based on a minimized Data Entity export.

    Essentially we're exporting 2 days of Entity Data (just to covered UTC/ET timezone extract windows) using a defined filter, mostly using a ModifiedDateTime field with a criteria of (DayRange(-2,0)) and subsequently executing a SQL Agent job to call a set of SQL Stored Procedures...  This is working well and certainly the full set of jobs (minimized exports PLUS SP Execution) is taking much less time than some of the Full Data Entity exports...

    The concern, is that this approach just adds another layer of development, required support and maintenance to the environment, so I'm more curious that anxiosu to make a move...

    Thanks in advance to all-
    Joe MacPherson

    ------------------------------
    Joe MacPherson
    Coldwater Creek
    Hingham MA
    ------------------------------


  • 2.  RE: D365 F&O Incremental Data Exports in v10

    SILVER CONTRIBUTOR
    Posted 22 days ago
    Joe,
        I've setup increment exports to BYOD where possible.  There are a couple of standard entities that don't seem to like incremental pushes that much (Released products V2, Sites, Warehouses are the ones I've come across), even when track changes are turned on.  I think this tends to be due to the design of the entity.  I ended up setting up a number of my own entities that got me around those issues.  One of the key issues that I have noticed is when virtual fields are used.  This tends to mean you can't do incremental or skip staging.  I must admit, we don't have the volume that you seem to, so it's been running rather smoothly.  I recommend to skip staging as well.  I found that when staging was used, the staging tables weren't clearing out, and there are no automated jobs that can be scheduled to clear them out.  If you don't skip staging, there is extra IO overhead and DB space overhead.  When we did refreshes to TEST, our DB size was growing rather large due to the data in the staging tables.

    I did extend the deletion of the history of data management jobs.  I have a function that calls the same as the delete function, but I wrapped that up in class that extends runbasebatch.  Particularly handy little function that, especially as when you delete a job history, any staging data for that job seems to be deleted along with it.  Another trap to watch out for is the start and end time on the project history.  It's not reflective of the overall runtime of the job.  MS don't seem to want to fix that little bug, but when testing the time a project takes to run, you have to watch the job run and time it essentially.

    When I ensured that my entities didn't have virtual fields and I skipped staging, I have found that my exports have all been rather smooth.  It's worth some experimentation in a DEV environment to see how you go.  Incremental has some other tricks to watch out for.  If you do an incremental push in a project, then remove the entity and re-add it, the system will still think it's done an initial full push.  So if you cleared the data out of your BYOD, you will loose it.  You have to essentially delete the project and recreate it again to get that initial full push happening again (or some other creative ways).  There are definitely some things to watch out for, but if you can get it working well for you, I think it would be well worth removing the extra layer of development that needs to be done.

    It's been a journey in getting incremental pushes working the way I want them, but a good learning experience.  I hope you find this information useful and helpful.

    Good luck.

    Regards,
    Andrew

    ------------------------------
    Andrew Friebel
    Gale Pacific Limited
    ------------------------------



  • 3.  RE: D365 F&O Incremental Data Exports in v10

    SILVER CONTRIBUTOR
    Posted 21 days ago

    Hi Joe, we've been using BYODB export for 1.5 year now.  At the beginning it was on 7.1 (PU?) and for about a month on v10 (PU24).  For us it's working well.  We encountered issue in the early days, learning what works and not.  But since PU21 it's pretty stable.  To highlight what Andrew already said to get better performance and more:
    - Don't use virtual field.  You can use computed field if you need to.
    - Don't use postLoad() method.  Like virtual field, this will force the export to go to staging and that is a major slow down.
    - Build your own entity : Helps you extract only what you need.  Standard entity are not always optimized for export and contains many data source (that you don't necessarily need)
    - Full push are slow for 2 reasons : The obvious, it push all the data every time.  The sneaky one, if your table is big enough (use to happen on table with 2 millions rows), the DELETE command on the BYODB table can take a lot of time.  So much that it could make the export time out (1h limit).  So we use to truncate the table manually before doing a full push.

    Andrey, you don't need to delete the project.  Just change the default refresh type to "Full push only", when it's completed switch it back to "Incremental".

    For more info, here's a presentation I made on our experience with BYOD for our chapter :
    https://www.axug.com/HigherLogic/System/DownloadDocumentFile.ashx?DocumentFileKey=59ff18c4-c7f2-af6d-c868-e43f4f0e69f8&forceDialog=0

    Let us know if you have more questions



    ------------------------------
    Steeve Gilbert
    Software Development Supervisor
    Boa-Franc S.E.N.C.
    QC, Canada
    ------------------------------



  • 4.  RE: D365 F&O Incremental Data Exports in v10

    Posted 18 days ago
    Installing SQL Server Always On will already give you this functionality

    ------------------------------
    michael dooley
    LV
    ------------------------------



  • 5.  RE: D365 F&O Incremental Data Exports in v10

    TOP CONTRIBUTOR
    Posted 16 days ago
    Could you provide some context to this comment?  I have no idea how it relates to incremental updates to a BYOD database from a D365FO environment.

    ------------------------------
    Mark Schurmann
    Accounting Systems Manager
    Automobile Protection Corp
    Norcross GA
    ------------------------------



  • 6.  RE: D365 F&O Incremental Data Exports in v10

    TOP CONTRIBUTOR
    Posted 16 days ago
    Sorry, my previous comment is in reference to the Sql Always On post.

    ------------------------------
    Mark Schurmann
    Accounting Systems Manager
    Automobile Protection Corp
    Norcross GA
    ------------------------------



  • 7.  RE: D365 F&O Incremental Data Exports in v10

    Posted 16 days ago
    Installing SQL Server in always-on mode gives you a full main AX D365 DB and a full real time read-only copy AX D365 DB.
    This makes BYOD and all it's complications irrelevant - just use the copy for reporting (and other) requirements.

    ------------------------------
    michael dooley
    LV
    ------------------------------



  • 8.  RE: D365 F&O Incremental Data Exports in v10

    Posted 16 days ago
    ~michael dooley...

    I just want to clarify your comment...

    Correct me if I'm wrong, but your response, "Installing SQL Server in always-on mode gives you a full main AX D365 DB ...", may be relevant only for 'on-premise' installations and does not impact those of us who are SaaS users.

    ------------------------------
    Joe MacPherson
    Coldwater Creek
    Hingham MA
    ------------------------------



  • 9.  RE: D365 F&O Incremental Data Exports in v10

    TOP CONTRIBUTOR
    Posted 15 days ago
    I agree.  I think the only reason anyone is chasing the BYOD solution is because Microsoft will not give the SAAS customers access to the AxDb.

    BTW, I would hold your breath waiting for Microsoft to respond to the many requests and votes for a read-only copy.  I raised this with the lead architect at last year's D365UG Summit during a Q&A session and his reason for not doing it was nonsense.  I think they have visions of the Common Data Service replacing the Sql databases underlying their products and any customer reliance on the old models will imped progress toward that goal.

    ------------------------------
    Mark Schurmann
    Accounting Systems Manager
    Automobile Protection Corp
    Norcross GA
    ------------------------------



  • 10.  RE: D365 F&O Incremental Data Exports in v10

    Posted 10 days ago
    Yes you are correct - sorry i should have been more clear.

    ------------------------------
    michael dooley
    LV
    ------------------------------



  • 11.  RE: D365 F&O Incremental Data Exports in v10

    Posted 2 days ago
    Steeve, I'm intrigued with your statement "So we use to truncate the table manually before doing a full push". How and where did you do this? I do this in Azure Data Factory as we're moving the data around, but it sounds like you're able to do this in DMF somehow?

    ------------------------------
    Paul Tansey
    WhiteWater West Industries, Ltd
    Richmond BC
    ------------------------------



  • 12.  RE: D365 F&O Incremental Data Exports in v10

    SILVER CONTRIBUTOR
    Posted 2 days ago
    Paul, no it's not in DMF.  We would truncate with a sql command from SSMS each time.

    ------------------------------
    Steeve Gilbert
    Software Development Supervisor
    Boa-Franc S.E.N.C.
    QC, Canada
    ------------------------------



  • 13.  RE: D365 F&O Incremental Data Exports in v10

    Posted 2 days ago

    Thanks for the clarification, @Steeve Gilbert. I'd hoped we could use TRUNCATE rather than DELETE in DMF. Every little bit of performance helps in that environment!
    fyi, @Graham Kemp



    ------------------------------
    Paul Tansey
    WhiteWater West Industries, Ltd
    Richmond BC
    ------------------------------



  • 14.  RE: D365 F&O Incremental Data Exports in v10

    SILVER CONTRIBUTOR
    Posted 21 days ago
    I encourage you all to vote for this idea about having a read-only copy of the DB for reporting.  That would save us from all the BYODB work!
    https://experience.dynamics.com/ideas/idea/?ideaid=a0d34018-04d6-e711-80c0-00155d7cd0b4

    ------------------------------
    Steeve Gilbert
    Software Development Supervisor
    Boa-Franc S.E.N.C.
    QC, Canada
    ------------------------------



  • 15.  RE: D365 F&O Incremental Data Exports in v10

    Posted 21 days ago
    Hi Joe

    Very exciting to hear your experiences about BYOD.

    We're also using BYOD. Mostly on Standard Entities with incremental push activated when ever possible and the entity allows to do so.
    For us it seems to run rather smoothly so far, however we are still learning since our D365 environment is not yet productive and the amount of data is manageable.
    We started with Version 8 PU 15. Now we are on Version 10 PU 27 and i have to say a lot of bugs with specific entities as well as with the Delete mechanism have been fixed since.

    The main issues we faced so far:

    - Data quality issues, where duplicates in the standard entities prevented BYOD from exporting data (you have to clean up your data first or set up a corresponding filter)
    - We had some cases where changes were not reflected in our BYOD database...we had to run a full export to fix the issue. We're monitoring these cases closely because it's a big risk
    - Timeouts, as mentioned by Andrew & Steve.
    - We're running a multi client D365 environment and do have BYOD Jobs for each client. You have to make sure to set up the BYOD with the correct client context, otherwise wrong data will be exported
    - We wanted to export BYOD data to an Azure SQL Server Managed instance directly (since our DataWarehouse is located there), which is technical not yet possible. So we have to first export the data to the Azure SQL DB.

    I also heard Microsoft is promoting the mechanism of the Entity Store Export to an Azure DataLake V2.0 (which is still in preview i guess?). However, I would still recommend the BYOD mechanism at the moment.

    Bottom line, we trust in Microsoft to still improve the BYOD functionality and keep it stable. There is still a steep learning curve and not too much knowledge out in the Web.

    Good Luck & looking forward to hear some more about your BYOD experiences.

    Andreas



    ------------------------------
    Andreas Fitzi
    SIGVARIS
    ------------------------------



  • 16.  RE: D365 F&O Incremental Data Exports in v10

    Posted 15 days ago
    Edited by Neill Riordan 4 days ago

    The accepted approach currently on BYODB is to make a copy of the entity in scope and remove fields not needed in the BYODB in order to lower transmission of data not required.

     

    There are some key things with this approach and it is all around one version.  The main one is you are adding code to F&O and of course in adding code there could be an impact in effort when coming to the upgrades on their cycle.  However, it does protect you to a certain extent from Microsoft changing their entities to V2, V3 and you having a direct impact from that.

     

    So when coming to look at some data to move to the BYODB the following steps should be considered;

     

    • Is there a standard entity that could do the job? If yes copy it and drop unnecessary fields
    • Does the standard entity you are coping have a primary key? If no add one to your copy example InventTrans
    • Will this entity be passing a lot of data? Consider increasing the timeout for the process
    • Can you use change tracking to reduce load on the system? Entity based from views cant do this
    • Can you skip staging? Only needed for virtual columns

     

    Then there are some general practices you should include

     

    • Always make sure the scope of the data you are pushing is the minimum you need
    • Size of BYODB need to be correct and you need to make sure you have the resiliency in SQL selection for the business.   Consider running a PowerShell script to turn up the SQL during large movements of data then turn it down again.
    • Time your batch schedule so you are not in peak if possible to lower the load on the system
    • Use Shadow tables in your BYODB to avoid locks and make sure the overall solution is performant
    • Clean up staging table regularly in Dynamics 365 for Finance and Operations

    In terms of the Data Lake through CDS v2 you need to make sure you are mature enough as an organisation to have the skill and the cost support to leverage information from the data lake.




    ------------------------------
    Neill Riordan
    Dynamics 365 Solution Architect
    IBM
    ------------------------------



If you've found this thread useful, dive deeper into User Group community content by role