Looking for support with Dynamics 365?
With 20+ years of industry experience in ERP and CRM, DAX is proficient in crafting tailored solutions to meet the needs of businesses.
Visualize Dynamics 365 opening on Monday morning and all the forms loading in less than a second. Views render instantly. Workflows fire without delay. Your sales and service teams zip through work with no waiting for slow pages or dropped workflows. That’s the kind of leverage that you can realize with Dynamics 365 fine-tuning done just right.
In this blog, you will learn how to improve productivity with Dynamics.
We’ll delve into such aspects as form design, data model design, and the Power Platform performance tuning.
You’ll discover how to spot bottlenecks, implement best practices, and monitor performance over time.
By the time we’re done, you’ll have a template to turn Dynamics 365 into a high velocity business engine – and when to not hesitate to contact DAX Software Solution for assistance via the dax software home page.
Looking for support with Dynamics 365?
With 20+ years of industry experience in ERP and CRM, DAX is proficient in crafting tailored solutions to meet the needs of businesses.
Before you change any settings, you have to have some good data to begin with on how Dynamics 365 is doing as-is. A clear baseline helps you:
Here is how you can establish your baseline:
| Scenario | Avg. Form Load (s) | 95th %-ile Script Time (ms) | Avg. Workflow Time (ms) | Avg. Active Queue Length |
| Open Account Form | 2.8 | 450 | N/A | N/A |
| Save Opportunity (plugin) | N/A | N/A | 1,200 | N/A |
| Run Daily Invoice Workflow | N/A | N/A | 3,500 | 75 |
| Bulk Update Orders via API | 1.5 | 200 | N/A | 30 |
Pro Tip: Collect these metrics for seven days straight to cover workload fluctuations. Compare peak-hour vs. off-peak variation.
You’ll be able to take this fully loaded performance baseline and:
One of the most frequent reasons for poor Dynamics 365 performance is bloated, busy forms. Every additional field, script, or control adds weight to the page. Here’s how to simplify your forms to create a smooth user experience:
Why it works: Each field and tab contributes its own DOM elements and data requests. Reducing the initial payload significantly cuts form-load time.
function onLoad(executionContext) {
var formName = executionContext.getFormContext()
.ui.formSelector
.getCurrentItem()
.getLabel();
if (formName !== “Invoice”) return;
initializeInvoiceScripts(executionContext);
}
Why it works: Excessive or unnecessary scripts block rendering. Consolidation and conditional loading reduce client CPU usage and improve load times.
Why it helps: Each ribbon command adds client-side metadata and event handlers. Stripping out unused buttons reduces initial page processing.
function onLoad(executionContext) {
var formContext = executionContext.getFormContext();
var lookupField = formContext.getControl(“new_relatedentityid”);
if (!lookupField) return;
lookupField.addPreSearch(function() {
lookupField.setLookupTypes([“account”, “contact”]);
});
}
Why it works: Deferring data calls until user interaction reduces the amount of data transferred on initial load, making the form feel faster.
Scope your plugin steps:
In the Plugin Registration Tool you will have to also register your plugin only for the required message (Create, Update, Delete ) and pipeline stage (Pre-Validation, Pre-Operation, Post-Operation).
Don’t enroll in all phases or characteristics:
This makes your code execute a lot more frequently than it needs to.
Filter by selected entity and property:
Set filtering options so it only runs the plugin if certain fields are updated (just when “Status” is changed, for example).
Example configuration:
If you plugin only recalculates a discount because the Price or the Quantity of an Order changes, then register it only for those attributes and not for all updates.
Benefit:
This reduces the number of plugin executions, which subsequently decreases CPU and database load, resulting in improved overall system performance.
What it is:
The ExecuteMultipleRequest message allows you to cache a set of create, update, and delete operations, and then execute them.
How to implement:
Gather OrganizationRequest objects (like CreateRequest, UpdateRequest) in your plugin code.
You can include these in a collection of ExecuteMultipleRequest.
Pass the batch request to the organization service.
Key settings:
ContinueOnError: whether the batch stops after the first error, or continues.
ReturnResponses: Returns individual responses if necessary for additional logic.
Advantage:
Minimizes network round-trips and database hits, yielding increased throughput, particularly for bulk transactions.
Enable the Profiler:
Inside the plugin registration tool, you can right-click on the plugin and select profile.
Capture execution:
Take the actions that cause your plug-in to execute in Dynamics 365. The profiler captures the entire execution environment.
Analyze the snapshot:
Download the snap file and reply with the registration tool. Check out the full call stack and timer for each method and call.
Identify slow code paths:
Search for time-consuming loops and repeated service calls within loops.
Refactor inefficient loops:
Push service calls out of loops if you can.
Implement early exits when conditions don’t need to be processed.
Get all the data in a single fetch (QueryExpression or FetchXML) instead of nesting queries.
Benefit:
Profiling uncovers bottlenecks in your plugin code that were previously hidden, so you can make use of local optimizations or recoding the most slowest parts.
One retail customer faced frequent account save failures and an average save time of 2.5 seconds as a synchronous plugin validated customer addresses against a third-party API. By:
Shifting the validation logic to a separate module outside the plugin.
Developing an Azure Function to handle validation with async support.
Calling the function on demand using a Power Automate flow after the account record is added.
They achieved:
95% reduction of save errors on accounts (the plugin stopped blocking saves when the external api was unresponsive).
Trimming 40% off saves time (average down to 1.5 seconds from 2.5 seconds).
And with these strategies, minimizing your real-time workflows, tuning up plugin registration, batching, and profiling your code, you’ll help to make sure that your server-side logic is slim and lean, meaning faster, more responsive Dynamics 365 experiences for all your users.
A good data schema and the right indexing are required for queries and data operations to be efficient in Dynamics 365. Here is a way to tweak and organize your data model for maximum performance:
Why it matters
Fetching thousands of rows at a time will break the server and the client.
How to configure
FetchXML/QueryExpression: Assigning the PageInfo properties – Count, PageNumber, and PagingCookie.
OData queries: Use the $top and $skiptoken parameters to get a subset of records.
Result
The server sends only the “page” of records that the client wants to see, thus decreasing memory consumption and response time.
The big picture
Too many big tables are killing searches, exports, and plugin queries.
Archiving strategy
Identify documents more than two years old ( e.g., archived Cases, fulfilled Orders).
Put them in some custom Archive entity in Dataverse or export them to some external Azure SQL store.
Get those archived records out of the main table, or make them otherwise inactive.
Or use Power Automate scheduled flows or Azure Data Factory pipelines to automate nightly archiving.
Result
The main tables become smaller, making queries faster overall and reducing storage costs.
Why it matters
Lookup columns are used to join and are often found in filters and views. Without indexes in place, these queries are causing full table scans.
How to add indexes
In the Power Platform admin center, go to Data > Tables > [Your Entity] > Indexes.
Create a new index on each lookup column (customerid, ownerid).
Result
Queries filtering or ordering by these fields now run in milliseconds instead of tens of seconds.
The big picture
Including whole tables in reports or integrations can lead to slowdowns and resource usage.
How to implement
SQL integrations (e.g., Data Export Service): Use filtered views for the target database that only return the columns and those rows you need (e.g., active records and current fiscal year).
FetchXML / Power BI: Ability to write dataset queries with filter clauses instead of importing full entities.
Outcome
In this scenario, downstream processes operate on a smaller set of pre-filtered data, which means that exports and dashboard refreshes happen faster.
Real-World Example
Fundraising Activities were logged into a single table in Dataverse by a nonprofit and accumulated to more than half a million records. Full table exports were taking 8 minutes, which created delays for reporting. They soon devised an archival solution:
Scheduled retention of all finished campaigns > 2 years into an Azure SQL database in Data Factory.
Active records are indexed on statuscode and ownerid.
Set up intelligent filtered views on Azure SQL for large exports.
After these updates, bulk data exports were taking less than 30 seconds to finish – a 94% reduction in processing time.
When you combine server-side paging, technology lifecycle management with the archiving of old records, indexing for key lookups, and filtered views, you’ll keep your Dataverse tables trim and your queries speed-of-light fast, helping to ensure that your Dynamics365 data model scales efficiently as your organization expands.
Looking for support with Dynamics 365?
With 20+ years of industry experience in ERP and CRM, DAX is proficient in crafting tailored solutions to meet the needs of businesses.
Dynamics 365 uses a few levels of caching to prevent going against the database when it doesn’t need to and make reading data faster. You can also improve performance by adjusting these settings and choosing server-side processing:
What it does
Tells browsers to save (cache) static files — JavaScript, CSS, images — so they’re not downloaded again for every page visit.
How to configure
In the settings of your web server or CDN, apply a Cache-Control header such as max-age in seconds to all your static assets:
Cache-Control: public, max-age=31536000
Version file names (formScripts. v2. js) so browsers will pull the new script when it changes and not use the old cached one.
Why it matters
Reduces the number of HTTP requests and makes client-side form rendering faster.
What it is
Caches Recently or Frequently Read Records in Platform Memory to Avoid Repeated Database Queries.
How to enable
Go to the Power Platform admin portal, click:
Environments > [Your Env] > Settings > Product > Performance
Enable Data Caching and the Cache Duration (for example, 5–10 minutes) for entities that are mostly read operations (such as Price Lists or Product Catalogs).
Why it matters
Repeating requests for the same records also can be performed from cache and in the process decrease the workload for the database CPU and the I/O of the server system with the result of a quicker responses.
What it does
Runs efficient SQL on the server, instead of transferring raw data to the client and computing it with client-side LINQ or JavaScript.
How to implement
Use the ExecuteFetchRequest class to send FetchXML requests directly to the server from within your plugins or custom API.
Don’t loop through thousands of records on the client—push filtering/sorting/aggregation back to the server.
Why it matters
Server side processing utilizes precompiled SQL execution plans and database indexes to query only the data that is requested without additional network and CPU costs to the client.
One insurance company complained about sluggish dashboards for their Policy entity (10,000+ records). They elected to make cache settings and server-side rendering adjustments:
Set up Data Caching on the Policy entity with a TTL of 10 minutes.
Added HTTP Cache-Control with a max-age of one year and versioned filenames to custom form scripts.
Replaced lots of LINQ client-side loops with ExecuteFetchRequest in their reporting plugin.
Result: Dashboard load time went from 12 seconds to under 5 seconds, increasing user satisfaction.
With a combination of HTTP-level caching, platform-level data caching, and server-side query processing, you can take the load off the database and ensure a snappier Dynamics 365 experience so that users get and stay productive.
Dynamics 365 is a suite of other products in the family of the so-called “Power Platform.” The better you fine-tune these components, the more smoothly your custom apps and automated flows will run — and the more productive your users will be. Here’s how to make the most of each layer:
Minimize data sources
Each connector adds latency. Include only the tables and views that are absolutely necessary for your app.
For Model-Driven Apps, clean up the metadata in the sitemap by removing it, and hide unneeded tables.
Substitute formula variables instead of repeated checkups.
In Canvas apps, assign the result of a lookup to a local variable wherever OnStart or OnVisible instead of calling the lookup multiple times.
// Instead of the following in each control:
LookUp(Accounts,AccountId= varSelectedId). Name
// Do this once:
Set(varAccount,LookUp(Accounts,AccountId=varSelectedId));
// Then reference:
varAccount.Name
Disable experimental features.
Try to use stable features where you can (eg, modern grids, page components)
Trim unused data sources.
Refresh the Data panel once in a while. Delete connections, tables, or formulas that the app no longer uses.
Why it matters:
Fewer connector calls and reducing double-lookups, reduce server round-trips, and reduce the amount of compute needed on the client-side.
As organizations adopt the Power Automate Flows tools to automate business processes and share data between cloud services, partners need to ensure these flows are appropriately monitored for any compliance violations.
Note the parallels that the program is taking that are bottlenecks.
Find the run history of the flow and what the branches waiting in the same connector.
Consolidate steps that depend on each other or similar tasks into one branch.
Cut back on redundant sign-offs or loops.
Consider replacing multiple-step loops using native Apply to each optimised to process even huge collections in batches.
Archive or deactivate flows that have not been run in 30 days to help keep the environment clean.
Use concurrency control
For triggers such as when an item is created, limiting the concurrency helps prevent the downstream systems from being overloaded.
The big picture: Misconfigured flows are capable of blowing past API limits — and slowing connected apps, which can slow the entire platform.
Monitor throttling events
Look for Throttled Requests in the Power Platform admin center on Metrics.
Scale with additional DTUs
If you are regularly hitting throttling limits, you can ask Microsoft Support for additional Database Throughput Units (DTUs) or upgrade your capacity tier.
Do partitioning and sharding.
For huge tables you might need to have some custom partitioning schemes or use multiple Dataverse environments to distribute data.
Why this matters: Sufficient DTUs provide the performance buffer your apps and integrations require to respond to times of spikes in usage without slowdowns.
A logistics company found that its custom Model-Driven App for tracking shipments had slow load times. They were optimized by:
Turning off unfinished experimental grid features is still a Preview feature.
Removed three tables that were lying around the app’s sitemap.
Instead of calling LookUp multiple times, cache the lookup values in the form’s OnLoad event.
Result: Average form render times increased by 45% (from 4.4s to 2.4s) – sending user satisfaction and throughput through the roof.
And when you fine-tune your Canvas and Model-Driven Apps, monitor and refine Power Automate flows, and maintain adequate Dataverse capacity, you can deliver smoother, faster experiences across the Power Platform, amplifying the productivity gains from your Dynamics 365 investment.
Dynamics 365 is part of the larger Power Platform environment. More information can be found here. Improving its performance, and in particular, where it manages a high volume of data operations and complex integrations, is a huge factor for the improvement that can be achieved by using Azure services. These services provide scalability and efficiencies across use cases.
What it is
Azure Data Factory (ADF) is a cloud-based, Extract, Transform, Load (ETL) service that is used to move and process massive amounts of data from different sources.
How to configure
Stage data to Azure Blob Storage: You can use ADF to copy and ingest massive datasets (for example, large CSV files) to Azure Blob Storage.
Transform and load into Dataverse
Use of ADF pipelines to transform the data as required and ingest into Dataverse with the available connectors.
Between the lines
This strategy offloads data processing from Dataverse into computationally intensive tasks, thus increasing API call capacity and overall rate of the system.
What is it
Azure Functions is a serverless compute service that lets you run event-triggered code without having to explicitly provision infrastructure. It’s perfect for executing your own logic on event or trigger.
How to implement
Event-based trigger functions – Trigger an Azure Function based on events in Dataverse, like records being created or updated.
CPU-bound Work Handlers
Use these for CPU-bound activities like document generation, heavy calculations, or data manipulation.
Why it matters
Offloading intensive processing to Azure Functions reduces the load on Dataverse, improves performance and responsiveness.
What it is
Azure Event Grid is a fully managed event-routing service that enables you to build event-based architectures and allows you to easily respond to events with intelligent event filtering.
How to implement
Configure Dataverse to publish events: Set Dataverse up to emit events (for example, record changes) to Azure Event Grid.
Send events to the processing pipeline: Use Event Grid to route these events to Azure Functions or other services for processing.
Why it matters: The structure allows for on-demand scaling and accommodation of changes to the data — within the system design.
A procurement client had a nightly invoice process that was taking around 4 hours. They saw significant gains when they moved this functionality to an Azure Function that is triggered by Event Grid from Dataverse:
Speedy processing: The batches took 4 hours to process, but now take only 20 minutes.
Better scalability: The application can deal with big data sets without losing performance.
Enhanced system response: Customers were served faster with the generated invoices and were more satisfied.
With the combination of Azure services (such as Dataverse, Data Factory, Functions, and Event Grid) handling heavy loads, organizations can easily manage at scale and guarantee a responsive Dynamics 365 environment wherever they are located.
Performance monitoring 24/7 for sustained productivity gains:
Power Platform Admin Center
Watch the latency, API requests, and database growth.
Azure Application Insights
Time of your plugins and custom APIs for end-to-end tracing. Add alarms to be notified if the error rate or execution time exceeds a certain threshold.
Custom telemetry
Log client-side performance metrics back to Azure from JavaScript with XrmTelemetry.
Example
A software house defined warnings for form load time greater than 2 seconds. Proactive notifications empowered their support team to handle rogue scripts before the end-user became aware.
Effective governance works hand in hand with technical tinkering:
Solution segmentation: divide customizations on a per-request type (Foundation/ Feature/ Hotfix) basis.
Release pipelines: Automatically deploy with Azure DevOps or GitHub Actions to guarantee consistency across environments.
Scheduled Timely: Quarterly Health check-ups with out-of-box (OrgInsights) or tools like xRMToolBox(MetadataDocumentGenerator).
Training and docs: Provide admins and customizers with best practices for the use of the component.
Yet, no matter how great the work, complex environments typically require expert guidance. DAX Software Solutions provides:
Take a look at our DAX software homepage. When you want to fuel rapid progress and work like a well-oiled machine, Get in touch with DAX Software Solution now for expert advice and hands-on assistance.
Through the use of these Dynamics 365 fine tuning and tweaks, you can reduce friction, make the daily toil a little faster, and remove barriers that enable your users to do what they do best—drive your business. Regular, ongoing monitoring and tuning make certain that your Dynamics investment is operating at peak efficiency long after you go live.
Looking for support with Dynamics 365?
With 20+ years of industry experience in ERP and CRM, DAX is proficient in crafting tailored solutions to meet the needs of businesses.