
TL;DR
Legacy ERP workflows rarely map cleanly into Dynamics 365. Familiar screens, custom approvals, patched permissions often break because Dynamics 365 enforces a modern, role-based model with standardised navigation and workflow logic. This isn’t a bug but the natural result of moving from heavily customised systems to a scalable platform. To avoid adoption failures, don’t replicate the old system screen by screen. Focus on what users actually need, rebuild tasks with native Dynamics 365 tools, redesign security around duties and roles, and test real scenarios continuously. Migration is your chance to modernise, and if you align workflows with Dynamics 365 patterns, you’ll gain a system that’s more secure, more scalable, and better suited to how your business works today.
Why your ERP workflows fail after migration
You’ve migrated the data. You’ve configured the system. But now your field teams are stuck.
They can’t find the screens they used to use. The approvals they rely on don’t trigger. And the reports they count on? Missing key data.
This isn’t a technical glitch. It’s a user experience mismatch.
When companies move from legacy ERP systems to Dynamics 365, the shift isn’t just in the database. The entire way users interact with the system changes, from navigation paths and field layouts to permission models and process logic.
If you don’t redesign around those changes, your ERP transformation will quietly break the very workflows it was meant to improve.
This is the fourth part of our series on ERP data migration. In our previous posts, we covered:
- Our step-by-step playbook for successful ERP migration
- Why your ERP migration is delayed and how to speed it up without sacrificing quality
- Why bad data breaks ERP and what to do about it
Dynamics 365 isn’t your old ERP — and that’s the point
Legacy systems were often customised beyond recognition. Buttons moved, fields were renamed, permissions manually patched. As a result, they were highly tailored systems that made sense to a small group of users but were nearly impossible to scale or maintain.
Dynamics 365 takes a different approach. It offers a modern, role-based experience that works consistently across modules. But that means some of your old shortcuts, forms, and approvals won’t work the same way — or at all.
This can catch users off guard. Especially if no one has taken the time to explain why the system changed, or to align the new setup with how teams actually work.
Where the breakage usually happens
Navigation
Field engineers can’t find the work order screens they’re used to. Finance can’t locate posting profiles. Procurement doesn’t know how to filter vendor invoices. Familiar menus are gone and replaced with new logic that’s more powerful, but also less intuitive if you’re coming from a heavily customised legacy system.
Permissions
Old ERPs often relied on manual access grants or flat permission sets. In Dynamics 365, role-based security is more structured, and less forgiving. If roles aren’t mapped correctly, users lose access to critical features or gain access they shouldn’t have.
Workflow logic
Your old approval chains or posting setups may not map cleanly to Dynamics 365. For example, workflow conditions that relied on specific field values may behave differently now, or require reconfiguration using the new workflow engine.
Day-to-day tasks
Sometimes it’s as simple as a missing field or a renamed dropdown. But that can be enough to stall operations if users aren’t involved in the migration process and given time to learn the new flow.
How to avoid workflow disruption in your Dynamics 365 migration
Don’t try to copy the old system screen for screen. That’s a common mistake and it leads to frustration. Instead, map legacy processes to Dynamics 365 patterns. Focus on what the user is trying to achieve, not what the old screen looked like.
Start with your core user tasks.
- What does a finance user need to do each day?
- What about warehouse staff?
- Field service engineers?
Identify their critical workflows, then rebuild them in Dynamics 365 using native forms, views, and automation.
Review and rebuild your security model. Role-based security is at the heart of Dynamics 365. You’ll need to define duties, privileges, and roles properly, not just copy old access tables. Test for least privilege and ensure segregation of duties.
Test user scenarios in every sprint. Don’t wait for UAT to catch usability issues. Include key personas in each migration cycle. Run scenario-based smoke tests and regression checks. Use test environments that mirror production, and validate real-life tasks end to end.
Provide context and support. Users aren’t just learning a new tool — they’re changing habits built over years. Train them with use-case-driven sessions, not generic walkthroughs. Show them how the new process works and why it changed.
The takeaway is: Migration is your chance to modernise — don’t waste it
Your ERP isn’t just a backend system. It’s where users spend hours every day getting work done. If it doesn’t work the way they expect, adoption will suffer, and so will your transformation goals.
✅ Legacy workflows may not map to D365 one-to-one
✅ Navigation, permissions, and logic will differ
✅ Redesign processes around D365 patterns, not legacy layouts
✅ Review role-based permissions carefully
✅ Involve users early and test real scenarios every sprint
Done right, this isn’t just damage control. It’s an opportunity to rebuild your processes in a way that’s more secure, more scalable, and actually aligned with how your business works today.
Just because your old system lets you do something doesn’t mean its how your new system should work — and now’s the perfect time to make that shift. Consult our experts before you begin your migration and set the project up for success.

TL;DR
ERP projects don’t collapse because of the software — they collapse when messy, inconsistent, or incomplete data is pushed into the new system. Bad data leads to broken processes, frustrated users, unreliable reports, and compliance risks. The only way to avoid this is to treat data quality as a business priority, not an IT afterthought. That means profiling and cleansing records early, aligning them to Dynamics 365’s logic instead of legacy quirks, validating and reconciling every migration cycle, and involving business owners throughout. Clean data isn’t a nice-to-have — it’s the foundation of a successful ERP rollout.
Your ERP is only as good as your data
You can have the best ERP system in the world. But if your data is incomplete, inconsistent, or poorly structured, the result is always the same: broken processes, frustrated users, and decisions based on guesswork.
We’ve seen Dynamics 365 projects delayed or derailed not because of the software — but because teams underestimated how messy their data really was.
So let’s talk about what happens when you migrate bad data, and what you should do instead.
This is the third part of our series on ERP data migration. In our previous posts, we covered:
What bad data looks like in an ERP system
A new ERP system is supposed to streamline your operations, not create more chaos. But if your master data isn’t accurate, it starts causing problems right away.
Broken workflows
- Sales orders that fail because customer IDs don’t match
- Invoices that bounce because VAT settings are missing
- Stock that disappears because unit of measure conversions weren’t mapped
User frustration
- Employees waste hours trying to fix errors manually
- They lose confidence in the system
- Adoption suffers. Shadow systems start creeping in
Poor reporting
- Your shiny new dashboards don’t reflect reality.
- Finance teams can’t close the books.
- Operations can’t trust inventory figures.
- Leadership can’t make informed decisions.
Compliance risks
- Missing fields
- Outdated codes
- Unauthorised access to sensitive records
- Inconsistent naming conventions that make data hard to track
All of these can lead to audit issues or worse.
And yet, despite the stakes, data quality is often treated as a “technical detail” — something IT will sort out later. That’s exactly how costly mistakes happen.
Why data quality needs to be a business priority — not an IT afterthought
Data quality isn’t just about spreadsheets. It’s about trust, efficiency, and the ability to run your business.
Good ERP data supports business processes. It aligns with how teams actually work. And it evolves with your organisation — if you put the right structures in place.
Too many ERP projects approach migration as a technical handover: extract, transform, load. But that mindset ignores a crucial reality — only business users know what the data should say.
That’s why successful migrations start with cross-functional ownership, not just technical execution.
What good data management looks like before ERP migration
Identify and fix bad data early
Run profiling tools or even basic Excel checks to spot issues:
- duplicates,
- incomplete fields,
- outdated reference codes.
Don’t wait for them to break test environments.
Fit your data to the new system — not the old one
Legacy ERPs allowed all sorts of workarounds. Dynamics 365 has stricter logic.
You’ll need to normalise values, align with global address book structures, and reformat transactional data to fit new entities.
Validate everything
Don’t assume data has moved just because the ETL pipeline ran.
Build checks into each stage:
- record counts
- value audits
- referential integrity checks.
Set up test environments that reflect real-world usage.
Involve business users early
IT can move data. But only process owners know if it's right.
Loop in finance, sales, procurement, inventory — whoever owns the source and target data. Get their input before migration begins.
Plan for reconciliation
Post-load, run audits to confirm data landed correctly. Compare source and target figures.
Validate key reports. Fix gaps before go-live, not after.
Data quality isn’t a nice-to-have. It’s a critical success factor
ERP migration is the perfect time to raise the bar. But to do that, you have to treat data quality as a core deliverable — not a side task.
That means budgeting time and effort for:
- Cleansing legacy records
- Enriching and standardising key fields
- Testing and validating multiple migration cycles
- Assigning ownership to named individuals
- Reviewing outcomes collaboratively with business teams
When done right, good data management saves time, cost, and credibility — not just during the migration, but long after go-live.
Clean data makes or breaks your ERP project
If you migrate messy, unvalidated data into Dynamics 365, the problems don’t go away — they multiply. This looks like:
- Broken processes
- Unhappy users
- Useless reports
- Extra costs
- Compliance headaches
But with the right approach, data becomes a strength — not a liability.
- Profile and clean your data early
- Fit it to Dynamics 365, not legacy quirks
- Validate, reconcile, and document
- Involve business users throughout
- Treat data quality as a strategic priority
A clean start in Dynamics 365 begins with clean data. And that’s something worth investing in.
The good news is that these potential mistakes can be avoided early on with a detailed migration plan. Consult with our experts to create one for your team.
Up next in our series on ERP migration:
How to migrate ERP data from legacy tools to D365

TL;DR:
Copilot Studio isn’t inherently expensive — unexpected costs usually come from how Microsoft’s licensing model counts Copilot Credits. Every generative answer, connector action, or automated trigger consumes quota, and without planning, usage can escalate fast. In 2025 you can choose from three models: pay-as-you-go for pilots and seasonal spikes, prepaid subscriptions for steady internal usage, or Microsoft 365 Copilot subscriptions for enterprise-wide internal agents. The key is to start small, measure consumption for 60–90 days, and then align the right model with your organisation’s actual usage. Done right, Copilot Studio becomes a cost-efficient productivity engine rather than a budget surprise.
“Why did our agent cost more than we expected?”
It’s a question many IT leaders have faced in recent months. A pilot agent launches, quickly starts answering HR questions, booking meetings, or integrating with Outlook and SharePoint. Then Finance opens the first month’s bill and quickly spots something that doesn’t add up.
The surprise rarely comes from Copilot Studio being unreasonably expensive. More often, it’s because the licensing model works differently to what Microsoft 365 administrators are used to. Without planning, credit consumption can escalate quickly, especially when generative answers, connectors, and automated triggers are involved.
In this post, we break down exactly how Copilot Studio licensing works in 2025, how credits are counted, and how to pick the right model for your organisation. We’ll also share real examples of how Copilot Studio can deliver a return on investment, helping you build a strong case for adoption.
The basics: what you must have
When planning a secure Copilot Studio rollout, it’s important to understand who needs a licence and who doesn’t.
- End-user access — Once an agent is published, anyone with access to it can use it without needing a special licence. The only exception is if they’re using Microsoft 365 Copilot (licensed separately).
- Creator licences — Anyone building or editing Copilot agents must have a Microsoft 365 plan that includes the Copilot Studio creator capability. These plans include the Copilot Studio user licence.
- Custom integrations — Included at no additional cost. This lets you build integrations so your agents can connect to internal systems, databases, or APIs beyond the standard connectors.
Three licensing options — and when to use them
Pay-as-you-go (without M365 Copilot licence)
How it works:
- Standard (non-generative) answer = $0.01 per message
- Generative answer = 2 Copilot Credits
- Action via connector/tool = 5 Copilot Credits
- Agent Flow Actions (100 runs) = 13 Copilot Credits
Additional AI tool usage
If your agents use certain built-in AI tools, these also consume credits from your licence capacity. The number of credits deducted depends on the tool type and usage level:
- Text and generative AI tools (basic) — 1 Copilot Credit per 10 responses
- Text and generative AI tools (standard) — 15 Copilot Credits per 10 responses
- Text and generative AI tools (premium) — 100 Copilot Credits per 10 responses
These deductions apply in addition to the per-credit costs outlined above for standard answers, generative answers, and actions. Make sure to factor these into your usage forecasts to avoid hitting capacity limits sooner than expected.
When to use:
- Pilots or early adoption, where usage is unpredictable
- Seasonal projects
- Avoiding unused prepaid capacity
Watch out for: Public-facing agents without access control. A customer support agent on your website could easily generate thousands of Copilot Credits a day — and a large bill.
Example: A regional HR team builds an agent to answer policy questions during onboarding periods. In January and September usage spikes, but it’s minimal the rest of the year. Pay-as-you-go keeps costs in proportion to demand.
Prepaid subscription
How it works: $200/month for 25,000 Copilot Credits at the tenant level, with the option to add pay-as-you-go for overages.
Cost efficiency compared to pay-as-you-go
This plan still counts usage in Copilot Credits, just like pay-as-you-go, but the effective rate works out at a lower cost, around $0.08 per Copilot Credit. If your organisation has steady or high Copilot Studio usage, this can offer better value than relying solely on PAYG pricing.
When to use:
- Predictable, steady internal usage
- Large organisations where agents are part of daily operations
Watch out for: Unused Copilot Credits don’t roll over. Overages revert to pay-as-you-go rates.
Example: An IT operations team runs an agent that handles password resets and system outage FAQs. Daily traffic is consistent year-round, making a prepaid subscription more cost-effective.
M365 Copilot subscription
$30/user/month for M365 Copilot — includes Copilot Studio within Teams, SharePoint, and Outlook.
Best for:
- Enterprises where everyone already uses M365
- Internal-only agents
Limitation: Agents can’t be published to public websites or customer portals.
Example: Sales uses Copilot Studio inside Teams to retrieve up-to-date product information and customer account details from SharePoint. No public access is required, so the M365 Copilot subscription covers the entire team without additional credit planning.
Copilot Credit counting: the hidden budget driver
Understanding your licence is one thing; knowing how fast you consume it is another.
.png)
Why it matters:
- Actions add up quickly — especially if your agent uses multiple connectors.
- MCP (Model Context Protocol) allows multiple tools per agent, but each action still counts.
- Triggers can be on-demand or autonomous — an agent that checks a knowledge base every morning will consume quota even if no one interacts with it.
How to increase the ROI of Copilot Studio
Licensing becomes an easier conversation when you can link it to measurable value.
Productivity gains examples:
- IT Ops: An L1 support agent resolving 200 password resets per month. Even at $0.02 per generative answer, that’s $4 — far less than the labour cost of 200 helpdesk tickets.
- HR: An onboarding agent that answers repetitive questions instantly, freeing HR staff for complex cases.
- Finance: An agent that automatically answers FAQs on expense policies during busy quarter-end periods.
- Field Operations: Scheduling and updating tasks via chat, without logging into multiple systems.
Best practice: Start with pay-as-you-go. Track Copilot Credit usage for 60–90 days to create a baseline. Then decide if prepaid makes financial sense.
Avoid these common Copilot Studio licensing mistakes
- Public agents without usage limits — ideal for engagement, dangerous for budgets.
- Overbuying prepaid packages “just in case” — unused capacity is wasted.
- Action-heavy flows — five Copilot Credits per action means complex workflows can burn through quotas quickly.
- Ignoring triggers — scheduled or event-based actions still count towards your total.
Choosing the right model
.png)
Making Copilot Studio licensing work for your organisation
Copilot Studio is one of Microsoft’s most flexible AI building tools — but that flexibility has cost implications. If you design agents without thinking about Copilot Credit counts, licence tiers, and ROI, your pilot could become an expensive surprise. If you start small, measure everything, and match the licensing model to actual use, you can turn Copilot Studio into a self-funding productivity engine.
The organisations getting the most from Copilot Studio in 2025 are not the ones simply buying licences. They are the ones designing for efficiency from the start.
Not sure which licensing model fits your use cases? Contact us to review your use case and estimate costs.
Useful resources:

TL;DR:
GPT-5 (launched 7 Aug 2025) now powers Microsoft 365 Copilot, GitHub Copilot, Copilot Studio, and Azure AI Foundry. It adds a “smart mode” for speed vs. reasoning, but reactions are mixed: slower on complex tasks, overly strict filtering, change fatigue, and licensing concerns. Still, strengths stand out — clearer outputs, better sequencing, role‑aware customisation, and solid knowledge management. To get the most out of it, use precise prompts, link it to reliable sources, and build role-specific prompt libraries. Our take: ignore the noise, test it on your real processes, and judge by outcomes, not clickbait headlines and personal opinions.
What’s happening and why it matters
When OpenAI released GPT-5, Microsoft wasted no time weaving it into its ecosystem. GPT-5 is now the default engine behind Copilot experiences across Microsoft 365, GitHub and Azure, making Copilot the front door to how most people will feel the upgrade. What makes it different is the addition of “smart mode”: dynamic routing that automatically switches between lighter, faster models for simple tasks and slower, more reasoning-heavy ones for complex, multi-step problems. You don’t need to choose; Copilot does it for you.
Licensed Microsoft 365 Copilot users got priority access on day one, with others following in phased waves. In Copilot Studio, GPT-5 is already available in early release cycles for building and testing custom agents.
Why do some people dislike GPT-5?
Online discussions around GPT-5 have been a bit chaotic. When the announcement dropped, the hype took off like wildfire: bold claims of “human-level reasoning,” “a game-changer,” even “the biggest leap since GPT-3.” But then came reality. Early adopters ran tests and the backlash followed. Here are the main pain points they mentioned:
- Overpromised, underdelivered
People expected perfection. Instead, they found a smarter but still fallible LLM. It makes fewer reasoning mistakes, but hallucinations and generic text haven’t disappeared.
- Slower on complex tasks
Deep reasoning mode can feel sluggish compared to quick drafts. That’s because it’s doing more thinking. Still, in an operations setting, slow answers can feel like lost time.
This shows up most clearly in Copilot when you’re drafting long documents or analysing complex data models.
- Too safe, too filtered
Some users noticed that Copilot holds back even on harmless internal queries, as Microsoft’s enterprise safety filters are set quite conservatively.
- Change fatigue
Frequent Copilot updates introduce variability in workflows and prompt behaviour, requiring users to continually adapt. This can disrupt established processes and create the perception of enforced adoption rather than incremental, planned change management.
- Licensing worries
Licensing itself hasn’t changed for now — GPT-5 is included in existing Microsoft 365 Copilot licences. However, heavy use of custom agents in Copilot Studio could generate additional compute or API costs, particularly in high-volume scenarios. Keep a close eye on usage metrics to anticipate potential budget impacts.
Why it’s still worth your time
If you only read LinkedIn takes, you might think GPT-5 is either magic or useless. But if you check the LMArena benchmarks — a popular platform combining crowd-sourced and in-house evaluations — you’ll see GPT-5 ranked as one of the top models, taking first place in multiple categories, such as hard prompts, coding, math, and instruction following (as of 20th of August, 2025.
In practice, this can bring tangible benefits to operations teams using Copilot:
- Better sequencing: It arranges tasks in the right order with less manual prompting.
- Improved clarity: Outputs read more like a professional SOP than a messy checklist.
- Role-specific tailoring: You can adapt outputs for different audiences, from new hires to experienced staff.
- Knowledge management: Scattered notes, Teams chats and old docs can quickly become structured guides.
- Field support: With Copilot on mobile, field workers can query internal manuals instantly, from safety procedures to troubleshooting steps. Add image upload and ticket creation, and you have lightweight expert support in the field.
- Everyday admin: Drafting project updates, onboarding packs, change announcements, policies, and even scheduling factory visits works better than in GPT-4.
In short: it’s not flawless, but it is useful — if you know how to use it.
How to make GPT-5 work for you
The difference between frustration and real value with GPT-5 usually comes down to how you use it.
The more context you give, the better the results. Instead of “Write the meeting notes,” try “Summarise the July 10 shift handover meeting, highlighting safety incidents, equipment downtime, and actions assigned to each supervisor.”
For anything going to external stakeholders or compliance teams, human review is still essential. It also pays to connect Copilot to authoritative sources like SharePoint libraries or Teams channels, and to keep those sources tidy so you’re not pulling in outdated content.
If you want consistent tone and structure, build reusable templates in Copilot Studio and share a simple prompt cheat sheet with your team — common starters like “Draft…”, “Compare…” or “Summarise…” go a long way.
Training should be role-specific, showing how GPT-5 makes day-to-day tasks easier, starting small and scaling up as confidence grows.
And perhaps most importantly, set expectations: deep reasoning is for complex, critical work, while fast mode is best for quick drafts — and in all cases, human judgement still matters.
Don’t rely on benchmarks alone — see how it performs on your tasks
GPT-5 is a meaningful upgrade that, in the right hands, can save hours of manual effort and make knowledge more accessible across teams. But it demands smart adoption: clear prompts, proper governance, and a willingness to test it against your real processes.
Ignore the noise. Don’t take the word of content creators or sales decks at face value. Put GPT-5 to work on your use cases, measure the outcomes, and decide for yourself.
Want your teams to use Copilot like a pro? Request a live workshop so you can scale operations without increasing headcount.

TL;DR
ERP migrations don’t fail because of the software itself — they fail when teams don’t align data, processes, and ownership from the start. Projects stall because bloated legacy records get migrated without a plan, data issues surface late with no accountability, and migration is treated as a side task instead of a core business initiative. The way forward is to define a clear data strategy early, align migration decisions with business process transformation, assign real data owners, validate records in test cycles, and resource migration as a dedicated workstream. If your Dynamics 365 migration feels stuck, the solution isn’t more developers or patches — it’s stronger governance, upfront planning, and real ownership.
It’s not the software, it’s the strategy
ERP migrations don’t fail because of bad software. They fail because too many teams treat them like an IT task, not a business transformation.
Many teams see their Dynamics 365 migration drifting past deadlines, or costs creep up while progress stalls. Even well-scoped projects with strong business cases run into blockers.
Here’s what we’ve learnt after seeing (and guiding) dozens of ERP migration projects:
Delays usually start with data. But the root problem is bigger — it’s a lack of upfront alignment between data, process, and ownership.
This is the second part of our series about ERP migration. In our previous article, we shared our playbook for a successful migration.
Four reasons your ERP migration is falling behind
1. Unmanaged customer and vendor data
Most legacy systems are bloated. One client we worked with had over 16,000 customer records, but only 2,700 were active. The rest were either duplicates, stale entries, or test data nobody cleaned up.
If your team doesn’t decide what to migrate and why before the project starts, you’re setting yourself up to move a mess from one system to another. And in Dynamics 365, that mess costs you — in time, storage, and performance.
2. No data readiness plan
Too many projects start with: “We’ll clean the data as we go.”
But once integrations begin and deadlines approach, that plan gets dropped. Then problems surface late: incorrect tax setups, mismatched payment terms, missing inventory units. These are all issues that weren’t visible until they broke something in testing.
Without a readiness plan, your team ends up fixing issues reactively. That’s when rework kicks in, timelines double, and your team’s confidence plummets.
3. Dirty data, no ownership
Inconsistent customer names. Duplicate vendors. Blank fields in mandatory columns. This is what we see when no one owns the data and no one’s accountable for fixing it.
ERP migration isn’t just about moving fields between tables. It’s about aligning business-critical records with how your new system is supposed to work. That takes decisions, not just scripts.
4. No dedicated migration team
One of the biggest red flags? When data migration is assigned “as a side task.”
ERP migrations affect every business unit. But many companies staff the project like it’s a back-office upgrade, until go-live panic sets in and suddenly, every department is firefighting data issues that should’ve been solved months ago.
What to do instead — A better way to plan your Dynamics 365 migration
Build a real data strategy before you migrate
Start by asking:
- What data do we actually use?
- What do we need to keep for compliance?
- What can we safely archive?
Then structure it for Dynamics 365. If you’re using Finance and Operations, that might mean reviewing data entities in the Data Management Framework, validating reference tables, and making sure your master data aligns with global address book logic.
Don’t wait for developers to ask these questions mid-project. Answer them early with business owners in the room.
Treat migration as transformation, not a lift and shift
The way you quote customers, manage stock, or post invoices might evolve in D365. So involve business process owners before you start migrating records.
Work backwards from your future-state workflows. Ask what data supports those processes, and define what needs to change, structurally, not just technically.
This approach helps avoid surprises later, like finding out your old pricing model doesn’t map cleanly to the new sales order flow.
Clean, enrich, and validate data upfront
Treat data quality as a project deliverable. Assign owners for each dataset — customers, vendors, products — and give them time to review, correct, and enrich records.
Use validation runs in a test environment to catch issues early. We typically run weekly load cycles into a staging environment, using Azure Data Factory + SQL with a bronze–silver–gold architecture to control quality step by step.
Allocate real resources early
Your best functional experts should be part of the migration team. If they're only looped in at go-live, they’ll spend weeks untangling misaligned setups that could’ve been avoided.
This isn’t just an IT project. It’s a business-critical initiative. Treat it like one.
TL;DR: You can still get back on track
If your ERP migration feels stuck, it’s not too late, but the fix probably isn’t more developers or another integration patch.
It’s a shift in how you’re planning, resourcing, and governing the work.
- Define a clear data strategy
- Align migration to business process transformation
- Assign data owners and validate early
- Treat migration as a dedicated workstream, not an afterthought
Our clients turn to us to get a structured approach tailored for Dynamics 365 — so if you’re feeling stuck with your ERP migration or planning one, let’s talk.

TL;DR
A successful Dynamics 365 Finance and Operations migration isn’t just about moving records; it’s about creating alignment between business needs, legacy structures, and the new system’s logic. That requires collaborative scoping to define what really matters, treating data quality as a deliverable, documenting every field and transformation, and building a repeatable ETL pipeline with staged validation. Instead of a one-shot cutover, the process should be incremental and transparent, with weekly test loads, clear mapping, and active business involvement. Good tools like Azure Data Factory help, but methodology is what prevents the “six months became sixteen” scenario. Done right, migration gives your business a clean, functional start in Dynamics 365 — not just a new system with old problems.
Why most ERP data migrations go wrong — and how to make it work
ERP migrations have a reputation for being expensive, exhausting, and unpredictable. And it’s not just the big, global rollouts. Even small, focused projects can spiral.
A six-month timeline turns into sixteen.
Your customisations don’t fit cleanly.
Your “clean” data turns out to be anything but.
Sounds familiar?
That’s exactly why we’ve spent the past year refining a data migration playbook that works — especially for Microsoft Dynamics 365 Finance and Operations. It’s not flashy, but it’s structured, scalable, and realistic.
Let’s walk through what that actually means.
What does a “good” D365 F&O migration look like?
A solid migration process doesn’t just move data. It creates alignment between business needs, legacy structures, and the new system’s logic — and it makes that alignment visible to everyone involved.
Here’s what that looks like in practice:
Shared understanding from the start
Kick off with collaborative scoping workshops. These sessions help define the core data entities, the essential fields, and the real business requirements behind each dataset. By the end, you’ve got a prioritised list of master, transactional, and configuration data — and a clear agreement on what matters at go-live.
Data quality as a first-class citizen
Next, analyse source data for inconsistencies, duplicates, missing values, and odd formatting. This isn’t just about cleaning up typos — we’re talking about reconciling structure, aligning reference values, and spotting critical gaps before they break downstream processes.
Clear mapping, not just assumptions
Every entity, every field, every transformation should be documented. Use an Excel-based mapping and metadata tracker that defines exactly how data flows from your old system into D365, including rules for enrichment, defaulting, and lookups. The goal is traceability and clarity, not guesswork.
A real ETL backbone
The process is supported by a proper technical foundation. We use Azure Data Factory to orchestrate data movement, and Azure SQL as a staging layer with bronze, silver, and gold schema structures. These layers help us filter, transform, and validate data in stages, ensuring accuracy and referential integrity before anything lands in production.
Repeatable, testable load cycles
Instead of a one-shot migration, use a gradual approach with weekly sprints to load data into test environments. Validate each cycle with smoke tests, basic process flows, and regression checks. This gives stakeholders the confidence that, come go-live, the data will actually support the processes it’s meant to.
But what about the dreaded stuff?
If you’ve browsed forums about ERP migration, you’ve seen it all:
“We thought it’d take 6 months. It took 16.”
“Our partner didn’t know how to handle our workflows.”
“The data blew up in size, and we got hit with surprise storage costs.”
These stories are real — and usually, the problem isn’t just the software. It’s the process. Most migration issues come down to underestimating three things:
- The complexity of legacy data
- The effort required to map custom logic
- The importance of incremental testing and validation
You can address these issues by making every step visible, documented, and testable.
How we approach the D365 migration process
Here’s what our migration plan includes for customers switching to D365:
- A scoped, categorised list of entities, reviewed with business stakeholders
- A detailed mapping document with transformation logic, dependencies, and field-level rules
- A repeatable ETL framework using Azure Data Factory and SQL, with bronze–silver–gold architecture
- Weekly test loads with functional smoke tests and UAT-ready validation
- Structured cutover plans, including manual tasks, such as posting journals or setting up number sequences post-migration
- Azure DevOps for tracking tasks, bugs and decisions.
And once you’re live, we don’t disappear — we provide post-go-live support with daily check-ins, KPI monitoring, and issue triaging via DevOps.
Why optimising your ERP data migration matters now
Whether you're moving from a homegrown system, migrating on-prem, or finally replacing your 1980s-era ERP, the success of your Dynamics 365 rollout depends on how well you plan and document things. Good tooling helps — but good methodology makes the difference.
If you’re planning a migration (or in the middle of one and feeling stuck), let’s talk. Our process isn’t just about moving data — it’s about giving your business a clean, functional start in Dynamics 365.
Because no one wants to be the person saying “we thought it’d take six months…”
Need an experienced partner to overlook your ERP migration? Contact us for a free audit.
This is the first part of our series on ERP data migration. In the coming weeks, we will explore:
- Reasons why your ERP migration is taking longer than expected,
- The importance of high-quality data management,
- Migrating ERP from legacy tech to D365, and
- The best practices for a successful ERP migration