Blog

Insights & ideas

Stay ahead with expert articles, industry trends, and actionable insights to help you grow.

Dynamics 365 licensing and access management basics
10 mins read
July 9, 2025

Dynamics 365 licensing and access management basics

Even with licences assigned, users can’t always access Dynamics 365 — permissions, environments, and licence-role mismatches often get in the way. Here’s what you need to know.

Read more
"We assigned the licence. Why can’t they log in?"

For many IT Operations teams, managing Dynamics 365 feels like tiptoeing through a minefield of licences, environments, and entitlements.

One admin summed it up like this:

“We’re paying for licences, but people still can’t access what they need. I just want to keep the system running.”

Dynamics 365 licensing is confusing but makes sense once you understand the principles. Between base licences, attach licences, user vs. capacity models, and silent limits on environments, even experienced IT pros get blindsided.

This is the first part of our Dynamics 365 licensing series. In this post, we’ll break down the key licensing concepts that matter for IT operations in 2025, without repeating the entire Microsoft guide.

Why your users have licences but still can’t access D365

Let’s start with the most frustrating scenario: your users are licensed, but they still hit access errors.

Here’s why that happens:

  • A licence doesn’t guarantee access to the environment. Users need permissions to the right environment and the underlying database.
  • Some roles need more than one licence. A single app licence (like for Finance) might not be enough if the user also needs to work in another module (like Sales).
  • Attach licences only work if you have a valid base licence. You can’t stack attach licences on a user without a qualifying base licence in place.

And don’t forget: environments and storage come with limits too. If the environment is out of capacity or misconfigured, licensing won’t save you.

D365 licensing basics

Here’s what actually matters to IT Operations teams:

The three licence types you’ll encounter most

  • User licence: The most common. Tied to a named user.
  • Device licence: For shared workstations or kiosks, especially in warehouses or retail.
  • Tenant licence: Grants capacity (like storage or API calls) at the environment or tenant level.

Base vs. Attach licences

  • Base licence: The first licence a user gets. It must be the most expensive one they need.
  • Attach licence: Discounted licences for additional apps. Only valid if you have a base licence from the right product family.

Many teams overpay by assigning multiple base licences instead of using attach licences strategically.

“We’re cleaning up old users — what do we do with the licences?”

This is a common scenario. Old users are still active in Entra ID or assigned licences in the Microsoft 365 Admin Centre, but nobody’s checked if they even use the system.

Here’s what we recommend:

  • Audit licence assignments quarterly. Look for inactive users still assigned premium licences.
  • Clean up orphaned access. If a user has been removed from Dynamics but still exists in Entra ID with a licence, that’s wasted spend.
  • Map access by role, not by person. Don’t assign licences just because “that’s what they had before.” Reassess by function.

Start with Team Member licences for light users — just make sure their access needs fall within its limits (read-only, self-service, or basic workflows only).

Are we paying for duplicate licences across environments?

Short answer: probably. Here’s how to spot waste:

  • Check for users with licences in multiple environments, especially sandbox copies or legacy orgs that no one cleaned up.
  • Review capacity add-ons — many are environment-bound and often over-provisioned.
  • Look at attached Power Platform licences. Are you paying for capacity through both Dynamics and Power Apps? You might be.

Storage maths in a nutshell

  • Each full user licence gives you 10 GB of database storage
  • Every user adds 250 MB extra

Need more? You’ll have to buy additional capacity.

Three tech stacks = three licensing mindsets

Dynamics 365 isn’t one piece of software or application. It’s a suite with different behaviours and pricing models:

  1. Customer Engagement Apps: Sales, Customer Service, Field Service — often used together, but watch out for duplicate entitlements.
  1. Business Central: Sold as a bundle. Easy to manage, but not as flexible with attach licences.
  1. Finance and Operations: High-value, high-cost — and the source of many of the trickiest licence combinations (Finance alone is £180/user/month).

Each stack handles users, storage, and automation differently. If you’re mixing these, map your licensing strategy accordingly.

Licensing isn’t just compliance

Many access issues, slow processes, or broken workflows are actually licensing issues in disguise:

  • Overloaded storage = system slowdowns
  • Misassigned licences = user lockouts
  • Missing entitlements = failed automations

Licensing is now directly tied to performance. Microsoft is enforcing this more aggressively, especially for Team Member misuse and capacity overages.

Your 2025 checklist for cost-efficient Dynamics 365 licensing

  • Audit users and licences: Identify who has what, where, and whether they actually use it.
  • Map roles to licence types: Use role-based access and the CRUD model to assign only what’s needed. (More on CRUD in our next post!)
  • Use Attach licences wisely: Don’t pay for multiple base licences — add attach licenses where eligible.
  • Clean up unused environments: Retire or merge low-use or duplicate instances.
  • Align storage to actual need: Remove excess capacity and avoid default overprovisioning.
  • Consolidate across teams: Prevent duplicated licensing in siloed regions or departments.
  • Reclaim unused licences: Remove entitlements from inactive or former users.
  • Review quarterly: Make licence audits a recurring practice, not a one-off cleanup.
  • Monitor Microsoft policy changes: Stay compliant by keeping up with evolving licensing rules.

Don’t let licensing be an afterthought

You don’t need to master every nuance of Microsoft’s licensing. But you do need to understand the mechanics that impact performance and budget.

Licensing should be among the first priorities in your Dynamics 365 environment, right alongside security, data, and automation.

Need a health check on your setup?
Request a free audit and make sure you’re not leaving money (or performance) on the table.

Up next in our D365 Licensing series:  

  • Right-size your D365 licenses by aligning them with actual user roles
  • D365 or Power Platform: Which one is right for your use case?
  • How to keep control over license sprawl

Soft blue and white gradient background with blurred smooth texture
Filter
Industry
Technology
Solution category
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Life in the ERP business
October 18, 2024
5 min read
Life in the ERP business
Read more

Want to get an insight into the daily life of a close-knit and enthusiastic team? Read our blog post about the Visual Labs ERP team! Find out how we spend our colourful and varied days, what we have in common at work and outside of work, and how we support each other in every situation. Keep scrolling to discover why our team is so special.

The Visual Labs ERP team is a very cohesive, fun team. We implement, develop and support enterprise management systems. Fortunately, our work is extremely varied, no two days are the same. We talk to customers, answer questions, bug reports, develop solutions for new needs, develop and test. We assess our new customers' business processes online or in person, and train users in English or Hungarian.

Who are we?

Our team members come from all over the Danube region. We include recent graduates and people with 15 years of professional experience. Our clients prefer to call us programmers or developers, but the cold, IT-savvy, economist-functional consultant inside is what we call them.

Everyone in the team has their own specialisation and super skills. Someone can build a cool chatGPT of their own, others are experts in creating new ERP environments, but we also know which of our partners to ask when it comes to VAT or when the configuration needs a final thorough check.

We're proud to have three of our team in the parent camp.On office days we get together to be in at the same time. Then we sit in a pile near the team's own houseplant. It's hard to decide if it's the 'ERP palm' or the team growing faster.

There's always a good atmosphere in the office and we laugh a lot together, whether it's at meetings, a cigarette break or swapping wallpapers. Several of the team are also hobby baristas who are always happy to make you a nice (or not so nice) cappuccino.

Regular meetings

We start the week with a WSM (weekly standup meeting), where after a quick debrief we present to the company the progress of our projects and the tasks and milestones for the week ahead. We also have a quarter-hour DSM (daily standup meeting) every morning, where we can ask each other for quick professional help, list our daily priorities, or volunteer for a task that has just fallen in that we don't have time for. Fortunately, we have a very good team spirit and proactively support each other to ensure a balanced distribution of tasks.

We end Friday afternoon with a one-hour sprint round meeting to discuss the week's events and lessons learned, review the progress of our projects, identify next steps and prepare for the WSM on Monday morning.

Once a month, we have the opportunity to attend a coaching session where we can talk through the issues that are bothering us and our current stuck points, whether they are work-related or personal.

The monthly team retro meeting is a completely offline session where we 'process' together what happened in the past month, looking at what went well and what went wrong. It's a great opportunity to give each other feedback and draw lessons on how we're doing in order to make the next time even smoother for working together and delivering projects.

In our bi-weekly knowledge sharing sessions, we try to share this kind of knowledge with other team members. Our most recent knowledge sharing sessions have included topics such as using ChatGPT and building your own GPT, new Microsoft ERP releases, but also working together on the ERP business aspects of the organisational level changes.

We also have weekly so-called customer status meetings with several of our customers. But sometimes, on a case-by-case basis, it is easier to discuss an announcement with a screen-share - we are available for that as well. During these calls, we can discuss support issues that have arisen in the past, new needs that have arisen and our proposed solutions to these. We have the same team of experts to guide our customers from the sales phase through implementation to operation. This ensures that, even during the operational phase, the customer's request for a solution is handled by an ERP expert who is fully familiar with both the customer and the solution delivered.

"Is everyone in this team a basketball player?"

It can be an odd sight when I and two of my 190cm tall "bodyguards" arrive at a new client's premises for a demo or consultation. We're not basketball players, but our average height is over 180 cm 😀 Several team members have a professional sporting background: we have handball players, footballers, canoeists, bowlers and marathon runners, but nowadays we mostly go to the gym. We try to make sport part of our week, alongside work. And on our team-building evenings, we test our skills in pub sports or poker, although unfortunately Dani is unbeatable at all of them.

Party planning committee

In addition to team events, we are also happy to take part in company events, even as organisers. Two members of the ERP team founded the Party Planning Committee, which for example organised a carnival party with a doughnut competition and a fun quiz. The winning chef in the cooking competition we organised was also from the ERP team (but there was no bunting). The rest of the team played beach volleyball while cooking.

Summary

As we have seen, Visual Labs' ERP business is not only made up of experts who are at home in the world of ERP systems, but also a close-knit community where work is combined with shared experiences and personal development. We are proud that everyone contributes to the diversity and strength of our team, whether it's a deep knowledge of different disciplines, a background in sports or even the art of coffee making.

Your time here will not only give you an insight into the ins and outs of ERP systems, but will also be part of a supportive environment where team building and social experiences are paramount alongside professional growth. The Visual Labs ERP team is growing dynamically, as is our office's famous ERP palm tree - and both symbolize the continued development and growth of our community.

Thank you for joining us for this brief glimpse. We hope you found it inspiring to read about our team, and maybe one day you'll be a part of the Visual Labs community too!

How to Set Your Local Currency as the Default in Dynamics 365
August 23, 2024
3 min read
How to Set Your Local Currency as the Default in Dynamics 365
Read more

Have you ever wondered how you can efficiently manage a business that operates in multiple currencies? Dynamics 365 offers a seamless solution for handling such scenarios.

Changing the default currency in D365 to your local currency can streamline financial transactions and reporting.

Dynamics 365 and Dataverse offer robust support for multiple currencies, allowing for flexibility in international business operations. When setting up an environment, you choose a base currency, such as the EUR. However, if your users operate in different currencies, they can adjust their default currency settings.

To update the default Currency, the user needs to follow these steps:

Step 1: Open your Dynamics application

Open any model-driven app, such as Sales Hub. Click on the gear icon and open Personalization Settings:

Step 2: Update the Currency Under the ’General’ tab, choose the desired local Currency and save the changes.

After saving, whenever the user opens an entity form which contains a Currency type field, their selected Currency will be visible.  

However, this only modifies the local currency, as the base currency is selected at the time of the environment setup and thus cannot be modified later. It is good to know that system administrators can set the default currency of users as a bulk operation.  

  • How the Currency field stores data

In Dataverse, data is structured with two distinct columns for currency values: one column for the local currency and another column for the base currency. These two columns are automatically created upon adding a Currency type field to a record. For instance, the ‘Estimated revenue’ field on the Opportunity has two underlying fields in the Dataverse: ‘Estimated Revenue’, which captures the value in the user’s local currency, while the column for ‘Estimated Revenue (Base)’ stores the field’s value in the base or organization currency. How does Dynamics calculate the conversion between the local and the base currency? To account for the different values of currencies, Dynamics uses the underlying Currency table, where each currency available has its own record. You can reach the Currency table by opening the Advanced Settings:

And navigating to the Business Management section and selecting Currencies.  

However, if frequently modifying the exchange rates is part of your organization's day-to-day work, you can also modify your application so that the Currency table can be reached directly from the menu.  

Moreover, you can set up an integration with an exchange rate provider. This integration ensures that all financial data reflects the most current rates, reducing the risk of errors and improving the accuracy of your financial reports. This is particularly beneficial for businesses that deal with multiple currencies and need real-time data to make informed decisions.

Summary

The currency type field in Dynamics 365 is a powerful tool for businesses operating in multiple currencies. By setting a local currency as the default and understanding how currency data is stored and converted, users can better manage their financial transactions. Additionally, integrating with an exchange rate provider can further enhance the accuracy and efficiency of your financial operations.

My Journey with CI/CD in Power BI: A Personal Tale of Transformation Part 3
July 10, 2024
3 min read
My Journey with CI/CD in Power BI: A Personal Tale of Transformation Part 3
Read more

In part 3, I’m going to give you a step-by-step description of the implementation process of source control in Power BI. This can be divided into 4 parts:

  1. Modify settings in Power BI Desktop
  2. Download & Install necessary softwares
  3. Set up environments
  4. Use it!

Step 1 -  Modify settings in Power BI Desktop: Enable preview feature: Power BI Project (*.pbib) save option

  1. Open Power BI Desktop  
  2. Go to Options and settings and select Options

   

3. Click on Preview features and enable Power BI Project (*.pbib) save option  +1 optional) I’d recommend ticking the boksz next to Store semantic model using TMDL format        4.Hit OK

And now we can move to Step 2. Step 2 - Download & Install necessary softwares At VisualLabs we decided to use VS Code but you can do the basics in Power Shell as well. The reason I prefer VS Code is that you can have a visual interpretation of your project (track all the branches, merges, etc at the same time).  

  1. Download and install VS Code - https://code.visualstudio.com/download

Feel free to install it with the default settings.          2. Download and install GIT. You can download it from here: https://www.git-scm.com/downloads Feel free to install it with the default settings, the only thing you can change is the default editor, which you can set to Visual Studio Code.  

 

3. Add GitGraph to VS Code – this will allow you to see the historical changes of your repo as mentioned above.

  1. Open VS Code
  2. Click on Extension on the right
  3. Type Git graph
  4. Select from list
  5. Click Install

Step 3 – Set up GIT and Azure DevOps environments

  1. Set up VS Code as your default GIT editor - Open a New Terminal in VS Code and type this command (you may need to restart you VS Code or machine to make the commands work properly):

git config --global core.editor "code --wait"  

Set up your GIT Identity – type this command in the terminal git config --global user.name "FirstName LastName" git config --global user.email firstname.lastname@myorganization.com

Create a repo on Azure DevOps You can follow this MS documentation: https://learn.microsoft.com/en-us/azure/devops/repos/git/create-new-repo?view=azure-devops#create-a-repo-using-the-web-portal

4. Once the repo is there, you’ll see this on your screen and now you can clone it onto your computer  

   

5. Select Clone in VS Code option  

 

6. Select destination folder

My recommendation is to create a separate folder where you can store all your repos from this point. I’d also opt for a cloud location for this repo collector folder – like OneDrive.  

 

7. In VS Code, you can check the current status of your repo  

 

 8. The last step is Save your Power BI file As.pbib to this folder.  

   

9. Click on Yes, I trust the authors to move tot he next step. You’ll see that VS Code recognized that there are new files in the folder.  

 

  10. Now you can Add a coming message, Select the changes you want to keep (this the step called: stage changes, feel free to click on Select all) and Click Commint (it is only going to save it locally)  

11. Click Sync changes (now it’s in the colud – you can check it in the Repo created on Azure DevOps)

12. GitGraph will look like this:  

   

13. Congrats!

Your source control journey has officially begun! Feel free to create branches, repos etc., and start the co-development with your colleagues or just simply enjoy that you won’t ever be named to “MyProject_final_v124_final12.pbix”

Unified Monitoring: Using Workbooks for Logic Apps, Azure Functions, and Microsoft Flows
July 4, 2024
7 min read
Unified Monitoring: Using Workbooks for Logic Apps, Azure Functions, and Microsoft Flows
Read more

Problem Statement

Monitoring of the three platforms mentioned in the title is solved independently in different locations. Logic Apps can be monitored either from the resource’s run history page or through the Logic App Management solution deployed to a Log Analytics workspace. Azure Functions have Application Insights, while the run history of Microsoft Flows is available on the Power Platform.

Most of our clients’ solutions consist of these resources, which often chain together and call each other to represent business processes and automations. Their centralized supervision is not solved, making error tracking and analysis difficult for employees. Moreover, they had to log into the client’s environment to perform these tasks.

Goal

We wanted to get a general overview of the status of the solutions we deliver to our clients, reduce our response time, and proactively prevent error reports submitted by our clients. We aimed to track our deployments in real-time, providing a more stable system and a more convenient user experience. We wanted to make our monitoring solution available within Visuallabs so that we could carry out monitoring tasks from the tenant that hosts our daily development activities.

Solution

Infrastructure Separation

Our solution is built on the infrastructure of a client used as a test subject, whose structure can be considered a prerequisite. On the Azure side, separate subscriptions were created for each project and environment, while for Dynamics, only separate environments were used. Project-based distinction for Flows is solved based on naming conventions, and since log collection is manual, the target workspace can be freely configured.

Centralized Log Collection

It was obvious to use Azure Monitor with Log Analytics workspaces for log collection. Diagnostic settings were configured for all Azure resources, allowing us to send logs to a Log Analytics workspace dedicated to the specific project and environment. For Microsoft Flows, we forward logs to a custom monitor table created for Flows using the built-in Azure Log Analytics Data Collector connector data-sending step. This table was created to match the built-in structure of the Logic Apps log table, facilitating the later merging of the tables.

monitroing
Diagnostic settings

Log Analytics workspace

Log tables

Making Logs Accessible in Our Tenant

An important criterion for the solution was that we did not want to move the logs; they would still be stored in the client’s tenant; we only wanted to read/query them. To achieve this, we used Azure Lighthouse, which allows a role to be enforced in a delegated scope. In our case, we set up a Monitoring contributor role for the client’s Azure subscriptions for a security group created in our tenant. This way, we can list, open, and view resources and make queries on Log Analytics workspaces under the role’s scope from our tenant.

Visualization

For visualization, we used Azure Monitor Workbook, which allows data analysis and visual report creation, as well as combining logs, metrics, texts, and embedding parameters. All Log Analytics workspaces we have read access to via Lighthouse can be selected as data sources. Numerous visualizations are available for data representation; we primarily used graphs, specifically honeycomb charts, but these can easily be converted into tables or diagrams.

Combining, Customizing, and Filtering Tables

To process log tables from different resources together, we defined the columns that would be globally interpretable for all resource types and necessary for grouping and filtering.

These include:

  • Client/Tenant ID
  • Environment/Subscription ID
  • Resource ID/Resource Name
  • Total number of runs
  • Number of successful runs
  • Number of failed runs

Based on these, we could later determine the client, environment, project, resource, and its numerical success rate, as well as the URLs needed for references. These formed the basis for combining tables from various Log Analytics Workspaces and resources for our visualizations.

Log Analytics

User Interface and Navigation

When designing the user interface, we focused on functionality and design. Our goal was to create a visually clear, well-interpreted, interactive solution suitable for error tracking. Workbooks allow embedding links and parameterizing queries, enabling interactivity and interoperability between different Workbooks. Utilizing this, we defined the following levels/types of pages:

  • Client
  • Project
  • Resources
  • Logic App
  • Azure Function
  • Flow
Customers

Projects

Resources

Resources
Resources [Azure Function]

At the Client and Project levels, clicking on their names displays the next subordinate Workbook in either docked or full-window view, passing the appropriate filtering parameters. Time is passed as a global parameter during page navigation, but it can be modified and passed deeply on individual pages. We can filter runs retrospectively by a specific minute, hour, day, or even between two dates.

On the page displaying resources, we provide multiple interactions for users. Clicking on resource names navigates to the resource’s summary page on the Azure Portal within the tenant, thanks to Lighthouse, without tenant switching (except for Power Automate Flows).

Clicking on the percentage value provides a deeper insight into the resource’s run history and errors in docked view. This detailed view is resource type-specific, meaning each of the three resources we segregated has its own Workbook. We always display what percentage of all runs were successful and how many faulty runs occurred, with details of these runs.

Logic App

Beyond general information, faulty runs (status, error cause, run time) are displayed in tabular form if any occurred during the specified time interval. Clicking the INSPECT RUN link redirects the user to the specific run where all successful and failed steps in the process can be viewed. At the bottom, the average run time and the distribution of runs are displayed in diagram form.

Logic App

Logic App [INSPECT RUN]

Logic App [diagrams]

Microsoft Flow

For Flows, the same information as for Logic Apps is displayed. The link also redirects to the specific run, but since it involves leaving Azure, logging in again is required because Dynamics falls outside the scope of Lighthouse.

Microsoft Flow

Azure Function

The structure is the same for Azure Functions, with the addition that the link redirects to another Workbook instead of the specific run’s Function App monitor page. This is necessary because only the last 20 runs can be reviewed on the Portal. For older runs, we need to use Log Analytics, so to facilitate error tracking, the unique logs determined by developers in the code for the faulty run are displayed in chronological order.

Azure Function

Azure Function

Consolidated View

Since organizationally, the same team may be responsible for multiple projects, a comprehensive view was also created where all resources are displayed without type-dependent grouping. This differs from the Workbook of a specific project’s resources in that the honeycombs are ordered by success rate, and the total number of runs is displayed. Clicking on the percentage value brings up the previously described resource type-specific views.

Resources

Usability

This solution can be handy in cases where we want to get a picture of the status of various platform services in a centralized location. This can be realized interactively for all runs, except for Flows, without switching tenants or possibly different user accounts. Notification rules can also be configured based on queries used in Workbooks.

Advantages:

  • The monitoring system and visualization are flexible and customizable.
  • New resources of the same type can be added with a few clicks to already defined resource types (see: configuring diagnostic settings for Logic Apps).

Disadvantages:

  • Custom log tables, visualizations, and navigation between Workbooks require manual configuration.
  • Integrating Flows requires significantly more time investment during development and planning.
  • Combining tables, separating environments and projects can be cumbersome due to different infrastructure schemas.
  • Basic knowledge of KQL (Kusto Query Language) or SQL is necessary for queries.

Experience

The team that implemented the solution for the client provided positive feedback. They use it regularly, significantly easing the daily work of developer colleagues and error tracking. Errors have often been detected and fixed before the client noticed them. It also serves well after the deployment of new developments and modifications. For Logic Apps, diagnostic settings are included in ARM (Azure Resource Manager) templates during development, so runs can be tracked from the moment of deployment in all environments using release pipelines.

Hiding Subgrid Buttons Specifically
July 4, 2024
3 min read
Hiding Subgrid Buttons Specifically
Read more

Depending on the stage of a sales process, different functions should be available on a form's Subgrid. Practically, this means that at the beginning of the process, interests can be added to a Lead, but these should not be modifiable later in the Opportunity phase.This article requires some technical knowledge for understanding and application, so it is recommended for Dynamics 365 CE app makers who are already familiar with the Power Platform world.

Starting point:

new products

Goal:

product insert

Tools Used for the Solution:

The Solution:

1. Solution

Create a Solution that will be loaded into the Ribbon Workbench. Add the entity whose SubGrid you want to modify into this solution. (Important: when adding the existing entity to the Solution, do not import any other elements). The name of the Solution should always be constructed based on the following logic: Ribbon_VL_[entity name] e.g., Ribbon_VL_Product_Interest.

subgrid

2. Subgrid

Name the SubGrid with a unique, identifiable name. Do not use the automatically generated name, as you will refer to this later.

3. JavaScript

Create the following JavaScript as a .js file (using VS Code), then upload it to the solution containing the Web resources. It is advisable to name the file the same as its content to make it easier to find later.

subgrid

forProductInterestView: function (selectedControl) {console.log("start.forProductInterestView");"use strict";debugger;var currentGridName = selectedControl._controlName;console.log("forProductInterestView-currentGridName: "+currentGridName);var excludedPayRun = "subgrid_prodinterest"; //Name of the subgridif (currentGridName == excludedPayRun) {console.log("end.forProductInterestView.true");return false;}else { console.log("end.forProductInterestView.false");return true; }}

4. Ribbon Workbench

Open the Ribbon workbench and add the solution created in step one. Each entity has 3 ribbons: Home, Subgrid, Form. We now need the Subgrid.

Select the button you want to remove by right-clicking on it and pressing "Customise Button." A red checkmark will appear, and it will also be added to the Buttons section below. If it is already checked, it means a command is already associated with it; in that case, you need to add a new command and can skip this step.

Next, add a Command, which can be done by clicking the plus sign in the Commands section. The command should look like this:

subgrid

Explanation:

  • Library: The webResource you added to the solution (this is where the good naming comes in)
  • Function name: The name given in the JavaScript. (The part before the Function)
  • CRM Parameter: What parameter to pass; in this case, it is the SelectedControl. This Control manages the SubGrids on Forms and all listings. The PrimaryControl manages the form.

Next, add an EnableRule that hides the buttons.

Explanation:

  • Library: The webResource you added to the solution (this is where the good naming comes in)
  • Function name: The name given in the JavaScript. (The part before the Function)
  • CRM Parameter: What parameter to pass; in this case, it is the SelectedControl

Only one step remains before Publishing. For the buttons, specify which Command should be associated with them.

I hope you find this article useful and that it provides a solution idea.

The Past, Present, and Future of ERP Systems
July 4, 2024
5 min read
The Past, Present, and Future of ERP Systems
Read more

When I started working at VisualLabs, during the first WSM (weekly standup meeting) where each business division reports on their current weekly tasks, I noticed how many abbreviations we use. As a member of the ERP team, I wondered if we know exactly how these abbreviations came about and what they stand for.

The term ERP (Enterprise Resource Planning) is familiar to everyone today, but few know its exact origins and development path. Therefore, I decided to gather information on where it started and the major milestones that helped shape the ERP systems we know today. Looking back in time, we will realize how deeply this technology is rooted in the modern business world.

In this blog, I have compiled seven milestones that contributed to the development of the ERP system as we know it today.

In today’s world, it would be unimaginable for a company not to use some kind of computer system for its various processes. However, before the advent of computers, companies had to manage these processes (be it accounting or production planning) using some methods. Take accounting, for example. Accountants recorded every financial transaction manually on paper in different books, which they managed daily and monthly. It is hard to imagine that companies often had rooms full of (main) books and files, each containing dozens of transactions. At the center of it all was the accountants’ most precious asset, the general ledger. It is daunting to think about how much work the year-end closing process entailed and how many errors could occur during this process.

ERP

  1. Birth of Computers (1950s):

In the 1950s, with the birth of computers – theoretically founded by John von Neumann – a new dimension opened up in the operation of companies and the transformation of their processes. Although these computers were primarily used in the military and scientific fields in the 50s – due to their large size and cost – continuous technological developments soon brought them into the business world. These devices allowed faster data processing and analysis and helped automate business activities.

ERP

2. Inventory Management and Control (1960s):

One of the first milestones in recognizing the potential of computers for business opportunities stretches back to the 1960s. The manufacturing industry realized the need for a system that would enable inventory management, monitoring, and control. The emergence of information technology allowed companies to integrate and automate their business processes. As a result, they improved the efficiency and accuracy of inventory management. This was one of the first steps toward developing ERP systems.

3. Material Requirements Planning (MRP I, 1970s)

The concept of MRP (Material Requirements Planning) first appeared in 1970 and fundamentally represented a software-based approach to planning and controlling manufacturing processes. MRP’s application primarily focused on planning and tracking material requirements. This approach allowed companies to predict more accurately the type and amount of materials needed during production processes. With MRP, companies could manage material procurement and production scheduling more effectively, reducing losses from over- or underestimation. This innovation had a significant impact on the manufacturing industry and fundamentally transformed companies’ material planning processes. This approach contributed to increased efficiency and competitiveness of manufacturing companies in the 1970s.

4. Manufacturing Resource Planning (MRP II, 1980s): The 1980s marked a significant milestone with the advent of MRP II systems. While MRP focused solely on the inventories and materials needed based on real or forecasted customer demands, MRP II provided greater insight into all other manufacturing resources. By extending manufacturing planning beyond materials to include labor, machinery, and other production resources, it gave companies much greater control over their manufacturing processes.

5. Enterprise Resource Planning Systems (ERP, 1990s): In the 1990s, the first true ERP systems were introduced. (The term ERP itself was coined by the research firm Gartner in the 1990s.) ERP systems represented a significant advancement compared to MRP II systems as they focused not only on manufacturing but also on the full integration and automation of business processes. Such processes included procurement, sales, finance, human resources, and accounting. With full integration, companies could manage their business processes in a unified database, offering numerous advantages. The unified storage and management of information ensured access to accurate, up-to-date data, improving decision-making and efficiency. The connected business areas helped formulate and implement unified strategies. As a result, the ERP system became a “one-stop solution” that managed all company information.

6. Web-Based Functionalities with the Rise of the Internet (ERP II, 2000s): In the mid-2000s, as the internet’s role grew in the business world, ERP systems also adapted to this change. Systems began incorporating customer relationship management (CRM) and supply chain management (SCM) functionalities. ERP II emphasized user-friendly interfaces and customization. Modular systems were developed, allowing businesses to select and implement the components most relevant to their operations.

7. Cloud-Based ERP (2010s): In the 2010s, the emergence of cloud technology added a new dimension to ERP systems. Cloud-based ERP solutions allowed companies to store and run their ERP systems in the cloud instead of traditional “on-premise” installations. This offered significant advantages, including greater flexibility, lower costs, and easier access to critical data. With cloud-based ERP systems, companies no longer had to worry about server maintenance or software updates, as these tasks were handled by their providers. This allowed companies to focus on their business goals and processes while ensuring their system was always up-to-date and accessible.

+1 The Future of ERP: So where is the development of ERP systems headed today? With intelligent algorithms and artificial intelligence, systems are increasingly able to automate and optimize business processes, reducing the need for human intervention. Data will continue to play a key role in the future, as more efficient analysis of data enables companies to make better business decisions. The integration of ERP systems with various IoT devices allows real-time data exchange, providing companies with quicker and more accurate answers to support different business questions.

ERP

ERP systems are also increasingly providing personalized user experiences and offering expandable integrations with other business applications and technologies. In the future, ERP systems will not just function as tools but will provide true business intelligence and competitiveness, helping companies keep pace with the rapidly changing business environment and stand out from their competitors.

Are you exploring the world of ERP systems? Visual Labs can help you uncover the possibilities within.

Sources:

https://www.geniuserp.com/resources/blog/a-brief-history-of-erps https://www.fortunetechnologyllc.com/history-of-erp-systems/ https://www.geniuserp.com/resources/blog/a-brief-history-of-erps
https://www.erp-information.com/history-of-erp.html#google_vignette
https://www.techtarget.com/searcherp/tip/MRP-vs-MRP-II-Learn-the-differences https://www.business-case-analysis.com/account.html https://www.britannica.com/technology/computer/IBM-develops-FORTRAN https://business.joellemena.com/business/when-did-computers-start-being-used-in-business-2/

Sorry, no items found with this category

Ready to talk about your use cases?

Request your free audit by filling out this form. Our team will get back to you to discuss how we can support you.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Stay ahead with the latest insights
Subscribe to our newsletter for expert insights, industry updates, and exclusive content delivered straight to your inbox.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.