Blog

Insights & ideas

Stay ahead with expert articles, industry trends, and actionable insights to help you grow.

Microsoft Power Platform licensing: What’s changed in 2025 and how it affects you
10 mins read
May 19, 2025

Microsoft Power Platform licensing in 2025

Licensing rules are tighter, enforcement is stricter, and the risks are real. This post explains what’s changed, where teams slip up, and how to stay compliant without breaking your apps or your budget.

Read more
“I just want to build and share apps. Why is licensing so hard?”

If you’ve ever said this or heard it from someone on your team, you’re not alone. In 2025, licensing remains one of the most frustrating parts of working with Power Platform. It’s a constantly recurring topic in community forums like Reddit, Slack threads, and internal support channels, discussed by admins, creators, and even casual users.

The system is full of fine print, scattered across admin centers, with policies that quietly shift from one month to the next. And just when you think you’ve figured it out, boom, an app fails to launch due to a missing license.

The frustration is real. One admin put it bluntly on Reddit:

“I’ve been in the Microsoft 365 Admin Center, Azure AD, Power Platform Admin Center… and I still can’t figure out how to assign a license to a user.”

So why bother trying to make sense of it?

Because Microsoft is now enforcing licensing rules particularly around API usage, multiplexing, Copilot access, and entitlement compliance.  

Licensing is no longer just a back-office detail. It now directly affects whether your apps run properly or slow down dramatically mid-process.

In 2025, Microsoft has tightened the regulations on compliance, especially around how requests are tracked, who’s licensed, and how apps are built. But if you know where to look, there’s more clarity too: they’ve finally provided better tools to help you stay ahead.

This post kicks off our new series on Power Platform licensing. If you’re in IT operations, managing Power Platform environments, or supporting citizen developers, this one’s for you.

What are Power Platform licensing options in 2025?

In 2025, Microsoft offers three main premium licensing options for Power Platform:

  • Per App Plan: Best for single, focused apps. Includes one app and one portal per user. Lacks built-in consumption tracking, so admins rely on custom monitoring.
  • Per User Plan: Ideal for power users and admins. Grants access to unlimited apps and environments, making it easier to manage at scale.
  • Pay-As-You-Go: Great for pilots or variable usage. Billed through Azure, but requires extra setup and ongoing oversight.

Choosing the right model depends on your usage patterns, scalability needs, and how much visibility you require.

Wait, isn’t Power Platform free with M365?

Yes and no.

Microsoft 365 plans (like E3 and E5) include Power Apps, but only for standard connectors like SharePoint or Outlook. The moment you introduce Dataverse, SQL, or custom APIs, you’ve stepped into premium territory.

And here’s the catch: read-only access to premium data? Still requires a premium license.

Why is my automation suddenly slowing down? The hidden cost of exceeding licensing limits

If your flow is throttling, your app is stuck, or your chatbot has gone quiet, the culprit might not be a technical bug — it might be your licensing.

Fragmented admin centers = Fragmented visibility

One major reason automations break or slow down is that teams unknowingly exceed API or capacity limits. This often happens because the fragmented admin experience makes it difficult to get a clear, centralised view of what’s being used and what’s licensed.

Licensing and usage insights are spread across multiple portals:

  • Licenses are assigned in the Microsoft 365 Admin Center
  • Group-based licensing is managed in Entra ID
  • Usage data lives in the Power Platform Admin Center

No single place gives you the full picture, so IT teams are forced to piece together licensing status and consumption manually.

You might be using features that aren’t actually covered

It’s common to assume that Power Apps are “free” with Microsoft 365. But once you start using premium connectors, Dataverse, or custom APIs, you’ve stepped into premium territory, and that can lead to access issues or performance slowdowns if the right licenses aren’t in place.

Power Platform = Multiple products, each with their own licensing rules

What makes it harder is that Power Automate, AI Builder, and Copilot Studio all come with separate entitlements and limitations. Even though they’re part of the same ecosystem, each requires different types of licenses, usage monitoring, and setup practices.

  • Power Automate offers per-user and per-flow plans. Flows tied to individual accounts often fail when roles change or users leave. Using service accounts with Per Flow licenses can improve reliability. Also: every API call now counts toward your usage limits, background processes included.

AI features = New licensing surprises

  • Copilot Studio is not bundled with most Power Apps plans by default. If your bots use custom plugins, external data sources, or generative AI, you may need extra capacity or Azure billing.
  • AI Builder credits are included in some plans, but they’re limited, and they run out fast if you’re using features like form recognition or prediction models at scale.

Bottom line: If your automations are slowing down, it’s probably not random. It’s likely a licensing boundary you didn’t know you crossed.

To stay compliant and maintain performance, operations teams need to be fluent in both legacy and modern models, a growing challenge for anyone managing Power Platform at scale.

What are some common licensing pitfalls?

You don’t need to be an expert in every detail of Microsoft’s SKU catalogue, but you do need to know where teams get tripped up. These are the biggest traps we’re seeing in 2025:

Multiplexing

What it is: Multiple users interact with an app using a single licensed account, often via embedded tools, shared portals, or apps embedded in Teams or SharePoint.

Why it’s risky: Microsoft explicitly forbids it, and yes, they’re checking. This is a fast track to non-compliance.

Request enforcement

Every. Single. API. Call. Counts.

That means background syncs, Power Automate flows, and even system-generated updates all contribute to usage limits. And when those limits are exceeded, restrictions like throttling or flow suspension kick in.

How can I audit my team before Microsoft does?

Start with mapping user roles and needs before assigning licenses. Who’s building apps? Who’s using them? Which connectors are involved? This upfront planning helps avoid deployment issues later.

Here’s our recommended approach:

  1. Map app dependencies

Make a list of who’s using what. Understanding which users rely on which apps and connectors helps prevent disruptions and supports better license planning.

  1. Track requests

Mark usage spikes and high-risk flows. Monitoring API consumption helps you identify patterns, avoid overages, and spot potential performance or compliance risks.

  1. Watch for multiplexing

Shared accounts are a red flag. Using a single licensed account to serve multiple users violates Microsoft’s licensing terms and can trigger audits or enforcement actions.

  1. Audit license assignments

Ensure users have the right entitlements. Regularly reviewing who has what license helps close gaps, prevent over-licensing, and maintain compliance.

  1. Plan for scale

Anticipate growth before it breaks your budget. Projecting future app usage and user needs lets you adjust licensing proactively and avoid costly surprises later.

What tools can I use to monitor my team’s Power Platform usage?

Power Platform Admin Center

It helps you get a detailed breakdown of:

  • Request volumes per user/app
  • API usage across environments
  • Gaps between license assignment and actual usage

Access is available to environment and tenant-level admins with appropriate roles (such as Power Platform admin or Global admin). To get meaningful insights, ensure that telemetry and usage reporting are enabled and your environments are correctly configured.

Azure Monitor integration

You can connect your Power Platform environment for real-time insights. Set alerts when nearing request limits or use it to prove compliance during audits. This integration is available to admins with Azure and Power Platform access, and requires environment-level configuration along with proper permissions to set up diagnostics and monitoring rules.

Licensing simulators

Microsoft has introduced calculators to model license needs based on usage and app scope. These tools are available to administrators and licensing managers with appropriate access to the Power Platform Admin Center or Microsoft licensing portals, and are most effective when accurate usage data and app requirements are already mapped out. Use these early before rollout, not after failure.

A little prep goes a long way in staying compliant and avoiding surprises.

Make licensing work for your team

Licensing may never be simple but with the right strategy and regular health checks, it’s manageable. Whether you're launching your first app or scaling across teams, clarity is key to staying compliant and avoiding surprises.

You don’t need to know every rule, just how to navigate the essentials. Stay informed and stay in control.

If you’re not sure which license is best for your team, contact us to discuss your use cases.

Up next in our Power Platform licensing series:

  • Power Platform Licensing within D365 & M365
  • Staying ahead of connector changes in Power Platform
  • Request management made easy: How to stay within limits
  • Scaling without breaking your budget

Soft blue and white gradient background with blurred smooth texture
Filter
Industry
Technology
Solution category
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Bulk Work Item Management: the Dynamic Duo of Azure DevOps and Microsoft Excel
October 18, 2024
3 min read
Bulk Work Item Management: the Dynamic Duo of Azure DevOps and Microsoft Excel
Read more

When using Azure DevOps, it is often a challenge to modify the status of multiple Work Items in large projects at the same time. This can significantly complicate admin tasks and make the project management process time consuming.

However, Azure DevOps’ bulk editing feature can provide a solution to this problem. It allows you to edit the state of multiple Work Items at once, either tagging them or moving them between iterations.
However, if you need to create a more complex, bulk work item, a Microsoft Excel add-in can be of great help. This supports adding Work Items, updating existing Work Items, adding links or attachments to multiple Work Items and much more.

By using these, consultants and project teams can respond quickly and efficiently to changing requirements or needs related to current project processes. This can result in improved project efficiency and reduced administrative workload. The ability to manage work items in bulk can result in significant time and resource savings throughout the project lifecycle.

So how can we use this solution?

There are a few prerequisites that are necessary to use this operation:

  • Microsoft Excel 2010 or later must be installed
  • You need to install Azure DevOps Office Integration 2019 (Available for free at https://visualstudio.microsoft.com/downloads/?ref=techielass.com#other-family)
  • You must be a member of the project you want to edit.
  • Must be allowed to view Work Items and edit WorkItems in the project
  • If you want to add or edit Work Items, you must have stakeholder access

If all the necessary conditions have been met, the following steps can be followed to modify and create work items en masse:

  1. Open Excel
  2. Click on the Team tab in the top menu (this should appear automatically after installing the extension. If it is not available, you can uncheck it in the customize menu bar to make it visible).

To be able to make bulk changes, we need to connect the Excel plugin to the Azure DevOps environment. This requires the following steps:

  1. Click on New List
  2. This will activate a pop-up window that will ask us to connect to an Azure DevOps Server.
  3. Click on the Servers button
  4. Click on the Add button
  5. Enter the URL of the Azure DevOps server
  6. Then click OK
  7. You will be prompted to authenticate the Azure DevOps server
  8. Then click Close to connect Excel to the DevOps server.A beépülő modul mostantól információkat kap a csapatprojektekről.
  9. Here we will be able to select the project within which we want to make the changes

10. You need to select which Work Items you want to download from the Azure DevOps tables to Excel (the connection will always pull the items from a query created in DevOps, so it is a prerequisite that there is a view, a list created under the project).

11. Now we can start making the necessary changes. When you are done, click on the Publish button under the Team tab. This will send the completed changes to the server where Azure DevOps will do its job.

If you refresh the Azure DevOps view in your browser, you will see the changes you have made.

Bulk editing Azure DevOps items using Excel is a significant time and resource saver for everyone. For projects larger than Visual Labs, this feature is regularly used by our project managers and consultants, and it greatly helps us to operate more efficiently.

Monitoring Standardization: using Workbooks for Logic Apps, Azure Functions and Microsoft Flows
October 18, 2024
7 min read
Monitoring Standardization: using Workbooks for Logic Apps, Azure Functions and Microsoft Flows
Read more

Problem Statement

Monitoring of the three platforms mentioned in the title is solved independently in different locations. Logic Apps can be monitored either from the resource’s run history page or through the Logic App Management solution deployed to a Log Analytics workspace. Azure Functions have Application Insights, while the run history of Microsoft Flows is available on the Power Platform.Most of our clients’ solutions consist of these resources, which often chain together and call each other to represent business processes and automations. Their centralized supervision is not solved, making error tracking and analysis difficult for employees. Moreover, they had to log into the client’s environment to perform these tasks.

Goal

We wanted to get a general overview of the status of the solutions we deliver to our clients, reduce our response time, and proactively prevent error reports submitted by our clients. We aimed to track our deployments in real-time, providing a more stable system and a more convenient user experience. We wanted to make our monitoring solution available within Visuallabs so that we could carry out monitoring tasks from the tenant that hosts our daily development activities.

Solution

Infrastructure Separation

Our solution is built on the infrastructure of a client used as a test subject, whose structure can be considered a prerequisite. On the Azure side, separate subscriptions were created for each project and environment, while for Dynamics, only separate environments were used. Project-based distinction for Flows is solved based on naming conventions, and since log collection is manual, the target workspace can be freely configured.

Centralized Log Collection

It was obvious to use Azure Monitor with Log Analytics workspaces for log collection. Diagnostic settings were configured for all Azure resources, allowing us to send logs to a Log Analytics workspace dedicated to the specific project and environment. For Microsoft Flows, we forward logs to a custom monitor table created for Flows using the built-in Azure Log Analytics Data Collector connector data-sending step. This table was created to match the built-in structure of the Logic Apps log table, facilitating the later merging of the tables.

monitroing
Diagnostic settings

Log Analytics workspace

Log tables

Making Logs Accessible in Our Tenant

An important criterion for the solution was that we did not want to move the logs; they would still be stored in the client’s tenant; we only wanted to read/query them. To achieve this, we used Azure Lighthouse, which allows a role to be enforced in a delegated scope. In our case, we set up a Monitoring contributor role for the client’s Azure subscriptions for a security group created in our tenant. This way, we can list, open, and view resources and make queries on Log Analytics workspaces under the role’s scope from our tenant.

Visualization

For visualization, we used Azure Monitor Workbook, which allows data analysis and visual report creation, as well as combining logs, metrics, texts, and embedding parameters. All Log Analytics workspaces we have read access to via Lighthouse can be selected as data sources. Numerous visualizations are available for data representation; we primarily used graphs, specifically honeycomb charts, but these can easily be converted into tables or diagrams.

Combining, Customizing, and Filtering Tables

To process log tables from different resources together, we defined the columns that would be globally interpretable for all resource types and necessary for grouping and filtering.These include:

  • Client/Tenant ID
  • Environment/Subscription ID
  • Resource ID/Resource Name
  • Total number of runs
  • Number of successful runs
  • Number of failed runs

Based on these, we could later determine the client, environment, project, resource, and its numerical success rate, as well as the URLs needed for references. These formed the basis for combining tables from various Log Analytics Workspaces and resources for our visualizations.

Log Analytics

User Interface and Navigation

When designing the user interface, we focused on functionality and design. Our goal was to create a visually clear, well-interpreted, interactive solution suitable for error tracking. Workbooks allow embedding links and parameterizing queries, enabling interactivity and interoperability between different Workbooks. Utilizing this, we defined the following levels/types of pages:

  • Client
  • Project
  • Resources
  • Logic App
  • Azure Function
  • Flow
Customers

Projects

Resources

Resources

Resources

Resources [Azure Function] At the Client and Project levels, clicking on their names displays the next subordinate Workbook in either docked or full-window view, passing the appropriate filtering parameters. Time is passed as a global parameter during page navigation, but it can be modified and passed deeply on individual pages. We can filter runs retrospectively by a specific minute, hour, day, or even between two dates.On the page displaying resources, we provide multiple interactions for users. Clicking on resource names navigates to the resource’s summary page on the Azure Portal within the tenant, thanks to Lighthouse, without tenant switching (except for Power Automate Flows).Clicking on the percentage value provides a deeper insight into the resource’s run history and errors in docked view. This detailed view is resource type-specific, meaning each of the three resources we segregated has its own Workbook. We always display what percentage of all runs were successful and how many faulty runs occurred, with details of these runs.

Logic App

Beyond general information, faulty runs (status, error cause, run time) are displayed in tabular form if any occurred during the specified time interval. Clicking the INSPECT RUN link redirects the user to the specific run where all successful and failed steps in the process can be viewed. At the bottom, the average run time and the distribution of runs are displayed in diagram form.

Logic App

Logic App [INSPECT RUN]

Logic App [diagrams]

Microsoft Flow

For Flows, the same information as for Logic Apps is displayed. The link also redirects to the specific run, but since it involves leaving Azure, logging in again is required because Dynamics falls outside the scope of Lighthouse.

Micrososft Flow

Azure Function

The structure is the same for Azure Functions, with the addition that the link redirects to another Workbook instead of the specific run’s Function App monitor page. This is necessary because only the last 20 runs can be reviewed on the Portal. For older runs, we need to use Log Analytics, so to facilitate error tracking, the unique logs determined by developers in the code for the faulty run are displayed in chronological order.

Azure Function

Azure Function

Consolidated View

Since organizationally, the same team may be responsible for multiple projects, a comprehensive view was also created where all resources are displayed without type-dependent grouping. This differs from the Workbook of a specific project’s resources in that the honeycombs are ordered by success rate, and the total number of runs is displayed. Clicking on the percentage value brings up the previously described resource type-specific views.

Resources

Usability

This solution can be handy in cases where we want to get a picture of the status of various platform services in a centralized location. This can be realized interactively for all runs, except for Flows, without switching tenants or possibly different user accounts. Notification rules can also be configured based on queries used in Workbooks.

Advantages:

  • The monitoring system and visualization are flexible and customizable.
  • New resources of the same type can be added with a few clicks to already defined resource types (see: configuring diagnostic settings for Logic Apps).

Disadvantages:

  • Custom log tables, visualizations, and navigation between Workbooks require manual configuration.
  • Integrating Flows requires significantly more time investment during development and planning.
  • Combining tables, separating environments and projects can be cumbersome due to different infrastructure schemas.
  • Basic knowledge of KQL (Kusto Query Language) or SQL is necessary for queries.

Experience

The team that implemented the solution for the client provided positive feedback. They use it regularly, significantly easing the daily work of developer colleagues and error tracking. Errors have often been detected and fixed before the client noticed them. It also serves well after the deployment of new developments and modifications. For Logic Apps, diagnostic settings are included in ARM (Azure Resource Manager) templates during development, so runs can be tracked from the moment of deployment in all environments using release pipelines.

The past, present and future of ERP systems
October 18, 2024
6 min read
The past, present and future of ERP systems
Read more

When I first started working at VisualLabs, during the first WSM (weekly standup meeting), where each business unit reports on their current weekly tasks, I noticed how many acronyms we use. As a member of the ERP team, the question arose in my mind: besides the fact that we use them, do we know exactly how they were developed and what they mean?

Everyone is now familiar with the term ERP (Enterprise Resource Planning), but few people know its exact origins and how it evolved.So I thought I'd gather together where it all started and what the main milestones were that helped shape the ERP systems we know today. As we look back in time, we will realise how deeply rooted this technology is in the modern business world.

In this blog, I've collected 7 milestones that helped shape the ERP system we know today.

In today's world, it would be unthinkable for a company not to use some kind of computer system for its various processes. However, even before the advent of the computer, companies had to manage these processes (be it accounting or production planning) in some way. Let's take accounting as an example. Accountants recorded every single financial transaction on paper, by hand, in different books. And they were managed day by day, month by month. It is hard to imagine that companies often had rooms full of (general) ledgers and files, each containing dozens of transactions. And at the heart of it all was the accountants' most treasured possession: the ledger. It is hard to imagine how much work must have been involved in the year-end closing process and how many mistakes must have been made in the process.

ERP

  1. The birth of computers (1950s):

In the 1950s, with the birth of the computer - for which János Neumann laid the theoretical foundations - a new dimension opened up in the way companies operate and transform their processes. Although these computers were used in the 1950s mainly in the military and scientific fields - because of their large size and price - they soon became part of the business world thanks to continuous technological developments. These tools enabled faster processing and analysis of data and helped automate the activities of companies.

ERP

2. Inventory management and control (1960s):

One of the first milestones in the growing uptake of the computer and the realisation of its potential for business dates back to the 1960s. It was then that the manufacturing industry recognised the need for a system that would allow them to manage, monitor and control their inventory. The advent of information technology has allowed companies to integrate and automate their business processes. As a result, they have been able to improve the efficiency and accuracy of their inventory management. This was one of the first steps towards the emergence of ERP systems.

3. Material Requirement Planning (MRP I, 1970s):

The concept of MRP (Material Requirements Planning) first emerged in 1970 and was essentially a software-based approach to planning and managing manufacturing processes. The application of MRP focused primarily on planning and tracking material requirements. This approach allowed companies to predict more accurately the quantities and types of materials they would need in their production processes. MRP has enabled companies to manage material procurement and production scheduling more efficiently, reducing losses due to over- or under-pricing. This innovation has had a significant impact on the manufacturing industry and has fundamentally transformed the way companies plan materials. This approach helped to increase the efficiency and competitiveness of manufacturing companies in the 1970s.

4. Production Resource Planning (MRP II, 1980s):

The 1980s marked a major milestone with the advent of MRP II systems. While MRP focused exclusively on the inventory and materials needed to meet actual or forecast customer demand, MRP II now provides greater insight into all other manufacturing resources. By extending production planning beyond materials to labor, machinery, and other production resources, it gives companies much greater control over their manufacturing processes.

5. Enterprise resource planning systems (ERP, 1990s)

It was in the 1990s that the first true ERP systems were introduced (the term ERP itself was first used in the 1990s by the research firm Gartner).The ERP systems were a significant improvement over MRP II systems, as they focused not only on the full integration and automation of manufacturing processes, but also on the full integration and automation of business processes. Examples of such processes include purchasing, sales, finance, human resources and accounting. As a result of this full integration, companies are now able to manage their business processes in a single database. This has brought many benefits. The unified storage and management of information ensured access to accurate, up-to-date data. This improved decision-making processes and efficiency for companies. And the interconnected business areas helped them to make and implement coherent strategies. As a result, the ERP system became a 'one-stop-shop' that brought together and managed all corporate information.

6. Web-based functionalities with the advent of the Internet (ERP II, 2000s)I

n the mid-2000s, the role of the Internet in the business world increased and ERP systems adapted to this change. Systems began to incorporate customer relationship management (CRM) and supply chain management (SCM) functionality. With ERP II, the focus has been on user-friendliness and customisation. Modular systems were developed that allowed businesses to select and implement the components that best fit their operations.

7. Cloud ERPs (2010s):

In the 2010s, the emergence of cloud technology gave a new dimension to ERP systems. Cloud ERP solutions have enabled companies to host and run their ERP systems in the cloud instead of traditional on-premise deployments. This has offered significant benefits, including greater flexibility, lower costs and easier access to critical data. Thanks to cloud ERP systems, companies no longer have to worry about server maintenance or software updates, as these tasks are handled by their service providers. This allows companies to focus on their business goals and processes while ensuring that their systems are always up-to-date and available.

+1 The future of ERP:

And where is the development of ERP systems heading today? With the help of intelligent algorithms and artificial intelligence, systems are increasingly capable of automating and optimising business processes, reducing the need for human intervention. Data will continue to play a key role in the future, as companies are able to make better business decisions by analysing it more effectively. The integration of ERP systems with various IoT tools will enable real-time data exchange and real-time analysis to provide companies with faster and more accurate answers to support different business issues.

ERP

ERP systems also increasingly offer a personalised user experience and extensible integrations with other business applications and technologies. In the future, ERP systems will not only function as a tool, but will provide companies with real business intelligence and competitiveness, helping them to keep pace with the rapidly changing business environment and to stand out from their competitors.

Are you familiar with the world of ERP systems? Visual Labs can help you explore its potential.

Sources:

https://www.geniuserp.com/resources/blog/a-brief-history-of-erps

https://www.fortunetechnologyllc.com/history-of-erp-systems/

https://www.geniuserp.com/resources/blog/a-brief-history-of-erps

https://www.erp-information.com/history-of-erp.html#google_vignette

https://www.techtarget.com/searcherp/tip/MRP-vs-MRP-II-Learn-the-differences

https://www.business-case-analysis.com/account.html

https://www.britannica.com/technology/computer/IBM-develops-FORTRAN

https://business.joellemena.com/business/when-did-computers-start-being-used-in-business-2/

Hide subgrid buttons specifically
October 18, 2024
3 min read
Hide subgrid buttons specifically
Read more

Depending on where a sales process is in the process, different functions should be available on a Subgrid on a form, which in practice means that you can add enquiries for a Lead at the beginning of the process, but they cannot be changed later in the Opportunity phase.A little technical knowledge is required to understand and apply this article, so this is recommended for Dynamics 365 CE app makers who are already comfortable in the Power Platform world.Default position:

new products

THE GOAL:

product insert

Tools used to solve:
  • JavaScript (XRM toolbox - Web Resource Manager ajánlott) - WebResources Manager - XrmToolBox
  • Ribbon Workbench - Develop 1 Ltd | Ribbon Workbench for Dynamics 365 & Dynamics CR

1. SOLUTION

We need to create a Solution, which we will load into the Ribbon Workbench. Into this Solution we need to load the entity whose SubGrid we want to modify. (It is important that when we add the existing entity to the Solution we do not import any other elements). The Solution name should always be built according to the following logic: Ribbon_VL_[entity name] e.g. Ribbon_VL_Product_Interest

subgrid

2. Subgrid

The SubGrid must be named with a unique, identifiable name. Do not have the boxed auto-generated name you refer to later.

3. JavaScript

The JavaScipt below should be created as a js file (VS Code) and uploaded to the solution containing the Web resources.When creating the file, it is a good idea to give it the same name as the file name to make it easier to find later.

subgrid

forProductInterestView: function (selectedControl) {console.log("start.forProductInterestView");"use strict";debugger;var currentGridName = selectedControl._controlName;console.log("forProductInterestView-currentGridName: "+currentGridName);var excludedPayRun = "subgrid_prodinterest"; //Name of the subgridif (currentGridName == excludedPayRun) {console.log("end.forProductInterestView.true");return false;}else { console.log("end.forProductInterestView.false");return true; }}

4. Ribbon Workbench

Open the Ribbon workbench and add the solution you created in the first step. Each entity has 3 ribbons. We need the Subgrid.

Select the button you want to uncheck by right clicking on it and press "Customise Button". A red tick will then appear and will be added to the Buttons section below. If it is already ticked, it means that it already has a command, you need to add a new command to it and you can skip this step.After that you have to add a Command, which can be done with the plus sign in the Commands section. The command should look like this.

subgrid

Explain:

  • Library: the webResource you added to the solution (this is where the good name comes in)
  • Function name: The name you gave it in JavaScript (before Function).
  • CRM Parameter: What parameter to pass this is in this case the SelectedControl. This Control handles the SubGrids on the Forms, and any listing. PrimaryContol handles the Form.

Next, we need to add an EnableRules that hides the buttons.

Explain:

  • Library: the webResource you added to the solution (this is where the good name comes in)
  • Function name: The name you gave it in JavaScript (before Function).
  • CRM Parameter: What parameter to pass this is in this case SelectedControl

Only one step left before Publish. For the buttons, you need to specify what Command should belong to them.

I hope you find this article useful and that we have been able to give you some ideas.

Top features of Dynamics 365 Business Central 2024 Release Wave 1
October 18, 2024
6 min read
Top features of Dynamics 365 Business Central 2024 Release Wave 1
Read more

It's that time of year again, when Microsoft has released the first major bundle for its Dynamics 365 Business Central enterprise management system.

There are two major update packs a year, in April and October. In addition, smaller updates will also arrive on a monthly basis.As part of the Wave 1 release, the company has announced a number of new features, in addition to the usual user experience improvements, this year there is a big focus on the 'Project Module' and further enhancements to the Artificial Intelligence line, with the addition of functionality to Copilot, which was introduced last year.

In this article we'll take a closer look at what we think are the most exciting features of Wave 1.

BC

AI & Copilot

For some time now, artificial intelligence has been increasingly embedded in all areas of life, including business. Microsoft's Copilot has given Business Central its first AI features, which will be further enhanced with the latest version.

Copilot, for example, makes it easier to fill out sales order lines by entering a few simple criteria.

BC

The analytical views generated by Copilot also make it possible to produce increasingly faster and more accurate reports.

BC

It generates an analysis in seconds based on the instructions given:

BC

The system also has a Copilot chat feature, where you can ask the AI, for example, about mapping a process within Business Central, and get a very useful - step by step - description back:

BC

What's unfortunate is that the new features still don't support the Hungarian language, so for the time being, English-speaking users are preferred.

Inventory cost recovery tool

We have a completely new interface that extends the range of cost recovery methods we are already familiar with.

BC

But what exactly do we get?

With the new tool, you can check and track the cost recovery for all items or only selected items.

You can now easily identify problematic articles and run automatic corrections for the rest without any problems.

It should be stressed that the new tool does not replace existing processes, but instead helps troubleshooting and improves system performance.

Multi-line text export/import with Config. package

Fields of type BLOB, which usually contain multi-line texts, have been added to the optional fields in the config. packages thanks to the new update.

A good example is the 'Job Description' field on the Sales Invoices page, which can now be imported from Excel to Business Central. This is particularly important as the content of this field is also displayed on the invoice screen.

BC

Hibaüzenet adatainak megosztása másik felhasználóval

Sokszor találkozhatunk hibaüzenetekkel a rendszer használata során. Az üzenetek nem mindig egyértelműek az átlag felhasználó számára, ezért a segítségkéréshez szükség lehet további információkra.

Ezen információk összegyűjtése időigényes folyamat lehet, főleg, ha fejlesztő bevonására is szükség van.

Ezt a folyamatot könnyíti meg ez az apró (de nagyon hasznos) újítás, mely lehetővé teszi, hogy az összes szükséges hibaüzenetet (a szükséges azonosítókkal) két kattintással továbbítani tudjuk a megfelelő személynek emailben vagy Teams-en.

BC

New Excel reports

Microsoft has added 8 new Excel reports to the system this year, supporting financial, sales and purchasing processes. These reports allow reporting without the need for a developer.

BC

Users can also use the latest reports as templates or enhance them to present their business data in the best way.

For the time being, the new reports have the suffix preview.

Project module updates

The project module has been given special attention this year.

In addition to the new fields that can be displayed on the Projects page with Personalisation, the details panel on the right-hand side of the information panel has been expanded.

The invoicing process for projects has been simplified from start to finish. With the new Query project design rows action, you can now add project design rows from multiple projects to a single sales invoice, but also invoice projects to multiple customers.

You can now add items to be assembled for ordering to projects, similar to sales orders. In this case, an assembly order is automatically created in the background, containing the required ingredients and their corresponding quantities.

A project archiving function has also been added, which works in a similar way to the archiving of sales and purchase orders.

Field Service and Business Central integration

Businesses using Field Service can rejoice. The new integration will make data communication between the two systems smoother.

Users can manage their work orders, service task progress, resource scheduling and consumption management on the Field Service side. Once a work order is completed, the integration allows for easy transfer of the necessary data to Business Central.

The integration also facilitates billing processes. Users can now generate accurate invoices based on service activities performed and consumption recorded on the Field Service side.

New role available for claims management

The role for claims management has been expanded. We get new lists, tiles and a revamped menu with an embedded action menu.

BC

Automation of calls

The new update includes a major enhancement to the call handling feature. It is now possible to automatically send notices to customers about overdue debts based on predefined conditions.

You can set filters for the whole process or for individual transactions and attach a PDF of the receivables to the emails.

BC

Connect the Business Central environment to the Power Platform environment

We can integrate Business Central with a range of Power Platform and Dynamics 365 products. But with the new update, we have the ability to connect environments through Business Central's admin center.

This will allow BC to inherit the settings of the associated Power Platform environment, including the encryption keys managed by the customer. This linked environment serves as the default target for other Dynamics 365 and Power Platform products such as Sales or Power Automate.

Worksheets now also available on mobile

It is increasingly important that more areas of Business Central can be accessed from mobile as more and more users choose to work this way.

Worksheets, such as the various diary accounting pages, can now be accessed from mobile, something that was previously only available from tablets and desktops.

BC

Developments in warehousing and stock management

Some changes to stock and inventory management have also been made based on regular user feedback:

  • We can now assign a portion and package number to existing tracking lines
  • We can manage warehouse item tracking queues
  • Warehouse items page inherits the name of the item in the item carton
BC

If you're interested in further developments or just want to understand more about the article, feel free to contact the Visual Labs team and we'll be happy to help.info@visuallabs.hu

Should we implement a cloud or on-premise Business Central system in our company?
October 18, 2024
6 min read
Should we implement a cloud or on-premise Business Central system in our company?
Read more

Introduction

In today's post, we want to compare Dynamics 365 Business Central ERP in the cloud (or Software as a Service, SaaS) and on-premise (hosted or on-premises) from several perspectives, providing helpful perspectives for business decision makers to consider the complexities of ERP implementation. Microsoft sells two ERP systems: one is the aforementioned Business Central; the other is Dynamics 365 Finance and Operations, designed for large enterprises.

Every day we hear that more and more business IT solutions are "running in the cloud" or "going to the cloud". In our personal lives, cloud-based software solutions are also becoming more and more common. Just think of our data stored in Google Drive, iCloud or Dropbox.

As business leaders, we can also choose to implement our enterprise resource planning (ERP) system in the cloud. Flexera forecasts that the global cloud ERP market is estimated to grow from $64.7 billion in 2022 to around $130 billion by 2027. This represents an annualised growth rate of 15%. Covid has also given a big boost to the expansion of cloud ERP systems. Therefore, cloud ERP systems have great potential.Before we start implementing our Business Central system, we first have to decide whether we want to run the software in the cloud or store the data on our own server.

Costs

A cloud-based solution has lower initial costs, as it is usually subscription-based and does not require a large initial investment. Costs are more flexible and easier to manage, whereas a deployed version has higher upfront costs, including hardware, software, installation and maintenance. In addition, ongoing operations (infrastructure and staff) and upgrades can be costly.

Data storage location

The most fundamental difference between cloud and terrestrial systems is the physical location of the data itself. With cloud-based Business Central, your company's data is stored in one of Microsoft's data centres. The data stored in this way complies with EU data protection directives, including the GDPR. In addition, Microsoft guarantees 99.99% availability of the Business Central application. This means that the system is only down 0.01% of the time, and then only after hours at the most.

Developments

The company has to have unique solutions. These can usually be solved by extensions or custom development. Modern cloud solutions offer more and more customisation and integration options. It is also possible to develop source code directly in on-premise environments. But the end result is the same: in both cases, you have the flexibility to extend out-of-the-box functionality with the careful expertise of a systems integrator like VisualLabs.

Performance and scalability

The performance of the new system is a key consideration when choosing a solution. For the cloud, Microsoft invests significant resources to ensure maximum availability and speed for its users. So businesses do not need to pay special attention to this.For on-premise solutions, system performance is more dependent on the infrastructure installed/leased by the company. As the Business Central installed requires hardware of optimal size and power. Also, the company itself has to be responsible for performance issues, not the availability as a basic requirement for a cloud-based service guaranteed by Microsoft.

Maintenance and updates

With a cloud-based system, software updates are done automatically by Microsoft, requiring less IT activity from the company. In this way, Microsoft ensures that subscribers have a modern, up-to-date or evergreen system in every respect. The timing of these updates can be flexibly set within an update window.In the case of on-premise versions, these have to be done by the company's IT team, which requires more resources. With on-premise, there is a risk that the latest software updates are not installed, leaving users 'stuck' with an old version. The result will be outdated software over the years, which could mean another IT project in the life of the business to replace.Whether we are talking about cloud-based or on-premises versions, it is highly recommended to test and try out new updates in a dedicated test environment before deploying them in a live environment. There are a number of new features in these updates that should make the daily life of users easier. Twice a year, Microsoft releases a major update package with a range of new functionality.

Data security

The most important consideration when choosing software is that your data is secure. Microsoft provides built-in backup and disaster recovery solutions that enable faster and more reliable recovery. This way, an outage in a Microsoft data centre is not noticed by the end user. In addition, Microsoft also enables up-to-the-minute data recovery for up to a month.With an on-premise solution, the company has full control over data security and access management, which means more control but also more responsibility. They can more easily comply with regulatory and legal requirements that require data to be stored on-site, especially in industries with strict data protection regulations.

Limitations of the on-premise version

The on-premise version has several limitations compared to the cloud-based solution. An interface called AppSource is not available in the on-premise version. It contains thousands of applications that can be added to the Business Central functionality. Many of these are available for free.Power Platform applications (e.g. PowerApps, Power BI) cannot be natively integrated with on-premise Business Central. This requires the creation and maintenance of a data gateway.In addition, the list of features that Microsoft will make available in the on-premise version is growing. These include the possibility of integration with Microsoft Teams or the Shopify webshop engine. Microsoft's AI solution Copilot is also available exclusively in the cloud-based Business Central version.

Conclusion

The table below summarises the main claims of our post:

Viewpoint

Cloud solution (cloud) On-premise solution

Costs

Lower upfront costs with subscription pricing; flexible and easy to manage.

Higher initial costs, including hardware, software, installation and maintenance. Ongoing operations (infrastructure and staff) and upgrades can also be costly.

Data storage location

The data is stored in Microsoft’s data centres in compliance with the EU Data Protection Directive (GDPR). Microsoft guarantees 99.99% availability.

The data is stored on the company’s premises. The company is responsible for data protection compliance and system availability.

Fejlesztések

Széleskörű testreszabási és integrációs lehetőségek bővítményekkel.

Közvetlen forráskód módosítás lehetséges. Mindkét esetben rugalmasan bővíthető a funkcionalitás rendszerintegrátor segítségével.

Performance and scalability

Microsoft ensures maximum availability and speed, so businesses don’t have to pay extra attention.

Performance depends on the company’s infrastructure. It requires optimal hardware and the company is responsible for performance problems.

Maintenance and updates

Software updates are done automatically by Microsoft, requiring less IT activity. Flexible update scheduling is possible.

The company’s IT team has to do this, which requires more resources. There is a risk that users will be stuck with old versions if they do not install the latest updates. Testing new updates is recommended before deploying them in a live environment.

Data security

Microsoft provides built-in backup and disaster recovery solutions, enabling fast and reliable recovery.

The company has full control over data security and access, which means more control and more responsibility. Easier compliance with strict data protection regulations.

The final decision always requires careful consideration by the organisation facing digital transformation. Visual Labs offers and implements a cloud-based Business Central for its clients, which, based on years of experience, is a well-established model that delivers the benefits detailed above in the short term. If you need help with ERP system implementation, please contact us!

Sorry, no items found with this category

Ready to talk about your use cases?

Request your free audit by filling out this form. Our team will get back to you to discuss how we can support you.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Stay ahead with the latest insights
Subscribe to our newsletter for expert insights, industry updates, and exclusive content delivered straight to your inbox.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.