Blog

Insights & ideas

Stay ahead with expert articles, industry trends, and actionable insights to help you grow.

How to build, deploy, and share custom AI agents with Copilot Studio
10 mins read
October 15, 2025

How can I build, deploy, and share custom AI agents with Copilot Studio?

Learn how to build, deploy, and share your own AI agents in Copilot Studio, from defining purpose and prompts to connecting data and publishing for your team. A practical, step-by-step guide to turning ideas into working copilots.

Read more

TL;DR

Copilot Studio makes it possible for IT Ops and business teams to create custom AI agents that can answer questions, run processes, and even automate workflows across Microsoft 365. We recommend a structured approach: define the use case, create the agent, set knowledge sources, add tools and actions, then shape its topics and triggers. But before you dive into building, check your data quality, define a clear purpose, and configure the right policies.  

Before you start setting anything up, stop

It’s tempting to open Copilot Studio and start dragging in tools, uploading files, and typing instructions. But an agent without a plan is just noise.

  • Get your data in order first. Bad data means bad answers, and no amount of clever configuration can save it.
  • Define the “why” before the “how.” Build around a specific use case. For example, sales support, finance queries, service troubleshooting.
  • Don’t build for the sake of building. Just because you can spin up a chatbot doesn’t mean you should. The best agents are purposeful, not experimental toys.  

Think of your agent as a new team member. Would you hire someone without a role description? Exactly.

Building your first Copilot: practical steps

Once you know the “why”, here’s how to get your first custom agent working.

1. Author clear instructions

Clear instructions are the foundation of your agent. Keep them simple, unambiguous, and aligned to the tools and data you’ve connected. Microsoft even provides guidance on how to write effective instructions.

2. Configure your agent

  • Content moderation: In the Agent → Generative AI menu, set rules for what your Copilot should (and shouldn’t) say. For example, if it can’t answer “XY”, define a safer fallback response.
  • Knowledge sources: You can upload multiple knowledge sources, or add trusted public websites so the agent uses those instead of a blind web search.
  • Add tools: Agents/Tools/Add a tool lets you extend functionality. For instance, connect a Meeting Management MCP server so your Copilot inherits scheduling skills without rebuilding them.

You’re not just configuring settings — you’re composing a system that reflects how your organisation works.

3. Validate your agent’s behaviour

As of now, there’s no automated testing of agents, but that doesn’t mean you should skip this step. You can manually test your agent as the author. The Copilot Studio test panel allows you to simulate conversations, trace which topics and nodes are activated, and identify unexpected behaviours. The panel is there to help you spot gaps, so take the time to run realistic scenarios before publishing.

4. Pick the right model

Copilot Studio now offers GPT-5 Auto (Experimental) alongside GPT-4.0 (default). The experimental model can feel sharper, but it may not reliably follow instructions. If stability matters more than novelty — and for most IT Ops rollouts it does — stick with 4.0 until you’re confident.  
(As of 15th October, 2025. Note that model availability and behaviour may change over time).

The difference between noise and value

Rolling out a custom agent isn’t about dropping a chatbot into Teams. Done right, it’s about embedding AI into workflows where it drives real value — answering finance queries with authority, guiding service agents through processes, or combining AI with agent flows for end-to-end automation.

The difference between a useless bot and a trusted agent is preparation. Build with intent, configure with care, and test until you're sure it works properly.

You wouldn’t give a new hire access to your systems without training, policies, and supervision. Treat your AI agents the same way.

Need help automating workflows with Copilot Studio? Get in touch with our experts to discuss your use case.

Soft blue and white gradient background with blurred smooth texture
Filter
Industry
Technology
Solution category
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Monitoring Standardization: using Workbooks for Logic Apps, Azure Functions and Microsoft Flows
October 18, 2024
7 min read
Monitoring Standardization: using Workbooks for Logic Apps, Azure Functions and Microsoft Flows
Read more

Problem Statement

Monitoring of the three platforms mentioned in the title is solved independently in different locations. Logic Apps can be monitored either from the resource’s run history page or through the Logic App Management solution deployed to a Log Analytics workspace. Azure Functions have Application Insights, while the run history of Microsoft Flows is available on the Power Platform.Most of our clients’ solutions consist of these resources, which often chain together and call each other to represent business processes and automations. Their centralized supervision is not solved, making error tracking and analysis difficult for employees. Moreover, they had to log into the client’s environment to perform these tasks.

Goal

We wanted to get a general overview of the status of the solutions we deliver to our clients, reduce our response time, and proactively prevent error reports submitted by our clients. We aimed to track our deployments in real-time, providing a more stable system and a more convenient user experience. We wanted to make our monitoring solution available within Visuallabs so that we could carry out monitoring tasks from the tenant that hosts our daily development activities.

Solution

Infrastructure Separation

Our solution is built on the infrastructure of a client used as a test subject, whose structure can be considered a prerequisite. On the Azure side, separate subscriptions were created for each project and environment, while for Dynamics, only separate environments were used. Project-based distinction for Flows is solved based on naming conventions, and since log collection is manual, the target workspace can be freely configured.

Centralized Log Collection

It was obvious to use Azure Monitor with Log Analytics workspaces for log collection. Diagnostic settings were configured for all Azure resources, allowing us to send logs to a Log Analytics workspace dedicated to the specific project and environment. For Microsoft Flows, we forward logs to a custom monitor table created for Flows using the built-in Azure Log Analytics Data Collector connector data-sending step. This table was created to match the built-in structure of the Logic Apps log table, facilitating the later merging of the tables.

monitroing
Diagnostic settings

Log Analytics workspace

Log tables

Making Logs Accessible in Our Tenant

An important criterion for the solution was that we did not want to move the logs; they would still be stored in the client’s tenant; we only wanted to read/query them. To achieve this, we used Azure Lighthouse, which allows a role to be enforced in a delegated scope. In our case, we set up a Monitoring contributor role for the client’s Azure subscriptions for a security group created in our tenant. This way, we can list, open, and view resources and make queries on Log Analytics workspaces under the role’s scope from our tenant.

Visualization

For visualization, we used Azure Monitor Workbook, which allows data analysis and visual report creation, as well as combining logs, metrics, texts, and embedding parameters. All Log Analytics workspaces we have read access to via Lighthouse can be selected as data sources. Numerous visualizations are available for data representation; we primarily used graphs, specifically honeycomb charts, but these can easily be converted into tables or diagrams.

Combining, Customizing, and Filtering Tables

To process log tables from different resources together, we defined the columns that would be globally interpretable for all resource types and necessary for grouping and filtering.These include:

  • Client/Tenant ID
  • Environment/Subscription ID
  • Resource ID/Resource Name
  • Total number of runs
  • Number of successful runs
  • Number of failed runs

Based on these, we could later determine the client, environment, project, resource, and its numerical success rate, as well as the URLs needed for references. These formed the basis for combining tables from various Log Analytics Workspaces and resources for our visualizations.

Log Analytics

User Interface and Navigation

When designing the user interface, we focused on functionality and design. Our goal was to create a visually clear, well-interpreted, interactive solution suitable for error tracking. Workbooks allow embedding links and parameterizing queries, enabling interactivity and interoperability between different Workbooks. Utilizing this, we defined the following levels/types of pages:

  • Client
  • Project
  • Resources
  • Logic App
  • Azure Function
  • Flow
Customers

Projects

Resources

Resources

Resources

Resources [Azure Function] At the Client and Project levels, clicking on their names displays the next subordinate Workbook in either docked or full-window view, passing the appropriate filtering parameters. Time is passed as a global parameter during page navigation, but it can be modified and passed deeply on individual pages. We can filter runs retrospectively by a specific minute, hour, day, or even between two dates.On the page displaying resources, we provide multiple interactions for users. Clicking on resource names navigates to the resource’s summary page on the Azure Portal within the tenant, thanks to Lighthouse, without tenant switching (except for Power Automate Flows).Clicking on the percentage value provides a deeper insight into the resource’s run history and errors in docked view. This detailed view is resource type-specific, meaning each of the three resources we segregated has its own Workbook. We always display what percentage of all runs were successful and how many faulty runs occurred, with details of these runs.

Logic App

Beyond general information, faulty runs (status, error cause, run time) are displayed in tabular form if any occurred during the specified time interval. Clicking the INSPECT RUN link redirects the user to the specific run where all successful and failed steps in the process can be viewed. At the bottom, the average run time and the distribution of runs are displayed in diagram form.

Logic App

Logic App [INSPECT RUN]

Logic App [diagrams]

Microsoft Flow

For Flows, the same information as for Logic Apps is displayed. The link also redirects to the specific run, but since it involves leaving Azure, logging in again is required because Dynamics falls outside the scope of Lighthouse.

Micrososft Flow

Azure Function

The structure is the same for Azure Functions, with the addition that the link redirects to another Workbook instead of the specific run’s Function App monitor page. This is necessary because only the last 20 runs can be reviewed on the Portal. For older runs, we need to use Log Analytics, so to facilitate error tracking, the unique logs determined by developers in the code for the faulty run are displayed in chronological order.

Azure Function

Azure Function

Consolidated View

Since organizationally, the same team may be responsible for multiple projects, a comprehensive view was also created where all resources are displayed without type-dependent grouping. This differs from the Workbook of a specific project’s resources in that the honeycombs are ordered by success rate, and the total number of runs is displayed. Clicking on the percentage value brings up the previously described resource type-specific views.

Resources

Usability

This solution can be handy in cases where we want to get a picture of the status of various platform services in a centralized location. This can be realized interactively for all runs, except for Flows, without switching tenants or possibly different user accounts. Notification rules can also be configured based on queries used in Workbooks.

Advantages:

  • The monitoring system and visualization are flexible and customizable.
  • New resources of the same type can be added with a few clicks to already defined resource types (see: configuring diagnostic settings for Logic Apps).

Disadvantages:

  • Custom log tables, visualizations, and navigation between Workbooks require manual configuration.
  • Integrating Flows requires significantly more time investment during development and planning.
  • Combining tables, separating environments and projects can be cumbersome due to different infrastructure schemas.
  • Basic knowledge of KQL (Kusto Query Language) or SQL is necessary for queries.

Experience

The team that implemented the solution for the client provided positive feedback. They use it regularly, significantly easing the daily work of developer colleagues and error tracking. Errors have often been detected and fixed before the client noticed them. It also serves well after the deployment of new developments and modifications. For Logic Apps, diagnostic settings are included in ARM (Azure Resource Manager) templates during development, so runs can be tracked from the moment of deployment in all environments using release pipelines.

The past, present and future of ERP systems
October 18, 2024
6 min read
The past, present and future of ERP systems
Read more

When I first started working at VisualLabs, during the first WSM (weekly standup meeting), where each business unit reports on their current weekly tasks, I noticed how many acronyms we use. As a member of the ERP team, the question arose in my mind: besides the fact that we use them, do we know exactly how they were developed and what they mean?

Everyone is now familiar with the term ERP (Enterprise Resource Planning), but few people know its exact origins and how it evolved.So I thought I'd gather together where it all started and what the main milestones were that helped shape the ERP systems we know today. As we look back in time, we will realise how deeply rooted this technology is in the modern business world.

In this blog, I've collected 7 milestones that helped shape the ERP system we know today.

In today's world, it would be unthinkable for a company not to use some kind of computer system for its various processes. However, even before the advent of the computer, companies had to manage these processes (be it accounting or production planning) in some way. Let's take accounting as an example. Accountants recorded every single financial transaction on paper, by hand, in different books. And they were managed day by day, month by month. It is hard to imagine that companies often had rooms full of (general) ledgers and files, each containing dozens of transactions. And at the heart of it all was the accountants' most treasured possession: the ledger. It is hard to imagine how much work must have been involved in the year-end closing process and how many mistakes must have been made in the process.

ERP

  1. The birth of computers (1950s):

In the 1950s, with the birth of the computer - for which János Neumann laid the theoretical foundations - a new dimension opened up in the way companies operate and transform their processes. Although these computers were used in the 1950s mainly in the military and scientific fields - because of their large size and price - they soon became part of the business world thanks to continuous technological developments. These tools enabled faster processing and analysis of data and helped automate the activities of companies.

ERP

2. Inventory management and control (1960s):

One of the first milestones in the growing uptake of the computer and the realisation of its potential for business dates back to the 1960s. It was then that the manufacturing industry recognised the need for a system that would allow them to manage, monitor and control their inventory. The advent of information technology has allowed companies to integrate and automate their business processes. As a result, they have been able to improve the efficiency and accuracy of their inventory management. This was one of the first steps towards the emergence of ERP systems.

3. Material Requirement Planning (MRP I, 1970s):

The concept of MRP (Material Requirements Planning) first emerged in 1970 and was essentially a software-based approach to planning and managing manufacturing processes. The application of MRP focused primarily on planning and tracking material requirements. This approach allowed companies to predict more accurately the quantities and types of materials they would need in their production processes. MRP has enabled companies to manage material procurement and production scheduling more efficiently, reducing losses due to over- or under-pricing. This innovation has had a significant impact on the manufacturing industry and has fundamentally transformed the way companies plan materials. This approach helped to increase the efficiency and competitiveness of manufacturing companies in the 1970s.

4. Production Resource Planning (MRP II, 1980s):

The 1980s marked a major milestone with the advent of MRP II systems. While MRP focused exclusively on the inventory and materials needed to meet actual or forecast customer demand, MRP II now provides greater insight into all other manufacturing resources. By extending production planning beyond materials to labor, machinery, and other production resources, it gives companies much greater control over their manufacturing processes.

5. Enterprise resource planning systems (ERP, 1990s)

It was in the 1990s that the first true ERP systems were introduced (the term ERP itself was first used in the 1990s by the research firm Gartner).The ERP systems were a significant improvement over MRP II systems, as they focused not only on the full integration and automation of manufacturing processes, but also on the full integration and automation of business processes. Examples of such processes include purchasing, sales, finance, human resources and accounting. As a result of this full integration, companies are now able to manage their business processes in a single database. This has brought many benefits. The unified storage and management of information ensured access to accurate, up-to-date data. This improved decision-making processes and efficiency for companies. And the interconnected business areas helped them to make and implement coherent strategies. As a result, the ERP system became a 'one-stop-shop' that brought together and managed all corporate information.

6. Web-based functionalities with the advent of the Internet (ERP II, 2000s)I

n the mid-2000s, the role of the Internet in the business world increased and ERP systems adapted to this change. Systems began to incorporate customer relationship management (CRM) and supply chain management (SCM) functionality. With ERP II, the focus has been on user-friendliness and customisation. Modular systems were developed that allowed businesses to select and implement the components that best fit their operations.

7. Cloud ERPs (2010s):

In the 2010s, the emergence of cloud technology gave a new dimension to ERP systems. Cloud ERP solutions have enabled companies to host and run their ERP systems in the cloud instead of traditional on-premise deployments. This has offered significant benefits, including greater flexibility, lower costs and easier access to critical data. Thanks to cloud ERP systems, companies no longer have to worry about server maintenance or software updates, as these tasks are handled by their service providers. This allows companies to focus on their business goals and processes while ensuring that their systems are always up-to-date and available.

+1 The future of ERP:

And where is the development of ERP systems heading today? With the help of intelligent algorithms and artificial intelligence, systems are increasingly capable of automating and optimising business processes, reducing the need for human intervention. Data will continue to play a key role in the future, as companies are able to make better business decisions by analysing it more effectively. The integration of ERP systems with various IoT tools will enable real-time data exchange and real-time analysis to provide companies with faster and more accurate answers to support different business issues.

ERP

ERP systems also increasingly offer a personalised user experience and extensible integrations with other business applications and technologies. In the future, ERP systems will not only function as a tool, but will provide companies with real business intelligence and competitiveness, helping them to keep pace with the rapidly changing business environment and to stand out from their competitors.

Are you familiar with the world of ERP systems? Visual Labs can help you explore its potential.

Sources:

https://www.geniuserp.com/resources/blog/a-brief-history-of-erps

https://www.fortunetechnologyllc.com/history-of-erp-systems/

https://www.geniuserp.com/resources/blog/a-brief-history-of-erps

https://www.erp-information.com/history-of-erp.html#google_vignette

https://www.techtarget.com/searcherp/tip/MRP-vs-MRP-II-Learn-the-differences

https://www.business-case-analysis.com/account.html

https://www.britannica.com/technology/computer/IBM-develops-FORTRAN

https://business.joellemena.com/business/when-did-computers-start-being-used-in-business-2/

Hide subgrid buttons specifically
October 18, 2024
3 min read
Hide subgrid buttons specifically
Read more

Depending on where a sales process is in the process, different functions should be available on a Subgrid on a form, which in practice means that you can add enquiries for a Lead at the beginning of the process, but they cannot be changed later in the Opportunity phase.A little technical knowledge is required to understand and apply this article, so this is recommended for Dynamics 365 CE app makers who are already comfortable in the Power Platform world.Default position:

new products

THE GOAL:

product insert

Tools used to solve:
  • JavaScript (XRM toolbox - Web Resource Manager ajánlott) - WebResources Manager - XrmToolBox
  • Ribbon Workbench - Develop 1 Ltd | Ribbon Workbench for Dynamics 365 & Dynamics CR

1. SOLUTION

We need to create a Solution, which we will load into the Ribbon Workbench. Into this Solution we need to load the entity whose SubGrid we want to modify. (It is important that when we add the existing entity to the Solution we do not import any other elements). The Solution name should always be built according to the following logic: Ribbon_VL_[entity name] e.g. Ribbon_VL_Product_Interest

subgrid

2. Subgrid

The SubGrid must be named with a unique, identifiable name. Do not have the boxed auto-generated name you refer to later.

3. JavaScript

The JavaScipt below should be created as a js file (VS Code) and uploaded to the solution containing the Web resources.When creating the file, it is a good idea to give it the same name as the file name to make it easier to find later.

subgrid

forProductInterestView: function (selectedControl) {console.log("start.forProductInterestView");"use strict";debugger;var currentGridName = selectedControl._controlName;console.log("forProductInterestView-currentGridName: "+currentGridName);var excludedPayRun = "subgrid_prodinterest"; //Name of the subgridif (currentGridName == excludedPayRun) {console.log("end.forProductInterestView.true");return false;}else { console.log("end.forProductInterestView.false");return true; }}

4. Ribbon Workbench

Open the Ribbon workbench and add the solution you created in the first step. Each entity has 3 ribbons. We need the Subgrid.

Select the button you want to uncheck by right clicking on it and press "Customise Button". A red tick will then appear and will be added to the Buttons section below. If it is already ticked, it means that it already has a command, you need to add a new command to it and you can skip this step.After that you have to add a Command, which can be done with the plus sign in the Commands section. The command should look like this.

subgrid

Explain:

  • Library: the webResource you added to the solution (this is where the good name comes in)
  • Function name: The name you gave it in JavaScript (before Function).
  • CRM Parameter: What parameter to pass this is in this case the SelectedControl. This Control handles the SubGrids on the Forms, and any listing. PrimaryContol handles the Form.

Next, we need to add an EnableRules that hides the buttons.

Explain:

  • Library: the webResource you added to the solution (this is where the good name comes in)
  • Function name: The name you gave it in JavaScript (before Function).
  • CRM Parameter: What parameter to pass this is in this case SelectedControl

Only one step left before Publish. For the buttons, you need to specify what Command should belong to them.

I hope you find this article useful and that we have been able to give you some ideas.

Top features of Dynamics 365 Business Central 2024 Release Wave 1
October 18, 2024
6 min read
Top features of Dynamics 365 Business Central 2024 Release Wave 1
Read more

It's that time of year again, when Microsoft has released the first major bundle for its Dynamics 365 Business Central enterprise management system.

There are two major update packs a year, in April and October. In addition, smaller updates will also arrive on a monthly basis.As part of the Wave 1 release, the company has announced a number of new features, in addition to the usual user experience improvements, this year there is a big focus on the 'Project Module' and further enhancements to the Artificial Intelligence line, with the addition of functionality to Copilot, which was introduced last year.

In this article we'll take a closer look at what we think are the most exciting features of Wave 1.

BC

AI & Copilot

For some time now, artificial intelligence has been increasingly embedded in all areas of life, including business. Microsoft's Copilot has given Business Central its first AI features, which will be further enhanced with the latest version.

Copilot, for example, makes it easier to fill out sales order lines by entering a few simple criteria.

BC

The analytical views generated by Copilot also make it possible to produce increasingly faster and more accurate reports.

BC

It generates an analysis in seconds based on the instructions given:

BC

The system also has a Copilot chat feature, where you can ask the AI, for example, about mapping a process within Business Central, and get a very useful - step by step - description back:

BC

What's unfortunate is that the new features still don't support the Hungarian language, so for the time being, English-speaking users are preferred.

Inventory cost recovery tool

We have a completely new interface that extends the range of cost recovery methods we are already familiar with.

BC

But what exactly do we get?

With the new tool, you can check and track the cost recovery for all items or only selected items.

You can now easily identify problematic articles and run automatic corrections for the rest without any problems.

It should be stressed that the new tool does not replace existing processes, but instead helps troubleshooting and improves system performance.

Multi-line text export/import with Config. package

Fields of type BLOB, which usually contain multi-line texts, have been added to the optional fields in the config. packages thanks to the new update.

A good example is the 'Job Description' field on the Sales Invoices page, which can now be imported from Excel to Business Central. This is particularly important as the content of this field is also displayed on the invoice screen.

BC

Hibaüzenet adatainak megosztása másik felhasználóval

Sokszor találkozhatunk hibaüzenetekkel a rendszer használata során. Az üzenetek nem mindig egyértelműek az átlag felhasználó számára, ezért a segítségkéréshez szükség lehet további információkra.

Ezen információk összegyűjtése időigényes folyamat lehet, főleg, ha fejlesztő bevonására is szükség van.

Ezt a folyamatot könnyíti meg ez az apró (de nagyon hasznos) újítás, mely lehetővé teszi, hogy az összes szükséges hibaüzenetet (a szükséges azonosítókkal) két kattintással továbbítani tudjuk a megfelelő személynek emailben vagy Teams-en.

BC

New Excel reports

Microsoft has added 8 new Excel reports to the system this year, supporting financial, sales and purchasing processes. These reports allow reporting without the need for a developer.

BC

Users can also use the latest reports as templates or enhance them to present their business data in the best way.

For the time being, the new reports have the suffix preview.

Project module updates

The project module has been given special attention this year.

In addition to the new fields that can be displayed on the Projects page with Personalisation, the details panel on the right-hand side of the information panel has been expanded.

The invoicing process for projects has been simplified from start to finish. With the new Query project design rows action, you can now add project design rows from multiple projects to a single sales invoice, but also invoice projects to multiple customers.

You can now add items to be assembled for ordering to projects, similar to sales orders. In this case, an assembly order is automatically created in the background, containing the required ingredients and their corresponding quantities.

A project archiving function has also been added, which works in a similar way to the archiving of sales and purchase orders.

Field Service and Business Central integration

Businesses using Field Service can rejoice. The new integration will make data communication between the two systems smoother.

Users can manage their work orders, service task progress, resource scheduling and consumption management on the Field Service side. Once a work order is completed, the integration allows for easy transfer of the necessary data to Business Central.

The integration also facilitates billing processes. Users can now generate accurate invoices based on service activities performed and consumption recorded on the Field Service side.

New role available for claims management

The role for claims management has been expanded. We get new lists, tiles and a revamped menu with an embedded action menu.

BC

Automation of calls

The new update includes a major enhancement to the call handling feature. It is now possible to automatically send notices to customers about overdue debts based on predefined conditions.

You can set filters for the whole process or for individual transactions and attach a PDF of the receivables to the emails.

BC

Connect the Business Central environment to the Power Platform environment

We can integrate Business Central with a range of Power Platform and Dynamics 365 products. But with the new update, we have the ability to connect environments through Business Central's admin center.

This will allow BC to inherit the settings of the associated Power Platform environment, including the encryption keys managed by the customer. This linked environment serves as the default target for other Dynamics 365 and Power Platform products such as Sales or Power Automate.

Worksheets now also available on mobile

It is increasingly important that more areas of Business Central can be accessed from mobile as more and more users choose to work this way.

Worksheets, such as the various diary accounting pages, can now be accessed from mobile, something that was previously only available from tablets and desktops.

BC

Developments in warehousing and stock management

Some changes to stock and inventory management have also been made based on regular user feedback:

  • We can now assign a portion and package number to existing tracking lines
  • We can manage warehouse item tracking queues
  • Warehouse items page inherits the name of the item in the item carton
BC

If you're interested in further developments or just want to understand more about the article, feel free to contact the Visual Labs team and we'll be happy to help.info@visuallabs.hu

Should we implement a cloud or on-premise Business Central system in our company?
October 18, 2024
6 min read
Should we implement a cloud or on-premise Business Central system in our company?
Read more

Introduction

In today's post, we want to compare Dynamics 365 Business Central ERP in the cloud (or Software as a Service, SaaS) and on-premise (hosted or on-premises) from several perspectives, providing helpful perspectives for business decision makers to consider the complexities of ERP implementation. Microsoft sells two ERP systems: one is the aforementioned Business Central; the other is Dynamics 365 Finance and Operations, designed for large enterprises.

Every day we hear that more and more business IT solutions are "running in the cloud" or "going to the cloud". In our personal lives, cloud-based software solutions are also becoming more and more common. Just think of our data stored in Google Drive, iCloud or Dropbox.

As business leaders, we can also choose to implement our enterprise resource planning (ERP) system in the cloud. Flexera forecasts that the global cloud ERP market is estimated to grow from $64.7 billion in 2022 to around $130 billion by 2027. This represents an annualised growth rate of 15%. Covid has also given a big boost to the expansion of cloud ERP systems. Therefore, cloud ERP systems have great potential.Before we start implementing our Business Central system, we first have to decide whether we want to run the software in the cloud or store the data on our own server.

Costs

A cloud-based solution has lower initial costs, as it is usually subscription-based and does not require a large initial investment. Costs are more flexible and easier to manage, whereas a deployed version has higher upfront costs, including hardware, software, installation and maintenance. In addition, ongoing operations (infrastructure and staff) and upgrades can be costly.

Data storage location

The most fundamental difference between cloud and terrestrial systems is the physical location of the data itself. With cloud-based Business Central, your company's data is stored in one of Microsoft's data centres. The data stored in this way complies with EU data protection directives, including the GDPR. In addition, Microsoft guarantees 99.99% availability of the Business Central application. This means that the system is only down 0.01% of the time, and then only after hours at the most.

Developments

The company has to have unique solutions. These can usually be solved by extensions or custom development. Modern cloud solutions offer more and more customisation and integration options. It is also possible to develop source code directly in on-premise environments. But the end result is the same: in both cases, you have the flexibility to extend out-of-the-box functionality with the careful expertise of a systems integrator like VisualLabs.

Performance and scalability

The performance of the new system is a key consideration when choosing a solution. For the cloud, Microsoft invests significant resources to ensure maximum availability and speed for its users. So businesses do not need to pay special attention to this.For on-premise solutions, system performance is more dependent on the infrastructure installed/leased by the company. As the Business Central installed requires hardware of optimal size and power. Also, the company itself has to be responsible for performance issues, not the availability as a basic requirement for a cloud-based service guaranteed by Microsoft.

Maintenance and updates

With a cloud-based system, software updates are done automatically by Microsoft, requiring less IT activity from the company. In this way, Microsoft ensures that subscribers have a modern, up-to-date or evergreen system in every respect. The timing of these updates can be flexibly set within an update window.In the case of on-premise versions, these have to be done by the company's IT team, which requires more resources. With on-premise, there is a risk that the latest software updates are not installed, leaving users 'stuck' with an old version. The result will be outdated software over the years, which could mean another IT project in the life of the business to replace.Whether we are talking about cloud-based or on-premises versions, it is highly recommended to test and try out new updates in a dedicated test environment before deploying them in a live environment. There are a number of new features in these updates that should make the daily life of users easier. Twice a year, Microsoft releases a major update package with a range of new functionality.

Data security

The most important consideration when choosing software is that your data is secure. Microsoft provides built-in backup and disaster recovery solutions that enable faster and more reliable recovery. This way, an outage in a Microsoft data centre is not noticed by the end user. In addition, Microsoft also enables up-to-the-minute data recovery for up to a month.With an on-premise solution, the company has full control over data security and access management, which means more control but also more responsibility. They can more easily comply with regulatory and legal requirements that require data to be stored on-site, especially in industries with strict data protection regulations.

Limitations of the on-premise version

The on-premise version has several limitations compared to the cloud-based solution. An interface called AppSource is not available in the on-premise version. It contains thousands of applications that can be added to the Business Central functionality. Many of these are available for free.Power Platform applications (e.g. PowerApps, Power BI) cannot be natively integrated with on-premise Business Central. This requires the creation and maintenance of a data gateway.In addition, the list of features that Microsoft will make available in the on-premise version is growing. These include the possibility of integration with Microsoft Teams or the Shopify webshop engine. Microsoft's AI solution Copilot is also available exclusively in the cloud-based Business Central version.

Conclusion

The table below summarises the main claims of our post:

Viewpoint

Cloud solution (cloud) On-premise solution

Costs

Lower upfront costs with subscription pricing; flexible and easy to manage.

Higher initial costs, including hardware, software, installation and maintenance. Ongoing operations (infrastructure and staff) and upgrades can also be costly.

Data storage location

The data is stored in Microsoft’s data centres in compliance with the EU Data Protection Directive (GDPR). Microsoft guarantees 99.99% availability.

The data is stored on the company’s premises. The company is responsible for data protection compliance and system availability.

Fejlesztések

Széleskörű testreszabási és integrációs lehetőségek bővítményekkel.

Közvetlen forráskód módosítás lehetséges. Mindkét esetben rugalmasan bővíthető a funkcionalitás rendszerintegrátor segítségével.

Performance and scalability

Microsoft ensures maximum availability and speed, so businesses don’t have to pay extra attention.

Performance depends on the company’s infrastructure. It requires optimal hardware and the company is responsible for performance problems.

Maintenance and updates

Software updates are done automatically by Microsoft, requiring less IT activity. Flexible update scheduling is possible.

The company’s IT team has to do this, which requires more resources. There is a risk that users will be stuck with old versions if they do not install the latest updates. Testing new updates is recommended before deploying them in a live environment.

Data security

Microsoft provides built-in backup and disaster recovery solutions, enabling fast and reliable recovery.

The company has full control over data security and access, which means more control and more responsibility. Easier compliance with strict data protection regulations.

The final decision always requires careful consideration by the organisation facing digital transformation. Visual Labs offers and implements a cloud-based Business Central for its clients, which, based on years of experience, is a well-established model that delivers the benefits detailed above in the short term. If you need help with ERP system implementation, please contact us!

Create efficient and customized Release Notes with Bravo Notes
October 18, 2024
4 min read
Create efficient and customized Release Notes with Bravo Notes
Read more

For our customers, it is important that when we deliver a new version of their existing IT system, we also provide a release note on the content and functionality of the released package. At Visuallabs, we constantly strive to meet our customers' needs to the maximum, all while simplifying our own workflows and increasing our administrative efficiency. We are supported in this by the Bravo Notes available in DevOps. Using this plug-in, we produce a unique yet standardized Release Note with each new development package delivery. This allows us to meet our customers' requirements in a fast and standardized way.

What is needed to do this?

By following a few simple principles in our delivery processes, the documentation we already produce provides a good basis for generating standard version documents in a few steps for our releases or bug fixes.

How do we document?

  • The conventions for using the various purpose fields available on a given DevOps element will be strictly adhered to and filled in in a way that is appropriate for the document being generated.
  • User Stroy descriptions are prepared in a standard format. This allows us to provide standard quality for our customers and to build in automated document generation.
  • Tickets are sorted by transport unit. This helps when responding to multiple business challenges from the customer at the same time. Documentation of delivered enhancements and system changes can then be categorised in one document.

Using Bravo Notes

Bravo Notes provides technical assistance to help you meet these requirements with the right customisation.The main functions we use:

  • Compiling content: there are several options to choose from when selecting items from DevOps. We use Query most often among the options shown in the screenshot below, because the multiple filtering criteria allow us to select relevant elements more efficiently, thus making the documentation more precise.
  • Template: In Bravo Notes, we have created various templates to organise the news into a proper structure.  

Main units of the template developed:

  • In the case where several delivery units or business processes are involved for a system release, the relevant descriptions are grouped together in the document.
  • A further organizing principle in the template is that new developments are shown in a feature-by-feature breakdown, and solutions to bugs are also shown in a separate unit. This makes it clear which supported feature a given release item refers to, whether it is a new development or a bug fix.
  • Use parameters: parameters based on business processes allow you to customise the generation of documents. During generation, you can change the title, date, release date and add comments to the document. You can also specify the applications and resources involved, for example, which business area or environment is affected.
  • Display of document units and headings based on a set of rules: it is handled in the template to display only the relevant headings and document parts; e.g. if there was no error correction in a given delivery unit, its heading is not displayed either.
  • Fields used in the template: as defined above, we provide easy-to-read descriptions for the released developments. The consistent documentation of the DevOps tickets used in the design or development process allows this to be done quickly and in a standardized way. The content of the fields defined in the template about the tickets is automatically included when the document is generated.
  • Export: After generation and verification, we export your document to PDF format.

Testimonials: Overall, it is therefore important for our customers to receive detailed and business-relevant documentation on the new versions provided for the systems they use.We are also trying to simplify our own workflows.The Bravo Notes module integrated into DevOps supports us in achieving these goals.With this plug-in, we create customized yet standardized Release Notes with each new development package delivery. This allows us to meet our customers' requirements in a fast and standardised way, providing them with the necessary information and transparency on system changes and enhancements.

Sorry, no items found with this category

Ready to talk about your use cases?

Request your free audit by filling out this form. Our team will get back to you to discuss how we can support you.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Stay ahead with the latest insights
Subscribe to our newsletter for expert insights, industry updates, and exclusive content delivered straight to your inbox.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.