A lot has happened since our blog post on Knowledge management trends. We found Fabric is a cost-effective solution, and this is reflected in the number of new Fabric deployments. The increasing use of artificial intelligence was also one of last year’s three themes. This year, we expect it to continue to grow in popularity and adoption.
For example, AI capabilities are now included in all of Fabric’s paid capacities and new AI functions, such as artificial intelligence agents, are being released at a rapid pace. The growing volume and use of data is why the third theme of the year is Data Governance. A new feature in this context is Fabric One Security, which could even be described as a revolutionary feature in access management.
Last year we thought this might be the case, but now we are certain: Microsoft Fabric is a big hit! I would say that no other Microsoft product has created such a buzz in social media and similar communities. Best of all, there is a lot of free material available for training, brainstorming, and problem-solving. The product has also been enthusiastically received by customers, and Pinja already has several completed and ongoing Fabric projects. Virtually all of our new reporting and data warehouse projects are based on Fabric and Lakehouse. We also have ongoing and completed migration projects where the legacy environment is converted to Fabric. This is good for us consultants, of course, but Fabric is easy to recommend because of the cost savings and great scalability. It is also a breeze to deploy, and we have developed an in-house product that sets up Fabric the right way from the start.
Fabric implements what we call Lakehouse architecture. It’s a modern way to build a data warehouse and its biggest strengths are separate storage and compute layers with “unlimited” and independent scaling capabilities. These solutions are also characterized by the fact that storage is very cheap, which is important for the ever-increasing volumes of data in modern data warehouses, and allows data from source systems to be stored as unchanged as possible. Read more about Fabric in the Fabric blog post series by Pinja’s experts.
Already in last year’s Pinja’s Trends blog, it was noted that artificial intelligence and machine learning are still hot topics. The use of AI models (language models) has become commonplace for many (and not just for generating memes), and is widely used for a variety of tasks. In line with our trend forecast last year, Microsoft Fabric and Copilot have started to deliver on their AI promises. Copilot has come a long way, and is starting to be of real use in Power BI development. Enabling AI capabilities for all paid Fabric capacities has paved the road to AI for everyone. How about building your own “ChatGPT” that mines the answer from your data? It’s already possible and it’s not as expensive as you might think. Fabric’s AI capabilities already include sentiment analysis and language translations. Building these models used to take a lot of work and knowledge in various tools, but now they are readily available in the data engineer’s toolkit, and AI is a line of code away in Notebook. This is something that many couldn’t even dream of a few years ago.
Even the traditional data warehouse & BI package is a strong indicator of what has happened. For some time now, it has been possible to drill down behind the numbers to analyze why something happened.
On the other hand, looking into the future has been facilitated by data platforms with integrated AI capabilities and a simpler way to use them.
Reactive processes have been handled by humans, but the need to know what you can and should do requires an agent culture. In the data world, the primary interest lies in data agents.
This year, AI agents are also gaining traction in Microsoft Fabric in the form of Data Agents. AI agents can be described as designated helpers. While Copilot or some other AI helps with common tasks (e.g. PBI report, Pipeline), an AI agent can work independently and also use other tools such as Teams. In Fabric, the AI agent can be configured with instructions in English and change its behavior as needed for each task.
With all the hype around language models, we should not forget about deeper analytics where it is still better to train more accurate AI models. Such cases include predicting sales, equipment maintenance needs, or dynamic pricing of a product, for example, which Pinja has a ready solution for. Read more about how Pinja’s AI-based dynamic pricing solution optimized prices in real time at Carspect.
Data Governance is a boring topic. However, it is even more important in 2025 than ever, and it is highly recommended to become familiar with the topic and possibly get the old school textbooks out of the drawer. Although the traditional data warehouse has evolved into a versatile lakehouse concept, the data quality is at least as important as ever. It is no longer just about providing the right numbers for management, but also a vital input for artificial intelligence. If you feed AI garbage, it will give you garbage back, as AI cannot (yet) fix the data. Artificial intelligence also does not eliminate the need to monitor and log data usage. It is even more important to keep track of what data is being fed into the artificial intelligence model and what data it is using to give you answers.
Agents like the Microsoft Fabric Data Agent are a tool with tremendous potential, but they need to be manageable. When you connect it to your data, you may not want it to have access to content that the end user commanding the agent should have access to. This is where a well-implemented governance model comes into play, and users cannot find data they don’t have access to, regardless of the prompt. In some environments, the data may need to be protected so that an adversary cannot modify it without authorization and thus affect the operation of the AI. For example, attempts have been made to modify popular AI models by feeding them massive amounts of false information.
The recently released Fabric One Security provides tools to implement the governance model, which is a truly revolutionary feature in Fabric! You only need to define data permissions once and they work everywhere in Fabric. We still lack proper tools for tracking the origin of data and the impact of changes, but I’m sure those will be added to the product sooner or later.
We are making incredible leaps forward now with Fabric and AI. Interested? Contact our experts and we will tailor the latest development steps to your needs.
Fabric blog series part 1
Carspect success story
Pinja's AI solutions
Knowledge management and business intelligence