Consolidating our enterprise data in the Azure Data Lake to enrich our data-driven solutions

Knowledge is power, and in business knowledge comes from data-driven insights. We are often accustomed to having an onlook into our data in singular applications, separated from each other, even though their data is contextually connected and related: A sales system, a billing system, various production pipelines and a logistics system, just to name a few.

We may have powerful reports on our sales pipelines, and another set on production or logistics, and these reports do provide us with information on how these processes work in vacuum. But the truth is that nothing happens in a vacuum, instead, everything we do is connected.

To give you an example, our sales are driven, among other things, by customer satisfaction. Customer satisfaction can be boosted by providing high quality products and services in time without delays. Problems in our logistics chains can cause delays in both production and delivery, which makes our customers unhappy. Same goes doubly for any issues during production – maybe the devices used in quality assurance were poorly configured and we shipped low quality products as a result?

Customer satisfaction goes further down and landing new sales deals becomes more difficult. But if we looked at the data from our sales processes on its own, we would only end up seeing the result of sales going down – and not the actual, underlying cause of poor customer satisfaction, which was in turn caused by poor performance in production.

In this blog post I will show you how we can use Microsoft’s Azure services to consolidate all of the aforementioned enterprise data, and more, from different sources together to create even more powerful data-driven solutions. For the previous post on my series on Microsoft’s data platform, please see Building a real-time, cloud-based monitoring solution using Power BI and Microsoft’s data offerings.

Case: Combined dashboards on the full sales-to-delivery chain with Azure Data Lake

Consolidating data between systems can often seem like an insurmountable task. There are the technical challenges of managing multiple data sources and formats and combining them together in a meaningful way, and then there are the semantic challenges of finding relationships between different systems: How do we relate a customer in our sales pipeline to the same customer in our logistics system? Or what are the product codes for the items we sold in our production systems? Although this may feel like a lot to do before we can get any meaningful end results implemented, I’ve got some good news for you: With Azure it really isn’t that bad.

At the center of it all lies Azure Data Lake – a service for storing virtually all the data your enterprise ever needs or generates. Data Lake is schema agnostic, meaning that it does not care what format your data is in. It’s preferable that the data is stored in a format which is supported by all of the services involved in your data pipelines.

But other than that storing business data in Data Lake is as simple as extracting it from the source system and saving the extract in Data Lake as-is. And since we are using Data Lake simply as a centralized storage for all of the data coming from our various separated systems, we can implement these data extraction processes independent from each other.

When it comes to implementing a Data Lake solution there is no need for a single massive project where we identify all of the relevant business data at once, perform transformations on it and then copy that one, huge blob to Data Lake. Instead, we will be agile, implementing the data store piece by piece and effortlessly adding new data sources in tiny development projects as they are identified.

In practice the data stored in Azure Data Lake can come from virtually anywhere: Production management databases or documents stored in network file shares in your on-premises networks, Excel files located in Microsoft Teams, data in SharePoint lists, or any non-Microsoft cloud-based systems, provided they have support for extracting data. Just to name a few.

We choose the tools used for integrating these systems into Data Lake on a case-by-case basis as dictated by technical requirements and cost-efficiency. Azure Data Factory is my personal favorite since it can be run in the cloud and optimized for both performance and costs, and with the option of using Databricks it becomes a very powerful product. Other common tools include SQL Server Integration Services (or SSIS) in on-premises, Azure Stream Analytics for real time data or Logic Apps to make use of its Microsoft 365 connectors.

Once our data has been stored in Data Lake, we can get into implementing the actual business solutions. Although it is possible to use Power BI to connect directly into Data Lake, the most common scenario is to implement a database for our solution, ingesting transformed data from Data Lake to this database, and then building the business solutions on top of that.

There are multiple reasons for this: First, we are using Data Lake simply as a storage for our existing data “as-is.” This means that before we can use the data from our various systems – be it sensor data from production management, logistics information or sales data – we likely need to perform transformations on the data to make it all usable together. These transformations are done on this new reporting database.

Second, separating the reporting data source from the initial data storage means we can do changes in the reporting without affecting the data storage layer. Similarly, we can also add new reporting databases for other, separate solutions without affecting our existing databases.

This results in a modular structure where we can have one single Azure Data Lake instance serving multiple separate reporting databases, all of which can be doing their own thing. It is not uncommon to have just one Data Lake in an enterprise and having each business area develop their own reporting databases with their own unique business requirements based on that shared Data Lake.

Since the Data Lake and our reporting databases all are located on the Azure cloud, that is where our transformation logic will reside as well. Copying data from Data Lake to Azure SQL databases is commonly done using Azure Data Factory, which gives us access to a wide array of tools, from Data Flows to Azure SSIS and Databricks, to choose from.

For special cases, such as handling more exotic data formats, we can implement Azure Function Apps to make use of programming code. For the reporting databases we can make use of Azure SQL’s wide array of performance options to find the most suitable solution for our needs: Serverless databases provide excellent price/performance -ratio for reports that are updated once or twice a day, while for more often updated databases provisioned instances can be automatically scaled to meet changing demands.

Now, with our enterprise data combined together in the reporting database we can create the solution layer. For example, we can create a Power BI report with which our production managers can identify delays in our suppliers’ logistics and then use a Power Apps tool to inform our customer service agents that all of the orders affected by this supplier delay might be delayed as well.

They, in turn, can then contact the customers in question and let them know of the potential delay in delivery and possible solutions before it has even occurred, resulting in better customer satisfaction. All of this is possible by being able to match the supplier logistics information with associated production pipelines, production with orders and orders with customers within our reporting database.

The possibilities provided by bringing all of our data together are endless. Do you feel like there might be ways in which your existing business data could be harnessed more effectively? Do get in touch and let’s see what we could do help you!

Writer: Joonas Äijälä

Automated Machine Learning has hit GA – and why it matters

Microsoft’s Ignite conference brought with it a ton of news as usual, and among those all was a long-awaited announcement of Azure’s Automated Machine Learning (ML) finally hitting general availability and leaving behind its over a year long preview state. Not only did the service reach GA, but Microsoft also announced that they will be merging the previously “Enterprise-grade” features of Azure Machine Learning with the cheaper Basic-tier, including Automated ML. This means that not only are these features now officially suitable for production use, but they are also very affordable to the customer: With Automated ML you will only pay for the resources you use, meaning that training new machine learning models can cost less than a cup of coffee does.

So, what is machine learning and why does this matter to me?

Of course, having a service reach GA and it becoming cheap to use is great and all, but with the myriad of different Azure services around we must ask ourselves why this one is something to be excited about? Machine learning itself is a data science discipline interested in finding out best possible answers to problems based on historical data. For example, you may have a list of patient records including details such as age, body-mass-index, blood pressure and whether the person is diabetic or not. Using this data, you could then train a machine learning model that can make predictions on whether someone else has diabetes based on their age, body-mass-index and blood pressure.

Similarly, you could train machine learning models to estimate daily sales, or the likelihood of a mechanical failure in a production machine, and many other things as long as you have the historical training data available. The movies that Netflix recommends you are a result of machine learning, too. While the results of machine learning are only predictions, these predictions can be made at rapid pace, faster than what any human is capable of. As such machine learning can produce valuable insights to large amounts of data that would be near impossible for humans to analyze, be it in real-time IoT systems or quarterly BI reporting. And even in daily decision-making machine learning can provide human decisionmakers with valuable second opinions that are less likely to hold any human biases.

But despite all the benefits that machine learning can bring us it has been rarely utilized, mostly due to the high barrier of entry: Producing accurate machine learning models has required a lot of manual work from highly skilled and sought-after data scientist specialists. Training the models has been an iterative process with lots of trial and error, and as such, the required investments needed for a machine learning project have been very high. Until now, that is.

Enter Automated Machine Learning

This is where Automated Machine Learning, Microsoft’s answer to making machine learning more affordable and available to all, comes in. It simplifies the process of training machine learning models by performing most of the iterative data science work for you, and turning projects that previously were 95 % data science and 5 % data engineering to more like 75 % data engineering and 25 % data science. Models can be trained and deployed to testing in a matter of hours, given good quality training data of course, and implementing proof-of-concept machine learning solutions with high quality models can be done quite handily. So, if you’ve thought that machine learning could be of use to you but were skeptical of the time and monetary investments required, worry no more.

Excited yet? Do get in touch and let us see how machine learning can help you!


Power Virtual Agents & You: Integrating Bots Into Microsoft Teams Without Coding

Applications built into Microsoft Teams make it the Hub for Work by delivering integrated experiences to users using a unified canvas. Teams apps can empower your business, office, and firstline workers by letting them take advantage of your backend systems all from the palm of their hand.

Teams Application Templates and Power Virtual Agents have the power to extend Teams with bots quickly. You can integrate App Template bots into a production in a matter of hours, and Power Virtual Agents are even faster than that with a simple conversation flow.

When working in a real-world situation, you always do the first tests and proofs in a development or testing environment. It’s here where you can see what value you will get from the app/bot and how well it fits the needs in your organization. The key here is that you can start your bot Proof-Of-Concept (POC) in a matter of hours!

How can you extend Teams with these apps?

First, it’s important to state that bots are apps as well. They’re just more conversational in their nature than, say, sending a message via Teams to every user in your organization.

Apps extend Teams in several ways including:

  • Tab applications (like PowerBI reports or SharePoint pages in a channel tab)
  • Message Extensions (like Praise or Stream)
  • Message Actions (like creating a Polly poll based on a single message)
  • Notifications from various apps. Instead of checking the app every ten minutes, you can act the moment you get the notification.
  • Bringing apps to Teams. PowerApps can be brought into Teams without changes, SharePoint web parts (SPFx) can be added to Teams as apps with minor changes.
  • Having bots you can use in a conversational way (and that can be made richer and more interactive with adaptive cards)

Some uses for bots in various organizations include:

  • FAQs. These can save your organization time and help you get answers without involving IT in the process
  • Several forms (IT support/team/travel/day off and various other requests)
  • Inquiring information from the back end (available vacation days/saldo hours, updating personal information, etc.
  • Reporting leads, getting customer or project information, reporting hours, asking for assistance, and so on

As can be seen, bots are very versatile and can be used in many processes. Instead of filling out a plain form, bots let you answer questions interactively. On the one hand, these bots already work with mobile Teams. On the other hand, trying to fit just any form into a bot conversation won’t do; some just don’t work properly. Instead of trying to recreate your current processes 1:1 in Teams, reconsider what’s essential and how the process would be best supported by Teams bots (or other platform capabilities).

What Are Teams Application Template Bots?

These bots are production-ready packages you can deploy and start using in your Teams. They use several Azure resources, so you need to have enough permissions to spin up resources and a suitable subscription when you are doing the deployment.

All application templates are open-source and come with source code that you can download, edit, and utilize even more. This way you can also check the application internals if your organization requires a review process. There’s also information in each apps’ architecture and even cost estimates.

There are several benefits you get from application templates:

  • They make it easy to get started. You don’t have to be a developer to deploy these.
  • You get more understanding of how to utilize Teams as an app platform.
  • If POC/test/pilot proves that these apps match your needs, you can deploy them to your production
  • For developers they serve as a kickstart and example code of what utilizing several different Teams app capabilities can do

Power Virtual Agents

If you thought it was fast to get started with Teams bots using Application Templates then you’ve seen nothing yet!

The new cool player in town is Power Virtual Agents. These agents come in a package that has it all: the platform, AI, machine learning, integrations, and a design canvas. With Power Virtual Agents you can create versatile conversational bots without any code or understanding of machine learning/AI, and you can even connect them to business processes without being a developer with a coding background. Power Automate can be used to extend conversational processes, and with this #lowcode platform you can use resources and do actions both in and out of Office 365.

When you sign up for a new Power Virtual Agent service it only takes about 15 minutes for you to get an example bot in place. Modifying example topics/setup to suit your immediate needs can be done in just minutes. Of course, it often takes a lot longer since the process is iterative and you rarely have the design done in advance. If you just have some simple Q&A questions it doesn’t take a long time, but businesses typically have larger needs.

What is especially great about PVAs is their language understanding (limited to English at this point). You don’t have to define topics for every possible variation—instead, the PVA platform will try to figure out what you meant and pick paths based on that. It has proven to be rather useful in my experiments, and it’s easy to edit these topics later as well. And yes, this is done without any coding in a visual design canvas.

Power Virtual Agents are the new “superpower” answering to quickly changing or rising needs. Combining the conversational AI “magic” with Power Automate allows power users and citizen developers to digitalize conversational processes super quickly.

Do not wait but get started with these #nocode #locode Application Template & Power Virtual Agent bots!

.. they are not scary at all!

If you are interested to learn more on the subject, a longer version of this post can be found here



Vesku on toiminut yli 20 vuoden ajan erilaisissa asiantuntija-, konsultointi- ja valmennustehtävissä. Hän on Microsoft MVP (Most Valuable Professional). Vesku on erikoistunut Office 365:n ja Microsoft Teamsin tarjoamiin modernin työn ratkaisuihin, mutta kokemusta löytyy monista erilaisista toimialoista ja teknologioista.

Building a real-time, cloud-based monitoring solution using Power BI and Microsoft’s data offerings

You may have heard the saying “Data is the new oil”, which has been spreading around the last few years. This word “data” in this statement has usually referenced to the data that you and I leave behind when using technology, to be then made use of by advertisers, influencers, and other parties.

But just as well the data that your business generates is the new oil, which, if refined properly, will allow you to gain new and powerful insights into your business practices. This allows you to enhance, streamline and improve your business processes from research, development and manufacturing to sales, logistics and customer support. In this blog series I will go through a number of topics on how we can leverage Microsoft’s cloud-based business intelligence and data offerings to bring you new insights to your data.

Topics of this series:
1. Building a real-time, cloud-based monitoring solution using Power BI and Microsoft’s data offerings (this post)
2. Consolidating our enterprise data in the Azure Data Lake to enrich our data-driven solutions

Power BI is a powerful tool for building reports that allows you to gain new insights from your data. Often these reports are refreshed daily or even weekly, which can be enough for performing historical analysis on concerns such as sales performance, customer satisfaction or expenses.

However, Power BI can also be used to build reports that give you a clear look into what is going on in your organization right now, so that you can act on the information immediately. Such reports can display information from live manufacturing processes, logistics chains or even sales negotiations, all in real time.

By default, Power BI offers support for streaming and pushing real-time data. Both of these mechanisms allow us to easily build reports with live data, but they also come with some limitations: Streamed data is fast and powerful to use, but it can be stored at most for an hour, which limits its use only to monitoring live data. No historical analysis will be possible with streamed data.

Pushed data, on the other hand, will be stored by Power BI and it will support historical analysis. This makes it ideal for reporting scenarios where we can identify a set of real-time data that we need to make easily available for a single Power BI application. However, the downside to pushed data is that we cannot reuse the same dataset anywhere else. If we would like to create multiple reports using the same data, we would need to build separate pushed datasets for each of them – or come up with an alternate solution.

Case: Live industrial monitoring whiteboards with Power BI

Being aware of the technical options and limitations our tools give us is definitely important, but with data everything begins with identifying the data we have, where it comes from and what we want to do with it. In this case example we will be building industrial monitoring whiteboards, which are used to display lots of different kinds of data detailing the goings in our factory:

  • Production speeds and targets
  • State of the machinery: Temperatures, cleanliness, jams or any other issues
  • Free storage space, inventory of parts and materials
  • Environmental data: Outside temperatures, wind speed and direction
  • Safety information: Number of accidents, latest notifications

Our production data is also timestamped, which allows us to attach it to the shifts working in the factory. This way we can not only see what is currently going on, but also see which teams have been performing beyond expectations. Next, we will classify our data based on how we need to use it on our reports:

As you recall from before, with live data in Power BI our choices are to go with streaming or pushing data. We want to avoid pushing, because we have identified that there can multiple future use cases for data, and as such we want to store it in a more reusable place than a Power BI dataset. Streaming data is stored for an hour in Power BI, so we can use it to display immediate sensor information such as production speeds and temperatures, but we also need to capture them separately for historical analysis. This leads us to adopt a hybrid model with our data and reports.

In our hybrid model we will stream live data directly from our data sources to Power BI, where our BI developers create meters and real-time visuals for the data. The same data will also be captured into a SQL database, from where we will supply the reports with near real-time historic data.

To achieve this, we will use Power BI’s Direct Query method to fetch recent data (the last few factory shifts, perhaps) frequently. Power BI’s Import data method will then be used to fetch any data needed for longer historical analysis, such as analyzing factory teams’ performance over the year. A Power BI data gateway will be installed between the cloud-based Power BI service and our on-premises factory servers to facilitate data transfer.

Finally, once our data is in place and the reports have been developed, we still need to figure out how to display the reports to our end-users: Both the analysts viewing our historic data and the factory workers themselves.
Our analysts are likely to be happy to use the Power BI dashboard, and for their purposes it will work perfectly. For the factory workers, who are more reliant on the live data, out-of-the-box Power BI interface can be found lacking: It contains lots of visual clutter and controls that are on the way when the report is viewed from a factory hall monitor.

In addition, while the live streaming data is automatically refreshed in Power BI reports, our near real-time Direct Query data is only refreshed when the report updates – and by default Power BI reports update only every fifteen minutes. To circumvent this issue, we create a custom Power BI Embedded app, which allows us to minimize visual clutter and enforce report refresh times. In our factory we settle to have the report refresh the near real-time data once every minute.

With these high-level steps we have identified the data in our possession, developed use-case driven ways to capture and deliver it to Power BI and then created reports and ways to display them catered to our different users’ needs.

Do you feel like there might be ways in which your existing business data could be harnessed more effectively? Do get in touch and let’s see what we could do help you!


How to transfer your customer meetings online and create easy-to-buy services with Microsoft Bookings

Microsoft Bookings is an online and mobile app for small businesses who provide services to internal or external customers on an appointment basis. With Bookings, you can easily create online services for your customers to buy.

For example, let´s say you have a business that sells social media contents for customers. Before, you would have your staff phone numbers on your webpage and maybe a “contact us” form for the customers. When a customer calls or sends a message, you would spend some time to sell the product and then maybe to find time for a meeting in both of your calendars. After that, you would go to the meeting, maybe spend a lot of time in traffic etc. Not to mention the restrictions that the situation with corona creates, making it perhaps impossible to even meet customers face-to-face.

Did you know that there is a more effective way to sell, book and deliver services? That is to say, you can use Microsoft Bookings and Teams.

Microsoft Bookings

In Bookings, your customer can book a time online from your available hours. The appointment can be held online using Microsoft Teams. Photos by Microsoft.

What will be different when you have Bookings + Teams?

Let´s say your customer browses through your website, finds interesting content and is assured of your expertise. From there, it´s a low threshold for the customer to click on your “Book a 30-minute sparring session for social media marketing online” -button. The customer can now book a time there and then, as they can see your available times and choose the one best suited for them. Once they do that, a notification is send to you and you can confirm the appointment.

The appointment can be held online using Microsoft Teams (even if the customer doesn´t have Teams). So there! You have just sold and delivered your services without spending any time on the phone or in a car, and you can focus on the key things, things you love to do.

Microsoft Bookings

When a new booking is made, you and your customer both get notifications from it. You can also set up an alarm to remind your customer when the appointment is getting closer. Photo by Microsoft

See all this in practice, join our webinar! You’ll learn:

  • What is Bookings
  • How to make a “reservation line” available online without having to have a person waiting by the phone. Customers can find free times and schedule appointments at any time of the day.
  • How you can implement Bookings quickly in your organization.
  • About the possibilities how to digitalize your business for example in finance & insurance, healthcare, consultative work, helpdesk and small business.

See the webinar

Note: Microsoft Bookings is available as part of an Office 365 Business Premium and Enterprise E3 & E5 subscriptions.

Microsoft Bookings

You can view and manage your booked hours in Bookings. Photo by Microsoft




Senior Consultant. Tuulia has a strong background as a communications specialist, and she thinks that effective internal communication is the key to shine out to customers as well. She helps clients with developing internal communications and modern information work, communicating and implementing change and how to utilize Office 365, SharePoint and Teams. Tuulia on asiakkaiden tukena modernin tietotyön konseptoinnissa ja suunnittelussa, muutoksen viestimisessä ja läpiviennissä sekä modernien työkalujen käyttöönotoissa. Hän auttaa asiakkaita kehittämään sisäistä viestintää, ryhmätyöskentelyä ja toimintaprosesseja Office 365:n, SharePointin ja Teamsin avulla.

Viimeistään nyt on aika suunnata pilveen, mikäli yrityksesi aikoo pysyä kehityksen mukana

“We could test new services experimental there, but we’re never going to use the cloud in production.” I well remember all the discussions and suspicions surrounding cloud services back in the early 2010s. But reality surprised us all once again. Concerns about cloud security and availability proved to be groundless. Instead the cloud has brought significant benefits, a few of which I’ll list a bit later on. Many who once viewed the cloud in a negative light have already migrated with no intention of going back. And why would they?

The business advantages of cloud solutions have long been undeniable. There’s no need to invest in IT infrastructure, meaning that the only costs will be operational. And these costs are also predictable and flexible. Capacity is always available as required. The cloud easily scales both up and down. All this is familiar to many, but the cloud also offers many other benefits.

Cloud services have grown rapidly. The research company IDC forecasts, that this year, public and private cloud investments will permanently exceed investments in traditional IT. In itself, this 50 per cent marker is just one milestone, but it’s still a strong indication that every company should be migrating to the cloud now if ever. The cloud is no longer “fashionable” – it’s where you have to go if you want to keep up to date.

 

The cloud boosts company development

Development is one of the most important reasons to enter the cloud, in terms of both services and the company’s own development.

The majority of service development is already happening there. Many of the latest features are available initially or only in the cloud, which means that traditional on-prem applications and solutions are starting to get left behind. Investments in data security are also mostly occurring in the cloud. If your company intends to keep up with updates and features, you don’t really have any other option.

The cloud will also boost your own company’s development. Teams, along with its integrated Microsoft and third-party applications, is fundamentally changing and enhancing our way of working. Services found in the Azure cloud, such as machine learning and analytics tools, are also enabling us to create new business solutions much faster than before. Data volumes have exploded in recent years. Making decisions with the aid of data also calls out to the cloud for help.

However, migrating to the cloud won’t happen on its own. Although it will require new technical skills, the shift will above all be a cultural one. If you don’t change the way you work  when your organisation migrates to the cloud, you’ll be lucky to succeed. People play a huge role in change projects.

Their approach to change be enthusiastic or critical or anything in between. Either way, the cloud is here to stay. And this year is your last chance to consider your next steps towards migrating to the cloud.



Mikko Torikka has spent more than ten years in supervisory and managerial positions in different organisations. He has a broad view of digitalisation, change implementation, product development and agile methods at Finnish companies. Mikko is also familiar with Finland’s corporate IT leadership  field, and has a solid understanding of the importance of technical solutions in both business and the employee experience.