Skip to main content

· 2 min read

I was checking out this cool feature on the Azure Portal Today. I usually spent 3 hours per week on evaluating the new features or building something new on Azure. Azure Portal is for a lot of developers is the go-to place to manage all their Azure resources and services. Most often i hear from the developers is that Portal takes much time to load when you login and sometimes we feel that portal is slow. Today i figured out a way to debug the portal loading time. If you are developer who considers Azure Portal as your website and want to know about the duration for each view of the page, while you are logged in into the Azure Portal and press the keyboard shortcut CTRL + ALT + D you can see the load time and other useful information for every title.

Azure Portal Load Time

You can simply enable/disable certain features on the portal by toggling the Experiments. You can also enable Debug Hub to see if there are any exceptions/issues while loading the portal related elements.

Enable/Disable certain features

One other tip i would like to highlight here is that keyboard shortcuts that you can use specifically for the Azure portal.

To see them all, you can open the Keyboard shortcut help item in the Help Menu on the top-right of the Portal.

Shortcut keys

Hope it helps you to figure out if there is a slowness while loading the Azure Portal. If you are new to Azure and want to get start on Azure, explore my tool Azure360. Share this tip with your colleagues.

· 5 min read

We have a wide variety of options to store data in Microsoft Azure. Nevertheless, every storage option has a unique purpose for its existence. In this blog, we will discuss ADLS (Azure Data Lake Storage) and its multi-protocol access that Microsoft introduced in the year 2019.

Introduction to ADLS (Azure Data Lake Storage)

According to the Microsoft definition, it is an enterprise-wide hyper-scale repository for big data analytics workloads and enables you to capture data of any size and ingestion speed in one single space for operational and exploratory analytics.

The main purpose of its existence is to enable analytics on the stored data (it may be of any type structured, semi-structured and unstructured data) and provide enterprise-grade capabilities like scalability, manageability, reliability, etc.

Where does it build?

ADLS is built on the top of the Azure Blob Storage. Blob Storage is one of the storage services under the suite of Storage accounts. Blob storage lets you store any type of data and it doesn’t necessarily to be a specific data type.

Does the functionality of ADLS sound like the Blob storage?

From the above paragraphs, it looks like both ADLS and Blob storage has the same functionality. Because, both the services can be used to store any type of data. But, as I said before, every service has its purpose for its existence. Let us explore, what is the difference between ADLS and Blob storage in the following.

Difference between ADLS and Blob storage

Purpose

It is optimized for analytical purposes on the data stored in the ADLS, but Blob storage is a usual way of storing file-based information in Azure where the data which will not be accessed very often also called as cold storage.

Cost

In both the storage options, we need to pay the amount for the data stored and I/O operations. In the case of ADLS, the cost is slightly higher than the Blob.

Support for Web HDFS interface

ADLS supports a standard web HDFS interface and can access the files and directories in Hadoop. Blob does not support this feature.

I/O performance

ADLS is built for running large scale systems that require massive read throughput when queried against the DB at any pace. Blob is used for store data which will be accessed infrequently.

Encryption at rest

Since ADLS GA, it supports encryption at rest. It encrypts data flowing in public networks and at rest. Blob Storage does not support encryption at rest. See more details on the comparison here.

Now, without any further delay let us dig on the Multi-protocol access for ADLS.

Multi-protocol access for ADLS

This is one of the significant announcements that Microsoft has done in the year 2019 as far as ADLS is concern. Multi-protocol access to the same data allows you to leverage existing object storage capabilities on Data Lake Storage accounts, which are hierarchical namespace-enabled storage accounts built on top of Blob storage. This allows you to put all your different types of data in the data lake so that the users can make the best use of your data as the use case evolves.

The multi-protocol concept can be achieved via Azure Blob storage API and Azure Data Lake Storage API. The convergence of both the existing services, ADLS Gen1 and blob storage, paved the path to a new term called Azure Data Lake Storage Gen 2.

Expanded feature set

With the announcement of multi-protocol access, existing blob features such as access tiers and lifecycle management policies are now unlocked for ADLS. Furthermore, it enables many of the features and ecosystem support of blob storage is now supported for your data lake storage.

This could be a great shift because your blob data can now be used for analytics. The best thing is you don’t need to update the existing applications to get access to your data stored in Data Lake Storage. Moreover, you can leverage the power of both your analytics and object storage applications to use your data most effectively.

While exploring the expanded feature sets, one of the best things I could found is that ADLS can now be integrated with Azure Event Grid.

Yes, we have one more publisher on the list for Azure Event Grid. Azure Event Grid can now be used to consume events generated from Azure Data Lake Storage Gen2 and routed to its subscribers with webhooks, Azure Event Hubs, Azure Functions, and Logic Apps as endpoints.

Modern Data Warehouse scenario

The above image depicts the use case scenario of ADLS integration with Event Grid. First off, there are a lot of data comes from different sources like Logs, Media, Files and Business apps. Those data are ending up in the ADLS via Azure Data Factory and the Event Grid which listens to the ADLS gets triggered once data reaches it. Further, the event gets routed via Event Grid and Functions to Azure Databricks. The file will be processed by the databricks job and writes the output back to Azure Data Lake Storage Gen2. Meanwhile, Azure Data Lake Storage Gen2 pushes a notification to Event Grid which triggers an Azure Function to copy data to Azure SQL Data Warehouse. Finally, the data will be served via Azure Analysis Services and PowerBI.

Wrap-up

In this blog, we have seen an introduction about the Azure Data Lake Storage and the difference between ADLS and blob storage. Further, we investigated the multi-protocol access which is one of the new entrants in ADLS. Finally, we looked into one of the extended feature sets - integration of ADLS with Azure Event Grid and its use case scenario.

I hope you enjoyed reading this article. Happy Learning!

Image Credits: Microsoft

This article was contributed to my site by Nadeem Ahamed and you can read more of his articles from here.

· One min read

There are cases if you want to start/stop all VMs in particular resource group in parallel within Azure. You can set it up with Automation using Scheduled actions. Other way which you can do by using PowerShell or Azure CLI

If you are using PowerShell, simply you can do

Get-AzVm -ResourceGroupName 'MyResourceGroup' | Start-AzVM

If you are using Azure CLI/Bash

az vm start --ids $(az vm list -g MyResourceGroup --query "[].id" -o tsv)

Where MyResourceGroup is the name of your ResourceGroup. Happy Azurifying!

· 2 min read

I have come across this question about "How to deploy a web app within a sub folder on Azure" in Stackoverflow many times. Even though there is an official documentation, this question has not been addressed in general. With Virtual Directories, You could keep your web sites in separate folders and use the ‘virtual directories and applications’ settings in Azure to publish the two different projects under the same site.

However, say if you have an ASP.NET Core/Angular app to a sub-folder inside Azure Web App (App Service), and wanted to deploy on Azure inside a sub-folder. You can simply navigate to Azure portal -> Select the Web App -> Overview

  • Download the publish profile
  • Import in Visual Studio
  • Edit the web-deploy profile(Normally the publish profile will have Web Deploy as well as FTP profile)
    • Change Site Name from your-site to your-site\folder\sub-folder
    • Change the Destination URL from http://your-site.azurewebsites.net to http://your-site.azurewebsites.net/folder/sub-folder
  • Publish

You should be getting an error as follows,

System.TypeLoadException: Method ‘get_Settings’ in type ‘Microsoft.Web.LibraryManager.Build.HostInteraction’ from assembly ‘Microsoft.Web.LibraryManager.Build

You can resolve the above issue by updating the nuget package named Microsoft.Web.LibraryManager.Build in your project.

One other thing that you should be aware is that, Go to portal > demo-site App Service > Configuration > Path Mappings > Virtual applications and directories. And add the following,

Virtual PathPhysical PathType
/foldersite\wwwroot\folderFolder
/folder/sub-foldersite\wwwroot\folder\sub-folderApplication

Configuration Virtual Directory

Now publish from Visual Studio. If you only need to publish to a first level folder, i.e to your-site\folder, then all you have to do is, change the Type to Application in the Path mappings for /folder, and skip the sub-folder entry since you don’t need it. And correct the Site Name and Destination URL in the publish profile accordingly.

Hope there will be no more questions on the same. Happy Coding!

· 2 min read

I came across this question about "How to deploy a web app within a sub folder on Azure" in Stackoverflow many times. Even though there is an official documentation, this question has not been addressed in general. With Virtual Directories, You could keep your web sites in separate folders and use the ‘virtual directories and applications’ settings in Azure to publish the two different projects under the same site.

However, say if you have an ASP.NET Core/Angular app to a sub-folder inside Azure Web App (App Service), and wanted to deploy on Azure inside a sub-folder. You can simply navigate to Azure portal -> Select the Web App -> Overview

  • Download the publish profile
  • Import in Visual Studio
  • Edit the web-deploy profile(Normally the publish profile will have Web Deploy as well as FTP profile)
    • Change Site Name from your-site to your-site\folder\sub-folder
    • Change the Destination URL from http://your-site.azurewebsites.net to http://your-site.azurewebsites.net/folder/sub-folder
  • Publish

You should be getting an error as follows,

System.TypeLoadException: Method ‘get_Settings’ in type ‘Microsoft.Web.LibraryManager.Build.HostInteraction’ from assembly ‘Microsoft.Web.LibraryManager.Build

You can resolve the above issue by updating the nuget package named Microsoft.Web.LibraryManager.Build in your project.

One other thing that you should be aware is that, Go to portal > demo-site App Service > Configuration > Path Mappings > Virtual applications and directories. And add the following,

Virtual PathPhysical PathType
/foldersite\wwwroot\folderFolder
/folder/sub-foldersite\wwwroot\folder\sub-folderApplication

Configuration Virtual Directory

Now publish from Visual Studio. If you only need to publish to a first level folder, i.e to your-site\folder, then all you have to do is, change the Type to Application in the Path mappings for /folder, and skip the sub-folder entry since you don’t need it. And correct the Site Name and Destination URL in the publish profile accordingly.

Hope there will be no more questions on the same. Happy Coding!