Skip to main content

· 5 min read

It's been almost a week since Angular 9 was released which included several new features such as new Ivy Renderer, SSR, Type checking etc. In this post, i will be going through on how to build and deploy an Angular9 application as a static website to Azure with GitHub Actions.

Prerequisites:

  • Github Account
  • Azure Account
  • VSCode

Step 1: Install latest Angular CLI

In order to create the Angular application , you need to install the latest angular cli, which could be done with the following command,

 npm install -g @angular/cli@latest

Step 2: Create new Angular9 App

Run the following command to create the Angular9 app with the default template. Let's name the application as ga-azure

ng new ga-azure

Step 3: Install the Hexa.run CLI

We will use the package by Wassim Chegham to deploy the application to Azure. Next step in your Angular project make sure to install the Hexa.run CLI as a prod dependency as you can see from the package.json of this project.

npm i -g @manekinekko/hexa

Step 4: Login to Azure account

Next step is to create the necessary resources to deploy the angular application. In this way, we will deploy our application to static website for Azure Storage which is the optimum option to host a single page application (SPA) on Azure.Hosting a SPA in pure Storage is by far the cheapest and most efficient way of running in Azure.

You can login to azure account with the command,

npm run hexa:login

which will list down the available subscriptions and you need to pick the subscription where you want to deploy the application.

Step 5: Initiate the Hexa Settings

Next step is to initiate the configuration needed for the deployment of the application. Run the Hexa CLI command as follows,

npm run hexa:init

which will ask for few inputs from the user such as the project name, storage account name and the destination folder. Eventually, you will see a new file generated as hexa.json which will look like the below,

{
"subscription": {
"name": "Azure Demo"
},
"project": {
"location": "westeurope",
"name": "ga-azure"
},
"storage": {
"location": "westeurope",
"name": "gaazure10940"
},
"hosting": {
"folder": "./gist/ga-azure"
}
}

Now you are good with all the necessary things needed to deploy the application with github action.

Step 6: Generate Service Principal

You need to use the service principal identity mechanism to do the authorization of the deployment. In order to generate the service principal using hexa, run the below command,

npm run hexa:ci

Hexa.run will automatically:

  1. create an Azure resource group (or lets you choose an existing one)
  2. create the Azure storage account
  3. configure the storage account and make it static-website ready
  4. upload the angular bundle.
  5. prints the generated URL from the Azure service.

Also it will generate the necessary credentials as a JSON and make a note of them.

{
appId: 'xx4362xx-aaxx-40xx-8bxx-xx6ea0c351xx',
displayName: 'ga-azure',
name: 'http://ga-azure',
password: 'xxce72xx-1axx-44xx-81xx-35xxb15xxa1e',
tenant: 'xxf988xx-86xx-41xx-91xx-2d7cd011dbxx'
}

Step 7 : Commit and push to the Github Repository

Once you are done with all the above steps , you can commit your changes and push to the remote repository on Github.

git remote add origin https://github.com/sajeetharan/ga-azure.git
git add -A && git commit -m "First commit"
git push origin master

Step 8 : Create Github Actions Workflow

Now it's the time to create the Actions workflow, you can create a new workflow by navigating to actions and click New Worfklow. There are few sample template workflows available , in this case we will use own workflow. So you need to click on setup workflow yourself.

Setting up own workflow GitHub Actions

Immediately you could see new workflow.yml file created where you need to add the steps and actions needed to deploy the app. Here is the workflow file look like after adding all the steps.

name: Deploy to Azure with Hexa.ru
on:
push:
branches:
- master
- release/*

jobs:
build:

runs-on: ubuntu-latest

strategy:
matrix:
node-version: [12.x]

steps:
- uses: actions/checkout@v1
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node-version }}
- name: npm install
run: |
npm install
- name: npm build, and deploy
env:
AZURE_SERVICE_PRINCIPAL_ID: ${{ secrets.AZURE_SERVICE_PRINCIPAL_ID }}
AZURE_SERVICE_PRINCIPAL_PASSWORD: ${{ secrets.AZURE_SERVICE_PRINCIPAL_PASSWORD }}
AZURE_SERVICE_PRINCIPAL_TENANT: ${{ secrets.AZURE_SERVICE_PRINCIPAL_TENANT }}
run: |
npm run hexa:login
npm run build -- --prod
npm run hexa:deploy

As you could see, the steps are very much simple starting with installing the dependencies and deploying the angular application using the hexa:deploy command.

Also you need to configure the secrets in the github repository which were generated in the step 6. You can create a new secret by navigating to settings and then secrets. you need to define the below secrets which are associated with the service principal.

Github Secrets

The rest in the workflow can be easily understood as its about the environment and the trigger(whenever someone push the changes to master/release there should be a build)

Step 9 : See Github Actions in Action

Immediately when you save the workflow.yml you can see there will be a new build triggered and the steps are executed, which you can notice in the Actions tab as follows,

Deploy using hexa:run

You will be able to access the application in the url generated once the application is deployed which will look like https://gaazure10940.z6.web.core.windows.net/

That's all you need to do inorder to deploy the Angular application to Azure. If you need to include end to end testing and different tasks you could simply modify the flow and it. Github Actions definitely a future to believe in! Try this out and let me know if you have any queries! Cheers!

· 5 min read

It's been almost a week since Angular 9 was released which included several new features such as new Ivy Renderer, SSR, Type checking etc. In this post, i will be going through on how to build and deploy an Angular9 application as a static website to Azure with GitHub Actions.

Prerequisites:

  • Github Account
  • Azure Account
  • VSCode

Step 1: Install latest Angular CLI

In order to create the Angular application , you need to install the latest angular cli, which could be done with the following command,

 npm install -g @angular/cli@latest

Step 2: Create new Angular9 App

Run the following command to create the Angular9 app with the default template. Let's name the application as ga-azure

ng new ga-azure

Step 3: Install the Hexa.run CLI

We will use the package by Wassim Chegham to deploy the application to Azure. Next step in your Angular project make sure to install the Hexa.run CLI as a prod dependency as you can see from the package.json of this project.

npm i -g @manekinekko/hexa

Step 4: Login to Azure account

Next step is to create the necessary resources to deploy the angular application. In this way, we will deploy our application to static website for Azure Storage which is the optimum option to host a single page application (SPA) on Azure.Hosting a SPA in pure Storage is by far the cheapest and most efficient way of running in Azure.

You can login to azure account with the command,

npm run hexa:login

which will list down the available subscriptions and you need to pick the subscription where you want to deploy the application.

Step 5: Initiate the Hexa Settings

Next step is to initiate the configuration needed for the deployment of the application. Run the Hexa CLI command as follows,

npm run hexa:init

which will ask for few inputs from the user such as the project name, storage account name and the destination folder. Eventually, you will see a new file generated as hexa.json which will look like the below,

{
"subscription": {
"name": "Azure Demo"
},
"project": {
"location": "westeurope",
"name": "ga-azure"
},
"storage": {
"location": "westeurope",
"name": "gaazure10940"
},
"hosting": {
"folder": "./gist/ga-azure"
}
}

Now you are good with all the necessary things needed to deploy the application with github action.

Step 6: Generate Service Principal

You need to use the service principal identity mechanism to do the authorization of the deployment. In order to generate the service principal using hexa, run the below command,

npm run hexa:ci

Hexa.run will automatically:

  1. create an Azure resource group (or lets you choose an existing one)
  2. create the Azure storage account
  3. configure the storage account and make it static-website ready
  4. upload the angular bundle.
  5. prints the generated URL from the Azure service.

Also it will generate the necessary credentials as a JSON and make a note of them.

{
appId: 'xx4362xx-aaxx-40xx-8bxx-xx6ea0c351xx',
displayName: 'ga-azure',
name: 'http://ga-azure',
password: 'xxce72xx-1axx-44xx-81xx-35xxb15xxa1e',
tenant: 'xxf988xx-86xx-41xx-91xx-2d7cd011dbxx'
}

Step 7 : Commit and push to the Github Repository

Once you are done with all the above steps , you can commit your changes and push to the remote repository on Github.

git remote add origin https://github.com/sajeetharan/ga-azure.git
git add -A && git commit -m "First commit"
git push origin master

Step 8 : Create Github Actions Workflow

Now it's the time to create the Actions workflow, you can create a new workflow by navigating to actions and click New Worfklow. There are few sample template workflows available , in this case we will use own workflow. So you need to click on setup workflow yourself.

Setting up own workflow GitHub Actions

Immediately you could see new workflow.yml file created where you need to add the steps and actions needed to deploy the app. Here is the workflow file look like after adding all the steps.

name: Deploy to Azure with Hexa.ru
on:
push:
branches:
- master
- release/*

jobs:
build:

runs-on: ubuntu-latest

strategy:
matrix:
node-version: [12.x]

steps:
- uses: actions/checkout@v1
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node-version }}
- name: npm install
run: |
npm install
- name: npm build, and deploy
env:
AZURE_SERVICE_PRINCIPAL_ID: ${{ secrets.AZURE_SERVICE_PRINCIPAL_ID }}
AZURE_SERVICE_PRINCIPAL_PASSWORD: ${{ secrets.AZURE_SERVICE_PRINCIPAL_PASSWORD }}
AZURE_SERVICE_PRINCIPAL_TENANT: ${{ secrets.AZURE_SERVICE_PRINCIPAL_TENANT }}
run: |
npm run hexa:login
npm run build -- --prod
npm run hexa:deploy

As you could see, the steps are very much simple starting with installing the dependencies and deploying the angular application using the hexa:deploy command.

Also you need to configure the secrets in the github repository which were generated in the step 6. You can create a new secret by navigating to settings and then secrets. you need to define the below secrets which are associated with the service principal.

Github Secrets

The rest in the workflow can be easily understood as its about the environment and the trigger(whenever someone push the changes to master/release there should be a build)

Step 9 : See Github Actions in Action

Immediately when you save the workflow.yml you can see there will be a new build triggered and the steps are executed, which you can notice in the Actions tab as follows,

Deploy using hexa:run

You will be able to access the application in the url generated once the application is deployed which will look like https://gaazure10940.z6.web.core.windows.net/

That's all you need to do inorder to deploy the Angular application to Azure. If you need to include end to end testing and different tasks you could simply modify the flow and it. Github Actions definitely a future to believe in! Try this out and let me know if you have any queries! Cheers!

· 6 min read

Microsoft's Power Platform is a low code platform which enable the organization to analyse data from multiple sources, act on it through application and automate business process. Power Platform contains 3 main aspects (Power Apps,Power BI and Microsoft Flow) and it also integrates two main ecosystems(Office 365 and Dynamics 365). In this blog we will see in detail about Power Apps which helps to quickly build apps that connects data and run on web and mobile devices.

Overview :

We wills see how to build a simple Power App to automate scaling of resources in Azure in a simple yet powerful fashion to help save you money in the cloud using multiple Microsoft tools. In this post, we will use Azure Automation run books to code Azure Power Shell scripts, MS Flow as orchestration tool and MS Power Apps as the simple interface. If you have an Azure Environment with lot of resources it becomes hassle to manage the scaling part of it if you don't have the auto scaling implemented. The following Power App will help to understand how easy it is to build the above scenario.

Prerequisites :

We will use the following Azure resources to showcase how scaling and automation can save lot of cost.

  • AppService Plan
  • Azure SQL database
  • Virtual Machines

Step 1: Create and Update Automation Account :

First we need to create the Azure Automation account. Navigate to Create Resources -> search for Automation Account on the search bar. Create a new resource as follows,

Once you have created the resource, you also need to install the required modules for this particular demo which includes
-AzureRM.Resources
-AzureRM.Profile

Installing necessary Modules

Step 2: Create Azure PowerShell RunBooks

Next step is to create the run books by going to the Runbooks blade, for this tutorial lets create 6 run books one for each resources and its purpose. We need to create PowerShell scripts for each of those types

Scale Up SQL
-Scale Up ASP
-Start VM
-Scale Down SQL
-Scale Down ASP
-Stop VM

Creating RunBook

PowerShell scripts for each of these are found in the Github repository. We need to create runbook for each of the scripts.

All the runbooks above can be scheduled and automated to run for desired length of time, particular days of the week and time frame or continuously with no expiry. For an enterprise for non-production you would want it to scale down end of business hours and at the weekend.

Once we tested with the above Powershell scripts and the runbooks, tested and published now we can move on the next step to create the Flow. Navigate to Flow and Create New App from the template.

New Flow from the template

Select the template as PowerApps Button and the first step we need to add is the automation job. When you search for automation you will see the list of jobs available. Select Create Job and pick the one you created above. If you want to all the actions in one app, you can add one by one, If not you need to create separate flows for each one.

In this one, i have created one with the ScaleUpDB job which will execute the Scale up command of the database.

Once you are done with all the steps save the flow with necessary name.

Once we create PowerApp flow buttons login to MS PowerApps with a work/school account. Navigate to Power Apps which will give a way to create a blank canvas for Mobile or tablet. Next you can then begin to customise the PowerApp with text labels, colour and buttons as below

PowerApps Name

In this case we will have a button to increase/decrease the count of instances of the sql db, my app looked like below with few labels and buttons.

AutoMateApp

Next is to link the flow to the button of the power App by navigating to the Actions -> Power Automate

Linking Button Action

Once both Scale Up/Scale down actions are linked, save the app and publish

Step 5 : Verify Automation

Inorder to verify if things are working correctly, click on scale up and scale down few times and navigate to Azure Portal and open the Automation account we created.

Navigate to the overview tab to see the requests for each job made via the power app as below.

Jobs executed.

In order to look at the history navigate to the Jobs blade

Jobs Execution

further you can build a customized app for different environments with different screens using Power App. With the help of Azure Alert, whenever you get an alert regarding the heavy usages of resources/spikes, with single click of button you will be able scale up and scale down the resources as you need.

Improvements and things to consider:

This is just a starting point to explore more on this functionality, but there are improvements you could add to make this really useful.

(i) Sometimes the azure automation action fails to start the runbook. When you are implementing flow needs to handle this condition.

(ii) Sometimes a runbook action will be successful, but the runbook execution errored. Consider using try/catch blocks in the PowerShell and output the final result as a JSON object that can be parsed and further handled by the flow.

(iii) We should update your code to use the Az modules rather than AzureRM.

Note : The user who executes the PowerApp also needs permission to execute runbooks in the Automation Account.

With this App, It becomes handy for the operations team to manage the portal without logging in. Hope it helps someone out there! Cheers.

· 6 min read

Microsoft's Power Platform is a low code platform which enable the organization to analyse data from multiple sources, act on it through application and automate business process. Power Platform contains 3 main aspects (Power Apps,Power BI and Microsoft Flow) and it also integrates two main ecosystems(Office 365 and Dynamics 365). In this blog we will see in detail about Power Apps which helps to quickly build apps that connects data and run on web and mobile devices.

Overview :

We wills see how to build a simple Power App to automate scaling of resources in Azure in a simple yet powerful fashion to help save you money in the cloud using multiple Microsoft tools. In this post, we will use Azure Automation run books to code Azure Power Shell scripts, MS Flow as orchestration tool and MS Power Apps as the simple interface. If you have an Azure Environment with lot of resources it becomes hassle to manage the scaling part of it if you don't have the auto scaling implemented. The following Power App will help to understand how easy it is to build the above scenario.

Prerequisites :

We will use the following Azure resources to showcase how scaling and automation can save lot of cost.

  • AppService Plan
  • Azure SQL database
  • Virtual Machines

Step 1: Create and Update Automation Account :

First we need to create the Azure Automation account. Navigate to Create Resources -> search for Automation Account on the search bar. Create a new resource as follows,

Once you have created the resource, you also need to install the required modules for this particular demo which includes
-AzureRM.Resources
-AzureRM.Profile

Installing necessary Modules

Step 2: Create Azure PowerShell RunBooks

Next step is to create the run books by going to the Runbooks blade, for this tutorial lets create 6 run books one for each resources and its purpose. We need to create PowerShell scripts for each of those types

Scale Up SQL
-Scale Up ASP
-Start VM
-Scale Down SQL
-Scale Down ASP
-Stop VM

Creating RunBook

PowerShell scripts for each of these are found in the Github repository. We need to create runbook for each of the scripts.

All the runbooks above can be scheduled and automated to run for desired length of time, particular days of the week and time frame or continuously with no expiry. For an enterprise for non-production you would want it to scale down end of business hours and at the weekend.

Once we tested with the above Powershell scripts and the runbooks, tested and published now we can move on the next step to create the Flow. Navigate to Flow and Create New App from the template.

New Flow from the template

Select the template as PowerApps Button and the first step we need to add is the automation job. When you search for automation you will see the list of jobs available. Select Create Job and pick the one you created above. If you want to all the actions in one app, you can add one by one, If not you need to create separate flows for each one.

In this one, i have created one with the ScaleUpDB job which will execute the Scale up command of the database.

Once you are done with all the steps save the flow with necessary name.

Once we create PowerApp flow buttons login to MS PowerApps with a work/school account. Navigate to Power Apps which will give a way to create a blank canvas for Mobile or tablet. Next you can then begin to customise the PowerApp with text labels, colour and buttons as below

PowerApps Name

In this case we will have a button to increase/decrease the count of instances of the sql db, my app looked like below with few labels and buttons.

AutoMateApp

Next is to link the flow to the button of the power App by navigating to the Actions -> Power Automate

Linking Button Action

Once both Scale Up/Scale down actions are linked, save the app and publish

Step 5 : Verify Automation

Inorder to verify if things are working correctly, click on scale up and scale down few times and navigate to Azure Portal and open the Automation account we created.

Navigate to the overview tab to see the requests for each job made via the power app as below.

Jobs executed.

In order to look at the history navigate to the Jobs blade

Jobs Execution

further you can build a customized app for different environments with different screens using Power App. With the help of Azure Alert, whenever you get an alert regarding the heavy usages of resources/spikes, with single click of button you will be able scale up and scale down the resources as you need.

Improvements and things to consider:

This is just a starting point to explore more on this functionality, but there are improvements you could add to make this really useful.

(i) Sometimes the azure automation action fails to start the runbook. When you are implementing flow needs to handle this condition.

(ii) Sometimes a runbook action will be successful, but the runbook execution errored. Consider using try/catch blocks in the PowerShell and output the final result as a JSON object that can be parsed and further handled by the flow.

(iii) We should update your code to use the Az modules rather than AzureRM.

Note : The user who executes the PowerApp also needs permission to execute runbooks in the Automation Account.

With this App, It becomes handy for the operations team to manage the portal without logging in. Hope it helps someone out there! Cheers.

· 7 min read

Many times you would have wanted to have one view/dashboard of all the Github issues created for your open source repositories. I have almost 150 repositories and it becomes really hard to find which are the priority ones to be fixed. In this post we will see how you can create a one dashboard/report to view all your github issues in a page using Azure Function(3.X with Typescript) and Azure CosmosDB.

PreRequisities:

You will need to have an Azure Subscription and a Github Account. If you do not have an Azure subscription you can simply create one with free trial. Free trial provides you with 12 months of free services. We will use Azure Function and CosmosDB to build this solution.

Step 1 : Create Resource Group

Inorder to manage deploy the function app and cosmosdb we first need to create Resource Group. You can create one named "gh-issue-report"

Step 2: Create the Azure Cosmosdb Account

To store the related data of the GitHub issue we need to create a CosmosDB account. To Create CosmosDB account, navigate to the Azure portal and click the Create Resource. Search for Azure Cosmosdb on the market place and create the account as follows.

CosmosDB Creation

Step 3:  Create the Function app

If you have noticed my previous blog, i have mentioned about how to create an Azure function. Here is an image of the Function App i created.

Creating Function App

Create Typescript Function:

As you see i have selected Runtime stack as Node.js which will be used to run the function written with Typescript.  Open Visual Studio Code(Make sure you have already installed the VSCode with the function core tools and extension). Select Ctrl + Shif + P to create a new Function Project and select the language as Typescript.

Create Typescript Function

 Select the template as Timer trigger as we need to run every 5 minutes and you need to configure the cron expression (0 */5 * * * *) as well. (You can have custom time)

Give the function name as gitIssueReport, You will see the function getting created with the necessary files.

Step 4 : Add Dependencies to the Function App

Let's try to add the necessary dependencies to the project. We will use bluebird as a dependency to handle the requests. Also gh-issues-api library to interact with Github and get the necessary issues. You need to add the dependencies in the package.json folder under dependencies.

 "dependencies": {
"@types/node": "^13.7.0",
"bluebird": "^3.4.7",
"gh-issues-api": "0.0.2"
}

You can view the whole package.json here.

Step 5: Set Output Binding

Let's set the output binding to CosmosDB to write the issues to the collection. You can set it by modifying the function.json as

{
"type": "cosmosDB",
"name": "issueReport",
"databaseName": "gh-issues",
"collectionName": "open-issues",
"createIfNotExists": true,
"connectionStringSetting": "gh-issue_DOCUMENTDB",
"direction": "out"
}

Where type cosmosDB denotes the database output binding and you can see that the database name and collection as configured.

Step 6 : Code to Retrieve the Github Repository Issues

The actual logic of the function is as follows,


import Promise = require('bluebird');

import {
GHRepository,
IssueType,
IssueState,
IssueActivity,
IssueActivityFilter,
IssueLabelFilter,
FilterCollection
} from 'gh-issues-api';

export function index(context: any, myTimer: any) {
var timeStamp = new Date().toISOString();

if(myTimer.isPastDue) {
context.log('Function trigger timer is past due!');
}

const repoName = process.env['repositoryName'];
const repoOwner = process.env['repositoryOwner'];
const labels = [
'bug',
'build issue',
'investigation required',
'help wanted',
'enhancement',
'question',
'documentation',
];

const repo = new GHRepository(repoOwner, repoName);
var report = {
name: repoName,
at: new Date().toISOString()
};

context.log('Issues for ' + repoOwner + '/' + repoName, timeStamp);
repo.loadAllIssues().then(() => {
var promises = labels.map(label => {
var filterCollection = new FilterCollection();
filterCollection.label = new IssueLabelFilter(label);
return repo.list(IssueType.All, IssueState.Open, filterCollection).then(issues => report[label] = issues.length);
});
var last7days = new Date(Date.now() - 604800000)
var staleIssuesFilter = new IssueActivityFilter(IssueActivity.Updated, last7days);
staleIssuesFilter.negated = true;
var staleFilters = new FilterCollection();
staleFilters.activity = staleIssuesFilter;
promises.push([
repo.list(IssueType.Issue, IssueState.Open).then(issues => report['total'] = issues.length),
repo.list(IssueType.PulLRequest, IssueState.Open).then(issues => report['pull_request'] = issues.length),
repo.list(IssueType.All, IssueState.Open, staleFilters).then(issues => report['stale_7days'] = issues.length)
]);

return Promise.all(promises);
}).then(() => {
var reportAsString = JSON.stringify(report);
context.log(reportAsString);
context.bindings.issueReport = reportAsString;
context.done();
});;
}

You can see that the document is set as a input to the CosmosDB with the binding named issueReport.

Step 7: Deploy the Function

Deploy the Function App. You can deploy the function app to the Azure with the keys Ctrl+Shift+P and select Deploy to the Function App

Deploy Function App

Step 8 : Verify/Install the Dependencies

Once the deployment is succesfful, Navigate to Azure portal and open the function app to make sure that everything looks good. If you dont see the dependencies make sure to install the dependencies manually by navigating to the Kudu Console of the function App.

Note : Make sure to stop the Function app before you head over to Kudu.

ick on the Platform Features tab. Under Development Tools, click Advanced tools (Kudu). Kudu will open on it’s own in a new window.

Navigate to KUDU console

In the top menu of the Kudu Console, click Debug Console and select CMD

In the command prompt, we’ll want to navigate to D:\home\site\wwwroot. You can do so by using the command cd site\wwwroot and press enter on your keyboard. Once you’re in wwwroot, run the command npm i bluebird to install the package. Also do the same for gh-issues-api

Step 8: Set Environment Variables (Repository)

As you could see in the above code, we are setting two environment variables to read the repository name and the repository owner which are needed to fetch the issues information. You can set those variable son the Azure portal as follows.

Navigate to the Overview tab for your function and click Configuration. As you can see below I've configured those values.

Function App Settings

Step 9: Verify the Output Binding

Just to make sure that our settings in the function.json has been reflected or not navigate to the Functions and select the Function and make sure all the binding values are correct. If not create a new binding to cosmosdb account you created as mentioned in the step Step 3 (Instead of Twilio select Cosmosdb)

Step 10 : Run and Test the Function

Now its time to see the function app running and issues being reported. Navigate to your function app and click Run. You can see the Function Running as shown below.

Run Function App

Step 11: Check Live App Metrics

If you see any errors you can always navigate to Monitor section of the Function app and select Live App Metrics

Live metrics of the function app

Step 12: Verify the data in cosmosdb

If everything goes well, you can navigate to Cosmosdb Account and open the collection with the data Explorer.

Data Explorer Cosmosdb

You will see that there are many documents inserted in the collection.

Cosmosdb collection with Github repository Issues

Now you can modify this function to retrieve the issues from all of your repositories and use the data stored in the cosmosdb collection to build a dashboard to show the issues with priority. Also you can make use of this post to send a notification to someone about the issue as well.

Hope this simple function will help someone to build a dashboard out of the data collected and make them more productive.Cheers!