Skip to main content

16 posts tagged with "microsoft"

View All Tags

· 6 min read

Microsoft's Power Platform is a low code platform which enable the organization to analyse data from multiple sources, act on it through application and automate business process. Power Platform contains 3 main aspects (Power Apps,Power BI and Microsoft Flow) and it also integrates two main ecosystems(Office 365 and Dynamics 365). In this blog we will see in detail about Power Apps which helps to quickly build apps that connects data and run on web and mobile devices.

Overview :

We wills see how to build a simple Power App to automate scaling of resources in Azure in a simple yet powerful fashion to help save you money in the cloud using multiple Microsoft tools. In this post, we will use Azure Automation run books to code Azure Power Shell scripts, MS Flow as orchestration tool and MS Power Apps as the simple interface. If you have an Azure Environment with lot of resources it becomes hassle to manage the scaling part of it if you don't have the auto scaling implemented. The following Power App will help to understand how easy it is to build the above scenario.

Prerequisites :

We will use the following Azure resources to showcase how scaling and automation can save lot of cost.

  • AppService Plan
  • Azure SQL database
  • Virtual Machines

Step 1: Create and Update Automation Account :

First we need to create the Azure Automation account. Navigate to Create Resources -> search for Automation Account on the search bar. Create a new resource as follows,

Once you have created the resource, you also need to install the required modules for this particular demo which includes
-AzureRM.Resources
-AzureRM.Profile

Installing necessary Modules

Step 2: Create Azure PowerShell RunBooks

Next step is to create the run books by going to the Runbooks blade, for this tutorial lets create 6 run books one for each resources and its purpose. We need to create PowerShell scripts for each of those types

Scale Up SQL
-Scale Up ASP
-Start VM
-Scale Down SQL
-Scale Down ASP
-Stop VM

Creating RunBook

PowerShell scripts for each of these are found in the Github repository. We need to create runbook for each of the scripts.

All the runbooks above can be scheduled and automated to run for desired length of time, particular days of the week and time frame or continuously with no expiry. For an enterprise for non-production you would want it to scale down end of business hours and at the weekend.

Once we tested with the above Powershell scripts and the runbooks, tested and published now we can move on the next step to create the Flow. Navigate to Flow and Create New App from the template.

New Flow from the template

Select the template as PowerApps Button and the first step we need to add is the automation job. When you search for automation you will see the list of jobs available. Select Create Job and pick the one you created above. If you want to all the actions in one app, you can add one by one, If not you need to create separate flows for each one.

In this one, i have created one with the ScaleUpDB job which will execute the Scale up command of the database.

Once you are done with all the steps save the flow with necessary name.

Once we create PowerApp flow buttons login to MS PowerApps with a work/school account. Navigate to Power Apps which will give a way to create a blank canvas for Mobile or tablet. Next you can then begin to customise the PowerApp with text labels, colour and buttons as below

PowerApps Name

In this case we will have a button to increase/decrease the count of instances of the sql db, my app looked like below with few labels and buttons.

AutoMateApp

Next is to link the flow to the button of the power App by navigating to the Actions -> Power Automate

Linking Button Action

Once both Scale Up/Scale down actions are linked, save the app and publish

Step 5 : Verify Automation

Inorder to verify if things are working correctly, click on scale up and scale down few times and navigate to Azure Portal and open the Automation account we created.

Navigate to the overview tab to see the requests for each job made via the power app as below.

Jobs executed.

In order to look at the history navigate to the Jobs blade

Jobs Execution

further you can build a customized app for different environments with different screens using Power App. With the help of Azure Alert, whenever you get an alert regarding the heavy usages of resources/spikes, with single click of button you will be able scale up and scale down the resources as you need.

Improvements and things to consider:

This is just a starting point to explore more on this functionality, but there are improvements you could add to make this really useful.

(i) Sometimes the azure automation action fails to start the runbook. When you are implementing flow needs to handle this condition.

(ii) Sometimes a runbook action will be successful, but the runbook execution errored. Consider using try/catch blocks in the PowerShell and output the final result as a JSON object that can be parsed and further handled by the flow.

(iii) We should update your code to use the Az modules rather than AzureRM.

Note : The user who executes the PowerApp also needs permission to execute runbooks in the Automation Account.

With this App, It becomes handy for the operations team to manage the portal without logging in. Hope it helps someone out there! Cheers.

· 6 min read

Microsoft's Power Platform is a low code platform which enable the organization to analyse data from multiple sources, act on it through application and automate business process. Power Platform contains 3 main aspects (Power Apps,Power BI and Microsoft Flow) and it also integrates two main ecosystems(Office 365 and Dynamics 365). In this blog we will see in detail about Power Apps which helps to quickly build apps that connects data and run on web and mobile devices.

Overview :

We wills see how to build a simple Power App to automate scaling of resources in Azure in a simple yet powerful fashion to help save you money in the cloud using multiple Microsoft tools. In this post, we will use Azure Automation run books to code Azure Power Shell scripts, MS Flow as orchestration tool and MS Power Apps as the simple interface. If you have an Azure Environment with lot of resources it becomes hassle to manage the scaling part of it if you don't have the auto scaling implemented. The following Power App will help to understand how easy it is to build the above scenario.

Prerequisites :

We will use the following Azure resources to showcase how scaling and automation can save lot of cost.

  • AppService Plan
  • Azure SQL database
  • Virtual Machines

Step 1: Create and Update Automation Account :

First we need to create the Azure Automation account. Navigate to Create Resources -> search for Automation Account on the search bar. Create a new resource as follows,

Once you have created the resource, you also need to install the required modules for this particular demo which includes
-AzureRM.Resources
-AzureRM.Profile

Installing necessary Modules

Step 2: Create Azure PowerShell RunBooks

Next step is to create the run books by going to the Runbooks blade, for this tutorial lets create 6 run books one for each resources and its purpose. We need to create PowerShell scripts for each of those types

Scale Up SQL
-Scale Up ASP
-Start VM
-Scale Down SQL
-Scale Down ASP
-Stop VM

Creating RunBook

PowerShell scripts for each of these are found in the Github repository. We need to create runbook for each of the scripts.

All the runbooks above can be scheduled and automated to run for desired length of time, particular days of the week and time frame or continuously with no expiry. For an enterprise for non-production you would want it to scale down end of business hours and at the weekend.

Once we tested with the above Powershell scripts and the runbooks, tested and published now we can move on the next step to create the Flow. Navigate to Flow and Create New App from the template.

New Flow from the template

Select the template as PowerApps Button and the first step we need to add is the automation job. When you search for automation you will see the list of jobs available. Select Create Job and pick the one you created above. If you want to all the actions in one app, you can add one by one, If not you need to create separate flows for each one.

In this one, i have created one with the ScaleUpDB job which will execute the Scale up command of the database.

Once you are done with all the steps save the flow with necessary name.

Once we create PowerApp flow buttons login to MS PowerApps with a work/school account. Navigate to Power Apps which will give a way to create a blank canvas for Mobile or tablet. Next you can then begin to customise the PowerApp with text labels, colour and buttons as below

PowerApps Name

In this case we will have a button to increase/decrease the count of instances of the sql db, my app looked like below with few labels and buttons.

AutoMateApp

Next is to link the flow to the button of the power App by navigating to the Actions -> Power Automate

Linking Button Action

Once both Scale Up/Scale down actions are linked, save the app and publish

Step 5 : Verify Automation

Inorder to verify if things are working correctly, click on scale up and scale down few times and navigate to Azure Portal and open the Automation account we created.

Navigate to the overview tab to see the requests for each job made via the power app as below.

Jobs executed.

In order to look at the history navigate to the Jobs blade

Jobs Execution

further you can build a customized app for different environments with different screens using Power App. With the help of Azure Alert, whenever you get an alert regarding the heavy usages of resources/spikes, with single click of button you will be able scale up and scale down the resources as you need.

Improvements and things to consider:

This is just a starting point to explore more on this functionality, but there are improvements you could add to make this really useful.

(i) Sometimes the azure automation action fails to start the runbook. When you are implementing flow needs to handle this condition.

(ii) Sometimes a runbook action will be successful, but the runbook execution errored. Consider using try/catch blocks in the PowerShell and output the final result as a JSON object that can be parsed and further handled by the flow.

(iii) We should update your code to use the Az modules rather than AzureRM.

Note : The user who executes the PowerApp also needs permission to execute runbooks in the Automation Account.

With this App, It becomes handy for the operations team to manage the portal without logging in. Hope it helps someone out there! Cheers.

· 8 min read

A programmer can code for days continously without a break. I have done it when I started my career as a programmer. In IT field, it gets worse if you continously work without taking 5 minutes break every 30 minutes. In this blog I will explain how you can find yourself to remind someone to get up and take that mandatory break.

PreRequisites:

Sign up Twilio

In order to yse Twilio, you need to sign up and purchase a voice enabled phone number. If you’re new user to Twilio, you can start with a free trial.

Sign up Azure:

In order to deploy your Azure function, you need to have Azure subscription. You can create a FREE Azure subscription to setup your Azure function. The free trial will provide you with 12 months of free services.

Steps to Create the Function:

Step 1 : Create the Function app

Let's start off by creating an app for our requirement. In the Azure portal, click + Create a Resource

When the Azure Marketplace  appears and in the list, click Compute. In the Featured list, click Function App (note: if Function App does not appear, then click See all).

Then you need to fill the function app settings, you can follow the image to setup your funciton,

Step 2 : Add Function

We just completed creating the function app and we need to add a function that provides the capability of alerting the user that is configured with a trigger. The trigger will start the function which sends the Twilio SMS message. We’ll be using a Timer trigger for this tutorial.

In the left menu, click Resource groups and select the resource group you created in the last step.

Click on the App Service which is highlighted. Once the page loads, click the + button next to Functions to create a new function.

Add a Function

On the next screen, you’ll need to choose a development environment. Since we’ll be creating the function in the Azure Portal, select In-Portal and Continue.

Select In Portal

Since we want to create a Timer trigger, you’ll need to select Timer and click Create.

You should now see TimerTrigger1 listed under Functions in the left menu.

Step 3 : Integrate with Twilio SMS

As the next step we need to integrate Twilio SMS with the function App we created. Under TimerTrigger1 click Integrate

Integrate Function

under Outputs, click + New Output and select Twilio SMS.

When you click on that, you will get a warning saying Extensions not installed. You’ll need to install the Mirosoft.Azure.WebJobs.Extensions.Twilio extension. You can do so by clicking Install. This process can take up to 20 minutes so just give it a moment to complete installation.

While the function extensions are getting installed, you need to update the values in the relevant fields, the values can be obtained from your Twilio dashboard as shown below,

and fill them in the environment variables after you installed the extension, you need to set the environment variables which will be used within the function.

Step 4: Set Environment variables

It's ok to hardcode the Twilio credentials in the output environment variables. However, if you are running this app in production you should always use environment variables so that you dont expose the credentials to others. As you obtained the values from the Twilio dashboard, copy and save those values,

You can create environment variables in the Azure Portal by going to the Overview tab for your function and click Configuration.

Add the environment variables one by one,

KeyValue
TWILIO_SIDACXXXXXXXXXXX
TWILIO_TOKENAuth token obtained from the Twilio dashboard
SENDER_NUMBERYour twilio number +94 77 330 XXXXX
RECIPIENT_NUMBERPhone number that recieves the message

Environment variables

You can create the first envrionment variable by clicking on the New Application Setting and repeat the same for rest of those variables as shown above,

New appsettings Configuration

Add the first setting TWILIO_SID

TWILIO_SID Environment Variable

Once you are done with each setting value click OK. When you are adding both SENDER_NUMBER and RECIPIENT_NUMBER be extra careful as it can be tricky and make sure to use the E.164 format referenced above. After all environment variables have been added click Save to save the updates that were made to the Application Settings.

Step 5 : Timer settings

Whenever you create a timer function, By default, Azure sets your function to trigger the text message every 5 minutes. You can change how frequent the timer triggers by going to the Integrate and updating the values in Timer trigger.

Goto your function and select on Integrate and then you need to update the value of the interval as you need. The Schedule field contains a sequence that using CRON expressions. For the purpose of testing the function, change the number 5 to the number 2 and click Save. You can later change the frequency after you confirm that the function works properly.

CRON Expression

If you're creating the Function App using Visual Studio code, follow this sample app to create a timer function which is more easier with Visual Studio Code.

Step 6: Modify function.json File

As we are done with all the configuration steps, now we need to update the function.json within our TimerTrigger1 function.  Go back over to the function app and click TimerTrigger1. On the far-right side of the screen, click View Files.

You will see two files:

  • function.json
  • index.js

Click function.json to open the file. Since the file is currently missing the “to”: “RECIPIENT_NUMBER”, we’ll need to add this to our file.

Now we need to add the logic to create the message and send the message using the Twilio to the relevant reciever.

Navigate to my Github repo and grab the code for this file. You’ll want to replace the existing code in the function.json file with the new code that you just copied from GitHub.

{
"bindings": [
{
"name": "myTimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 */2 * * * *"
},
{
"type": "twilioSms",
"name": "message",
"accountSidSetting": "REPLACE_WITH_YOUR_ACCOUNT_SID",
"authTokenSetting": "REPLACE_WITH_YOUR_AUTH_TOKEN",
"from": "SENDER_NUMBER",
"to": "RECIPIENT_NUMBER",
"direction": "out"
}
]
}

Step 7 : Let's Add logic to the index.js file

When Azure creates a function, it adds default code to help setup your function. We will the code for the Twilio SMS message to this code.

In the View Files menu, click the index.js file. You’ll want to replace the existing code in the index.js file with the code below.

const twiAccountSid = process.env.TWILIO_SID;
const twiAuthToken = process.env.TWILIO_TOKEN;
const client = require('twilio')(twiAccountSid, twiAuthToken);
module.exports = async function (context, myTimer) {
var timeStamp = new Date().toISOString();
if (myTimer.IsPastDue)
{
context.log('JavaScript is running late!');
}
client.messages
.create({ from: process.env.SENDER_NUMBER,
body: "Time to have cofee and take a break for 5 minutes!",
to: process.env.RECIPIENT_NUMBER
})
.then(message => {
context.log("Message sent");
context.res = {
body: 'Text successfully sent'
};
context.log('JavaScript timer trigger done!', timeStamp);
context.done();
}).catch(err => {
context.log.error("Twilio Error: " + err.message + " -- " + err.code);
context.res = {
status: 500,
body: `Twilio Error Message: ${err.message}\nTwilio Error code: ${err.code}`
};
context.done();
});
}

Step 8 : Install the dependencies (Twilio)

As you can see the first line of the code is required to use twilio helper library. We’ll need to install the twilio-node from npm so that it’s available to our function. To do so, we’ll need to first add a new file to our function.

Add package.json file

In the View Files window, click Add. Type the file name package.json and click enter. You will see an empty content page in the middle of the screen.

Add Package.json

Add the code below to the package.json file.

{
"name": "doc247",
"version": "1.0.0",
"description": "Alert an employee with an SMS to take a break",
"main": "index.js",
"scripts": {
"test": "echo \"No tests yet...\""
},
"author": "Sajeetharan",
"dependencies": {},
"devDependencies": {
"twilio": "^3.0.0"
}
}

Now we have added Twilio as a dependency to the package.json file , as the next step we need to install it as a dependency on the environment itself. As you are aware you can install the dependencies using the deploy command as well as you can install it using the Kudu.

Note : Make sure to stop the Function app before you head over to Kudu.

Click on the Platform Features tab. Under Development Tools, click Advanced tools (Kudu). Kudu will open on it’s own in a new window.

Navigate to Kudu from the function app

In the top menu of the Kudu Console, click Debug Console and select CMD

Kudu Debug Console

In the command prompt, we’ll want to navigate to D:\home\site\wwwroot. You can do so by using the command cd site\wwwroot and press enter on your keyboard. Once you’re in wwwroot, run the command npm i twilio to install the package.

Install Dependencies Twilio

You will also notice a new node_modules folder added to the file structure. Back in the Overview tab for the function app, click Start.

node_modules Folder

Step 9 : Run and Test the Function

Back in the Overview tab for the function app, click Start. App and click TimerTrigger1. Make sure that you’re in the index.js file. Click Test next to View Files (far right-side of the screen). At the top of the index.js file, click Run

If everything was successful, the personshould receive a text message after 20 minutes with your message!

You can change the frequency for your timer by heading back to Integrate and changing the Schedule field. Be sure to read up on CRON expressions before entering a new frequency.

If you’re curious to learn more about Azure Functions, I would suggest taking this Microsoft Learn module. You can access complete source code from here. If you want to learn more on Azure visit http://azure360.info/ .Happy Coding!

· 3 min read

This week I participated my first ever OpenHack on DevOps organized by Microsoft in Singapore. It was a three day event from 26th-29th November 2019. The hackathon was focused on Azure DevOps and Azure Kubernetes services. There were participants from all over the world gathered at one place.

There were over 90+ Participants comprised of internal employees as well as customers. Participants were divided into 6 members per team with one coach. The content was set as 8 challenges. Coach from each team was some Cloud Solution Architect from Microsoft who was helping and guiding the team during the challenges with some hints to solve it. One of the cool thing of the hack was that each team could apply their own solutions in unique ways. We as a team were supposed to find our way out to solve the challenges. There was no one way, we were free to take our decisions and paths as deemed fit. If you are wondering about the agenda and what happened in the hack, here you go.

My Team RockStars - Announced as Happiest team among all

What I really liked about the openhack was that each team member was really able to understand what's the challenge and was able to get the team's support whenever they're stuck. Before we start each challenge, one member from the team was assigned as a Scrum Master and he has to drive the entire team to complete the challgne. In each challenge, one has to elloborate the feature of whatever the tools/technologies that we would use in the challenge. There was whiteboarding session included in each challenge before we get in to try to solve the challenge. It was a hands-on rather than attending any tech talk about a specific topic. The tasks were set, challenges were well organized, the environment was prepared, code was almost prepared (with some changes) so that we can focus on learning how we can use Azure DevOps as a tool to ensure zero down-time for production ready application. Kubernetes was chosen as an orchestration framework. Azure monitor was used as the Monitoring service.

Microsoft OpenHack is a developer focused event where a wide variety of participants (Open) learn through hands-on experimentation (Hack) using challenges based on real world customer engagements designed to mimic the developer journey.

For every challenge the links to documentation and resources were provided to understand relevant topics and areas at hand. Besides the actual work, it was a great opportunity to network and discuss broader topics with fellow team members and other participants. It was not just about solving challenges, but each one was appreciating others work whenever we accompolished something. we were given with some cool swags including stickers,notebook wireless charger and Azure Devops badge.

There was not a real winner(team) out of this openhack, all the teams who participated thoroughly enjoyed and it was about sharing and solving real issues.Overall, I think it was a great learning experience for all the participants with great focus on getting things done. I will definitely keep an eye on such events in the future and try to join as a Devops coach for the upcoming events. More than the hackathon it was not just about technology but about teamwork. If you want to have the same experience try to join any of the OpenHacks from here.

· 2 min read

One of the cool feature that Microsoft teams provide is that you can load any website as a tab on particular channel. One handy thing i noticed is that if you want to access azure cloud shell in teams you can just add it as a separate tab and manage the resources without navigating to the browser. With simple step you can access your azure portal as a tab in your teams.

Step 1 : Goto your channel on teams and click on + (Add tab)

it will open up a window.

Step 2 : Just type website in the search tab and you should be able to add a new website as a tab.

Step 3: Just add http://shell.azure.com as a website tab and thats it. Now you should be able to execute all the commands without opening up a shell.

You can do the same with most of the frequently accessed websites if you want to demonstrate something while you are on teams. I have done with Stackoverflow and it's really helfpul while github can be added as a separate tab from the available list.

Try it out today, eventhough it is not helpful as the page refreshes when you move to another tab and come back, but really a good feature to have as a developer. Here is a small gif demonstrating how effectively you can navigate between those tabs.

Microsoft teams is getting better with much features day by day!