Skip to main content

· 4 min read

Technology is moving very fast and it’s hard to keep up, but you always gotta find a way to keep updated! Especially in the cloud's world it is mandatory to keep your knowledge updated everyday. Every social media is a great place to know about what’s happening around,new innovations and stuff. As there are many services/features been announced every day on Azure, This blog helps you to unleash the power of AzureCharts which is a great resource that help to track Azure updates. AzureCharts is a set of auto-rebuilt charts to keep you updated on Azure changes, news, stats. Public updates, RSS channels and web pages are used as data sources.

Before few weeks I saw a tweet from mark russinovich who is the CTO of Azure definitely somebody that you should follow if you want to learn about Azure.

As you see, he mentioned in the tweet if you want to see exactly what updates were making to Azure services check out this cool site that lets you see all of them as your charts. This site was developed by one of cloud solution architects Alexey Polkovnikov. Azure charts is not a Microsoft service or product, this is just a personal project of him and quite honestly I think this is awesome. He has delivered a great way to consolidate everything going on in all of the different update points of Azure which cover things like RSS feeds the azure updates page news statistic other public updates so this is just an awesome place to come and see exactly what's going on.

HeatMap View:

It pulls the update from portal and other locations in order to give us many things including a heat map view. Now you have one place to come where you can see everything across all major pillars and services in Azure.This is just absolutely amazing so this heat map view specifically digs into the hottest areas of azure update items.You'll find those across the top and they are the most highlighted in color and we have many different ways that we can look at and slice this data up. You can also look at this in terms of regions and data centers which features are being deprecated in this retirement section,things security compliance open source and other azure services and individual feature updates.

Addition to that you can look at this based on a specific measure role. For example, if I am an Azure developer then I would care about what's going on with the SDKs and specific features that are here.

SLA

It is very difficult to keep track the SLA of each and every service on Azure. This feature becomes very handly, if we click on this tile it takes us over to the SLA page and you can see the SLA details a and then you can read up about that and the different service levels.

Regions

Region look up is a pretty cool feature this is where we can take aparticular geography and again compare it to another region but a little more. specifically, if you want to look at features that are GA in the United States geography regions or what is GA in preview GA preview and future you will just select GA and hit search regions and of all of these services.

Fun

Last but not the least, Quize helps you to validate the knowledge on different services with bit of gamification as there are different multiple choice questions.

Thanks Alexey for putting this together and this should be a shout out for everybody to go explore this tool and give your feedback. Definitely this would help in some way.

· 3 min read

This week I participated my first ever OpenHack on DevOps organized by Microsoft in Singapore. It was a three day event from 26th-29th November 2019. The hackathon was focused on Azure DevOps and Azure Kubernetes services. There were participants from all over the world gathered at one place.

There were over 90+ Participants comprised of internal employees as well as customers. Participants were divided into 6 members per team with one coach. The content was set as 8 challenges. Coach from each team was some Cloud Solution Architect from Microsoft who was helping and guiding the team during the challenges with some hints to solve it. One of the cool thing of the hack was that each team could apply their own solutions in unique ways. We as a team were supposed to find our way out to solve the challenges. There was no one way, we were free to take our decisions and paths as deemed fit. If you are wondering about the agenda and what happened in the hack, here you go.

My Team RockStars - Announced as Happiest team among all

What I really liked about the openhack was that each team member was really able to understand what's the challenge and was able to get the team's support whenever they're stuck. Before we start each challenge, one member from the team was assigned as a Scrum Master and he has to drive the entire team to complete the challgne. In each challenge, one has to elloborate the feature of whatever the tools/technologies that we would use in the challenge. There was whiteboarding session included in each challenge before we get in to try to solve the challenge. It was a hands-on rather than attending any tech talk about a specific topic. The tasks were set, challenges were well organized, the environment was prepared, code was almost prepared (with some changes) so that we can focus on learning how we can use Azure DevOps as a tool to ensure zero down-time for production ready application. Kubernetes was chosen as an orchestration framework. Azure monitor was used as the Monitoring service.

Microsoft OpenHack is a developer focused event where a wide variety of participants (Open) learn through hands-on experimentation (Hack) using challenges based on real world customer engagements designed to mimic the developer journey.

For every challenge the links to documentation and resources were provided to understand relevant topics and areas at hand. Besides the actual work, it was a great opportunity to network and discuss broader topics with fellow team members and other participants. It was not just about solving challenges, but each one was appreciating others work whenever we accompolished something. we were given with some cool swags including stickers,notebook wireless charger and Azure Devops badge.

There was not a real winner(team) out of this openhack, all the teams who participated thoroughly enjoyed and it was about sharing and solving real issues.Overall, I think it was a great learning experience for all the participants with great focus on getting things done. I will definitely keep an eye on such events in the future and try to join as a Devops coach for the upcoming events. More than the hackathon it was not just about technology but about teamwork. If you want to have the same experience try to join any of the OpenHacks from here.

· 2 min read

One of the cool feature that Microsoft teams provide is that you can load any website as a tab on particular channel. One handy thing i noticed is that if you want to access azure cloud shell in teams you can just add it as a separate tab and manage the resources without navigating to the browser. With simple step you can access your azure portal as a tab in your teams.

Step 1 : Goto your channel on teams and click on + (Add tab)

it will open up a window.

Step 2 : Just type website in the search tab and you should be able to add a new website as a tab.

Step 3: Just add http://shell.azure.com as a website tab and thats it. Now you should be able to execute all the commands without opening up a shell.

You can do the same with most of the frequently accessed websites if you want to demonstrate something while you are on teams. I have done with Stackoverflow and it's really helfpul while github can be added as a separate tab from the available list.

Try it out today, eventhough it is not helpful as the page refreshes when you move to another tab and come back, but really a good feature to have as a developer. Here is a small gif demonstrating how effectively you can navigate between those tabs.

Microsoft teams is getting better with much features day by day!

· 6 min read

The first ever Github Universe viewing party in SriLanka took place on last Thursday organized by the Github Campus Experts in the country. It was an event to share all the exciting news and updates on Github and it was a great success. I decided to write this blog based on the session i presented on “Github Actions”. It’s amazing to see the new features announced by the Github over the span of last 12 months out of which Github actions was the latest one and it was made generally available on  few days ago (November 13, 2019) to build CI/CD pipelines from GitHub itself. I was excited about this announcement and tested it with two of my projects and I have to say I’m impressed.

As a example, in this post I will explain about how to build a “Emotion detection app” with angular and deploy it on one of the public cloud(Azure) with Github Actions. Below is the simple architecture diagram to get an understanding on how I am going to leverage Github action to deploy my Angular application to the Cloud. Here is the simple architecture of the application that i have demonstrated.

PreRequisities:

Step 1 : Create the Resource group On Azure :

As the first step, we need to create the AppService on Azure to Deploy the Angular application.Navigate to https://portal.azure.com/ and you will be directed to the home page on the portal. Let’s create a resource group to group the resources we create.

Step 2: Create the App service to deploy the Angular app.

As the second step, create an app service to deploy the Angular application

Step 3: Create the Cognitive Service

Create Cognitive service to integrate the emotion detection part. We will use detect api to detect the attributes in a picture.

If you want to store the data , you can create a new cosmosdb to store the results which i have not included here.

Step 4: Code the Angular App

You need to create the component to upload a file and pass the file to the cognitive service to detect the attributes and use ngFor on the template to display the results.

Get the keys of the cognitive service and the url from the portal as follow

You can access the whole code from here. Make sure to replace the Ocp-Apim-Subscription-Key and the url according to the endpoint you created above.

makeRequest() {
let data, contentType;
if (typeof this.image === 'string' && !this.image.startsWith('data')) {
data = { url: this.image };
contentType = 'application/json';
} else {
data = this.fileToUpload;
contentType = 'application/octet-stream';
}

const httpOptions = {
headers: new HttpHeaders({
'Content-Type': contentType,
'Ocp-Apim-Subscription-Key': 'eb491c17bd874d2f9d410eedde346366'
})
};

this.http
.post(
'https://eastus.api.cognitive.microsoft.com/face/v1.0/detect?returnFaceId=true&returnFaceLandmarks=false&returnFaceAttributes=emotion',
data,
httpOptions
)
.subscribe(body => {
if (body && body[0]) {
console.log(body);
this.output = body;
this.thing = body[0].faceAttributes.emotion;
this.result = this.getTop();
this.noFace = false;
} else {
this.noFace = true;
}
});
}

Step 5: Push the Code to Github

You can push the code to your own repository on GitHub and let’s create the build and deploy pipeline via the GitHub actions. Navigate to your repository and click on Actions

Step 6: Create Github Action with Workflow

Create a new workflow by clicking on the new workflow. You will get to see different templates by default to build the pipeline according to the application language as below

In this case, I will create my own workflow by clicking on the setup workflow for yourself. Name the workflow as angular.yaml. You can see a new file being generated under your repository as github_action_angular/.github/workflows/azure.yml

name: Deploy to Azure
on:
push:
branches:
- master
env:
AZURE_WEBAPP_NAME: github-actions-spa
AZURE_WEBAPP_PACKAGE_PATH: './dist/angulargithubaction'
NODE_VERSION: '10.x'

jobs:
build-and-deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@master
- name: Use Node.js ${{ env.NODE_VERSION }}
uses: actions/setup-node@v1
with:
node-version: ${{ env.NODE_VERSION }}
- name: Install dependencies
run: npm install
- name: Build
run: npm run build -- --prod
- name: 'Deploy to Azure WebApp'
uses: azure/webapps-deploy@v1
with:
app-name: ${{ env.AZURE_WEBAPP_NAME }}
publish-profile: ${{ secrets.AZURE_WEBAPP_PUBLISH_PROFILE }}
package: ${{ env.AZURE_WEBAPP_PACKAGE_PATH }}

The workflow is really simple. As you see it includes a name and few actions starting with when you need to do the build and deploy. Buildon: push indicates that whenever there is a new commit the code needs to be built again. Also you have to define NodeJS version and will run our build on ubuntu server. And you have a few regular steps that we usually do with building angular application if we are familiar with Angular apps development.

Also as an option you  run that configuration only for branches other than master. For master branch we have separate configuration (with deployment to Azure). So it is flexible to maintain different workflows to different branches/environments. Is not that cool?

Step 7: Configure the Pipeline,Secrets

As the next step you need to create in GitHub Secrets page new secrets. It’s important to save the secret name whenever you need to deploy to production/development using secrets is one of the best practice. You can get the the keys from the publish profile of the app service.

Create new secret as above with the values got from profile.

  We have to configure the values in angular.yaml as follows:

  • app-name — application name in Azure
  • publish-profie — name of the secret from GitHub
  • package — path to directory which we would like to deploy (in above example: ./dist/yourSPAApp.

And that’s it. Really clear and simple! You can just check if the deployment has been successful or not by navigating to the Kudu.

And you can see the application working successfully on Azure. As the next step you can include unit tests to run when you do the build. Using the Angular CLI and Github Actions, it has become very easy to create and test frontend Web apps. Check out the fulling working demo repo below as well as the current build status for the demo!.

Start using Github Action and deploy your app within few seconds. You can use Github actions to deploy any application to any cloud as i've explained above.

You can access the Session Slides from here and the repository from here.

· One min read

One of the interesting queries that i got from my colleague is that how to get rid of the metadata properties when retrieving documents from Cosmosdb. It seemed like a very reasonable expectation to have the option with the document "GET" API call to be able to retrieve exactly what he created using the document "POST" API call, without these Cosmosdb Metadata properties mixed in:

"_rid":"ehszALxRRgACAAAAAAAAAA==", "_ts":1408340640, "_self":"dbs\/ehszAA==\/colls\/ehszALxRRgAALxRRgACAAAAAAAAAA==\/", "_etag":"00002500-0000-0000-0000-53f192a00000", "_attachments":"attachments\/"

As of now there is no direct way to omit these properties when you are querying the documents. However, cosmosdb team is aware of this feature request, understand the reasons for it, and are considering it for a future release.

For those who are wondering how to omit these system generated properties, you can simply handle this with a User Defined Function.

function stripMeta(doc) {
var metaProps = ["_rid", "_ts", "_self", "_etag", "_attachments"];
var newDoc = {};
for(var prop in doc) {
if (metaProps.indexOf(prop) == -1) {
newDoc[prop] = doc[prop];
}
}

return newDoc;
}

And you can retrieve your documents with whatever queries as follows,

select value udf.stripMeta(c) from c

Hope this helps someone out there.