Skip to main content

3 posts tagged with "azure-cosmosdb"

View All Tags

· 3 min read

Overview :

In last November 2021, Cosmos DB team has announced support for patching documents with SQL API which was the top requested features in the user voice. This is a useful and long-awaited feature among users. Prior to this announcement the only way to change the stored document was to completely replace it. Users will be able to perform Partial updates with all the Cosmos DB SDKs ( JAVA, . Net , JS ) and also Cosmos DB REST API.

Use Case:

Let's look at a scenario of a dashboard application where user need to display the System and it's features. User has decided to use Cosmos DB as a database to store the data of a System and it's features. A sample document would look as follows.

{
"ApplicationId": "App-01",
"ApplicationName": "New Cosmos Client",
"Description": "This shows the users and interactions",
"IsDraft": 0,
"IsSaved": 1,
"Features": [
{
"SystemID": 1,
"Features": [
{
"FeatureID": 4,
"FeatureName": "Patch",
"IsActive": false
},
{
"FeatureID": 35,
"FeatureName": "Feature-a-2",
"IsActive": false
},
{
"FeatureID": 36,
"FeatureName": "test-feature-a-3",
"IsActive": true
}
]
}
],
"id": "af3dfd7a-b7e9-4467-a2ca-9c6be7ad5b9f",
"_rid": "S1xQAKKOTBABAAAAAAAAAA==",
"_self": "dbs/S1xQAA==/colls/S1xQAKKOTBA=/docs/S1xQAKKOTBABAAAAAAAAAA==/",
"_etag": "\"31003582-0000-0700-0000-6213cb6b0000\"",
"_attachments": "attachments/",
"_ts": 1645464427
}

Let's look at some of the scenarios where user needs to perform patch operations.

Patch Scenario 1:  Add a new record (JSON object) under Features parent array

JSON object to Add:

{
"SystemID": 2,
"Features": [
{
"FeatureID": 1,
"FeatureName": "Reports",
"IsActive": false
}
]
}

You can refer my previous blog on How to setup Rest Operations with Postman. Once you've configured the setup with the underlying database and container. Here we need to insert a new Object within the Features array, certainly we can leverage the Single Add Operation. And the sample request Body will be like,

Patch Url : [https://account/dbs/dbName/colls/Patch/docs/id of the document](https://account/dbs/dbName/colls/Patch/docs/id of the document) and the header "x-ms-documentdb-partitionkey" is set to the partition key value which is the "ApplicationId" "App-01". And we need to specify the path as "/Features/-"

Add new JSON object under Features Array

On a successful response, in the backend you will see that a new Object that is shown above will be added to the Features array.

Patch Scenario 2:  Update the Application Description in the root Object to "Testing Patch with REST API"

Patch Scenario 3:  Update only a specific Feature object in array for a given Features parent object like FeatureName . Let's assume if we need to update the Object at the 0th index and then update the First Features Object,

Patch Scenario 4 : Perform all the above operations as a single request

Since Partial document update supports up to 10 operations in a single request, all the above operations can be merged. User needs to combine all the operations as one as below,

{
"operations":[
{
"op":"replace",
"path":"/Features/0/Features/0/FeatureName",
"value":"Patch Testing"
},
{
"op":"set",
"path":"/Description",
"value":"Testing Patch with REST API"
},
{
"op":"add",
"path":"/Features/-",
"value":{
"SystemID": 2,
"Features": [
{
"FeatureID": 1,
"FeatureName": "Reports",
"IsActive": false
}
]
}
}
]
}

Hope these samples help you to perform different operations at different path using Cosmos Rest API. Cheers!

· 6 min read

The first ever Github Universe viewing party in SriLanka took place on last Thursday organized by the Github Campus Experts in the country. It was an event to share all the exciting news and updates on Github and it was a great success. I decided to write this blog based on the session i presented on “Github Actions”. It’s amazing to see the new features announced by the Github over the span of last 12 months out of which Github actions was the latest one and it was made generally available on  few days ago (November 13, 2019) to build CI/CD pipelines from GitHub itself. I was excited about this announcement and tested it with two of my projects and I have to say I’m impressed.

As a example, in this post I will explain about how to build a “Emotion detection app” with angular and deploy it on one of the public cloud(Azure) with Github Actions. Below is the simple architecture diagram to get an understanding on how I am going to leverage Github action to deploy my Angular application to the Cloud. Here is the simple architecture of the application that i have demonstrated.

PreRequisities:

Step 1 : Create the Resource group On Azure :

As the first step, we need to create the AppService on Azure to Deploy the Angular application.Navigate to https://portal.azure.com/ and you will be directed to the home page on the portal. Let’s create a resource group to group the resources we create.

Step 2: Create the App service to deploy the Angular app.

As the second step, create an app service to deploy the Angular application

Step 3: Create the Cognitive Service

Create Cognitive service to integrate the emotion detection part. We will use detect api to detect the attributes in a picture.

If you want to store the data , you can create a new cosmosdb to store the results which i have not included here.

Step 4: Code the Angular App

You need to create the component to upload a file and pass the file to the cognitive service to detect the attributes and use ngFor on the template to display the results.

Get the keys of the cognitive service and the url from the portal as follow

You can access the whole code from here. Make sure to replace the Ocp-Apim-Subscription-Key and the url according to the endpoint you created above.

makeRequest() {
let data, contentType;
if (typeof this.image === 'string' && !this.image.startsWith('data')) {
data = { url: this.image };
contentType = 'application/json';
} else {
data = this.fileToUpload;
contentType = 'application/octet-stream';
}

const httpOptions = {
headers: new HttpHeaders({
'Content-Type': contentType,
'Ocp-Apim-Subscription-Key': 'eb491c17bd874d2f9d410eedde346366'
})
};

this.http
.post(
'https://eastus.api.cognitive.microsoft.com/face/v1.0/detect?returnFaceId=true&returnFaceLandmarks=false&returnFaceAttributes=emotion',
data,
httpOptions
)
.subscribe(body => {
if (body && body[0]) {
console.log(body);
this.output = body;
this.thing = body[0].faceAttributes.emotion;
this.result = this.getTop();
this.noFace = false;
} else {
this.noFace = true;
}
});
}

Step 5: Push the Code to Github

You can push the code to your own repository on GitHub and let’s create the build and deploy pipeline via the GitHub actions. Navigate to your repository and click on Actions

Step 6: Create Github Action with Workflow

Create a new workflow by clicking on the new workflow. You will get to see different templates by default to build the pipeline according to the application language as below

In this case, I will create my own workflow by clicking on the setup workflow for yourself. Name the workflow as angular.yaml. You can see a new file being generated under your repository as github_action_angular/.github/workflows/azure.yml

name: Deploy to Azure
on:
push:
branches:
- master
env:
AZURE_WEBAPP_NAME: github-actions-spa
AZURE_WEBAPP_PACKAGE_PATH: './dist/angulargithubaction'
NODE_VERSION: '10.x'

jobs:
build-and-deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@master
- name: Use Node.js ${{ env.NODE_VERSION }}
uses: actions/setup-node@v1
with:
node-version: ${{ env.NODE_VERSION }}
- name: Install dependencies
run: npm install
- name: Build
run: npm run build -- --prod
- name: 'Deploy to Azure WebApp'
uses: azure/webapps-deploy@v1
with:
app-name: ${{ env.AZURE_WEBAPP_NAME }}
publish-profile: ${{ secrets.AZURE_WEBAPP_PUBLISH_PROFILE }}
package: ${{ env.AZURE_WEBAPP_PACKAGE_PATH }}

The workflow is really simple. As you see it includes a name and few actions starting with when you need to do the build and deploy. Buildon: push indicates that whenever there is a new commit the code needs to be built again. Also you have to define NodeJS version and will run our build on ubuntu server. And you have a few regular steps that we usually do with building angular application if we are familiar with Angular apps development.

Also as an option you  run that configuration only for branches other than master. For master branch we have separate configuration (with deployment to Azure). So it is flexible to maintain different workflows to different branches/environments. Is not that cool?

Step 7: Configure the Pipeline,Secrets

As the next step you need to create in GitHub Secrets page new secrets. It’s important to save the secret name whenever you need to deploy to production/development using secrets is one of the best practice. You can get the the keys from the publish profile of the app service.

Create new secret as above with the values got from profile.

  We have to configure the values in angular.yaml as follows:

  • app-name — application name in Azure
  • publish-profie — name of the secret from GitHub
  • package — path to directory which we would like to deploy (in above example: ./dist/yourSPAApp.

And that’s it. Really clear and simple! You can just check if the deployment has been successful or not by navigating to the Kudu.

And you can see the application working successfully on Azure. As the next step you can include unit tests to run when you do the build. Using the Angular CLI and Github Actions, it has become very easy to create and test frontend Web apps. Check out the fulling working demo repo below as well as the current build status for the demo!.

Start using Github Action and deploy your app within few seconds. You can use Github actions to deploy any application to any cloud as i've explained above.

You can access the Session Slides from here and the repository from here.

· One min read

One of the interesting queries that i got from my colleague is that how to get rid of the metadata properties when retrieving documents from Cosmosdb. It seemed like a very reasonable expectation to have the option with the document "GET" API call to be able to retrieve exactly what he created using the document "POST" API call, without these Cosmosdb Metadata properties mixed in:

"_rid":"ehszALxRRgACAAAAAAAAAA==", "_ts":1408340640, "_self":"dbs\/ehszAA==\/colls\/ehszALxRRgAALxRRgACAAAAAAAAAA==\/", "_etag":"00002500-0000-0000-0000-53f192a00000", "_attachments":"attachments\/"

As of now there is no direct way to omit these properties when you are querying the documents. However, cosmosdb team is aware of this feature request, understand the reasons for it, and are considering it for a future release.

For those who are wondering how to omit these system generated properties, you can simply handle this with a User Defined Function.

function stripMeta(doc) {
var metaProps = ["_rid", "_ts", "_self", "_etag", "_attachments"];
var newDoc = {};
for(var prop in doc) {
if (metaProps.indexOf(prop) == -1) {
newDoc[prop] = doc[prop];
}
}

return newDoc;
}

And you can retrieve your documents with whatever queries as follows,

select value udf.stripMeta(c) from c

Hope this helps someone out there.