Skip to main content

· 3 min read

Overview :

In last November 2021, Cosmos DB team has announced support for patching documents with SQL API which was the top requested features in the user voice. This is a useful and long-awaited feature among users. Prior to this announcement the only way to change the stored document was to completely replace it. Users will be able to perform Partial updates with all the Cosmos DB SDKs ( JAVA, . Net , JS ) and also Cosmos DB REST API.

Use Case:

Let's look at a scenario of a dashboard application where user need to display the System and it's features. User has decided to use Cosmos DB as a database to store the data of a System and it's features. A sample document would look as follows.

{
"ApplicationId": "App-01",
"ApplicationName": "New Cosmos Client",
"Description": "This shows the users and interactions",
"IsDraft": 0,
"IsSaved": 1,
"Features": [
{
"SystemID": 1,
"Features": [
{
"FeatureID": 4,
"FeatureName": "Patch",
"IsActive": false
},
{
"FeatureID": 35,
"FeatureName": "Feature-a-2",
"IsActive": false
},
{
"FeatureID": 36,
"FeatureName": "test-feature-a-3",
"IsActive": true
}
]
}
],
"id": "af3dfd7a-b7e9-4467-a2ca-9c6be7ad5b9f",
"_rid": "S1xQAKKOTBABAAAAAAAAAA==",
"_self": "dbs/S1xQAA==/colls/S1xQAKKOTBA=/docs/S1xQAKKOTBABAAAAAAAAAA==/",
"_etag": "\"31003582-0000-0700-0000-6213cb6b0000\"",
"_attachments": "attachments/",
"_ts": 1645464427
}

Let's look at some of the scenarios where user needs to perform patch operations.

Patch Scenario 1:  Add a new record (JSON object) under Features parent array

JSON object to Add:

{
"SystemID": 2,
"Features": [
{
"FeatureID": 1,
"FeatureName": "Reports",
"IsActive": false
}
]
}

You can refer my previous blog on How to setup Rest Operations with Postman. Once you've configured the setup with the underlying database and container. Here we need to insert a new Object within the Features array, certainly we can leverage the Single Add Operation. And the sample request Body will be like,

Patch Url : [https://account/dbs/dbName/colls/Patch/docs/id of the document](https://account/dbs/dbName/colls/Patch/docs/id of the document) and the header "x-ms-documentdb-partitionkey" is set to the partition key value which is the "ApplicationId" "App-01". And we need to specify the path as "/Features/-"

Add new JSON object under Features Array

On a successful response, in the backend you will see that a new Object that is shown above will be added to the Features array.

Patch Scenario 2:  Update the Application Description in the root Object to "Testing Patch with REST API"

Patch Scenario 3:  Update only a specific Feature object in array for a given Features parent object like FeatureName . Let's assume if we need to update the Object at the 0th index and then update the First Features Object,

Patch Scenario 4 : Perform all the above operations as a single request

Since Partial document update supports up to 10 operations in a single request, all the above operations can be merged. User needs to combine all the operations as one as below,

{
"operations":[
{
"op":"replace",
"path":"/Features/0/Features/0/FeatureName",
"value":"Patch Testing"
},
{
"op":"set",
"path":"/Description",
"value":"Testing Patch with REST API"
},
{
"op":"add",
"path":"/Features/-",
"value":{
"SystemID": 2,
"Features": [
{
"FeatureID": 1,
"FeatureName": "Reports",
"IsActive": false
}
]
}
}
]
}

Hope these samples help you to perform different operations at different path using Cosmos Rest API. Cheers!

· 3 min read

Overview:

As a Developer, If you have a requirement to have list of work items which are approved to be displayed in a Grid View. You would tend to query for all approved work items and display them in the grid, one point to note is that since the amount of approved work items might grow over time, hence loading the data would get slower and slower due to having to download more items. In general, this is where pagination comes into play in front end development, which means you don't need to necessarily download all records, just download a page of records at a time. If you are using Cosmos DB as a backend, this feature is supported out of the box via the use of Continuation Token. Cosmos DB SDKs utilize a continuation strategy when managing the results returned from the queries.

Pagination with @Azure/cosmos SDK:

One of the frequent questions I have come across on forums is on How to Implement Pagination with @azure/cosmos SDK with JavaScript. With this post , I wanted to keep it simple and add a very minimal quick start on how to implementation pagination with .js SDK. You can follow the steps given in the repository.

Prerequisites:

  • Create a CosmosDB Account of type SQL API and database named 'javascript' and collection named 'products'
  • Insert the data 'ProductsData.json' to CosmosDB using the Data Migration Tool

execute.ts is the start point of the application which invokes the CallPagination method defined in the pagination.service.ts.

import dotenv from "dotenv"
import { Helper } from './helper';
import PaginationService from './pagination.service';

dotenv.config()

const cosmosDB = new Helper();
const pageMod = new PaginationService(cosmosDB);


const callPagination = async () => {
const result = await pageMod.executeSample(2, "")
console.log({
data: result.result,
dataLength: result.result.length,
hasMoreResult: result.hasMoreResults,
contToken: result.continuationToken
});
};

callPagination();

Implementation is really simple as we are passing a simple query that has more than 40 records and we set the pageLimit as 5 which is the max number of items to be returned in single call.

 public async executeSample(
pageLimit: number = 5,
contToken: string = ""
): Promise<{
result: Array<any>;
hasMoreResults: boolean;
continuationToken: string;
}> {
const sqlQuery =
`SELECT * from products where products.CategoryName = "Bikes, Road Bikes"`
console.log({ sqlQuery })

const { result, hasMoreResults, continuationToken } =
await this.cosmosHelper.queryContainerNext(
{
query: sqlQuery
},
{
maxItemCount: pageLimit,
continuationToken: contToken
},
process.env.CONTAINER_NAME
);

return {
result: result || [],
hasMoreResults: hasMoreResults || false,
continuationToken: continuationToken || "",
};
}

You can access the whole sample from this link.

Steps to Run the application

  • Clone the Repository
  • Run npm install
  • Replace the environment COSMOS_ENDPOINT,COSMOS_KEY,COSMOS_DATABASE_NAME,CONTAINER_NAME in the .env file
  • Run npm start run the application

Hope this helps someone who wants to implement Pagination with JavaScript. Cheers!

· 2 min read

Hello folks!

Azure Data Lake is a big data solution which allows organizations to ingest multiple data sets covering structured, unstructured, and semi-structured data into an infinitely scalable data lake enabling storage, processing, and analytics. It enables users to build their own customized analytical platform to fit any analytical requirements in terms of volume, speed, and quality.

Cloud Scale Analytics with Azure Data Services: Build modern data warehouses on Microsoft Azure

In the context of above, the book "Cloud Scale Analytics with Azure Data Services" book is your guide to learning all the features and capabilities of Azure data services for storing, processing, and analyzing data (structured, unstructured, and semi-structured) of any size. You will explore key techniques for ingesting and storing data and perform batch, streaming, and interactive analytics

The book also shows you how to overcome various challenges and complexities relating to productivity and scaling. Next, you will be able to develop and run massive data workloads to perform different actions. Using a cloud based big data-modern data warehouse–analytics setup, you will also be able to build secure, scalable data estates for enterprises. Finally, you will not only learn how to develop a data warehouse but also understand how to create enterprise-grade security and auditing big data programs.

By the end of this Azure book, you will have learned how to develop a powerful and efficient analytical platform to meet enterprise needs.

WIN YOUR FREE COPY

Enter your details and Click Submit to enter our monthly prize draw. Take your chance to receive one of the 10 free copies as give away.

· 8 min read

Overview :

In this blog post you will learn about the Azure Cosmos DB SQL API queries and How to get started with Cosmos DB SQL API. I recently published a video on youtube and decided to have it available in blog as well. Azure Cosmos DB is a fully managed NoSQL multi model database service provided by Azure which is highly available, globally distributed, and responds back within the minimum latency in single digit millisecond. It's becoming the preferred database for developers on Azure to build modern day applications.

You can access the slides here and repository for the queries here.

Azure supports multiple data models including documents, key-value, graph, and column-family with multi models APIs such as SQL,Mongo,Cassandra,Gremlin and Table. SQL APi is one of them and its oldest offerings on cosmos db. SQL API is also known as Core API which means that any new feature which is rolled out to cosmos db usually first available in SQL API accounts. It supports for querying the items using the Structured query language Syntax which provides a way to query JSON objects.

Also cosmos db SQL API queries can be done using any SDK we provide with Net, Java, Node and python.

Azure Cosmos DB SQL API

Azure Cosmos DB is truly schema-free. Whenever you store data, it provides automatic indexing of JSON documents without requiring explicit schema or creation of secondary indexes.

The Azure Cosmos DB database account is a unique name space that gives you access to Azure Cosmos DB.

A database account consists of a set of databases, each containing multiple collections, each of which can contain stored procedures, triggers, UDFs, documents, and related attachments.

Cosmos DB SQL API Account Structure

Ways to Manage Cosmosdb Documents :

With the Cosmos DB SQL API , you can create documents using a variety of different tools :

Portal: The Data Explorer is a tool embedded within the Azure Cosmos DB blade in the Azure Portal that allows you to view, modify and add documents to your Cosmos DB API collections. Within the explorer, you can upload one or more JSON documents directly into a specific database or collection which i will be showing in a bit

SDK: Cosmos DB database service that was released prior to Azure Cosmos DB featured a variety of SDKs available across many languages

REST API : As we mentioned previously, JSON documents stored in SQL API are managed through a well-defined hierarchy of database resources. These resources are each addressable using a unique URI. Since each resource has a unique URI, many of the concepts in Restful API design applies to SQL API resources

Data Migration Tool: The open-source Cosmos DB data migration tool which allows you to import data into a SQL API collection from various sources including MongoDB, SQL Server, Table Storage, Amazon DynamoDB, HBase and other Cosmosdb collections.

Prerequisites:

For this overview demo , I will be using a dataset Tweets which contains Tweets by users across the world on certain tags says #Azure and #Cosmosdb

To replicate the demo you can use the Emulator which you can download from here Cosmosdb Emulator . Or you can create a Cosmosdb free tier account which is handy for developers.Azure Cosmos DB free tier makes it easy to get started, develop, test your applications, or even run small production workloads for free. When free tier is enabled on an account, you'll get the first 1000 RU/s and 25 GB of storage in the account for free.

In this demo let me migrate this sample dataset tweets which has 1000 recent tweets from users who actually tweeted about different technologies. I have uploaded the dataset in my github account and we will be using the same to understand different queries.

  • Create a CosmosDB Account of type SQL API and database/collection named Tweets
  • Insert the data inside the folder Tweets to CosmosDB using the Data Migration Tool

SQL API Queries :

Once you created the Cosmos DB account on Azure , Navigate to Settings -> Key and copy the Endpoint and the Url for the Cosmos DB account and replace the values in the Program.cs ( This is not recommended for production use , use Keyvault instead).

Obtain Keys and Endpoint from Azure portal

  // The Azure Cosmos DB endpoint for running this sample.
private static readonly string EndpointUri = "https://sajee-cosmos-notebooks.documents.azure.com:443/";
// The primary key for the Azure Cosmos account.
private static readonly string PrimaryKey = "==";

Before dive into the queries , let me explain one of the most important thing to deal with queries in Cosmos DB. In Cosmos DB SQL API accounts, there are two ways to read data.

Point reads – Which denotes you can do a key value lookup on a single item id and a partition key. Point reads usually cost 1 RU With a latency under 10 Milli seconds.

SQL queries - SQL queries consume more Rus in general than the point reads . So if you need a single item, point reads are cheaper and faster.

As a developer we tend to execute select * from table which might cause you more RUs since you are reading the whole data across multiple partitions.

Query vs Point Read

Let's dive into some of the queries with SQL API

Query 1 : Select All

To retrieve all items from the container, in this case let's get all the tweets. Type this query into the query editor:

SELECT *
FROM tweets

...and click on Execute Query!

Cosmos DB SQL API queries

Few things to note here are, It retrieves the first 100 items from the container and if you need to retrieve the next 100, you could click on load more , under the hood it uses the pagination mechanism. Another handy thing here is that you can navigate to query stats and see more details on the query such as RUs, Document size, Execution time etc.

Query Metrics

Query 2 : Filter Path

When referring to fields you must use the alias you define in the FROM clause. We have to provide the full "path" to the properties of the objects within the container. For example, if you need to get the RetweetCount from the container for all the items.

SELECT tweets.RetweetCount
FROM tweets

Query 3 : Join

Lets see how we can find out the hashtags that have been used in all the tweets. We can use the JOIN keyword to join to our hashtags array in each tweet. We can also give it an alias and inspect its properties.

Let's see the JOIN in action. Try this query:

SELECT hashtags
FROM tweets
JOIN hashtags IN tweets.Hashtags

Now that we know how to join to our child array we can use it for filtering. Lets find all other hashtags that have been used along with the known hashtags (#Azure, #CosmosDB):

SELECT hashtags
FROM tweets
JOIN hashtags IN tweets.Hashtags
WHERE hashtags.text NOT IN ("CosmosDB", "Azure")

Similarly we can use OR,IN,DISTINCT,GROUPBY etc.

Query 4 : KEYWORDS (IN,OR etc)

To return the entire tweet where the indices of the hashtag is between 11 and 18 simply by selecting the tweets rather than the indices

SELECT tweets
FROM tweets
JOIN hashtags IN tweets.Hashtags
JOIN indices IN hashtags.indices
WHERE indices BETWEEN 11 AND 18

Query 5 : ORDERBY and TOP

We can also order our query so we can find the most recent tweet(s) and retrieve the top 5 tweets. (use ASC for ascending and DESC for Descending) :

SELECT TOP 5 tweets
FROM tweets
JOIN hashtags IN tweets.Hashtags
JOIN indices IN hashtags.indices
WHERE indices BETWEEN 21 AND 28
ORDER BY tweets

Query 6 : PROJECTION

We can use a feature called Projection to create an entirely new result set. We could use this to create a common structure or to make it match a structure we already have.

Try this query:

SELECT tweets.CreatedBy.Name AS Name,
tweets.FullText AS Text,
tweets.CreatedAt AS CreatedTime,
tweets.TweetDTO.metadata.iso_language_code AS LanguageCode
FROM tweets

Query 7 : USER DEFINED FUNCTION

The SQL API supports javascript User defined functions, there that you can use on this server called displayDate which removes the time parts of a UTC date string.

This is the function :

function displayDate(inputDate) {
return inputDate.split('T')[0];
}

Let's have a go at using it

SELECT tweets.CreatedAt,
udf.displayDate(tweets.CreatedAt) AS FormattedDate
FROM tweets

The SQL API also supports stored procedures written in JavaScript which enables you to perform ACID transactions over multiple records. This allows scalable and almost unlimited expandability on the functionality Azure Cosmos DB can offer.

These are some of the basic queries to get started with CosmosDB SQL API. If you want to get to know more about SQL API the following references would be useful.

Hope this post help you get started with CosmosDB SQL API! Cheers!

· 5 min read

Microsoft introduced App Service Static Web Apps in Preview at Build 2020 as"Azure static web apps", a service which allows web developers to build and deploy website to any environment from Github repository for free. Developers can deploy applications in minute while taking the advantage of scaling and cost savings offered by azure functions as backend. One of the frequent questions that i heard from the developers was the availability of Azure Devops support with Azure static web apps.

I have already published an article which demonstrates how to deploy to Azure Static Web Apps using GithubActions. Azure static web apps team announced the public preview of Azure Devops with Azure static web apps yesterday.

In this post you I will walk you through on how to deploy an angular application to Azure static web apps using Azure Devops.

PreRequisites:

Step 1 : Create a Devops repository

Navigate to Dev.Azure.com and create a new Devops Repository as below

Azure Devops repository

Step 2 : Import your static web application to Azure Devops repository

https://github.com/microsoft/static-web-apps-gallery-code-samplesNext step is to import the web application from any source control to your newly created Azure Devops repository. In this case i am importing the same "Meme generator" app which was made of Angular. Meme4Fun is an app to create custom programming memes from a picture and it also identifies features of people in the image which is available as part of code samples for Azure static web apps.

Import Repository to Azure Devops

Step 3 : Create a static web app on Azure

Next step is to create the static web application on azure, navigate to azure portal and create a new resource by searching for Static Web apps, and create it.

Note : Since I am going to leverage Azure Devops as the deployment method, select Other as the option.

Choose Devops as the Deployment

Once the deployment is successful, navigate to the new Static Web Apps resource and select Manage deployment token.

Manage deployment token

Step 4: Create the Pipeline task in Azure Devops

If you are using Azure Devops to deploy applications in the past, you will need to have the pipeline in plact to deploy to particular resource. In this case, lets create a build pipeline to deploy the angular application.

Next step is to select the template, since we are deploying an angular application, you can select the template as angular with nodejs

Angular template

Also for static web apps it is important to include the following step in the yaml which has the app_locationapi_location, and output_location , you need to pass those values if you have those details if not keep it empty.

Different values details

and the final configure YAML will look similar to the below,

# Node.js with Angular
# Build a Node.js project that uses Angular.
# Add steps that analyze code, save build artifacts, deploy, and more:
# https://docs.microsoft.com/azure/devops/pipelines/languages/javascript
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- task: NodeTool@0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- script: |
npm install -g @angular/cli
npm install
ng build --prod
displayName: 'npm install and build'
- task: AzureStaticWebApp@0
inputs:
app_location: "/"
api_location: "api"
output_location: "dist/meme4fun"
env:
azure_static_web_apps_api_token: $(deployment_token)

Next step is to provide the The **azure_static_web_apps_api_token** value is self managed and is manually configured. Select Variables in the pipeline and create a new variable named as "deployment_token"(matching the name in the workflow) below.

Add new Variable

Paste the deployment_token value which was copied from Azure portal

Make sure to check "Keep the value secret" and save the workflow and run.

With that step Azure Devops will execute the pipeline and deploy the application to azure static web apps.

Successful deployment

Now you can access the application from the static static web app from the URL.

Static web app with azure devops deployment

If you had any trouble in executing the above steps, you can watch the same tutorial published as a video here.

It's great to see that with few steps of configuration, i was able to deploy the application with Azure Devops. If you have enabled backend support with the azure functions, it can be deployed as well. Support with Azure Devops has been one of the most requested features by developers and great to see it in live now. It's your chance to explore various application deployments with different frameworks, hope this is useful. Cheers!