Microsoft DP420 Study Guide – Integrate an Azure Cosmos DB solution

This post is part of a series focusses on the CosmosDb exam topics. Once ready, I will update this page with links to the other pages.

  • Design and implement data models
  • Design and implement data distribution
  • Integrate an Azure Cosmos DB solution (you are here)
  • Optimize an Azure Cosmos DB solution
  • Maintain an Azure Cosmos DB solution

Integrate an Azure Cosmos DB solution (5 – 10%)

Enable Azure Cosmos DB analytical workloads

Azure CosmosDb Synapse Link means you can directly connect Azure Cosmos DB containers and Azure Synapse Analytics with no separate connectors. Synapse currently supports Synapse Link with Synapse Apache Spark and serverless SQL pool.

Turning on Synapse is pretty easy for new containers (there’s a toggle in the GUI or pass in AnalyticalStoreTimeToLiveInSeconds programmatically), but existing ones currently have to register which can take up to a week.

It’s also worth noting that when you enable this, it will incur a cost and this currently cannot be turned off (you can delete the resource but that’s the only way).

You can’t have both continuous backups and Synapse links on in the same way as they use the same connection method. You can have periodic backups and Synapse, but not continuous and Synapse.

https://docs.microsoft.com/en-gb/azure/cosmos-db/synapse-link#enable-htap-scenarios-for-your-operational-data

Enable the analytical store on a container

https://docs.microsoft.com/en-us/azure/cosmos-db/configure-synapse-link


Enable a connection to an analytical store and query from Azure Synapse Spark or Azure
Synapse SQL

https://docs.microsoft.com/en-us/azure/cosmos-db/sql/create-sql-api-spark

Perform a query against the transactional store from Spark

https://docs.microsoft.com/en-us/learn/modules/query-azure-cosmos-db-with-apache-spark-for-azure-synapse-analytics/

Write data back to the transactional store from Spark

https://docs.microsoft.com/en-us/learn/modules/query-azure-cosmos-db-with-apache-spark-for-azure-synapse-analytics/7-write-data-back-to-transactional-store


Implement solutions across services

Integrate events with other applications by using Azure Functions and Azure Event Hubs

https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-cosmos-db-triggered-function


Denormalize data by using Change Feed and Azure Functions

https://docs.microsoft.com/en-us/azure/cosmos-db/sql/change-feed-design-patterns


Enforce referential integrity by using Change Feed and Azure Functions

https://docs.microsoft.com/en-us/learn/modules/advanced-modeling-patterns-azure-cosmos-db/5-exercise-use-change-feed


Aggregate data by using Change Feed and Azure Functions, including reporting

This and the next couple are basically the same as the above links in my opinion. The change feed is an incredibly useful tool that allows you to process pretty much everything async within the context of your database.

You can use the feed to create aggregate columns or summary tables etc for quick querying.


Archive data by using Change Feed and Azure Functions

For archiving, use the change feed to monitor writes and move them into long-term storage.


Implement Azure Cognitive Search for an Azure Cosmos DB solution

https://docs.microsoft.com/en-us/azure/search/search-howto-index-cosmosdb

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.