AZ-204 Microsoft Azure Developer Associate Topic: Develop for Azure Storage Part 3
December 16, 2022

27. AZ-203/204 – Lab – Azure Functions – Queue and Table binding

Hi, and welcome back. Now, in the earlier chapter, we saw how to work with the queue trigger. So we add the Azure function itself. It was going ahead and processing messages in a queue. Now we’ll go a step further. We’ll actually take the object, form the queue, and actually write it as an entity onto Azure Table Storage. cheval Let me go ahead and add a table. Let me give you a table name. So again, let’s say I’m storing information about courses in this table. Let me go ahead and hit on.

OK, now please note that, from the perspective of the AZ 204 exam, there is no stress on the objectives of using tables in the Azure Storage service. So I’m going to be covering tables when it comes to Azure Cosmos DB. So even over here, it mentions that you have better table experience when it comes to Azure Cosmos DB. So we’ll be covering tables in that particular section. So for now, just understand that if you want to have a flat table structure, you can go ahead and use the table service that’s available in your Azure Storage account. This is like a NoSQL data store that you have as part of your storage accounts. Now, if you want to go ahead and see the entities—or, let’s say, the rows of the table—you can go to Storage Explorer, which is available in this experience if you go onto tables, if you go on to course. So currently, there is absolutely no data in this table. So what I want to do is go ahead and add a message to a queue. When the message is added to the queue, our Azure function will be triggered, and Azure is going to go ahead and add the details of this queue message onto the table.

So let me go head-on to the function. So I’ve already gone ahead and added the code, which is going to go ahead and do this. So let’s go ahead and inspect our code. So again, I have my run method, I have my JSON object, and I have my course class. I’m going ahead and adding a new object for my course class. Now, when it comes to working with entities in the table service, there are two properties that are mandatory. So that is the partition key. So you need to have the name of a partition key for a property and something known as the row key. So the partition key is used to ensure that the data that’s stored in the tables is divided into multiple logical partitions. This improves the performance of how the underlying data is stored and queried from the underlying table and allows us to go ahead and uniquely identify each row or entity within the particular partition, something known as the row key.

So over here, I’m going ahead and matching my ID property onto the partition key and my name property onto the row key. And I have an extra property known as the rating. So this is the structure of my course class, and then I’m setting the properties for my object accordingly. So I’m getting it from the JSON object. Now over here, there is one more aspect that we have to implement. So remember, we have our input trigger, right? So our message is coming from the queue. It’s coming to our Azure function. But in the function, I can’t see any code that is implemented to add an entity or a row to our Azure Table. Yes, you can add that code, but there is an easier way when it comes to Azure functions. So we can go ahead and add a binding to our Azure Table. Now, when we add a binding to our Azure table,  remember that we are taking that object and adding it. So it’s an output of our Azure function. So we are going to go ahead and add an output binding for Azure functions. So remember, the queue was an input binding because the data is coming into our Azure function from the queue. That same data is now being sent outside of our function to an Azure Table. Over here, you can see I also have a return clause. So I’m basically returning my course object. So I want this course object to be automatically added to your table with an entity. So we can move on to the integrated section by implementing this by adding an output binding.

So let me go ahead and first click on Save to ensure that our function compiles. So over here, you can see we are getting an error in the compilation, and that’s because we have not defined that the return clause has a binding. So let me go ahead and show you how to do that. So first let me go ahead and clear the locks over here, so I can go on to the right and click on Clear. Let me go on to the integrated section. So let me go ahead and add a new output. So over here, you can see there are a lot of services that you can add as an output for the Azure function. So I’m going to go ahead and choose Azure Table Storage. I’ll just scroll down and click on “select.” So now in the Azure Table Storage output, we have to mention what the name of our table is. So there you have it. So remember, our table is in the storage account. So we have to ensure that we choose that particular storage account connection.

We’ve already created that in the earlier chapter. Now what is the table parameter name? Over here, we can go ahead and use a function return value. So this now adds that parameter, “dollar return.” So let me go ahead and click on “Save.” Now once this is done, So what this does is that if I return to my function, if I return to my function or JSON file, So now over here, you can see that we have one more binding. So now we have a table binding. So over here, we have a table binding. This is the return clause. So now our object can be returned to a table. We’re naming the table here. What is the connection? And this is an outward binding. That’s why we have the direction “out.” Right? So now that we have this in place, letme go ahead and add a message onto RQ.So I’ll click on “add message.” So I’ll add the message over here, and I’ll go ahead and click on OK, so if I click on refresh again, I can see the message has been processed. If I go on to my storage Explorer, if I go ahead and just refresh the table here, you can see we have our data in place. So we have our partition key. That’s the idea of the code course, the row key, which is the name of the course, and the rating. So now automatically, a zero function is taking messages from a queue and adding them to a table, right? So this marks the end of this chapter.

28. AZ-203/204 – Introduction to Azure SQL Database

Hi, and welcome back. Now, in the earlier chapter, we saw how to work with the queue trigger. So we add the Azure function itself. It was going ahead and processing messages in a queue. Now we’ll go a step further. We’ll actually take the object, form the queue, and actually write it as an entity onto Azure Table Storage. So if I go on to my storage account, I’ll go on to the table service. Let me go ahead and add a table. Let me give you a table name. So again, let’s say I’m storing information about courses in this table. Let me go ahead and hit on. OK, now please note that, from the perspective of the AZ 204 exam, there is no stress on the objectives of using tables in the Azure Storage service. So I’m going to be covering tables when it comes to Azure Cosmos DB. So even over here, it mentions that you have better table experience when it comes to Azure Cosmos DB.

So we’ll be covering tables in that particular section. So for now, just understand that if you want to have a flat table structure, you can go ahead and use the table service that’s available in your Azure Storage account. This is like a NoSQL data store that you have as part of your storage accounts. Now, if you want to go ahead and see the entities—or, let’s say, the rows of the table—you can go to Storage Explorer, which is available in this experience if you go onto tables, if you go on to course. So currently, there is absolutely no data in this table. So what I want to do is go ahead and add a message to a queue. When the message is added to the queue, our Azure function will be triggered, and Azure is going to go ahead and add the details of this queue message onto the table. So let me go head-on to the function. So I’ve already gone ahead and added the code, which is going to go ahead and do this.

So let’s go ahead and inspect our code. So again, I have my run method, I have my JSON object, and I have my course class. I’m going ahead and adding a new object for my course class. Now, when it comes to working with entities in the table service, there are two properties that are mandatory. So that is the partition key. So you need to have the name of a partition key for a property and something known as the row key. So the partition key is used to ensure that the data that’s stored in the tables is divided into multiple logical partitions. This improves the performance of how the underlying data is stored and queried from the underlying table and allows us to go ahead and uniquely identify each row or entity within the particular partition, something known as the row key. So over here, I’m going ahead and matching my ID property onto the partition key and my name property onto the row key. And I have an extra property known as the rating. So this is the structure of my course class, and then I’m setting the properties for my object accordingly. So I’m getting it from the JSON object.

Now over here, there is one more aspect that we have to implement. So remember, we have our input trigger, right? So our message is coming from the queue. It’s coming to our Azure function. But in the function, I can’t see any code that is implemented to add an entity or a row to our Azure Table. Yes, you can add that code, but there is an easier way when it comes to Azure functions. So we can go ahead and add a binding to our Azure Table. Now, when we add a binding to our Azure table,  remember that we are taking that object and adding it. So it’s an output of our Azure function. So we are going to go ahead and add an output binding for Azure functions. So remember, the queue was an input binding because the data is coming into our Azure function from the queue. That same data is now being sent outside of our function to an Azure Table. Over here, you can see I also have a return clause. So I’m basically returning my course object. So I want this course object to be automatically added to your table with an entity.

So we can move on to the integrated section by implementing this by adding an output binding. So let me go ahead and first click on Save to ensure that our function compiles. So over here, you can see we are getting an error in the compilation, and that’s because we have not defined that the return clause has a binding. So let me go ahead and show you how to do that. So first let me go ahead and clear the locks over here, so I can go on to the right and click on Clear. Let me go on to the integrated section. So let me go ahead and add a new output. So over here, you can see there are a lot of services that you can add as an output for the Azure function. So I’m going to go ahead and choose Azure Table Storage. I’ll just scroll down and click on “select.” So now in the Azure Table Storage output, we have to mention what the name of our table is. So there you have it. So remember, our table is in the storage account.

So we have to ensure that we choose that particular storage account connection. We’ve already created that in the earlier chapter. Now what is the table parameter name? Over here, we can go ahead and use a function return value. So this now adds that parameter, “dollar return.” So let me go ahead and click on “Save.” Now once this is done, So what this does is that if I return to my function, if I return to my function or JSON file, So now over here, you can see that we have one more binding. So now we have a table binding. So over here, we have a table binding. This is the return clause. So now our object can be returned to a table. We’re naming the table here. What is the connection? And this is an outward binding. That’s why we have the direction “out.” Right? So now that we have this in place, let me go ahead and add a message to RQ. So I’ll click on “add message.” So I’ll add the message over here, and I’ll go ahead and click on OK, so if I click on refresh again, I can see the message has been processed. If I go on to my storage Explorer, if I go ahead and just refresh the table here, you can see we have our data in place. So we have our partition key. That’s the idea of the code course, the row key, which is the name of the course, and the rating. So now automatically, a zero function is taking messages from a queue and adding them to a table, right? So this marks the end of this chapter.

29. AZ-203/204 – Lab – Azure SQL Database

Hi, and welcome back. Now in this chapter, I want to show you how you can actually connect an Azure web app, or at least connect your web application that’s hosted in the Azure App Service, onto an Azure SQL Database. Now, please know this is not part of the objectives of the AZ 204 exam, but I do get a request from a lot of students on how we can actually connect a web application that is hosted on an application service to a SQL database. and this is pretty useful information. So I just thought that we should add this set of bonus chapters to this course. So first, in the Azure Portal, we are going to go ahead and create a new Azure SQL Database.

So again, the Azure SQL Database service is a platform that has a service where the underlying infrastructure is completely managed by Azure. It will even go ahead and install the entire SQL Server database engine. All you have to worry about is how to maintain your database. There are even features that are available, such as automated backups. There is also a safe way to connect to your Azure SQL Database. So let’s go ahead and add a new resource. So I’ll go ahead and select SQL Database. So I’ll choose my subscription and my resource group. I’ll go ahead and give the database a name. Now, even though you don’t need to maintain the server, you still have to give some details about the server. So remember, when you go ahead and connect to the database, you will actually be connecting to the database server.

So I’m going to go ahead and hit on “create new.” Since I don’t have a database server in place, I’m going to choose a location in the central United States. I’ll give you a server name. So this has to be a unique server name. I’ll enter a server admin name and login. Please keep a note of these credentials. These will be required to go ahead and connect to your SQL database server. Let me go ahead and hit on.OK, next, in the Configure Database section, I’m going to go ahead and choose the DTU model. So this is a more cost-effective option when you want to go ahead and work with an Azure SQL Database. So if you go on the side here, you can see the estimated cost per month. If you want better performance for your underlying database, then you can go ahead and choose Premium. This is for intensive workloads. Over here. You can see that it’s much more expensive when it comes to the cost per month, but you get better performance.

Because this SQL Database will only be used for demonstration purposes, I’ll go with the basic version for less demanding workloads. Let me go ahead and hit “Apply.” Again, I’m just showing you the cost. This is even lower. I’ll press “apply.” I’ll go autonomous for the networking. Now, in order to go ahead and access our SQL database, We also have to configure something known as the server firewall. Now, by default, the connectivity method is not Access. That means no service can actually go ahead, or no client can actually go ahead and connect to this database server. So you can go ahead and add it as a public endpoint. Now, in the firewall rules, you can add your current client’s IP address. Now, for example, I am on a Windows workstation. I want to connect to my SQL database so that I can create a table and maybe add rows. So for that, you have to go ahead and ensure that your client’s IP address is added to a firewall rule. So remember that this is a firewall that is being used as an extra security measure to protect the traffic that is flowing into your database server.

Now, if you want Azure Services to be able to access your SQL Database as well, go ahead and click “yes” over here as well. In the additional settings, if you want, you can use existing data, which either could be the sample database or an existing backup. For now, I’ll leave everything as is. I’ll go onto Next for the tags, and I’ll go on to review and create. And let me go ahead and create this as your SQL database. Now, please note that this is going to go ahead and create two resources. The first is your SQL Database server, and the second is the Azure SQL Database. This could take around five minutes. Let’s come back once both of these resources are in place. Now, in order to connect to your SQL Database server, you can download the SQL Server Management Studio, which is a free tool. So you can go ahead and download SQL Server Management Studio. This allows you to connect to the database server. You can create databases. You can create tables. You can work with the data in the underlying tables.

So I’ve already gone ahead and installed SQL Server Management Studio. Now that you have the resource in place, let’s get started. So here you can see your SQL database. There are a lot of features that are available for your SQL database. For now, let’s go ahead and connect to the SQL Database Server. So you can go ahead and take the server name. From here, I’m going to open SQL Server Management Studio. So I’ll give the server name over here. I’ll choose SQL authentication and add that admin username and password. And now you can see your AppDB database on the database server, right? ,,, the, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and Let’s go over to the next chapter to see how we can connect an application in our App Service to the SQL Database.

30. AZ-203/204 – Azure Web App – Azure SQL Database – Part 2

Hi, and welcome back. Now in the prior chapter, we had seenhow to create an Azure SQL database. Now in Visual Studio, I have an MVC-based application and an ASP.NET application that is using the empty framework to go ahead and interact with a SQL database. Now if I go on to the web configuration file, over here, I’m actually using the entry framework. If I scroll down, here is where I need to add the connection string.

So for the connection string, you can go on to your SQL database. You can show the database connection string. You can go ahead and copy the entire connection string. Please know that over here you have to go ahead and replace your password. So you can go ahead and copy it to the connection string over here, and just ensure to go ahead and change the password over here. So I’ll go ahead and change the password. Let me now go ahead and run this particular project. Now, after running the application, So this is the home page. Now, currently, I can’t see any data. That’s because the ND framework will just go ahead and create the table in our SQL database. So if you go on to our SQL database, which you created in the earlier chapter, remember that we didn’t create any tables manually. This will be done by the ND framework. That’s part of the MVC application. So over here, we can see our table. Now we can go ahead and add some rows to this table. So let me go on to the database and add a new query.

 So I’ll go ahead and add these rows of data. So I’ve now got three rows of data. If I now go and refresh my application page, you can see the application data. Remember that we are operating from our local system. Now, one of the reasons we can actually connect from our local Visual Studio system onto an Azure SQL Database is because of the firewall settings that we set when we create our Azure SQL Database. So if you go back to the overview, there is something known as “Set Server Firewall.” Here, you will be directed to the actual SQL Server. So remember, this is our client’s IP address over here. because you have a client IP address. My workstation, from where I’m working, can actually now connect to that as your SQL database. So that’s step number one. That’s complete. Now let’s say we want to ensure that we can publish this onto an app service, and an app service should be able to connect to this database. So first, let’s go on to Visual Studio. So I’m going to go ahead and publish this project onto an existing app service that we have. So I’ll go ahead and ensure that I select an existing app. So I’ll go on to demo Group One.

I’ll go with the staging web app 4000. Click on “OK,” “Let me go,” and “Publish.” Now that the publish is complete, if I go to the home page for my staging web application, I can now see the details of my application. Now, how do I know that this app service is actually connecting to our Azure SQL Database? So, if I go on to the Azure SQL Database over here, you can see that these are the roles that we currently have in place. So let me go ahead and add a new course using this application. So let me go ahead and create a new record. I’ll go back on the list. So we have our new record in place over here. Let me go on to the Azure SQL database. Let me go ahead and fetch all the rows from the courses. And now you can see the additional data over here, or the additional row. So now Azure Web App is also connecting to our Azure SQL database. Please keep in mind that this is due to the file containing all settings. So because we have ensured that Azure Services are allowed, remember, our app service isn’t an Azure Service. So since we have enabled “Allow Azure Services to access this particular server,” our Azure WebApp can actually access our SQL Database, right? So this marks the end of this chapter.

31. Using Azure Web App – Connecting strings

Hi, and welcome back. Now in this chapter, I want to talk about how you can set the connection strings for your app services. Now, I already have one SQL database in place. So in the SQL database, I have one simple course table. These are the rows in the table. now in Visual Studio. I have an ASP.NET MVC application. Now, you can use connection strings in your web configuration to connect to an Azure SQL database. You have a connection string property in here, and you mentioned what the name for the connection string is and what the connection string itself is. So if I go ahead and run this program, it will run locally. So you can see that this page is displaying the data from the database. You can now use these connection strings in a zero from within your web application.

So if I go on to an existing app service, which I already have in place, if I go on to the configuration setting So in application settings, we also have something known as connection strings. So we can go ahead and add a new connection string for the value. I’m going to go ahead and take the value that I have over here. So remember, this is your connection to your database. The next type is SQL Azure. What is the name? So the name of this connection string is Add it over here. Click on OK. Make sure to click on “Save.” Click on “continue.”

 So the update is complete. Now, from within Visual Studio, the way I’m actually accessing this is through the Configuration Manager. So if I go on to my models, I have a database model. So over here, I am actually using the Configuration Manager class to go ahead and fetch the connection string. So whether the connection string is in the web configuration or whether it’s now in Azure, the code stays the same. So now I’m going to go ahead and delete the connection string. From here, click on “Save.” Now let me go ahead and publish this on my app service. So I’ll go ahead and create a new profile. I want to make certain that I publish it to the appropriate app service. So I’ll choose that app service. Create the profile. Click on “publish.” So now, when you browse for your app service, you can see that it is connecting to the database. Remember now that it is using the connection strings that are attached to the app service, right? So this marks the end of this chapter.

32. AZ-203 – Lab – Azure SQL Database – Reading data

Hi, and welcome back. Now in this chapter, let’s quickly go through how you can connect to and read data from your Azure SQL Server database. So remember, we’ve already spun up a database, and we have the sample database installed, which is the Adventure Works database. You can now create a simple programme in.NET Core, PHP, or DotNetFramework.

You can use an MVC framework. So there are different ways you can actually access the database. It’s just like a simple database on the Azure cloud. So I’m just giving an example over here where I’m just reading data from a database itself. So first, I’m actually setting the data source. So over here, I’m mentioning the server name, the user ID, the password, and the initial database with which we want to connect to our demodb database. And then next, what I’m doing is just selecting the top five rows from the product table. So I am just taking the first three columns. So I’m just building a string, building a SQL command, doing a read operation, and then writing it to the console. a very simple programme to read data from the SQL database. So now let’s go ahead and just run the program. So, once again, it’s very simple. I am getting the output of the first five rows. So again, it is very simple to read data from a SQL Server on the Zero platform.

33. AZ-203 – Exam Extra – Entity Framework

Hello and welcome back! Now in this chapter, let’s talk about how we can use the entity framework. This is important from an exam perspective because one of the objectives is how you can actually query data in an Azure SQL database, and one of the ways is to go ahead and use the entity framework. So the NT framework basically easily maps the tables that you have in a SQL database onto OBJ. I’ve gone ahead and done that. In an Azure SQL database, I’ve gone ahead and created a table. I just inserted some rows in that table. So when you go ahead and use the NT framework and include the table, it’s automatically gone ahead and gotten what the columns are as part of the table. If you go on to the class, it automatically creates a customer class as well. So here, it’s gone ahead and created a representation of the columns in your table. It will generate a class based on that information, as well as a database context with which you will interact.

So you’re going to be using database context to actually work with the table itself. So if I go on to my main program, here I have one method to basically get the value of a customer where the ID is equal to one and add a customer to the table as well. So, over here, I’m not explicitly adding the code to fetch data from the table itself. This is all going to be done by the entity framework. We’re going to call a couple of methods, and then we can also go ahead and even add data to a table as well. So in order to get customers, I’m going to be using those entities. So remember, that’s the class that we have over here as part of our context. And then I can go ahead and use link to go ahead and fetch the first row in the table where the customer ID is equal to 1, and then I can write it onto the console. And then, when adding a customer again, I can use the demo entities class here. I can go ahead and add a new object to the table, save the changes, and it will actually be saved onto our database. Let’s go ahead and run the program. So here I’ve gone ahead and fetched the row where the customer ID is equal to one, and it’s gone ahead and added a new customer. So if I go ahead and fetch the rows from the table, you can see it’s gone ahead and added a new row to our table. Now, what I’ve done is just gone ahead and created an orders table, which basically refers to our customer table. I’ve inserted some rows in the orders table. Now, when I use the NTFramework and include these two tables, it’s ordered—it not only has the columns from both tables, but it also has the relationship. As a result, it demonstrates a one-to-many relationship.

Now, if you go on to the class itself, So the way it basically maintains that one-to-many relationship is by creating an eye collection of the order class within the customer class. Now, if you go back to the programme and enter it again, if you want to get the value of the customer, you can do so. You can also get the number of orders the customer has placed because there is a reference to the orders—remember, there is an orders collection. And if you want to add an order itself, you can go ahead and first get the customer from the customer table. And then, when adding the orders, you can go ahead and reference the customer ID. So it’s as simple as that. Now, you can also use a mentor-mentee relationship using the entity framework. Now, remember, in SQL, in your SQL database, if you want to maintain a mentoring relationship, you have to ensure that you create another table known as a junction table. The junction table will actually have the order ID and the customer ID. So this is how you keep the men interested in making relationships. Now, in the resources for this chapter, I have these statements, which you can use to actually create a student’s table, a course table, and then maintain a junction table for the course and the students. So if you want to practice, you can actually use the stables, execute the queries, and then go ahead and use an empty framework, and it will automatically detect that there is a many-to-many relationship. And once more if you proceed to the class itself.

So if you go on to the course, you will see it has an eye correction for students. If you proceed to the student class, you will notice that it has an eye correction of the course. Just a quick note in the context: if you’re using fluent API to define a code-first mentor-mentee relationship, this is how it will look. So the has many, and the bit specifically mentions that the student and the course have a many-to-many relationship. So this is only true when using the mentored code-first relationship model; in this case, we’re using the database first in our code, and let’s say now in the code. In the main code, you want to add a student. So you’re adding a new student with the first name and, let’s say, the last name, but it has a reference to an existing course. So the first thing that you have to do is get a reference to the existing course itself. Then create a new student, create a new collection of the course, add the existing course, and then add the course to the student object. Then go ahead and add your student. When it comes to the NT framework, there is a lot to consider. If you’re already very familiar with the Nt framework, this should be easy for you. But for those students who were unfamiliar with the NT framework, I said, just from an exam standpoint, just some important NT framework notes. I said in the resources collection of this particular video that I’d add all of these core examples along with the script, which you can use to create the tables and practise on your own. So this marks the end of this chapter.

34. AZ-203 – Azure SQL Database Serverless + Hyperscale

Hello and welcome back! In this chapter, we’ll ll go over some of the other objectives. es. The first is the Azure SQL Database Server service. We look at a lab to see how we can provision this. this. This is a new feature that’s currently availabpreviewrmode.  moocow, the Server less Compute Tier for a single database can automatically scale based on the load on the databaitself. lf. And again in the Server less Compute tier, you can choose the Gen Five option. Now, please note that this is only available, I said in the Vico-based model that if you’re looking at the basic, standard, or premium, it’s not available here. Everything is based on the number of DTUs, or database transaction units. So you have to go to the Vicobase purchasing options. Choose Service You can set the maximum and minimum vicos here. And here you have that auto-pause delay. You can go ahead and hit “apply” again. Let me return to Configure database. Now, over here, you have the preview term. So this is just a preview. That’s why you might have to agree to these terms and conditions. Then click on “apply.” Now we’ll go on to the next one. So everything else remains the same. You can then go on to review and create the database instance. Now, once your resource is in place, you can go ahead and access the resource and see that it’s like any other database. You can actually go and connect to it. If you continue to configure, you will have compute utilization. So, at any time, you can choose to increase the maximum number of vicos or the minimum number of vicos. So all of this you can configure after the database is up and running. Right? So this marks the end of this chapter.

35. AZ-203/204 – Azure CosmosDB

Hi, and welcome back. Now, in this chapter, I want to give an introduction to the Cosmos DB service. So there are many features that are available with this service. Let’s look at the important ones. So first, Cosmos DB is a multimodel database. Now, what is the meaning of this? Now, when you create a Cosmos DB account—so it’s known as an account—you can actually specify something known as an API. When you specify the API, it basically decides what type of database engine or what type of data can be stored in this Cosmos DB account. So let’s say that you want to have a NoSQL-based database that stores documents. Then you can choose the Cosql API. So when you choose your Cosmos DB account, you can say that you want to store JSON-based documents. You would then choose the CoSQL API. Or maybe you want to store table-like data. Then you choose the Azure Table. Or maybe you want to migrate an existing MongoDB database to Azure.

You can specify that the API has MongoDB, or you could specify Cassandra. Or maybe you have a graph database, and you can choose the Gremlin API. So that’s the meaning of a multimodel database, because in the Cosmos DB account, you can have different types of APIs based on the API you choose. You can then create your database and start storing your data. So that’s from the perspective of a multimodal database. Then there’s the quick access that Cosmos DB provides. So with Cosmos DB, you get single-digit millisecond data access to your data itself. So if you want a database that can give you the least latency and the fastest access to your data, then you can go ahead and use Cosmos DB. Another advantage of using Cosmos DB is that it is globally distributed. So let’s have a better understanding of what the meaning of “globally distributed” is. Now, in Cosmos DB, when you create an account and a database, you can actually enable something known as multiregion accounts, wherein, let’s say, you create your database in the US.

East region. You can request that my data be replicated to another region, such as the Central United States, with a single click. You can also add another region; for example, please replicate my data to another region other than Central US. So whenever Azeo has data centres launched throughout the world, the first thing that comes up as a service is Cosmos. And why is this? is because they want to have multiple data centres in the world that can be used to host your Cosmos DB data. So you could enable multiregion accounts by enabling the replication of your data in your Cosmos DB account across multiple regions. So what is the core benefit of this? So let’s say you have an application where you have users across the world who are accessing your application. Now, let’s say that your Cosmos DB account and your data are in the East US region, right? So it’s only in one region. But since you have users across the world who need access to your data, by enabling replications to different regions across the world, you can bring the data closer to your users. That’s the core benefit.

Another core benefit is availability. So if this region fails, the other regions still have copies of your data because you enabled replication on those regions. So your application can do a quick switchover to the data in the other regions. So these are the core benefits of having multiregional accounts. Now, please remember this from a cost perspective. So, even if you have multiple regions for high availability and close proximity of data to your users, there are still costs associated with such a feature. So if you have your Cosmos DB data in only one region, you only bear the cost of the data and the throughput for that particular region. But then, if you have your data replicated across regions, then again, you bear the cost for all of the regions that you enable replication for. So if you have three regions, your cost would be three times. So please make a note of this. So even though you have high availability, you still have to pay more. However, if organisations require high availability of their data, they are willing to accept this cost factor. So let’s go over the aspects of Azeo Cosmos DB again, along with some additional aspects.

So Azio Cosmos DB, as mentioned before, has support for multiple APIs. It has a high availability of 99.9%.So I mentioned that if you want a highly available solution, then you should choose Azure Cosmos DB. It provides the least latency to your data, so it provides fast access to your data. So it guarantees less than 10-millisecond latencies for both reads. That’s index, which reads and writes at the 99th percentile all around the world. So how does it actually achieve high availability? Well, every partition of data that’s in your CosmosDB account is actually protected on the physical layer by something known as a replica set. So there are multiple replicas of your data, and all rights to your Cosmos DB account, your database, are committed to all the replicas. So even if one replica were to go down, you still have the others available. That means you have your data available. The replicas themselves are distributed across 10 to 20 fault domains. Now, when it comes to the charge for Azure CosmosDB, you are charged based on something known as the number of request units and the storage consumed. So what exactly is a request unit? Well, a request unit is a blend of the percentage of memory, the percentage of CPU, and the percentage of IOPS. Now, Azio Cosmos DB is a purely commercial service. So over here, you don’t need to manage the infrastructure at all. You just go ahead and define your database and your data itself.

So you don’t need to mention server details. You don’t need to mention the type of virtual machine that’s used to host your database. No, nothing. You just start working directly with your data. So that’s why there is a metric in place known as “request units” on which you will be charged. So this is a blend of the amount of memory, the CPU, and the IOPS that are assigned to your Cosmos TB database. Now, you consume a request unit, which is an ru, whenever you perform any operations, such as a read and insert, an upset, a delete, or a query operation. The request units are also known as the throughput that you have assigned to either your Cosmos DB database or something known as a container. So the throughput is basically the number of request units you’ve actually assigned to your Cosmos DB database. Please note that when you go into a lab, you will see where you can actually assign request units for your Cosmos DB database. You can assign a request unit the number of request units either at the database level or at the container level. So remember that if you enable replication for your data, you will again get charged for the data in the subsequent regions. So, DB, this is an introduction to Azure Cosmos. Let’s move on to the next chapter in this course.

Leave a Reply

How It Works

img
Step 1. Choose Exam
on ExamLabs
Download IT Exams Questions & Answers
img
Step 2. Open Exam with
Avanset Exam Simulator
Press here to download VCE Exam Simulator that simulates real exam environment
img
Step 3. Study
& Pass
IT Exams Anywhere, Anytime!