Pass Microsoft Certified: Azure Data Engineer Associate Certification Exams in First Attempt Easily
Latest Microsoft Certified: Azure Data Engineer Associate Certification Exam Dumps, Practice Test Questions
Accurate & Verified Answers As Experienced in the Actual Test!
Microsoft Certified: Azure Data Engineer Associate Certification Practice Test Questions, Microsoft Certified: Azure Data Engineer Associate Exam Dumps
Want to prepare by using Microsoft Certified: Azure Data Engineer Associate certification exam dumps. 100% actual Microsoft Certified: Azure Data Engineer Associate practice test questions and answers, study guide and training course from Exam-Labs provide a complete solution to pass. Microsoft Certified: Azure Data Engineer Associate exam dumps questions and answers in VCE Format make it convenient to experience the actual test before you take the real exam. Pass with Microsoft Certified: Azure Data Engineer Associate certification practice test questions and answers with Exam-Labs VCE files.
Design and implement data storage – Basics
6. A quick note when it comes to the Azure Free Account
In this chapter, I'd like to make a quick note for students who are using as their free account. The first note is that whenever you log in, you should be getting a notification about the amount of credit that is remaining. So remember, you get this credit for the first 30 days when it comes to using a free account. Another note: when it comes to creating resources, So for example, if I want to create an Azure SQL database, we'll be creating one as your SQL database in this course. Now, if I could just quickly enter a database name and also create a new server. So this is just a logical server name I'll just hit on. Okay, now here, when I choose Configure Database, this is the amount of compute and storage that you can actually allocate for the database. So here, when you look at the service tier, you can see that there are some service tiers that are actually greyed out. So one reason could be that we're using the Azure free account. So there could be some limitations, and that's fine. So when you're looking at testing or just experimenting, you would not want to use really high-purpose workloads. When it comes to working on A, even the general ones will do something other than note that you might not be able to create the resource in the region of your choice. This could be for some services. So we are going to be looking at the data warehousing service that is available in Azure, and that's Azure Synopse. So if I go to Azure Synapseantics and click on Create. So, here I have an existing resource group. Here I'll just enter a workspace name. So I'll leave the default region as "East US" too. Now here. I need to create a data lake. Gen Two. Account name. Please note that I will actually go through this entire process in a later chapter. Yeah, I just want to make a point when it comes to some aspects of the SEO Free account. Go on next to security, just enter a password, beat everything as it is, go on to networking, go on to tags, go on to review and create, and let me go ahead and hit on create. Now let's come back after three to four minutes to see if we have our resource in place. Now here, you can see that we were not able to create the resource. The deployment actually failed. And if you click here for more details, it says that Location East US Two is not accepting the creation of a new Windows as your SQL database server for this particular subscription. So there are some restrictions when it comes to deploying resources in certain regions. If you try to create this resource in the central US location, it will work. So there are some limitations on choosing regions. Now, when it comes to working on Azure myself, I use another Azure account where I use the pay-as-you-go subscription model. And since I've been using the Azure account for a few years, I don't have many limitations when it comes to creating resources in Azure with the Azure free account. I want to actually give a note on some of these important points.
7. Lab - Application connecting to Azure Storage and SQL database
In the previous two chapters, we had done a couple of things. We had created an Asia Storage Account and an Asia SQL Database. Now, in this chapter, I'll show you the.Net programme that I created previously that will interact with our SQL database and our storage account. If you cannot open the.Net programme on your machine, you can now install Visual Studio 2019, Community edition. It's a free download. It's a free tool. You can download the tool and then open up the project that I'm going to go ahead and show you. Also, in the next chapter, I'll include an optional chapter on how you can install Visual Studio. Just an optional chapter for those students who don't have Visual Studio in place or don't know how to install the tool itself, just an optional chapter.Now, the.Net programme will be attached as a zip file to my resources chapter. So the zip file will be something like this. It will be a sequel application zip file. You can right-click on the file, and you can extract all of the contents. So I'll just extract it into my temporary directory itself. So I've gone ahead and downloaded that zipfile, and I'm just extracting it over here. Now once the contents are extracted, I'll go onto the folder. Here we have a solution file—a Visual Studio solution file. I'll double click on this Visual Studio solution file, and it will open up my project. Once you have the project opened up in Visual Studio So I'm not going into the details of the project. I'll just show you the changes I'm going to make on this project. So this code is going to interact with us—your SQL database. It will take that data and display it on a simple webpage. It will also actually fetch images from my Azure storage account. The first thing I need to do is set up the connection on my Azure SQL database. For that, I'll go on to the Services folder. I'll double click on COSE. And here I have embedded all the details of my SQL database. So I need to change. Now the details are over here first: my server name. So I need to go on zero. I'll again copy the server name from here. Let me paste it here. Next is my user. So it's a SQL user. And my password is your address. One, two, three. and my database is AppDB. So, what is the information about the data it will retrieve from your SQL database now? So I'll be fetching data such as the course ID, the exam image, the course name, and the rating. So we have to go ahead and actually create a table in our database and insert some data. These commands will also be available as resources. Please note that all the commands that I'm going to execute in this course will be available in one form or another as resources so that you can follow along. The first thing I need to do is to create a table. We're creating a very simple table. The name of the table is "Course." We have four columns. We have the course ID, the examimage, the course name, and the rating. All of them have the associated data types. So I'll copy all of this into SQL Server Management Studio. For my database, I'll right click. I'll choose a new query, so it will open up a SQL query window. I'll paste the contents to create the table. I'll hit execute. We'll have our table in place now. You can see the Course table in place if you go to Object Explorer and expand the tables over here. Debo is just the default schema, which will be appended onto the table name. Now we need to insert data into the table. So we have the course ID. We are something known as the exam image. Now this image will be fetched from my Azure Storage account. And here we are just mentioning what the URL for that particular image is. When you upload a Blob or object onto an AzureStorage account, it's going to give it a unique URL. So instead of storing the image anywhere else, you can actually store that image as a blob on our storage account. And our application can actually fetch that image. Next, I have the course name, and I also have the rating. So, before we execute these commands, we need to upload these images onto our Azure Storage account. Now, please note that we have to change the name of our storage account. So let's do that. First, I'll go on to all the resources. I'll filter my data by GRP. I'll hit on apply. Now, just to ensure that we always see this particular view, I'll go on to manage View, and I'll save the view. I'll give it a proper name. I'll hit on okay. So now at any point in time, you can go ahead and just switch on to that view. So we'll go on to the storage account. I'll take the storage account name. I'll just copy it. I'll paste it over here. Let me paste it everywhere else as well. Next, we need to create a container. So in a storage account, in order to hold all of your objects, you create something known as a container to hold those objects. Here we are making a reference to a data container. So in the storage account, I'll go on to containers. I'll create a new container. I'll call myself Data. And for Access for Blobs Only in the public access level, I'll select Blob anonymous. I'm providing public access to the blobs or objects that I upload to this container. Please note that there are many more different security levels that are available to protect the data that you have in your container. to make things much simpler. I am giving the public access level of "Blob anonymous access" so that there is public access available for this particular container. I'll hit on create.I'll click on the container. I'll click on the upload button. I'm going to upload. Now files that I have on my local system I've copied three images into my temp folder. So if I go on to my temp folder, I have three images in place.I'll choose those images and hit on "Open." I'll hit upload. So now I have my three images in place. If I proceed to any image, if I proceed to Edit. So here are some simple images about the different exams. I'll go back to my statements. I'll take now all of my threeinsert statements in my query window. I'll replace all of the statements. I'll hit execute. So I have the rows in place. We can do a select statement to ensure that we have all of the data in our table. I'll just select "course" from the list. Hit execute. I can see all of my data in one place. Now, if I continue with my programme and run it. So this programme will first connect to my SQL database. It will read that particular table. It will fetch the information from that table at the same time. Remember, each row in the table has a reference to the image in the storage account. So here we can see our application in a running state. So all of these images are being collected from an Azure storage account. And the other information is coming in from the table that we have in our SQL database. So I said the entire purpose of this set of chapters was to make one thing very familiar. Your data is the most important aspect of your application or any art system that relies on data. Here our data is residing not only in an Asia storage account in the form of images, but also in an Azure SQL database, right? So this marks the end of the set of chapters. The next chapter will look at your data. Lake Gen 2 storage accounts.
8. Different file formats
9. Azure Data Lake Gen-2 storage accounts
So now we come to the Azure Data Lake, Gen. two storage accounts. So this is just a service that actually provides the option of hosting a data lake on Azure. So, when working with large data sets and data arriving in large volumes and at a rapid rate, companies consider having data leaks in place for hosting the data. So in Azure, you can actually make use of Azure data. Lake Gen has two storage accounts. Now. Azure Data Lake Gentle Storage Accounts is just a service that is built on top of Azure Block Storage. In the earlier chapter, we looked at Azure Storage accounts, and Azure Data Leak is based on Azure Storage accounts themselves. With Azure data lakes gentle storage accounts. You have the ability to host an Enterprise Data Lake. On the left here, you get something known as the feature of a hierarchical namespace on top of Azure Block Storage itself. This hierarchy helps to organise the objects and the files into a hierarchy of directories for efficient data access. So I said, when it comes to storing data, initially when a company wants to take and store data coming from multiple sources, this data could be in different formats. You could have image files, you could have documents, you could have text-based files, you could have JSON-based files—files in different formats. And at first, the company simply wants a location to store all of that data in whatever format it is. They would go ahead and have something known as a data lake. And when it comes to Azure, you can actually make use of Azure Data Lake Gen2 storage accounts in the background. When it comes to storage, you don't have to worry about it. You don't have to think about adding more and more storage to the storage account. You can just keep on uploading your data. The service itself will manage the storage for you. And as your data lake is actually built for big data, for hosting large sums of data, you can upload data in its native raw format, and it is optimised for storing terabytes and even petabytes of data. The data can actually come from a variety of data sources, and the data can be in a variety of formats, whether it be structured, unstructured, or a combination of both. So now in the next chapter, let's go ahead and create an Asia Data Lake Gentle Storage account.
10. Lab - Creating an Azure Data Lake Gen-2 storage account
So here we are in Azure. Now we'll create a new resource. So, in all resources, I'll click Create to generate an Azure data lake. Too stores account, we haveto create nothing but a normal storage account. So you can search for the storage account service. Select the create button. Yes, I'll choose our resource group. That's our data GRP resource group. I'll go with North Europe as the location. Again, I need to give a unique data lake storage account name. So that's fine. Again, for redundancy, I'll choose locally redundant storage. I'll go on next for advance on the advance screen. This is what is important. There is an option for daily storage gen.Two: we have to enable the hierarchical namespace. This ensures that our storage account now behaves as a data lake storage gentle account. I'll enable the setting and all the other settings in the subsequent screens. I'll just leave it as it is. I won't make any changes. I'll go on to review and create. And I'll hit on "create." This will just take a couple of minutes. Let's wait till we have the storage account in place. Once our deployment is complete, I can move ahead with the resource and the entire layout. The entire overview of the data lake storage account is similar to a normal storage account, which you had seen earlier on. on the left-hand side. Again, we have containers; we have file shares; we have queues; and we have tables. The UB Data Lake Service is again the containers, which is based on our block service. If I go on to containers here, again, I can create a container. Then within the container, I can start uploading my objects. So here, if I create a simple container known as "data" again, we have the public access level of either a private blob or container. Anonymous access. I'll leave it private; no anonymous access. I'll hit on create.If you go on to the container in the container now, you can also add directories to the container. So let's say you're storing Rawfiles in this particular directory. You can create a directory known as "Raw Hit on Save." You can go on to the directory, and you can start uploading your files and objects over there. So when it comes to the block service and when it comes to data leaks, when you upload something to it, say a file, when it comes to the block service, this file is referred to as a blob or an object. Because in the end, it's actually stored in binary format on the underlying storage service, when it comes to the block service, it's referred to as a block or an object. Also, another quick note before I forget, so I forgot to mention this in each chapter when looking at storage accounts: when it comes to the block service, if I go back on, all resources go back onto my view. I want to go onto the storage account we had created earlier on the data store here. If I go on to containers on our data container, if I click on any object, we have seen that if you go on to the edit section, we can see what the contents of that particular file or that particular blob are. At the same time, if you go onto the overview, every object or blob in the storage account gets a unique URL. This URL can be used to access the blob since we've given access to the container. So, if I return to the container in terms of access level, we had granted "blob anonymous read access." That means we can read the blobs as an anonymous user. And what does this mean again? If I click on an object and copy the URL onto the clipboard, if I go to a new tab and press Ctrl-V, I paste that complete URL. Here we have the name of our storage account blob. This is our service blob for Windows Net. This is the name of our container, and this is the name of the image. If I hit Enter, I can see the blob itself. In this tab, we are now an anonymous user. In this tab, we are not logged into Azure in this tab.We are actually logged into your account. But yeah, we are logged in as an anonymous user. We are not logged in because no user is. So these are all the different security measures that you should actually consider when it comes to accessing your objects. And I said in subsequent chapters we would actually look at the different security measures in place. So I thought, before I forget, let me kind of give you that note. When it comes to the URL feature, which is available for blocks in your storage account, the same concept is also available for the DeederLake Gen 2 storage accounts as well. So, going back onto our Data Lake Gen 2 storage account, I'll go on Data Lake 2000, I'll just go onto containers, I'll go on to my data container, and I'll go on to my raw folder. I'll upload a file that I have on my local system. I'll click on upload. So again, in my temp folder, I have a JSON-based file. Just open up that file, and don't worry, I'll include this file as well as a resource for this chapter. I'll hit upload. So if I go to the file, if I go to edit, I have the JSON file in place. So I've got some information here. This information is actually based on the diagnostic settings that are available for your SQL database. So that diagnostic setting is sending diagnostic information about the database. So for example, it is sending what metrics about the database itself? You have different metrics, such as the CPU percentage, the memory percentage, et cetera. and at different points in time, it's actually sending that information. So I just have this sample file in place, a sample JSON file, and I've uploaded this file onto my data lake. Please know that we have a lot of chapters in which I'll actually show how we can continuously stream data onto your data lake using gentle storage accounts, because we still have a lot to go in this particular course. At this point in time, I just want to show you how we can upload a simple file onto a data lake using Gentle Storage accounts. So at this point in time, you should understand what the purpose of a data lake is. When it comes onto a job, it is based on the block service. And here you have the ability to store different kinds of files in different formats and based on different sizes. So at this point in time, I just want you all to know about the service that is available in Azure hosting a dealer lake that is in an Azure dataLake Gen 2 storage account where you can upload different types of files that are in varying sizes. Right, so this marks the end of this chapter.
11. Using PowerBI to view your data
in this chapter. I just want to give a quick example when it comes to the visualisation of your data, which is available in a Data Lake Gen 2 storage account. So I'll go on to my data lake. Gentlemen account. I'll go on to my containers. I'll go onto my data container. I'll go on to the Raw folder. I have my JSON-based file. I'll go ahead and upload a new file into this folder. In my temp directory, I have a log CSV file. I'll hit "open." I'll hit upload. I'll just quickly go ahead and open up this CA log file. So we'll be using the same LogCAC file in subsequent chapters as well. This actually contains the information from my Azure activity logs. Here I have an ID column, which I have self generated.Then I have something known as the correlation ID. What is the operation name, what is the status,the event category, the level, the timestamp I havethe subscription, the event initiated by what is theresource type and what is the resource group. I'll tell you the way that I actually generated this particular file. So if I just quickly open up all resources in a new tab, I want to go on to the Azure Monitoring Service. So the Azure Monitor Service is a central monitoring service for all of your resources that you have in Azure. If I search for Monitor and go to the activity log, let me just hide this. So all of the activities that I perform as part of your account are administrative-level activities. So, for example, if I've gone ahead and created a storage account or deleted a storage account or created a SQL database, everything will be listed over here. So what I've actually done is change the time span over here. I've chosen a custom duration of three months. So we can only look at the last three months' data. And then I don't think all the content is in a CSV file when you download the CSV file. So you don't get this ID column. So I've gone ahead and generally created thisID column in Excel, and you'll also behaving one more column, the resource column. For now, I've just gone ahead and deleted the data in this resource column, right? So I have all of this information in my log CSV file. If I go to my data lake storage account, select my container, and change the access level just for now, I'll select Blob enormous, read access for Blobs only, and click okay. Next, if you want to start working with PowerBi right away, PowerBi is a powerful visualisation tool. You can use this tool to create reports based on different data sources. So there is integration with Power Bi, with data sources that are available not only in Azure but on other third-party platforms as well. You can actually go ahead and download the Power Bi desktop tool. This is a freely available tool. This tool is available for download on your local system. I'm on a Windows Ten system. I've already gone ahead to download and install the Power Bi desktop tool. So I'm just starting the PowerBI desktop tool. Now, the first thing I'll dois to click on Get Sources. Actually, let's close all of the screens. I'll click on "get data" and hit on "more." I can choose as well as you, and you have a lot of options in place. I can choose as your data lake storage Gen Two. Hit "connect." Now, here I need to give a URL, so I'll go back onto my dealer's Lake Gen 2 storage account. I'll actually go on to the next point. I'll scroll down. I'll take the end point, which is available for data lake storage. So I'll copy this. I'll place it over here. Now my file is in the data container in the raw folder, and I look at my log CSV file. I'll hit okay; I can actually choose my account key. Also, I can enter my account key here so that I can scroll to the top. I can go on with access keys. I can show the keys. I can take either Key One or Key Two. So I'll take the key one. I'll move it over here and connect so I can get my Log CC file. I'll hit on transform data here. I get the Power Query Editor. Now I'll click on this binary link for the content, and then I should see all of the information in my Log CSV file. So I can see all of the information being displayed over here. I can just right click and rename this particular query log data. Then I can click Close and Apply, and my data will be saved in Power Bi. I can go ahead and close this. So now it's loaded all of the rows here. We should read all of our columns. So, for example, if you want to go ahead and have a clustered column chart, you can go ahead and just click this. Just expand it over here. I can close the filters. Let's say I want to display the countof the IDs based on the operation name. I'll get an entire graph over here. So now, based on the data that we have as your data (Lake Gen. 2), you can see that you can already start building some basic analysis on this. However, when it comes to raw data, you will typically perform data cleansing and transformation first. So these are concepts that we learn a little bit later on in this particular section. I just want to get started with the fact that you can actually start storing your data in your Gen Two Storage Account.
So when looking for preparing, you need Microsoft Certified: Azure Data Engineer Associate certification exam dumps, practice test questions and answers, study guide and complete training course to study. Open in Avanset VCE Player & study in real exam environment. However, Microsoft Certified: Azure Data Engineer Associate exam practice test questions in VCE format are updated and checked by experts so that you can download Microsoft Certified: Azure Data Engineer Associate certification exam dumps in VCE format.