Microsoft Azure Fundamentals AZ-900 Topic: Azure Core Solutions
December 13, 2022
  1. Internet of Things (IoT) Solutions

describe core solutions and management tools, and that is worth 10 to 15% of the exam score. We can see the exam requirements on screen. We just talked about some of the core Azure products, and those were compute, networking, storage, databases, etc. Now we’re moving on to more complex tools. These are almost software as a service, where Azure is giving you access to more complex services. So we’re talking about the internet of things. We’re talking about big data, synapseanalytics, which is a data warehouse product, and databricks machine learning.

All of these things are sort of a level above the core elements of virtual machines, storage, and networking. A lot of these are platform services or software as a service.And then later in this course, we’ll talk about management tools, which include all of the things that you use and need to manage your Azure environment and your applications and services. So, the first thing we’ll talk about are the Internet of Things-type devices, including Azure Sphere. Some of the products that Azure provides are IoT Hub, IoT Central, and Azure Sphere. The Internet of Things refers to devices that you may have in your home, on your body, or on your watch that communicate with the cloud in order to provide you with a real service. Your fridge could be like that, etc. The IoT Hub and IoT Central are ways to allow these devices to communicate in the cloud in a high-speed, low-latency way. You can’t imagine that if you’ve got a million thermostats deployed around the world, you want a really heavy solution that’s slow and expensive.

So an IoT hub could be something that receives little bits of data from millions of devices and is designed to handle that kind of volume. From there, you can do some processing and store that data somewhere. Azure Sphere is really interesting because it’s both a chip and an operating system, and it’s basically a security protocol for these devices. So you can think about these millions of thermostats, fridges, and watches while keeping those things secure so that hackers can’t break in and install their own code and get your webcam. Spying on you is a big challenge. As a result, Azure S received this Azure Sphere product, which assists manufacturers in creating these secure devices as well as an operating system against which they can program. 

2. Big Data Solutions

So the next category of services that we’re going to talk about falls under the umbrella of “big data” and “data analytics.” Now within it, there’s this type of database called a “data warehouse” that has been around for a long time and is designed to pull in massive quantities of data and make it easy to run reports on that data. You see, prior to the data warehouse, people would run reports on the production database, which in this case is SQL Server. And those reports would actually interfere with the performance of the day-to-day operations of the system. So somebody would run a big report, and that would actually slow down the processing of sales and orders. And so somebody came up with the brilliant idea of pulling out the data that is needed for reporting into its own database. This could be done overnight. It was called a roll-up. And that database could also then be expanded to include data from other sources and could also include full historical data because, in an operating system, you might want to trim data that is no longer relevant to keep it lean and operating well. So the data warehouse was born. Now, Microsoft used to have a product called the SQL Data Warehouse that was this style of big database used for reporting. Within the last couple of years, they introduced a new product called Synapse Analytics that includes the data warehouse. But that idea has been expanded even more. There are also some open-source big data solutions called Hadoop or Apache Hadoop. You may have heard about this. This is also available on other cloud platforms. Microsoft offers managed Hadoop services such as Hadoop, Apache Spark, Hive Apache Kafka, Storm Many of these open-source products are available in managed form in Azure. That means that Microsoft operates these services, they have the technical ability to keep them running, and you’re just interacting with them in a software-as-a-service or platform-as-a-service type of way. So that umbrella of all those Apache products is called HD Insight. So it’s an umbrella term to refer to all of these products. And another really interesting product, which I think I’ll show you in a moment, is called Azure Databricks. Now, Databricks is also an open source project, but it is basically a data analytics platform. In this case, running inside of Microsoft Azure It’s a managed platform. And the idea is that you’re going to have all of your data analysts, your database administrators, the business side, the technical side, and the reporting side all working in one workspace to make sense of the data. And so I’ll show you this thing called JupiterNotebooks in a second where you can actually run queries against the database, see the results, and then annotate those results so that it becomes a report. So it’s basically real-time reporting that comes with reports, data, and handwritten notes of the data analyst making notes against that. So Databricks is basically a modern platform for many areas of your organisation to work together to derive insights from this big data. So these are the three large, big data platforms that we’re going to talk about with Microsoft Azure. It does veer into what is called the “analytics space,” which is again pulling out insights from this. and I’m going to show this to you in a second.

3. DEMO: Azure Databricks

So let’s demonstrate this Azure Data bricks service. I’m going to go into Azure, under the marketplace, and look for Azure data bricks, and when I say “Create,” I’m basically taken through a very simple wizard in order to give it a name, put it into a resource group, and choose the location. I’m not choosing to put it on my own network right now, but I could do that if security was more of a concern. This is just for demonstration purposes, and I’m going to click Review and Create and then click Create. So now we’ve created the Azure Data bricks workspace, but we haven’t actually created any servers for it yet. So we go into the data bricks workspace and we say, “Launch workspace.” You’ll notice that it’s taking us to a different website to work with Azure data bricks.

Zoom in a little bit here. Now, we don’t currently have any servers associated with this data brick, and so we are going to have to create a data brick cluster. Now, do keep in mind that anytime you’re dealing with servers, you’re dealing with cost. And so starting up the set of servers is basically going to have a certain cost associated with it. We’ll be sure to choose small enough servers so that the cost isn’t too high. But we do want to have some servers available for this cluster. So I gave it a name. I left a lot of the default options. I’m going to make sure it terminates after 45 minutes of inactivity. And again, inside a cloud computing environment, when the servers are not running, you’re not charged for those servers. In this case, you’ll notice that I can have between two and four workers. And these are the three DS servers that have 14 gigabytes of memory, four cores of CPU, and zero database units, which is a measure of performance. And we can see that we have other options in here, so we can choose bigger servers if we need to. In a real environment for testing, going with the smallest one isn’t a bad idea.

So now I can click “create cluster.” Now, again, the purpose of this is not so much for you to step through it with me so much as to show you the power of being able to create, in this case, a database analytics system, and particularly when we get into the notebooks, which is pretty powerful stuff. But I’m now requesting two to four servers be created, and this can be done in just a few minutes. All right, so after a few minutes, we now have a green dot here, and we can see that the cluster is running. I’m going to head back to the homepage here, and what we’re going to want to do is now look at the notebook feature. So to me, this is sort of a behind-the-scenes server that we just created. What we want to do is see how the notebooks can be used to bring teams together by bringing the people who know the data—the database administrators and the developers—with the people who know the business, which are the data analysts and the business leaders. So I’m going to say that I’ve created a new notebook called “firstnotebook,” and you can see that it is running on Python. So now the language of the data analyst is often things like R and Scala, but you can create different languages. We’re going to choose the Python notebook and say “create.” We now have a place where you can run queries against the data and create annotated reports effectively. Now we’re going to use a database. what’s called the Azure Open Data set.

I’m going to pull it in here, but we can see that Azure has what’s called an “open data set,” and you can refer to it; it’s in a storage account that’s public, and you can then run queries against it. Instead of inventing fake data, we’ll pull over three and a half million rows from the City of Boston. The first thing we’ll do is sort of set up the variables here. So this little box is actually where you type your Python commands. Forgive me for pasting this in here, but I’m just basically going to set up four variables, including the Open Data account name, where the container is, where the data lives within that container, and a published security token. Because, I guess, even though it’s a public database, it’s not a publicly open storage account. So you hit Shift Enter, and that runs the code. It says “command zero, 6 seconds.” Now all I’m doing is setting some variables. It’s not actually doing anything. Next up, we’re going to do a command that’s going to point to the remote data. This basically builds the path based on the variables that we set up. So now it’s some output here, a remote Blob path, and it’s the place where the city of Boston data is. I’m not trying to teach you Python here; I’m just trying to show you that you can build a query that can be much more than that. We’ll start by making a data frame. As a result, it took 5 seconds to generate that data. Right, we passed the path into the read, and Parquet is a method of reading data extremely efficiently for text data.

So instead of XML and JSON, PARK is sort of like the efficient method for ASCII data. And lastly, here we’re going to basically just output some of the rows. So there’s a traditional SQL statement, but it’s going to say “display” and “output.” From that data frame, you can see that it took 13 seconds to run. And now we have the rows being shown. Now let’s say you want to take these results and instead of displaying them in tabular form, you want to trade them as a graph. We can say we want a bar chart, set up the options here, and build ourselves a bar chart. Or, instead, make this a pie chart. Instead of a sum, we’ll make it a count. And so we can see here in the top ten results that we only selected 1050 percent of the results from our Citizens Connect app, and 50% are based on the call. When I click Apply, my data is replaced with a pie chart. Again, you can add commentary to this so I can say, okay, these four are setting up variables, and over here is defining the path to date. And so you can see, it’s not just queries; it’s results; it’s commentary. You can turn results into diagrams. This is saved. You can come back to it. It’s all pretty cool stuff. And you can get much deeper into it than I’ve shown here.

4. AI Solutions

So in this section of the course, we’ve been talking about all of the core solutions that are available within Azure. We talked before about how there are services such as storage, compute, and networking that you can build your own solutions off of. On top of those, Microsoft is making available hundreds of its own services that you can turn into your own applications. One of the more interesting ones is the artificial intelligence service. And it’s generally around the concept of machine learning.

Now, if you’ve heard that term before, great. And if you haven’t, that’s okay as well. Machine learning is a whole branch of computer science that involves computers being able to make decisions similar to the way that humans would. And obviously, once you teach a computer to do something, it often does it much better. And so we’re seeing this in our self-driving cars and in systems all around, but we even have consumer-facing devices that contain this type of technology. You can communicate with your Nest, Thermostats, Google Home, Amazon Alexa, and Smartwatch. They understand you, and they can act upon your requests. You can order this TidePods refill from Amazon and it will be delivered to your home. That technology is not restricted to Google or Amazon. You can use that same technology in your own custom applications. Microsoft created a whole set of services called the Cognitive Services, which contain a bunch of APIs that you can call.

There are services centred on vision. This usually involves analysing the contents of a photo. As a result, you can essentially send a photo to the vision service. And it can either identify whether there are faces in the photo, how old they are, their gender, or, if it’s a celebrity, who they are. If it’s a location, it can try to identify the place. It can identify objects such as bicycles, cars, and chairs, effectively return a tag cloud, or even just describe the picture. If the image is of a man swimming in blue water, the computer can analyse everything in the picture. And so imagine if you have images being uploaded to your application; they even have moderation. So if you’re trying to stop adult content from making its way inside or illegal content from entering, then they have APIs that deal with adult filters and moderations, etc., etc. They also have audio-related APIs. So then you can understand human speech and turn that into text, or vice versa. Take a piece of text and turn it into speech. There are also translation APIs that allow you to switch from one language to another. And it basically supports dozens of languages at this point. And that’s more than just taking single words and translating them one word at a time. But computer-aided translation has really evolved over the last five or ten years so that it understands the nature of human language. It understands the context of what you’re saying. And so if you’re using a word that can have several definitions in a different language, it will know the context in which you are speaking and choose the correct translation for that. These translator APIs are getting much, much better, even in the last five years.

It’s a huge increase in the quality of translations. We also have various decision-making APIs that are able to categorise products and make predictions. And so, if you’ve got a set of data and you want to understand which of the following people are more likely to lose weight, let’s use losing weight as a predictor. Then you can feed in the data of the people—their characteristics, their weights, their behaviors—and then be able to predict in advance the types of people who are more likely to achieve their goal. That kind of example might be a terrible one. But think about all of the things that might be helpful for you to understand as a prediction. Maybe you want to know which of your customers are your best customers and which of your customers are more likely to rate your service highly in the next three months. And so if you have a list of customers that have used your service in the next three months, you can then train the model so that it can look at the customers you don’t know about and make a prediction as to which ones do or do not need your service. There’s lots of potential in that clustering analysis and the ability to categorise things, et cetera..

So machine learning has seen huge growth. There’s a lot of research being done, and then you can act like the big guys and use that within your applications. Another interesting advance that is also potentially helpful is what’s called the “bot service.” So this takes natural language processing—the ability of a computer to understand human speech and even written speech—and be able to respond to that in an intelligent way. So you’ve got an example on your screen of somebody asking, “Do you know if my package has arrived?” And the computer on the other end knowing who they are and then looking up their history in the database to see when their packages were sent and then asking them which package they received? So there’s a context to what’s being said, and the computer can respond intelligently. And you yourself can add a bot like this onto your website using the bot service. It’s also good for doing searches on your website, so if you know what hours you’re open, what your locations are, what products you store, et cetera, all of that information is on your website. But a bot can answer those questions as well. So those are the AI services, the umbrella categories of AI. Machine learning is really the application of AI. There are all sorts of cognitive services for that. Finally, there is a bot service with a chat conversational nature. All of that stuff can be used by your applications, which do not even have to be hosted in Azure. They could be hosted on your own website on Azure so that those APIs can be called from there.

5. Azure Functions

Now, the next topic to talk about for this exam is the concept of serverless computing. Now, the name is a bit of a misnomer because there are servers there. They’re almost certainly hidden from you. A couple of the most famous serverless options within Azure are called Azure Functions and Azure Logic Apps. Azure functions are small pieces of code that are hosted in Azure, and they basically run off of some trigger. It could be a timer, a data change, or a stored file.

And that trigger caused some code to be run and some action to occur. So instead of creating a whole programme for this and creating a web app or a virtual machine and having to host that, functions are much smaller pieces of code that are hosted directly within Azure. In fact, there is a functions editor within the Azure platform, and you can simply create the function right within the editor, and it will execute on the specified schedule or trigger. So there are no servers involved because you’re not defining the instance, the number of CPUs, the size of the RAM storage, all of those things. You’re not choosing a hosting plan like you do in a web application. Now, Event Grid is also categorised as serverless because it is actually performing a function for you. It’s actually doing messaging, communicating with the outside world, passing a message into Azure, and then forwarding that message to a listener who will act on it.

But it is not programmable in that sense. It’s configurable, not programable. I’m going to actually show you the Azure Function service to demonstrate to you the power of Azure functions. So I enter the portal and look for the function app. I see that I don’t have any, so I say, “Create a function app.” I do have to create a resource group for this. And I do have to give the app a name. Notice the name of the app has the azurewebsites.net domain attached to it. That means it’s a fully qualified domain. And if this app has external access, that’s the URL at which you’re going to connect with it. You do choose the operating system and the code. So let’s say I want to choose NetApp for the backend and Region 3 for the frontend. As always, it does have to have a storage account because you’re creating files, and those files do get distorted somewhere. I do choose the underlying operating system. Now, Net Core is a cross-platform operating system. So you can choose Windows or Linux. And this is where you have identified it: the serverless form. Now, Azure functions can run within the Web App service plan.

And so in those forms, it is not server less because you’re paying a fixed amount per month. But in the consumption plan, it is a server less form. I’m not going to set up any monitoring, turn off tagging, and then I’m just going to hit the create button. So that was done within a couple of minutes. Let’s go to the resource. We’re going to skip right into the function settings. Right now, we don’t have any functions. This is just a shell. So we’re going to create our first function. and this is one of the cool features. We can develop this code right within the portal. We don’t need Visual Studio; we don’t need an external source to develop it and push it in there. It’s basically going to install right within the browser. It does start with some basic triggers. I’m going to do it as an HTTP trigger. So we’re going to refer to that URL we created as well as my NewFunction SJD Azure websites. net, and that’ll be the way that we start the function. Now we’ll see that you can start it based on email, Cosmos, DB, and the Blobs service bus. There are lots of options for triggering a function to run. It does have to have a name. And I’m going to make this an anonymous function, so it doesn’t require any security to run. Let’s go into the code for it. So we chose a C++ function or a NetCore function. And so we’ll be presented with the code right here in the browser. Constant reminder. You don’t need to know how to do this for this exam.

And I know that as soon as we see code onscreen, we’re going to risk you thinking this is too much, but you don’t need to know how to do this. But I’m just showing you this code that somebody understands. And if all you need to do is deploy a little bit of code, you don’t want to spin up a server and pay hundreds of dollars a month for it to be running; you just want it to be available. This is one way to do it. Now, the default code that they’ve given me here takes in a name as a parameter and then says “hello, name” with the string here. So I’m going to change this. I introduced myself by name. The function works. I’m going to save this. There is a test interface, so I’m going to go into the test version. And since this is an anonymous function, we don’t need to pass the key. We’re going to pass in the parameter name, but I’m going to see Scott, and I’m going to say run. And so this is simulating the calling of the function.

And as you can see, it has returned. Holo scot. The function works, which is the code that we created within the function. So, copy the URL, paste it into a browser window, enter your own name in the “Equals” field, and press enter. and I can zoom in a little bit. We can see that it says hello with the string that I passed. Whoa. The function works. And so you can use a function to execute a piece of code in Azure without any servers, a coding environment, GitHub, or publication. That’s as easy as it is. And it’s really powerful for small pieces of code that have a very specific purpose. How much do you think this is going to cost? That’s the other question. This function that I just created on the serverless model was completely free. There is an enormous free level, which I believe is one million executions per month. And of course, the involvement of your function and the amount of CPU it uses are relative factors to that. but a million executions a month for free, which is basically one run every few seconds. And so this is really cool. That’s a serverless function.

6. Azure Logic Apps

So functions are pretty cool, as we saw. But what are logic apps, then? So logic apps are the other type of, I would say, famous example of server less computing within Azure. Logic apps are workflow services. So if you have a workflow that has several steps, start with step one and do this. Step two: do that. Step three: do that. You can develop that as a logic app. The logic app takes care of executing the steps. Now you could also have logic in there, which could be “if this, then that,” or you could have it trigger things that are outside of Azure. We’re going to switch over to the portal. Again, I’m going to show you a logic application. We won’t set it up completely, but I’ll go into the interface and show you how. It’s a connector between all the different Azure services as well as external services.

So we switched over to the portal. We see the function in the resource group we created, which has the function in it. I’m simply going to add a logic application to the same resource group. So I type “add,” go to search, find the logic app, and see “create.” Hopefully, we know this by now. You give it a name, you choose the region, you put it in the right resource group, etc. We don’t need tags at this time. And we can say “review and create.” It only took a couple of seconds. I can say, “Go to the resource.” The first time you go into a logic application, you’re thrown into this designer view. Logic app designer The second time you go into it, it’s back to more of a normal view, and you could learn about logic apps, but I prefer to just dive right in. Because it is also a trigger, logic apps do not include it. So it is based on a schedule or an event that is taking place. But now you’ll see some other events, like OneDrive, Twitter, and Outlook.com. We can go down here and look at some of the examples. So basically, we’re looking at Dropbox as a trigger. When a new file is created in Dropbox, copy it to OneDrive and get daily reminders emailed to you. This is ethereum.

So it was published as a smart contract event to a SQL database. So when something happens in the Aetherium blockchain, you publish that to the SQL database. So there are some really cool external connectors. There are a couple hundred places to connect; you can create your own, etc. Well, let’s do the HTTP trigger because that’s the easiest one to test. So it isn’t validated until you save, and the HD URL isn’t created until you save. So let’s say the URL is called. What do we want to happen? Well, let’s say the first step is to call the function that we just created so I can actually search for it. Azure Functions is one of the connectors. So my new function is one of the functions that we just created; remember, we created the test function, so hello name, the function was successful is one of those. Now it’s asking for the body, right? I’m going to do the parameter, and I’m going to say, “Let’s hard code it so that name equals Scott.” I could put it in the JSON body, and that would also have worked, etc. Now that we can say what we want to happen once that function runs and returns successfully, what else do we want to happen? Let’s say we want to store that result. Whatever string comes back from my function goes into Azure storage.

So I want to do the “create blob” action. I can choose the storage account as the storage account associated with this function, give it a name,  say “create,” and then we can create the container and the folder and the path to the file and the contents of the file. So we know that I want this to go into, I’m going to call this test container, you give it a name, the blob name will just be the date and time that it occurred, and the contents of the blob will be the function’s return. So, once again, you don’t need to know how to do this; this is just an example of something happening, causing something to happen. And we can just keep going. We can also have an email go out; we can also tweet; we can also send to LinkedIn; and on and on. It’s a big list of actions based on an event. So logic apps are connectors between various services inside and outside of Azure, including functions. Again, we’ve not seen any definition of the server this is running on, the amount of RAM, or the amount of CPU. This is serverless, and you’re going to be charged based on the number of executions. So it’s a consumption-based plan as well. Depending on how frequently this occurs, it could be very inexpensive.

7. Azure DevOps Solutions

Now in this final video of this section of the course, we’ll talk about the concept of DevOps. Now DevOps is one of those hybrid words. It’s an amalgamation between development and operations. Now, I’m not sure about your organization specifically, but many organizations have a whole team of developers called the development team. And then they have a separate department called the “Operations Team” or “IT Team” that manages the servers and the hardware and keeps everything running. So there are people who keep the lights on day to day, and another group of people who work on maintaining the applications, the code, and the future development of that. Now that DevOps is sort of a hybrid role, even where you are working with the operations team, you are keeping the lights on, you are keeping the servers running, and you are keeping everything from crashing. But you also have the development skills to write scripts and do automations.

You basically tie in some of these development functions into the operations of your company’s infrastructure. This could take the form of what’s called continuous integration or continuous deployment, where the development team checks in some code into GitHub. And that code is automatically deployed to a development or testing server. There is also no hands-on operation; no one is waiting for a ticket to deploy the code. It’s all automated. So the DevOps function really is involved in automation, but there’s a significant amount of programming to do that automation. Microsoft Azure has a DevOps suite of tools that have a lot of these back-end functions tied together. So there’s a pipeline function that you can use to create logic, just like a logic app, from the time that the code is ready to be deployed through testing suites, copying of files, pushing to the development environment, sending out an email, and then when that gets authorized, sending it to staging, etc. So you can create a pipeline from that. There are also tools for agile development processes, Canvan boards, and other agile methods within the DevOps suite as well. Now, some of you may know that Microsoft purchased GitHub a couple of years ago, and they pretty much left it alone, except they’re starting to tie GitHub into Azure.

Behind the scenes, there’s a new feature called GitHubActions, which we’ll talk about in a second, that lets you push code right from GitHub into Azure. So GitHub, if you don’t know, is a website that was independently founded, and it was designed for you to store your source code in a centralised location. Since Virtual Source Safe years ago, we’ve had code repositories and various types of source controls. But the same guy who founded Linux, Linus Torvalds, invented this technology called Git, which is used for code repositories. And so GitHub, a public version of the Git repository, became very popular, and Microsoft purchased it. Now, it doesn’t mean that your code is public. There are public repositories, which you’ll find in some of my other courses. I am quick to recommend going to GitHub to look for sample code. So if you are wanting to become a developer or an administrator, I do recommend you get involved in Git and find some excellent code that you can borrow from and build upon. But they also have private repositories, which means your company can store its code there and no one outside your organisation can see it. It also has some pretty good integrations into the tools you may already be working with, such as Visual Studio. Now I want to mention GitHub actions.

These are relatively new. It’s an automatic software workflow built into GitHub where it can detect that some source code has been changed and then compile that code and push it into your development environment so that as soon as a developer makes a change, there’s already a website that incorporates the change. This is called continuous integration. You can also check for errors. It can also test the code to see if there are bugs. You have a set of test cases that are automatically run, and then you get an automatic notification once you’ve done a check-in that the quote you just checked in contained a couple of bugs or more likely broke a couple of test cases. So again, it’s a pretty cool idea. Not everyone does this, but to have your webapps automatically update when new code is there, not necessarily in production but in development and testing, That means that people can do a little bit of development, check in their code, and then switch over to the browser window and test the code, and then go back to the fix, check that in, and test it. It becomes a quicker cycle of development and testing in a real environment outside of the developer’s computer.

Leave a Reply

How It Works

img
Step 1. Choose Exam
on ExamLabs
Download IT Exams Questions & Answers
img
Step 2. Open Exam with
Avanset Exam Simulator
Press here to download VCE Exam Simulator that simulates real exam environment
img
Step 3. Study
& Pass
IT Exams Anywhere, Anytime!