DP-300 Microsoft Azure Database Topic: Implement compliance controls for sensitive data
December 20, 2022

1. 34. apply a data classification strategy

Now, some of your data is sensitive, but people might still need to be able to see it. So sensitive data includes data privacy, regulatory requirements, and national requirements. So we’ve talked extensively about GDPR and the California version. And maybe there are security scenarios that include controlling access. So what you can do is say that this particular column is a bit sensitive. So to do that, we go into the database this time.

So if it’s due to data, we’re going to the database. And under security, we’ve got data discovery and classification, and I’m going to go into the classification tab. So here we can see that nothing has been classified. However, there are 25 columns with classification recommendations. So, for instance, maybe the employee’s first name is confidential under GDPR. So whatever you choose, just check them and accept selected recommendations, and then save. But maybe I want to do a more manual one. That’s fine; just go ahead and add a classification. So you select the schema, the table name, and the column name, and then you select the information account. So there’s NA and then there’s networking, and there’s personal data such as contact info, name, national ID, Social Security number, health, and date of birth. There are credentials, and there are financial records such as credit card, banking, and financial information. So I’m going to put this one in as contact information. And then we have a sensitivity label. So NA is not sensitive at all.

It’s data from your own scope. It’s completely fine if anyone sees that. “Public” refers to freely available business data or information that’s been released to the public. General business data not meant for the public, such as emails, documents, and files that do not contain confidential information, then are confidential, and then are highly confidential. And, in the GDPR version, data that, if overwritten, would cause significant harm to your company. Now, just to point out, you can’t select NA for both the information type and the sensitivity label. One or more of them are required. So I’m going to say that this is contact information with general sensitivity. Now, the following roles can modify and read a database’s data classification: owner, contributor, and SQL Security Manager. Additionally, the following can read a database’s data classification but not modify the reader or user access administrator, so I’ll add this and save. Now, once they’ve been classified, you can use auditing to drill into security insights, gain access to sensitive data, and so on. And you will also see it in what’s called “intelligent insights.” Now, let’s manage the classifications elsewhere. Let’s take a look at TSQL.

So we have Sys sensitivity underscore classifications, and that shows you your classification. So we’ve got these information types, and we’ve also got information about what they are. So this may need a bit of explaining, but if I go into Sys columns and look where the objectID is equal to this long number, for instance, then you’ll see that this first one is the employee’s first name and the last one is the employee’s last name. So you can see where the object ID and the column ID match up. Assume I wanted to include sensitivity classification. Well, I can do that. We’ve added sensitivity classification to the name here, so it’s the schema name, table name, and column name separated by dots, and then the label with and in brackets. Networking, contact information, credentials, credit cards, banking (another name for the national number is the SSN), health, and date of birth are all examples of highly confidential information. So we’ve seen those already. And then the rank: low, medium, high, critical, or not critical.

So if I put this on the postal code, as I’ve already put one on the city, you can see that it ran successfully. And if I go back into my portal, instead of three classified columns, we have four. And now you can see something in the overview: you can see the label distribution in terms of confidential, general, and highly confidential, as well as the type of information distribution type.And then, if you no longer want it, you can drop the sensitivity. So I will drop it again for the postal cord. And when we’ve gone from four, you’ll see we’ve gotten down to three. So this is the data classification strategy. So you can do this in SSMS, and you can also do this in the portal. So you can see there are various suggestions, and you can always add a new classification with an information type and a sensitivity label.

2. 35. configure server and database audits

In this video, we’re going to configure server and database audits. Why would you want to do that? Well, you can use auditing to retain a trail of selected database actions, report on database activities using preconfigured reports on the dashboard, and analyses reports for suspicious events, unusual activity, and trends. Please keep in mind that it is not supported for premium storage, storage, or hierarchical namespace. We want to encounter hierarchical name spacing in this particular course. For example, it’s using things like Data Lake Storage, Gen 2.

Now, if Azure databases are really busy, Azure will prioritise other actions and might not record some audited events. And they are written to append blobs to blob storage. So let’s talk about server policy audits. This is for all existing and newly created databases. Server policy audits always apply to the database, regardless of any database-level auditing policies you may have. They can sit side by side, so you can do server policy audits as well as specific database-level auditing policies. Now, Microsoft recommends using only server-level auditing unless you want to audit different event types or different event categories for a specific database. The default auditing policy includes batch-completed groups. That’s for all queries, installed procedures, successful databases, and failed database authentication group.

So that’s success and failed logins, and it stores around 4000 characters of data in an audit. So let’s add auditing to our SQL database. So, to begin, I’m going to log into the server. And under the security heading on the left, we have auditing. So I’m going to enable Azure SQL auditing. So as you can see, it will audit all the queries and procedures installed and executed against the database, as well as successful and failed logins. Now, where are you going to store all of your auditing? So we could have an existing or new storage account, an existing Monitor Log Analytics workspace, and an existing event hub. So you have a choice of options. So you might want to try one or two of them and see where you use them. Now, in the server area, you can also enable auditing of Microsoft support operations if we’re in the database.

So I’ll go to a specific database and go down to auditing on the left-hand side. Then you don’t have that option, but you can view the server settings if you so wish to do so. So that gets you something similar to what we’ve got on the server. So we have the choice. So let’s start, for instance, with the monitor log analytics workspace. Unfortunately, there isn’t a plus sign for creating new ones, so we’ve got to create a new one. So I’m going to go into a separate tab and type in Log Analytics. And there we have our workspace. So I’ll create a log. Analytics workspace. So a resource group As you can see at the top, Log Analytics says A workspace is the basic management unit for logs. So let’s go down and give it a name. So this is log analytics broken down by region. I’m going to stay in the vicinity of where I am, UK South. Click on the pricing tier. We have a pair-as-you-go pricing tier, but that is actually quite a good one.

You won’t incur any charges until you’ve obtained a sufficient amount of data. So let’s create and review this. So after quite a while, it validated it. So I’ll click Create, and it will begin creating. Now, while it’s creating, let’s take a look at Event Hub. So event hubs do similar things. You would need to set up a stream to consume these events and write them to a target. Now, alternatively, you might just want to have it in storage. So I’m going to pair as you go. And there is a Create New button here. So I’m going to have some storage here. So this is the DP 300 storage account, but as you can see, it can only contain lowercase letters and numbers. So let’s try this again. So, using general-purpose storage with standard performance locally and redundant storage for the rest of it is fine. So I’m creating that at the same time. So we’ve got all of these things happening. So now I’m going to acknowledge that the deployment is complete. So I’m going to go back to my DP300 server, go back into auditing so I can enable Azure SQL auditing, and click on Log Analytics. And there is my log analysis that I have previously set up. And I’m going to click on Save, and you can see it’s now saving the auditing settings. We’ve also now got this storage account set up. So I’m going to add the storage account once the save operation has finished. The term “electronic commerce” refers to the sale of electronic goods. So I’ll save that.

So what I’m going to do now is pause this video, record quite a number of more videos, and then come back here so that we’ve actually got events that I’ve done and that we can then audit and see what I did. Well, it’s a few days later, and I’ve got server-level auditing enabled, but database auditing is not enabled. So I’m here in my SQL database, and I’m going to click on “view audit logs.” So there’s nothing in the database audit because I don’t have one. But in the server audit here, you can see things like “batch” completed and “RPC” completed. So that’s a remote procedure call, so you can look at all of those. Now, what you can also do is go to SSMS, and I can go to File, Open, and Merge Audit Files. So I’ll click on “Add.” I’ll add it from Azure Blobstorage, which I’ll have to connect to first. So I will go sign in to Azure Storage. So select my storage account or my Blob container. Notice. I’ve got a volubility assessment.

I’ve enabled this in a few videos from here. So you can also use exactly the same way to merge vulnerability assessment files: select my server, select my database, and select from and to. So I’m going to have multiple files downloading, and then it’s going to merge them all together so I can view them in SSMS as well. So with my audit files, once they are created, I can just go to audit, and then I can view the audit records. I can also go into Log Analytics if I have it enabled. And the results can be seen here. So you can enable SQL auditing at the server layer and also at the database layer. So these audits apply at the same time. So you can have different database policy audits for server audits, but Microsoft recommends using only server level auditing unless you want to audit different event types for a specific database. So you can get batches completed, group all queries, and store procedures; you can get successful database authentication groups or logins; and you can save it to an existing or new storage account, an existing monitor, a Log Analytics workspace, or an existing event hub.

3. 36. implement data change tracking – Change Tracking

In this video, we’re going to have a look at data change tracking. Now, change tracking is separate from change data capture, which we’ll have a look at in a few videos time.

So CT is not the same as the CDC. Additionally, while both change tracking and change data capture can be used in an Azure SQL database, the only one that can be used in an SQL-managed instance is change data capture. Now, what does change tracking do? Well, let’s suppose that you have a table like Smallest’s address. Now suppose a row gets changed or a particular column gets changed. That is what changes tracking tracks. So, for instance, I could tell you that this row and this column have changed. However, it doesn’t track how many times something has changed, nor does it track historic data. So I couldn’t go back and say what it was changed from. So it is more lightweight and requires less storage than a feature that would do all that. CDC changed data capture, so what is it used for? Well, it enables applications to determine which roles have changed and then only request the new rows.

So that can save a lot of time. When you open up an app, it doesn’t have to reload the entire database, just the things that have changed. The data is stored in an in-memory raw store and flushed on every checkpoint to the internal data. In other words, it’s kept in memory and then saved every so often. Now, you might want to consider using snapshot isolation for this, so that any changes that are made while getting the data are not visible within the transaction. So it has a set of data rather than a changing set of data. So you can do that with either of these. So this will alter the database to allow snapshot isolation on.And only when “allow snapshot isolation” is on can you set the transaction level to “snapshot,” and then you can have your transactions. So it may sound like a good idea. So how do you actually implement it? Well, if we go to the database in SSMS, right? And click and go to Properties; there is a section on the left hand side for change tracking. So at the moment, it is falsely indicating whether change tracking is enabled for the database.

So I’m going to change that to true. You can then set the retention period. So two days is the default, so you don’t have to have an endless list. Change information will be kept for at least this long. The minimum is 1 minute. There is no maximum. And you can also select whether data is automatically cleaned up during that retention period. If it’s true, then the tracking data will be removed periodically. So if an app has not received the updated information in time, all data would need to be refreshed. So that takes advantage of this retention period. However, if the change tracking information is false, it will not be removed and will continue to grow. So I’m going to say that’s absolutely true.

Now, there is an alternative in TSQL. So change the database name, name a database, turn on changeunderscore tracking, and then in brackets Underscore retention equals what time period? Two days. Comma auto-underscore cleanup equals on So that enables track changing for a particular database,  but it has not been enabled on any tables yet, so we will need to do that as well. So if I go into the address table here, we can see right, and click on it, it will go to properties again. We have a track-changing section on the left side. So I can set the track changing to true for a specific table here. So we need it for the database first. Now we can also set the track columns updated to true. So this is useful if you use the update command; otherwise, it won’t be tracking updates. So you can say on or off, true or false.

Again, you can use SSMS for this. Change the name of the table to Table Enable Tracking with Track Column Updated Equals On or Off. So when you’ve had enough of CTs, if that’s the case with change tracking, you can disable it. You need to disable it on all of the tables before you can disable it on the database. So it’s very easy to do in SSMS. Again, you just right-click on the table and then the database code to change tracking and change this from true to false. In TSQL, you can see the code on the screen. So disable change tracking as opposed to enabling it, and then set change tracking equals off for the database. Right, so now that’s done, what can you get? Well, first of all, you can see whether your databases are enabled for change tracking. So here we can see that database five, and we can canalways find out what database five is from SYS databases. So database five is the DP 300 database. So that is on with a two-day retention, and auto cleanup is on. We can also see what tables are being tracked. And, once again, I believe we may require assistance from objects, sys objects, or sys tables.

So you can see that object ID 1778, et cetera, is on. So let’s find 1778. There it is. And we can see that that is the address table. So when are track changes updated on?Yes, that’s absolutely fine. Now, how would we use it in real life? Now, you probably won’t need this for the exam, but first of all, you’ll get the initial sync, so you’ll get where it’s synchronised up to. So, if I view it, we can use change underscore tracking to update the current underscore version. Right, let’s make some changes. So, I’m just going to edit these top 200 rows, and I’m going to change the very top row from 8713 Yosemite to 8712. So the last sync was zero. So now if we have a look at what the current version is, the current version is version 1, so we can find out what’s happened since version zero. So there we are.

We can see that there is an update operation, and this system change column will tell you which particular column has been updated. However, Microsoft says you shouldn’t use this directly. Instead, it recommends using another function called “change underscore fracking” in the “column” section of the mask. And this is when it gets a bit complicated. So I’m not going to go much deeper into that, but hopefully you can see the principle. So it is telling us that one particular column has changed, and when we drill down, we will see that it is address line one. Now, I’ve stated that it may flush to disk, which is perfectly fine, but it can also auto-clean up. So what that means is that after the specified retention period, it will disappear. So what the change in tracking minimum valid version is saying is, “Okay, we are currently on a particular tracking number.” This is the tracking number that is the earliest for the retained information that we’ve got. So the information that’s not being automatically cleaned up So if the last sync is later than or equal to this, then it’s fine.

We don’t have to refresh the entire table. However, if the last sync, let’s say the last time we synchronized, was version 80 and the earliest we can get back to is version 90, then we’ll have to refresh the entire table because we don’t know what happened in sync for versions 8182-28384 and so on. This is known as change tracking or data change tracking. So you enable it for a database, and then you enable it for as many tables as you want and look at the information that you can get out. You can get the primary key, but you can’t get other things. So putting CT address line 1 into the command line will not work. You can get an idea of what sort of operation it was. So in this case, an update You can see with a bit more work which particular columns were changed, but you can’t see what it was changed from or to. So if that’s sufficient, It is fairly lightweight, definitely compared to change data capture, and it requires less storage. So this is how you can do change tracking. Just right-click on a database on a table and go to change tracking on the left-hand side.

4. 36. implement data change tracking – Change Data Capture

In the previous video, we implemented data change tracking (CT). However, this did not tell us where a specific item came from or went to. If we want that, then we’ll have to use a different technology called Change Data Capture (CDC). However, I cannot do it on the database as it exists because it is a bit too low of a level. We have it currently on the Basic pricing tier, and that won’t work. So I could upgrade it, do the change data capture, and then I’ll have to remove the change data capture to go back to Basic. Or what I could do is create a database just for this particular video. So I’m going to create a single database, and it’s going to be called DP 300 Database Two. It’s going to be on the same server, and let’s see what configurations we need. Well, it can’t be on the basic level; it could be on the standard level, but only insofar as we get it up to a minimum of S 3. So, zero won’t work. So this is, as you can see on the screen, the minimum that you can use for Change Data Capture. Change data capture is now available on Azure SQL Database. Azure SQL managed instance and SQL Server on a virtual machine So I’m just going to leave it like that. So, three.

This is the minimum. I’m going to add my current IP address and some sample data. So let’s create that, and I’ll pause the video until it’s finished. Right, well, that’s been done. So I’ll go to this new resource, which is a standard S3, and you can see that there are now two databases, two as your SQL databases, if I go into my server behind the database. So this latest one is database two. So make sure you’re on Database Two when you take a look at this. So, the first thing you need to do before you enable CDC Change Data Capture for a table is, just like change tracking, enable it for the database. You can also enable it for a table if you use an SP CDC-enabled database. Now, to run this command, you need to be a sysadmin. Now, what happens is that it creates additional objects. So if I go into the table and refresh, you can see no change immediately. But if I go into the system tables here, we can see the additional objects. Now, you can only use this, by the way, on user databases and not system databases. So now that it’s enabled for a database, I can now enable it for a table. So here we have an example of this command.

The Table So some of the arguments are source schema. That’s what comes before the dot source name. That’s the table name, or in this case, a role name. So this database role is used to gain access to the change data. Now, notice I don’t actually have a role called “New Role.” It simply does not exist. If I go into roles, I don’t have that role. So it’s going to create it and then the captured column list. So what do you want to monitor? So, for columns to be captured, you need to capture the primary key, in this case, address ID. and you can separate them with commas. By the way, you can’t use encrypted columns, by the way.So let’s run this and turn it on for this specific table. Now I can see what my configuration is by using Sys SPCDC to change data capture. So you can see that I have the Sales Art address, and I got this index here, and I got the role name here. And over on the right side, we’ve got what we’re capturing. So what I’m going to do is just change one row. So here we have an address, ID nine.

 The city of Buffalo I’m going to update it so that Buffalo is spelled with one L, not two. So simple an update statement So after that’s done, you can see we now have one L in Buffalo. So now I need to find out what the new roles are so we can see how they are using them. So I’m declaring two variables as binary, and I’m getting what’s called the LSN. This is the log sequence number, so the earliest one and the latest one, and then finding out all of the changes. So let’s run that, and you can see that we have a change. It is Operation Four. That means updating. One is delete to insert. Three also means it’s time to change the address. ID nine and the City of Buffalo So now if I set it back, so this is a second update as opposed to a rollback, then we’ll have a second operation and see where it’s going. So it’s now a barthel with two LS. Now, just a word about this formula. It is a custom SQL formula. It has been made by the CDC, and it actually has, right at the end, the name of the table. So it’s not just one function for all tables; it’s one function for one particular table. And there are arguments on both sides. And I want all in this case.

So if I need to disable it from a table, I can use this exactly the same in terms of the first two schemas for disable and enable, with just one difference for the last. I say I want to disable all instances. So I could have one instance providing City while another instance is trapping something else. Here I can say I want all of them, and then finally I can disable it for the database. So, this is how you implement data change tracking. So it’s supported in all versions of Azure that we’re talking about. It tracks historic data. It needs a minimum of one Vcore, 100 DTUs, or Ettus. And because of that requirement, it can’t be used in the AzureSQL Basic or Standard Tier, where we’re talking about S Zero, S One, and S Two, and it can’t be used in an Azure SQL Database elastic pool where you’ve got the Vcore of less than one or the EDTUS of less than 100. You can see on the screen all of the various formulas you need. You need to enable it for the database, then for a table, and then at any point you can ask, “What has changed?”

5. 37. perform a vulnerability assessment

In this video, we’re going to perform a vulnerability assessment. So this is an online database, so it’s possible that some things may go wrong in terms of security. So vulnerability assessments are fairly important. As your SQL defender, it costs around $15 per server per month. When you consider that some of the databases we’re looking at, such as this stack standard S 3, cost around $180, that’s not a huge increase. It is, of course, if you’re using the basic version, but you should really only be using Basic and S Zero for testing purposes.

So, in the SQL database, navigate to Security and Security Center. So we don’t have any recommendations to show right now, but we do have a split. At the top, Azure Defender for SQL is enabled at the subscription level. So I’m going to click on “Configure” next to it. So I’ve already selected my subscription; it’s already on. I’m going to select a storage account. So these are the storage accounts in the selected subscription, and in UK South, you can turn on periodic recurring scans to get weekly scans. And you can see that it will generally be on the day that vulnerability assessment has been enabled and saved. So if you enable it and save it on Wednesday, then you will get an email on Wednesdays. Now you can send scan reports to a particular address, so I’m going to send it to an address like this. You can also send email notifications to admins and subscription owners. So let’s click save. So saving the settings So there it is, saved.

So I’m going to go back into my database now to view the details of the finding. Then again, it’s the same place, the Security Center, and we can scroll down to the vulnerability assessment findings and click on “View additional findings in the vulnerability assessment.” So I’m going to do a one-off scan. I’ll just click “scan.” As you can see, the last scan time was completely blank. So I’ll just pause the video until it’s finished. It only took a few seconds. And you can see that there are four findings that it has made. I have 30 passes, but I’ve also got four fails, one of which is at high risk. So I can click on any of these and get more information about it. Here we have server-level firewall rules that must be tracked and maintained at a minimum. And you have additional information about it, as well as how we can remediate or remediate. Now we have the top approval’s baseline.

So if I were to click on that, then I would say that this is not actually a failure; this is fine, and it will go into the past section. I can also say, “No, this isn’t part of the baseline at all,” if I want to. So I’m setting this as the baseline and going back into my vulnerability assessment. So next time that I scan it, You can see there are pending baseline changes. Run a new scan. This particular item will go into the past section. And here we are. So I’ve got 31 passes. I can also click on “scan history,” so I can see what the history of my vulnerability assessment is, and I can also export the scan results to Excel, so I can have a copy on my computer and I can say, “Okay, this is the result as of this particular date.” Let’s have a look at it compared to previous results. So this is how you can perform a vulnerability assessment. You can set it up to do it automatically. If we go to the top here as your defender for SQL and configure a storage account and periodic recurring scans, we can also have a look at the vulnerability assessment findings, do a one-off check, click on scan, scan history, and export scan results. So that is performing a vulnerability assessment.

Leave a Reply

How It Works

img
Step 1. Choose Exam
on ExamLabs
Download IT Exams Questions & Answers
img
Step 2. Open Exam with
Avanset Exam Simulator
Press here to download VCE Exam Simulator that simulates real exam environment
img
Step 3. Study
& Pass
IT Exams Anywhere, Anytime!