Microsoft PL-200 Power Platform Functional Consultant Exam Dumps and Practice Test Questions Set6 Q101-120

Visit here for our full Microsoft PL-200 exam dumps and practice test questions.

Question 101: 

You are configuring a business rule that needs to set a field value based on the current date. Can business rules access the current date?

A) Yes, using NOW() function

B) Yes, using CURRENT_DATE

C) No, business rules cannot access current date

D) Yes, through date field default value

Answer: C) No, business rules cannot access current date

Explanation:

Business rules in Dataverse cannot access the current date or time dynamically within their conditions or actions. They are designed to provide a no-code way to implement simple logic on forms and fields, such as setting values, showing or hiding fields, making fields required, or locking fields. However, business rules are limited in scope: they can only evaluate data that exists on the current record or is explicitly available through related fields. They do not have access to system functions, environmental context, or dynamically changing values like the current date and time. This limitation means that any logic that relies on comparing a record’s field to the current date cannot be handled directly with a business rule.

For example, if a business requirement states that a field should be updated when a date field is earlier than today, a business rule alone cannot perform this comparison because it cannot retrieve the current date. Similarly, if you need to calculate the number of days between today and a due date or dynamically set reminders based on today’s date, business rules are insufficient. While field default values in Dataverse allow you to set a field to the current date/time at the moment of record creation, this is a one-time operation and cannot be used for ongoing evaluations or conditional logic, and it does not provide dynamic comparison capabilities within business rules.

For scenarios that require date comparisons or actions based on the current date, Power Automate flows are a more appropriate solution. Power Automate provides access to dynamic expressions like utcNow() or addDays(), allowing flows to evaluate the current date and perform actions accordingly. Flows can be triggered on record creation, modification, or on a scheduled basis, and can conditionally update fields, send notifications, or execute other business logic based on the result of date calculations. Similarly, JavaScript running on the form can access the client’s current date and time, perform calculations, and manipulate fields or sections dynamically, offering a flexible alternative for client-side logic that needs date awareness.

Functions such as NOW() or CURRENT_DATE, which are available in SQL or other programming environments, do not exist in the context of business rules. Business rules also cannot retrieve user environment information beyond the owner or basic record context, further limiting their ability to implement temporal logic. Therefore, when business requirements involve evaluating or acting based on the current date and time, relying solely on business rules is insufficient. Instead, Power Automate flows or JavaScript must be used to implement logic that requires environmental context, current timestamps, or dynamic temporal calculations.

Question 102: 

You need to create a canvas app that displays data from a custom connector to an external API. The API requires authentication. What should you configure?

A) Custom connector with OAuth2 or API key

B) Direct HTTP requests with headers

C) Data gateway

D) Service principal authentication

Answer: A) Custom connector with OAuth2 or API key

Explanation:

Using a custom connector with OAuth2 or API key authentication is the correct approach for connecting canvas apps to external APIs that require secure authentication. Canvas apps often need to interact with external services to retrieve data, send updates, or trigger processes, but many APIs enforce authentication to ensure that only authorized clients can access their resources. Custom connectors in Power Platform provide a structured way to encapsulate API endpoints, handle authentication, and expose operations in a reusable manner across multiple apps and flows.

When creating a custom connector, you define the authentication type supported by the external API. This could be OAuth2, which involves user authorization and token management, or an API key, which is a static credential included in requests. For OAuth2, the connector handles the full authorization flow, including token acquisition and automatic refresh when tokens expire, removing the need for developers to manually manage these processes. For API keys, the connector securely stores the key and injects it into the request headers or query parameters as required. This centralized authentication management ensures consistency and reduces the risk of exposing sensitive credentials in multiple places in your app.

Beyond authentication, custom connectors define the operations available on the external API, including the HTTP method (GET, POST, PUT, DELETE) and the input and output parameters. This allows app makers to interact with APIs using clearly defined actions, with all the necessary parameters and expected responses available in the app formula environment. By abstracting low-level HTTP request details, custom connectors improve maintainability, as changes to the API or authentication settings can be managed in one place without modifying multiple app formulas or flows.

Alternative approaches, like using direct HTTP requests from the canvas app, require embedding authentication logic in each request, which is less secure and harder to maintain. Each request must manage tokens, refresh logic, or API key inclusion, increasing the potential for errors or security leaks. Data gateways facilitate access to on-premises resources but do not handle external API authentication. Service principals are suitable for app-to-app authentication scenarios but are not designed for direct API calls from a canvas app where user context may be needed.

By using a custom connector, the authentication complexity is encapsulated, token handling is automated, and API operations are reusable and easily integrated into multiple canvas apps or Power Automate flows. This approach ensures that the integration remains secure, maintainable, and aligned with best practices for managing external API connections, while providing a simple interface for app makers to use without delving into the underlying authentication mechanisms.

Question 103: 

You are configuring a model-driven app form and need to display a field from a related record in read-only mode. What should you add?

A) Lookup field

B) Quick view form

C) Subgrid

D) Related records display

Answer: B) Quick view form

Explanation:

Quick view form is the correct component for displaying read-only fields from a related record directly on a form in model-driven apps. Quick view forms allow you to show key information from a related table without requiring users to navigate away from the current record. They are designed to provide contextual details about a lookup record, improving user experience and reducing the need to switch between forms to view related data. This functionality is particularly useful when you want users to quickly reference important fields, such as account contact information or related case details, while working on a record.

To implement a quick view form, you first create the form on the related table, specifying which fields should be displayed. Once the quick view form is created, it can be added to the main form that contains the lookup field. Placement of the quick view form is typically near the corresponding lookup to maintain contextual relevance and improve readability. When a user selects a lookup record, the quick view form automatically populates with data from the related record. If the user changes the lookup to a different record, the quick view form updates dynamically to show information from the newly selected record, ensuring the displayed information is always accurate and current.

For example, consider a contact form with an Account lookup field. Adding a quick view form from the Account table allows fields such as the account’s phone number, address, or annual revenue to appear on the contact form. Users can immediately see relevant account information without opening the account form, which reduces clicks and streamlines workflow. This also ensures users have critical data at their fingertips, enhancing efficiency and supporting better decision-making.

Other components do not provide the same functionality. Lookup fields allow users to select related records but do not display additional fields from those records. Subgrids show lists of related records, which is useful for displaying multiple child records but not for showing individual record details in a read-only format. Generic related record displays or other custom controls may require additional development and do not provide the same out-of-the-box integration, automatic updating, and read-only display functionality.

Quick view forms are also configurable with permissions and security roles, ensuring that users see only the fields they are authorized to view. They support responsive layouts and can be included in multiple forms across the application, making them highly reusable. For scenarios where read-only information from a related record needs to be visible directly on the form to provide context without navigation, quick view forms offer a purpose-built, declarative solution that is efficient, maintainable, and enhances user productivity.

Question 104: 

You need to configure a Power Apps portal entity list that allows users to export displayed data to Excel. What should you enable?

A) Enable entity permissions

B) Enable Excel export in entity list options

C) Add custom button for export

D) This feature is not available in portals

Answer: B) Enable Excel export in entity list options

Explanation:

Enabling Excel export in entity list options is the correct configuration for allowing portal users to export data from entity lists in Power Apps portals. Entity lists are components that display tabular data from Dataverse tables on the portal and can be configured to support a range of interactive features. One of these features is Excel export, which provides users with the ability to download the currently displayed records into an Excel file for offline analysis or reporting. Once the Excel export option is enabled, an export button automatically appears on the entity list, giving users an intuitive way to generate a spreadsheet with the data visible in the list.

When a user exports records, the system respects table-level and row-level permissions, ensuring that only data the user is authorized to view is included in the exported file. This ensures that security and compliance requirements are maintained while providing convenient access to data. The exported Excel file includes only the columns configured in the entity list, reflecting the view and any filters or sorting applied by the user. This guarantees that users receive meaningful and relevant data without extra information that is unnecessary or restricted.

While entity permissions determine which records a user can access, they do not, by themselves, enable the ability to export records. Similarly, creating custom buttons or scripts could allow export functionality, but this would require additional development effort and maintenance. The built-in Excel export feature provides a secure, reliable, and user-friendly solution that requires minimal configuration. For scenarios where portal users need to download data for offline review, analysis, or integration with external tools, enabling Excel export in the entity list options is the most efficient and maintainable approach, delivering data in a structured, accessible format while respecting existing security rules.

Question 105: 

You are creating a calculated field that needs to return an empty string if a text field is null, otherwise return the field value in uppercase. What formula should you use?

A) IF(ISBLANK(field), “”, UPPER(field))

B) ISNULL(field, UPPER(field))

C) UPPER(field) OR “”

D) COALESCE(UPPER(field), “”)

Answer: A) IF(ISBLANK(field), “”, UPPER(field))

Explanation:

The formula IF(ISBLANK(field), “”, UPPER(field)) is the correct approach for conditionally applying an uppercase transformation to a field while safely handling null or empty values in Dataverse calculated fields. This formula works in two steps. First, it checks whether the field is blank using the ISBLANK function. If the field is blank, the formula returns an empty string, preventing errors that could occur if a function like UPPER were applied to a null value. If the field is not blank, it applies the UPPER function to convert the text value to uppercase. This ensures that the calculated field produces clean and predictable results regardless of whether the source field contains data.

Handling null values explicitly is important in calculated field formulas because text functions can behave unexpectedly when applied to blank or null inputs. Without the ISBLANK check, the formula might produce errors or unwanted results, such as displaying the word “null” or generating runtime calculation errors. Using IF(ISBLANK()) allows the formula to gracefully skip the transformation when the field is empty while still applying the desired logic when data is present.

Alternative approaches like ISNULL are not available in Dataverse calculated fields, and COALESCE is also not supported in this context. Using logical operators like OR without the proper structure would not provide the intended behavior and could lead to syntax errors. By combining IF with ISBLANK and UPPER, the formula provides a robust, maintainable, and best-practice approach for text transformations that need to account for optional or missing values. This pattern is widely applicable for calculated fields where safe handling of blank inputs is required before performing text operations.

Question 106: 

You need to create a view in a model-driven app that shows accounts sorted by annual revenue in descending order with the highest revenue accounts first. What should you configure?

A) Primary sort on annual revenue, ascending

B) Primary sort on annual revenue, descending

C) Group by annual revenue

D) Filter by annual revenue

Answer: B) Primary sort on annual revenue, descending

Explanation:

Primary sort on the annual revenue field with descending order is the correct configuration for displaying accounts with the highest revenue first. In view configuration for model-driven apps, you can define which field serves as the primary sort and specify the sort direction—ascending for lowest to highest or descending for highest to lowest. By setting annual revenue as the primary sort field and choosing descending order, the view automatically lists accounts starting with the largest revenue values at the top, allowing users to quickly focus on high-value accounts without manual scanning or additional filtering.

You can also configure secondary and tertiary sort fields to handle situations where multiple records have identical values in the primary sort field. For example, after sorting primarily by annual revenue descending, a secondary sort by account name ascending ensures consistent and predictable order among accounts sharing the same revenue. Multi-level sorting helps maintain a logical and user-friendly display of data, making it easier for sales teams and managers to prioritize accounts efficiently.

This configuration applies automatically whenever the view is loaded and does not require additional manual intervention. It also works in combination with filters, which restrict records shown in the view but do not affect the sort order. Grouping changes the visual presentation by categories rather than ordering records by value. Ascending sort would display the lowest revenue accounts first, which does not meet the requirement. Therefore, setting annual revenue as the primary sort field in descending order provides the precise mechanism to ensure records are consistently presented from highest to lowest, enhancing usability and decision-making for users focused on top-performing accounts.

Question 107: 

You are configuring a Power Automate flow that needs to send different email templates based on the case priority. What is the most maintainable approach?

A) Multiple Send email actions with conditions

B) HTML template stored in SharePoint

C) Switch/condition with composed email body

D) Separate flows for each priority

Answer: B) HTML template stored in SharePoint

Explanation:

HTML email templates stored in SharePoint is the most maintainable approach for dynamic email content. You would create HTML template files for each priority level in SharePoint, then use conditions in your flow to determine which template to retrieve based on case priority. The flow gets the file content, replaces placeholder tokens with actual data, and sends the formatted email. This separates content from flow logic, allowing marketing or communications teams to update email content without modifying flows.

This pattern provides several benefits: templates can be updated by non-technical users, versioning through SharePoint, the ability to preview templates, and easier quality control. You can use simple token replacement like {{CaseNumber}} in templates and use Replace expressions in the flow to inject actual values. This approach scales well when you have many templates or frequently changing content.

Multiple send email actions work but duplicate email configuration. Composed email bodies in the flow are harder to maintain and require flow edits for content changes. Separate flows per priority creates unnecessary complexity. For implementing dynamic emails with different content based on conditions where email content needs to be maintainable by non-developers, storing HTML templates externally in SharePoint provides the architecture that separates content management from flow logic enabling easier updates and better governance.

Question 108: 

You need to configure a security role that prevents users from deleting any records but allows all other operations. Which privilege should you remove?

A) Delete privilege

B) Write privilege

C) Remove privilege

D) Deactivate privilege

Answer: A) Delete privilege

Explanation:

Delete privilege is the correct privilege to remove for preventing users from deleting records while allowing other operations. The Delete privilege specifically controls whether users can permanently delete records from tables. By removing or not granting Delete privilege for specific tables in a security role, you prevent users with that role from deleting records while still allowing them to create, read, and update records if those privileges are granted.

Delete is a separate privilege from other CRUD operations, allowing granular control over this potentially destructive action. Many organizations restrict delete capabilities to administrators or specific roles to prevent accidental or unauthorized data loss. Users without Delete privilege will see delete buttons disabled or receive insufficient privileges errors if they attempt deletion.

Write privilege controls creating and updating records, not deletion. Remove is not a standard privilege name. Deactivate privilege controls status changes but not permanent deletion. For preventing users from permanently deleting records while allowing them to perform other operations like creating, reading, and updating, removing the Delete privilege from their security role provides the specific access control that blocks deletion while preserving other capabilities.

Question 109: 

You are creating a canvas app with a form that submits data to multiple Dataverse tables. What is the best approach?

A) Multiple SubmitForm functions

B) Patch function for each table

C) Collect then batch insert

D) Power Automate flow triggered by app

Answer: B) Patch function for each table

Explanation:

Patch function for each table is the best approach for submitting data to multiple Dataverse tables from a canvas app. The Patch function allows explicit control over data submission to each table, enabling you to create or update records in multiple tables with a single button click. You can sequence the patches to create parent records first, capture their IDs, then create child records with proper relationships.

For example, Patch(ParentTable, Defaults(ParentTable), {Field1: Value1}); Set(varParentID, Result.ID); Patch(ChildTable, Defaults(ChildTable), {ParentLookup: LookUp(ParentTable, ID=varParentID)}) creates a parent record, stores its ID, then creates a child record linked to the parent. This provides complete control over the submission process with error handling at each step.

Multiple SubmitForm functions are less flexible and harder to coordinate across tables. Collect and batch insert adds unnecessary complexity. Power Automate adds latency and complexity. For submitting data to multiple related tables from canvas apps with proper relationship handling and sequential control, using Patch functions for each table provides the explicit submission pattern that handles multi-table data creation with relationship management and error handling.

Question 110: 

You need to create a business process flow stage that requires users to complete tasks before proceeding. Can business process flows check for completed tasks?

A) Yes, through data steps on tasks

B) Yes, through stage conditions

C) No, requires Power Automate integration

D) Yes, through workflow checks

Answer: C) No, requires Power Automate integration

Explanation:

Business process flows cannot directly check for the existence or completion of related task records as a condition for stage progression. Business process flows work with data steps that reference fields on the current record but cannot evaluate whether related records exist or meet criteria. To implement logic that checks for completed tasks before allowing stage progression, you need Power Automate flow integration.

You would create a flow that monitors the business process flow stage and related tasks, then automatically advances the stage when task completion criteria are met. Alternatively, you could use a rollup field counting completed tasks and reference that field in a data step, though this still requires the rollup field setup. The business process flow itself lacks the ability to query and evaluate related records in its native functionality.

Data steps reference fields on the current record only. Stage conditions are not a native feature for this scenario. Workflows are deprecated. For implementing business logic that requires checking related records like task completion before allowing business process flow progression, integrating with Power Automate flows provides the capability to evaluate related data and programmatically control stage progression based on complex criteria that business process flows cannot natively evaluate.

Question 111: 

You are configuring a model-driven app and need to ensure certain fields are visible only when a record is in a specific state. What should you use?

A) Business rule with visibility conditions

B) Field-level security

C) Form properties

D) JavaScript with state checking

Answer: A) Business rule with visibility conditions

Explanation:

Business rule with visibility conditions is the correct no-code approach for controlling field visibility based on record state. Business rules support Show or Hide actions that can make fields, sections, or tabs visible or invisible based on conditions including record state. You would create a business rule with a condition checking the state field value, then use visibility actions to show fields when the state matches specific values and hide them otherwise.

This approach updates dynamically as the state changes, immediately showing or hiding fields as users modify the state field. Business rules execute in real-time on forms, providing responsive UI behavior without page reloads. This is more maintainable than JavaScript and accessible to non-developers who need to configure visibility logic based on business requirements.

Field-level security controls access for all users based on roles, not conditional visibility based on record data. Form properties provide static configuration. JavaScript could achieve this but business rules are the declarative approach. For implementing field visibility that responds to record state values in model-driven apps without custom code, business rules with conditions and visibility actions provide the configuration-based approach that makes fields conditionally visible based on record data values.

Question 112: 

You need to create a Power Automate flow that processes records in batches of 100 to avoid timeouts. What pattern should you implement?

A) Pagination with do until loop

B) Apply to each with chunking

C) Multiple flows with filters

D) Scheduled flow runs

Answer: A) Pagination with do until loop

Explanation:

Pagination with do until loop is the correct pattern for processing large datasets in controlled batches. You would use a do until loop that continues while more records exist, with each iteration retrieving a batch of records using pagination parameters like skip and top in queries. After processing each batch, the loop retrieves the next batch until all records are processed. This prevents timeouts and manages memory efficiently.

The pattern involves initializing a skip counter, using do until to loop while records remain, querying with pagination parameters to get the current batch, processing those records, then incrementing the skip counter for the next iteration. This controlled batching ensures the flow processes all records without overwhelming system resources or exceeding timeout limits, making it suitable for bulk operations on thousands of records.

Apply to each doesn’t provide explicit batch control. Multiple flows would be complex to coordinate. Scheduled flows don’t address batching within a single run. For processing large datasets in Power Automate without timeouts or resource exhaustion, implementing pagination with do until loops provides the batch processing pattern that systematically works through large record sets in manageable chunks while maintaining processing state across iterations.

Question 113: 

You are configuring a canvas app that needs to display a countdown until a specific future date. What formula approach should you use?

A) Timer control with DateDiff calculation

B) DateDiff function in label

C) Calculate days and display

D) All of the above could work

Answer: D) All of the above could work

Explanation:

All of the suggested approaches could work for displaying a countdown in canvas apps, each with different characteristics. A Timer control with DateDiff calculation provides real-time updating, refreshing the countdown display at regular intervals. A DateDiff function in a label text property calculates the difference between the target date and current date, displaying the result but updating only when the app recalculates or screen refreshes. Calculating days and displaying is essentially the same as using DateDiff but might involve more manual calculation.

The best choice depends on requirements: if you need second-by-second updates showing time counting down, use a Timer control that triggers calculations repeatedly. For a simple display of days remaining that doesn’t need constant updates, a Label with DateDiff(Now(), TargetDate, Days) is simpler. The calculation approach would be DateDiff(Now(), DateField, TimeUnit.Days) formatted appropriately for display.

Each approach has tradeoffs in complexity, performance, and update frequency. Timer-based updates consume more resources but provide dynamic countdown displays. Simple label formulas are efficient but static until refresh. For displaying time remaining until future dates in canvas apps, multiple approaches exist with varying levels of dynamism and complexity, allowing selection based on specific user experience requirements and performance considerations.

Question 114: 

You need to configure a Power Apps portal where users can view records but cannot export or print the data. What should you configure?

A) Table permissions with read-only

B) Disable browser print and export via portal configuration

C) Web role with restricted permissions

D) This level of control is not available

Answer: D) This level of control is not available

Explanation:

This level of control is not available in Power Apps portals through standard configuration. While you can control whether users can read, write, create, or delete records through table permissions, you cannot prevent users from using browser features like print or selecting and copying displayed text. Once data is displayed in a user’s browser, they have access through standard browser functionality regardless of portal configuration.

Entity list export features can be disabled by not enabling the Excel export option, preventing the built-in export button. However, users can still use browser print functionality, take screenshots, or copy displayed data. True prevention of data extraction would require complex client-side restrictions that could still be circumvented. If data security requires preventing any form of data extraction, the data should not be displayed to users at all.

Table permissions control CRUD operations. Web roles control portal access. Portal configuration can disable specific export features but not browser capabilities. For scenarios requiring strong data protection where users should view but never extract data, the technical limitations of web applications mean that once data appears in a browser, complete prevention of data extraction is not feasible through portal configuration alone, requiring alternative security approaches if this level of protection is critical.

Question 115: 

You are creating a calculated field that needs to display the month name from a date field. What formula should you use?

A) TEXT(MONTH(datefield), “MMMM”)

B) MONTHNAME(datefield)

C) No function available for this

D) FORMAT(datefield, “MMMM”)

Answer: C) No function available for this

Explanation:

No function is available in Dataverse calculated fields to convert month numbers to month names or directly extract month names from dates. Calculated fields support limited functions and TEXT with format patterns, MONTHNAME, and FORMAT with date formatting are not among the available functions. The MONTH function extracts the month number from a date returning an integer from 1 to 12, but there’s no native way to convert that number to the month name like “January” or “February” within calculated field formulas.

To achieve month name display, you would need alternative approaches such as using Power Automate to set a text field with the month name, using JavaScript on forms to display month names dynamically, or creating an option set mapping month numbers to names and using workflows to set values. The limitations of calculated field functions mean certain common formatting requirements cannot be achieved within calculated fields alone.

TEXT, MONTHNAME, and FORMAT with these patterns are not available in Dataverse calculated fields. For displaying month names from date fields, the limitations of calculated field functions require alternative implementation approaches using automation or client-side scripting, as calculated fields lack the text formatting functions necessary to convert month numbers or dates to textual month names.

Question 116: 

You need to create a Power Automate flow that runs when a field value changes from one specific value to another specific value. What trigger and condition should you use?

A) When a row is modified trigger with condition checking old and new values

B) When specific fields change trigger

C) Use trigger conditions with from/to values

D) Requires custom connector

Answer: C) Use trigger conditions with from/to values

Explanation:

Trigger conditions with from/to value checks is the most efficient approach for detecting specific value transitions. While “When a row is added, modified or deleted” trigger with filtering is possible, trigger conditions allow the flow to evaluate whether to run before consuming a flow run, making it more efficient. Trigger conditions can access both previous and current values of fields using trigger outputs, enabling detection of specific transitions like status changing from “Draft” to “Approved.”

The trigger condition would evaluate expressions comparing the old and new values of the field, running the flow only when both conditions match: the old value equals the “from” value and the new value equals the “to” value. This prevents the flow from running on irrelevant changes, improving efficiency and reducing flow run consumption.

When a row is modified with internal conditions works but is less efficient. A “when specific fields change” trigger exists but still requires condition checking for specific value transitions. Custom connectors aren’t needed. For detecting and responding to specific value transitions in Dataverse fields, configuring trigger conditions that evaluate both previous and current field values provides the efficient pattern that runs flows only for relevant transitions, minimizing unnecessary flow executions.

Question 117: 

You are configuring a business rule and need to set a field to a value from another record. Can business rules access data from other records?

A) Yes, through lookup relationships

B) Yes, through rollup fields

C) No, business rules cannot access other records

D) Yes, through calculated fields

Answer: C) No, business rules cannot access other records

Explanation:

Business rules cannot access data from other records including related records through lookups. Business rules are limited to working with field values that exist directly on the current record. They cannot traverse lookup relationships to read values from related records or query other records to retrieve data. This is a fundamental limitation that requires alternative approaches when logic needs data from related or other records.

To use data from related records in business rule-like logic, you would need to use calculated fields that can’t be set by business rules, rollup fields for aggregates from child records, Power Automate flows that can query and use data from any records, or JavaScript that can make additional queries. The business rule itself operates in isolation with only the current record’s field values available.

Lookup relationships provide access to the related record ID but not field values from the related record. Rollup and calculated fields can surface data from other records but business rules can’t set their values dynamically. For implementing logic that requires data from related or other records, business rules’ limitations necessitate alternative approaches like flows or JavaScript that have the capability to access and use data from beyond the current record context.

Question 118: 

You need to create a view that shows opportunities with estimated close dates in the current quarter. What filter should you use?

A) This quarter dynamic filter

B) Date range with calculated dates

C) Greater than quarter start

D) Custom FetchXML with quarter calculation

Answer: A) This quarter dynamic filter

Explanation:

This quarter dynamic filter is the correct option for showing records with dates in the current quarter. Dataverse view filters include relative date operators like “This Quarter” that automatically calculate the current quarter’s date range based on the organization’s fiscal year settings and today’s date. This filter dynamically adjusts as time passes, always showing opportunities in whichever quarter is current without manual updates.

Relative date filters like This Quarter, This Year, Next Month, and many others provide dynamic date filtering that remains current automatically. These filters respect the organization’s fiscal year configuration, so “This Quarter” refers to the current fiscal quarter, which may not align with calendar quarters depending on fiscal year setup. This makes views dynamically current for date-based filtering.

Date ranges with specific dates would require manual updates each quarter. Greater than quarter start alone wouldn’t exclude future quarters. FetchXML could work but isn’t necessary. For creating views that dynamically filter to date ranges like current quarter that automatically adjust as time passes, using built-in relative date operators like This Quarter provides the dynamic date filtering that keeps views automatically current without configuration updates.

Question 119: 

You are creating a canvas app that needs to upload files to SharePoint. What function should you use?

A) Patch to SharePoint list with attachment

B) SharePoint connector Add file action

C) Office 365 connector upload

D) Cannot upload files from canvas apps

Answer: A) Patch to SharePoint list with attachment

Explanation:

Patch to SharePoint list with attachment is the correct approach for uploading files from canvas apps. When you connect a canvas app to a SharePoint list, you can use the Patch function to create or update list items and include attachments. The Attachments field in SharePoint lists accepts files from controls like Add Picture or Attachments controls in canvas apps, allowing users to select files and upload them as part of the Patch operation.

The formula would be structured like Patch(SharePointList, Defaults(SharePointList), {Title: TextInput.Text, Attachments: {Value: Upload Control.Media}}) to create a list item with an attachment. This approach integrates file upload into the normal data submission process, making it straightforward to combine file uploads with other field values in a single operation.

SharePoint connector actions are for Power Automate, not canvas app formulas. Office 365 connector is not the right tool. Canvas apps can definitely upload files. For uploading files from canvas apps to SharePoint lists as attachments associated with list items, using the Patch function with attachment field mapping provides the direct file upload capability that integrates naturally with SharePoint list connectivity in canvas apps.

Question 120: 

You need to configure a security role that allows users to assign their own records to other users but prevents them from assigning others’ records. What privilege configuration should you use?

A) Assign: User level

B) Assign: Business Unit level

C) Share: User level

D) Assign: Organization with conditions

Answer: A) Assign: User level

Explanation:

Assign privilege at User level is the correct configuration for allowing users to assign only their own records to others. The Assign privilege controls whether users can change record ownership by assigning records to other users or teams. When set to User level, users can only assign records where they are currently the owner, preventing them from reassigning records owned by others.

This is useful in scenarios where team members should be able to distribute their own workload by assigning their records to colleagues without having authority to reassign others’ work. When a user assigns their record, ownership transfers to the assigned user or team. The User level restriction ensures this capability is limited to personally owned records, maintaining appropriate boundaries around ownership changes.

Business Unit level would allow assigning any record in the business unit, not just owned records. Share privilege controls shared access, not ownership transfer. Organization level with conditions isn’t a standard configuration pattern. For enabling users to reassign their own records to others while preventing them from changing ownership of records they don’t own, configuring the Assign privilege at User level provides the granular control over ownership transfer capabilities based on current record ownership.

Leave a Reply

How It Works

img
Step 1. Choose Exam
on ExamLabs
Download IT Exams Questions & Answers
img
Step 2. Open Exam with
Avanset Exam Simulator
Press here to download VCE Exam Simulator that simulates real exam environment
img
Step 3. Study
& Pass
IT Exams Anywhere, Anytime!