Microsoft PL-200 Power Platform Functional Consultant Exam Dumps and Practice Test Questions Set2 Q21-40

Visit here for our full Microsoft PL-200 exam dumps and practice test questions.

Question 21: 

You are creating a Power Automate flow that needs to parse JSON data from an HTTP request. Which action should you use?

A) Compose

B) Parse JSON

C) Select

D) Initialize variable

Answer: B) Parse JSON

Explanation:

Parse JSON is the correct action to use when working with JSON data in Power Automate flows because it allows the flow to interpret and structure incoming JSON content in a predictable and accessible way. When an HTTP request or API response returns JSON data, the information typically arrives as a raw text string. Without parsing, the flow cannot easily recognize the individual fields within that JSON. The Parse JSON action solves this by converting the raw JSON into a structured format and generating tokens that can be referenced directly in later actions. This makes it possible to work with the data using simple dynamic content instead of complicated expressions.

To set up the Parse JSON action, you provide the raw JSON output and define a schema that describes the structure of the data. Power Automate includes a helpful “Generate from sample” feature that automatically builds the schema from example JSON, eliminating the need to write schema manually. Once parsed, each property within the JSON becomes available as a token that can be used in conditions, applied in field mappings, or passed into other system operations. This makes Parse JSON essential when integrating with external APIs, custom connectors, or webhook payloads where specific values must be isolated for further processing.

Other actions do not offer this functionality. Compose allows you to create or manipulate data but cannot break JSON into individual fields. Select is designed to transform arrays but not to parse raw JSON into usable tokens. Initialize variable simply creates a variable and does not interpret structured content. Parse JSON is the action specifically designed to expose the internal structure of JSON so flows can work with it seamlessly. Whenever a Power Automate flow needs to extract and use values from JSON data, Parse JSON is the most efficient and reliable approach.

Question 22: 

Your organization uses business process flows with multiple stages. You need to ensure specific fields are required only in certain stages. What should you configure?

A) Field-level security

B) Business rules

C) Data steps

D) Stage categories

Answer: C) Data steps

Explanation:

Data steps are the correct configuration for making fields required in specific stages of a business process flow. Within each stage, data steps are added to reference fields from the underlying table, and each step can be set as required. When a field is marked as required in a data step, users must complete it before they can progress to the next stage. This allows for stage-specific requirements that guide users to enter the correct information at the appropriate point in the process, ensuring data completeness and accuracy throughout the business process flow.

Data steps are visually represented in the business process flow bar at the top of the form, providing users with a clear view of the fields they must complete in each stage. Required data steps block progression, so users cannot move forward until all mandatory fields in the current stage are filled. This approach offers greater flexibility than making fields universally required on the form because different stages often need different information. For example, early stages in a sales process might require only basic contact or qualification details, while later stages may require more detailed financial, contractual, or operational information. Using data steps ensures that users are prompted for the right information at the right time without overloading them with unnecessary mandatory fields.

Alternative configurations do not meet this stage-specific requirement. Field-level security controls who can view or edit a field but does not make it required conditionally. Business rules can enforce required fields but apply across the form rather than being tied to a specific stage. Stage categories help classify stages but do not control field requirements. Therefore, data steps are the ideal mechanism for implementing stage-specific field requirements, providing a structured and user-friendly way to ensure that critical information is collected at the correct points in a business process flow.

Question 23: 

You need to create a Power Automate flow that runs daily to send reminder emails for tasks due within the next seven days. Which trigger should you use?

A) When a row is added, modified or deleted

B) Recurrence

C) When an action is performed

D) When a row is added

Answer: B) Recurrence

Explanation:

Recurrence is the correct trigger to use when creating scheduled flows in Power Automate that need to run at regular intervals. This trigger allows you to configure a flow to execute automatically on a predefined schedule, such as daily, weekly, monthly, or even at custom intervals. For example, to implement a daily task reminder, you would configure the Recurrence trigger to run once per day, then use subsequent actions to query Dataverse for tasks with due dates between the current day and the next seven days. After retrieving the relevant records, the flow can send email reminders to the owners of those tasks, ensuring they receive timely notifications without manual intervention.

The Recurrence trigger provides extensive flexibility in scheduling, including the ability to set specific times of day, choose time zones, and define complex frequency patterns. For instance, the flow could be set to run every day at 8 AM in a particular time zone, guaranteeing that reminders are sent consistently and at the intended time. After the trigger fires, actions like List Rows in Dataverse can filter tasks based on due dates, and the Apply to Each action can iterate through the filtered results to send individual emails to each task owner, making the process fully automated and repeatable.

Alternative triggers do not meet the requirements for scheduled execution. The “When a row is added, modified, or deleted” trigger responds to changes in data rather than running on a schedule. The “When an action is performed” trigger responds only to user-initiated actions in model-driven apps. Similarly, “When a row is added” fires only for new records. None of these alternatives can ensure that the flow executes at a regular, defined interval independent of user actions or data changes. 

Therefore, for automated processes such as daily reminders or scheduled reports, the Recurrence trigger is the ideal and most reliable choice, providing predictable execution and consistent results.

Question 24: 

You are configuring a model-driven app and need to hide a tab on a form based on the value of a field. What should you use?

A) Business rule

B) JavaScript

C) Form properties

D) Security role

Answer: A) Business rule

Explanation:

A business rule is the correct no-code solution for showing or hiding form tabs based on field values in model-driven apps. Business rules allow administrators and makers to create declarative logic that can dynamically control form behavior, including field, section, and tab visibility, without writing any code. To implement tab-level visibility, you would create a business rule with a condition that evaluates a specific field value. Based on that condition, the “Show or hide” action is used to control the visibility of the target tab. This approach is ideal for non-developers because it provides a simple, maintainable, and supported way to manage form behavior.

Business rules execute in real-time as users interact with the form, ensuring that tab visibility updates immediately when the controlling field value changes. You can define multiple conditions and actions within a single business rule to handle different scenarios. For instance, a “Partnership Details” tab could be hidden whenever the account type is not set to “Partner,” while it remains visible for partner accounts. Business rules are scoped to specific tables and can be applied to all forms or selected forms for that table, offering flexibility in implementation while ensuring consistent behavior across forms.

While JavaScript can also be used to achieve dynamic tab visibility, it requires writing custom code and maintaining scripts, which increases complexity and potential for errors. Static form properties can control the default visibility of tabs but cannot respond dynamically to field values. Security roles manage access at the user or role level but do not provide record-specific conditional behavior. Business rules, on the other hand, are fully declarative, easy to configure, and maintainable over time, making them the optimal choice for dynamically showing or hiding tabs based on field values in model-driven apps. This ensures a user-friendly experience while minimizing development overhead.

Question 25: 

Your organization uses Dataverse to track projects and tasks. You need to ensure that when a project status changes to Completed, all related tasks are automatically marked as Completed. What should you implement?

A) Calculated field

B) Rollup field

C) Power Automate flow

D) Business rule

Answer: C) Power Automate flow

Explanation:

A Power Automate flow is the correct solution for implementing cascading updates from parent records to related child records. In scenarios where a project’s status changes to Completed and all related tasks need to be updated accordingly, automation is required to handle the one-to-many relationship efficiently. By creating a flow triggered when a project row is modified, you can first check whether the status changed to Completed using a condition. Then, the “List rows” action can retrieve all tasks related to that project, and an “Apply to each” action can iterate through each task to update its status. This ensures that all associated records are updated automatically without requiring manual intervention.

This approach provides flexibility and reliability in managing data consistency. The flow can include additional conditions to only update tasks that are not already completed or to exclude certain tasks based on business rules. Error handling can be implemented to log failures or notify project managers if some tasks fail to update, ensuring accountability and visibility. Notifications can also be included to confirm successful updates, providing a complete automated process that maintains data integrity across related records.

Alternative solutions are not suitable for this requirement. Calculated fields only compute values within the same record and cannot push updates to related child records. Rollup fields aggregate data from child records to parent records but do not allow changes to flow downward. Business rules operate within a single form and record context and cannot update related records across relationships. Power Automate flows, however, are specifically designed to query, iterate, and update multiple related records in a scalable and controlled manner. For scenarios requiring cascading updates from parent to child records, flows provide the most robust, maintainable, and flexible solution, ensuring that changes in parent records are consistently reflected in all related child records.

Question 26: 

You are creating a canvas app with a complex data entry form. You need to ensure that data is saved to Dataverse even if the user loses internet connectivity. What should you implement?

A) SaveData function

B) Connection checking

C) Offline caching

D) LoadData function

Answer: A) SaveData function

Explanation:

The SaveData function is the correct approach for enabling offline data persistence in canvas apps. SaveData stores collection data locally on the device, allowing the app to save user-entered information even without internet connectivity. You would use SaveData to persist form data to the device’s local storage, then use LoadData when connectivity is restored to retrieve that saved data and submit it to Dataverse using the Patch function.

This pattern involves collecting user input into a collection as they fill out the form, then using SaveData to store that collection locally. The function saves data in the browser’s local storage or on the mobile device, making it available even if the app is closed and reopened. When the user regains connectivity, you can check the saved data, present it for confirmation if needed, and then submit it to Dataverse. This ensures no data loss during connectivity interruptions.

Connection checking detects connectivity status but doesn’t save data. Offline caching is a general concept but SaveData is the specific function that implements it. LoadData retrieves previously saved data but doesn’t save it initially. Together, SaveData and LoadData create offline capabilities, but SaveData specifically addresses the requirement of saving data when offline. For canvas apps requiring data persistence during connectivity loss, the SaveData function provides the necessary local storage capability to ensure user-entered data is preserved until it can be synchronized with Dataverse.

Question 27: 

You need to create a view in a model-driven app that shows only records the current user owns. What type of view should you create?

A) Public view

B) Personal view

C) System view

D) Quick Find view

Answer: A) Public view

Explanation:

A public view is the correct choice for creating a view accessible to all users that filters to show only records owned by the current user. Public views are created by system customizers and administrators and are available to all users with appropriate permissions. You can configure a public view with a filter condition that uses the “Equals current user” operator on the Owner field, which dynamically shows each user only their own records when they access the view.

This dynamic filtering is powerful because the view definition is created once but behaves differently for each user, automatically showing them only their owned records. Public views appear in the view selector for all users and can be set as the default view for a table. They support all view capabilities including custom columns, sorting, filtering, and grouping. This approach is more efficient than having each user create their own personal view with the same filter.

Personal views are created by individual users for their own use and aren’t shared with others. System views are built-in views provided by Microsoft that typically shouldn’t be heavily customized. Quick Find views define search behavior rather than being selectable list views. For creating a commonly used view that filters dynamically based on the current user’s ownership while being available to all users, a public view with an “Equals current user” filter provides the appropriate solution that combines shareability with user-specific filtering.

Question 28: 

You are designing a Power Apps portal where customers can submit support cases. You need to ensure customers can only view and edit their own cases. What should you configure?

A) Web role

B) Table permissions

C) Page permissions

D) Field permissions

Answer: B) Table permissions

Explanation:

Table permissions are the correct configuration for controlling record-level access in Power Apps portals. Table permissions define which records users can create, read, update, and delete in Dataverse tables accessed through the portal. For customer support cases, you would configure table permissions on the Case table with a scope of “Contact” which ensures users can only access records where they are the related contact. This implements the requirement that customers see only their own cases.

Table permissions include several scope options: Global allows access to all records, Contact limits access to records related to the user’s contact record, Account extends access to records related to the user’s account, Self restricts to records owned by the user, and Parent defines access through relationship chains. For customer scenarios where each contact should access only their records, the Contact scope is typical. You also specify which actions are allowed such as read, write, create, and delete.

Web roles define which users can access which portal features but don’t provide record-level security. Page permissions control access to specific portal pages but not data records. Field permissions control access to specific fields within records. For implementing record-level security in Power Apps portals ensuring users only access their own data records, table permissions with appropriate scope configuration provide the necessary data access control mechanism.

Question 29: 

Your organization uses multiple environments for development, testing, and production. You need to move a solution containing custom tables, forms, and flows from development to production. What should you use?

A) Export and import data

B) Copy environment

C) Solution export and import

D) Manual recreation

Answer: C) Solution export and import

Explanation:

Solution export and import is the correct approach for moving customizations between environments in Power Platform. Solutions serve as containers that package customizations, including tables, forms, views, business rules, flows, apps, dashboards, and other components. By exporting a solution from a development environment, you capture all related components and their dependencies in a single package. This exported solution can then be imported into another environment, such as a test, staging, or production environment, ensuring all customizations are transferred consistently and accurately.

During export, you can choose between managed and unmanaged solutions. Managed solutions are recommended for production environments because they cannot be directly modified in the target environment, which preserves the integrity of the customizations and prevents unintended changes. Managed solutions also allow for clean uninstallation if the solution needs to be removed. Unmanaged solutions, on the other hand, are more appropriate for development or sandbox environments where modifications may still be needed. The export process tracks component versions and dependencies, ensuring that components are imported in the correct order and that related functionality continues to work without disruption.

The import process maintains all relationships between components, such as lookup fields, business rules, and flow connections, reducing the risk of missing dependencies. For complex deployments, multiple solutions can be combined into solution packages, enabling staged or modular deployment strategies. This approach aligns with Application Lifecycle Management (ALM) practices by supporting version control, promoting consistency across environments, and enabling testing in non-production environments before deploying to live systems.

Alternative approaches are less effective. Exporting and importing data moves only record-level data and does not transfer forms, tables, or other customizations. Copying an environment duplicates the entire environment, including production data, which is useful for backup or sandbox creation but is not suitable for selective deployment. Manual recreation of customizations is error-prone, time-consuming, and difficult to maintain across multiple environments, especially as solutions grow in complexity.

Using solution export and import also supports collaboration among multiple development teams. Developers can work on different solutions in parallel, merge changes through controlled exports, and deploy updates incrementally. Versioning allows teams to track updates and revert to previous versions if issues arise. Furthermore, solution import logs provide detailed information about component updates, conflicts, or failures during deployment, enabling administrators to troubleshoot and resolve issues quickly.

Question 30: 

You are creating a calculated field that needs to display a contact’s full name combining first name and last name with a space between them. Which formula should you use?

A) CONCATENATE(firstname, lastname)

B) firstname + ” ” + lastname

C) firstname & ” ” & lastname

D) CONCAT(firstname, ” “, lastname)

Answer: C) firstname & ” ” & lastname

Explanation:

The correct formula syntax for calculated fields in Dataverse uses the ampersand operator for string concatenation. The formula firstname & ” ” & lastname combines the first name field, a space character, and the last name field into a single text value. The ampersand operator is the standard string concatenation operator in calculated field formulas, allowing you to join multiple text values and literal strings.

This formula approach handles null values appropriately, and you can extend it to include additional elements like middle names or suffixes. For example, you could use firstname & ” ” & middlename & ” ” & lastname to create a full name with middle name included. Calculated fields evaluate when records are retrieved or when source field values change, ensuring the full name is always current based on the component name fields.

CONCATENATE is not a function available in Dataverse calculated fields. Using the plus operator for string concatenation is not supported in calculated field formulas; it’s used for numeric addition. CONCAT is not the correct function name in this context. The ampersand operator is the proper and supported method for string concatenation in Dataverse calculated field formulas. For creating text fields that combine multiple field values with literal strings like spaces or punctuation, using the ampersand operator provides the correct syntax.

Question 31: 

You need to configure a security role that allows users to create new account records but prevents them from reading existing accounts created by others. Which privilege configuration should you use?

A) Create: Organization, Read: User

B) Create: User, Read: Organization

C) Create: Organization, Read: None

D) Create: User, Read: None

Answer: A) Create: Organization, Read: User

Explanation:

The correct configuration is Create privilege at Organization level and Read privilege at User level. The Create privilege at Organization level allows users to create new account records without restrictions on which business unit or ownership. The Read privilege at User level ensures users can only view account records they own. This combination enables users to create accounts freely while restricting their view to only accounts where they are the owner.

When a user creates a record, they typically become the owner by default, which means they will be able to read that record due to the User-level read privilege. This configuration is useful in scenarios where you want to encourage record creation but maintain data privacy so users don’t see accounts managed by others. It’s common in sales scenarios where representatives create and manage their own customer accounts without visibility into other representatives’ accounts.

Option B would allow creating only user-owned records but reading all accounts, which is the opposite of the requirement. Option C would allow creating accounts but not reading any accounts, which would prevent users from even seeing the accounts they create. Option D would restrict both creation and reading to user context only. The combination of Organization-level Create with User-level Read correctly implements the requirement of unrestricted account creation with restricted visibility to owned accounts only.

Question 32: 

You are creating a business rule that needs to set a field value based on multiple conditions using OR logic. What should you configure?

A) Multiple condition groups with AND logic

B) Multiple condition groups with OR logic

C) Single condition group with OR logic

D) Nested business rules

Answer: C) Single condition group with OR logic

Explanation:

A single condition group with OR logic is the correct configuration for business rules that need to trigger based on any one of multiple conditions being true. In business rule configuration, you can add multiple conditions to a condition group and specify whether all conditions must be true using AND logic or at least one must be true using OR logic. For OR logic scenarios, you create one condition group, add all relevant conditions, and set the group to use OR logic.

For example, if you want to set a priority field to High when either the revenue exceeds one million OR the number of employees exceeds 500, you would add both conditions to a single group with OR logic. When any condition in the group is met, the business rule’s actions execute. This is straightforward and efficient in the business rule designer, where you can clearly see all conditions that could trigger the action.

Multiple condition groups with AND logic would require all groups to be true, not suitable for OR scenarios. Multiple condition groups with OR logic between groups would be overly complex for simple OR conditions. Nested business rules is not a feature in Power Platform. For implementing actions that should occur when any one of several conditions is met, configuring a single condition group with OR logic provides the clear and efficient approach to evaluate multiple criteria where meeting any one triggers the business rule actions.

Question 33: 

Your organization needs to display real-time inventory data from an external SQL database in a model-driven app. The data updates frequently and should not be stored in Dataverse. What should you use?

A) Virtual table

B) Import data

C) Power Automate sync

D) Azure SQL connector

Answer: A) Virtual table

Explanation:

Virtual tables are the correct solution for displaying external data in model-driven apps without storing it in Dataverse. Virtual tables, also called virtual entities, create a table definition in Dataverse that retrieves data in real-time from external sources like SQL Server, SharePoint, or custom data providers. When users access records through the virtual table, the system queries the external source and displays current data without creating duplicate copies in Dataverse.

This approach is ideal for frequently changing data like inventory levels because users always see the most current information without needing synchronization processes. Virtual tables appear and behave like regular tables in model-driven apps, supporting forms, views, and most standard table features. You configure a virtual table by creating a data source connection to your SQL database and defining the table structure to map to your database schema. The external data remains in its source system while being accessible through the Dataverse interface.

Importing data creates copies in Dataverse which would become outdated quickly with frequent updates. Power Automate sync could update Dataverse periodically but creates unnecessary data duplication. Azure SQL connector is used in canvas apps but doesn’t integrate into model-driven apps like tables do. For displaying real-time external data in model-driven apps without data duplication, virtual tables provide the appropriate architecture that queries source systems on demand while presenting a native table experience to users.

Question 34: 

You are configuring a Power Apps portal and need to display a list of articles filtered by category selected by the user. What should you use?

A) Entity list

B) Entity form

C) Web template

D) Content snippet

Answer: A) Entity list

Explanation:

Entity list is the correct portal component for displaying filtered lists of records from Dataverse tables. Entity lists provide configurable views of data with built-in support for filtering, sorting, pagination, and search. You would configure an entity list on a portal page to display articles from a table, then enable filter options that allow users to select categories. The entity list can be configured to dynamically filter results based on user selections.

Entity lists support multiple view configurations, allow customization of displayed columns, and can include actions like view details or download. They automatically handle pagination for large datasets and provide search capabilities across displayed fields. For filtering by category, you can configure the entity list with metadata filters that present users with category options, and selecting a category refreshes the list to show only matching articles.

Entity forms are for creating or editing single records, not displaying lists. Web templates define page structure and layout but aren’t specifically for data lists. Content snippets display static content blocks. For displaying filterable, searchable lists of Dataverse records in Power Apps portals with user-selectable filter criteria like categories, entity lists provide the purpose-built component with all necessary features for presenting and filtering data effectively.

Question 35: 

You need to create a Power Automate flow that waits for approval before creating a record in Dataverse. Which action should you use?

A) Condition

B) Start and wait for an approval

C) Delay

D) Do until

Answer: B) Start and wait for an approval

Explanation:

Start and wait for an approval is the correct action for implementing approval workflows in Power Automate. This action sends an approval request to designated approvers and pauses the flow until a response is received. The action integrates with Microsoft Teams, Outlook, and the Power Automate mobile app, allowing approvers to respond from multiple channels. After receiving approval or rejection, the flow continues with subsequent actions based on the outcome.

The approval action provides various options including single approver, multiple approvers with first-to-respond, or multiple approvers requiring everyone to approve. You can customize the approval request with dynamic content, add detailed information for approvers to consider, and configure timeout periods. After the approval step, you would add a condition that checks the outcome, then create the Dataverse record only if the outcome is “Approve”, or take alternative actions if rejected.

The Condition action evaluates expressions but doesn’t send approval requests or wait for human input. Delay simply pauses flow execution for a specified time without interaction. Do until loops until a condition is met but doesn’t facilitate approval workflows. For implementing business approval processes where flows need to pause and wait for authorized approvers to grant or deny permission before proceeding with actions like record creation, the Start and wait for an approval action provides the complete approval workflow functionality with proper notification and response tracking.

Question 36: 

You are configuring a model-driven app form with multiple tabs. You need to ensure a specific tab loads immediately while other tabs load only when users click on them. What should you configure?

A) Tab order

B) Form load performance

C) Tab properties – Delay load

D) Asynchronous loading

Answer: C) Tab properties – Delay load

Explanation:

Tab properties with the Delay load option is the correct configuration for controlling when tabs load their content in model-driven app forms. Each tab has properties where you can enable “Delay Tab Loading” which prevents that tab’s content from loading until the user clicks on the tab. Tabs without this option enabled load immediately when the form opens. This improves initial form load performance by deferring non-critical content.

For optimal form performance, you should identify which information users need immediately and ensure those tabs load by default while enabling delay load for tabs containing supplementary information or rarely accessed data. This is particularly beneficial on forms with many tabs, complex controls, or subgrids that retrieve large amounts of related data. Users experience faster initial form load times while still having access to all information when needed.

Tab order controls the sequence tabs appear but not loading timing. Form load performance is a general concern but delay load is the specific feature that addresses it. Asynchronous loading is not a configurable option for tabs. For optimizing form load performance by controlling when different tabs retrieve and display their content, configuring the Delay load property on individual tabs provides the specific mechanism to improve initial load times while maintaining access to all form information.

Question 37: 

Your organization uses Dynamics 365 Customer Service. You need to create a workflow that automatically assigns cases to service representatives based on their availability and skill sets. What should you use?

A) Assignment rules

B) Routing rules

C) Business process flow

D) Power Automate flow

Answer: B) Routing rules

Explanation:

Routing rules are the correct feature for automatically assigning cases based on criteria in Dynamics 365 Customer Service. Routing rules evaluate incoming cases against configured conditions and automatically assign them to appropriate queues or users based on factors like case type, priority, customer segment, or product. You can create rule items that check for specific skills or attributes and route cases to qualified service representatives.

Routing rules work automatically as cases are created or updated, evaluating them against rule conditions in priority order. When a case matches a rule, it gets routed according to the rule’s action, which can assign to a queue, user, or team. This automation ensures cases reach the right people quickly without manual intervention. You can configure multiple routing rule items to handle different scenarios, creating sophisticated distribution logic based on your organizational structure and service delivery model.

Assignment rules in Dynamics 365 are simpler and typically based on round-robin or load balancing rather than complex skill-based routing. Business process flows guide users through stages but don’t automatically assign records. Power Automate flows could implement assignment logic but routing rules are the built-in feature specifically designed for this purpose in Customer Service. For automatically routing and assigning cases based on availability and skills in Dynamics 365 Customer Service, routing rules provide the purpose-built functionality optimized for service case management.

Question 38: 

You are creating a canvas app that needs to display different content based on whether the user is accessing from a phone or tablet. What function should you use?

A) Device function

B) Screen.Size property

C) Platform function

D) App.Device property

Answer: B) Screen.Size property

Explanation:

The Screen.Size property is the correct approach for detecting device type and adapting canvas app layouts. Screen.Size returns an enumeration value indicating whether the app is running on a Phone, Tablet, Desktop, or other screen size category. You can use this property in conditional statements to show different controls, adjust layouts, or modify behavior based on the device type. For example, you might display a compact single-column layout for phones and a two-column layout for tablets.

Using Screen.Size in formulas like If(Screen.Size = ScreenSize.Small, …) allows you to create responsive designs that adapt to different form factors. You can control the Visible property of containers or controls, adjust Height and Width properties, or switch between entirely different screens based on device type. This ensures optimal user experience across different devices by presenting appropriately sized and arranged content for each form factor.

The Device function doesn’t exist in Power Apps. Platform function doesn’t exist in this context. App.Device property is not the correct property name. Screen.Size specifically provides the device category information needed for responsive design. For creating canvas apps that adapt their interface and behavior based on whether users access from phones, tablets, or desktop devices, the Screen.Size property provides the correct device detection mechanism to implement responsive design patterns.

Question 39: 

You need to configure a field in Dataverse that automatically displays the current date and time when a record is created and never changes afterward. What type of field should you use?

A) Date and Time field with default value

B) Calculated field with NOW function

C) Rollup field

D) Autonumber field

Answer: A) Date and Time field with default value

Explanation:

A Date and Time field with a default value set to the current date and time is the correct solution for capturing creation timestamp. When you configure a field’s default value to use the “Date and Time” option set to “Current Date and Time”, the field automatically populates with the timestamp when the record is created. Importantly, this value doesn’t update when the record is modified later, permanently recording the original creation time.

This approach provides a simple, reliable way to track when records were created without requiring custom automation. The default value applies automatically whether records are created through forms, API calls, or imports. You can make the field read-only or remove it from forms if you don’t want users to modify it, though by default its value is set at creation and typically wouldn’t change. This is commonly used for audit trails and reporting on record age.

Calculated fields with NOW function would recalculate every time the record is retrieved, showing the current time rather than creation time. Rollup fields aggregate data from related records. Autonumber fields generate sequential numbers, not timestamps. For automatically capturing and permanently storing the creation date and time of records, a Date and Time field with a default value of current date and time provides the appropriate configuration that records the timestamp once at creation without ongoing updates.

Question 40: 

You are configuring a Power Automate flow that needs to create multiple related records in Dataverse as a single transaction. If any record fails to create, all should be rolled back. What should you use?

A) Change set

B) Batch operations

C) Parallel branch

D) Scope with error handling

Answer: A) Change set

Explanation:

Change set is the correct feature for executing multiple Dataverse operations as an atomic transaction in Power Automate. Change sets ensure that either all operations within the set succeed, or if any operation fails, all changes are rolled back, maintaining data integrity. You configure a change set by adding a Change Set scope to your flow, then placing all related Dataverse actions like Add a new row or Update a row inside that scope.

This transactional behavior is critical when creating interdependent records where partial completion would leave data in an inconsistent state. For example, if you’re creating an order header and multiple order line items, you want all line items created or none at all if something fails. Change sets provide this all-or-nothing guarantee. If any operation within the change set fails, none of the changes are committed to Dataverse, preventing orphaned or incomplete data.

Batch operations group multiple operations for efficiency but don’t provide transactional rollback. Parallel branches execute actions simultaneously but independently without transaction control. Scope with error handling can catch errors but doesn’t automatically rollback successful operations that occurred before a failure. For ensuring multiple Dataverse record operations either all succeed or all fail together maintaining data consistency, change sets provide the proper transactional control mechanism with automatic rollback on any failure.

Leave a Reply

How It Works

img
Step 1. Choose Exam
on ExamLabs
Download IT Exams Questions & Answers
img
Step 2. Open Exam with
Avanset Exam Simulator
Press here to download VCE Exam Simulator that simulates real exam environment
img
Step 3. Study
& Pass
IT Exams Anywhere, Anytime!