Friday, 24 January 2025

AZ-204 Question and Answer Part 18

Question #271

You develop and deploy an ASP.NET Core application that connects to an Azure Database for MySQL instance.
Connections to the database appear to drop intermittently and the application code does not handle the connection failure.
You need to handle the transient connection errors in code by implementing retries.
What are three possible ways to achieve this goal? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
  1. A
    Close the database connection and immediately report an error.
  2. B
    Disable connection pooling and configure a second Azure Database for MySQL instance.
  3. C
    Wait five seconds before repeating the connection attempt to the database.
  4. D
    Set a maximum number of connection attempts to 10 and report an error on subsequent connections.
  5. E
    Increase connection repeat attempts exponentially up to 120 seconds.

Correct Answer:
ACD
Question #272

You are building a B2B web application that uses Azure B2B collaboration for authentication. Paying customers authenticate to Azure B2B using federation.
The application allows users to sign up for trial accounts using any email address.
When a user converts to a paying customer, the data associated with the trial should be kept, but the user must authenticate using federation.
You need to update the user in Azure Active Directory (Azure AD) when they convert to a paying customer.
Which Graph API parameter is used to change authentication from one-time passcodes to federation?
  1. A
    resetRedemption
  2. B
    Status
  3. C
    userFlowType
  4. D
    invitedUser

Correct Answer:
B
Question #273

Introductory Info Case study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

Background -
Wide World Importers is moving all their datacenters to Azure. The company has developed several applications and services to support supply chain operations and would like to leverage serverless computing where possible.

Current environment -
Windows Server 2016 virtual machine
This virtual machine (VM) runs BizTalk Server 2016. The VM runs the following workflows:
Ocean Transport `" This workflow gathers and validates container information including container contents and arrival notices at various shipping ports.
Inland Transport `" This workflow gathers and validates trucking information including fuel usage, number of stops, and routes.
The VM supports the following REST API calls:
Container API `" This API provides container information including weight, contents, and other attributes.
Location API `" This API provides location information regarding shipping ports of call and trucking stops.
Shipping REST API `" This API provides shipping information for use and display on the shipping website.

Shipping Data -
The application uses MongoDB JSON document storage database for all container and transport information.

Shipping Web Site -
The site displays shipping container tracking information and container contents. The site is located at http://shipping.wideworldimporters.com/

Proposed solution -
The on-premises shipping application must be moved to Azure. The VM has been migrated to a new Standard_D16s_v3 Azure VM by using Azure Site Recovery and must remain running in Azure to complete the BizTalk component migrations. You create a Standard_D16s_v3 Azure VM to host BizTalk Server. The Azure architecture diagram for the proposed solution is shown below:


Requirements -

Shipping Logic app -
The Shipping Logic app must meet the following requirements:
Support the ocean transport and inland transport workflows by using a Logic App.
Support industry-standard protocol X12 message format for various messages including vessel content details and arrival notices.
Secure resources to the corporate VNet and use dedicated storage resources with a fixed costing model.
Maintain on-premises connectivity to support legacy applications and final BizTalk migrations.

Shipping Function app -
Implement secure function endpoints by using app-level security and include Azure Active Directory (Azure AD).

REST APIs -
The REST API's that support the solution must meet the following requirements:
Secure resources to the corporate VNet.
Allow deployment to a testing location within Azure while not incurring additional costs.
Automatically scale to double capacity during peak shipping times while not causing application downtime.
Minimize costs when selecting an Azure payment model.

Shipping data -
Data migration from on-premises to Azure must minimize costs and downtime.

Shipping website -
Use Azure Content Delivery Network (CDN) and ensure maximum performance for dynamic content while minimizing latency and costs.

Issues -

Windows Server 2016 VM -
The VM shows high network latency, jitter, and high CPU utilization. The VM is critical and has not been backed up in the past. The VM must enable a quick restore from a 7-day snapshot to include in-place restore of disks in case of failure.

Shipping website and REST APIs -
The following error message displays while you are testing the website:
Failed to load http://test-shippingapi.wideworldimporters.com/: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://test.wideworldimporters.com/' is therefore not allowed access. Question HOTSPOT -
You need to configure Azure CDN for the Shipping web site.
Which configuration options should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:


    Correct Answer:

    Scenario: Shipping website -
    Use Azure Content Delivery Network (CDN) and ensure maximum performance for dynamic content while minimizing latency and costs.

    Tier: Standard -

    Profile: Akamai -
    Optimization: Dynamic site acceleration
    Dynamic site acceleration (DSA) is available for Azure CDN Standard from Akamai, Azure CDN Standard from Verizon, and Azure CDN Premium from Verizon profiles.
    DSA includes various techniques that benefit the latency and performance of dynamic content. Techniques include route and network optimization, TCP optimization, and more.
    You can use this optimization to accelerate a web app that includes numerous responses that aren't cacheable. Examples are search results, checkout transactions, or real-time data. You can continue to use core Azure CDN caching capabilities for static data.
    Reference:
    https://docs.microsoft.com/en-us/azure/cdn/cdn-optimization-overview
    Question #274

    Introductory Info Case study -
    This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
    To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
    At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

    To start the case study -
    To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

    Background -
    Wide World Importers is moving all their datacenters to Azure. The company has developed several applications and services to support supply chain operations and would like to leverage serverless computing where possible.

    Current environment -
    Windows Server 2016 virtual machine
    This virtual machine (VM) runs BizTalk Server 2016. The VM runs the following workflows:
    Ocean Transport `" This workflow gathers and validates container information including container contents and arrival notices at various shipping ports.
    Inland Transport `" This workflow gathers and validates trucking information including fuel usage, number of stops, and routes.
    The VM supports the following REST API calls:
    Container API `" This API provides container information including weight, contents, and other attributes.
    Location API `" This API provides location information regarding shipping ports of call and trucking stops.
    Shipping REST API `" This API provides shipping information for use and display on the shipping website.

    Shipping Data -
    The application uses MongoDB JSON document storage database for all container and transport information.

    Shipping Web Site -
    The site displays shipping container tracking information and container contents. The site is located at http://shipping.wideworldimporters.com/

    Proposed solution -
    The on-premises shipping application must be moved to Azure. The VM has been migrated to a new Standard_D16s_v3 Azure VM by using Azure Site Recovery and must remain running in Azure to complete the BizTalk component migrations. You create a Standard_D16s_v3 Azure VM to host BizTalk Server. The Azure architecture diagram for the proposed solution is shown below:


    Requirements -

    Shipping Logic app -
    The Shipping Logic app must meet the following requirements:
    Support the ocean transport and inland transport workflows by using a Logic App.
    Support industry-standard protocol X12 message format for various messages including vessel content details and arrival notices.
    Secure resources to the corporate VNet and use dedicated storage resources with a fixed costing model.
    Maintain on-premises connectivity to support legacy applications and final BizTalk migrations.

    Shipping Function app -
    Implement secure function endpoints by using app-level security and include Azure Active Directory (Azure AD).

    REST APIs -
    The REST API's that support the solution must meet the following requirements:
    Secure resources to the corporate VNet.
    Allow deployment to a testing location within Azure while not incurring additional costs.
    Automatically scale to double capacity during peak shipping times while not causing application downtime.
    Minimize costs when selecting an Azure payment model.

    Shipping data -
    Data migration from on-premises to Azure must minimize costs and downtime.

    Shipping website -
    Use Azure Content Delivery Network (CDN) and ensure maximum performance for dynamic content while minimizing latency and costs.

    Issues -

    Windows Server 2016 VM -
    The VM shows high network latency, jitter, and high CPU utilization. The VM is critical and has not been backed up in the past. The VM must enable a quick restore from a 7-day snapshot to include in-place restore of disks in case of failure.

    Shipping website and REST APIs -
    The following error message displays while you are testing the website:
    Failed to load http://test-shippingapi.wideworldimporters.com/: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://test.wideworldimporters.com/' is therefore not allowed access. Question HOTSPOT -
    You need to correct the VM issues.
    Which tools should you use? To answer, select the appropriate options in the answer area.
    NOTE: Each correct selection is worth one point.
    Hot Area:


      Correct Answer:

      Box 1: Azure Backup -
      The VM is critical and has not been backed up in the past. The VM must enable a quick restore from a 7-day snapshot to include in-place restore of disks in case of failure.
      In-Place restore of disks in IaaS VMs is a feature of Azure Backup.
      Performance: Accelerated Networking
      Scenario: The VM shows high network latency, jitter, and high CPU utilization.

      Box 2: Accelerated networking -
      The VM shows high network latency, jitter, and high CPU utilization.
      Accelerated networking enables single root I/O virtualization (SR-IOV) to a VM, greatly improving its networking performance. This high-performance path bypasses the host from the datapath, reducing latency, jitter, and CPU utilization, for use with the most demanding network workloads on supported VM types.
      Reference:
      https://azure.microsoft.com/en-us/blog/an-easy-way-to-bring-back-your-azure-vm-with-in-place-restore/
      Question #275

      Introductory Info Case study -
      This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
      To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
      At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

      To start the case study -
      To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

      Background -
      You are a developer for Litware Inc., a SaaS company that provides a solution for managing employee expenses. The solution consists of an ASP.NET Core Web
      API project that is deployed as an Azure Web App.

      Overall architecture -
      Employees upload receipts for the system to process. When processing is complete, the employee receives a summary report email that details the processing results. Employees then use a web application to manage their receipts and perform any additional tasks needed for reimbursement.

      Receipt processing -
      Employees may upload receipts in two ways:
      Uploading using an Azure Files mounted folder
      Uploading using the web application

      Data Storage -
      Receipt and employee information is stored in an Azure SQL database.

      Documentation -
      Employees are provided with a getting started document when they first use the solution. The documentation includes details on supported operating systems for
      Azure File upload, and instructions on how to configure the mounted folder.

      Solution details -

      Users table -


      Web Application -
      You enable MSI for the Web App and configure the Web App to use the security principal name WebAppIdentity.

      Processing -
      Processing is performed by an Azure Function that uses version 2 of the Azure Function runtime. Once processing is completed, results are stored in Azure Blob
      Storage and an Azure SQL database. Then, an email summary is sent to the user with a link to the processing report. The link to the report must remain valid if the email is forwarded to another user.

      Logging -
      Azure Application Insights is used for telemetry and logging in both the processor and the web application. The processor also has TraceWriter logging enabled.
      Application Insights must always contain all log messages.

      Requirements -

      Receipt processing -
      Concurrent processing of a receipt must be prevented.

      Disaster recovery -
      Regional outage must not impact application availability. All DR operations must not be dependent on application running and must ensure that data in the DR region is up to date.

      Security -
      User's SecurityPin must be stored in such a way that access to the database does not allow the viewing of SecurityPins. The web application is the only system that should have access to SecurityPins.
      All certificates and secrets used to secure data must be stored in Azure Key Vault.
      You must adhere to the principle of least privilege and provide privileges which are essential to perform the intended function.
      All access to Azure Storage and Azure SQL database must use the application's Managed Service Identity (MSI).
      Receipt data must always be encrypted at rest.
      All data must be protected in transit.
      User's expense account number must be visible only to logged in users. All other views of the expense account number should include only the last segment, with the remaining parts obscured.
      In the case of a security breach, access to all summary reports must be revoked without impacting other parts of the system.

      Issues -

      Upload format issue -
      Employees occasionally report an issue with uploading a receipt using the web application. They report that when they upload a receipt using the Azure File
      Share, the receipt does not appear in their profile. When this occurs, they delete the file in the file share and use the web application, which returns a 500 Internal
      Server error page.

      Capacity issue -
      During busy periods, employees report long delays between the time they upload the receipt and when it appears in the web application.

      Log capacity issue -
      Developers report that the number of log messages in the trace output for the processor is too high, resulting in lost log messages.

      Application code -

      Processing.cs -


      Database.cs -


      ReceiptUploader.cs -


      ConfigureSSE.ps1 -
       Question DRAG DROP -
      You need to add code at line PC32 in Processing.cs to implement the GetCredentials method in the Processing class.
      How should you complete the code? To answer, drag the appropriate code segments to the correct locations. Each code segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
      NOTE: Each correct selection is worth one point.
      Select and Place:


        Correct Answer:

        Box 1: AzureServiceTokenProvider()
        Box 2: tp.GetAccessTokenAsync("..")
        Acquiring an access token is then quite easy. Example code:
        private async Task<string> GetAccessTokenAsync()
        {
        var tokenProvider = new AzureServiceTokenProvider();
        return await tokenProvider.GetAccessTokenAsync("https://storage.azure.com/");
        }
        Reference:
        https://joonasw.net/view/azure-ad-authentication-with-azure-storage-and-managed-service-identity
        Question #276

        Introductory Info Case study -
        This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
        To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
        At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

        To start the case study -
        To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

        Background -
        You are a developer for Litware Inc., a SaaS company that provides a solution for managing employee expenses. The solution consists of an ASP.NET Core Web
        API project that is deployed as an Azure Web App.

        Overall architecture -
        Employees upload receipts for the system to process. When processing is complete, the employee receives a summary report email that details the processing results. Employees then use a web application to manage their receipts and perform any additional tasks needed for reimbursement.

        Receipt processing -
        Employees may upload receipts in two ways:
        Uploading using an Azure Files mounted folder
        Uploading using the web application

        Data Storage -
        Receipt and employee information is stored in an Azure SQL database.

        Documentation -
        Employees are provided with a getting started document when they first use the solution. The documentation includes details on supported operating systems for
        Azure File upload, and instructions on how to configure the mounted folder.

        Solution details -

        Users table -


        Web Application -
        You enable MSI for the Web App and configure the Web App to use the security principal name WebAppIdentity.

        Processing -
        Processing is performed by an Azure Function that uses version 2 of the Azure Function runtime. Once processing is completed, results are stored in Azure Blob
        Storage and an Azure SQL database. Then, an email summary is sent to the user with a link to the processing report. The link to the report must remain valid if the email is forwarded to another user.

        Logging -
        Azure Application Insights is used for telemetry and logging in both the processor and the web application. The processor also has TraceWriter logging enabled.
        Application Insights must always contain all log messages.

        Requirements -

        Receipt processing -
        Concurrent processing of a receipt must be prevented.

        Disaster recovery -
        Regional outage must not impact application availability. All DR operations must not be dependent on application running and must ensure that data in the DR region is up to date.

        Security -
        User's SecurityPin must be stored in such a way that access to the database does not allow the viewing of SecurityPins. The web application is the only system that should have access to SecurityPins.
        All certificates and secrets used to secure data must be stored in Azure Key Vault.
        You must adhere to the principle of least privilege and provide privileges which are essential to perform the intended function.
        All access to Azure Storage and Azure SQL database must use the application's Managed Service Identity (MSI).
        Receipt data must always be encrypted at rest.
        All data must be protected in transit.
        User's expense account number must be visible only to logged in users. All other views of the expense account number should include only the last segment, with the remaining parts obscured.
        In the case of a security breach, access to all summary reports must be revoked without impacting other parts of the system.

        Issues -

        Upload format issue -
        Employees occasionally report an issue with uploading a receipt using the web application. They report that when they upload a receipt using the Azure File
        Share, the receipt does not appear in their profile. When this occurs, they delete the file in the file share and use the web application, which returns a 500 Internal
        Server error page.

        Capacity issue -
        During busy periods, employees report long delays between the time they upload the receipt and when it appears in the web application.

        Log capacity issue -
        Developers report that the number of log messages in the trace output for the processor is too high, resulting in lost log messages.

        Application code -

        Processing.cs -


        Database.cs -


        ReceiptUploader.cs -


        ConfigureSSE.ps1 -
         Question DRAG DROP -
        You need to ensure disaster recovery requirements are met.
        What code should you add at line PC16?
        To answer, drag the appropriate code fragments to the correct locations. Each code fragment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
        NOTE: Each correct selection is worth one point.
        Select and Place:


          Correct Answer:

          Scenario: Disaster recovery. Regional outage must not impact application availability. All DR operations must not be dependent on application running and must ensure that data in the DR region is up to date.

          Box 1: DirectoryTransferContext -
          We transfer all files in the directory.
          Note: The TransferContext object comes in two forms: SingleTransferContext and DirectoryTransferContext. The former is for transferring a single file and the latter is for transferring a directory of files.
          Box 2: ShouldTransferCallbackAsync
          The DirectoryTransferContext.ShouldTransferCallbackAsync delegate callback is invoked to tell whether a transfer should be done.

          Box 3: False -
          If you want to use the retry policy in Copy, and want the copy can be resume if break in the middle, you can use SyncCopy (isServiceCopy = false).
          Note that if you choose to use service side copy ('isServiceCopy' set to true), Azure (currently) doesn't provide SLA for that. Setting 'isServiceCopy' to false will download the source blob loca
          Reference:
          https://docs.microsoft.com/en-us/azure/storage/common/storage-use-data-movement-library https://docs.microsoft.com/en-us/dotnet/api/microsoft.windowsazure.storage.datamovement.directorytransfercontext.shouldtransfercallbackasync?view=azure- dotnet
          Question #277

          Introductory Info Case study -
          This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
          To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
          At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

          To start the case study -
          To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

          LabelMaker app -
          Coho Winery produces, bottles, and distributes a variety of wines globally. You are a developer implementing highly scalable and resilient applications to support online order processing by using Azure solutions.
          Coho Winery has a LabelMaker application that prints labels for wine bottles. The application sends data to several printers. The application consists of five modules that run independently on virtual machines (VMs). Coho Winery plans to move the application to Azure and continue to support label creation.
          External partners send data to the LabelMaker application to include artwork and text for custom label designs.

          Requirements. Data -
          You identify the following requirements for data management and manipulation:
          Order data is stored as nonrelational JSON and must be queried using SQL.
          Changes to the Order data must reflect immediately across all partitions. All reads to the Order data must fetch the most recent writes.

          Requirements. Security -
          You have the following security requirements:
          Users of Coho Winery applications must be able to provide access to documents, resources, and applications to external partners.

          External partners must use their own credentials and authenticate with their organization's identity management solution.
          External partner logins must be audited monthly for application use by a user account administrator to maintain company compliance.
          Storage of e-commerce application settings must be maintained in Azure Key Vault.
          E-commerce application sign-ins must be secured by using Azure App Service authentication and Azure Active Directory (AAD).
          Conditional access policies must be applied at the application level to protect company content.
          The LabelMaker application must be secured by using an AAD account that has full access to all namespaces of the Azure Kubernetes Service (AKS) cluster.

          Requirements. LabelMaker app -
          Azure Monitor Container Health must be used to monitor the performance of workloads that are deployed to Kubernetes environments and hosted on Azure
          Kubernetes Service (AKS).
          You must use Azure Container Registry to publish images that support the AKS deployment.

          Architecture -


          Issues -
          Calls to the Printer API App fail periodically due to printer communication timeouts.
          Printer communication timeouts occur after 10 seconds. The label printer must only receive up to 5 attempts within one minute.
          The order workflow fails to run upon initial deployment to Azure.

          Order.json -
          Relevant portions of the app files are shown below. Line numbers are included for reference only.
          This JSON file contains a representation of the data for an order that includes a single item.

          Order.json -
           Question HOTSPOT -
          You need to configure Azure Cosmos DB.
          Which settings should you use? To answer, select the appropriate options in the answer area.
          NOTE: Each correct selection is worth one point.
          Hot Area:


            Correct Answer:

            Box 1: Strong -
            When the consistency level is set to strong, the staleness window is equivalent to zero, and the clients are guaranteed to read the latest committed value of the write operation.
            Scenario: Changes to the Order data must reflect immediately across all partitions. All reads to the Order data must fetch the most recent writes.
            Note: You can choose from five well-defined models on the consistency spectrum. From strongest to weakest, the models are: Strong, Bounded staleness,
            Session, Consistent prefix, Eventual

            Box 2: SQL -
            Scenario: You identify the following requirements for data management and manipulation:
            Order data is stored as nonrelational JSON and must be queried using Structured Query Language (SQL).
            Question #278
            Introductory Info Case study -
            This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
            To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
            At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

            To start the case study -
            To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

            LabelMaker app -
            Coho Winery produces, bottles, and distributes a variety of wines globally. You are a developer implementing highly scalable and resilient applications to support online order processing by using Azure solutions.
            Coho Winery has a LabelMaker application that prints labels for wine bottles. The application sends data to several printers. The application consists of five modules that run independently on virtual machines (VMs). Coho Winery plans to move the application to Azure and continue to support label creation.
            External partners send data to the LabelMaker application to include artwork and text for custom label designs.

            Requirements. Data -
            You identify the following requirements for data management and manipulation:
            Order data is stored as nonrelational JSON and must be queried using SQL.
            Changes to the Order data must reflect immediately across all partitions. All reads to the Order data must fetch the most recent writes.

            Requirements. Security -
            You have the following security requirements:
            Users of Coho Winery applications must be able to provide access to documents, resources, and applications to external partners.

            External partners must use their own credentials and authenticate with their organization's identity management solution.
            External partner logins must be audited monthly for application use by a user account administrator to maintain company compliance.
            Storage of e-commerce application settings must be maintained in Azure Key Vault.
            E-commerce application sign-ins must be secured by using Azure App Service authentication and Azure Active Directory (AAD).
            Conditional access policies must be applied at the application level to protect company content.
            The LabelMaker application must be secured by using an AAD account that has full access to all namespaces of the Azure Kubernetes Service (AKS) cluster.

            Requirements. LabelMaker app -
            Azure Monitor Container Health must be used to monitor the performance of workloads that are deployed to Kubernetes environments and hosted on Azure
            Kubernetes Service (AKS).
            You must use Azure Container Registry to publish images that support the AKS deployment.

            Architecture -


            Issues -
            Calls to the Printer API App fail periodically due to printer communication timeouts.
            Printer communication timeouts occur after 10 seconds. The label printer must only receive up to 5 attempts within one minute.
            The order workflow fails to run upon initial deployment to Azure.

            Order.json -
            Relevant portions of the app files are shown below. Line numbers are included for reference only.
            This JSON file contains a representation of the data for an order that includes a single item.

            Order.json -
             Question HOTSPOT -
            You need to retrieve all order line items from Order.json and sort the data alphabetically by the city.
            How should you complete the code? To answer, select the appropriate options in the answer area.
            NOTE: Each correct selection is worth one point.
            Hot Area:


              Correct Answer:

              Box 1: orders o -
              Scenario: Order data is stored as nonrelational JSON and must be queried using SQL.

              Box 2:li -

              Box 3: o.line_items -

              Box 4: o.city -
              The city field is in Order, not in the 2s.
              Question #279
              Introductory Info Case study -
              This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
              To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
              At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

              To start the case study -
              To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

              Background -
              VanArsdel, Ltd. is a global office supply company. The company is based in Canada and has retail store locations across the world. The company is developing several cloud-based solutions to support their stores, distributors, suppliers, and delivery services.

              Current environment -

              Corporate website -
              The company provides a public website located at http://www.vanarsdelltd.com. The website consists of a React JavaScript user interface, HTML, CSS, image assets, and several APIs hosted in Azure Functions.

              Retail Store Locations -
              The company supports thousands of store locations globally. Store locations send data every hour to an Azure Blob storage account to support inventory, purchasing and delivery services. Each record includes a location identifier and sales transaction information.

              Requirements -
              The application components must meet the following requirements:

              Corporate website -
              Secure the website by using SSL.
              Minimize costs for data storage and hosting.
              Implement native GitHub workflows for continuous integration and continuous deployment (CI/CD).
              Distribute the website content globally for local use.
              Implement monitoring by using Application Insights and availability web tests including SSL certificate validity and custom header value verification.
              The website must have 99.95 percent uptime.

              Retail store locations -
              Azure Functions must process data immediately when data is uploaded to Blob storage. Azure Functions must update Azure Cosmos DB by using native SQL language queries.
              Audit store sale transaction information nightly to validate data, process sales financials, and reconcile inventory.

              Delivery services -
              Store service telemetry data in Azure Cosmos DB by using an Azure Function. Data must include an item id, the delivery vehicle license plate, vehicle package capacity, and current vehicle location coordinates.
              Store delivery driver profile information in Azure Active Directory (Azure AD) by using an Azure Function called from the corporate website.

              Inventory services -
              The company has contracted a third-party to develop an API for inventory processing that requires access to a specific blob within the retail store storage account for three months to include read-only access to the data.

              Security -
              All Azure Functions must centralize management and distribution of configuration data for different environments and geographies, encrypted by using a company-provided RSA-HSM key.
              Authentication and authorization must use Azure AD and services must use managed identities where possible.

              Issues -

              Retail Store Locations -
              You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.
              Azure Cosmos DB queries from the Azure Function exhibit high Request Unit (RU) usage and contain multiple, complex queries that exhibit high point read latency for large items as the function app is scaling. Question HOTSPOT -
              You need to implement the Azure Function for delivery driver profile information.
              Which configurations should you use? To answer, select the appropriate options in the answer area.
              NOTE: Each correct selection is worth one point.
              Hot Area:


                Correct Answer:

                Box 1: Azure Identity library -
                Store delivery driver profile information in Azure Active Directory (Azure AD) by using an Azure Function called from the corporate website.
                We recommend that you use a managed identity for applications deployed to Azure.
                The preceding authentication scenarios are supported by the Azure Identity client library and integrated with Key Vault SDKs.
                Note: What is Managed Service Identity?
                Azure Key Vault avoids the need to store keys and secrets in application code or source control. However, in order to retrieve keys and secrets from Azure Key
                Vault, you need to authorize a user or application with Azure Key Vault, which in its turn needs another credential. Managed Service Identity avoids the need of storing credentials for Azure Key Vault in application or environment settings by creating a Service Principal for each application or cloud service on which
                Managed Service Identity is enabled. This Service Principal enables you to call a local MSI endpoint to get an access token from Azure AD using the credentials of the Service Principal. This token is then used to authenticate to an Azure Service, for example Azure Key Vault.

                Box 2: Azure Key Vault -
                Azure Key Vault allows you to securely access sensitive information from within your applications:
                * Keys, secrets, and certificates are protected without your having to write the code yourself, and you can easily use them from your applications.
                Use Azure Key Vault to store only secrets for your application. Examples of secrets that should be stored in Key Vault include:

                Client application secrets -

                Connection strings -

                Passwords -

                Shared access keys -

                SSH keys -
                Reference:
                https://docs.microsoft.com/en-us/azure/key-vault/general/developers-guide https://integration.team/blog/retrieve-azure-key-vault-secrets-using-azure-functions-and-managed-service-identity
                Question #280
                Introductory Info Case study -
                This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                To start the case study -
                To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                Background -
                VanArsdel, Ltd. is a global office supply company. The company is based in Canada and has retail store locations across the world. The company is developing several cloud-based solutions to support their stores, distributors, suppliers, and delivery services.

                Current environment -

                Corporate website -
                The company provides a public website located at http://www.vanarsdelltd.com. The website consists of a React JavaScript user interface, HTML, CSS, image assets, and several APIs hosted in Azure Functions.

                Retail Store Locations -
                The company supports thousands of store locations globally. Store locations send data every hour to an Azure Blob storage account to support inventory, purchasing and delivery services. Each record includes a location identifier and sales transaction information.

                Requirements -
                The application components must meet the following requirements:

                Corporate website -
                Secure the website by using SSL.
                Minimize costs for data storage and hosting.
                Implement native GitHub workflows for continuous integration and continuous deployment (CI/CD).
                Distribute the website content globally for local use.
                Implement monitoring by using Application Insights and availability web tests including SSL certificate validity and custom header value verification.
                The website must have 99.95 percent uptime.

                Retail store locations -
                Azure Functions must process data immediately when data is uploaded to Blob storage. Azure Functions must update Azure Cosmos DB by using native SQL language queries.
                Audit store sale transaction information nightly to validate data, process sales financials, and reconcile inventory.

                Delivery services -
                Store service telemetry data in Azure Cosmos DB by using an Azure Function. Data must include an item id, the delivery vehicle license plate, vehicle package capacity, and current vehicle location coordinates.
                Store delivery driver profile information in Azure Active Directory (Azure AD) by using an Azure Function called from the corporate website.

                Inventory services -
                The company has contracted a third-party to develop an API for inventory processing that requires access to a specific blob within the retail store storage account for three months to include read-only access to the data.

                Security -
                All Azure Functions must centralize management and distribution of configuration data for different environments and geographies, encrypted by using a company-provided RSA-HSM key.
                Authentication and authorization must use Azure AD and services must use managed identities where possible.

                Issues -

                Retail Store Locations -
                You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.
                Azure Cosmos DB queries from the Azure Function exhibit high Request Unit (RU) usage and contain multiple, complex queries that exhibit high point read latency for large items as the function app is scaling. Question You need to grant access to the retail store location data for the inventory service development effort.
                What should you use?
                1. A
                  Azure AD access token
                2. B
                  Azure RBAC role
                3. C
                  Shared access signature (SAS) token
                4. D
                  Azure AD ID token
                5. E
                  Azure AD refresh token

                Correct Answer:
                C
                A shared access signature (SAS) provides secure delegated access to resources in your storage account. With a SAS, you have granular control over how a client can access your data. For example:
                What resources the client may access.
                What permissions they have to those resources.
                How long the SAS is valid.
                Note: Inventory services:
                The company has contracted a third-party to develop an API for inventory processing that requires access to a specific blob within the retail store storage account for three months to include read-only access to the data.
                Reference:
                https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview
                Question #281
                Introductory Info Case study -
                This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                To start the case study -
                To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                Background -
                VanArsdel, Ltd. is a global office supply company. The company is based in Canada and has retail store locations across the world. The company is developing several cloud-based solutions to support their stores, distributors, suppliers, and delivery services.

                Current environment -

                Corporate website -
                The company provides a public website located at http://www.vanarsdelltd.com. The website consists of a React JavaScript user interface, HTML, CSS, image assets, and several APIs hosted in Azure Functions.

                Retail Store Locations -
                The company supports thousands of store locations globally. Store locations send data every hour to an Azure Blob storage account to support inventory, purchasing and delivery services. Each record includes a location identifier and sales transaction information.

                Requirements -
                The application components must meet the following requirements:

                Corporate website -
                Secure the website by using SSL.
                Minimize costs for data storage and hosting.
                Implement native GitHub workflows for continuous integration and continuous deployment (CI/CD).
                Distribute the website content globally for local use.
                Implement monitoring by using Application Insights and availability web tests including SSL certificate validity and custom header value verification.
                The website must have 99.95 percent uptime.

                Retail store locations -
                Azure Functions must process data immediately when data is uploaded to Blob storage. Azure Functions must update Azure Cosmos DB by using native SQL language queries.
                Audit store sale transaction information nightly to validate data, process sales financials, and reconcile inventory.

                Delivery services -
                Store service telemetry data in Azure Cosmos DB by using an Azure Function. Data must include an item id, the delivery vehicle license plate, vehicle package capacity, and current vehicle location coordinates.
                Store delivery driver profile information in Azure Active Directory (Azure AD) by using an Azure Function called from the corporate website.

                Inventory services -
                The company has contracted a third-party to develop an API for inventory processing that requires access to a specific blob within the retail store storage account for three months to include read-only access to the data.

                Security -
                All Azure Functions must centralize management and distribution of configuration data for different environments and geographies, encrypted by using a company-provided RSA-HSM key.
                Authentication and authorization must use Azure AD and services must use managed identities where possible.

                Issues -

                Retail Store Locations -
                You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.
                Azure Cosmos DB queries from the Azure Function exhibit high Request Unit (RU) usage and contain multiple, complex queries that exhibit high point read latency for large items as the function app is scaling. Question HOTSPOT -
                You need to reliably identify the delivery driver profile information.
                How should you configure the system? To answer, select the appropriate options in the answer area.
                NOTE: Each correct selection is worth one point.
                Hot Area:


                  Correct Answer:

                  Box 1: ID -
                  Scenario: Store delivery driver profile information in Azure Active Directory (Azure AD) by using an Azure Function called from the corporate website.
                  ID token - A JWT that contains claims that you can use to identify users in your application. This token is securely sent in HTTP requests for communication between two components of the same application or service. You can use the claims in an ID token as you see fit. They're commonly used to display account information or to make access control decisions in an application. ID tokens are signed, but the're not encrypted. When your application or API receives an ID token, it must validate the signature to prove that the token is authentic. Your application or API must also validate a few claims in the token to prove that it's valid.
                  Depending on the scenario requirements, the claims validated by an application can vary, but your application must perform some common claim validations in every scenario.

                  Box 2: Oid -
                  Oid - The immutable identifier for the "principal" of the request - the user or service principal whose identity has been verified. In ID tokens and app+user tokens, this is the object ID of the user. In app-only tokens, this is the object ID of the calling service principal. It can also be used to perform authorization checks safely and as a key in database tables. This ID uniquely identifies the principal across applications - two different applications signing in the same user will receive the same value in the oid claim.
                  Incorrect:
                  Aud - Identifies the intended recipient of the token. For Azure AD B2C, the audience is the application ID. Your application should validate this value and reject the token if it doesn't match. Audience is synonymous with resource.
                  Idp - Records the identity provider that authenticated the subject of the token. This value is identical to the value of the Issuer claim unless the user account not in the same tenant as the issuer - guests, for instance. If the claim isn't present, it means that the value of iss can be used instead. For personal accounts being used in an organizational context (for instance, a personal account invited to an Azure AD tenant), the idp claim may be 'live.com' or an STS URI containing the
                  Microsoft account tenant.
                  Reference:
                  https://docs.microsoft.com/en-us/azure/active-directory-b2c/tokens-overview https://docs.microsoft.com/en-us/azure/active-directory/develop/access-tokens
                  Question #282
                  Introductory Info Case study -
                  This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                  To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                  At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                  To start the case study -
                  To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                  Background -
                  VanArsdel, Ltd. is a global office supply company. The company is based in Canada and has retail store locations across the world. The company is developing several cloud-based solutions to support their stores, distributors, suppliers, and delivery services.

                  Current environment -

                  Corporate website -
                  The company provides a public website located at http://www.vanarsdelltd.com. The website consists of a React JavaScript user interface, HTML, CSS, image assets, and several APIs hosted in Azure Functions.

                  Retail Store Locations -
                  The company supports thousands of store locations globally. Store locations send data every hour to an Azure Blob storage account to support inventory, purchasing and delivery services. Each record includes a location identifier and sales transaction information.

                  Requirements -
                  The application components must meet the following requirements:

                  Corporate website -
                  Secure the website by using SSL.
                  Minimize costs for data storage and hosting.
                  Implement native GitHub workflows for continuous integration and continuous deployment (CI/CD).
                  Distribute the website content globally for local use.
                  Implement monitoring by using Application Insights and availability web tests including SSL certificate validity and custom header value verification.
                  The website must have 99.95 percent uptime.

                  Retail store locations -
                  Azure Functions must process data immediately when data is uploaded to Blob storage. Azure Functions must update Azure Cosmos DB by using native SQL language queries.
                  Audit store sale transaction information nightly to validate data, process sales financials, and reconcile inventory.

                  Delivery services -
                  Store service telemetry data in Azure Cosmos DB by using an Azure Function. Data must include an item id, the delivery vehicle license plate, vehicle package capacity, and current vehicle location coordinates.
                  Store delivery driver profile information in Azure Active Directory (Azure AD) by using an Azure Function called from the corporate website.

                  Inventory services -
                  The company has contracted a third-party to develop an API for inventory processing that requires access to a specific blob within the retail store storage account for three months to include read-only access to the data.

                  Security -
                  All Azure Functions must centralize management and distribution of configuration data for different environments and geographies, encrypted by using a company-provided RSA-HSM key.
                  Authentication and authorization must use Azure AD and services must use managed identities where possible.

                  Issues -

                  Retail Store Locations -
                  You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.
                  Azure Cosmos DB queries from the Azure Function exhibit high Request Unit (RU) usage and contain multiple, complex queries that exhibit high point read latency for large items as the function app is scaling. Question You need to secure the Azure Functions to meet the security requirements.
                  Which two actions should you perform? Each correct answer presents part of the solution.
                  NOTE: Each correct selection is worth one point.
                  1. A
                    Store the RSA-HSM key in Azure Key Vault with soft-delete and purge-protection features enabled.
                  2. B
                    Store the RSA-HSM key in Azure Blob storage with an immutability policy applied to the container.
                  3. C
                    Create a free tier Azure App Configuration instance with a new Azure AD service principal.
                  4. D
                    Create a standard tier Azure App Configuration instance with an assigned Azure AD managed identity.
                  5. E
                    Store the RSA-HSM key in Azure Cosmos DB. Apply the built-in policies for customer-managed keys and allowed locations.

                  Correct Answer:
                  AD
                  Scenario: All Azure Functions must centralize management and distribution of configuration data for different environments and geographies, encrypted by using a company-provided RSA-HSM key.
                  Microsoft Azure Key Vault is a cloud-hosted management service that allows users to encrypt keys and small secrets by using keys that are protected by hardware security modules (HSMs).
                  You need to create a managed identity for your application.
                  Reference:
                  https://docs.microsoft.com/en-us/azure/app-service/app-service-key-vault-references
                  Question #283
                  Introductory Info Case study -
                  This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                  To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                  At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                  To start the case study -
                  To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                  Background -

                  Overview -
                  You are a developer for Contoso, Ltd. The company has a social networking website that is developed as a Single Page Application (SPA). The main web application for the social networking website loads user uploaded content from blob storage.
                  You are developing a solution to monitor uploaded data for inappropriate content. The following process occurs when users upload content by using the SPA:
                  * Messages are sent to ContentUploadService.
                  * Content is processed by ContentAnalysisService.
                  * After processing is complete, the content is posted to the social network or a rejection message is posted in its place.
                  The ContentAnalysisService is deployed with Azure Container Instances from a private Azure Container Registry named contosoimages.
                  The solution will use eight CPU cores.

                  Azure Active Directory -
                  Contoso, Ltd. uses Azure Active Directory (Azure AD) for both internal and guest accounts.

                  Requirements -

                  ContentAnalysisService -
                  The company's data science group built ContentAnalysisService which accepts user generated content as a string and returns a probable value for inappropriate content. Any values over a specific threshold must be reviewed by an employee of Contoso, Ltd.
                  You must create an Azure Function named CheckUserContent to perform the content checks.

                  Costs -
                  You must minimize costs for all Azure services.

                  Manual review -
                  To review content, the user must authenticate to the website portion of the ContentAnalysisService using their Azure AD credentials. The website is built using
                  React and all pages and API endpoints require authentication. In order to review content a user must be part of a ContentReviewer role. All completed reviews must include the reviewer's email address for auditing purposes.

                  High availability -
                  All services must run in multiple regions. The failure of any service in a region must not impact overall application availability.

                  Monitoring -
                  An alert must be raised if the ContentUploadService uses more than 80 percent of available CPU cores.

                  Security -
                  You have the following security requirements:
                  Any web service accessible over the Internet must be protected from cross site scripting attacks.
                  All websites and services must use SSL from a valid root certificate authority.
                  Azure Storage access keys must only be stored in memory and must be available only to the service.
                  All Internal services must only be accessible from internal Virtual Networks (VNets).
                  All parts of the system must support inbound and outbound traffic restrictions.
                  All service calls must be authenticated by using Azure AD.

                  User agreements -
                  When a user submits content, they must agree to a user agreement. The agreement allows employees of Contoso, Ltd. to review content, store cookies on user devices, and track user's IP addresses.
                  Information regarding agreements is used by multiple divisions within Contoso, Ltd.
                  User responses must not be lost and must be available to all parties regardless of individual service uptime. The volume of agreements is expected to be in the millions per hour.

                  Validation testing -
                  When a new version of the ContentAnalysisService is available the previous seven days of content must be processed with the new version to verify that the new version does not significantly deviate from the old version.

                  Issues -
                  Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages.

                  Code -

                  ContentUploadService -


                  ApplicationManifest -
                   Question DRAG DROP -
                  You need to add markup at line AM04 to implement the ContentReview role.
                  How should you complete the markup? To answer, drag the appropriate json segments to the correct locations. Each json segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
                  NOTE: Each correct selection is worth one point.
                  Select and Place:


                    Correct Answer:

                    Box 1: allowedMemberTypes -
                    allowedMemberTypes specifies whether this app role definition can be assigned to users and groups by setting to "User", or to other applications (that are accessing this application in daemon service scenarios) by setting to "Application", or to both.
                    Note: The following example shows the appRoles that you can assign to users.
                    "appId": "8763f1c4-f988-489c-a51e-158e9ef97d6a",
                    "appRoles": [
                    {
                    "allowedMemberTypes": [
                    "User"
                    ],
                    "displayName": "Writer",
                    "id": "d1c2ade8-98f8-45fd-aa4a-6d06b947c66f",
                    "isEnabled": true,
                    "description": "Writers Have the ability to create tasks.",
                    "value": "Writer"
                    }
                    ],
                    "availableToOtherTenants": false,

                    Box 2: User -
                    Scenario: In order to review content a user must be part of a ContentReviewer role.

                    Box 3: value -
                    value specifies the value which will be included in the roles claim in authentication and access tokens.
                    Reference:
                    https://docs.microsoft.com/en-us/graph/api/resources/approle
                    Question #284
                    Introductory Info Case study -
                    This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                    To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                    At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                    To start the case study -
                    To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                    Background -

                    Overview -
                    You are a developer for Contoso, Ltd. The company has a social networking website that is developed as a Single Page Application (SPA). The main web application for the social networking website loads user uploaded content from blob storage.
                    You are developing a solution to monitor uploaded data for inappropriate content. The following process occurs when users upload content by using the SPA:
                    * Messages are sent to ContentUploadService.
                    * Content is processed by ContentAnalysisService.
                    * After processing is complete, the content is posted to the social network or a rejection message is posted in its place.
                    The ContentAnalysisService is deployed with Azure Container Instances from a private Azure Container Registry named contosoimages.
                    The solution will use eight CPU cores.

                    Azure Active Directory -
                    Contoso, Ltd. uses Azure Active Directory (Azure AD) for both internal and guest accounts.

                    Requirements -

                    ContentAnalysisService -
                    The company's data science group built ContentAnalysisService which accepts user generated content as a string and returns a probable value for inappropriate content. Any values over a specific threshold must be reviewed by an employee of Contoso, Ltd.
                    You must create an Azure Function named CheckUserContent to perform the content checks.

                    Costs -
                    You must minimize costs for all Azure services.

                    Manual review -
                    To review content, the user must authenticate to the website portion of the ContentAnalysisService using their Azure AD credentials. The website is built using
                    React and all pages and API endpoints require authentication. In order to review content a user must be part of a ContentReviewer role. All completed reviews must include the reviewer's email address for auditing purposes.

                    High availability -
                    All services must run in multiple regions. The failure of any service in a region must not impact overall application availability.

                    Monitoring -
                    An alert must be raised if the ContentUploadService uses more than 80 percent of available CPU cores.

                    Security -
                    You have the following security requirements:
                    Any web service accessible over the Internet must be protected from cross site scripting attacks.
                    All websites and services must use SSL from a valid root certificate authority.
                    Azure Storage access keys must only be stored in memory and must be available only to the service.
                    All Internal services must only be accessible from internal Virtual Networks (VNets).
                    All parts of the system must support inbound and outbound traffic restrictions.
                    All service calls must be authenticated by using Azure AD.

                    User agreements -
                    When a user submits content, they must agree to a user agreement. The agreement allows employees of Contoso, Ltd. to review content, store cookies on user devices, and track user's IP addresses.
                    Information regarding agreements is used by multiple divisions within Contoso, Ltd.
                    User responses must not be lost and must be available to all parties regardless of individual service uptime. The volume of agreements is expected to be in the millions per hour.

                    Validation testing -
                    When a new version of the ContentAnalysisService is available the previous seven days of content must be processed with the new version to verify that the new version does not significantly deviate from the old version.

                    Issues -
                    Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages.

                    Code -

                    ContentUploadService -


                    ApplicationManifest -
                     Question HOTSPOT -
                    You need to add code at line AM09 to ensure that users can review content using ContentAnalysisService.
                    How should you complete the code? To answer, select the appropriate options in the answer area.
                    NOTE: Each correct selection is worth one point.
                    Hot Area:


                      Correct Answer:

                      Box 1: "oauth2Permissions": ["login"]
                      oauth2Permissions specifies the collection of OAuth 2.0 permission scopes that the web API (resource) app exposes to client apps. These permission scopes may be granted to client apps during consent.
                      Box 2: "oauth2AllowImplicitFlow":true
                      For applications (Angular, Ember.js, React.js, and so on), Microsoft identity platform supports the OAuth 2.0 Implicit Grant flow.
                      Reference:
                      https://docs.microsoft.com/en-us/azure/active-directory/develop/reference-app-manifest
                      Question #285
                      Introductory Info Case study -
                      This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                      To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                      At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                      To start the case study -
                      To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                      Background -

                      Overview -
                      You are a developer for Contoso, Ltd. The company has a social networking website that is developed as a Single Page Application (SPA). The main web application for the social networking website loads user uploaded content from blob storage.
                      You are developing a solution to monitor uploaded data for inappropriate content. The following process occurs when users upload content by using the SPA:
                      * Messages are sent to ContentUploadService.
                      * Content is processed by ContentAnalysisService.
                      * After processing is complete, the content is posted to the social network or a rejection message is posted in its place.
                      The ContentAnalysisService is deployed with Azure Container Instances from a private Azure Container Registry named contosoimages.
                      The solution will use eight CPU cores.

                      Azure Active Directory -
                      Contoso, Ltd. uses Azure Active Directory (Azure AD) for both internal and guest accounts.

                      Requirements -

                      ContentAnalysisService -
                      The company's data science group built ContentAnalysisService which accepts user generated content as a string and returns a probable value for inappropriate content. Any values over a specific threshold must be reviewed by an employee of Contoso, Ltd.
                      You must create an Azure Function named CheckUserContent to perform the content checks.

                      Costs -
                      You must minimize costs for all Azure services.

                      Manual review -
                      To review content, the user must authenticate to the website portion of the ContentAnalysisService using their Azure AD credentials. The website is built using
                      React and all pages and API endpoints require authentication. In order to review content a user must be part of a ContentReviewer role. All completed reviews must include the reviewer's email address for auditing purposes.

                      High availability -
                      All services must run in multiple regions. The failure of any service in a region must not impact overall application availability.

                      Monitoring -
                      An alert must be raised if the ContentUploadService uses more than 80 percent of available CPU cores.

                      Security -
                      You have the following security requirements:
                      Any web service accessible over the Internet must be protected from cross site scripting attacks.
                      All websites and services must use SSL from a valid root certificate authority.
                      Azure Storage access keys must only be stored in memory and must be available only to the service.
                      All Internal services must only be accessible from internal Virtual Networks (VNets).
                      All parts of the system must support inbound and outbound traffic restrictions.
                      All service calls must be authenticated by using Azure AD.

                      User agreements -
                      When a user submits content, they must agree to a user agreement. The agreement allows employees of Contoso, Ltd. to review content, store cookies on user devices, and track user's IP addresses.
                      Information regarding agreements is used by multiple divisions within Contoso, Ltd.
                      User responses must not be lost and must be available to all parties regardless of individual service uptime. The volume of agreements is expected to be in the millions per hour.

                      Validation testing -
                      When a new version of the ContentAnalysisService is available the previous seven days of content must be processed with the new version to verify that the new version does not significantly deviate from the old version.

                      Issues -
                      Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages.

                      Code -

                      ContentUploadService -


                      ApplicationManifest -
                       Question HOTSPOT -
                      You need to ensure that network security policies are met.
                      How should you configure network security? To answer, select the appropriate options in the answer area.
                      NOTE: Each correct selection is worth one point.
                      Hot Area:


                        Correct Answer:

                        Box 1: Valid root certificate -
                        Scenario: All websites and services must use SSL from a valid root certificate authority.
                        Box 2: Azure Application Gateway
                        Scenario:
                        ✑ Any web service accessible over the Internet must be protected from cross site scripting attacks.
                        ✑ All Internal services must only be accessible from Internal Virtual Networks (VNets)
                        All parts of the system must support inbound and outbound traffic restrictions.

                        Azure Web Application Firewall (WAF) on Azure Application Gateway provides centralized protection of your web applications from common exploits and vulnerabilities. Web applications are increasingly targeted by malicious attacks that exploit commonly known vulnerabilities. SQL injection and cross-site scripting are among the most common attacks.
                        Application Gateway supports autoscaling, SSL offloading, and end-to-end SSL, a web application firewall (WAF), cookie-based session affinity, URL path-based routing, multisite hosting, redirection, rewrite HTTP headers and other features.
                        Note: Both Nginx and Azure Application Gateway act as a reverse proxy with Layer 7 load-balancing features plus a WAF to ensure strong protection against common web vulnerabilities and exploits.
                        You can modify Nginx web server configuration/SSL for X-XSS protection. This helps to prevent cross-site scripting exploits by forcing the injection of HTTP headers with X-XSS protection.
                        Reference:
                        https://docs.microsoft.com/en-us/azure/web-application-firewall/ag/ag-overview https://www.upguard.com/articles/10-tips-for-securing-your-nginx-deployment
                        Question #286
                        Introductory Info Case study -
                        This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                        To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                        At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                        To start the case study -
                        To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                        Background -

                        Overview -
                        You are a developer for Contoso, Ltd. The company has a social networking website that is developed as a Single Page Application (SPA). The main web application for the social networking website loads user uploaded content from blob storage.
                        You are developing a solution to monitor uploaded data for inappropriate content. The following process occurs when users upload content by using the SPA:
                        * Messages are sent to ContentUploadService.
                        * Content is processed by ContentAnalysisService.
                        * After processing is complete, the content is posted to the social network or a rejection message is posted in its place.
                        The ContentAnalysisService is deployed with Azure Container Instances from a private Azure Container Registry named contosoimages.
                        The solution will use eight CPU cores.

                        Azure Active Directory -
                        Contoso, Ltd. uses Azure Active Directory (Azure AD) for both internal and guest accounts.

                        Requirements -

                        ContentAnalysisService -
                        The company's data science group built ContentAnalysisService which accepts user generated content as a string and returns a probable value for inappropriate content. Any values over a specific threshold must be reviewed by an employee of Contoso, Ltd.
                        You must create an Azure Function named CheckUserContent to perform the content checks.

                        Costs -
                        You must minimize costs for all Azure services.

                        Manual review -
                        To review content, the user must authenticate to the website portion of the ContentAnalysisService using their Azure AD credentials. The website is built using
                        React and all pages and API endpoints require authentication. In order to review content a user must be part of a ContentReviewer role. All completed reviews must include the reviewer's email address for auditing purposes.

                        High availability -
                        All services must run in multiple regions. The failure of any service in a region must not impact overall application availability.

                        Monitoring -
                        An alert must be raised if the ContentUploadService uses more than 80 percent of available CPU cores.

                        Security -
                        You have the following security requirements:
                        Any web service accessible over the Internet must be protected from cross site scripting attacks.
                        All websites and services must use SSL from a valid root certificate authority.
                        Azure Storage access keys must only be stored in memory and must be available only to the service.
                        All Internal services must only be accessible from internal Virtual Networks (VNets).
                        All parts of the system must support inbound and outbound traffic restrictions.
                        All service calls must be authenticated by using Azure AD.

                        User agreements -
                        When a user submits content, they must agree to a user agreement. The agreement allows employees of Contoso, Ltd. to review content, store cookies on user devices, and track user's IP addresses.
                        Information regarding agreements is used by multiple divisions within Contoso, Ltd.
                        User responses must not be lost and must be available to all parties regardless of individual service uptime. The volume of agreements is expected to be in the millions per hour.

                        Validation testing -
                        When a new version of the ContentAnalysisService is available the previous seven days of content must be processed with the new version to verify that the new version does not significantly deviate from the old version.

                        Issues -
                        Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages.

                        Code -

                        ContentUploadService -


                        ApplicationManifest -
                         Question DRAG DROP -
                        You need to add YAML markup at line CS17 to ensure that the ContentUploadService can access Azure Storage access keys.
                        How should you complete the YAML markup? To answer, drag the appropriate YAML segments to the correct locations. Each YAML segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
                        NOTE: Each correct selection is worth one point.
                        Select and Place:


                          Correct Answer:

                          Box 1: volumeMounts -
                          Example:
                          volumeMounts:
                          - mountPath: /mnt/secrets
                          name: secretvolume1
                          volumes:
                          - name: secretvolume1
                          secret:
                          mysecret1: TXkgZmlyc3Qgc2VjcmV0IEZPTwo=

                          Box 2: volumes -

                          Box 3: secret -
                          Reference:
                          https://docs.microsoft.com/en-us/azure/container-instances/container-instances-volume-secret
                          Question #287
                          Introductory Info Case study -
                          This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                          To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                          At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                          To start the case study -
                          To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                          Background -

                          Overview -
                          You are a developer for Contoso, Ltd. The company has a social networking website that is developed as a Single Page Application (SPA). The main web application for the social networking website loads user uploaded content from blob storage.
                          You are developing a solution to monitor uploaded data for inappropriate content. The following process occurs when users upload content by using the SPA:
                          * Messages are sent to ContentUploadService.
                          * Content is processed by ContentAnalysisService.
                          * After processing is complete, the content is posted to the social network or a rejection message is posted in its place.
                          The ContentAnalysisService is deployed with Azure Container Instances from a private Azure Container Registry named contosoimages.
                          The solution will use eight CPU cores.

                          Azure Active Directory -
                          Contoso, Ltd. uses Azure Active Directory (Azure AD) for both internal and guest accounts.

                          Requirements -

                          ContentAnalysisService -
                          The company's data science group built ContentAnalysisService which accepts user generated content as a string and returns a probable value for inappropriate content. Any values over a specific threshold must be reviewed by an employee of Contoso, Ltd.
                          You must create an Azure Function named CheckUserContent to perform the content checks.

                          Costs -
                          You must minimize costs for all Azure services.

                          Manual review -
                          To review content, the user must authenticate to the website portion of the ContentAnalysisService using their Azure AD credentials. The website is built using
                          React and all pages and API endpoints require authentication. In order to review content a user must be part of a ContentReviewer role. All completed reviews must include the reviewer's email address for auditing purposes.

                          High availability -
                          All services must run in multiple regions. The failure of any service in a region must not impact overall application availability.

                          Monitoring -
                          An alert must be raised if the ContentUploadService uses more than 80 percent of available CPU cores.

                          Security -
                          You have the following security requirements:
                          Any web service accessible over the Internet must be protected from cross site scripting attacks.
                          All websites and services must use SSL from a valid root certificate authority.
                          Azure Storage access keys must only be stored in memory and must be available only to the service.
                          All Internal services must only be accessible from internal Virtual Networks (VNets).
                          All parts of the system must support inbound and outbound traffic restrictions.
                          All service calls must be authenticated by using Azure AD.

                          User agreements -
                          When a user submits content, they must agree to a user agreement. The agreement allows employees of Contoso, Ltd. to review content, store cookies on user devices, and track user's IP addresses.
                          Information regarding agreements is used by multiple divisions within Contoso, Ltd.
                          User responses must not be lost and must be available to all parties regardless of individual service uptime. The volume of agreements is expected to be in the millions per hour.

                          Validation testing -
                          When a new version of the ContentAnalysisService is available the previous seven days of content must be processed with the new version to verify that the new version does not significantly deviate from the old version.

                          Issues -
                          Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages.

                          Code -

                          ContentUploadService -


                          ApplicationManifest -
                           Question HOTSPOT -
                          You need to add code at line AM10 of the application manifest to ensure that the requirement for manually reviewing content can be met.
                          How should you complete the code? To answer, select the appropriate options in the answer area.
                          NOTE: Each correct selection is worth one point.
                          Hot Area:


                            Correct Answer:

                            Box 1: sid -
                            Sid: Session ID, used for per-session user sign-out. Personal and Azure AD accounts.

                            Scenario: Manual review -
                            To review content, the user must authenticate to the website portion of the ContentAnalysisService using their Azure AD credentials. The website is built using
                            React and all pages and API endpoints require authentication. In order to review content a user must be part of a ContentReviewer role.

                            Box 2: email -
                            Scenario: All completed reviews must include the reviewer's email address for auditing purposes.
                            Question #288
                            Introductory Info Case study -
                            This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                            To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                            At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                            To start the case study -
                            To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                            Background -
                            Wide World Importers is moving all their datacenters to Azure. The company has developed several applications and services to support supply chain operations and would like to leverage serverless computing where possible.

                            Current environment -
                            Windows Server 2016 virtual machine
                            This virtual machine (VM) runs BizTalk Server 2016. The VM runs the following workflows:
                            Ocean Transport `" This workflow gathers and validates container information including container contents and arrival notices at various shipping ports.
                            Inland Transport `" This workflow gathers and validates trucking information including fuel usage, number of stops, and routes.
                            The VM supports the following REST API calls:
                            Container API `" This API provides container information including weight, contents, and other attributes.
                            Location API `" This API provides location information regarding shipping ports of call and trucking stops.
                            Shipping REST API `" This API provides shipping information for use and display on the shipping website.

                            Shipping Data -
                            The application uses MongoDB JSON document storage database for all container and transport information.

                            Shipping Web Site -
                            The site displays shipping container tracking information and container contents. The site is located at http://shipping.wideworldimporters.com/

                            Proposed solution -
                            The on-premises shipping application must be moved to Azure. The VM has been migrated to a new Standard_D16s_v3 Azure VM by using Azure Site Recovery and must remain running in Azure to complete the BizTalk component migrations. You create a Standard_D16s_v3 Azure VM to host BizTalk Server. The Azure architecture diagram for the proposed solution is shown below:


                            Requirements -

                            Shipping Logic app -
                            The Shipping Logic app must meet the following requirements:
                            Support the ocean transport and inland transport workflows by using a Logic App.
                            Support industry-standard protocol X12 message format for various messages including vessel content details and arrival notices.
                            Secure resources to the corporate VNet and use dedicated storage resources with a fixed costing model.
                            Maintain on-premises connectivity to support legacy applications and final BizTalk migrations.

                            Shipping Function app -
                            Implement secure function endpoints by using app-level security and include Azure Active Directory (Azure AD).

                            REST APIs -
                            The REST API's that support the solution must meet the following requirements:
                            Secure resources to the corporate VNet.
                            Allow deployment to a testing location within Azure while not incurring additional costs.
                            Automatically scale to double capacity during peak shipping times while not causing application downtime.
                            Minimize costs when selecting an Azure payment model.

                            Shipping data -
                            Data migration from on-premises to Azure must minimize costs and downtime.

                            Shipping website -
                            Use Azure Content Delivery Network (CDN) and ensure maximum performance for dynamic content while minimizing latency and costs.

                            Issues -

                            Windows Server 2016 VM -
                            The VM shows high network latency, jitter, and high CPU utilization. The VM is critical and has not been backed up in the past. The VM must enable a quick restore from a 7-day snapshot to include in-place restore of disks in case of failure.

                            Shipping website and REST APIs -
                            The following error message displays while you are testing the website:
                            Failed to load http://test-shippingapi.wideworldimporters.com/: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://test.wideworldimporters.com/' is therefore not allowed access. Question HOTSPOT -
                            You need to secure the Shipping Function app.
                            How should you configure the app? To answer, select the appropriate options in the answer area.
                            NOTE: Each correct selection is worth one point.
                            Hot Area:


                              Correct Answer:

                              Scenario: Shipping Function app: Implement secure function endpoints by using app-level security and include Azure Active Directory (Azure AD).

                              Box 1: Function -
                              Box 2: JSON based Token (JWT)
                              Azure AD uses JSON based tokens (JWTs) that contain claims

                              Box 3: HTTP -
                              How a web app delegates sign-in to Azure AD and obtains a token
                              User authentication happens via the browser. The OpenID protocol uses standard HTTP protocol messages.
                              Reference:
                              https://docs.microsoft.com/en-us/azure/active-directory/develop/authentication-scenarios
                              Question #289
                              Introductory Info Case study -
                              This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                              To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                              At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                              To start the case study -
                              To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                              Background -
                              Wide World Importers is moving all their datacenters to Azure. The company has developed several applications and services to support supply chain operations and would like to leverage serverless computing where possible.

                              Current environment -
                              Windows Server 2016 virtual machine
                              This virtual machine (VM) runs BizTalk Server 2016. The VM runs the following workflows:
                              Ocean Transport `" This workflow gathers and validates container information including container contents and arrival notices at various shipping ports.
                              Inland Transport `" This workflow gathers and validates trucking information including fuel usage, number of stops, and routes.
                              The VM supports the following REST API calls:
                              Container API `" This API provides container information including weight, contents, and other attributes.
                              Location API `" This API provides location information regarding shipping ports of call and trucking stops.
                              Shipping REST API `" This API provides shipping information for use and display on the shipping website.

                              Shipping Data -
                              The application uses MongoDB JSON document storage database for all container and transport information.

                              Shipping Web Site -
                              The site displays shipping container tracking information and container contents. The site is located at http://shipping.wideworldimporters.com/

                              Proposed solution -
                              The on-premises shipping application must be moved to Azure. The VM has been migrated to a new Standard_D16s_v3 Azure VM by using Azure Site Recovery and must remain running in Azure to complete the BizTalk component migrations. You create a Standard_D16s_v3 Azure VM to host BizTalk Server. The Azure architecture diagram for the proposed solution is shown below:


                              Requirements -

                              Shipping Logic app -
                              The Shipping Logic app must meet the following requirements:
                              Support the ocean transport and inland transport workflows by using a Logic App.
                              Support industry-standard protocol X12 message format for various messages including vessel content details and arrival notices.
                              Secure resources to the corporate VNet and use dedicated storage resources with a fixed costing model.
                              Maintain on-premises connectivity to support legacy applications and final BizTalk migrations.

                              Shipping Function app -
                              Implement secure function endpoints by using app-level security and include Azure Active Directory (Azure AD).

                              REST APIs -
                              The REST API's that support the solution must meet the following requirements:
                              Secure resources to the corporate VNet.
                              Allow deployment to a testing location within Azure while not incurring additional costs.
                              Automatically scale to double capacity during peak shipping times while not causing application downtime.
                              Minimize costs when selecting an Azure payment model.

                              Shipping data -
                              Data migration from on-premises to Azure must minimize costs and downtime.

                              Shipping website -
                              Use Azure Content Delivery Network (CDN) and ensure maximum performance for dynamic content while minimizing latency and costs.

                              Issues -

                              Windows Server 2016 VM -
                              The VM shows high network latency, jitter, and high CPU utilization. The VM is critical and has not been backed up in the past. The VM must enable a quick restore from a 7-day snapshot to include in-place restore of disks in case of failure.

                              Shipping website and REST APIs -
                              The following error message displays while you are testing the website:
                              Failed to load http://test-shippingapi.wideworldimporters.com/: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://test.wideworldimporters.com/' is therefore not allowed access. Question You need to secure the Shipping Logic App.
                              What should you use?
                              1. A
                                Azure App Service Environment (ASE)
                              2. B
                                Integration Service Environment (ISE)
                              3. C
                                VNet service endpoint
                              4. D
                                Azure AD B2B integration

                              Correct Answer:
                              B
                              Scenario: The Shipping Logic App requires secure resources to the corporate VNet and use dedicated storage resources with a fixed costing model.
                              You can access to Azure Virtual Network resources from Azure Logic Apps by using integration service environments (ISEs).
                              Sometimes, your logic apps and integration accounts need access to secured resources, such as virtual machines (VMs) and other systems or services, that are inside an Azure virtual network. To set up this access, you can create an integration service environment (ISE) where you can run your logic apps and create your integration accounts.
                              Reference:
                              https://docs.microsoft.com/en-us/azure/logic-apps/connect-virtual-network-vnet-isolated-environment-overview
                              Question #290
                              Introductory Info Case study -
                              This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                              To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                              At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                              To start the case study -
                              To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                              Background -
                              City Power & Light company provides electrical infrastructure monitoring solutions for homes and businesses. The company is migrating solutions to Azure.

                              Current environment -

                              Architecture overview -
                              The company has a public website located at http://www.cpandl.com/. The site is a single-page web application that runs in Azure App Service on Linux. The website uses files stored in Azure Storage and cached in Azure Content Delivery Network (CDN) to serve static content.
                              API Management and Azure Function App functions are used to process and store data in Azure Database for PostgreSQL. API Management is used to broker communications to the Azure Function app functions for Logic app integration. Logic apps are used to orchestrate the data processing while Service Bus and
                              Event Grid handle messaging and events.
                              The solution uses Application Insights, Azure Monitor, and Azure Key Vault.

                              Architecture diagram -
                              The company has several applications and services that support their business. The company plans to implement serverless computing where possible. The overall architecture is shown below.


                              User authentication -
                              The following steps detail the user authentication process:
                              1. The user selects Sign in in the website.
                              2. The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.
                              3. The user signs in.
                              4. Azure AD redirects the user's session back to the web application. The URL includes an access token.
                              5. The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience ('aud') claim in the access token.
                              6. The back-end API validates the access token.

                              Requirements -

                              Corporate website -
                              Communications and content must be secured by using SSL.
                              Communications must use HTTPS.
                              Data must be replicated to a secondary region and three availability zones.
                              Data storage costs must be minimized.

                              Azure Database for PostgreSQL -
                              The database connection string is stored in Azure Key Vault with the following attributes:
                              Azure Key Vault name: cpandlkeyvault
                              Secret name: PostgreSQLConn
                              Id: 80df3e46ffcd4f1cb187f79905e9a1e8
                              The connection information is updated frequently. The application must always use the latest information to connect to the database.
                              Azure Service Bus and Azure Event Grid
                              Azure Event Grid must use Azure Service Bus for queue-based load leveling.
                              Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.
                              Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.

                              Security -
                              All SSL certificates and credentials must be stored in Azure Key Vault.
                              File access must restrict access by IP, protocol, and Azure AD rights.
                              All user accounts and processes must receive only those privileges which are essential to perform their intended function.

                              Compliance -
                              Auditing of the file updates and transfers must be enabled to comply with General Data Protection Regulation (GDPR). The file updates must be read-only, stored in the order in which they occurred, include only create, update, delete, and copy operations, and be retained for compliance reasons.

                              Issues -

                              Corporate website -
                              While testing the site, the following error message displays:
                              CryptographicException: The system cannot find the file specified.

                              Function app -
                              You perform local testing for the RequestUserApproval function. The following error message displays:
                              'Timeout value of 00:10:00 exceeded by function: RequestUserApproval'
                              The same error message displays when you test the function in an Azure development environment when you run the following Kusto query:

                              FunctionAppLogs -
                              | where FunctionName = = "RequestUserApproval"

                              Logic app -
                              You test the Logic app in a development environment. The following error message displays:
                              '400 Bad Request'
                              Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.

                              Code -

                              Corporate website -
                              Security.cs:


                              Function app -
                              RequestUserApproval.cs:
                               Question HOTSPOT -
                              You need to retrieve the database connection string.
                              Which values should you use? To answer, select the appropriate options in the answer area.
                              NOTE: Each correct selection is worth one point.
                              Hot Area:


                                Correct Answer:

                                Azure database connection string retrieve REST API vault.azure.net/secrets/

                                Box 1: cpandlkeyvault -
                                We specify the key vault, cpandlkeyvault.
                                Scenario: The database connection string is stored in Azure Key Vault with the following attributes:
                                Azure Key Vault name: cpandlkeyvault

                                Secret name: PostgreSQLConn -
                                Id: 80df3e46ffcd4f1cb187f79905e9a1e8

                                Box 2: PostgreSQLConn -
                                We specify the secret, PostgreSQLConn
                                Example, sample request:
                                https://myvault.vault.azure.net//secrets/mysecretname/4387e9f3d6e14c459867679a90fd0f79?api-version=7.1

                                Box 3: Querystring -
                                Reference:
                                https://docs.microsoft.com/en-us/rest/api/keyvault/getsecret/getsecret
                                Question #291
                                Introductory Info Case study -
                                This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                                To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                                At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                                To start the case study -
                                To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                                Background -
                                City Power & Light company provides electrical infrastructure monitoring solutions for homes and businesses. The company is migrating solutions to Azure.

                                Current environment -

                                Architecture overview -
                                The company has a public website located at http://www.cpandl.com/. The site is a single-page web application that runs in Azure App Service on Linux. The website uses files stored in Azure Storage and cached in Azure Content Delivery Network (CDN) to serve static content.
                                API Management and Azure Function App functions are used to process and store data in Azure Database for PostgreSQL. API Management is used to broker communications to the Azure Function app functions for Logic app integration. Logic apps are used to orchestrate the data processing while Service Bus and
                                Event Grid handle messaging and events.
                                The solution uses Application Insights, Azure Monitor, and Azure Key Vault.

                                Architecture diagram -
                                The company has several applications and services that support their business. The company plans to implement serverless computing where possible. The overall architecture is shown below.


                                User authentication -
                                The following steps detail the user authentication process:
                                1. The user selects Sign in in the website.
                                2. The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.
                                3. The user signs in.
                                4. Azure AD redirects the user's session back to the web application. The URL includes an access token.
                                5. The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience ('aud') claim in the access token.
                                6. The back-end API validates the access token.

                                Requirements -

                                Corporate website -
                                Communications and content must be secured by using SSL.
                                Communications must use HTTPS.
                                Data must be replicated to a secondary region and three availability zones.
                                Data storage costs must be minimized.

                                Azure Database for PostgreSQL -
                                The database connection string is stored in Azure Key Vault with the following attributes:
                                Azure Key Vault name: cpandlkeyvault
                                Secret name: PostgreSQLConn
                                Id: 80df3e46ffcd4f1cb187f79905e9a1e8
                                The connection information is updated frequently. The application must always use the latest information to connect to the database.
                                Azure Service Bus and Azure Event Grid
                                Azure Event Grid must use Azure Service Bus for queue-based load leveling.
                                Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.
                                Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.

                                Security -
                                All SSL certificates and credentials must be stored in Azure Key Vault.
                                File access must restrict access by IP, protocol, and Azure AD rights.
                                All user accounts and processes must receive only those privileges which are essential to perform their intended function.

                                Compliance -
                                Auditing of the file updates and transfers must be enabled to comply with General Data Protection Regulation (GDPR). The file updates must be read-only, stored in the order in which they occurred, include only create, update, delete, and copy operations, and be retained for compliance reasons.

                                Issues -

                                Corporate website -
                                While testing the site, the following error message displays:
                                CryptographicException: The system cannot find the file specified.

                                Function app -
                                You perform local testing for the RequestUserApproval function. The following error message displays:
                                'Timeout value of 00:10:00 exceeded by function: RequestUserApproval'
                                The same error message displays when you test the function in an Azure development environment when you run the following Kusto query:

                                FunctionAppLogs -
                                | where FunctionName = = "RequestUserApproval"

                                Logic app -
                                You test the Logic app in a development environment. The following error message displays:
                                '400 Bad Request'
                                Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.

                                Code -

                                Corporate website -
                                Security.cs:


                                Function app -
                                RequestUserApproval.cs:
                                 Question DRAG DROP -
                                You need to correct the corporate website error.
                                Which four actions should you recommend be performed in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
                                Select and Place:


                                  Correct Answer:

                                  Scenario: Corporate website -
                                  While testing the site, the following error message displays:
                                  CryptographicException: The system cannot find the file specified.

                                  Step 1: Generate a certificate -
                                  Step 2: Upload the certificate to Azure Key Vault
                                  Scenario: All SSL certificates and credentials must be stored in Azure Key Vault.
                                  Step 3: Import the certificate to Azure App Service
                                  Step 4: Update line SCO5 of Security.cs to include error handling and then redeploy the code
                                  Reference:
                                  https://docs.microsoft.com/en-us/azure/app-service/configure-ssl-certificate
                                  Question #292
                                  Introductory Info Case study -
                                  This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                                  To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                                  At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                                  To start the case study -
                                  To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                                  Background -
                                  City Power & Light company provides electrical infrastructure monitoring solutions for homes and businesses. The company is migrating solutions to Azure.

                                  Current environment -

                                  Architecture overview -
                                  The company has a public website located at http://www.cpandl.com/. The site is a single-page web application that runs in Azure App Service on Linux. The website uses files stored in Azure Storage and cached in Azure Content Delivery Network (CDN) to serve static content.
                                  API Management and Azure Function App functions are used to process and store data in Azure Database for PostgreSQL. API Management is used to broker communications to the Azure Function app functions for Logic app integration. Logic apps are used to orchestrate the data processing while Service Bus and
                                  Event Grid handle messaging and events.
                                  The solution uses Application Insights, Azure Monitor, and Azure Key Vault.

                                  Architecture diagram -
                                  The company has several applications and services that support their business. The company plans to implement serverless computing where possible. The overall architecture is shown below.


                                  User authentication -
                                  The following steps detail the user authentication process:
                                  1. The user selects Sign in in the website.
                                  2. The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.
                                  3. The user signs in.
                                  4. Azure AD redirects the user's session back to the web application. The URL includes an access token.
                                  5. The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience ('aud') claim in the access token.
                                  6. The back-end API validates the access token.

                                  Requirements -

                                  Corporate website -
                                  Communications and content must be secured by using SSL.
                                  Communications must use HTTPS.
                                  Data must be replicated to a secondary region and three availability zones.
                                  Data storage costs must be minimized.

                                  Azure Database for PostgreSQL -
                                  The database connection string is stored in Azure Key Vault with the following attributes:
                                  Azure Key Vault name: cpandlkeyvault
                                  Secret name: PostgreSQLConn
                                  Id: 80df3e46ffcd4f1cb187f79905e9a1e8
                                  The connection information is updated frequently. The application must always use the latest information to connect to the database.
                                  Azure Service Bus and Azure Event Grid
                                  Azure Event Grid must use Azure Service Bus for queue-based load leveling.
                                  Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.
                                  Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.

                                  Security -
                                  All SSL certificates and credentials must be stored in Azure Key Vault.
                                  File access must restrict access by IP, protocol, and Azure AD rights.
                                  All user accounts and processes must receive only those privileges which are essential to perform their intended function.

                                  Compliance -
                                  Auditing of the file updates and transfers must be enabled to comply with General Data Protection Regulation (GDPR). The file updates must be read-only, stored in the order in which they occurred, include only create, update, delete, and copy operations, and be retained for compliance reasons.

                                  Issues -

                                  Corporate website -
                                  While testing the site, the following error message displays:
                                  CryptographicException: The system cannot find the file specified.

                                  Function app -
                                  You perform local testing for the RequestUserApproval function. The following error message displays:
                                  'Timeout value of 00:10:00 exceeded by function: RequestUserApproval'
                                  The same error message displays when you test the function in an Azure development environment when you run the following Kusto query:

                                  FunctionAppLogs -
                                  | where FunctionName = = "RequestUserApproval"

                                  Logic app -
                                  You test the Logic app in a development environment. The following error message displays:
                                  '400 Bad Request'
                                  Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.

                                  Code -

                                  Corporate website -
                                  Security.cs:


                                  Function app -
                                  RequestUserApproval.cs:
                                   Question HOTSPOT -
                                  You need to configure API Management for authentication.
                                  Which policy values should you use? To answer, select the appropriate options in the answer area.
                                  NOTE: Each correct selection is worth one point.
                                  Hot Area:


                                    Correct Answer:

                                    Box 1: Validate JWT -
                                    The validate-jwt policy enforces existence and validity of a JWT extracted from either a specified HTTP Header or a specified query parameter.
                                    Scenario: User authentication (see step 5 below)
                                    The following steps detail the user authentication process:
                                    1. The user selects Sign in in the website.
                                    2. The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.
                                    3. The user signs in.
                                    4. Azure AD redirects the user's session back to the web application. The URL includes an access token.
                                    5. The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience ('aud') claim in the access token.
                                    6. The back-end API validates the access token.
                                    Incorrect Answers:
                                    ✑ Limit call rate by key - Prevents API usage spikes by limiting call rate, on a per key basis.
                                    ✑ Restrict caller IPs - Filters (allows/denies) calls from specific IP addresses and/or address ranges.
                                    ✑ Check HTTP header - Enforces existence and/or value of a HTTP Header.

                                    Box 2: Outbound -
                                    Reference:
                                    https://docs.microsoft.com/en-us/azure/api-management/api-management-access-restriction-policies
                                    Question #293
                                    Introductory Info Case study -
                                    This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                                    To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                                    At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                                    To start the case study -
                                    To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                                    Background -
                                    City Power & Light company provides electrical infrastructure monitoring solutions for homes and businesses. The company is migrating solutions to Azure.

                                    Current environment -

                                    Architecture overview -
                                    The company has a public website located at http://www.cpandl.com/. The site is a single-page web application that runs in Azure App Service on Linux. The website uses files stored in Azure Storage and cached in Azure Content Delivery Network (CDN) to serve static content.
                                    API Management and Azure Function App functions are used to process and store data in Azure Database for PostgreSQL. API Management is used to broker communications to the Azure Function app functions for Logic app integration. Logic apps are used to orchestrate the data processing while Service Bus and
                                    Event Grid handle messaging and events.
                                    The solution uses Application Insights, Azure Monitor, and Azure Key Vault.

                                    Architecture diagram -
                                    The company has several applications and services that support their business. The company plans to implement serverless computing where possible. The overall architecture is shown below.


                                    User authentication -
                                    The following steps detail the user authentication process:
                                    1. The user selects Sign in in the website.
                                    2. The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.
                                    3. The user signs in.
                                    4. Azure AD redirects the user's session back to the web application. The URL includes an access token.
                                    5. The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience ('aud') claim in the access token.
                                    6. The back-end API validates the access token.

                                    Requirements -

                                    Corporate website -
                                    Communications and content must be secured by using SSL.
                                    Communications must use HTTPS.
                                    Data must be replicated to a secondary region and three availability zones.
                                    Data storage costs must be minimized.

                                    Azure Database for PostgreSQL -
                                    The database connection string is stored in Azure Key Vault with the following attributes:
                                    Azure Key Vault name: cpandlkeyvault
                                    Secret name: PostgreSQLConn
                                    Id: 80df3e46ffcd4f1cb187f79905e9a1e8
                                    The connection information is updated frequently. The application must always use the latest information to connect to the database.
                                    Azure Service Bus and Azure Event Grid
                                    Azure Event Grid must use Azure Service Bus for queue-based load leveling.
                                    Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.
                                    Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.

                                    Security -
                                    All SSL certificates and credentials must be stored in Azure Key Vault.
                                    File access must restrict access by IP, protocol, and Azure AD rights.
                                    All user accounts and processes must receive only those privileges which are essential to perform their intended function.

                                    Compliance -
                                    Auditing of the file updates and transfers must be enabled to comply with General Data Protection Regulation (GDPR). The file updates must be read-only, stored in the order in which they occurred, include only create, update, delete, and copy operations, and be retained for compliance reasons.

                                    Issues -

                                    Corporate website -
                                    While testing the site, the following error message displays:
                                    CryptographicException: The system cannot find the file specified.

                                    Function app -
                                    You perform local testing for the RequestUserApproval function. The following error message displays:
                                    'Timeout value of 00:10:00 exceeded by function: RequestUserApproval'
                                    The same error message displays when you test the function in an Azure development environment when you run the following Kusto query:

                                    FunctionAppLogs -
                                    | where FunctionName = = "RequestUserApproval"

                                    Logic app -
                                    You test the Logic app in a development environment. The following error message displays:
                                    '400 Bad Request'
                                    Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.

                                    Code -

                                    Corporate website -
                                    Security.cs:


                                    Function app -
                                    RequestUserApproval.cs:
                                     Question You need to authenticate the user to the corporate website as indicated by the architectural diagram.
                                    Which two values should you use? Each correct answer presents part of the solution.
                                    NOTE: Each correct selection is worth one point.
                                    1. A
                                      ID token signature
                                    2. B
                                      ID token claims
                                    3. C
                                      HTTP response code
                                    4. D
                                      Azure AD endpoint URI
                                    5. E
                                      Azure AD tenant ID

                                    Correct Answer:
                                    AD
                                    A: Claims in access tokens -
                                    JWTs (JSON Web Tokens) are split into three pieces:
                                    ✑ Header - Provides information about how to validate the token including information about the type of token and how it was signed.
                                    ✑ Payload - Contains all of the important data about the user or app that is attempting to call your service.
                                    ✑ Signature - Is the raw material used to validate the token.
                                    E: Your client can get an access token from either the v1.0 endpoint or the v2.0 endpoint using a variety of protocols.
                                    Scenario: User authentication (see step 5 below)
                                    The following steps detail the user authentication process:
                                    1. The user selects Sign in in the website.
                                    2. The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.
                                    3. The user signs in.
                                    4. Azure AD redirects the user's session back to the web application. The URL includes an access token.
                                    5. The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience ('aud') claim in the access token.
                                    6. The back-end API validates the access token.
                                    Reference:
                                    https://docs.microsoft.com/en-us/azure/api-management/api-management-access-restriction-policies
                                    Question #294
                                    Introductory Info Case study -
                                    This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                                    To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                                    At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                                    To start the case study -
                                    To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                                    Background -
                                    City Power & Light company provides electrical infrastructure monitoring solutions for homes and businesses. The company is migrating solutions to Azure.

                                    Current environment -

                                    Architecture overview -
                                    The company has a public website located at http://www.cpandl.com/. The site is a single-page web application that runs in Azure App Service on Linux. The website uses files stored in Azure Storage and cached in Azure Content Delivery Network (CDN) to serve static content.
                                    API Management and Azure Function App functions are used to process and store data in Azure Database for PostgreSQL. API Management is used to broker communications to the Azure Function app functions for Logic app integration. Logic apps are used to orchestrate the data processing while Service Bus and
                                    Event Grid handle messaging and events.
                                    The solution uses Application Insights, Azure Monitor, and Azure Key Vault.

                                    Architecture diagram -
                                    The company has several applications and services that support their business. The company plans to implement serverless computing where possible. The overall architecture is shown below.


                                    User authentication -
                                    The following steps detail the user authentication process:
                                    1. The user selects Sign in in the website.
                                    2. The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.
                                    3. The user signs in.
                                    4. Azure AD redirects the user's session back to the web application. The URL includes an access token.
                                    5. The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience ('aud') claim in the access token.
                                    6. The back-end API validates the access token.

                                    Requirements -

                                    Corporate website -
                                    Communications and content must be secured by using SSL.
                                    Communications must use HTTPS.
                                    Data must be replicated to a secondary region and three availability zones.
                                    Data storage costs must be minimized.

                                    Azure Database for PostgreSQL -
                                    The database connection string is stored in Azure Key Vault with the following attributes:
                                    Azure Key Vault name: cpandlkeyvault
                                    Secret name: PostgreSQLConn
                                    Id: 80df3e46ffcd4f1cb187f79905e9a1e8
                                    The connection information is updated frequently. The application must always use the latest information to connect to the database.
                                    Azure Service Bus and Azure Event Grid
                                    Azure Event Grid must use Azure Service Bus for queue-based load leveling.
                                    Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.
                                    Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.

                                    Security -
                                    All SSL certificates and credentials must be stored in Azure Key Vault.
                                    File access must restrict access by IP, protocol, and Azure AD rights.
                                    All user accounts and processes must receive only those privileges which are essential to perform their intended function.

                                    Compliance -
                                    Auditing of the file updates and transfers must be enabled to comply with General Data Protection Regulation (GDPR). The file updates must be read-only, stored in the order in which they occurred, include only create, update, delete, and copy operations, and be retained for compliance reasons.

                                    Issues -

                                    Corporate website -
                                    While testing the site, the following error message displays:
                                    CryptographicException: The system cannot find the file specified.

                                    Function app -
                                    You perform local testing for the RequestUserApproval function. The following error message displays:
                                    'Timeout value of 00:10:00 exceeded by function: RequestUserApproval'
                                    The same error message displays when you test the function in an Azure development environment when you run the following Kusto query:

                                    FunctionAppLogs -
                                    | where FunctionName = = "RequestUserApproval"

                                    Logic app -
                                    You test the Logic app in a development environment. The following error message displays:
                                    '400 Bad Request'
                                    Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.

                                    Code -

                                    Corporate website -
                                    Security.cs:


                                    Function app -
                                    RequestUserApproval.cs:
                                     Question HOTSPOT -
                                    You need to correct the Azure Logic app error message.
                                    Which configuration values should you use? To answer, select the appropriate options in the answer area.
                                    NOTE: Each correct selection is worth one point.
                                    Hot Area:


                                      Correct Answer:

                                      Scenario: You test the Logic app in a development environment. The following error message displays:
                                      '400 Bad Request'
                                      Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.
                                      Note: If the inbound call's request body doesn't match your schema, the trigger returns an HTTP 400 Bad Request error.

                                      Box 1: function -
                                      If you have an Azure function where you want to use the system-assigned identity, first enable authentication for Azure functions.

                                      Box 2: system-assigned -
                                      Your logic app or individual connections can use either the system-assigned identity or a single user-assigned identity, which you can share across a group of logic apps, but not both.
                                      Reference:
                                      https://docs.microsoft.com/en-us/azure/logic-apps/create-managed-service-identity
                                      Question #295
                                      Introductory Info Case study -
                                      This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                                      To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                                      At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                                      To start the case study -
                                      To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                                      Background -
                                      City Power & Light company provides electrical infrastructure monitoring solutions for homes and businesses. The company is migrating solutions to Azure.

                                      Current environment -

                                      Architecture overview -
                                      The company has a public website located at http://www.cpandl.com/. The site is a single-page web application that runs in Azure App Service on Linux. The website uses files stored in Azure Storage and cached in Azure Content Delivery Network (CDN) to serve static content.
                                      API Management and Azure Function App functions are used to process and store data in Azure Database for PostgreSQL. API Management is used to broker communications to the Azure Function app functions for Logic app integration. Logic apps are used to orchestrate the data processing while Service Bus and
                                      Event Grid handle messaging and events.
                                      The solution uses Application Insights, Azure Monitor, and Azure Key Vault.

                                      Architecture diagram -
                                      The company has several applications and services that support their business. The company plans to implement serverless computing where possible. The overall architecture is shown below.


                                      User authentication -
                                      The following steps detail the user authentication process:
                                      1. The user selects Sign in in the website.
                                      2. The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.
                                      3. The user signs in.
                                      4. Azure AD redirects the user's session back to the web application. The URL includes an access token.
                                      5. The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience ('aud') claim in the access token.
                                      6. The back-end API validates the access token.

                                      Requirements -

                                      Corporate website -
                                      Communications and content must be secured by using SSL.
                                      Communications must use HTTPS.
                                      Data must be replicated to a secondary region and three availability zones.
                                      Data storage costs must be minimized.

                                      Azure Database for PostgreSQL -
                                      The database connection string is stored in Azure Key Vault with the following attributes:
                                      Azure Key Vault name: cpandlkeyvault
                                      Secret name: PostgreSQLConn
                                      Id: 80df3e46ffcd4f1cb187f79905e9a1e8
                                      The connection information is updated frequently. The application must always use the latest information to connect to the database.
                                      Azure Service Bus and Azure Event Grid
                                      Azure Event Grid must use Azure Service Bus for queue-based load leveling.
                                      Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.
                                      Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.

                                      Security -
                                      All SSL certificates and credentials must be stored in Azure Key Vault.
                                      File access must restrict access by IP, protocol, and Azure AD rights.
                                      All user accounts and processes must receive only those privileges which are essential to perform their intended function.

                                      Compliance -
                                      Auditing of the file updates and transfers must be enabled to comply with General Data Protection Regulation (GDPR). The file updates must be read-only, stored in the order in which they occurred, include only create, update, delete, and copy operations, and be retained for compliance reasons.

                                      Issues -

                                      Corporate website -
                                      While testing the site, the following error message displays:
                                      CryptographicException: The system cannot find the file specified.

                                      Function app -
                                      You perform local testing for the RequestUserApproval function. The following error message displays:
                                      'Timeout value of 00:10:00 exceeded by function: RequestUserApproval'
                                      The same error message displays when you test the function in an Azure development environment when you run the following Kusto query:

                                      FunctionAppLogs -
                                      | where FunctionName = = "RequestUserApproval"

                                      Logic app -
                                      You test the Logic app in a development environment. The following error message displays:
                                      '400 Bad Request'
                                      Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.

                                      Code -

                                      Corporate website -
                                      Security.cs:


                                      Function app -
                                      RequestUserApproval.cs:
                                       Question HOTSPOT -
                                      You need to configure Azure Service Bus to Event Grid integration.
                                      Which Azure Service Bus settings should you use? To answer, select the appropriate options in the answer area.
                                      NOTE: Each correct selection is worth one point.
                                      Hot Area:


                                        Correct Answer:

                                        Box 1: Premium -
                                        Service Bus can now emit events to Event Grid when there are messages in a queue or a subscription when no receivers are present. You can create Event Grid subscriptions to your Service Bus namespaces, listen to these events, and then react to the events by starting a receiver. With this feature, you can use Service
                                        Bus in reactive programming models.
                                        To enable the feature, you need the following items:
                                        A Service Bus Premium namespace with at least one Service Bus queue or a Service Bus topic with at least one subscription.
                                        Contributor access to the Service Bus namespace.

                                        Box 2: Contributor -
                                        Reference:
                                        https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-to-event-grid-integration-concept
                                        Question #296
                                        Introductory Info Case study -
                                        This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                                        To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                                        At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                                        To start the case study -
                                        To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                                        Background -
                                        You are a developer for Litware Inc., a SaaS company that provides a solution for managing employee expenses. The solution consists of an ASP.NET Core Web
                                        API project that is deployed as an Azure Web App.

                                        Overall architecture -
                                        Employees upload receipts for the system to process. When processing is complete, the employee receives a summary report email that details the processing results. Employees then use a web application to manage their receipts and perform any additional tasks needed for reimbursement.

                                        Receipt processing -
                                        Employees may upload receipts in two ways:
                                        Uploading using an Azure Files mounted folder
                                        Uploading using the web application

                                        Data Storage -
                                        Receipt and employee information is stored in an Azure SQL database.

                                        Documentation -
                                        Employees are provided with a getting started document when they first use the solution. The documentation includes details on supported operating systems for
                                        Azure File upload, and instructions on how to configure the mounted folder.

                                        Solution details -

                                        Users table -


                                        Web Application -
                                        You enable MSI for the Web App and configure the Web App to use the security principal name WebAppIdentity.

                                        Processing -
                                        Processing is performed by an Azure Function that uses version 2 of the Azure Function runtime. Once processing is completed, results are stored in Azure Blob
                                        Storage and an Azure SQL database. Then, an email summary is sent to the user with a link to the processing report. The link to the report must remain valid if the email is forwarded to another user.

                                        Logging -
                                        Azure Application Insights is used for telemetry and logging in both the processor and the web application. The processor also has TraceWriter logging enabled.
                                        Application Insights must always contain all log messages.

                                        Requirements -

                                        Receipt processing -
                                        Concurrent processing of a receipt must be prevented.

                                        Disaster recovery -
                                        Regional outage must not impact application availability. All DR operations must not be dependent on application running and must ensure that data in the DR region is up to date.

                                        Security -
                                        User's SecurityPin must be stored in such a way that access to the database does not allow the viewing of SecurityPins. The web application is the only system that should have access to SecurityPins.
                                        All certificates and secrets used to secure data must be stored in Azure Key Vault.
                                        You must adhere to the principle of least privilege and provide privileges which are essential to perform the intended function.
                                        All access to Azure Storage and Azure SQL database must use the application's Managed Service Identity (MSI).
                                        Receipt data must always be encrypted at rest.
                                        All data must be protected in transit.
                                        User's expense account number must be visible only to logged in users. All other views of the expense account number should include only the last segment, with the remaining parts obscured.
                                        In the case of a security breach, access to all summary reports must be revoked without impacting other parts of the system.

                                        Issues -

                                        Upload format issue -
                                        Employees occasionally report an issue with uploading a receipt using the web application. They report that when they upload a receipt using the Azure File
                                        Share, the receipt does not appear in their profile. When this occurs, they delete the file in the file share and use the web application, which returns a 500 Internal
                                        Server error page.

                                        Capacity issue -
                                        During busy periods, employees report long delays between the time they upload the receipt and when it appears in the web application.

                                        Log capacity issue -
                                        Developers report that the number of log messages in the trace output for the processor is too high, resulting in lost log messages.

                                        Application code -

                                        Processing.cs -


                                        Database.cs -


                                        ReceiptUploader.cs -


                                        ConfigureSSE.ps1 -
                                         Question HOTSPOT -
                                        You need to add code at line PC26 of Processing.cs to ensure that security policies are met.
                                        How should you complete the code that you will add at line PC26? To answer, select the appropriate options in the answer area.
                                        NOTE: Each correct selection is worth one point.
                                        Hot Area:


                                          Correct Answer:

                                          Box 1: var key = await Resolver.ResolveKeyAsyn(keyBundle,KeyIdentifier.CancellationToken.None);
                                          Box 2: var x = new BlobEncryptionPolicy(key,resolver);
                                          Example:
                                          // We begin with cloudKey1, and a resolver capable of resolving and caching Key Vault secrets.
                                          BlobEncryptionPolicy encryptionPolicy = new BlobEncryptionPolicy(cloudKey1, cachingResolver); client.DefaultRequestOptions.EncryptionPolicy = encryptionPolicy;
                                          Box 3: cloudblobClient. DefaultRequestOptions.EncryptionPolicy = x;
                                          Reference:
                                          https://github.com/Azure/azure-storage-net/blob/master/Samples/GettingStarted/EncryptionSamples/KeyRotation/Program.cs
                                          Question #297
                                          Introductory Info Case study -
                                          This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                                          To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                                          At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                                          To start the case study -
                                          To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                                          Background -
                                          You are a developer for Litware Inc., a SaaS company that provides a solution for managing employee expenses. The solution consists of an ASP.NET Core Web
                                          API project that is deployed as an Azure Web App.

                                          Overall architecture -
                                          Employees upload receipts for the system to process. When processing is complete, the employee receives a summary report email that details the processing results. Employees then use a web application to manage their receipts and perform any additional tasks needed for reimbursement.

                                          Receipt processing -
                                          Employees may upload receipts in two ways:
                                          Uploading using an Azure Files mounted folder
                                          Uploading using the web application

                                          Data Storage -
                                          Receipt and employee information is stored in an Azure SQL database.

                                          Documentation -
                                          Employees are provided with a getting started document when they first use the solution. The documentation includes details on supported operating systems for
                                          Azure File upload, and instructions on how to configure the mounted folder.

                                          Solution details -

                                          Users table -


                                          Web Application -
                                          You enable MSI for the Web App and configure the Web App to use the security principal name WebAppIdentity.

                                          Processing -
                                          Processing is performed by an Azure Function that uses version 2 of the Azure Function runtime. Once processing is completed, results are stored in Azure Blob
                                          Storage and an Azure SQL database. Then, an email summary is sent to the user with a link to the processing report. The link to the report must remain valid if the email is forwarded to another user.

                                          Logging -
                                          Azure Application Insights is used for telemetry and logging in both the processor and the web application. The processor also has TraceWriter logging enabled.
                                          Application Insights must always contain all log messages.

                                          Requirements -

                                          Receipt processing -
                                          Concurrent processing of a receipt must be prevented.

                                          Disaster recovery -
                                          Regional outage must not impact application availability. All DR operations must not be dependent on application running and must ensure that data in the DR region is up to date.

                                          Security -
                                          User's SecurityPin must be stored in such a way that access to the database does not allow the viewing of SecurityPins. The web application is the only system that should have access to SecurityPins.
                                          All certificates and secrets used to secure data must be stored in Azure Key Vault.
                                          You must adhere to the principle of least privilege and provide privileges which are essential to perform the intended function.
                                          All access to Azure Storage and Azure SQL database must use the application's Managed Service Identity (MSI).
                                          Receipt data must always be encrypted at rest.
                                          All data must be protected in transit.
                                          User's expense account number must be visible only to logged in users. All other views of the expense account number should include only the last segment, with the remaining parts obscured.
                                          In the case of a security breach, access to all summary reports must be revoked without impacting other parts of the system.

                                          Issues -

                                          Upload format issue -
                                          Employees occasionally report an issue with uploading a receipt using the web application. They report that when they upload a receipt using the Azure File
                                          Share, the receipt does not appear in their profile. When this occurs, they delete the file in the file share and use the web application, which returns a 500 Internal
                                          Server error page.

                                          Capacity issue -
                                          During busy periods, employees report long delays between the time they upload the receipt and when it appears in the web application.

                                          Log capacity issue -
                                          Developers report that the number of log messages in the trace output for the processor is too high, resulting in lost log messages.

                                          Application code -

                                          Processing.cs -


                                          Database.cs -


                                          ReceiptUploader.cs -


                                          ConfigureSSE.ps1 -
                                           Question You need to ensure the security policies are met.
                                          What code do you add at line CS07 of ConfigureSSE.ps1?
                                          1. A
                                            ג€"PermissionsToKeys create, encrypt, decrypt
                                          2. B
                                            ג€"PermissionsToCertificates create, encrypt, decrypt
                                          3. C
                                            ג€"PermissionsToCertificates wrapkey, unwrapkey, get
                                          4. D
                                            ג€"PermissionsToKeys wrapkey, unwrapkey, get

                                          Correct Answer:
                                          B
                                          Scenario: All certificates and secrets used to secure data must be stored in Azure Key Vault.
                                          You must adhere to the principle of least privilege and provide privileges which are essential to perform the intended function.
                                          The Set-AzureRmKeyValutAccessPolicy parameter -PermissionsToKeys specifies an array of key operation permissions to grant to a user or service principal.
                                          The acceptable values for this parameter: decrypt, encrypt, unwrapKey, wrapKey, verify, sign, get, list, update, create, import, delete, backup, restore, recover, purge
                                          Incorrect Answers:
                                          A, C: The Set-AzureRmKeyValutAccessPolicy parameter -PermissionsToCertificates specifies an array of certificate permissions to grant to a user or service principal. The acceptable values for this parameter: get, list, delete, create, import, update, managecontacts, getissuers, listissuers, setissuers, deleteissuers, manageissuers, recover, purge, backup, restore
                                          Reference:
                                          https://docs.microsoft.com/en-us/powershell/module/azurerm.keyvault/set-azurermkeyvaultaccesspolicy
                                          Question #298
                                          Introductory Info Case study -
                                          This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                                          To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                                          At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                                          To start the case study -
                                          To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                                          Background -
                                          VanArsdel, Ltd. is a global office supply company. The company is based in Canada and has retail store locations across the world. The company is developing several cloud-based solutions to support their stores, distributors, suppliers, and delivery services.

                                          Current environment -

                                          Corporate website -
                                          The company provides a public website located at http://www.vanarsdelltd.com. The website consists of a React JavaScript user interface, HTML, CSS, image assets, and several APIs hosted in Azure Functions.

                                          Retail Store Locations -
                                          The company supports thousands of store locations globally. Store locations send data every hour to an Azure Blob storage account to support inventory, purchasing and delivery services. Each record includes a location identifier and sales transaction information.

                                          Requirements -
                                          The application components must meet the following requirements:

                                          Corporate website -
                                          Secure the website by using SSL.
                                          Minimize costs for data storage and hosting.
                                          Implement native GitHub workflows for continuous integration and continuous deployment (CI/CD).
                                          Distribute the website content globally for local use.
                                          Implement monitoring by using Application Insights and availability web tests including SSL certificate validity and custom header value verification.
                                          The website must have 99.95 percent uptime.

                                          Retail store locations -
                                          Azure Functions must process data immediately when data is uploaded to Blob storage. Azure Functions must update Azure Cosmos DB by using native SQL language queries.
                                          Audit store sale transaction information nightly to validate data, process sales financials, and reconcile inventory.

                                          Delivery services -
                                          Store service telemetry data in Azure Cosmos DB by using an Azure Function. Data must include an item id, the delivery vehicle license plate, vehicle package capacity, and current vehicle location coordinates.
                                          Store delivery driver profile information in Azure Active Directory (Azure AD) by using an Azure Function called from the corporate website.

                                          Inventory services -
                                          The company has contracted a third-party to develop an API for inventory processing that requires access to a specific blob within the retail store storage account for three months to include read-only access to the data.

                                          Security -
                                          All Azure Functions must centralize management and distribution of configuration data for different environments and geographies, encrypted by using a company-provided RSA-HSM key.
                                          Authentication and authorization must use Azure AD and services must use managed identities where possible.

                                          Issues -

                                          Retail Store Locations -
                                          You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.
                                          Azure Cosmos DB queries from the Azure Function exhibit high Request Unit (RU) usage and contain multiple, complex queries that exhibit high point read latency for large items as the function app is scaling. Question You need to reduce read latency for the retail store solution.
                                          What are two possible ways to achieve the goal? Each correct answer presents a complete solution.
                                          NOTE: Each correct selection is worth one point.
                                          1. A
                                            Create a new composite index for the store location data queries in Azure Cosmos DB. Modify the queries to support parameterized SQL and update the Azure Function app to call the new queries.
                                          2. B
                                            Provision an Azure Cosmos DB dedicated gateway. Update the Azure Function app connection string to use the new dedicated gateway endpoint.
                                          3. C
                                            Configure Azure Cosmos DB consistency to session consistency. Cache session tokens in a new Azure Redis cache instance after every write. Update reads to use the session token stored in Azure Redis.
                                          4. D
                                            Provision an Azure Cosmos DB dedicated gateway. Update blob storage to use the new dedicated gateway endpoint.
                                          5. E
                                            Configure Azure Cosmos DB consistency to strong consistency. Increase the RUs for the container supporting store location data.

                                          Correct Answer:
                                          BC
                                          Azure Cosmos DB queries from the Azure Function exhibit high Request Unit (RU) usage and contain multiple, complex queries that exhibit high point read latency for large items as the function app is scaling.
                                          B: A dedicated gateway is server-side compute that is a front-end to your Azure Cosmos DB account. When you connect to the dedicated gateway, it both routes requests and caches data.
                                          You can provision a dedicated gateway to improve performance at scale.
                                          You must connect to Azure Cosmos DB using the dedicated gateway in order to use the integrated cache. The dedicated gateway has a different endpoint from the standard one provided with your Azure Cosmos DB account. When you connect to your dedicated gateway endpoint, your application sends a request to the dedicated gateway, which then routes the request to different backend nodes. If possible, the integrated cache will serve the result.
                                          C: Azure Cache for Redis perfectly complements Azure database services such as Cosmos DB. It provides a cost-effective solution to scale read and write throughput of your data tier. Store and share database query results, session states, static contents, and more using a common cache-aside pattern.
                                          Reference:
                                          https://docs.microsoft.com/en-us/azure/architecture/solution-ideas/articles/data-cache-with-redis-cache https://docs.microsoft.com/en-us/azure/cosmos-db/dedicated-gateway
                                          Question #299
                                          Introductory Info Case study -
                                          This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                                          To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                                          At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                                          To start the case study -
                                          To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                                          Background -
                                          VanArsdel, Ltd. is a global office supply company. The company is based in Canada and has retail store locations across the world. The company is developing several cloud-based solutions to support their stores, distributors, suppliers, and delivery services.

                                          Current environment -

                                          Corporate website -
                                          The company provides a public website located at http://www.vanarsdelltd.com. The website consists of a React JavaScript user interface, HTML, CSS, image assets, and several APIs hosted in Azure Functions.

                                          Retail Store Locations -
                                          The company supports thousands of store locations globally. Store locations send data every hour to an Azure Blob storage account to support inventory, purchasing and delivery services. Each record includes a location identifier and sales transaction information.

                                          Requirements -
                                          The application components must meet the following requirements:

                                          Corporate website -
                                          Secure the website by using SSL.
                                          Minimize costs for data storage and hosting.
                                          Implement native GitHub workflows for continuous integration and continuous deployment (CI/CD).
                                          Distribute the website content globally for local use.
                                          Implement monitoring by using Application Insights and availability web tests including SSL certificate validity and custom header value verification.
                                          The website must have 99.95 percent uptime.

                                          Retail store locations -
                                          Azure Functions must process data immediately when data is uploaded to Blob storage. Azure Functions must update Azure Cosmos DB by using native SQL language queries.
                                          Audit store sale transaction information nightly to validate data, process sales financials, and reconcile inventory.

                                          Delivery services -
                                          Store service telemetry data in Azure Cosmos DB by using an Azure Function. Data must include an item id, the delivery vehicle license plate, vehicle package capacity, and current vehicle location coordinates.
                                          Store delivery driver profile information in Azure Active Directory (Azure AD) by using an Azure Function called from the corporate website.

                                          Inventory services -
                                          The company has contracted a third-party to develop an API for inventory processing that requires access to a specific blob within the retail store storage account for three months to include read-only access to the data.

                                          Security -
                                          All Azure Functions must centralize management and distribution of configuration data for different environments and geographies, encrypted by using a company-provided RSA-HSM key.
                                          Authentication and authorization must use Azure AD and services must use managed identities where possible.

                                          Issues -

                                          Retail Store Locations -
                                          You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.
                                          Azure Cosmos DB queries from the Azure Function exhibit high Request Unit (RU) usage and contain multiple, complex queries that exhibit high point read latency for large items as the function app is scaling. Question You need to audit the retail store sales transactions.
                                          What are two possible ways to achieve the goal? Each correct answer presents a complete solution.
                                          NOTE: Each correct selection is worth one point.
                                          1. A
                                            Update the retail store location data upload process to include blob index tags. Create an Azure Function to process the blob index tags and filter by store location.
                                          2. B
                                            Process the change feed logs of the Azure Blob storage account by using an Azure Function. Specify a time range for the change feed data.
                                          3. C
                                            Enable blob versioning for the storage account. Use an Azure Function to process a list of the blob versions per day.
                                          4. D
                                            Process an Azure Storage blob inventory report by using an Azure Function. Create rule filters on the blob inventory report.
                                          5. E
                                            Subscribe to blob storage events by using an Azure Function and Azure Event Grid. Filter the events by store location.

                                          Correct Answer:
                                          BE
                                          Scenario: Audit store sale transaction information nightly to validate data, process sales financials, and reconcile inventory.
                                          "Process the change feed logs of the Azure Blob storage account by using an Azure Function. Specify a time range for the change feed data": Change feed support is well-suited for scenarios that process data based on objects that have changed. For example, applications can:
                                          Store, audit, and analyze changes to your objects, over any period of time, for security, compliance or intelligence for enterprise data management.
                                          "Subscribe to blob storage events by using an Azure Function and Azure Event Grid. Filter the events by store location": Azure Storage events allow applications to react to events, such as the creation and deletion of blobs. It does so without the need for complicated code or expensive and inefficient polling services. The best part is you only pay for what you use.
                                          Blob storage events are pushed using Azure Event Grid to subscribers such as Azure Functions, Azure Logic Apps, or even to your own http listener. Event Grid provides reliable event delivery to your applications through rich retry policies and dead-lettering.
                                          Incorrect Answers:
                                          "Enable blob versioning for the storage account. Use an Azure Function to process a list of the blob versions per day": You can enable Blob storage versioning to automatically maintain previous versions of an object. When blob versioning is enabled, you can access earlier versions of a blob to recover your data if it is modified or deleted.
                                          Reference:
                                          https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-change-feed https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-overview
                                          Question #300
                                          Introductory Info Case study -
                                          This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                                          To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                                          At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                                          To start the case study -
                                          To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                                          Background -

                                          Overview -
                                          You are a developer for Contoso, Ltd. The company has a social networking website that is developed as a Single Page Application (SPA). The main web application for the social networking website loads user uploaded content from blob storage.
                                          You are developing a solution to monitor uploaded data for inappropriate content. The following process occurs when users upload content by using the SPA:
                                          * Messages are sent to ContentUploadService.
                                          * Content is processed by ContentAnalysisService.
                                          * After processing is complete, the content is posted to the social network or a rejection message is posted in its place.
                                          The ContentAnalysisService is deployed with Azure Container Instances from a private Azure Container Registry named contosoimages.
                                          The solution will use eight CPU cores.

                                          Azure Active Directory -
                                          Contoso, Ltd. uses Azure Active Directory (Azure AD) for both internal and guest accounts.

                                          Requirements -

                                          ContentAnalysisService -
                                          The company's data science group built ContentAnalysisService which accepts user generated content as a string and returns a probable value for inappropriate content. Any values over a specific threshold must be reviewed by an employee of Contoso, Ltd.
                                          You must create an Azure Function named CheckUserContent to perform the content checks.

                                          Costs -
                                          You must minimize costs for all Azure services.

                                          Manual review -
                                          To review content, the user must authenticate to the website portion of the ContentAnalysisService using their Azure AD credentials. The website is built using
                                          React and all pages and API endpoints require authentication. In order to review content a user must be part of a ContentReviewer role. All completed reviews must include the reviewer's email address for auditing purposes.

                                          High availability -
                                          All services must run in multiple regions. The failure of any service in a region must not impact overall application availability.

                                          Monitoring -
                                          An alert must be raised if the ContentUploadService uses more than 80 percent of available CPU cores.

                                          Security -
                                          You have the following security requirements:
                                          Any web service accessible over the Internet must be protected from cross site scripting attacks.
                                          All websites and services must use SSL from a valid root certificate authority.
                                          Azure Storage access keys must only be stored in memory and must be available only to the service.
                                          All Internal services must only be accessible from internal Virtual Networks (VNets).
                                          All parts of the system must support inbound and outbound traffic restrictions.
                                          All service calls must be authenticated by using Azure AD.

                                          User agreements -
                                          When a user submits content, they must agree to a user agreement. The agreement allows employees of Contoso, Ltd. to review content, store cookies on user devices, and track user's IP addresses.
                                          Information regarding agreements is used by multiple divisions within Contoso, Ltd.
                                          User responses must not be lost and must be available to all parties regardless of individual service uptime. The volume of agreements is expected to be in the millions per hour.

                                          Validation testing -
                                          When a new version of the ContentAnalysisService is available the previous seven days of content must be processed with the new version to verify that the new version does not significantly deviate from the old version.

                                          Issues -
                                          Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages.

                                          Code -

                                          ContentUploadService -


                                          ApplicationManifest -
                                           
                                          Question You need to monitor ContentUploadService according to the requirements.
                                          Which command should you use?
                                          1. A
                                            az monitor metrics alert create ג€"n alert ג€"g ג€¦ - -scopes ג€¦ - -condition "avg Percentage CPU > 8"
                                          2. B
                                            az monitor metrics alert create ג€"n alert ג€"g ג€¦ - -scopes ג€¦ - -condition "avg Percentage CPU > 800"
                                          3. C
                                            az monitor metrics alert create ג€"n alert ג€"g ג€¦ - -scopes ג€¦ - -condition "CPU Usage > 800"
                                          4. D
                                            az monitor metrics alert create ג€"n alert ג€"g ג€¦ - -scopes ג€¦ - -condition "CPU Usage > 8"

                                          Correct Answer:
                                          B
                                          Scenario: An alert must be raised if the ContentUploadService uses more than 80 percent of available CPU cores.
                                          Reference:
                                          https://docs.microsoft.com/sv-se/cli/azure/monitor/metrics/alert


                                          No comments:

                                          Post a Comment