Friday, 24 January 2025

AZ-204 Question and Answer Part 19

 Question #301

Introductory Info Case study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

Background -

Overview -
You are a developer for Contoso, Ltd. The company has a social networking website that is developed as a Single Page Application (SPA). The main web application for the social networking website loads user uploaded content from blob storage.
You are developing a solution to monitor uploaded data for inappropriate content. The following process occurs when users upload content by using the SPA:
* Messages are sent to ContentUploadService.
* Content is processed by ContentAnalysisService.
* After processing is complete, the content is posted to the social network or a rejection message is posted in its place.
The ContentAnalysisService is deployed with Azure Container Instances from a private Azure Container Registry named contosoimages.
The solution will use eight CPU cores.

Azure Active Directory -
Contoso, Ltd. uses Azure Active Directory (Azure AD) for both internal and guest accounts.

Requirements -

ContentAnalysisService -
The company's data science group built ContentAnalysisService which accepts user generated content as a string and returns a probable value for inappropriate content. Any values over a specific threshold must be reviewed by an employee of Contoso, Ltd.
You must create an Azure Function named CheckUserContent to perform the content checks.

Costs -
You must minimize costs for all Azure services.

Manual review -
To review content, the user must authenticate to the website portion of the ContentAnalysisService using their Azure AD credentials. The website is built using
React and all pages and API endpoints require authentication. In order to review content a user must be part of a ContentReviewer role. All completed reviews must include the reviewer's email address for auditing purposes.

High availability -
All services must run in multiple regions. The failure of any service in a region must not impact overall application availability.

Monitoring -
An alert must be raised if the ContentUploadService uses more than 80 percent of available CPU cores.

Security -
You have the following security requirements:
Any web service accessible over the Internet must be protected from cross site scripting attacks.
All websites and services must use SSL from a valid root certificate authority.
Azure Storage access keys must only be stored in memory and must be available only to the service.
All Internal services must only be accessible from internal Virtual Networks (VNets).
All parts of the system must support inbound and outbound traffic restrictions.
All service calls must be authenticated by using Azure AD.

User agreements -
When a user submits content, they must agree to a user agreement. The agreement allows employees of Contoso, Ltd. to review content, store cookies on user devices, and track user's IP addresses.
Information regarding agreements is used by multiple divisions within Contoso, Ltd.
User responses must not be lost and must be available to all parties regardless of individual service uptime. The volume of agreements is expected to be in the millions per hour.

Validation testing -
When a new version of the ContentAnalysisService is available the previous seven days of content must be processed with the new version to verify that the new version does not significantly deviate from the old version.

Issues -
Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages.

Code -

ContentUploadService -


ApplicationManifest -
 
Question: You need to investigate the http server log output to resolve the issue with the ContentUploadService.
Which command should you use first?
  1. A
    az webapp log
  2. B
    az ams live-output
  3. C
    az monitor activity-log
  4. D
    az container attach
Correct Answer:
C
Scenario: Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages.
"502 bad gateway" and "503 service unavailable" are common errors in your app hosted in Azure App Service.
Microsoft Azure publicizes each time there is a service interruption or performance degradation.
The az monitor activity-log command manages activity logs.
Note: Troubleshooting can be divided into three distinct tasks, in sequential order:
1. Observe and monitor application behavior
2. Collect data
3. Mitigate the issue
Reference:
https://docs.microsoft.com/en-us/cli/azure/monitor/activity-log
Question #302
Introductory Info Case study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

Background -
City Power & Light company provides electrical infrastructure monitoring solutions for homes and businesses. The company is migrating solutions to Azure.

Current environment -

Architecture overview -
The company has a public website located at http://www.cpandl.com/. The site is a single-page web application that runs in Azure App Service on Linux. The website uses files stored in Azure Storage and cached in Azure Content Delivery Network (CDN) to serve static content.
API Management and Azure Function App functions are used to process and store data in Azure Database for PostgreSQL. API Management is used to broker communications to the Azure Function app functions for Logic app integration. Logic apps are used to orchestrate the data processing while Service Bus and
Event Grid handle messaging and events.
The solution uses Application Insights, Azure Monitor, and Azure Key Vault.

Architecture diagram -
The company has several applications and services that support their business. The company plans to implement serverless computing where possible. The overall architecture is shown below.


User authentication -
The following steps detail the user authentication process:
1. The user selects Sign in in the website.
2. The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.
3. The user signs in.
4. Azure AD redirects the user's session back to the web application. The URL includes an access token.
5. The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience ('aud') claim in the access token.
6. The back-end API validates the access token.

Requirements -

Corporate website -
Communications and content must be secured by using SSL.
Communications must use HTTPS.
Data must be replicated to a secondary region and three availability zones.
Data storage costs must be minimized.

Azure Database for PostgreSQL -
The database connection string is stored in Azure Key Vault with the following attributes:
Azure Key Vault name: cpandlkeyvault
Secret name: PostgreSQLConn
Id: 80df3e46ffcd4f1cb187f79905e9a1e8
The connection information is updated frequently. The application must always use the latest information to connect to the database.
Azure Service Bus and Azure Event Grid
Azure Event Grid must use Azure Service Bus for queue-based load leveling.
Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.
Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.

Security -
All SSL certificates and credentials must be stored in Azure Key Vault.
File access must restrict access by IP, protocol, and Azure AD rights.
All user accounts and processes must receive only those privileges which are essential to perform their intended function.

Compliance -
Auditing of the file updates and transfers must be enabled to comply with General Data Protection Regulation (GDPR). The file updates must be read-only, stored in the order in which they occurred, include only create, update, delete, and copy operations, and be retained for compliance reasons.

Issues -

Corporate website -
While testing the site, the following error message displays:
CryptographicException: The system cannot find the file specified.

Function app -
You perform local testing for the RequestUserApproval function. The following error message displays:
'Timeout value of 00:10:00 exceeded by function: RequestUserApproval'
The same error message displays when you test the function in an Azure development environment when you run the following Kusto query:

FunctionAppLogs -
| where FunctionName = = "RequestUserApproval"

Logic app -
You test the Logic app in a development environment. The following error message displays:
'400 Bad Request'
Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.

Code -

Corporate website -
Security.cs:


Function app -
RequestUserApproval.cs:
Question: You need to investigate the Azure Function app error message in the development environment.
What should you do?
  1. A
    Connect Live Metrics Stream from Application Insights to the Azure Function app and filter the metrics.
  2. B
    Create a new Azure Log Analytics workspace and instrument the Azure Function app with Application Insights.
  3. C
    Update the Azure Function app with extension methods from Microsoft.Extensions.Logging to log events by using the log instance.
  4. D
    Add a new diagnostic setting to the Azure Function app to send logs to Log Analytics.
Correct Answer:
A
Azure Functions offers built-in integration with Azure Application Insights to monitor functions.
The following areas of Application Insights can be helpful when evaluating the behavior, performance, and errors in your functions:
Live Metrics: View metrics data as it's created in near real-time.

Failures -

Performance -

Metrics -
Reference:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-monitoring
Question #303
Introductory Info Case study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

Background -
City Power & Light company provides electrical infrastructure monitoring solutions for homes and businesses. The company is migrating solutions to Azure.

Current environment -

Architecture overview -
The company has a public website located at http://www.cpandl.com/. The site is a single-page web application that runs in Azure App Service on Linux. The website uses files stored in Azure Storage and cached in Azure Content Delivery Network (CDN) to serve static content.
API Management and Azure Function App functions are used to process and store data in Azure Database for PostgreSQL. API Management is used to broker communications to the Azure Function app functions for Logic app integration. Logic apps are used to orchestrate the data processing while Service Bus and
Event Grid handle messaging and events.
The solution uses Application Insights, Azure Monitor, and Azure Key Vault.

Architecture diagram -
The company has several applications and services that support their business. The company plans to implement serverless computing where possible. The overall architecture is shown below.


User authentication -
The following steps detail the user authentication process:
1. The user selects Sign in in the website.
2. The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.
3. The user signs in.
4. Azure AD redirects the user's session back to the web application. The URL includes an access token.
5. The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience ('aud') claim in the access token.
6. The back-end API validates the access token.

Requirements -

Corporate website -
Communications and content must be secured by using SSL.
Communications must use HTTPS.
Data must be replicated to a secondary region and three availability zones.
Data storage costs must be minimized.

Azure Database for PostgreSQL -
The database connection string is stored in Azure Key Vault with the following attributes:
Azure Key Vault name: cpandlkeyvault
Secret name: PostgreSQLConn
Id: 80df3e46ffcd4f1cb187f79905e9a1e8
The connection information is updated frequently. The application must always use the latest information to connect to the database.
Azure Service Bus and Azure Event Grid
Azure Event Grid must use Azure Service Bus for queue-based load leveling.
Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.
Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.

Security -
All SSL certificates and credentials must be stored in Azure Key Vault.
File access must restrict access by IP, protocol, and Azure AD rights.
All user accounts and processes must receive only those privileges which are essential to perform their intended function.

Compliance -
Auditing of the file updates and transfers must be enabled to comply with General Data Protection Regulation (GDPR). The file updates must be read-only, stored in the order in which they occurred, include only create, update, delete, and copy operations, and be retained for compliance reasons.

Issues -

Corporate website -
While testing the site, the following error message displays:
CryptographicException: The system cannot find the file specified.

Function app -
You perform local testing for the RequestUserApproval function. The following error message displays:
'Timeout value of 00:10:00 exceeded by function: RequestUserApproval'
The same error message displays when you test the function in an Azure development environment when you run the following Kusto query:

FunctionAppLogs -
| where FunctionName = = "RequestUserApproval"

Logic app -
You test the Logic app in a development environment. The following error message displays:
'400 Bad Request'
Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.

Code -

Corporate website -
Security.cs:


Function app -
RequestUserApproval.cs:
 Question HOTSPOT -
You need to configure security and compliance for the corporate website files.
Which Azure Blob storage settings should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:


    Correct Answer:

    Box 1: role-based access control (RBAC)
    Azure Storage supports authentication and authorization with Azure AD for the Blob and Queue services via Azure role-based access control (Azure RBAC).
    Scenario: File access must restrict access by IP, protocol, and Azure AD rights.

    Box 2: storage account type -
    Scenario: The website uses files stored in Azure Storage
    Auditing of the file updates and transfers must be enabled to comply with General Data Protection Regulation (GDPR).
    Creating a diagnostic setting:
    1. Sign in to the Azure portal.
    2. Navigate to your storage account.
    3. In the Monitoring section, click Diagnostic settings (preview).

    4. Choose file as the type of storage that you want to enable logs for.
    5. Click Add diagnostic setting.
    Reference:
    https://docs.microsoft.com/en-us/azure/storage/common/storage-introduction https://docs.microsoft.com/en-us/azure/storage/files/storage-files-monitoring
    Question #304
    Introductory Info Case study -
    This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
    To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
    At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

    To start the case study -
    To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

    Background -
    City Power & Light company provides electrical infrastructure monitoring solutions for homes and businesses. The company is migrating solutions to Azure.

    Current environment -

    Architecture overview -
    The company has a public website located at http://www.cpandl.com/. The site is a single-page web application that runs in Azure App Service on Linux. The website uses files stored in Azure Storage and cached in Azure Content Delivery Network (CDN) to serve static content.
    API Management and Azure Function App functions are used to process and store data in Azure Database for PostgreSQL. API Management is used to broker communications to the Azure Function app functions for Logic app integration. Logic apps are used to orchestrate the data processing while Service Bus and
    Event Grid handle messaging and events.
    The solution uses Application Insights, Azure Monitor, and Azure Key Vault.

    Architecture diagram -
    The company has several applications and services that support their business. The company plans to implement serverless computing where possible. The overall architecture is shown below.


    User authentication -
    The following steps detail the user authentication process:
    1. The user selects Sign in in the website.
    2. The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.
    3. The user signs in.
    4. Azure AD redirects the user's session back to the web application. The URL includes an access token.
    5. The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience ('aud') claim in the access token.
    6. The back-end API validates the access token.

    Requirements -

    Corporate website -
    Communications and content must be secured by using SSL.
    Communications must use HTTPS.
    Data must be replicated to a secondary region and three availability zones.
    Data storage costs must be minimized.

    Azure Database for PostgreSQL -
    The database connection string is stored in Azure Key Vault with the following attributes:
    Azure Key Vault name: cpandlkeyvault
    Secret name: PostgreSQLConn
    Id: 80df3e46ffcd4f1cb187f79905e9a1e8
    The connection information is updated frequently. The application must always use the latest information to connect to the database.
    Azure Service Bus and Azure Event Grid
    Azure Event Grid must use Azure Service Bus for queue-based load leveling.
    Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.
    Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.

    Security -
    All SSL certificates and credentials must be stored in Azure Key Vault.
    File access must restrict access by IP, protocol, and Azure AD rights.
    All user accounts and processes must receive only those privileges which are essential to perform their intended function.

    Compliance -
    Auditing of the file updates and transfers must be enabled to comply with General Data Protection Regulation (GDPR). The file updates must be read-only, stored in the order in which they occurred, include only create, update, delete, and copy operations, and be retained for compliance reasons.

    Issues -

    Corporate website -
    While testing the site, the following error message displays:
    CryptographicException: The system cannot find the file specified.

    Function app -
    You perform local testing for the RequestUserApproval function. The following error message displays:
    'Timeout value of 00:10:00 exceeded by function: RequestUserApproval'
    The same error message displays when you test the function in an Azure development environment when you run the following Kusto query:

    FunctionAppLogs -
    | where FunctionName = = "RequestUserApproval"

    Logic app -
    You test the Logic app in a development environment. The following error message displays:
    '400 Bad Request'
    Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.

    Code -

    Corporate website -
    Security.cs:


    Function app -
    RequestUserApproval.cs:
     Question You need to correct the RequestUserApproval Function app error.
    What should you do?
    1. A
      Update line RA13 to use the async keyword and return an HttpRequest object value.
    2. B
      Configure the Function app to use an App Service hosting plan. Enable the Always On setting of the hosting plan.
    3. C
      Update the function to be stateful by using Durable Functions to process the request payload.
    4. D
      Update the functionTimeout property of the host.json project file to 15 minutes.

    Correct Answer:
    C
    Async operation tracking -
    The HTTP response mentioned previously is designed to help implement long-running HTTP async APIs with Durable Functions. This pattern is sometimes referred to as the polling consumer pattern.
    Both the client and server implementations of this pattern are built into the Durable Functions HTTP APIs.

    Function app -
    You perform local testing for the RequestUserApproval function. The following error message displays:
    'Timeout value of 00:10:00 exceeded by function: RequestUserApproval'
    The same error message displays when you test the function in an Azure development environment when you run the following Kusto query:

    FunctionAppLogs -
    | where FunctionName = = "RequestUserApproval"
    Reference:
    https://docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-http-features
    Question #305
    Introductory Info Case study -
    This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
    To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
    At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

    To start the case study -
    To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

    Background -
    You are a developer for Proseware, Inc. You are developing an application that applies a set of governance policies for Proseware's internal services, external services, and applications. The application will also provide a shared library for common functionality.

    Requirements -

    Policy service -
    You develop and deploy a stateful ASP.NET Core 2.1 web application named Policy service to an Azure App Service Web App. The application reacts to events from Azure Event Grid and performs policy actions based on those events.
    The application must include the Event Grid Event ID field in all Application Insights telemetry.
    Policy service must use Application Insights to automatically scale with the number of policy actions that it is performing.

    Policies -

    Log policy -
    All Azure App Service Web Apps must write logs to Azure Blob storage. All log files should be saved to a container named logdrop. Logs must remain in the container for 15 days.

    Authentication events -
    Authentication events are used to monitor users signing in and signing out. All authentication events must be processed by Policy service. Sign outs must be processed as quickly as possible.

    PolicyLib -
    You have a shared library named PolicyLib that contains functionality common to all ASP.NET Core web services and applications. The PolicyLib library must:
    Exclude non-user actions from Application Insights telemetry.
    Provide methods that allow a web service to scale itself.
    Ensure that scaling actions do not disrupt application usage.

    Other -

    Anomaly detection service -
    You have an anomaly detection service that analyzes log information for anomalies. It is implemented as an Azure Machine Learning model. The model is deployed as a web service. If an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook.

    Health monitoring -
    All web applications and services have health monitoring at the /health service endpoint.

    Issues -

    Policy loss -
    When you deploy Policy service, policies may not be applied if they were in the process of being applied during the deployment.

    Performance issue -
    When under heavy load, the anomaly detection service undergoes slowdowns and rejects connections.

    Notification latency -
    Users report that anomaly detection emails can sometimes arrive several minutes after an anomaly is detected.

    App code -

    EventGridController.cs -
    Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.


    LoginEvent.cs -
    Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.
     

    Question 
    DRAG DROP -
    You need to implement the Log policy.
    How should you complete the Azure Event Grid subscription? To answer, drag the appropriate JSON segments to the correct locations. Each JSON segment may be used once, more than once, or not at all. You may need to drag the split bar between panes to view content.
    NOTE: Each correct selection is worth one point.
    Select and Place:


      Correct Answer:

      Box 1:WebHook -
      Scenario: If an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook. endpointType: The type of endpoint for the subscription (webhook/HTTP, Event Hub, or queue).

      Box 2: SubjectBeginsWith -
      Box 3: Microsoft.Storage.BlobCreated

      Scenario: Log Policy -
      All Azure App Service Web Apps must write logs to Azure Blob storage. All log files should be saved to a container named logdrop. Logs must remain in the container for 15 days.

      Example subscription schema -
      {
      "properties": {
      "destination": {
      "endpointType": "webhook",
      "properties": {
      "endpointUrl": "https://example.azurewebsites.net/api/HttpTriggerCSharp1?code=VXbGWce53l48Mt8wuotr0GPmyJ/nDT4hgdFj9DpBiRt38qqnnm5OFg=="
      }
      },
      "filter": {
      "includedEventTypes": [ "Microsoft.Storage.BlobCreated", "Microsoft.Storage.BlobDeleted" ],
      "subjectBeginsWith": "blobServices/default/containers/mycontainer/log",
      [1]
      "isSubjectCaseSensitive ": "true"
      }
      }
      }
      Reference:
      https://docs.microsoft.com/en-us/azure/event-grid/subscription-creation-schema
      Question #306
      Introductory Info Case study -
      This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
      To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
      At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

      To start the case study -
      To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

      Background -
      You are a developer for Proseware, Inc. You are developing an application that applies a set of governance policies for Proseware's internal services, external services, and applications. The application will also provide a shared library for common functionality.

      Requirements -

      Policy service -
      You develop and deploy a stateful ASP.NET Core 2.1 web application named Policy service to an Azure App Service Web App. The application reacts to events from Azure Event Grid and performs policy actions based on those events.
      The application must include the Event Grid Event ID field in all Application Insights telemetry.
      Policy service must use Application Insights to automatically scale with the number of policy actions that it is performing.

      Policies -

      Log policy -
      All Azure App Service Web Apps must write logs to Azure Blob storage. All log files should be saved to a container named logdrop. Logs must remain in the container for 15 days.

      Authentication events -
      Authentication events are used to monitor users signing in and signing out. All authentication events must be processed by Policy service. Sign outs must be processed as quickly as possible.

      PolicyLib -
      You have a shared library named PolicyLib that contains functionality common to all ASP.NET Core web services and applications. The PolicyLib library must:
      Exclude non-user actions from Application Insights telemetry.
      Provide methods that allow a web service to scale itself.
      Ensure that scaling actions do not disrupt application usage.

      Other -

      Anomaly detection service -
      You have an anomaly detection service that analyzes log information for anomalies. It is implemented as an Azure Machine Learning model. The model is deployed as a web service. If an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook.

      Health monitoring -
      All web applications and services have health monitoring at the /health service endpoint.

      Issues -

      Policy loss -
      When you deploy Policy service, policies may not be applied if they were in the process of being applied during the deployment.

      Performance issue -
      When under heavy load, the anomaly detection service undergoes slowdowns and rejects connections.

      Notification latency -
      Users report that anomaly detection emails can sometimes arrive several minutes after an anomaly is detected.

      App code -

      EventGridController.cs -
      Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.


      LoginEvent.cs -
      Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.
       Question You need to ensure that the solution can meet the scaling requirements for Policy Service.
      Which Azure Application Insights data model should you use?
      1. A
        an Application Insights dependency
      2. B
        an Application Insights event
      3. C
        an Application Insights trace
      4. D
        an Application Insights metric

      Correct Answer:
      D
      Application Insights provides three additional data types for custom telemetry:
      Trace - used either directly, or through an adapter to implement diagnostics logging using an instrumentation framework that is familiar to you, such as Log4Net or
      System.Diagnostics.
      Event - typically used to capture user interaction with your service, to analyze usage patterns.
      Metric - used to report periodic scalar measurements.
      Scenario:
      Policy service must use Application Insights to automatically scale with the number of policy actions that it is performing.
      Reference:
      https://docs.microsoft.com/en-us/azure/azure-monitor/app/data-model
      Question #307
      Introductory Info Case study -
      This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
      To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
      At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

      To start the case study -
      To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

      Background -
      You are a developer for Proseware, Inc. You are developing an application that applies a set of governance policies for Proseware's internal services, external services, and applications. The application will also provide a shared library for common functionality.

      Requirements -

      Policy service -
      You develop and deploy a stateful ASP.NET Core 2.1 web application named Policy service to an Azure App Service Web App. The application reacts to events from Azure Event Grid and performs policy actions based on those events.
      The application must include the Event Grid Event ID field in all Application Insights telemetry.
      Policy service must use Application Insights to automatically scale with the number of policy actions that it is performing.

      Policies -

      Log policy -
      All Azure App Service Web Apps must write logs to Azure Blob storage. All log files should be saved to a container named logdrop. Logs must remain in the container for 15 days.

      Authentication events -
      Authentication events are used to monitor users signing in and signing out. All authentication events must be processed by Policy service. Sign outs must be processed as quickly as possible.

      PolicyLib -
      You have a shared library named PolicyLib that contains functionality common to all ASP.NET Core web services and applications. The PolicyLib library must:
      Exclude non-user actions from Application Insights telemetry.
      Provide methods that allow a web service to scale itself.
      Ensure that scaling actions do not disrupt application usage.

      Other -

      Anomaly detection service -
      You have an anomaly detection service that analyzes log information for anomalies. It is implemented as an Azure Machine Learning model. The model is deployed as a web service. If an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook.

      Health monitoring -
      All web applications and services have health monitoring at the /health service endpoint.

      Issues -

      Policy loss -
      When you deploy Policy service, policies may not be applied if they were in the process of being applied during the deployment.

      Performance issue -
      When under heavy load, the anomaly detection service undergoes slowdowns and rejects connections.

      Notification latency -
      Users report that anomaly detection emails can sometimes arrive several minutes after an anomaly is detected.

      App code -

      EventGridController.cs -
      Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.


      LoginEvent.cs -
      Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.
       
      Question 
      DRAG DROP -
      You need to implement telemetry for non-user actions.
      How should you complete the Filter class? To answer, drag the appropriate code segments to the correct locations. Each code segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
      NOTE: Each correct selection is worth one point.
      Select and Place:


        Correct Answer:

        Scenario: Exclude non-user actions from Application Insights telemetry.

        Box 1: ITelemetryProcessor -
        To create a filter, implement ITelemetryProcessor. This technique gives you more direct control over what is included or excluded from the telemetry stream.

        Box 2: ITelemetryProcessor -

        Box 3: ITelemetryProcessor -

        Box 4: RequestTelemetry -

        Box 5: /health -
        To filter out an item, just terminate the chain.
        Reference:
        https://docs.microsoft.com/en-us/azure/azure-monitor/app/api-filtering-sampling
        Question #308
        Introductory Info Case study -
        This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
        To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
        At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

        To start the case study -
        To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

        Background -
        You are a developer for Proseware, Inc. You are developing an application that applies a set of governance policies for Proseware's internal services, external services, and applications. The application will also provide a shared library for common functionality.

        Requirements -

        Policy service -
        You develop and deploy a stateful ASP.NET Core 2.1 web application named Policy service to an Azure App Service Web App. The application reacts to events from Azure Event Grid and performs policy actions based on those events.
        The application must include the Event Grid Event ID field in all Application Insights telemetry.
        Policy service must use Application Insights to automatically scale with the number of policy actions that it is performing.

        Policies -

        Log policy -
        All Azure App Service Web Apps must write logs to Azure Blob storage. All log files should be saved to a container named logdrop. Logs must remain in the container for 15 days.

        Authentication events -
        Authentication events are used to monitor users signing in and signing out. All authentication events must be processed by Policy service. Sign outs must be processed as quickly as possible.

        PolicyLib -
        You have a shared library named PolicyLib that contains functionality common to all ASP.NET Core web services and applications. The PolicyLib library must:
        Exclude non-user actions from Application Insights telemetry.
        Provide methods that allow a web service to scale itself.
        Ensure that scaling actions do not disrupt application usage.

        Other -

        Anomaly detection service -
        You have an anomaly detection service that analyzes log information for anomalies. It is implemented as an Azure Machine Learning model. The model is deployed as a web service. If an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook.

        Health monitoring -
        All web applications and services have health monitoring at the /health service endpoint.

        Issues -

        Policy loss -
        When you deploy Policy service, policies may not be applied if they were in the process of being applied during the deployment.

        Performance issue -
        When under heavy load, the anomaly detection service undergoes slowdowns and rejects connections.

        Notification latency -
        Users report that anomaly detection emails can sometimes arrive several minutes after an anomaly is detected.

        App code -

        EventGridController.cs -
        Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.


        LoginEvent.cs -
        Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.
         Question DRAG DROP -
        You need to ensure that PolicyLib requirements are met.
        How should you complete the code segment? To answer, drag the appropriate code segments to the correct locations. Each code segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
        NOTE: Each correct selection is worth one point.
        Select and Place:


          Correct Answer:

          Scenario: You have a shared library named PolicyLib that contains functionality common to all ASP.NET Core web services and applications. The PolicyLib library must:
          ✑ Exclude non-user actions from Application Insights telemetry.
          ✑ Provide methods that allow a web service to scale itself.
          ✑ Ensure that scaling actions do not disrupt application usage.

          Box 1: ITelemetryInitializer -
          Use telemetry initializers to define global properties that are sent with all telemetry; and to override selected behavior of the standard telemetry modules.

          Box 2: Initialize -

          Box 3: Telemetry.Context -
          Box 4: ((EventTelemetry)telemetry).Properties["EventID"]
          Reference:
          https://docs.microsoft.com/en-us/azure/azure-monitor/app/api-filtering-sampling
          Question #309
          Introductory Info Case study -
          This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
          To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
          At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

          To start the case study -
          To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

          Background -
          You are a developer for Litware Inc., a SaaS company that provides a solution for managing employee expenses. The solution consists of an ASP.NET Core Web
          API project that is deployed as an Azure Web App.

          Overall architecture -
          Employees upload receipts for the system to process. When processing is complete, the employee receives a summary report email that details the processing results. Employees then use a web application to manage their receipts and perform any additional tasks needed for reimbursement.

          Receipt processing -
          Employees may upload receipts in two ways:
          Uploading using an Azure Files mounted folder
          Uploading using the web application

          Data Storage -
          Receipt and employee information is stored in an Azure SQL database.

          Documentation -
          Employees are provided with a getting started document when they first use the solution. The documentation includes details on supported operating systems for
          Azure File upload, and instructions on how to configure the mounted folder.

          Solution details -

          Users table -


          Web Application -
          You enable MSI for the Web App and configure the Web App to use the security principal name WebAppIdentity.

          Processing -
          Processing is performed by an Azure Function that uses version 2 of the Azure Function runtime. Once processing is completed, results are stored in Azure Blob
          Storage and an Azure SQL database. Then, an email summary is sent to the user with a link to the processing report. The link to the report must remain valid if the email is forwarded to another user.

          Logging -
          Azure Application Insights is used for telemetry and logging in both the processor and the web application. The processor also has TraceWriter logging enabled.
          Application Insights must always contain all log messages.

          Requirements -

          Receipt processing -
          Concurrent processing of a receipt must be prevented.

          Disaster recovery -
          Regional outage must not impact application availability. All DR operations must not be dependent on application running and must ensure that data in the DR region is up to date.

          Security -
          User's SecurityPin must be stored in such a way that access to the database does not allow the viewing of SecurityPins. The web application is the only system that should have access to SecurityPins.
          All certificates and secrets used to secure data must be stored in Azure Key Vault.
          You must adhere to the principle of least privilege and provide privileges which are essential to perform the intended function.
          All access to Azure Storage and Azure SQL database must use the application's Managed Service Identity (MSI).
          Receipt data must always be encrypted at rest.
          All data must be protected in transit.
          User's expense account number must be visible only to logged in users. All other views of the expense account number should include only the last segment, with the remaining parts obscured.
          In the case of a security breach, access to all summary reports must be revoked without impacting other parts of the system.

          Issues -

          Upload format issue -
          Employees occasionally report an issue with uploading a receipt using the web application. They report that when they upload a receipt using the Azure File
          Share, the receipt does not appear in their profile. When this occurs, they delete the file in the file share and use the web application, which returns a 500 Internal
          Server error page.

          Capacity issue -
          During busy periods, employees report long delays between the time they upload the receipt and when it appears in the web application.

          Log capacity issue -
          Developers report that the number of log messages in the trace output for the processor is too high, resulting in lost log messages.

          Application code -

          Processing.cs -


          Database.cs -


          ReceiptUploader.cs -


          ConfigureSSE.ps1 -
           Question You need to ensure receipt processing occurs correctly.
          What should you do?
          1. A
            Use blob properties to prevent concurrency problems
          2. B
            Use blob SnapshotTime to prevent concurrency problems
          3. C
            Use blob metadata to prevent concurrency problems
          4. D
            Use blob leases to prevent concurrency problems

          Correct Answer:

          B


          You can create a snapshot of a blob. A snapshot is a read-only version of a blob that's taken at a point in time. Once a snapshot has been created, it can be read, copied, or deleted, but not modified. Snapshots provide a way to back up a blob as it appears at a moment in time.
          Scenario: Processing is performed by an Azure Function that uses version 2 of the Azure Function runtime. Once processing is completed, results are stored in
          Azure Blob Storage and an Azure SQL database. Then, an email summary is sent to the user with a link to the processing report. The link to the report must remain valid if the email is forwarded to another user.
          Reference:
          https://docs.microsoft.com/en-us/rest/api/storageservices/creating-a-snapshot-of-a-blob


          Question #310
          Introductory Info Case study -
          This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
          To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
          At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

          To start the case study -
          To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

          Background -
          You are a developer for Litware Inc., a SaaS company that provides a solution for managing employee expenses. The solution consists of an ASP.NET Core Web
          API project that is deployed as an Azure Web App.

          Overall architecture -
          Employees upload receipts for the system to process. When processing is complete, the employee receives a summary report email that details the processing results. Employees then use a web application to manage their receipts and perform any additional tasks needed for reimbursement.

          Receipt processing -
          Employees may upload receipts in two ways:
          Uploading using an Azure Files mounted folder
          Uploading using the web application

          Data Storage -
          Receipt and employee information is stored in an Azure SQL database.

          Documentation -
          Employees are provided with a getting started document when they first use the solution. The documentation includes details on supported operating systems for
          Azure File upload, and instructions on how to configure the mounted folder.

          Solution details -

          Users table -


          Web Application -
          You enable MSI for the Web App and configure the Web App to use the security principal name WebAppIdentity.

          Processing -
          Processing is performed by an Azure Function that uses version 2 of the Azure Function runtime. Once processing is completed, results are stored in Azure Blob
          Storage and an Azure SQL database. Then, an email summary is sent to the user with a link to the processing report. The link to the report must remain valid if the email is forwarded to another user.

          Logging -
          Azure Application Insights is used for telemetry and logging in both the processor and the web application. The processor also has TraceWriter logging enabled.
          Application Insights must always contain all log messages.

          Requirements -

          Receipt processing -
          Concurrent processing of a receipt must be prevented.

          Disaster recovery -
          Regional outage must not impact application availability. All DR operations must not be dependent on application running and must ensure that data in the DR region is up to date.

          Security -
          User's SecurityPin must be stored in such a way that access to the database does not allow the viewing of SecurityPins. The web application is the only system that should have access to SecurityPins.
          All certificates and secrets used to secure data must be stored in Azure Key Vault.
          You must adhere to the principle of least privilege and provide privileges which are essential to perform the intended function.
          All access to Azure Storage and Azure SQL database must use the application's Managed Service Identity (MSI).
          Receipt data must always be encrypted at rest.
          All data must be protected in transit.
          User's expense account number must be visible only to logged in users. All other views of the expense account number should include only the last segment, with the remaining parts obscured.
          In the case of a security breach, access to all summary reports must be revoked without impacting other parts of the system.

          Issues -

          Upload format issue -
          Employees occasionally report an issue with uploading a receipt using the web application. They report that when they upload a receipt using the Azure File
          Share, the receipt does not appear in their profile. When this occurs, they delete the file in the file share and use the web application, which returns a 500 Internal
          Server error page.

          Capacity issue -
          During busy periods, employees report long delays between the time they upload the receipt and when it appears in the web application.

          Log capacity issue -
          Developers report that the number of log messages in the trace output for the processor is too high, resulting in lost log messages.

          Application code -

          Processing.cs -


          Database.cs -


          ReceiptUploader.cs -


          ConfigureSSE.ps1 -
           Question You need to resolve the capacity issue.
          What should you do?
          1. A
            Convert the trigger on the Azure Function to an Azure Blob storage trigger
          2. B
            Ensure that the consumption plan is configured correctly to allow scaling
          3. C
            Move the Azure Function to a dedicated App Service Plan
          4. D
            Update the loop starting on line PC09 to process items in parallel

          Correct Answer:
          D
          If you want to read the files in parallel, you cannot use forEach. Each of the async callback function calls does return a promise. You can await the array of promises that you'll get with Promise.all.
          Scenario: Capacity issue: During busy periods, employees report long delays between the time they upload the receipt and when it appears in the web application.

          Reference:
          https://stackoverflow.com/questions/37576685/using-async-await-with-a-foreach-loop
          Question #311
          Introductory Info Case study -
          This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
          To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
          At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

          To start the case study -
          To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

          Background -
          You are a developer for Litware Inc., a SaaS company that provides a solution for managing employee expenses. The solution consists of an ASP.NET Core Web
          API project that is deployed as an Azure Web App.

          Overall architecture -
          Employees upload receipts for the system to process. When processing is complete, the employee receives a summary report email that details the processing results. Employees then use a web application to manage their receipts and perform any additional tasks needed for reimbursement.

          Receipt processing -
          Employees may upload receipts in two ways:
          Uploading using an Azure Files mounted folder
          Uploading using the web application

          Data Storage -
          Receipt and employee information is stored in an Azure SQL database.

          Documentation -
          Employees are provided with a getting started document when they first use the solution. The documentation includes details on supported operating systems for
          Azure File upload, and instructions on how to configure the mounted folder.

          Solution details -

          Users table -


          Web Application -
          You enable MSI for the Web App and configure the Web App to use the security principal name WebAppIdentity.

          Processing -
          Processing is performed by an Azure Function that uses version 2 of the Azure Function runtime. Once processing is completed, results are stored in Azure Blob
          Storage and an Azure SQL database. Then, an email summary is sent to the user with a link to the processing report. The link to the report must remain valid if the email is forwarded to another user.

          Logging -
          Azure Application Insights is used for telemetry and logging in both the processor and the web application. The processor also has TraceWriter logging enabled.
          Application Insights must always contain all log messages.

          Requirements -

          Receipt processing -
          Concurrent processing of a receipt must be prevented.

          Disaster recovery -
          Regional outage must not impact application availability. All DR operations must not be dependent on application running and must ensure that data in the DR region is up to date.

          Security -
          User's SecurityPin must be stored in such a way that access to the database does not allow the viewing of SecurityPins. The web application is the only system that should have access to SecurityPins.
          All certificates and secrets used to secure data must be stored in Azure Key Vault.
          You must adhere to the principle of least privilege and provide privileges which are essential to perform the intended function.
          All access to Azure Storage and Azure SQL database must use the application's Managed Service Identity (MSI).
          Receipt data must always be encrypted at rest.
          All data must be protected in transit.
          User's expense account number must be visible only to logged in users. All other views of the expense account number should include only the last segment, with the remaining parts obscured.
          In the case of a security breach, access to all summary reports must be revoked without impacting other parts of the system.

          Issues -

          Upload format issue -
          Employees occasionally report an issue with uploading a receipt using the web application. They report that when they upload a receipt using the Azure File
          Share, the receipt does not appear in their profile. When this occurs, they delete the file in the file share and use the web application, which returns a 500 Internal
          Server error page.

          Capacity issue -
          During busy periods, employees report long delays between the time they upload the receipt and when it appears in the web application.

          Log capacity issue -
          Developers report that the number of log messages in the trace output for the processor is too high, resulting in lost log messages.

          Application code -

          Processing.cs -


          Database.cs -


          ReceiptUploader.cs -


          ConfigureSSE.ps1 -
           Question You need to resolve the log capacity issue.
          What should you do?
          1. A
            Create an Application Insights Telemetry Filter
          2. B
            Change the minimum log level in the host.json file for the function
          3. C
            Implement Application Insights Sampling
          4. D
            Set a LogCategoryFilter during startup

          Correct Answer:
          C
          Scenario, the log capacity issue: Developers report that the number of log message in the trace output for the processor is too high, resulting in lost log messages.
          Sampling is a feature in Azure Application Insights. It is the recommended way to reduce telemetry traffic and storage, while preserving a statistically correct analysis of application data. The filter selects items that are related, so that you can navigate between items when you are doing diagnostic investigations. When metric counts are presented to you in the portal, they are renormalized to take account of the sampling, to minimize any effect on the statistics.
          Sampling reduces traffic and data costs, and helps you avoid throttling.
          Reference:
          https://docs.microsoft.com/en-us/azure/azure-monitor/app/sampling
          Question #312
          Introductory Info Case study -
          This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
          To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
          At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

          To start the case study -
          To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

          Background -
          VanArsdel, Ltd. is a global office supply company. The company is based in Canada and has retail store locations across the world. The company is developing several cloud-based solutions to support their stores, distributors, suppliers, and delivery services.

          Current environment -

          Corporate website -
          The company provides a public website located at http://www.vanarsdelltd.com. The website consists of a React JavaScript user interface, HTML, CSS, image assets, and several APIs hosted in Azure Functions.

          Retail Store Locations -
          The company supports thousands of store locations globally. Store locations send data every hour to an Azure Blob storage account to support inventory, purchasing and delivery services. Each record includes a location identifier and sales transaction information.

          Requirements -
          The application components must meet the following requirements:

          Corporate website -
          Secure the website by using SSL.
          Minimize costs for data storage and hosting.
          Implement native GitHub workflows for continuous integration and continuous deployment (CI/CD).
          Distribute the website content globally for local use.
          Implement monitoring by using Application Insights and availability web tests including SSL certificate validity and custom header value verification.
          The website must have 99.95 percent uptime.

          Retail store locations -
          Azure Functions must process data immediately when data is uploaded to Blob storage. Azure Functions must update Azure Cosmos DB by using native SQL language queries.
          Audit store sale transaction information nightly to validate data, process sales financials, and reconcile inventory.

          Delivery services -
          Store service telemetry data in Azure Cosmos DB by using an Azure Function. Data must include an item id, the delivery vehicle license plate, vehicle package capacity, and current vehicle location coordinates.
          Store delivery driver profile information in Azure Active Directory (Azure AD) by using an Azure Function called from the corporate website.

          Inventory services -
          The company has contracted a third-party to develop an API for inventory processing that requires access to a specific blob within the retail store storage account for three months to include read-only access to the data.

          Security -
          All Azure Functions must centralize management and distribution of configuration data for different environments and geographies, encrypted by using a company-provided RSA-HSM key.
          Authentication and authorization must use Azure AD and services must use managed identities where possible.

          Issues -

          Retail Store Locations -
          You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.
          Azure Cosmos DB queries from the Azure Function exhibit high Request Unit (RU) usage and contain multiple, complex queries that exhibit high point read latency for large items as the function app is scaling. Question HOTSPOT -
          You need to implement event routing for retail store location data.
          Which configurations should you use? To answer, select the appropriate options in the answer area.
          NOTE: Each correct selection is worth one point.
          Hot Area:


            Correct Answer:

            Box 1: Azure Blob Storage -
            Azure event publishers and event handlers are at the core of the Event Grid routing-service. Event Grid listens to Azure event publishers, such as Blog Storage, then reacts by routing specific events to Azure event handlers, such as WebHooks. You can easily control this entire process at a granular level through event subscriptions and event filters.

            Box 2: Azure Event Grid -
            Azure Event Grid is a highly scalable event-routing service that listens for specific system events, then reacts to them according to your precise specifications. In the past, event handling has relied largely on polling ג€" a high latency, low-efficiency approach that can prove prohibitively expensive at scale.

            Box 3: Azure Logic App -
            Event Grid's supported event handlers currently include Event Hubs, WebHooks, Logic Apps, Azure Functions, Azure Automation and Microsoft Flow.
            Reference:
            https://www.appliedi.net/blog/using-azure-event-grid-for-highly-scalable-event-routing
            Question #313
            Introductory Info Case study -
            This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
            To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
            At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

            To start the case study -
            To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

            LabelMaker app -
            Coho Winery produces, bottles, and distributes a variety of wines globally. You are a developer implementing highly scalable and resilient applications to support online order processing by using Azure solutions.
            Coho Winery has a LabelMaker application that prints labels for wine bottles. The application sends data to several printers. The application consists of five modules that run independently on virtual machines (VMs). Coho Winery plans to move the application to Azure and continue to support label creation.
            External partners send data to the LabelMaker application to include artwork and text for custom label designs.

            Requirements. Data -
            You identify the following requirements for data management and manipulation:
            Order data is stored as nonrelational JSON and must be queried using SQL.
            Changes to the Order data must reflect immediately across all partitions. All reads to the Order data must fetch the most recent writes.

            Requirements. Security -
            You have the following security requirements:
            Users of Coho Winery applications must be able to provide access to documents, resources, and applications to external partners.

            External partners must use their own credentials and authenticate with their organization's identity management solution.
            External partner logins must be audited monthly for application use by a user account administrator to maintain company compliance.
            Storage of e-commerce application settings must be maintained in Azure Key Vault.
            E-commerce application sign-ins must be secured by using Azure App Service authentication and Azure Active Directory (AAD).
            Conditional access policies must be applied at the application level to protect company content.
            The LabelMaker application must be secured by using an AAD account that has full access to all namespaces of the Azure Kubernetes Service (AKS) cluster.

            Requirements. LabelMaker app -
            Azure Monitor Container Health must be used to monitor the performance of workloads that are deployed to Kubernetes environments and hosted on Azure
            Kubernetes Service (AKS).
            You must use Azure Container Registry to publish images that support the AKS deployment.

            Architecture -


            Issues -
            Calls to the Printer API App fail periodically due to printer communication timeouts.
            Printer communication timeouts occur after 10 seconds. The label printer must only receive up to 5 attempts within one minute.
            The order workflow fails to run upon initial deployment to Azure.

            Order.json -
            Relevant portions of the app files are shown below. Line numbers are included for reference only.
            This JSON file contains a representation of the data for an order that includes a single item.
             

            Question: You need to troubleshoot the order workflow.
            Which two actions should you perform? Each correct answer presents part of the solution.
            NOTE: Each correct selection is worth one point.
            1. A
              Review the API connections.
            2. B
              Review the activity log.
            3. C
              Review the run history.
            4. D
              Review the trigger history.

            Correct Answer:
            CD
            Scenario: The order workflow fails to run upon initial deployment to Azure.
            Check runs history: Each time that the trigger fires for an item or event, the Logic Apps engine creates and runs a separate workflow instance for each item or event. If a run fails, follow these steps to review what happened during that run, including the status for each step in the workflow plus the inputs and outputs for each step.
            Check the workflow's run status by checking the runs history. To view more information about a failed run, including all the steps in that run in their status, select the failed run.
            Example:

            Check the trigger's status by checking the trigger history
            To view more information about the trigger attempt, select that trigger event, for example:

            Reference:
            https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-diagnosing-failures
            Question #314
            Introductory Info Case study -
            This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
            To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
            At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

            To start the case study -
            To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

            LabelMaker app -
            Coho Winery produces, bottles, and distributes a variety of wines globally. You are a developer implementing highly scalable and resilient applications to support online order processing by using Azure solutions.
            Coho Winery has a LabelMaker application that prints labels for wine bottles. The application sends data to several printers. The application consists of five modules that run independently on virtual machines (VMs). Coho Winery plans to move the application to Azure and continue to support label creation.
            External partners send data to the LabelMaker application to include artwork and text for custom label designs.

            Requirements. Data -
            You identify the following requirements for data management and manipulation:
            Order data is stored as nonrelational JSON and must be queried using SQL.
            Changes to the Order data must reflect immediately across all partitions. All reads to the Order data must fetch the most recent writes.

            Requirements. Security -
            You have the following security requirements:
            Users of Coho Winery applications must be able to provide access to documents, resources, and applications to external partners.

            External partners must use their own credentials and authenticate with their organization's identity management solution.
            External partner logins must be audited monthly for application use by a user account administrator to maintain company compliance.
            Storage of e-commerce application settings must be maintained in Azure Key Vault.
            E-commerce application sign-ins must be secured by using Azure App Service authentication and Azure Active Directory (AAD).
            Conditional access policies must be applied at the application level to protect company content.
            The LabelMaker application must be secured by using an AAD account that has full access to all namespaces of the Azure Kubernetes Service (AKS) cluster.

            Requirements. LabelMaker app -
            Azure Monitor Container Health must be used to monitor the performance of workloads that are deployed to Kubernetes environments and hosted on Azure
            Kubernetes Service (AKS).
            You must use Azure Container Registry to publish images that support the AKS deployment.

            Architecture -


            Issues -
            Calls to the Printer API App fail periodically due to printer communication timeouts.
            Printer communication timeouts occur after 10 seconds. The label printer must only receive up to 5 attempts within one minute.
            The order workflow fails to run upon initial deployment to Azure.

            Order.json -
            Relevant portions of the app files are shown below. Line numbers are included for reference only.
            This JSON file contains a representation of the data for an order that includes a single item.
             Question HOTSPOT -
            You need to update the order workflow to address the issue when calling the Printer API App.
            How should you complete the code? To answer, select the appropriate options in the answer area.
            NOTE: Each correct selection is worth one point.
            Hot Area:


              Correct Answer:

              Box 1: fixed -
              The 'Default' policy does 4 exponential retries and from my experience the interval times are often too short in situations.

              Box 2: PT60S -
              We could set a fixed interval, e.g. 5 retries every 60 seconds (PT60S).
              PT60S is 60 seconds.
              Scenario: Calls to the Printer API App fail periodically due to printer communication timeouts.
              Printer communication timeouts occur after 10 seconds. The label printer must only receive up to 5 attempts within one minute.

              Box 3: 5 -
              Reference:
              https://michalsacewicz.com/error-handling-in-power-automate/
              Question #315
              Introductory Info Case study -
              This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
              To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
              At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

              To start the case study -
              To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

              Background -
              Wide World Importers is moving all their datacenters to Azure. The company has developed several applications and services to support supply chain operations and would like to leverage serverless computing where possible.

              Current environment -
              Windows Server 2016 virtual machine
              This virtual machine (VM) runs BizTalk Server 2016. The VM runs the following workflows:
              Ocean Transport `" This workflow gathers and validates container information including container contents and arrival notices at various shipping ports.
              Inland Transport `" This workflow gathers and validates trucking information including fuel usage, number of stops, and routes.
              The VM supports the following REST API calls:
              Container API `" This API provides container information including weight, contents, and other attributes.
              Location API `" This API provides location information regarding shipping ports of call and trucking stops.
              Shipping REST API `" This API provides shipping information for use and display on the shipping website.

              Shipping Data -
              The application uses MongoDB JSON document storage database for all container and transport information.

              Shipping Web Site -
              The site displays shipping container tracking information and container contents. The site is located at http://shipping.wideworldimporters.com/

              Proposed solution -
              The on-premises shipping application must be moved to Azure. The VM has been migrated to a new Standard_D16s_v3 Azure VM by using Azure Site Recovery and must remain running in Azure to complete the BizTalk component migrations. You create a Standard_D16s_v3 Azure VM to host BizTalk Server. The Azure architecture diagram for the proposed solution is shown below:


              Requirements -

              Shipping Logic app -
              The Shipping Logic app must meet the following requirements:
              Support the ocean transport and inland transport workflows by using a Logic App.
              Support industry-standard protocol X12 message format for various messages including vessel content details and arrival notices.
              Secure resources to the corporate VNet and use dedicated storage resources with a fixed costing model.
              Maintain on-premises connectivity to support legacy applications and final BizTalk migrations.

              Shipping Function app -
              Implement secure function endpoints by using app-level security and include Azure Active Directory (Azure AD).

              REST APIs -
              The REST API's that support the solution must meet the following requirements:
              Secure resources to the corporate VNet.
              Allow deployment to a testing location within Azure while not incurring additional costs.
              Automatically scale to double capacity during peak shipping times while not causing application downtime.
              Minimize costs when selecting an Azure payment model.

              Shipping data -
              Data migration from on-premises to Azure must minimize costs and downtime.

              Shipping website -
              Use Azure Content Delivery Network (CDN) and ensure maximum performance for dynamic content while minimizing latency and costs.

              Issues -

              Windows Server 2016 VM -
              The VM shows high network latency, jitter, and high CPU utilization. The VM is critical and has not been backed up in the past. The VM must enable a quick restore from a 7-day snapshot to include in-place restore of disks in case of failure.

              Shipping website and REST APIs -
              The following error message displays while you are testing the website:
              Failed to load http://test-shippingapi.wideworldimporters.com/: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://test.wideworldimporters.com/' is therefore not allowed access. Question DRAG DROP -
              You need to support the message processing for the ocean transport workflow.
              Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
              Select and Place:


                Correct Answer:

                Step 1: Create an integration account in the Azure portal
                You can define custom metadata for artifacts in integration accounts and get that metadata during runtime for your logic app to use. For example, you can provide metadata for artifacts, such as partners, agreements, schemas, and maps - all store metadata using key-value pairs.
                Step 2: Link the Logic App to the integration account
                A logic app that's linked to the integration account and artifact metadata you want to use.
                Step 3: Add partners, schemas, certificates, maps, and agreements
                Step 4: Create a custom connector for the Logic App.
                Reference:
                https://docs.microsoft.com/bs-latn-ba/azure/logic-apps/logic-apps-enterprise-integration-metadata
                Question #316
                Introductory Info Case study -
                This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                To start the case study -
                To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                Background -
                Wide World Importers is moving all their datacenters to Azure. The company has developed several applications and services to support supply chain operations and would like to leverage serverless computing where possible.

                Current environment -
                Windows Server 2016 virtual machine
                This virtual machine (VM) runs BizTalk Server 2016. The VM runs the following workflows:
                Ocean Transport `" This workflow gathers and validates container information including container contents and arrival notices at various shipping ports.
                Inland Transport `" This workflow gathers and validates trucking information including fuel usage, number of stops, and routes.
                The VM supports the following REST API calls:
                Container API `" This API provides container information including weight, contents, and other attributes.
                Location API `" This API provides location information regarding shipping ports of call and trucking stops.
                Shipping REST API `" This API provides shipping information for use and display on the shipping website.

                Shipping Data -
                The application uses MongoDB JSON document storage database for all container and transport information.

                Shipping Web Site -
                The site displays shipping container tracking information and container contents. The site is located at http://shipping.wideworldimporters.com/

                Proposed solution -
                The on-premises shipping application must be moved to Azure. The VM has been migrated to a new Standard_D16s_v3 Azure VM by using Azure Site Recovery and must remain running in Azure to complete the BizTalk component migrations. You create a Standard_D16s_v3 Azure VM to host BizTalk Server. The Azure architecture diagram for the proposed solution is shown below:


                Requirements -

                Shipping Logic app -
                The Shipping Logic app must meet the following requirements:
                Support the ocean transport and inland transport workflows by using a Logic App.
                Support industry-standard protocol X12 message format for various messages including vessel content details and arrival notices.
                Secure resources to the corporate VNet and use dedicated storage resources with a fixed costing model.
                Maintain on-premises connectivity to support legacy applications and final BizTalk migrations.

                Shipping Function app -
                Implement secure function endpoints by using app-level security and include Azure Active Directory (Azure AD).

                REST APIs -
                The REST API's that support the solution must meet the following requirements:
                Secure resources to the corporate VNet.
                Allow deployment to a testing location within Azure while not incurring additional costs.
                Automatically scale to double capacity during peak shipping times while not causing application downtime.
                Minimize costs when selecting an Azure payment model.

                Shipping data -
                Data migration from on-premises to Azure must minimize costs and downtime.

                Shipping website -
                Use Azure Content Delivery Network (CDN) and ensure maximum performance for dynamic content while minimizing latency and costs.

                Issues -

                Windows Server 2016 VM -
                The VM shows high network latency, jitter, and high CPU utilization. The VM is critical and has not been backed up in the past. The VM must enable a quick restore from a 7-day snapshot to include in-place restore of disks in case of failure.

                Shipping website and REST APIs -
                The following error message displays while you are testing the website:
                Failed to load http://test-shippingapi.wideworldimporters.com/: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://test.wideworldimporters.com/' is therefore not allowed access. Question You need to support the requirements for the Shipping Logic App.
                What should you use?
                1. A
                  Azure Active Directory Application Proxy
                2. B
                  Site-to-Site (S2S) VPN connection
                3. C
                  On-premises Data Gateway
                4. D
                  Point-to-Site (P2S) VPN connection

                Correct Answer:
                C
                Before you can connect to on-premises data sources from Azure Logic Apps, download and install the on-premises data gateway on a local computer. The gateway works as a bridge that provides quick data transfer and encryption between data sources on premises (not in the cloud) and your logic apps.
                The gateway supports BizTalk Server 2016.
                Note: Microsoft have now fully incorporated the Azure BizTalk Services capabilities into Logic Apps and Azure App Service Hybrid Connections.
                Logic Apps Enterprise Integration pack bring some of the enterprise B2B capabilities like AS2 and X12, EDI standards support
                Scenario: The Shipping Logic app must meet the following requirements:
                ✑ Support the ocean transport and inland transport workflows by using a Logic App.
                ✑ Support industry-standard protocol X12 message format for various messages including vessel content details and arrival notices.
                ✑ Secure resources to the corporate VNet and use dedicated storage resources with a fixed costing model.
                ✑ Maintain on-premises connectivity to support legacy applications and final BizTalk migrations.
                Reference:
                https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-gateway-install
                Question #317
                Introductory Info Case study -
                This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                To start the case study -
                To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                Background -
                City Power & Light company provides electrical infrastructure monitoring solutions for homes and businesses. The company is migrating solutions to Azure.

                Current environment -

                Architecture overview -
                The company has a public website located at http://www.cpandl.com/. The site is a single-page web application that runs in Azure App Service on Linux. The website uses files stored in Azure Storage and cached in Azure Content Delivery Network (CDN) to serve static content.
                API Management and Azure Function App functions are used to process and store data in Azure Database for PostgreSQL. API Management is used to broker communications to the Azure Function app functions for Logic app integration. Logic apps are used to orchestrate the data processing while Service Bus and
                Event Grid handle messaging and events.
                The solution uses Application Insights, Azure Monitor, and Azure Key Vault.

                Architecture diagram -
                The company has several applications and services that support their business. The company plans to implement serverless computing where possible. The overall architecture is shown below.


                User authentication -
                The following steps detail the user authentication process:
                1. The user selects Sign in in the website.
                2. The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.
                3. The user signs in.
                4. Azure AD redirects the user's session back to the web application. The URL includes an access token.
                5. The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience ('aud') claim in the access token.
                6. The back-end API validates the access token.

                Requirements -

                Corporate website -
                Communications and content must be secured by using SSL.
                Communications must use HTTPS.
                Data must be replicated to a secondary region and three availability zones.
                Data storage costs must be minimized.

                Azure Database for PostgreSQL -
                The database connection string is stored in Azure Key Vault with the following attributes:
                Azure Key Vault name: cpandlkeyvault
                Secret name: PostgreSQLConn
                Id: 80df3e46ffcd4f1cb187f79905e9a1e8
                The connection information is updated frequently. The application must always use the latest information to connect to the database.
                Azure Service Bus and Azure Event Grid
                Azure Event Grid must use Azure Service Bus for queue-based load leveling.
                Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.
                Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.

                Security -
                All SSL certificates and credentials must be stored in Azure Key Vault.
                File access must restrict access by IP, protocol, and Azure AD rights.
                All user accounts and processes must receive only those privileges which are essential to perform their intended function.

                Compliance -
                Auditing of the file updates and transfers must be enabled to comply with General Data Protection Regulation (GDPR). The file updates must be read-only, stored in the order in which they occurred, include only create, update, delete, and copy operations, and be retained for compliance reasons.

                Issues -

                Corporate website -
                While testing the site, the following error message displays:
                CryptographicException: The system cannot find the file specified.

                Function app -
                You perform local testing for the RequestUserApproval function. The following error message displays:
                'Timeout value of 00:10:00 exceeded by function: RequestUserApproval'
                The same error message displays when you test the function in an Azure development environment when you run the following Kusto query:

                FunctionAppLogs -
                | where FunctionName = = "RequestUserApproval"

                Logic app -
                You test the Logic app in a development environment. The following error message displays:
                '400 Bad Request'
                Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.

                Code -

                Corporate website -
                Security.cs:


                Function app -
                RequestUserApproval.cs:
                 Question HOTSPOT -
                You need to configure the integration for Azure Service Bus and Azure Event Grid.
                How should you complete the CLI statement? To answer, select the appropriate options in the answer area.
                NOTE: Each correct selection is worth one point.
                Hot Area:


                  Correct Answer:

                  Box 1: eventgrid -
                  To create event subscription use: az eventgrid event-subscription create

                  Box 2: event-subscription -

                  Box 3: servicebusqueue -
                  Scenario: Azure Service Bus and Azure Event Grid
                  Azure Event Grid must use Azure Service Bus for queue-based load leveling.
                  Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.
                  Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.
                  Reference:
                  https://docs.microsoft.com/en-us/cli/azure/eventgrid/event-subscription?view=azure-cli-latest#az_eventgrid_event_subscription_create
                  Question #318
                  Introductory Info Case study -
                  This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                  To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                  At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                  To start the case study -
                  To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                  Background -
                  City Power & Light company provides electrical infrastructure monitoring solutions for homes and businesses. The company is migrating solutions to Azure.

                  Current environment -

                  Architecture overview -
                  The company has a public website located at http://www.cpandl.com/. The site is a single-page web application that runs in Azure App Service on Linux. The website uses files stored in Azure Storage and cached in Azure Content Delivery Network (CDN) to serve static content.
                  API Management and Azure Function App functions are used to process and store data in Azure Database for PostgreSQL. API Management is used to broker communications to the Azure Function app functions for Logic app integration. Logic apps are used to orchestrate the data processing while Service Bus and
                  Event Grid handle messaging and events.
                  The solution uses Application Insights, Azure Monitor, and Azure Key Vault.

                  Architecture diagram -
                  The company has several applications and services that support their business. The company plans to implement serverless computing where possible. The overall architecture is shown below.


                  User authentication -
                  The following steps detail the user authentication process:
                  1. The user selects Sign in in the website.
                  2. The browser redirects the user to the Azure Active Directory (Azure AD) sign in page.
                  3. The user signs in.
                  4. Azure AD redirects the user's session back to the web application. The URL includes an access token.
                  5. The web application calls an API and includes the access token in the authentication header. The application ID is sent as the audience ('aud') claim in the access token.
                  6. The back-end API validates the access token.

                  Requirements -

                  Corporate website -
                  Communications and content must be secured by using SSL.
                  Communications must use HTTPS.
                  Data must be replicated to a secondary region and three availability zones.
                  Data storage costs must be minimized.

                  Azure Database for PostgreSQL -
                  The database connection string is stored in Azure Key Vault with the following attributes:
                  Azure Key Vault name: cpandlkeyvault
                  Secret name: PostgreSQLConn
                  Id: 80df3e46ffcd4f1cb187f79905e9a1e8
                  The connection information is updated frequently. The application must always use the latest information to connect to the database.
                  Azure Service Bus and Azure Event Grid
                  Azure Event Grid must use Azure Service Bus for queue-based load leveling.
                  Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.
                  Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for processing.

                  Security -
                  All SSL certificates and credentials must be stored in Azure Key Vault.
                  File access must restrict access by IP, protocol, and Azure AD rights.
                  All user accounts and processes must receive only those privileges which are essential to perform their intended function.

                  Compliance -
                  Auditing of the file updates and transfers must be enabled to comply with General Data Protection Regulation (GDPR). The file updates must be read-only, stored in the order in which they occurred, include only create, update, delete, and copy operations, and be retained for compliance reasons.

                  Issues -

                  Corporate website -
                  While testing the site, the following error message displays:
                  CryptographicException: The system cannot find the file specified.

                  Function app -
                  You perform local testing for the RequestUserApproval function. The following error message displays:
                  'Timeout value of 00:10:00 exceeded by function: RequestUserApproval'
                  The same error message displays when you test the function in an Azure development environment when you run the following Kusto query:

                  FunctionAppLogs -
                  | where FunctionName = = "RequestUserApproval"

                  Logic app -
                  You test the Logic app in a development environment. The following error message displays:
                  '400 Bad Request'
                  Troubleshooting of the error shows an HttpTrigger action to call the RequestUserApproval function.

                  Code -

                  Corporate website -
                  Security.cs:


                  Function app -
                  RequestUserApproval.cs:
                  Question: You need to ensure that all messages from Azure Event Grid are processed.
                  What should you use?
                  1. A
                    Azure Event Grid topic
                  2. B
                    Azure Service Bus topic
                  3. C
                    Azure Service Bus queue
                  4. D
                    Azure Storage queue
                  5. E
                    Azure Logic App custom connector

                  Correct Answer:
                  C
                  As a solution architect/developer, you should consider using Service Bus queues when:
                  ✑ Your solution needs to receive messages without having to poll the queue. With Service Bus, you can achieve it by using a long-polling receive operation using the TCP-based protocols that Service Bus supports.
                  Reference:
                  https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-azure-and-service-bus-queues-compared-contrasted
                  Question #319
                  Introductory Info Case study -
                  This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                  To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                  At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                  To start the case study -
                  To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                  Background -
                  You are a developer for Proseware, Inc. You are developing an application that applies a set of governance policies for Proseware's internal services, external services, and applications. The application will also provide a shared library for common functionality.

                  Requirements -

                  Policy service -
                  You develop and deploy a stateful ASP.NET Core 2.1 web application named Policy service to an Azure App Service Web App. The application reacts to events from Azure Event Grid and performs policy actions based on those events.
                  The application must include the Event Grid Event ID field in all Application Insights telemetry.
                  Policy service must use Application Insights to automatically scale with the number of policy actions that it is performing.

                  Policies -

                  Log policy -
                  All Azure App Service Web Apps must write logs to Azure Blob storage. All log files should be saved to a container named logdrop. Logs must remain in the container for 15 days.

                  Authentication events -
                  Authentication events are used to monitor users signing in and signing out. All authentication events must be processed by Policy service. Sign outs must be processed as quickly as possible.

                  PolicyLib -
                  You have a shared library named PolicyLib that contains functionality common to all ASP.NET Core web services and applications. The PolicyLib library must:
                  Exclude non-user actions from Application Insights telemetry.
                  Provide methods that allow a web service to scale itself.
                  Ensure that scaling actions do not disrupt application usage.

                  Other -

                  Anomaly detection service -
                  You have an anomaly detection service that analyzes log information for anomalies. It is implemented as an Azure Machine Learning model. The model is deployed as a web service. If an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook.

                  Health monitoring -
                  All web applications and services have health monitoring at the /health service endpoint.

                  Issues -

                  Policy loss -
                  When you deploy Policy service, policies may not be applied if they were in the process of being applied during the deployment.

                  Performance issue -
                  When under heavy load, the anomaly detection service undergoes slowdowns and rejects connections.

                  Notification latency -
                  Users report that anomaly detection emails can sometimes arrive several minutes after an anomaly is detected.

                  App code -

                  EventGridController.cs -
                  Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.


                  LoginEvent.cs -
                  Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.
                   

                  Question: 
                  DRAG DROP -
                  You need to add code at line EG15 in EventGridController.cs to ensure that the Log policy applies to all services.
                  How should you complete the code? To answer, drag the appropriate code segments to the correct locations. Each code segment may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
                  NOTE: Each correct selection is worth one point.
                  Select and Place:


                    Correct Answer:

                    Scenario, Log policy: All Azure App Service Web Apps must write logs to Azure Blob storage.

                    Box 1: Status -

                    Box 2: Succeeded -

                    Box 3: operationName -
                    Microsoft.Web/sites/write is resource provider operation. It creates a new Web App or updates an existing one.
                    Reference:
                    https://docs.microsoft.com/en-us/azure/role-based-access-control/resource-provider-operations
                    Question #320
                    Introductory Info Case study -
                    This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                    To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                    At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                    To start the case study -
                    To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                    Background -
                    You are a developer for Proseware, Inc. You are developing an application that applies a set of governance policies for Proseware's internal services, external services, and applications. The application will also provide a shared library for common functionality.

                    Requirements -

                    Policy service -
                    You develop and deploy a stateful ASP.NET Core 2.1 web application named Policy service to an Azure App Service Web App. The application reacts to events from Azure Event Grid and performs policy actions based on those events.
                    The application must include the Event Grid Event ID field in all Application Insights telemetry.
                    Policy service must use Application Insights to automatically scale with the number of policy actions that it is performing.

                    Policies -

                    Log policy -
                    All Azure App Service Web Apps must write logs to Azure Blob storage. All log files should be saved to a container named logdrop. Logs must remain in the container for 15 days.

                    Authentication events -
                    Authentication events are used to monitor users signing in and signing out. All authentication events must be processed by Policy service. Sign outs must be processed as quickly as possible.

                    PolicyLib -
                    You have a shared library named PolicyLib that contains functionality common to all ASP.NET Core web services and applications. The PolicyLib library must:
                    Exclude non-user actions from Application Insights telemetry.
                    Provide methods that allow a web service to scale itself.
                    Ensure that scaling actions do not disrupt application usage.

                    Other -

                    Anomaly detection service -
                    You have an anomaly detection service that analyzes log information for anomalies. It is implemented as an Azure Machine Learning model. The model is deployed as a web service. If an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook.

                    Health monitoring -
                    All web applications and services have health monitoring at the /health service endpoint.

                    Issues -

                    Policy loss -
                    When you deploy Policy service, policies may not be applied if they were in the process of being applied during the deployment.

                    Performance issue -
                    When under heavy load, the anomaly detection service undergoes slowdowns and rejects connections.

                    Notification latency -
                    Users report that anomaly detection emails can sometimes arrive several minutes after an anomaly is detected.

                    App code -

                    EventGridController.cs -
                    Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.


                    LoginEvent.cs -
                    Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.
                     Question HOTSPOT -
                    You need to insert code at line LE03 of LoginEvent.cs to ensure that all authentication events are processed correctly.
                    How should you complete the code? To answer, select the appropriate options in the answer area.
                    NOTE: Each correct selection is worth one point.
                    Hot Area:


                      Correct Answer:

                      Box 1: id -
                      id is a unique identifier for the event.

                      Box 2: eventType -
                      eventType is one of the registered event types for this event source.

                      Box 3: dataVersion -
                      dataVersion is the schema version of the data object. The publisher defines the schema version.
                      Scenario: Authentication events are used to monitor users signing in and signing out. All authentication events must be processed by Policy service. Sign outs must be processed as quickly as possible.
                      The following example shows the properties that are used by all event publishers:
                      [
                      {
                      "topic": string,
                      "subject": string,
                      "id": string,
                      "eventType": string,
                      "eventTime": string,
                      "data":{
                      object-unique-to-each-publisher
                      },
                      "dataVersion": string,
                      "metadataVersion": string
                      }
                      ]
                      Reference:
                      https://docs.microsoft.com/en-us/azure/event-grid/event-schema
                      Question #321
                      Introductory Info Case study -
                      This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                      To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                      At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                      To start the case study -
                      To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                      Background -
                      You are a developer for Proseware, Inc. You are developing an application that applies a set of governance policies for Proseware's internal services, external services, and applications. The application will also provide a shared library for common functionality.

                      Requirements -

                      Policy service -
                      You develop and deploy a stateful ASP.NET Core 2.1 web application named Policy service to an Azure App Service Web App. The application reacts to events from Azure Event Grid and performs policy actions based on those events.
                      The application must include the Event Grid Event ID field in all Application Insights telemetry.
                      Policy service must use Application Insights to automatically scale with the number of policy actions that it is performing.

                      Policies -

                      Log policy -
                      All Azure App Service Web Apps must write logs to Azure Blob storage. All log files should be saved to a container named logdrop. Logs must remain in the container for 15 days.

                      Authentication events -
                      Authentication events are used to monitor users signing in and signing out. All authentication events must be processed by Policy service. Sign outs must be processed as quickly as possible.

                      PolicyLib -
                      You have a shared library named PolicyLib that contains functionality common to all ASP.NET Core web services and applications. The PolicyLib library must:
                      Exclude non-user actions from Application Insights telemetry.
                      Provide methods that allow a web service to scale itself.
                      Ensure that scaling actions do not disrupt application usage.

                      Other -

                      Anomaly detection service -
                      You have an anomaly detection service that analyzes log information for anomalies. It is implemented as an Azure Machine Learning model. The model is deployed as a web service. If an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook.

                      Health monitoring -
                      All web applications and services have health monitoring at the /health service endpoint.

                      Issues -

                      Policy loss -
                      When you deploy Policy service, policies may not be applied if they were in the process of being applied during the deployment.

                      Performance issue -
                      When under heavy load, the anomaly detection service undergoes slowdowns and rejects connections.

                      Notification latency -
                      Users report that anomaly detection emails can sometimes arrive several minutes after an anomaly is detected.

                      App code -

                      EventGridController.cs -
                      Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.


                      LoginEvent.cs -
                      Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.
                       
                      Question 
                      HOTSPOT -
                      You need to implement the Log policy.
                      How should you complete the EnsureLogging method in EventGridController.cs? To answer, select the appropriate options in the answer area.
                      NOTE: Each correct selection is worth one point.
                      Hot Area:


                        Correct Answer:

                        Box 1: logdrop -
                        All log files should be saved to a container named logdrop.

                        Box 2: 15 -
                        Logs must remain in the container for 15 days.
                        Box 3: UpdateApplicationSettings
                        All Azure App Service Web Apps must write logs to Azure Blob storage.
                        Reference:
                        https://blog.hompus.nl/2017/05/29/adding-application-logging-blob-to-a-azure-web-app-service-using-powershell/
                        Question #322
                        Introductory Info Case study -
                        This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                        To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                        At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                        To start the case study -
                        To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                        Background -
                        You are a developer for Proseware, Inc. You are developing an application that applies a set of governance policies for Proseware's internal services, external services, and applications. The application will also provide a shared library for common functionality.

                        Requirements -

                        Policy service -
                        You develop and deploy a stateful ASP.NET Core 2.1 web application named Policy service to an Azure App Service Web App. The application reacts to events from Azure Event Grid and performs policy actions based on those events.
                        The application must include the Event Grid Event ID field in all Application Insights telemetry.
                        Policy service must use Application Insights to automatically scale with the number of policy actions that it is performing.

                        Policies -

                        Log policy -
                        All Azure App Service Web Apps must write logs to Azure Blob storage. All log files should be saved to a container named logdrop. Logs must remain in the container for 15 days.

                        Authentication events -
                        Authentication events are used to monitor users signing in and signing out. All authentication events must be processed by Policy service. Sign outs must be processed as quickly as possible.

                        PolicyLib -
                        You have a shared library named PolicyLib that contains functionality common to all ASP.NET Core web services and applications. The PolicyLib library must:
                        Exclude non-user actions from Application Insights telemetry.
                        Provide methods that allow a web service to scale itself.
                        Ensure that scaling actions do not disrupt application usage.

                        Other -

                        Anomaly detection service -
                        You have an anomaly detection service that analyzes log information for anomalies. It is implemented as an Azure Machine Learning model. The model is deployed as a web service. If an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook.

                        Health monitoring -
                        All web applications and services have health monitoring at the /health service endpoint.

                        Issues -

                        Policy loss -
                        When you deploy Policy service, policies may not be applied if they were in the process of being applied during the deployment.

                        Performance issue -
                        When under heavy load, the anomaly detection service undergoes slowdowns and rejects connections.

                        Notification latency -
                        Users report that anomaly detection emails can sometimes arrive several minutes after an anomaly is detected.

                        App code -

                        EventGridController.cs -
                        Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.


                        LoginEvent.cs -
                        Relevant portions of the app files are shown below. Line numbers are included for reference only and include a two-character prefix that denotes the specific file to which they belong.

                        Question:
                        You need to resolve a notification latency issue.
                        Which two actions should you perform? Each correct answer presents part of the solution.
                        NOTE: Each correct selection is worth one point.
                        1. A
                          Set Always On to true.
                        2. B
                          Ensure that the Azure Function is using an App Service plan.
                        3. C
                          Set Always On to false.
                        4. D
                          Ensure that the Azure Function is set to use a consumption plan.

                        Correct Answer:
                        AB
                        Azure Functions can run on either a Consumption Plan or a dedicated App Service Plan. If you run in a dedicated mode, you need to turn on the Always On setting for your Function App to run properly. The Function runtime will go idle after a few minutes of inactivity, so only HTTP triggers will actually "wake up" your functions. This is similar to how WebJobs must have Always On enabled.
                        Scenario: Notification latency: Users report that anomaly detection emails can sometimes arrive several minutes after an anomaly is detected.
                        Anomaly detection service: You have an anomaly detection service that analyzes log information for anomalies. It is implemented as an Azure Machine Learning model. The model is deployed as a web service.
                        If an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook.
                        Reference:
                        https://github.com/Azure/Azure-Functions/wiki/Enable-Always-On-when-running-on-dedicated-App-Service-Plan
                        Question #323
                        Introductory Info Case study -
                        This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                        To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                        At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                        To start the case study -
                        To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                        Background -

                        Overview -
                        You are a developer for Contoso, Ltd. The company has a social networking website that is developed as a Single Page Application (SPA). The main web application for the social networking website loads user uploaded content from blob storage.
                        You are developing a solution to monitor uploaded data for inappropriate content. The following process occurs when users upload content by using the SPA:
                        Messages are sent to ContentUploadService.
                        Content is processed by ContentAnalysisService.
                        After processing is complete, the content is posted to the social network or a rejection message is posted in its place.
                        The ContentAnalysisService is deployed with Azure Container Instances from a private Azure Container Registry named contosoimages.
                        The solution will use eight CPU cores.

                        Azure Active Directory -
                        Contoso, Ltd. uses Azure Active Directory (Azure AD) for both internal and guest accounts.

                        Requirements -

                        ContentAnalysisService -
                        The company's data science group built ContentAnalysisService which accepts user generated content as a string and returns a probable value for inappropriate content. Any values over a specific threshold must be reviewed by an employee of Contoso, Ltd.
                        You must create an Azure Function named CheckUserContent to perform the content checks.

                        Costs -
                        You must minimize costs for all Azure services.

                        Manual review -
                        To review content, the user must authenticate to the website portion of the ContentAnalysisService using their Azure AD credentials. The website is built using
                        React and all pages and API endpoints require authentication. In order to review content a user must be part of a ContentReviewer role. All completed reviews must include the reviewer's email address for auditing purposes.

                        High availability -
                        All services must run in multiple regions. The failure of any service in a region must not impact overall application availability.

                        Monitoring -
                        An alert must be raised if the ContentUploadService uses more than 80 percent of available CPU cores.

                        Security -
                        You have the following security requirements:
                        Any web service accessible over the Internet must be protected from cross site scripting attacks.
                        All websites and services must use SSL from a valid root certificate authority.
                        Azure Storage access keys must only be stored in memory and must be available only to the service.
                        All Internal services must only be accessible from internal Virtual Networks (VNets).
                        All parts of the system must support inbound and outbound traffic restrictions.
                        All service calls must be authenticated by using Azure AD.

                        User agreements -
                        When a user submits content, they must agree to a user agreement. The agreement allows employees of Contoso, Ltd. to review content, store cookies on user devices, and track user's IP addresses.
                        Information regarding agreements is used by multiple divisions within Contoso, Ltd.
                        User responses must not be lost and must be available to all parties regardless of individual service uptime. The volume of agreements is expected to be in the millions per hour.

                        Validation testing -
                        When a new version of the ContentAnalysisService is available the previous seven days of content must be processed with the new version to verify that the new version does not significantly deviate from the old version.

                        Issues -
                        Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages.

                        Code -

                        ContentUploadService -


                        ApplicationManifest -
                         Question HOTSPOT -
                        You need to ensure that validation testing is triggered per the requirements.
                        How should you complete the code segment? To answer, select the appropriate values in the answer area.
                        NOTE: Each correct selection is worth one point.
                        Hot Area:


                          Correct Answer:

                          Box 1: RepositoryUpdated -
                          When a new version of the ContentAnalysisService is available the previous seven days of content must be processed with the new version to verify that the new version does not significantly deviate from the old version.

                          Box 2: service -

                          Box 3: imageCollection -
                          Reference:
                          https://docs.microsoft.com/en-us/azure/devops/notifications/oob-supported-event-types
                          Question #324
                          Introductory Info Case study -
                          This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                          To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                          At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                          To start the case study -
                          To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                          Background -

                          Overview -
                          You are a developer for Contoso, Ltd. The company has a social networking website that is developed as a Single Page Application (SPA). The main web application for the social networking website loads user uploaded content from blob storage.
                          You are developing a solution to monitor uploaded data for inappropriate content. The following process occurs when users upload content by using the SPA:
                          Messages are sent to ContentUploadService.
                          Content is processed by ContentAnalysisService.
                          After processing is complete, the content is posted to the social network or a rejection message is posted in its place.
                          The ContentAnalysisService is deployed with Azure Container Instances from a private Azure Container Registry named contosoimages.
                          The solution will use eight CPU cores.

                          Azure Active Directory -
                          Contoso, Ltd. uses Azure Active Directory (Azure AD) for both internal and guest accounts.

                          Requirements -

                          ContentAnalysisService -
                          The company's data science group built ContentAnalysisService which accepts user generated content as a string and returns a probable value for inappropriate content. Any values over a specific threshold must be reviewed by an employee of Contoso, Ltd.
                          You must create an Azure Function named CheckUserContent to perform the content checks.

                          Costs -
                          You must minimize costs for all Azure services.

                          Manual review -
                          To review content, the user must authenticate to the website portion of the ContentAnalysisService using their Azure AD credentials. The website is built using
                          React and all pages and API endpoints require authentication. In order to review content a user must be part of a ContentReviewer role. All completed reviews must include the reviewer's email address for auditing purposes.

                          High availability -
                          All services must run in multiple regions. The failure of any service in a region must not impact overall application availability.

                          Monitoring -
                          An alert must be raised if the ContentUploadService uses more than 80 percent of available CPU cores.

                          Security -
                          You have the following security requirements:
                          Any web service accessible over the Internet must be protected from cross site scripting attacks.
                          All websites and services must use SSL from a valid root certificate authority.
                          Azure Storage access keys must only be stored in memory and must be available only to the service.
                          All Internal services must only be accessible from internal Virtual Networks (VNets).
                          All parts of the system must support inbound and outbound traffic restrictions.
                          All service calls must be authenticated by using Azure AD.

                          User agreements -
                          When a user submits content, they must agree to a user agreement. The agreement allows employees of Contoso, Ltd. to review content, store cookies on user devices, and track user's IP addresses.
                          Information regarding agreements is used by multiple divisions within Contoso, Ltd.
                          User responses must not be lost and must be available to all parties regardless of individual service uptime. The volume of agreements is expected to be in the millions per hour.

                          Validation testing -
                          When a new version of the ContentAnalysisService is available the previous seven days of content must be processed with the new version to verify that the new version does not significantly deviate from the old version.

                          Issues -
                          Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages.

                          Code -

                          ContentUploadService -


                          ApplicationManifest -
                           Question You need to deploy the CheckUserContent Azure Function. The solution must meet the security and cost requirements.
                          Which hosting model should you use?
                          1. A
                            Premium plan
                          2. B
                            App Service plan
                          3. C
                            Consumption plan

                          Correct Answer:
                          B
                          Scenario:
                          You must minimize costs for all Azure services.
                          All Internal services must only be accessible from internal Virtual Networks (VNets).
                          Best for long-running scenarios where Durable Functions can't be used. Consider an App Service plan in the following situations:
                          ✑ You have existing, underutilized VMs that are already running other App Service instances.
                          ✑ You want to provide a custom image on which to run your functions.
                          ✑ Predictive scaling and costs are required.
                          Note: When you create a function app in Azure, you must choose a hosting plan for your app. There are three basic hosting plans available for Azure Functions:
                          Consumption plan, Premium plan, and Dedicated (App Service) plan.
                          Incorrect Answers:
                          A: A Premium plan would be more costly.
                          C: Need the VNET functionality.
                          Reference:
                          https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale
                          Question #325
                          Introductory Info Case study -
                          This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                          To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                          At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                          To start the case study -
                          To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                          LabelMaker app -
                          Coho Winery produces, bottles, and distributes a variety of wines globally. You are a developer implementing highly scalable and resilient applications to support online order processing by using Azure solutions.
                          Coho Winery has a LabelMaker application that prints labels for wine bottles. The application sends data to several printers. The application consists of five modules that run independently on virtual machines (VMs). Coho Winery plans to move the application to Azure and continue to support label creation.
                          External partners send data to the LabelMaker application to include artwork and text for custom label designs.

                          Requirements. Data -
                          You identify the following requirements for data management and manipulation:
                          Order data is stored as nonrelational JSON and must be queried using SQL.
                          Changes to the Order data must reflect immediately across all partitions. All reads to the Order data must fetch the most recent writes.

                          Requirements. Security -
                          You have the following security requirements:
                          Users of Coho Winery applications must be able to provide access to documents, resources, and applications to external partners.

                          External partners must use their own credentials and authenticate with their organization's identity management solution.
                          External partner logins must be audited monthly for application use by a user account administrator to maintain company compliance.
                          Storage of e-commerce application settings must be maintained in Azure Key Vault.
                          E-commerce application sign-ins must be secured by using Azure App Service authentication and Azure Active Directory (AAD).
                          Conditional access policies must be applied at the application level to protect company content.
                          The LabelMaker application must be secured by using an AAD account that has full access to all namespaces of the Azure Kubernetes Service (AKS) cluster.

                          Requirements. LabelMaker app -
                          Azure Monitor Container Health must be used to monitor the performance of workloads that are deployed to Kubernetes environments and hosted on Azure
                          Kubernetes Service (AKS).
                          You must use Azure Container Registry to publish images that support the AKS deployment.

                          Architecture -


                          Issues -
                          Calls to the Printer API App fail periodically due to printer communication timeouts.
                          Printer communication timeouts occur after 10 seconds. The label printer must only receive up to 5 attempts within one minute.
                          The order workflow fails to run upon initial deployment to Azure.

                          Order.json -
                          Relevant portions of the app files are shown below. Line numbers are included for reference only.
                          This JSON file contains a representation of the data for an order that includes a single item.

                          Order.json -
                           Question DRAG DROP -
                          You need to deploy a new version of the LabelMaker application to ACR.
                          Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
                          Select and Place:


                            Correct Answer:

                            Step 1: Build a new application image by using dockerfile
                            Step 2: Create an alias if the image with the fully qualified path to the registry
                            Before you can push the image to a private registry, you've to ensure a proper image name. This can be achieved using the docker tag command. For demonstration purpose, we'll use Docker's hello world image, rename it and push it to ACR.
                            # pulls hello-world from the public docker hub
                            $ docker pull hello-world
                            # tag the image in order to be able to push it to a private registry
                            $ docker tag hello-word <REGISTRY_NAME>/hello-world
                            # push the image
                            $ docker push <REGISTRY_NAME>/hello-world
                            Step 3: Log in to the registry and push image
                            In order to push images to the newly created ACR instance, you need to login to ACR form the Docker CLI. Once logged in, you can push any existing docker image to your ACR instance.
                            Scenario:
                            Coho Winery plans to move the application to Azure and continue to support label creation.

                            LabelMaker app -
                            Azure Monitor Container Health must be used to monitor the performance of workloads that are deployed to Kubernetes environments and hosted on Azure
                            Kubernetes Service (AKS).
                            You must use Azure Container Registry to publish images that support the AKS deployment.
                            Reference:
                            https://thorsten-hans.com/how-to-use-a-private-azure-container-registry-with-kubernetes-9b86e67b93b6 https://docs.microsoft.com/en-us/azure/container-registry/container-registry-tutorial-quick-task
                            Question #326
                            Introductory Info Case study -
                            This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                            To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                            At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                            To start the case study -
                            To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                            LabelMaker app -
                            Coho Winery produces, bottles, and distributes a variety of wines globally. You are a developer implementing highly scalable and resilient applications to support online order processing by using Azure solutions.
                            Coho Winery has a LabelMaker application that prints labels for wine bottles. The application sends data to several printers. The application consists of five modules that run independently on virtual machines (VMs). Coho Winery plans to move the application to Azure and continue to support label creation.
                            External partners send data to the LabelMaker application to include artwork and text for custom label designs.

                            Requirements. Data -
                            You identify the following requirements for data management and manipulation:
                            Order data is stored as nonrelational JSON and must be queried using SQL.
                            Changes to the Order data must reflect immediately across all partitions. All reads to the Order data must fetch the most recent writes.

                            Requirements. Security -
                            You have the following security requirements:
                            Users of Coho Winery applications must be able to provide access to documents, resources, and applications to external partners.

                            External partners must use their own credentials and authenticate with their organization's identity management solution.
                            External partner logins must be audited monthly for application use by a user account administrator to maintain company compliance.
                            Storage of e-commerce application settings must be maintained in Azure Key Vault.
                            E-commerce application sign-ins must be secured by using Azure App Service authentication and Azure Active Directory (AAD).
                            Conditional access policies must be applied at the application level to protect company content.
                            The LabelMaker application must be secured by using an AAD account that has full access to all namespaces of the Azure Kubernetes Service (AKS) cluster.

                            Requirements. LabelMaker app -
                            Azure Monitor Container Health must be used to monitor the performance of workloads that are deployed to Kubernetes environments and hosted on Azure
                            Kubernetes Service (AKS).
                            You must use Azure Container Registry to publish images that support the AKS deployment.

                            Architecture -


                            Issues -
                            Calls to the Printer API App fail periodically due to printer communication timeouts.
                            Printer communication timeouts occur after 10 seconds. The label printer must only receive up to 5 attempts within one minute.
                            The order workflow fails to run upon initial deployment to Azure.

                            Order.json -
                            Relevant portions of the app files are shown below. Line numbers are included for reference only.
                            This JSON file contains a representation of the data for an order that includes a single item.

                            Order.json -
                             

                            Question You need to access data from the user claim object in the e-commerce web app.
                            What should you do first?
                            1. A
                              Write custom code to make a Microsoft Graph API call from the e-commerce web app.
                            2. B
                              Assign the Contributor RBAC role to the e-commerce web app by using the Resource Manager create role assignment API.
                            3. C
                              Update the e-commerce web app to read the HTTP request header values.
                            4. D
                              Using the Azure CLI, enable Cross-origin resource sharing (CORS) from the e-commerce checkout API to the e-commerce web app.

                            Correct Answer:
                            C
                            Methods to Get User Identity and Claims in a .NET Azure Functions App include:
                            ✑ ClaimsPrincipal from the Request Context
                            The ClaimsPrincipal object is also available as part of the request context and can be extracted from the HttpRequest.HttpContext.
                            ✑ User Claims from the Request Headers.
                            App Service passes user claims to the app by using special request headers.
                            Reference:
                            https://levelup.gitconnected.com/four-alternative-methods-to-get-user-identity-and-claims-in-a-net-azure-functions-app-df98c40424bb
                            Question #327
                            Introductory Info Case study -
                            This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                            To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                            At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                            To start the case study -
                            To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                            Background -
                            VanArsdel, Ltd. is a global office supply company. The company is based in Canada and has retail store locations across the world. The company is developing several cloud-based solutions to support their stores, distributors, suppliers, and delivery services.

                            Current environment -

                            Corporate website -
                            The company provides a public website located at http://www.vanarsdelltd.com. The website consists of a React JavaScript user interface, HTML, CSS, image assets, and several APIs hosted in Azure Functions.

                            Retail Store Locations -
                            The company supports thousands of store locations globally. Store locations send data every hour to an Azure Blob storage account to support inventory, purchasing and delivery services. Each record includes a location identifier and sales transaction information.

                            Requirements -
                            The application components must meet the following requirements:

                            Corporate website -
                            Secure the website by using SSL.
                            Minimize costs for data storage and hosting.
                            Implement native GitHub workflows for continuous integration and continuous deployment (CI/CD).
                            Distribute the website content globally for local use.
                            Implement monitoring by using Application Insights and availability web tests including SSL certificate validity and custom header value verification.
                            The website must have 99.95 percent uptime.

                            Retail store locations -
                            Azure Functions must process data immediately when data is uploaded to Blob storage. Azure Functions must update Azure Cosmos DB by using native SQL language queries.
                            Audit store sale transaction information nightly to validate data, process sales financials, and reconcile inventory.

                            Delivery services -
                            Store service telemetry data in Azure Cosmos DB by using an Azure Function. Data must include an item id, the delivery vehicle license plate, vehicle package capacity, and current vehicle location coordinates.
                            Store delivery driver profile information in Azure Active Directory (Azure AD) by using an Azure Function called from the corporate website.

                            Inventory services -
                            The company has contracted a third-party to develop an API for inventory processing that requires access to a specific blob within the retail store storage account for three months to include read-only access to the data.

                            Security -
                            All Azure Functions must centralize management and distribution of configuration data for different environments and geographies, encrypted by using a company-provided RSA-HSM key.
                            Authentication and authorization must use Azure AD and services must use managed identities where possible.

                            Issues -

                            Retail Store Locations -
                            You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.
                            Azure Cosmos DB queries from the Azure Function exhibit high Request Unit (RU) usage and contain multiple, complex queries that exhibit high point read latency for large items as the function app is scaling. Question HOTSPOT -
                            You need to implement the retail store location Azure Function.
                            How should you configure the solution? To answer, select the appropriate options in the answer area.
                            NOTE: Each correct selection is worth one point.
                            Hot Area:


                              Correct Answer:

                              Scenario: Retail store locations: Azure Functions must process data immediately when data is uploaded to Blob storage.

                              Box 1: HTTP -
                              Binding configuration example: https://<storage_account_name>.blob.core.windows.net

                              Box 2: Input -
                              Read blob storage data in a function: Input binding

                              Box 3: Blob storage -
                              The Blob storage trigger starts a function when a new or updated blob is detected.
                              Azure Functions integrates with Azure Storage via triggers and bindings. Integrating with Blob storage allows you to build functions that react to changes in blob data as well as read and write values.
                              Reference:
                              https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob-trigger
                              Question #328
                              Introductory Info Case study -
                              This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                              To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                              At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                              To start the case study -
                              To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                              Background -
                              VanArsdel, Ltd. is a global office supply company. The company is based in Canada and has retail store locations across the world. The company is developing several cloud-based solutions to support their stores, distributors, suppliers, and delivery services.

                              Current environment -

                              Corporate website -
                              The company provides a public website located at http://www.vanarsdelltd.com. The website consists of a React JavaScript user interface, HTML, CSS, image assets, and several APIs hosted in Azure Functions.

                              Retail Store Locations -
                              The company supports thousands of store locations globally. Store locations send data every hour to an Azure Blob storage account to support inventory, purchasing and delivery services. Each record includes a location identifier and sales transaction information.

                              Requirements -
                              The application components must meet the following requirements:

                              Corporate website -
                              Secure the website by using SSL.
                              Minimize costs for data storage and hosting.
                              Implement native GitHub workflows for continuous integration and continuous deployment (CI/CD).
                              Distribute the website content globally for local use.
                              Implement monitoring by using Application Insights and availability web tests including SSL certificate validity and custom header value verification.
                              The website must have 99.95 percent uptime.

                              Retail store locations -
                              Azure Functions must process data immediately when data is uploaded to Blob storage. Azure Functions must update Azure Cosmos DB by using native SQL language queries.
                              Audit store sale transaction information nightly to validate data, process sales financials, and reconcile inventory.

                              Delivery services -
                              Store service telemetry data in Azure Cosmos DB by using an Azure Function. Data must include an item id, the delivery vehicle license plate, vehicle package capacity, and current vehicle location coordinates.
                              Store delivery driver profile information in Azure Active Directory (Azure AD) by using an Azure Function called from the corporate website.

                              Inventory services -
                              The company has contracted a third-party to develop an API for inventory processing that requires access to a specific blob within the retail store storage account for three months to include read-only access to the data.

                              Security -
                              All Azure Functions must centralize management and distribution of configuration data for different environments and geographies, encrypted by using a company-provided RSA-HSM key.
                              Authentication and authorization must use Azure AD and services must use managed identities where possible.

                              Issues -

                              Retail Store Locations -
                              You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.
                              Azure Cosmos DB queries from the Azure Function exhibit high Request Unit (RU) usage and contain multiple, complex queries that exhibit high point read latency for large items as the function app is scaling. Question HOTSPOT -
                              You need to implement the corporate website.
                              How should you configure the solution? To answer, select the appropriate options in the answer area.
                              NOTE: Each correct selection is worth one point.
                              Hot Area:


                                Correct Answer:

                                Box 1: Standard -
                                Below is a high-level comparison of the features as per the pricing tier for the App Service Plan.


                                Note: Corporate website -
                                The company provides a public website located at http://www.vanarsdelltd.com. The website consists of a React JavaScript user interface, HTML, CSS, image assets, and several APIs hosted in Azure Functions.
                                Corporate website requirements:
                                ✑ Secure the website by using SSL.
                                ✑ Minimize costs for data storage and hosting.
                                ✑ Implement native GitHub workflows for continuous integration and continuous deployment (CI/CD).
                                ✑ Distribute the website content globally for local use.
                                ✑ Implement monitoring by using Application Insights and availability web tests including SSL certificate validity and custom header value verification.
                                ✑ The website must have 99.95 percent uptime.

                                Box 2: App Service Web App -
                                A Web App is a web application that is hosted in an App Service. The App Service is the managed service in Azure that enables you to deploy a web application and make it available to your customers on the Internet in a very short amount of time.
                                Incorrect:
                                A Static Web Application is any web application that can be delivered directly to an end user's browser without any server-side alteration of the HTML, CSS, or
                                JavaScript content.
                                Reference:
                                https://azure-training.com/2018/12/27/understanding-app-services-app-service-plan-and-ase/ https://docs.microsoft.com/en-us/azure/app-service/overview
                                Question #329
                                Introductory Info Case study -
                                This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                                To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                                At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                                To start the case study -
                                To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                                Background -
                                VanArsdel, Ltd. is a global office supply company. The company is based in Canada and has retail store locations across the world. The company is developing several cloud-based solutions to support their stores, distributors, suppliers, and delivery services.

                                Current environment -

                                Corporate website -
                                The company provides a public website located at http://www.vanarsdelltd.com. The website consists of a React JavaScript user interface, HTML, CSS, image assets, and several APIs hosted in Azure Functions.

                                Retail Store Locations -
                                The company supports thousands of store locations globally. Store locations send data every hour to an Azure Blob storage account to support inventory, purchasing and delivery services. Each record includes a location identifier and sales transaction information.

                                Requirements -
                                The application components must meet the following requirements:

                                Corporate website -
                                Secure the website by using SSL.
                                Minimize costs for data storage and hosting.
                                Implement native GitHub workflows for continuous integration and continuous deployment (CI/CD).
                                Distribute the website content globally for local use.
                                Implement monitoring by using Application Insights and availability web tests including SSL certificate validity and custom header value verification.
                                The website must have 99.95 percent uptime.

                                Retail store locations -
                                Azure Functions must process data immediately when data is uploaded to Blob storage. Azure Functions must update Azure Cosmos DB by using native SQL language queries.
                                Audit store sale transaction information nightly to validate data, process sales financials, and reconcile inventory.

                                Delivery services -
                                Store service telemetry data in Azure Cosmos DB by using an Azure Function. Data must include an item id, the delivery vehicle license plate, vehicle package capacity, and current vehicle location coordinates.
                                Store delivery driver profile information in Azure Active Directory (Azure AD) by using an Azure Function called from the corporate website.

                                Inventory services -
                                The company has contracted a third-party to develop an API for inventory processing that requires access to a specific blob within the retail store storage account for three months to include read-only access to the data.

                                Security -
                                All Azure Functions must centralize management and distribution of configuration data for different environments and geographies, encrypted by using a company-provided RSA-HSM key.
                                Authentication and authorization must use Azure AD and services must use managed identities where possible.

                                Issues -

                                Retail Store Locations -
                                You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.
                                Azure Cosmos DB queries from the Azure Function exhibit high Request Unit (RU) usage and contain multiple, complex queries that exhibit high point read latency for large items as the function app is scaling. Question You need to implement a solution to resolve the retail store location data issue.
                                Which three Azure Blob features should you enable? Each correct answer presents part of the solution.
                                NOTE: Each correct selection is worth one point.
                                1. A
                                  Soft delete
                                2. B
                                  Change feed
                                3. C
                                  Snapshots
                                4. D
                                  Versioning
                                5. E
                                  Object replication
                                6. F
                                  Immutability

                                Correct Answer:
                                ABD
                                Scenario: You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.
                                Before you enable and configure point-in-time restore, enable its prerequisites for the storage account: soft delete, change feed, and blob versioning.
                                Reference:
                                https://docs.microsoft.com/en-us/azure/storage/blobs/point-in-time-restore-manage
                                Question #330
                                Introductory Info Case study -
                                This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
                                To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
                                At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

                                To start the case study -
                                To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. When you are ready to answer a question, click the Question button to return to the question.

                                Background -

                                Overview -
                                You are a developer for Contoso, Ltd. The company has a social networking website that is developed as a Single Page Application (SPA). The main web application for the social networking website loads user uploaded content from blob storage.
                                You are developing a solution to monitor uploaded data for inappropriate content. The following process occurs when users upload content by using the SPA:
                                * Messages are sent to ContentUploadService.
                                * Content is processed by ContentAnalysisService.
                                * After processing is complete, the content is posted to the social network or a rejection message is posted in its place.
                                The ContentAnalysisService is deployed with Azure Container Instances from a private Azure Container Registry named contosoimages.
                                The solution will use eight CPU cores.

                                Azure Active Directory -
                                Contoso, Ltd. uses Azure Active Directory (Azure AD) for both internal and guest accounts.

                                Requirements -

                                ContentAnalysisService -
                                The company's data science group built ContentAnalysisService which accepts user generated content as a string and returns a probable value for inappropriate content. Any values over a specific threshold must be reviewed by an employee of Contoso, Ltd.
                                You must create an Azure Function named CheckUserContent to perform the content checks.

                                Costs -
                                You must minimize costs for all Azure services.

                                Manual review -
                                To review content, the user must authenticate to the website portion of the ContentAnalysisService using their Azure AD credentials. The website is built using
                                React and all pages and API endpoints require authentication. In order to review content a user must be part of a ContentReviewer role. All completed reviews must include the reviewer's email address for auditing purposes.

                                High availability -
                                All services must run in multiple regions. The failure of any service in a region must not impact overall application availability.

                                Monitoring -
                                An alert must be raised if the ContentUploadService uses more than 80 percent of available CPU cores.

                                Security -
                                You have the following security requirements:
                                Any web service accessible over the Internet must be protected from cross site scripting attacks.
                                All websites and services must use SSL from a valid root certificate authority.
                                Azure Storage access keys must only be stored in memory and must be available only to the service.
                                All Internal services must only be accessible from internal Virtual Networks (VNets).
                                All parts of the system must support inbound and outbound traffic restrictions.
                                All service calls must be authenticated by using Azure AD.

                                User agreements -
                                When a user submits content, they must agree to a user agreement. The agreement allows employees of Contoso, Ltd. to review content, store cookies on user devices, and track user's IP addresses.
                                Information regarding agreements is used by multiple divisions within Contoso, Ltd.
                                User responses must not be lost and must be available to all parties regardless of individual service uptime. The volume of agreements is expected to be in the millions per hour.

                                Validation testing -
                                When a new version of the ContentAnalysisService is available the previous seven days of content must be processed with the new version to verify that the new version does not significantly deviate from the old version.

                                Issues -
                                Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages.

                                Code -

                                ContentUploadService -


                                ApplicationManifest -
                                 Question You need to store the user agreements.
                                Where should you store the agreement after it is completed?
                                1. A
                                  Azure Storage queue
                                2. B
                                  Azure Event Hub
                                3. C
                                  Azure Service Bus topic
                                4. D
                                  Azure Event Grid topic

                                Correct Answer:
                                B
                                Azure Event Hub is used for telemetry and distributed data streaming.
                                This service provides a single solution that enables rapid data retrieval for real-time processing as well as repeated replay of stored raw data. It can capture the streaming data into a file for processing and analysis.
                                It has the following characteristics:

                                No comments:

                                Post a Comment