Design and deploy apps with Microsoft Power Platform. Best Practices during this process will decrease the chances of technical issues in the future related to poor application lifecycle management processes and storage limitations. The first step before the customization and configuration of the solution takes place is the configuration of your online tenant and online environment

Learning objectives

In this module, We’ll learn the below points.

  • Explore Microsoft Power Platform deployment foundation concepts
  • Strategy and vision
  • Plan your deployment
  • Secure deployments
  • Determine storage requirements
  • Manage authentication
  • Connect and authenticate to data sources

Explore Microsoft Power Platform deployment foundation concepts

The Microsoft Power Platform is a high-productivity application development platform from Microsoft. Not only is the platform used by Microsoft to build their own first-party Customer Engagement applications such as Dynamics 365 Sales, Service, Field Service, and Marketing. Organizations across the world are using the Power Platform to build entire solutions that help their organization meet the ever-changing challenges in today’s world. Additionally, more organizations are empowering individual users and teams to build personal or team productivity applications that help provide value to their jobs or simplify the way that they work.

The Microsoft Power Platform includes a core set of tools that organizations can use to build these solutions. These solutions include Power Apps, Power BI, Power Automate, and Power Virtual Agents. By using the common infrastructure of the Microsoft Dataverse and over 700 pre-built connectors, solutions built on the Power Platform can connect to multiple data sources, and when needed can also include Azure cloud services to scale from individual productivity to enterprise mission critical line-of-business applications.

Diagram of Microsoft Power Platform products.

The evolution of an organization adopting Power Apps, Power Automate, and Dataverse starts with the administrator. There are several items to consider as an administrator of the Microsoft Power Platform. As an administrator of Microsoft Power Platform, you begin your journey by asking how you can protect your organization’s data.

  • What data is accessible through these services?
  • Are there best practices that we should be following?
  • What is the Power Apps security model and how should I control access to data?

Once you determine how to proceed with data access, you’ll then want to know how you can monitor and manage what users are doing with these services. When you’ve figured out control and visibility, the next part of your journey takes you to deployment. Individual users and teams can deploy apps on their own, but how do you centrally deploy solutions for your entire organization? And how do you orchestrate updates and identify and fix issues?

Throughout the remainder of this module, we’ll examine several elements related to deploying and administering the Power Platform.

Strategy and vision

Organizations that implement Power Platform solutions often have power users and their professional developers working hand in hand on projects. The power users are building applications that help to improve job function or enhance productivity, while developers are building the more technical components that make these solutions work.

This process is different than what has been done in the past. When you have this many people collaborating and building solutions, it can lead to different challenges related to security, compliance, performance, and more.

Before you start building solutions using the Microsoft Platform, it’s important to consider what that looks like. A little planning and consideration can go a long way toward the success of a project. This includes implementing different tools and practices designed to help ensure that the solutions you’re building will go more smoothly.

Establish a Center of Excellence

One of the first things that you should consider is establishing a Microsoft Power Platform Center of Excellence. Establishing a Center of Excellence (CoE) means investing in and nurturing the organic growth the Power Platform can provide while maintaining governance and control. For many organizations, the CoE is the first step in fostering greater creativity and innovation across the organization. It empowers different business units to digitize and automate their business processes while maintaining the necessary level of central oversight and governance.

One key principle is to clarify why you’re setting up a CoE, what you aim to accomplish, and the key business outcomes you hope to achieve. Then get started and learn and evolve along the way.

A CoE is designed to drive innovation and improvement. It can break down geographic and organizational silos to bring together like-minded people with similar business goals to share knowledge and success, while at the same time providing standards, consistency, and governance to the organization. In summary, a CoE can be a powerful way for an organization to align around business goals rather than individual department metrics.

Typically, the following people or departments are key drivers or stakeholders when establishing a Center of Excellence:

  • App and flow makers
  • Application lifecycle management and DevOps users
  • Central IT
  • Support and training engineers
  • Business change management

At first, establishing a Center of Excellence might start off simply with a single individual using the provided tools and best practices to get a view of their Microsoft Power Platform adoption in their organization. As your organization evolves, it might grow into a more mature investment with multiple functions and roles to manage multiple aspects of governance, training, support, and automated app deployment across the organization.

We recommend the following strategy for getting started with your journey of establishing a CoE:

  • Secure by establishing data loss prevention policies, and managing licenses, and access to data sources.
  • Evangelize by providing a community space on Teams, Yammer, or SharePoint, with a collection of links for people to start their learning.
  • Monitor your usage, see who is creating apps, what apps are being created, and how they’re used.
  • Evolve your CoE strategy with those learnings.

You can learn more about creating the Microsoft Power Platform Center of Excellence here: Get started with the Microsoft Power Platform Center of Excellence.

Roles and responsibilities

Planning and maintaining Power Platform solutions as well as establishing a Center of Excellence, typically requires input and feedback for many different stakeholders to be effective. To assist with this, we recommend that you include the following roles and responsibilities as part of your strategy. This will help to provide guidance related to the application creation, help ensure data is secure, and help to ensure that app makers are using best practices as they build solutions. The list below represents a suggested starting point. In your organization, this might be different, or you might start with only a few roles and grow to more as your adoption journey continues.

Low-code strategy team

The low-code strategy team represents the key decision-makers and ensures the Microsoft Power Platform strategy is aligned with organizational goals. This team also is responsible for the adoption and change management, and for looking at ways of working across the organization. As a driver for digital innovation, they ensure a concrete action plan for increasing digital literacy is in place. Often that is achieved through a combination of bottom-up and top-down initiatives.

  • Bottom-up: Educate your makers, make it less scary, and drive self-enablement.
  • Top-down: Work on executive literacy and creating an innovation-friendly culture.

Microsoft Power Platform admin team

The Microsoft Power Platform admin team is responsible for establishing an environment strategy, setting up data loss prevention (DLP) policies, and managing users, capacity, and licensing. They also make data available to makers through connectors, integration, or migration.

Microsoft Power Platform nurtures team

The Microsoft Power Platform nurture team—and this can consist of your champions—organizes app-in-a-day events and hackathons, provides mentorship to makers and ensures new makers get off to a good start, and really look to evangelizing the platform.

Automation and reusable components

Another team or function that you should consider is one that looks at automating tasks, such as archiving unused resources, identifying highly used resources to provide more formal support, and approving environment and license requests from end users. This team would also set up application lifecycle management using the Microsoft Power Platform Build Tools for Azure DevOps, support architecture reviews with makers, and share common templates and reusable components. Having these functions in place will ensure that your organization extracts benefits more quickly, by ensuring processes are consistent and best practices copied across the organization.

Delivery Models

Another consideration that you should think about initially is how you’ll be delivering solutions to the organization. Depending on the size of your organization, you might want to formalize your Microsoft Power Platform adoption approach by implementing a structured organization model. You should consider the following ways to structure your team and decide what is the best fit for your situation and organization.

Microsoft Power Platform has four delivery models, but each of these is just a mental model, every organization has a variation of multiple models along this continuum. For example, even if you opt for a centralized model, where all requirements are coming into a central delivery team, you’ll still have citizen developers discovering the platform and building apps for their teams. You’ll have elements of matrix or BizDevOps regardless.

These models can help you consider what your current software delivery model is and how Microsoft Power Platform might overlay into it, or how your current model might evolve to accommodate the rapid development capability enabled by Microsoft Power Platform.

Diagram of the four available delivery models.


In this model, you create central teams of product owners who own the low-code delivery of departmental solutions from around the organization’s business units. Professional developers owning code-first solutions will work in tandem with the business to deliver in a shared model. Enterprise architects will own the middle tier and services, and ensure data is available to makers. Central IT will own the licensing and systems in which everyone operates.

With this model, you create a central team that can pick up the development of apps based on organizational priorities. Additionally, because they would have foundational expertise in Power Apps, your team will include members who specialize in specific parts of Microsoft Power Platform such as Power Automate, Power BI, and the Power Apps component framework or they could specialize in third-party integration and artificial intelligence. This model is an effective way to drive change across your organization and is the best way to deliver any type of application.


In this model, you can create multiple teams across the organization that are close to the day-to-day running of various teams. They’ll have the resources to deliver apps consistently within organizational guidelines. Each team can run autonomously, and they can split and grow in a cellular fashion. However, with this model, you’ll still need centralized governance to apply some high-level digital guardrails to ensure corporate compliance. These can include things like data loss prevention (DLP) governance, connector management, and license management to ensure users and developers can safely build and release solutions with minimal intervention from IT while keeping company data safe and compliant. This is a great self-service option.


With this model, you mix the best of decentralized and centralized. You have a centralized team of trained and certified Microsoft Power Platform specialists. You’ll have leaders of change, design, delivery, and architecture, in addition to specialized trainers to train local teams across the organization. Local teams made up of citizen developers are connected with experts from the centralized structure, to make sure nothing is getting lost in translation between the people doing their day-to-day jobs and using the apps that are being built. With this model, you can scale into the thousands of people working on app creation.

This team should also consider the notion of a Center of Excellence to manage their data estate and deploy solutions with guidelines for everyone. This works well for self-service and small teams to deliver options quickly with little IT engagement.


Rapid app development can only happen at the speed that operations such as IT can support the apps created. BizDevOps is a holistic relationship between app makers and operations that works in a virtuous loop. For this to work, all teams need to have a clear vision of the digital culture the organization is moving toward. To get the maximum value from the apps created, they need reliable support, governance, and maintainability. As technology evolves, updates will need to be made to the apps to keep them current. Not only being aware of the change but having a plan for managing it, is a key to successful apps.

Now that we’ve examined some of the key elements to consider when developing a Power Platform strategy and vision, let’s examine some things to consider when planning a deployment.

Plan your deployment

Your deployment of the Power Platform with Microsoft Dataverse will go more smoothly with some initial preliminary planning. Microsoft Power Platform adoption best practices provides guidance designed to help you create and implement the business and technology strategies necessary for your organization to succeed with Microsoft Power Platform.

Administrative roles

There are several administrative roles available to assign to users when you manage your subscription in the Microsoft Online Services environment. Administrative roles define administrative responsibilities related to subscription management activities, for example, billing administration, password administration, and user management administration.

From a Power Platform standpoint, there are two Microsoft Power Platform–related service admin roles you can assign to provide a high level of admin management:

  • Dynamics 365 admin
  • Microsoft Power Platform admin

To help you administer environments and settings, you can assign users the Microsoft Power Platform admin role to manage Microsoft Power Platform at the tenant level. These admin roles can be assigned from the Microsoft 365 admin center.

Adding an environment to your subscription

Environments are containers that administrators can use to manage apps, flows, connections, and other assets, along with permissions to allow organization members to use the resources. You can add different environments to a tenant.

Each environment is created under an Azure Active Directory (Azure AD) tenant, and its resources can only be accessed by users within that tenant. An environment is also bound to a geographic location, like the United States. When you create an app in an environment, that app is routed only to data centers in that geographic location. Any items that you create in that environment (including chatbots, connections, gateways, flows using Microsoft Power Automate, and more) are also bound to their environment’s location.

Every environment can have zero or one Microsoft Dataverse database, which provides storage for your apps and chatbots. Whether you can create a database for your environment depends on the license you purchase for Power Apps and your permissions within that environment.

When you create an app in an environment, that app is only permitted to connect to the data sources that are also deployed in that same environment, including connections, gateways, flows, and Dataverse databases. For example, consider a scenario where you’ve created two environments named Test and Dev, and created a Dataverse database in each of the environments. If you create an app in the Test environment, it will only be permitted to connect to the Test database; it won’t be able to connect to the ‘Dev’ database.

Environments have two built-in roles that provide access to permissions within an environment:

  • Environment Admin – Can perform all administrative actions on an environment, including the following:
    • Add or remove a user or group from either the Environment Admin or Environment Maker role.
    • Provision a Dataverse database for the environment.
    • View and manage all resources created within the environment.
    • Set data loss prevention policies.
  • Environment Maker – Can create resources within an environment including apps, connections, custom connectors, gateways, and flows using Power Automate.

Non-Production/Sandbox environments

A Sandbox environment is any non-production environment. Since they’re isolated from production environments, a Sandbox environment is the place to safely develop and test application changes with low risk.

Some of the major advantages a non-Production/Sandbox environment offer are:

  • Evaluate new functions before they’re introduced to your Production environment: a Sandbox environment can be updated before Production so you can test all your functionality before applying the update to Production.
  • Access control: a sandbox environment can be placed in Administrative Mode to allow only users with System Administrators or System Customizer security roles to access the environment.
  • Copy and Restore: you can copy the customizations and data from a Production environment into a Sandbox environment
  • Training: after a full copy from production into a sandbox environment, you get an amazing training environment. Users will be able to experience the full capabilities of their Production solution without being afraid of adding or deleting test data during training that could disrupt the data quality maintained in Production.
  • Test new apps: a sandbox environment is a great place to install solutions and apps to be tested and considered for Production. After testing an app, the users can be trained in Sandbox ahead of the app deployment day into Production.

You can learn more about working with Sandboxed environments here: Sandbox environment.

Production environments

These environments are intended to be used for permanent work in an organization. It can be created and owned by an administrator or anyone with a Power Apps license, provided there’s a 1-GB available database capacity. These environments are also created for each existing Dataverse database when it’s upgraded to version 9.0 or later. Production environments are what you should use for any environments on which you depend.

Switching an environment

It’s important to spend time planning and designing your implementation, but you’ll always have the opportunity to switch the environment type from Production to Sandbox, and from Sandbox, to Production if needed.

For example, if you took a backup of a Production environment before installing a Solution and you noticed that solution is giving you some issues after the installation, or perhaps you’re even unable to remove the solution, you can restore it from your backup; however, you can’t restore backups into a Production environment, the environment will have to be switched to Sandbox first, then you can proceed restoring from the backup, and then switch to Production again. This limitation has been placed in order to avoid accidental overwrites of your Production environment.

Dataverse for Teams

Microsoft Dataverse for Teams delivers a built-in, low-code data platform for Microsoft Teams. It provides relational data storage, rich data types, enterprise-grade governance, and one-click solution deployment. A Dataverse for Teams environment is automatically created for the selected team when you create an app or bot in Microsoft Teams for the first time or install a Power Apps app from the app catalog for the first time. The Dataverse for Teams environment is used to store, manage, and share team-specific data, apps, and flows.

Each team can have one environment, and all data, apps, bots, and flows created with the Power Apps app inside a team are available from that team’s Dataverse for Teams database.

Security in Dataverse for Teams aligns to how security is handled in Teams, with a focus on Owners, Members, and Guests.

You can learn more about Dataverse for Teams environments here: Dataverse for Teams.

Environment details

You can see specific details related to your environments by selecting an individual environment in the Power Platform admin center. See some of the details of your environment by selecting an environment. Select See all to see more environment details.

Environment details as viewed from the admin center.

Select Edit to review and edit environment details.

More environment details from the admin center.

Environment Strategies

Developing an environment strategy means configuring environments and other layers of data security in a way that supports productive development in your organization, while securing and organizing resources. A strategy to manage environment provisioning and access, and controlling resources within them, is important to:

  • Secure data and access.
  • Understand how to use the default environment correctly.
  • Manage the correct number of environments to avoid sprawl and conserve capacity.
  • Facilitate application lifecycle management (ALM).
  • Organize resources in logical partitions.
  • Support operations (and helpdesk) in identifying apps that are in production by having them in dedicated environments.
  • Ensure data is being stored and transmitted in acceptable geographic regions (for performance and compliance reasons).
  • Ensure isolation of applications being developed.

You can learn more about establishing an environment strategy here:( Establishing an environment strategy.)

Secure deployments

One key element of administrating a Power Platform deployment is to consider security. Security is crucial to ensuring that everyone has access to only the data that they need, but also to ensure that data business isn’t accessed by people or applications that shouldn’t access it.

Establishing a DLP strategy

One of the first things that you should consider when deploying a Power Platform solution is to establish a Data loss prevention (DLP) strategy. DLP policies act as guardrails designed to help prevent users from unintentionally exposing organizational data and to protect information security in the tenant. DLP policies enforce rules for which connectors are enabled for each environment, and which connectors can be used together. Connectors are classified as either business data only, no business data allowed, or blocked. A connector in the business data only group can only be used with other connectors from that group in the same app or flow. For example, someone might want to create an application that applies the Microsoft Dataverse connecter and a third-party connector. If you classify the Microsoft Dataverse connector as a business data connector, you’ll only be able to apply the third-party connector if it’s also classified as a business data connector.

Establishing your DLP policies will go hand in hand with your environmental strategy.

Connector classification

Business and non-business classifications draw boundaries around what connectors can be used together in a given app or flow. Connectors can be classified across the following groups using DLP policies:

  • Business: A given Power App or Power Automate resource can use one or more connectors from a business group. If a Power App or Power Automate resource uses a business connector, it can’t use any non-business connector.
  • Non-business: A given Power App or Power Automate resource can use one or more connectors from a non-business group. If a Power App or Power Automate resource uses a non-business connector, it can’t use any business connector.
  • Blocked: No Power App or Power Automate resource can use a connector from a blocked group. All Microsoft-owned premium connectors and third-party connectors (standard and premium) can be blocked. All Microsoft-owned standard connectors and Common Data Service connectors can’t be blocked.

Strategies for creating DLP policies

As mentioned previously, as an administrator taking over an environment or starting to support use of Power Apps and Power Automate, DLP policies should be one of the first things you set up. This ensures a base set of policies is in place, and you can then focus on handling exceptions and creating targeted DLP policies that implement these exceptions once approved.

We recommend the following starting point for DLP policies for the shared user and team productivity environments:

  • Create a policy spanning all environments except selected ones (for example, your production environments), keep the available connectors in this policy limited to Office 365 and other standard microservices, and block access to everything else. This policy will apply to the default environment, and to the training environments, you have for running internal training events. Additionally, this policy will also apply to any new environments that will be created.
  • Create appropriate and more permissive DLP policies for your shared user and team productivity environments. These policies could allow makers to use connectors like Azure services in addition to Office 365 services. The connectors available in these environments will depend on your organization, and where your organization stores business data.

We recommend the following starting point for DLP policies for production (business unit and project) environments:

  • Exclude those environments from the shared user and team productivity policies.
  • Work with the business unit and project to establish which connectors and connector combinations they’ll use and create a tenant policy to include the selected environments only.
  • Environment admins of those environments can use environment policies to categorize custom connectors as business data only, if necessary.

In addition to the above, we also recommend:

  • Creating a minimal number of policies per environment. There’s no strict hierarchy between tenant and environment policies, and at design and runtime, all policies that are applicable to the environment in which the app or flow resides are evaluated together to decide whether the resource is in compliance or violation of DLP policies. Multiple DLP policies applied to one environment will fragment your connector space in complicated ways and might make it difficult to understand the issues your makers are facing.
  • Centrally managing DLP Policies using tenant-level policies, and using environment policies only to categorize custom connectors or in exceptional cases.

With this in place, plan how to handle exceptions. You can:

  • Deny the request.
  • Add the connector to the default DLP policy.
  • Add the environments to the All Except list for the global default DLP and create a use case-specific DLP policy with the exception included.

Example: Contoso’s DLP strategy

Let’s look at how Contoso Corporation, our sample organization for this guidance, set up its DLP policies. The setup of their DLP policies ties in closely with their environmental strategy. Contoso admins want to support user and team productivity scenarios and business applications, in addition to Center of Excellence (CoE) activity management.

The environment and DLP strategy Contoso admins have applied here consists of:

  • A tenant-wide restrictive DLP policy that applies to all environments in the tenant except some specific environments that they’ve excluded from the policy scope. Admins intend to keep the available connectors in this policy limited to Office 365 and other standard micro-services by blocking access to everything else. This policy will also apply to the default environment.
  • Contoso admins have created another shared environment for users to create apps for user and team productivity use cases. This environment has an associated tenant-level DLP policy that isn’t as risk-averse as a default policy and allows makers to use connectors like Azure services in addition to the Office 365 services. Because this is a non-default environment, admins can actively control the environment maker list for it. This is a tiered approach to a shared user and team productivity environment and associated DLP settings.
  • In addition, for the business units to create line-of-business applications, they have created development, test, and production environments for their tax and audit subsidiaries across various countries. The environment maker’s access to these environments is carefully managed, and appropriate first-and third-party connectors are made available using tenant-level DLP policies in consultation with the business unit stakeholders.
  • Similarly, dev/test/production environments are created for Central IT’s use to develop and roll out relevant or right applications. These business application scenarios typically have a well-defined set of connectors that need to be made available for makers, testers, and users in these environments. Access to these connectors is managed using a dedicated tenant-level policy.
  • Contoso also has a special purpose environment dedicated to its Center of Excellence activities. In Contoso, the DLP policy for the special purpose environment will remain high touch given the experimental nature of the theory teams book. In this case, tenant admins have delegated DLP management for this environment directly to a trusted environment admin of the CoE team and excluded it from a school of all tenant-level policies. This environment is managed only by the environment-level DLP policy, which is an exception rather than the rule at Contoso.

As expected, any new environments that are created in Contoso will map to the original all-environments policy.

This setup of tenant-centric DLP policies doesn’t prevent environment admins from coming up with their own environment-level DLP policies, if they want to introduce other restrictions or classify custom connectors.

You can learn more about creating DLP policies here: Plan and manage your Microsoft Power Platform environment.

Determine storage requirements

Full Power Apps and Power Automate capabilities are licensed on a standalone basis. Additionally, limited Power Apps and Power Automate capabilities are included within various Office 365 and Dynamics 365 licenses—that means users with those licenses already have access to Power Apps and Power Automate. The( Microsoft Power Apps and Power Automate Licensing Guide ) provides more details.

As an administrator, you aren’t required to have a standalone Power Apps or Power Automate license to manage environments.

Some questions to consider as you plan your per-app or per-user capacity:

  • Is a per-user or per-app license more cost-effective?
  • What add-on capacity do I need? For example, Portal Page Views, and AI Builder credits.
  • How much storage capacity do I need? Such as for databases, files, and logs.

Review capacity entitlements and usage from the ( Power Platform admin center).

Storage capacity

Microsoft Dataverse capacity (database, file, log, and add-ons) are pooled across the tenant and shared among all environments and workloads. The first subscription to Power Apps or Power Automate provides a one-time default capacity entitlement for the tenant.

For example, a Power Apps per-user plan would set the tenant capacity initially as 10 GB of Dataverse database, 20 GB of Dataverse file, and 2 GB of Dataverse log capacity.

Each other licensed user provides another per-user capacity grant that increases the overall tenant available capacity. There are also capacity add-ons available to purchase other databases, files, and log capacity.

Every environment can have zero or one Dataverse database, which provides storage for your apps. To create a database, there must be at least 1 GB of Dataverse database capacity remaining. Capacity is also consumed by normal Dataverse storage consumption by storing data, files, and logs.

As an administrator, you can monitor your capacity usage in the admin portal.

The tenant subscriptions include by default 10-GB database storage as long at least one instance of the tenant is on v8.2.

More storage capacity is granted at no charge at the rate of 5 GB for every 20 full users. For example, for every increment of 20 Dynamics 365 for Sales SLs, the included storage capacity increases by 5 GB. A customer with 20 Dynamics 365 for Sales SLs receives data storage of 15 GB Dynamics 365 (10-GB default database storage + 5-GB more database storage). The cap on the amount of free storage that may be earned per tenant is subject to the technical limit of 30 TB.

Additional database storage add-on

The Database Additional Storage Add-on provides flexibility to increase the storage capacity associated with your Dynamics Online subscription in increments of 1 GB per Additional Storage Add-on license, up to 30 TB of storage.

Subscription storage corresponding to a customer subscription is tracked against all the instances associated with the tenant.

Determine how storage is being used across all instances within a tenant

Storage usage is displayed on the Power Platform Admin Center within the Capacity tab.

Screenshot of the storage usage displayed on the Power Platform Admin Center within the Capacity tab.

Manage authentication

Power Platform authentication involves a sequence of requests, responses, and redirects between the user’s browser and Power Platform or Azure services. The sequence follows the (Azure Active Directory (Azure AD) auth code grant flow).

You can choose from three main identity models in Microsoft 365 when you set up and manage user accounts:

  • Cloud identity: Manage your user accounts in Microsoft 365 only. No on-premises servers are required to manage users; it’s all done in the cloud.
  • Synchronized identity: Synchronize on-premises directory objects with Microsoft 365 and manage your users on-premises. You can also synchronize passwords so that the users have the same password on-premises and in the cloud, but they’ll have to sign in again to use Microsoft 365.
  • Federated identity: Synchronize on-premises directory objects with Microsoft 365 and manage your users on-premises. The users have the same password on-premises and in the cloud, and they don’t have to sign in again to use Microsoft 365. This is often referred to as single Sign-On.

It’s important to carefully consider which identity model to use to get up and running. Think about time, existing complexity, and cost. These factors are different for every organization. Your choice is based largely on the size of your company and the depth and breadth of your IT resources.

Understanding Microsoft 365 identity and Azure Active Directory

Microsoft 365 uses the cloud-based user identity and authentication service Azure Active Directory (Azure AD) to manage users. Choosing if identity management is configured between your on-premises organization and Microsoft 365 is an early decision that is one of the foundations of your cloud infrastructure. Because changing this configuration later can be difficult, carefully consider the options to determine what works best for the needs of your organization.

You can choose from two main authentication models in Microsoft 365 to set up and manage user accounts; cloud authentication and federated authentication.

Cloud authentication

Depending if you’ve or don’t have an existing Active Directory environment on-premises, you have several options to manage authentication and identity services for your users with Microsoft 365.

Cloud Only

With the cloud-only model, you manage your user accounts in Microsoft 365 only. No on-premises servers are required; it’s all handled in the cloud by Azure AD. You create and manage users in the Microsoft 365 admin center or by using Windows PowerShell (PowerShell cmdlets )and identity and authentication are handled completely in the cloud by Azure AD.

The cloud-only model is typically a good choice if:

  • You have no other on-premises user directory.
  • You have a complex on-premises directory and simply want to avoid the work to integrate with it.
  • You have an existing on-premises directory, but you want to run a trial or pilot of Microsoft 365. Later, you can match the cloud users to on-premises users when you’re ready to connect to your on-premises directory.

Password hash sync with seamless single Sign-On

The simplest way to enable authentication for on-premises directory objects in Azure AD. With password hash sync (PHS), you synchronize your on-premises Active Directory user account objects with Microsoft 365 and manage your users on-premises. Hashes of user passwords are synchronized from your on-premises Active Directory to Azure AD so that the users have the same password on-premises and in the cloud. When passwords are changed or reset on-premises, the new password hashes are synchronized to Azure AD so that your users can always use the same password for cloud resources and on-premises resources. The passwords are never sent to Azure AD or stored in Azure AD in clear text. Some premium features of Azure AD, such as Identity Protection, require PHS regardless of which authentication method is selected. With seamless single Sign-On, users are automatically signed in to Azure AD when they are on their corporate devices and connected to your corporate network.

Pass-through authentication with seamless single Sign-On

Provides a simple password validation for Azure AD authentication services using a software agent running on one or more on-premises servers to validate the users directly with your on-premises Active Directory. With pass-through authentication (PTA), you synchronize on-premises Active Directory user account objects with Microsoft 365 and manage your users on-premises. Allows your users to sign in to both on-premises and Microsoft 365 resources and applications using their on-premises account and password. This configuration validates users passwords directly against your on-premises Active Directory without sending password hashes to Microsoft 365. Companies with a security requirement to immediately enforce on-premises user account states, password policies, and sign in hours would use this authentication method. With seamless single Sign-On, users are automatically signed in to Azure AD when they are on their corporate devices and connected to your corporate network.

Single Sign-On

By default, Dynamics 365 Online uses Azure Active Directory for authentication, however, many organizations around the world use their Local Active Directory to do authentication in-house.

Azure Active Directory Seamless Single Sign-On (Azure AD Seamless SSO) automatically signs users in when they are on their corporate devices connected to your corporate network. When enabled, users don’t need to type in their passwords to sign in to Azure AD, and usually, even type in their usernames. This feature provides your users easy access to your cloud-based applications without needing any more on-premises components.

Seamless SSO can be combined with either the ( Password Hash Synchronization or Pass-through Authentication )sign-in methods. Seamless SSO isn’t* applicable to Active Directory Federation Services (ADFS).

Diagram of Seamless Single Sign-On with Password Hash Synchronization and Pass-through Authentication methods.


Seamless SSO needs the user’s device to be domain-joined but doesn’t need for the device to be Azure AD Joined.

Key benefits

  • Great user experience
    • Users are automatically signed into both on-premises and cloud-based applications.
    • Users don’t have to enter their passwords repeatedly.
  • Easy to deploy & administer No other components needed on-premises to make this work.

Things to consider

  • Sign-in username can be either the on-premises default username (userPrincipalName) or another attribute configured in Azure AD Connect (Alternate ID). Both use cases work because Seamless SSO uses the security identifier claim in the Kerberos ticket to look up the corresponding user object in Azure AD.
  • Seamless SSO is an opportunistic feature. If it fails for any reason, the user sign-in experience goes back to its regular behavior – i.e, the user needs to enter their password on the sign-in page.
  • If an application (for example, forwards a domain_hint (OpenID Connect) or whr (SAML) parameter, identifying your tenant, or login_hint parameter, identifying the user, in its Azure AD sign-in request, users are automatically signed in without them entering usernames or passwords.
  • Users also get a silent Sign-On experience if an application (for example, sends sign-in requests to Azure AD’s tenanted endpoints – that is, or <tenant_ID> instead of Azure AD’s common endpoint – that is,
  • Sign out is supported. This allows users to choose another Azure AD account to sign in with, instead of being automatically signed in using Seamless SSO automatically.
  • Microsoft 365 Win32 clients (Outlook, Word, Excel, and others) with versions 16.0.8730.xxxx and above are supported using a non-interactive flow. For OneDrive, you’ll have to activate the (For more reference see OneDrive silent config feature)  for a silent Sign-On experience.
  • It can be enabled via Azure AD Connect.
  • It’s a free feature, and you don’t need any paid editions of Azure AD to use it.

Federate a single AD forest environment to the cloud

The following tutorial will walk you through creating a hybrid identity environment using federation. This environment can then be used for testing or for getting more familiar with how a hybrid identity works.

(For more reference see Tutorial: Federate a single AD forest environment to the cloud)

Azure AD conditional Access

For reference see Conditional Access policies in Azure Active Directory (Azure AD) at their simplest are if-then statements: if a user wants to access a resource, then they must complete an action.

Example: A payroll manager wants to access the payroll app that has been built with Power Apps and is required to perform multifactor authentication to access it.

Administrators are faced with two primary goals:

  • Empower users to be productive wherever and whenever.
  • Protect the organization’s assets.

By using Conditional Access policies, you can apply the right access controls when needed to keep your organization secure and stay out of your user’s way when they’re not needed. Conditional Access policies are enforced after the first-factor authentication has been completed.

Only Global Admins can configure Conditional Access policies. This isn’t available for Microsoft Power Platform or Dynamics 365 admins.

Diagram of a conceptual conditional access process flow.

For reference see Plan a Conditional Access deployment.

Connect and authenticate to data sources

Many of the applications and solutions that you build on the Power Platform will require data from other data sources. For example, a canvas application built by a power user might need to include data coming from Microsoft Dataverse and data from another source like a SQL database or some other location.

Connecting and authenticating to a data source is done separately from authenticating to a Power Platform service. When looking at how connection authentication is done, you first need to understand how Power Platform services connect with different data sources. Depending on the data source, Power Platform services connect in various ways, however, the general pattern is the same. Based on the app and the data source that is being used, the authentication credentials used may be the same as the Power Platform service or they may be different.

Connecting to Microsoft Dataverse

Power Apps canvas and model-driven apps connect directly to Dataverse without the need for a separate connector. Canvas apps store consent to work with other Dataverse environments in the Power Apps Resource Provider (RP). Power Automate authenticates using an API Hub, but all data interactions after that are direct to Dataverse. Both Power Apps and Power Automate support legacy connectors that access Dataverse using connectors such as the now deprecated Dynamics 365 connector and the Microsoft Dataverse (legacy) connector.

The diagram below illustrates how canvas apps work with Dataverse.

Diagram of the direct connection between the Power Apps back-end cluster and Dataverse.
  1. Power Apps back-end services request data directly from Dataverse.
  2. Dataverse returns query results back to Power Apps back-end services.

Connecting to non-Dataverse data sources

In general, Power Platform services will use connectors to work with external data sources that aren’t Dataverse. These connectors act as API wrappers that help to provide access to data and commands that will be available through the connector.

The following diagram illustrates a typical pathway using an Azure API Management (APIM) connector.

Diagram of the Power Platform back-end services working with an API Hub/API Management connector to reach external data connectors.
  1. The Power Platform service sends a connection request to the Power Apps Resource Provider (RP).
  2. The Power Apps RP asks the API Hub to create a connection and store the authentication credentials.
  3. The Power Platform service sends a data query request to the API Management connector.
  4. The API Management connector sends a request to the consent service to get permission to access the data source.
  5. The consent service returns credentials to the API Management connector.
  6. The API Management connector sends the consent credentials to the Power Apps RP. The credentials are stored in the RP so that Power Apps doesn’t prompt for consent again the next time data is requested.
  7. The API Management connector passes the data query to the external connector.
  8. The connector sends the query to the data source.
  9. The data source returns the requested data to the connector.
  10. The connector passes the data back to the Power Platform back-end cluster.

Authenticating data sources

Users authenticate to the Power Platform service first. Then, separately, users authenticate to a data source using the credentials the connector requires. The API Hub credentials service always stores and manage credentials.

Some connectors support more than one authentication method. Authentication to a data source is specific to that data source instance. It’s based on the authentication method the maker chose when creating the connection.

There are two types of data source authentication methods in Power Apps:

  • Explicit authentication means the app user’s credentials are used to access the data source.
  • Implicit authentication means the credentials the app maker provided when creating the connection are used.

We recommend you use explicit authentication whenever possible. It’s more secure.

You can learn more about the difference between explicit and implicit connections here: Explicit vs implicit connections. Although the article refers to SQL Server, it applies to all relational databases.

Let’s quickly review what we covered in this module.

In this module, we examined some of the different elements that you should consider when developing an implementation strategy for Power Platform deployments. This includes developing a strategy around environments, security, and storage. Some of the key elements that we examined throughout this module include:

  • Examining different considerations related to establishing a vision and deployment strategy as it relates to the Power Platform.
  • Reviewing administrative considerations related to defining administrative roles and creating Power Platform environments.
  • Examining how to create a Data Loss Prevention (DLP) strategy for securing data.
  • Reviewing different considerations related to storage.
  • Exploring different authentication strategies.
  • Examining how to connect to different data sources.

I Hope This will help you a lot for future Power Platform deployments.