Dataverse Best Practices rewritten in a table format for clarity:
Category | Best Practice | Benefits |
---|---|---|
Data Modeling | Normalize data by breaking it into multiple related tables. | Reduces redundancy, improves data integrity, and simplifies maintenance. |
Use appropriate field types like Lookups for relationships and Choice fields for predefined lists. | Ensures consistency, enhances querying, and supports better data validation. | |
Define clear naming conventions with prefixes (e.g., cm_ for Course Management). | Improves readability and maintainability of data models. | |
Plan for scalability with indexing and optimized filtering for large datasets. | Improves performance when querying tables with millions of rows. | |
Performance | Enable delegation in Canvas Apps for server-side query execution. | Handles large datasets efficiently without hitting row limits. |
Use filtered FetchXML queries with pagination for large datasets (>50,000 rows). | Reduces query execution time and improves system responsiveness. | |
Minimize the use of Multiline Text fields and calculated fields. | Enhances performance and reduces storage overhead. | |
Security | Implement role-based security at the table and record levels. | Ensures secure and granular access control for users. |
Use field-level security for sensitive information like salaries or personal details. | Protects confidential data and complies with data privacy regulations. | |
Design business units carefully to reflect the organization’s structure. | Simplifies security and sharing configurations across teams or departments. | |
Automation | Use Power Automate for non-real-time processes and workflows. | Provides scalable and flexible automation with better performance. |
Optimize calculated and rollup fields by limiting their use for complex calculations. | Reduces server-side processing load and improves data update efficiency. | |
Integration | Use virtual tables for read-only integration with external data sources. | Avoids data duplication and keeps external data accessible in real-time. |
Consolidate multiple API calls into batches. | Reduces overhead, avoids API throttling, and improves integration efficiency. | |
Monitor and manage API usage within the allocated limits (25,000 calls/user/day). | Ensures uninterrupted integrations and avoids throttling. | |
Storage Management | Regularly monitor storage usage in the Dataverse Capacity report. | Helps identify and address potential storage issues proactively. |
Archive large files and historical data to external storage like Azure Blob Storage. | Reduces storage costs and maintains efficient database performance. | |
Disable unnecessary auditing to avoid excessive log storage usage. | Frees up log storage and reduces additional storage costs. | |
Auditing and Logging | Enable auditing only for critical fields and tables. | Maintains compliance and tracks important changes without overwhelming storage. |
Periodically review and export logs for long-term retention. | Keeps system logs manageable while ensuring audit data is preserved. | |
Backup and Recovery | Schedule regular backups and test the restore process periodically. | Ensures data safety and quick recovery during emergencies. |
Governance | Establish naming conventions, approval processes, and role-based permissions for customizations. | Improves consistency, maintainability, and security of the Dataverse environment. |
Maintain documentation for table schemas, workflows, and integrations. | Enhances collaboration and simplifies troubleshooting. | |
Licensing and Costs | Use per-app licensing for specific app users instead of per-user plans where applicable. | Optimizes costs while providing access to necessary features. |
Monitor and deactivate unused accounts to save on license costs. | Reduces unnecessary expenses and ensures proper license allocation. |
Source: https://learn.microsoft.com/en-us/power-apps/developer/data-platform/best-practices/work-with-data/