Skip to content

MongoDB Infrastructure Upgrade and Data Migration

This guide covers the infrastructure upgrade and data migration steps required for Brief Connect v2.5. This is Phase 2 of the three-phase upgrade process.

Work with the E2 Support Team

Please complete this phase in collaboration with the E2 support team. Our team will guide you through each step and ensure a successful migration.

Change Window Required

This phase requires a change window of several hours. The change window begins when you enable maintenance mode (step 2) and ends when Phase 2 is successfully complete.

Plan to perform this upgrade outside of core usage hours.


Overview

Phase 2 consists of the following steps:

  1. Deploy MongoDB Infrastructure
  2. Enable Maintenance Mode and Migration Mode
  3. Run Data Migration
  4. Swap to MongoDB
  5. Update Power BI Models
  6. Verify Migration Success (admin verification while in maintenance mode)
  7. Disable Maintenance Mode
  8. Run Full Smoke Tests (includes rollback procedure if needed)

Maintenance mode must remain active from step 2 until step 7 is complete.


Step 1: Deploy MongoDB Infrastructure

This step updates your existing Azure Resources release pipeline to deploy MongoDB Atlas (primary database) alongside your current infrastructure.

Prerequisites

Before starting the infrastructure deployment:

  1. If you are deploying the optional reporting database (Azure Cosmos DB for MongoDB vCore), confirm your Azure Subscription contains the resource type Microsoft.DocumentDB. If it doesn't, this resource type will need to be added.

  2. Ensure you have the MongoDB Atlas prerequisites in place:

  3. An Atlas organisation (direct in Atlas, or via the Azure Marketplace offer)
  4. An Atlas OAuth service account with Organisation Owner role
  5. The Atlas Organisation ID, plus the service account Client ID and Client Secret

If you are provisioning Atlas via the Azure Marketplace SaaS offer, you may also need the Azure resource provider Microsoft.SaaS registered in your subscription.

  1. Contact the E2 support team to request an updated Azure Resources repository containing:
  2. Updated Bicep templates (v2.5.2+)
  3. Provisioning scripts (e.g. Scripts/ProvisionInfra.ps1)

Non-Standard Infrastructure

If your environment uses hardened infrastructure with VNets, private endpoints, or was deployed using alternative Infrastructure as Code tools (e.g., Terraform, Pulumi), please work directly with the E2 support team for tailored deployment support. The steps below apply to standard Brief Connect deployments only.

1.1 Update DevOps App Role Assignments

The deployment pipeline requires an additional role to manage Key Vault secrets. Update the Role Based Access Control Administrator role assignment for the "[EnvPrefix] Brief Connect DevOps" Entra ID app to include the new Key Vault Secrets Officer role.

  1. Open the Azure Resource Group for the related environment in Azure Portal
  2. Go to Access Control (IAM) → Role assignments
  3. Find the existing Role Based Access Control Administrator assignment for "[EnvPrefix] Brief Connect DevOps"
  4. Edit the assignment to update the conditions:
    • In the Conditions tab, click on + Select roles and principals
    • Click Configure in the "Constrain roles" section
    • Ensure the following roles are selected:
      • Storage Account Contributor
      • Key Vault Secrets User
      • Key Vault Secrets Officer (new)
  5. Save the assignment

1.2 Update Your Release Pipeline

Your environment already has a release pipeline for Azure resources. You need to update it to use the new deployment tasks.

  1. Open Azure DevOps and navigate to Pipelines → Releases
  2. Open the Deploy Brief Connect Azure Resources release pipeline
  3. Click Edit
  4. Add next environment variables:
  5. Name: packageFolder; Value: Source alias from Artifact of the pipeline. Example: _Brief Connect - Azure Resources. Scope: Release
  6. Ensure your linked variable group includes Atlas variables:
  7. AtlasOrganizationId
  8. AtlasDevOpsAppId
  9. AtlasDevOpsAppSecret (secret)
  10. AtlasAllowCreateProjectWhenMissing (set to true to allow the deployment to create the Atlas Project if it does not already exist)

Upgrade note

In v2.5.2+, packageFolder must point to the artifact root (not /Bicep). The deployment stage runs Scripts/ProvisionInfra.ps1 and reads/writes ProvisionMongoAtlas.parameters.json, which live at the artifact root.

  1. Select the stage for your environment
  2. In the stage tasks:
    • Delete all existing tasks from this stage
    • Add the two new tasks as described below

1.3 Add PowerShell Task - Prepare Bicep Parameters

Add a PowerShell task with these settings:

Setting Value
Display name Bicep param File
Type Inline
Working Directory $(System.DefaultWorkingDirectory)/$(packageFolder)

Script:

Use Non-prod params script or Prod params script files as a base for your environment deployment.

Review and update variables in the parameters file, if needed (SKU sizes for example, naming, etc). See the details below for parameters used.

Bicep/parameters-1.bicepparam (for main-1.bicep):

Parameter Name Description Type Default Value
region The Azure region where the resources will be deployed. string (none)
userAssignedIdentityName The name of the user-assigned managed identity to be created or used. string (none)
keyVaultName The name of the Azure Key Vault to be created or used. string (none)
mongoMainPasswordSecretName Key Vault secret name for the Atlas DB admin password. string (none)
mongoReplicaPasswordSecretName Key Vault secret name for the vCore DB admin password (reporting/replica). string (none)
mongoPasswordLength Length for newly generated passwords. int 32
mongoPasswordMinExistingLength Minimum length required to accept an existing secret without rotation. If requirement is not met, then new password will be generated int 16
forceUpdateTag Force update tag for password generation scripts. Used to trigger script execution (set to utcNow() by default to run it every time). string utcNow()

Bicep/parameters-2.bicepparam (for main-2.bicep):

Parameter Name Description Type Default Value
serverAppRegistrationClientId Server App Registration application Id string (none)
clientAppRegistrationClientId Client App Registration application Id string (none)
region The region where the resources will be deployed. string (none)
existingApiAppServicePlanResourceId Optional, existing Api Web App Service Plan resource ID. Used for non-prod environments to share a service plan across multiple environments. Leave empty to create a new one. string ''
existingFunctionAppServicePlanResourceId Optional, existing Function App Service Plan resource ID. Used for non-prod environments to share a service plan across multiple environments. Leave empty to create a new one. string ''
useApiAppServicePlanForFunctionApp Set to true to reuse API Web App Service Plan for both - API Web App and Function App. Recommended for non-prod environments to reduce costs. bool false
apiAppServicePlan SKU for the API app service plan (if creating new) object {{}}
apiServicePlanName Name for the API app service plan (if creating new) string ''
functionAppServicePlan SKU for the function app service plan (if creating new) object {{}}
functionAppServicePlanName Name for the function app service plan (if creating new) string ''
redisCacheSku The SKU of the Redis cache to deploy. object (none)
redisInstanceName Name of the Redis instance. string (none)
userAssignedIdentityName Name of the user-assigned managed identity. string (none)
apiAppName Name of the API App. string (none)
functionAppName Name of the Function App. string (none)
keyVaultName Name of the Key Vault. string (none)
logAnalyticsWorkspaceName Name of the Log Analytics Workspace. string (none)
applicationInsightsName Name of the Application Insights resource. string (none)
storageAccountWebName Name of the Storage Account for web/static site. string (none)
storageAccountFuncName Name of the Storage Account for Function App. string (none)
storageAccountDataName Name of the Storage Account for data. string (none)
dataStorageRegion Region for data storage account. By default 'region' param value is used. string region
dataMongoReplicaRegion Region for MongoDB vCore deployment. By default 'region' param value is used. string region
mongoMainConnectionStringSecretName Key Vault secret name to store the Atlas SRV connection string placeholder. string (none)
mongoMainPasswordSecretName Key Vault secret name that stores the Atlas DB admin password. string (none)
mongoReplicaClusterName MongoDB vCore cluster name (reporting/replica). string (none)
mongoReplicaAdminUsername MongoDB vCore administrator username. string 'mongoadmin'
mongoReplicaPasswordSecretName Key Vault secret name that stores the MongoDB vCore admin password. string (none)
mongoReplicaServerVersion MongoDB vCore server version. string '8.0'
mongoReplicaComputeTier MongoDB vCore compute tier. string 'M10'
mongoReplicaStorageSizeGb MongoDB vCore storage size in GB. int 32
mongoReplicaStorageType MongoDB vCore storage type. string 'PremiumSSD'
mongoReplicaShardCount MongoDB vCore shard count. int 1
mongoReplicaHighAvailabilityMode MongoDB vCore high availability mode. string 'Disabled'
mongoReplicaPublicNetworkAccess MongoDB vCore public network access. string 'Enabled'
mongoReplicaAllowAllIPs Enable MongoDB firewall rule to allow all IPs (for development only). bool false
mongoReplicaAllowAzureServices Enable MongoDB firewall rule to allow Azure services. bool true
defaultLogsRetentionInDays Default retention in days for application logs. int 90
auditLogsInteractiveRetentionInDays Retention in days for interactive audit logs. int 180
auditLogsTotalRetentionInDays Total retention in days for audit logs (includes interactive and archived). int 730

Notes:

  • For mongoPasswordLength and mongoPasswordMinExistingLength, you may use the defaults unless your security policy requires otherwise.
  • MongoDB passwords are generated automatically and stored in Key Vault. You may rotate the stored secrets to values that match your security policies.
  • For object parameters (e.g., SKUs), see Azure documentation for required fields.
  • Parameters that have the same name in both .bicepparam files must share the same values.

1.4 Add Azure CLI Task - Deploy Resources

Add an Azure CLI task with these settings:

Setting Value
Display name Deploy Azure Resources
Azure Resource Manager connection Select your existing service connection
Script Type PowerShell
Script Location Script Path
ErrorActionPreference Stop
Working Directory $(System.DefaultWorkingDirectory)/$(packageFolder)
Access service principal details in script Checked

Script Path:

$(System.DefaultWorkingDirectory)/$(packageFolder)/Scripts/ProvisionInfra.ps1

Script Arguments:

-rootFolderPath "$(System.DefaultWorkingDirectory)/$(packageFolder)" -AtlasDevOpsAppSecret "$(AtlasDevOpsAppSecret)"

  1. Go to the Variables tab in the pipeline editor
  2. Select Variable groups
  3. Click Link variable group
  4. Select your existing environment variable group and link it to the stage

1.6 Run the Pipeline

  1. Save the pipeline
  2. Create a new release for your environment stage only
  3. Wait for the deployment to complete
  4. Verify MongoDB Atlas has been provisioned:
  5. In Atlas, confirm the expected Project and Cluster exist
  6. In Azure Key Vault, confirm the Atlas connection string secret (e.g. MongoDbMainConnectionString) exists

If you are deploying the optional reporting database, also confirm the Azure Cosmos DB for MongoDB vCore resource is provisioned in your Azure resource group.

What this deploys

The pipeline provisions MongoDB Atlas (primary database) and updates your existing Azure resources (Function App, Key Vault, Storage Accounts, etc.) in place — they will not be recreated or deleted.


1.7 (Optional) Enable Data API for Azure Cosmos DB for MongoDB vCore (Power BI only)

The Data API is required for Power BI connectivity when using the Azure Cosmos DB for MongoDB vCore connector.

  1. Go to Azure portal
  2. Go to your Azure Cosmos DB for MongoDB vCore instance (reporting database)
  3. Open the 'Features' blade
  4. Click on 'Data API'
  5. Click on 'Enable'
  6. Verify the result. Your 'Features' blade should now display 'Data API' as 'On'.

Step 2: Enable Maintenance Mode and Migration Mode

Before migrating to MongoDB, place your Brief Connect environment into maintenance mode to prevent data loss during the migration.

Follow the steps in Maintenance Mode to enable maintenance mode:

  1. Open the Function App for the environment
  2. Go to Configuration → Application settings
  3. Add or update the setting MAINTENANCE_MODE to true
  4. Rename TempMongoDbConnectionString setting to MongoDbMigrationConnectionString without changing the value.
  5. Rename TempMongoDbPassword setting to MongoDbPassword without changing the value.
  6. If DataStorageAccountConnectionString is not present in application settings, then add a new setting DataStorageAccountConnectionString and copy the value from StorageAccountConnectionString setting.
  7. Save changes and allow the app to restart

Verify maintenance mode is active:

  • Regular users should see a maintenance message and cannot access the application
  • System Administrators can still access the application with a maintenance banner visible
  • System Administrators can see "Database Migration" button in the command bar in Admin Panel (https://[WEB_APP_URL]/#/adminPanel)

Do not proceed until maintenance mode is confirmed active

Proceeding without maintenance mode enabled may result in data loss during migration.


Step 3: Run Data Migration

Brief Connect includes a built-in migration tool in the Admin Panel to migrate data from Azure Storage to MongoDB.

Open the Data Migration panel

  1. Log into the Brief Connect environment as a System Administrator
  2. Navigate to the Admin Panel (https://[WEB_APP_URL]/#/adminPanel)
  3. In the command bar, click Database Migration

Note

The "Database Migration" button is only visible when the required migration environment variables are configured.

Start the migration

  1. In the Selection Panel, select the tables and blob containers to migrate

    • By default, all required tables and containers should be selected

    Data Migration Selection Panel

  2. Click the Start button

  3. Review the confirmation dialog showing the number of items to migrate
  4. Click Start to begin the migration

Monitor migration progress

The Data Migration panel displays real-time progress:

  • Migration Summary Card: Shows the current run ID, elapsed time, and overall state
  • Progress Display: Shows tables completed, in progress, and failed, as well as total rows migrated
  • Diagnostics Panel: Shows connection status, performance metrics, and any warnings

The panel auto-refreshes during migration. You can also click Refresh to manually update the status.

Tip

Click Show Detailed Progress to view per-table migration metrics.

Wait for completion

The migration is complete when:

  • All tables show as "Completed" in the progress display
  • All blob containers show as "Completed"
  • The overall migration state shows "Completed"

If any items fail, review the error details in the Diagnostics Panel before proceeding.


Step 4: Swap to MongoDB

  1. Open the Function App for the environment
  2. Stop the application
  3. Go to Configuration → Application settings
  4. Rename MongoDbMigrationConnectionString setting to MongoDbConnectionString without changing the value.
  5. Confirm MongoDbPassword is present and still points to the same Key Vault secret.
  6. Save changes
  7. Open the API Web App for the environment
  8. Stop the application
  9. Go to Configuration → Application settings
  10. Rename TempMongoDbConnectionString setting to MongoDbConnectionString without changing the value.
  11. Rename TempMongoDbPassword setting to MongoDbPassword without changing the value.
  12. Save changes
  13. Go to Redis Cache instance and Flush all data in Cache
  14. Go back to the Function App for the environment and start it.

Step 5: Update Power BI Models

After the data migration is complete, the Power BI models need to be updated to connect to the new MongoDB data source.

E2 Support Team Responsibility

The E2 support team will prepare updated Power BI models in advance of your scheduled deployment and provide support during the upgrade to complete this step. No action is required from your team — the E2 support team will handle the Power BI model update.

Data API Preview Feature

Power BI connects to MongoDB using the Azure Cosmos DB for MongoDB vCore connector. This requires the Data API preview feature to be enabled on your Azure Cosmos DB for MongoDB vCore (reporting database) instance.

The E2 support team will enable this feature as part of the infrastructure upgrade.

Preview Feature Considerations

Both the Data API and the Power BI connector are currently in preview. While these features are fully functional and supported for Brief Connect deployments, using preview features carries some risk (albeit small) that the service or product may change or cease to operate before reaching general availability.


Step 6: Verify Migration Success

Verification is split into two phases: admin verification while maintenance mode is active, and full smoke tests after maintenance mode is disabled.

Check migration tool status

  1. In the Data Migration panel, confirm the migration state shows Completed
  2. Review the Diagnostics Panel for any errors or warnings
  3. Click Download Report to download a migration report for your records
  4. Confirm no tables or containers show a "Failed" state

If any items failed, work with the E2 support team to resolve the issues before proceeding.

Run admin verification (while in maintenance mode)

While maintenance mode is active, System Administrators can perform initial verification to confirm the migration was successful.

Test Expected Result
Log in as a System Administrator Dashboard loads successfully
Open the Admin Panel Admin Panel loads with all configuration options available
Open an existing record Record opens and displays all metadata correctly
View the Documents tab on a record All documents are listed and can be opened
View the Activity Log on a record Activity history is displayed correctly
View the People and Roles tab All role assignments are displayed correctly
Search for a record using the search box Search returns expected results
Generate a PDF Pack for a record PDF Pack generates successfully

If any of these tests fail, use the rollback procedure below before disabling maintenance mode.


Step 7: Disable Maintenance Mode

Once admin verification is complete:

  1. Open the Function App for the environment
  2. Go to Configuration → Application settings
  3. Set MAINTENANCE_MODE to false (or remove the setting entirely)
  4. Save changes and allow the app to restart
  5. Verify that users can access the application normally

Step 8: Run Full Smoke Tests

After maintenance mode is disabled, run the full smoke tests to verify Brief Connect functions correctly for all users.

Be prepared to rollback

If critical issues are discovered during smoke tests, you may need to re-enable maintenance mode and execute the rollback procedure.

Configuration and access

Test Expected Result
Log in as a regular user Dashboard loads successfully with appropriate records visible
As a proxy user, switch to view another user's dashboard Proxy user can see the assigned user's records

Permissions

Test Expected Result
As a regular user, attempt to open a record you have access to Record opens with appropriate edit/view permissions
As a regular user, attempt to open a record you do not have access to Access is denied or record is not visible

Core functionality

Test Expected Result
Apply filters to the dashboard Filters work correctly and display expected records
Change columns visible in the dashboard Column changes are saved and displayed
Export records to Excel Excel export generates successfully

For a comprehensive list of smoke tests, see Basic testing of Brief Connect.

Run customer-specific verification tests

Prepare your verification tests in advance

We recommend preparing a list of customer-specific verification tests before the migration window. This ensures you can quickly verify critical functionality without extending the change window.

In addition to the standard smoke tests above, you should verify any functionality specific to your Brief Connect configuration:

  • Custom workflows: Test that your configured workflows behave as expected
  • Record types and subtypes: Verify records of each type can be opened and edited
  • Custom fields and classifications: Confirm custom field values are displayed correctly
  • Integrations: Test any integrations with external systems (e.g., webhooks, APIs)
  • Reporting: Verify any reporting or Power BI connections function correctly

Document the results of your verification tests.

Rollback procedure

If critical issues are discovered and cannot be resolved, you can roll back to the previous configuration.

Rollback will discard migrated data

Rolling back reverts the environment to use Azure Table Storage. Any data written to MongoDB after the swap will not be available.

To roll back:

  1. Re-enable maintenance mode:

    1. Open the Function App for the environment
    2. Go to Configuration → Application settings
    3. Set MAINTENANCE_MODE to true
    4. Save changes and allow the app to restart
  2. Revert Power BI models to the previous configuration:

    E2 Support Team Responsibility

    The E2 support team will handle reverting the Power BI models to the previous Azure Table Storage configuration as part of the rollback process.

  3. Go to Azure DevOps, open your Brief Connect project

  4. In the left panel navigation, go to Pipelines → Library → Variable groups
  5. Locate the variable group for the target environment
  6. Revert the database connection string to the previous Azure Storage connection string (changes made in 'Step 4: Swap to MongoDB')
  7. Save changes
  8. Re-run the Deploy Brief Connect Application release pipeline to apply the reverted configuration
  9. Once the deployment completes, verify the application is functioning correctly with the original data
  10. Disable maintenance mode

Contact the E2 support team if you need assistance with the rollback procedure.


Next steps

After completing Phase 2, proceed to Phase 3: Enable MongoDB Dashboard (Beta) in the v2.5 Release Notes.