Skip to content

MongoDB Infrastructure Upgrade and Data Migration

This guide covers the infrastructure upgrade and data migration steps required for Brief Connect v2.5.0. This is Phase 2 of the three-phase upgrade process.

Work with the E2 Support Team

Please complete this phase in collaboration with the E2 support team. Our team will guide you through each step and ensure a successful migration.

Change Window Required

This phase requires a change window of several hours. The change window begins when you enable maintenance mode (step 2) and ends when Phase 2 is successfully complete.

Plan to perform this upgrade outside of core usage hours.


Overview

Phase 2 consists of the following steps:

  1. Deploy MongoDB Infrastructure
  2. Enable Maintenance Mode and Migration Mode
  3. Run Data Migration
  4. Swap to MongoDB
  5. Update Power BI Models
  6. Verify Migration Success (admin verification while in maintenance mode)
  7. Disable Maintenance Mode
  8. Run Full Smoke Tests (includes rollback procedure if needed)

Maintenance mode must remain active from step 2 until step 7 is complete.


Step 1: Deploy MongoDB Infrastructure

This step updates your existing Azure Resources release pipeline to deploy the new MongoDB vCore cluster alongside your current infrastructure.

Prerequisites

E2 Team Must Update Your Azure Resources Repository

Before proceeding, the E2 support team must provide you with an updated Azure Resources repository containing the new Bicep templates for MongoDB vCore provisioning. Contact the E2 support team to request this update before starting the infrastructure upgrade.

Non-Standard Infrastructure

If your environment uses hardened infrastructure with VNets, private endpoints, or was deployed using alternative Infrastructure as Code tools (e.g., Terraform, Pulumi), please work directly with the E2 support team for tailored deployment support. The steps below apply to standard Brief Connect deployments only.

1.1 Update Your Release Pipeline

Your environment already has a release pipeline for Azure resources. You need to update it to use the new deployment tasks.

  1. Open Azure DevOps and navigate to Pipelines → Releases
  2. Open the Deploy Brief Connect Azure Resources pipeline
  3. Click Edit
  4. Select the stage for your environment
  5. In the stage tasks:
    • Delete all existing tasks from this stage
    • Add the two new tasks as described below

1.2 Add PowerShell Task - Prepare Bicep Parameters

Add a PowerShell task with these settings:

Setting Value
Display name Bicep param File
Type Inline
Working Directory $(System.DefaultWorkingDirectory)/$(packageFolder)

Script:

Update the following values for your environment

  • $mainPrefix - Your environment's resource naming prefix (e.g., BC-PROD)
  • $azureRegion - Your Azure region (e.g., Australia East)
  • MongoDB cluster settings - see Deployment Architecture for recommended cluster sizes:
    • Production: M20 cluster, 64 GB storage, High Availability enabled
    • Non-production: M10 cluster, 32 GB storage, High Availability disabled
$mainPrefix = "<put resources name prefix value here>"
$azureRegion = "Australia East"
$userAssignedIdentityName = "$($mainPrefix)-MI"
$KeyVaultName = "$($mainPrefix)-KV"
$mongoPasswordSecretName = "MongoDbAdminPassword"

@"
using './main-1.bicep'

param region = '$($azureRegion)'
param mongoPasswordSecretName = '$($mongoPasswordSecretName)'
param userAssignedIdentityName = '$($userAssignedIdentityName)'
param keyVaultName = '$($KeyVaultName)'
"@ | Out-File -FilePath "parameters-1.bicepparam" -Encoding utf8 -Force

$file_content = @"
using './main-2.bicep'

var main_prefix = '$($mainPrefix)'
param region = '$($azureRegion)'
param mongoPasswordSecretName = '$($mongoPasswordSecretName)'
param userAssignedIdentityName = '$($userAssignedIdentityName)'
param keyVaultName = '$($KeyVaultName)'

param serverAppRegistrationClientId = '`${ENV:serverAppRegistrationClientId}'
param clientAppRegistrationClientId = '`${ENV:clientAppRegistrationClientId}'

param redisCacheSku = {
  name: 'Basic'
  family: 'C'
  capacity: 0
}

param apiServicePlanName = '`${main_prefix}-ASP'
param apiAppServicePlan = {
  name: 'P0v3'
  tier: 'Premium0V3'
}

param useApiAppServicePlanForFunctionApp = true

param redisInstanceName = '`${main_prefix}RC'
param apiAppName = '`${ENV:azureWebApiWebAppName}'
param functionAppName = '`${ENV:azureFunctionAppName}'
param logAnalyticsWorkspaceName = '`${main_prefix}-LAW'
param applicationInsightsName = '`${main_prefix}-AI'

param storageAccountWebName = '`${toLower(main_prefix)}dsa'
param storageAccountFuncName = '`${toLower(main_prefix)}fsa'
param storageAccountDataName = '`${toLower(main_prefix)}dsa'

param mongoClusterName = '`${toLower(main_prefix)}-mongo'
param mongoAdminUsername = 'mongoadmin'

// Non-production tier
param mongoComputeTier = 'M10'
param mongoStorageSizeGb = 32
param mongoStorageType = 'PremiumSSD'
param mongoShardCount = 1 // Single shard
param mongoHighAvailabilityMode = 'Disabled'
param mongoPublicNetworkAccess = 'Enabled'
param mongoAllowAllIPs = true // Allow all IPs
param mongoAllowAzureServices = true
"@ | Out-File -FilePath "parameters-2.bicepparam" -Encoding utf8 -Force

#prints the content of the files
Get-Content -Path "parameters-1.bicepparam"
Get-Content -Path "parameters-2.bicepparam"

1.3 Add Azure CLI Task - Deploy Resources

Add an Azure CLI task with these settings:

Setting Value
Display name Deploy Azure Resources
Azure Resource Manager connection Select your existing service connection
Script Type PowerShell
Script Location Inline script
ErrorActionPreference Stop
Working Directory $(System.DefaultWorkingDirectory)/$(packageFolder)
Access service principal details in script Checked

Inline script:

$resourceGroup = "$($ENV:azureResourceGroupName)"
az deployment group create --resource-group $resourceGroup --template-file main-1.bicep --parameters parameters-1.bicepparam
az deployment group create --resource-group $resourceGroup --template-file main-2.bicep --parameters parameters-2.bicepparam
  1. Go to the Variables tab in the pipeline editor
  2. Select Variable groups
  3. Click Link variable group
  4. Select your existing environment variable group and link it to the stage

1.5 Run the Pipeline

  1. Save the pipeline
  2. Create a new release for your environment stage only
  3. Wait for the deployment to complete
  4. Verify the MongoDB vCore cluster is provisioned in your resource group

What this deploys

The pipeline will provision a new MongoDB vCore cluster alongside your existing resources. Your existing Azure resources (Function App, Key Vault, Storage Accounts, etc.) will be updated in place — they will not be recreated or deleted.


Step 2: Enable Maintenance Mode and Migration Mode

Before migrating to MongoDB, place your Brief Connect environment into maintenance mode to prevent data loss during the migration.

Follow the steps in Maintenance Mode to enable maintenance mode:

  1. Open the Function App for the environment
  2. Go to Configuration → Application settings
  3. Add or update the setting MAINTENANCE_MODE to true
  4. Rename TempMongoDbConnectionString setting to MongoDbMigrationConnectionString without changing the value.
  5. Save changes and allow the app to restart

Verify maintenance mode is active:

  • Regular users should see a maintenance message and cannot access the application
  • System Administrators can still access the application with a maintenance banner visible
  • System Administrators can see "Database Migration" button in the command bar in Admin Panel (https://[WEB_APP_URL]/#/adminPanel)

Do not proceed until maintenance mode is confirmed active

Proceeding without maintenance mode enabled may result in data loss during migration.


Step 3: Run Data Migration

Brief Connect includes a built-in migration tool in the Admin Panel to migrate data from Azure Storage to MongoDB.

Open the Data Migration panel

  1. Log into the Brief Connect environment as a System Administrator
  2. Navigate to the Admin Panel (https://[WEB_APP_URL]/#/adminPanel)
  3. In the command bar, click Database Migration

Note

The "Database Migration" button is only visible when the required migration environment variables are configured.

Start the migration

  1. In the Selection Panel, select the tables and blob containers to migrate

    • By default, all required tables and containers should be selected

    Data Migration Selection Panel

  2. Click the Start button

  3. Review the confirmation dialog showing the number of items to migrate
  4. Click Start to begin the migration

Monitor migration progress

The Data Migration panel displays real-time progress:

  • Migration Summary Card: Shows the current run ID, elapsed time, and overall state
  • Progress Display: Shows tables completed, in progress, and failed, as well as total rows migrated
  • Diagnostics Panel: Shows connection status, performance metrics, and any warnings

The panel auto-refreshes during migration. You can also click Refresh to manually update the status.

Tip

Click Show Detailed Progress to view per-table migration metrics.

Wait for completion

The migration is complete when:

  • All tables show as "Completed" in the progress display
  • All blob containers show as "Completed"
  • The overall migration state shows "Completed"

If any items fail, review the error details in the Diagnostics Panel before proceeding.


Step 4: Swap to MongoDB

  1. Open the Function App for the environment
  2. Stop the application
  3. Go to Configuration → Application settings
  4. Rename MongoDbMigrationConnectionString setting to MongoDbConnectionString without changing the value.
  5. Rename CosmosDbConnectionString setting to OldCosmosDbConnectionString without changing the value.
  6. Save changes
  7. Go to Redis Cache instance and Flush all data in Cache
  8. Go back to the Function App for the environment and start it.

Step 5: Update Power BI Models

After the data migration is complete, the Power BI models need to be updated to connect to the new MongoDB data source.

E2 Support Team Responsibility

The E2 support team will prepare updated Power BI models in advance of your scheduled deployment and provide support during the upgrade to complete this step. No action is required from your team — the E2 support team will handle the Power BI model update.


Step 6: Verify Migration Success

Verification is split into two phases: admin verification while maintenance mode is active, and full smoke tests after maintenance mode is disabled.

Check migration tool status

  1. In the Data Migration panel, confirm the migration state shows Completed
  2. Review the Diagnostics Panel for any errors or warnings
  3. Click Download Report to download a migration report for your records
  4. Confirm no tables or containers show a "Failed" state

If any items failed, work with the E2 support team to resolve the issues before proceeding.

Run admin verification (while in maintenance mode)

While maintenance mode is active, System Administrators can perform initial verification to confirm the migration was successful.

Test Expected Result
Log in as a System Administrator Dashboard loads successfully
Open the Admin Panel Admin Panel loads with all configuration options available
Open an existing record Record opens and displays all metadata correctly
View the Documents tab on a record All documents are listed and can be opened
View the Activity Log on a record Activity history is displayed correctly
View the People and Roles tab All role assignments are displayed correctly
Search for a record using the search box Search returns expected results
Generate a PDF Pack for a record PDF Pack generates successfully

If any of these tests fail, use the rollback procedure below before disabling maintenance mode.


Step 7: Disable Maintenance Mode

Once admin verification is complete:

  1. Open the Function App for the environment
  2. Go to Configuration → Application settings
  3. Set MAINTENANCE_MODE to false (or remove the setting entirely)
  4. Save changes and allow the app to restart
  5. Verify that users can access the application normally

Step 8: Run Full Smoke Tests

After maintenance mode is disabled, run the full smoke tests to verify Brief Connect functions correctly for all users.

Be prepared to rollback

If critical issues are discovered during smoke tests, you may need to re-enable maintenance mode and execute the rollback procedure.

Configuration and access

Test Expected Result
Log in as a regular user Dashboard loads successfully with appropriate records visible
As a proxy user, switch to view another user's dashboard Proxy user can see the assigned user's records

Permissions

Test Expected Result
As a regular user, attempt to open a record you have access to Record opens with appropriate edit/view permissions
As a regular user, attempt to open a record you do not have access to Access is denied or record is not visible

Core functionality

Test Expected Result
Apply filters to the dashboard Filters work correctly and display expected records
Change columns visible in the dashboard Column changes are saved and displayed
Export records to Excel Excel export generates successfully

For a comprehensive list of smoke tests, see Basic testing of Brief Connect.

Run customer-specific verification tests

Prepare your verification tests in advance

We recommend preparing a list of customer-specific verification tests before the migration window. This ensures you can quickly verify critical functionality without extending the change window.

In addition to the standard smoke tests above, you should verify any functionality specific to your Brief Connect configuration:

  • Custom workflows: Test that your configured workflows behave as expected
  • Record types and subtypes: Verify records of each type can be opened and edited
  • Custom fields and classifications: Confirm custom field values are displayed correctly
  • Integrations: Test any integrations with external systems (e.g., webhooks, APIs)
  • Reporting: Verify any reporting or Power BI connections function correctly

Document the results of your verification tests.

Rollback procedure

If critical issues are discovered and cannot be resolved, you can roll back to the previous configuration.

Rollback will discard migrated data

Rolling back reverts the environment to use Azure Table Storage. Any data written to MongoDB after the swap will not be available.

To roll back:

  1. Re-enable maintenance mode:

    1. Open the Function App for the environment
    2. Go to Configuration → Application settings
    3. Set MAINTENANCE_MODE to true
    4. Save changes and allow the app to restart
  2. Revert Power BI models to the previous configuration:

    E2 Support Team Responsibility

    The E2 support team will handle reverting the Power BI models to the previous Azure Table Storage configuration as part of the rollback process.

  3. Go to Azure DevOps, open your Brief Connect project

  4. In the left panel navigation, go to Pipelines → Library → Variable groups
  5. Locate the variable group for the target environment
  6. Revert the database connection string to the previous Azure Storage connection string
  7. Save changes
  8. Re-run the Deploy Brief Connect Application release pipeline to apply the reverted configuration
  9. Once the deployment completes, verify the application is functioning correctly with the original data
  10. Disable maintenance mode

Contact the E2 support team if you need assistance with the rollback procedure.


Next steps

After completing Phase 2, proceed to Phase 3: Enable MongoDB Dashboard (Beta) in the v2.5.0 Release Notes.