From c32cadb278ec995510c171b9eee8e60db45f3da5 Mon Sep 17 00:00:00 2001 From: Grace Rehn Date: Thu, 2 Apr 2026 09:12:42 +1000 Subject: [PATCH 1/2] Make updating installation id a required step for cloned instances --- .../administration/upgrading/guide/creating-test-instance.mdx | 1 + .../upgrade/upgrade-migrating-instances.include.md | 3 ++- .../upgrade/upgrade-update-installation-id.include.md | 4 ++++ 3 files changed, 7 insertions(+), 1 deletion(-) diff --git a/src/pages/docs/administration/upgrading/guide/creating-test-instance.mdx b/src/pages/docs/administration/upgrading/guide/creating-test-instance.mdx index 3e5e60e86e..ea82f3dfa1 100644 --- a/src/pages/docs/administration/upgrading/guide/creating-test-instance.mdx +++ b/src/pages/docs/administration/upgrading/guide/creating-test-instance.mdx @@ -81,6 +81,7 @@ Creating a clone of an existing instance involves: 1. Downloading the same version of Octopus Deploy as your main instance. 1. Installing that version on a new server and configuring it to point to the cloned database. 1. Copying all the files from the backed up folders from the source instance. +1. [**Updating the Installation ID**](#update-the-instance-id) on the cloned instance (required). 1. *Optionally*, disabling targets on the cloned instance. 1. Download the latest version of Octopus Deploy. 1. Upgrade the test instance to the latest version of Octopus Deploy. diff --git a/src/shared-content/upgrade/upgrade-migrating-instances.include.md b/src/shared-content/upgrade/upgrade-migrating-instances.include.md index 4277421fa3..94db164d1b 100644 --- a/src/shared-content/upgrade/upgrade-migrating-instances.include.md +++ b/src/shared-content/upgrade/upgrade-migrating-instances.include.md @@ -2,10 +2,11 @@ It will be possible to run both the old and cloned instances side by side. Both of them can deploy to the same targets (assuming you are not using polling Tentacles). But there are a few items to keep in mind. -- The Octopus Server is tightly coupled with Calamari. Deploying to the same target from both servers will result in Calamari getting upgraded/downgraded a lot. +- The Octopus Server is tightly coupled with Calamari. Deploying to the same target from both servers will result in Calamari getting upgraded/downgraded a lot. - The newer Octopus Server will prompt you to upgrade the Tentacles. While running both instances side by side, you will want to avoid this. - Unless the cloned instance has the same domain name, polling Tentacles will not connect to the cloned instance. A clone of the polling Tentacles might need to be created. - The thumbprints for certificates and other sensitive items are stored in the Octopus Deploy database. Cloning the database cloned those values. +- **You must update the Installation ID on the cloned instance.** Cloning copies the Installation ID from the original, which means both instances will report [telemetry](/docs/security/outbound-requests/telemetry) under the same identifier. This corrupts usage data. See [Creating a test instance](/docs/administration/upgrading/guide/creating-test-instance) for the SQL script to generate a new Installation ID. ### Considerations diff --git a/src/shared-content/upgrade/upgrade-update-installation-id.include.md b/src/shared-content/upgrade/upgrade-update-installation-id.include.md index 43c79767b4..f5ecc979ba 100644 --- a/src/shared-content/upgrade/upgrade-update-installation-id.include.md +++ b/src/shared-content/upgrade/upgrade-update-installation-id.include.md @@ -1,5 +1,9 @@ ### Update the Instance ID +:::div{.warning} +**You must update the Installation ID after cloning.** Failing to do so means your cloned instance will report telemetry under the same ID as your original instance. This corrupts our usage data and prevents us from accurately understanding how many installations exist. Please do not skip this step. +::: + Cloning an instance includes the unique Installation ID of your original instance. This ID is used to identify the instance by a few integrations and sending telemetry reports. You can run this SQL Script on your cloned instance database to generate a new unique installation ID. From cc952b60e325ad4eee6efb035b2a259597349b6f Mon Sep 17 00:00:00 2001 From: Grace Rehn Date: Thu, 2 Apr 2026 13:14:16 +1000 Subject: [PATCH 2/2] Linting --- .../upgrading/guide/creating-test-instance.mdx | 2 +- .../upgrade/upgrade-migrating-instances.include.md | 10 +++++----- .../upgrade/upgrade-update-installation-id.include.md | 8 ++++---- 3 files changed, 10 insertions(+), 10 deletions(-) diff --git a/src/pages/docs/administration/upgrading/guide/creating-test-instance.mdx b/src/pages/docs/administration/upgrading/guide/creating-test-instance.mdx index ea82f3dfa1..e7806453e9 100644 --- a/src/pages/docs/administration/upgrading/guide/creating-test-instance.mdx +++ b/src/pages/docs/administration/upgrading/guide/creating-test-instance.mdx @@ -81,7 +81,7 @@ Creating a clone of an existing instance involves: 1. Downloading the same version of Octopus Deploy as your main instance. 1. Installing that version on a new server and configuring it to point to the cloned database. 1. Copying all the files from the backed up folders from the source instance. -1. [**Updating the Installation ID**](#update-the-instance-id) on the cloned instance (required). +1. [**Updating the Installation ID**](/docs/administration/upgrading/guide/creating-test-instance#update-the-instance-id) on the cloned instance (required). 1. *Optionally*, disabling targets on the cloned instance. 1. Download the latest version of Octopus Deploy. 1. Upgrade the test instance to the latest version of Octopus Deploy. diff --git a/src/shared-content/upgrade/upgrade-migrating-instances.include.md b/src/shared-content/upgrade/upgrade-migrating-instances.include.md index 94db164d1b..82d896270b 100644 --- a/src/shared-content/upgrade/upgrade-migrating-instances.include.md +++ b/src/shared-content/upgrade/upgrade-migrating-instances.include.md @@ -1,4 +1,4 @@ -### Migrating to a new instance +### Migrating to a new instance It will be possible to run both the old and cloned instances side by side. Both of them can deploy to the same targets (assuming you are not using polling Tentacles). But there are a few items to keep in mind. @@ -12,10 +12,10 @@ It will be possible to run both the old and cloned instances side by side. Both As you migrate your instance, here are few items to consider. -1. Will the new instance's domain name be the same, or will it change? For example, will it change from https://octopusdeploy.mydomain.com to https://octopus.mydomain.com. If it changes and you are using polling Tentacles, you will need to create new Tentacle instances for the new Octopus Deploy instance. +1. Will the new instance's domain name be the same, or will it change? For example, will it change from `https://octopusdeploy.mydomain.com` to `https://octopus.mydomain.com`. If it changes and you are using polling Tentacles, you will need to create new Tentacle instances for the new Octopus Deploy instance. 2. What CI, or build servers, integrate with Octopus Deploy? Do those plug-ins need to be updated? You can find several of the plug-ins on the [downloads page](https://octopus.com/downloads). 3. Do you have any internally developed tools or scripts that invoke the Octopus API? We've done our best to maintain backward compatibility, but there might be some changes. -4. What components do you use the most? What does a testing plan look like? +4. What components do you use the most? What does a testing plan look like? 5. Chances are there are new features and functionality you haven't been exposed to. How will you train people on the new functionality? If unsure, please [contact us](https://octopus.com/support) to get pointed in the right direction. ### Drift concerns @@ -30,7 +30,7 @@ If you are unsure how long the migration will take, consider setting up a test i ### Polling Tentacles -A Polling Tentacle can only connect to one Octopus Deploy instance. It connects via DNS name or IP address. If the new instance's DNS name changes - for example, the old instance was https://octopusdeploy.mydomain.com with the new instance set to https://octopus.mydomain.com - you'll need to clone each Polling Tentacle instance. +A Polling Tentacle can only connect to one Octopus Deploy instance. It connects via DNS name or IP address. If the new instance's DNS name changes - for example, the old instance was `https://octopusdeploy.mydomain.com` with the new instance set to `https://octopus.mydomain.com` - you'll need to clone each Polling Tentacle instance. Each Polling Tentacle will need to be cloned on each deployment target. To make things easier, we have provided [this script](https://github.com/OctopusDeployLabs/SpaceCloner/blob/master/CloneTentacleInstance.ps1) to help clone a Tentacle instance. That script will look at the source instance, determine the roles, environments, and tenants, then create a cloned Tentacle and register that cloned Tentacle with your cloned instance. @@ -44,4 +44,4 @@ Cutting over from the old instance to the new instance will require a bit of dow 1. Enable maintenance mode on the old instance to put it into read-only mode. 1. Ensure all CI servers are pointing to the new instance (or change DNS). -1. You don't have to upgrade Tentacles right away. Newer versions of Octopus Deploy [can communicate with older versions of Tentacles](/docs/support/compatibility). You can upgrade a set at a time instead of upgrading everything, starting in 2020.x you can perform a search on the deployment target page and update only the returned Tentacles. Or, you can [upgrade Tentacles per environment](https://www.youtube.com/watch?v=KVxdSdYAqQU&t=352s). \ No newline at end of file +1. You don't have to upgrade Tentacles right away. Newer versions of Octopus Deploy [can communicate with older versions of Tentacles](/docs/support/compatibility). You can upgrade a set at a time instead of upgrading everything, starting in 2020.x you can perform a search on the deployment target page and update only the returned Tentacles. Or, you can [upgrade Tentacles per environment](https://www.youtube.com/watch?v=KVxdSdYAqQU&t=352s). diff --git a/src/shared-content/upgrade/upgrade-update-installation-id.include.md b/src/shared-content/upgrade/upgrade-update-installation-id.include.md index f5ecc979ba..115105401b 100644 --- a/src/shared-content/upgrade/upgrade-update-installation-id.include.md +++ b/src/shared-content/upgrade/upgrade-update-installation-id.include.md @@ -1,4 +1,4 @@ -### Update the Instance ID +### Update the Instance ID {#update-the-instance-id} :::div{.warning} **You must update the Installation ID after cloning.** Failing to do so means your cloned instance will report telemetry under the same ID as your original instance. This corrupts our usage data and prevents us from accurately understanding how many installations exist. Please do not skip this step. @@ -27,9 +27,9 @@ PRINT 'The old Installation Id is ' + @oldguid + ' - Save this value' PRINT 'The new Installation Id will be ' + @newguid IF @dryRun = 1 - PRINT 'This is a dry run, no update is occurring. Set @dryrun to 0 to update the Installation Id.' + PRINT 'This is a dry run, no update is occurring. Set @dryrun to 0 to update the Installation Id.' ELSE - PRINT 'The Installation Id is being updated. Restart your Octopus Server service for this change to take effect.' + PRINT 'The Installation Id is being updated. Restart your Octopus Server service for this change to take effect.' UPDATE dbo.Configuration SET [JSON] = @config @@ -39,4 +39,4 @@ WHERE Id = 'upgradeavailability' :::div{.hint} The script is set to do a dry run of what will change. Change @dryRun to 0 to make the change on your instance. -::: \ No newline at end of file +:::