docs: add migration guide for v1.6 (#2010)

Co-authored-by: Matthias Nannt <mail@matthiasnannt.com>
This commit is contained in:
Shubham Palriwala
2024-02-23 22:11:38 +05:30
committed by GitHub
parent f19e1960b7
commit 241716b4f3
5 changed files with 402 additions and 8 deletions

View File

@@ -1,7 +1,7 @@
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
# dependencies
**/node_modules
# **/node_modules
.pnp
.pnp.js
.pnpm-store/

View File

@@ -8,7 +8,92 @@ export const metadata = {
# Migration Guide
## v1.1 -> v1.2
## v1.6
Formbricks v1.6 comes with a big new features like Advanced Targeting & Segmentation of your end-users along with on-the-fly triggers for surveys and a ton of stability improvements & features. This also involves a few changes in our environment variables. This guide will help you migrate your existing Formbricks instance to v1.6 without any hassles or build errors.
<Note>
This upgrade requires a **data migration**. Please make sure to backup your database before proceeding with
the upgrade. Follow the below steps thoroughly to upgrade your Formbricks instance to v1.6.
</Note>
### Steps to Migrate
This guide is for users who are self-hosting Formbricks using our one-click setup. If you are using a different setup, you might adjust the commands accordingly.
To run all these steps, please navigate to the `formbricks` folder where your `docker-compose.yml` file is located.
1. **Backup your Database**: This is a crucial step. Please make sure to backup your database before proceeding with the upgrade. You can use the following command to backup your database:
<Col>
<CodeGroup title="Backup Postgres">
```bash
docker exec formbricks-quickstart-postgres-1 pg_dump -U postgres -d formbricks > formbricks_pre_v1.6_$(date +%Y%m%d_%H%M%S).sql
```
</CodeGroup>
</Col>
2. Stop the running Formbricks instance & remove the related containers:
<Col>
<CodeGroup title="Stop the containers">
```bash
docker-compose down
```
</CodeGroup>
</Col>
3. Restarting the containers will automatically pull the latest version of Formbricks:
<Col>
<CodeGroup title="Restart the containers">
```bash
docker-compose up -d
```
</CodeGroup>
</Col>
4. Now let's migrate the data to the latest schema:
<Note>To find your Docker Network name for your Postgres Database, find it using `docker network ps`</Note>
<Col>
<CodeGroup title="Migrate the data">
```bash
docker run --rm \
--network=formbricks_default \
-e DATABASE_URL="postgresql://postgres:postgres@postgres:5432/formbricks?schema=public" \
-e UPGRADE_TO_VERSION="v1.6" \
ghcr.io/formbricks/data-migrations:v1.6
```
</CodeGroup>
</Col>
The above command will migrate your data to the latest schema. This is a crucial step to migrate your existing data to the new structure. Only if the script runs successful, changes are made to the database. The script can safely run multiple times.
5. That's it! Once the migration is complete, you can **now access your Formbricks instance** at the same URL as before.
### In-App Surveys with @formbricks/js
If you are using the `@formbricks/js` package, please make sure to update it to version 1.6.0 to use the latest features and improvements.
### Deprecated Environment Variables
| Environment Variable | Comments |
| -------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| GITHUB_AUTH_ENABLED | Was used to enable GitHub OAuth, but from v1.6, you can just set the `GITHUB_ID` and `GITHUB_SECRET` environment variables. |
| GOOGLE_AUTH_ENABLED | Was used to enable Google OAuth, but from v1.6, you can just set the `GOOGLE_CLIENT_ID` and `GOOGLE_CLIENT_SECRET` environment variables. |
| AZUREAD_AUTH_ENABLED | Was used to enable AzureAD OAuth, but from v1.6, you can just set the `AZUREAD_CLIENT_ID`, `AZUREAD_CLIENT_SECRET` & `AZUREAD_TENANT_ID` environment variables. |
## v1.2
Formbricks v1.2 ships a lot of features targeting our Link Surveys. We have also improved our security posture to be as robust as ever. However, it also comes with a few breaking changes specifically with the environment variables. This guide will help you migrate your existing Formbricks instance to v1.2 without any hassles or build errors.
@@ -25,7 +110,7 @@ Formbricks v1.2 ships a lot of features targeting our Link Surveys. We have also
| -------------------- | ------------------------------------------------------------------------- |
| SURVEY_BASE_URL | The WEBAPP_URL is now used to determine the survey base url in all places |
## v1.0 -> v1.1
## v1.1
Formbricks v1.1 includes a lot of new features and improvements. However, it also comes with a few breaking changes specifically with the environment variables. This guide will help you migrate your existing Formbricks instance to v1.1 without losing any data.
@@ -139,6 +224,7 @@ x-environment: &environment
# GOOGLE_CLIENT_SECRET:
```
</CodeGroup>
</Col>
Did we miss something? Are you still facing issues migrating your app? [Join our Discord!](https://formbricks.com/discord) We'd be happy to help!

View File

@@ -42,6 +42,10 @@ RUN pnpm install
RUN pnpm post-install --filter=web...
RUN pnpm turbo run build --filter=web...
# Transpile TypeScript migration script to JavaScript
RUN cd packages/database && pnpm generate
RUN npx tsc packages/database/migrations/20240207041922_advanced_targeting/data-migration.ts --esModuleInterop --outDir packages/database/migrations > /dev/null || true
# Runner stage: Setting up the runtime environment
FROM node:20-alpine AS runner
RUN corepack enable && corepack prepare pnpm@latest --activate
@@ -59,8 +63,19 @@ COPY --from=installer /app/apps/web/package.json .
COPY --from=installer --chown=nextjs:nextjs /app/apps/web/.next/standalone ./
COPY --from=installer --chown=nextjs:nextjs /app/apps/web/.next/static ./apps/web/.next/static
COPY --from=installer --chown=nextjs:nextjs /app/apps/web/public ./apps/web/public
COPY --from=installer --chown=nextjs:nextjs /app/packages/database/schema.prisma ./packages/database/schema.prisma
COPY --from=installer --chown=nextjs:nextjs /app/packages/database/migrations ./packages/database/migrations
# COPY --from=installer --chown=nextjs:nextjs /app/packages/database/schema.prisma ./packages/database/schema.prisma
# this works
# Copy node_modules from installer stage
# COPY --from=installer --chown=nextjs:nextjs /app/node_modules ./node_modules
# exps
RUN npm install --no-save @paralleldrive/cuid2
COPY --from=installer --chown=nextjs:nextjs /app/node_modules/prisma/ ./node_modules/prisma/
COPY --from=installer --chown=nextjs:nextjs /app/node_modules/@prisma/ ./node_modules/@prisma/
COPY --from=installer --chown=nextjs:nextjs /app/packages/database ./packages/database
COPY --from=installer /app/docker/cronjobs /app/docker/cronjobs
EXPOSE 3000

View File

@@ -4,12 +4,12 @@ version: "3.3"
x-webapp-url: &webapp_url http://localhost:3000
# PostgreSQL DB for Formbricks to connect to
x-database-url: &database_url
x-database-url: &database_url postgresql://postgres:postgres@postgres:5432/formbricks?schema=public
# NextJS Auth
# @see: https://next-auth.js.org/configuration/options#nextauth_secret
# You can use: `openssl rand -hex 32` to generate one
x-nextauth-secret: &nextauth_secret
x-nextauth-secret: &nextauth_secret luJthrnoDpVgGakjVYlccsZ1FdlwxIWogWIsrxzoQ6E=
# Set this to your public-facing URL, e.g., https://example.com
# You do not need the NEXTAUTH_URL environment variable in Vercel.
@@ -17,7 +17,7 @@ x-nextauth-url: &nextauth_url http://localhost:3000
# Encryption key
# You can use: `openssl rand -hex 32` to generate one
x-encryption-key: &encryption_key
x-encryption-key: &encryption_key b19a492fe2a9c01debe543f945d8481728e126904f5b54acc53eb0936748fb02
x-mail-from: &mail_from
x-smtp-host: &smtp_host

View File

@@ -0,0 +1,293 @@
"use strict";
var __awaiter =
(this && this.__awaiter) ||
function (thisArg, _arguments, P, generator) {
function adopt(value) {
return value instanceof P
? value
: new P(function (resolve) {
resolve(value);
});
}
return new (P || (P = Promise))(function (resolve, reject) {
function fulfilled(value) {
try {
step(generator.next(value));
} catch (e) {
reject(e);
}
}
function rejected(value) {
try {
step(generator["throw"](value));
} catch (e) {
reject(e);
}
}
function step(result) {
result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected);
}
step((generator = generator.apply(thisArg, _arguments || [])).next());
});
};
var __generator =
(this && this.__generator) ||
function (thisArg, body) {
var _ = {
label: 0,
sent: function () {
if (t[0] & 1) throw t[1];
return t[1];
},
trys: [],
ops: [],
},
f,
y,
t,
g;
return (
(g = { next: verb(0), throw: verb(1), return: verb(2) }),
typeof Symbol === "function" &&
(g[Symbol.iterator] = function () {
return this;
}),
g
);
function verb(n) {
return function (v) {
return step([n, v]);
};
}
function step(op) {
if (f) throw new TypeError("Generator is already executing.");
while ((g && ((g = 0), op[0] && (_ = 0)), _))
try {
if (
((f = 1),
y &&
(t =
op[0] & 2
? y["return"]
: op[0]
? y["throw"] || ((t = y["return"]) && t.call(y), 0)
: y.next) &&
!(t = t.call(y, op[1])).done)
)
return t;
if (((y = 0), t)) op = [op[0] & 2, t.value];
switch (op[0]) {
case 0:
case 1:
t = op;
break;
case 4:
_.label++;
return { value: op[1], done: false };
case 5:
_.label++;
y = op[1];
op = [0];
continue;
case 7:
op = _.ops.pop();
_.trys.pop();
continue;
default:
if (!((t = _.trys), (t = t.length > 0 && t[t.length - 1])) && (op[0] === 6 || op[0] === 2)) {
_ = 0;
continue;
}
if (op[0] === 3 && (!t || (op[1] > t[0] && op[1] < t[3]))) {
_.label = op[1];
break;
}
if (op[0] === 6 && _.label < t[1]) {
_.label = t[1];
t = op;
break;
}
if (t && _.label < t[2]) {
_.label = t[2];
_.ops.push(op);
break;
}
if (t[2]) _.ops.pop();
_.trys.pop();
continue;
}
op = body.call(thisArg, _);
} catch (e) {
op = [6, e];
y = 0;
} finally {
f = t = 0;
}
if (op[0] & 5) throw op[1];
return { value: op[0] ? op[1] : void 0, done: true };
}
};
Object.defineProperty(exports, "__esModule", { value: true });
var cuid2_1 = require("@paralleldrive/cuid2");
var client_1 = require("@prisma/client");
var prisma = new client_1.PrismaClient();
function main() {
return __awaiter(this, void 0, void 0, function () {
var _this = this;
return __generator(this, function (_a) {
switch (_a.label) {
case 0:
return [
4 /*yield*/,
prisma.$transaction(function (tx) {
return __awaiter(_this, void 0, void 0, function () {
var allSurveysWithAttributeFilters;
var _this = this;
return __generator(this, function (_a) {
switch (_a.label) {
case 0:
return [
4 /*yield*/,
prisma.survey.findMany({
where: {
attributeFilters: {
some: {},
},
},
include: {
attributeFilters: { include: { attributeClass: true } },
},
}),
];
case 1:
allSurveysWithAttributeFilters = _a.sent();
if (
!(allSurveysWithAttributeFilters === null || allSurveysWithAttributeFilters === void 0
? void 0
: allSurveysWithAttributeFilters.length)
) {
// stop the migration if there are no surveys with attribute filters
return [2 /*return*/];
}
allSurveysWithAttributeFilters.forEach(function (survey) {
return __awaiter(_this, void 0, void 0, function () {
var attributeFilters, filters;
return __generator(this, function (_a) {
switch (_a.label) {
case 0:
attributeFilters = survey.attributeFilters;
// if there are no attribute filters, we can skip this survey
if (
!(attributeFilters === null || attributeFilters === void 0
? void 0
: attributeFilters.length)
) {
return [2 /*return*/];
}
filters = attributeFilters.map(function (filter, idx) {
var attributeClass = filter.attributeClass;
var resource;
// if the attribute class is userId, we need to create a user segment with the person filter
if (
attributeClass.name === "userId" &&
attributeClass.type === "automatic"
) {
resource = {
id: (0, cuid2_1.createId)(),
root: {
type: "person",
personIdentifier: "userId",
},
qualifier: {
operator: filter.condition,
},
value: filter.value,
};
} else {
resource = {
id: (0, cuid2_1.createId)(),
root: {
type: "attribute",
attributeClassName: attributeClass.name,
},
qualifier: {
operator: filter.condition,
},
value: filter.value,
};
}
var attributeSegment = {
id: filter.id,
connector: idx === 0 ? null : "and",
resource: resource,
};
return attributeSegment;
});
return [
4 /*yield*/,
tx.segment.create({
data: {
title: "".concat(survey.id),
description: "",
isPrivate: true,
filters: filters,
surveys: {
connect: {
id: survey.id,
},
},
environment: {
connect: {
id: survey.environmentId,
},
},
},
}),
];
case 1:
_a.sent();
return [2 /*return*/];
}
});
});
});
// delete all attribute filters
return [4 /*yield*/, tx.surveyAttributeFilter.deleteMany({})];
case 2:
// delete all attribute filters
_a.sent();
return [2 /*return*/];
}
});
});
}),
];
case 1:
_a.sent();
return [2 /*return*/];
}
});
});
}
main()
.catch(function (e) {
return __awaiter(void 0, void 0, void 0, function () {
return __generator(this, function (_a) {
console.error(e);
process.exit(1);
return [2 /*return*/];
});
});
})
.finally(function () {
return __awaiter(void 0, void 0, void 0, function () {
return __generator(this, function (_a) {
switch (_a.label) {
case 0:
return [4 /*yield*/, prisma.$disconnect()];
case 1:
return [2 /*return*/, _a.sent()];
}
});
});
});