[WEB-2001]feat: Cache issues on the client (#5327)

* use common getIssues from issue service instead of multiple different services for modules and cycles

* Use SQLite to store issues locally and load issues from it.

* Fix incorrect total count and filtering on assignees.

* enable parallel API calls

* use common getIssues from issue service instead of multiple different services for modules and cycles

* Use SQLite to store issues locally and load issues from it.

* Fix incorrect total count and filtering on assignees.

* enable parallel API calls

* chore: deleted issue list

* - Handle local mutations
- Implement getting the updates
- Use SWR to update/sync data

* Wait for sync to complete in get issues

* Fix build errors

* Fix build issue

* - Sync updates to local-db
- Fallback to server when the local data is loading
- Wait when the updates are being fetched

* Add issues in batches

* Disable skeleton loaders for first 10 issues

* Load issues in bulk

* working version of sql lite with grouped issues

* Use window queries for group by

* - Fix sort by date fields
- Fix the total count

* - Fix grouping by created by
- Fix order by and limit

* fix pagination

* Fix sorting on issue priority

* - Add secondary sort order
- Fix group by priority

* chore: added timestamp filter for deleted issues

* - Extract local DB into its own class
- Implement sorting by label names

* Implement subgroup by

* sub group by changes

* Refactor query constructor

* Insert or update issues instead of directly adding them.

* Segregated queries. Not working though!!

* - Get filtered issues and then group them.
- Cleanup code.
- Implement order by labels.

* Fix build issues

* Remove debuggers

* remove loaders while changing sorting or applying filters

* fix loader while clearing all filters

* Fix issue with project being synced twice

* Improve project sync

* Optimize the queries

* Make create dummy data more realistic

* dev: added total pages in the global paginator

* chore: updated total_paged count

* chore: added state_group in the issues pagination

* chore: removed deleted_at from the issue pagination payload

* chore: replaced state_group with state__group

* Integrate new getIssues API, and fix sync issues bug.

* Fix issue with SWR running twice in workspace wrapper

* Fix DB initialization called when opening project for the first time.

* Add all the tables required for sorting

* Exclude description from getIssues

* Add getIssue function.

* Add only selected fields to get query.

* Fix the count query

* Minor query optimization when no joins are required.

* fetch issue description from local db

* clear local db on signout

* Correct dummy data creation

* Fix sort by assignee

* sync to local changes

* chore: added archived issues in the deleted endpoint

* Sync deletes to local db.

* - Add missing indexes for tables used in sorting in spreadsheet layout.
- Add options table

* Make fallback optional in getOption

* Kanban column virtualization

* persist project sync readiness to sqlite and use that as the source of truth for the project issues to be ready

* fix build errors

* Fix calendar view

* fetch slimed down version of modules in project wrapper

* fetch toned down modules and then fetch complete modules

* Fix multi value order by in spread sheet layout

* Fix sort by

* Fix the query when ordering by multi field names

* Remove unused import

* Fix sort by multi value fields

* Format queries and fix order by

* fix order by for multi issue

* fix loaders for spreadsheet

* Fallback to manual order whn moving away from spreadsheet layout

* fix minor bug

* Move fix for order_by when switching from spreadsheet layout to translateQueryParams

* fix default rendering of kanban groups

* Fix none priority being saved as null

* Remove debugger statement

* Fix issue load

* chore: updated isue paginated query from  to

* Fix sub issues and start and target date filters

* Fix active and backlog filter

* Add default order by

* Update the Query param to match with backend.

* local sqlite db versioning

* When window is hidden, do not perform any db versioning

* fix error handling and fall back to server when database errors out

* Add ability to disable local db cache

* remove db version check from getIssues function

* change db version to number and remove workspaceInitPromise in storage.sqlite

* - Sync the entire workspace in the background
- Add get sub issue method with distribution

* Make changes to get issues for sync to match backend.

* chore: handled workspace and project in v2 paginted issues

* disable issue description and title until fetched from server

* sync issues post bulk operations

* fix server error

* fix front end build

* Remove full workspace sync

* - Remove the toast message on sync.
- Update the disable local message.

* Add Hardcoded constant to disable the local db caching

* fix lint errors

* Fix order by in grouping

* update yarn lock

* fix build

* fix plane-web imports

* address review comments

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
This commit is contained in:
Satish Gandham
2024-09-24 19:01:34 +05:30
committed by GitHub
parent 8dabe839f3
commit 3df230393a
48 changed files with 2085 additions and 155 deletions

View File

@@ -20,6 +20,7 @@ from plane.app.views import (
IssueViewSet,
LabelViewSet,
BulkArchiveIssuesEndpoint,
DeletedIssuesListViewSet,
IssuePaginatedViewSet,
)
@@ -39,9 +40,9 @@ urlpatterns = [
),
name="project-issue",
),
# updated v1 paginated issues
# updated v2 paginated issues
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/v2/issues/",
"workspaces/<str:slug>/v2/issues/",
IssuePaginatedViewSet.as_view({"get": "list"}),
name="project-issues-paginated",
),
@@ -311,4 +312,9 @@ urlpatterns = [
),
name="project-issue-draft",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/deleted-issues/",
DeletedIssuesListViewSet.as_view(),
name="deleted-issues",
),
]

View File

@@ -114,6 +114,7 @@ from .issue.base import (
IssueViewSet,
IssueUserDisplayPropertyEndpoint,
BulkDeleteIssuesEndpoint,
DeletedIssuesListViewSet,
IssuePaginatedViewSet,
)

View File

@@ -234,11 +234,17 @@ class IssueViewSet(BaseViewSet):
@method_decorator(gzip_page)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def list(self, request, slug, project_id):
extra_filters = {}
if request.GET.get("updated_at__gt", None) is not None:
extra_filters = {
"updated_at__gt": request.GET.get("updated_at__gt")
}
project = Project.objects.get(pk=project_id, workspace__slug=slug)
filters = issue_filters(request.query_params, "GET")
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = self.get_queryset().filter(**filters)
issue_queryset = self.get_queryset().filter(**filters, **extra_filters)
# Custom ordering for priority and state
# Issue queryset
@@ -713,16 +719,43 @@ class BulkDeleteIssuesEndpoint(BaseAPIView):
)
class DeletedIssuesListViewSet(BaseAPIView):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def get(self, request, slug, project_id):
filters = {}
if request.GET.get("updated_at__gt", None) is not None:
filters = {"updated_at__gt": request.GET.get("updated_at__gt")}
deleted_issues = (
Issue.all_objects.filter(
workspace__slug=slug,
project_id=project_id,
)
.filter(Q(archived_at__isnull=False) | Q(deleted_at__isnull=False))
.filter(**filters)
.values_list("id", flat=True)
)
return Response(deleted_issues, status=status.HTTP_200_OK)
class IssuePaginatedViewSet(BaseViewSet):
def get_queryset(self):
workspace_slug = self.kwargs.get("slug")
project_id = self.kwargs.get("project_id")
# getting the project_id from the request params
project_id = self.request.GET.get("project_id", None)
issue_queryset = Issue.issue_objects.filter(
workspace__slug=workspace_slug
)
if project_id:
issue_queryset = issue_queryset.filter(project_id=project_id)
return (
Issue.issue_objects.filter(
workspace__slug=workspace_slug, project_id=project_id
issue_queryset.select_related(
"workspace", "project", "state", "parent"
)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
@@ -760,17 +793,18 @@ class IssuePaginatedViewSet(BaseViewSet):
return paginated_data
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def list(self, request, slug, project_id):
def list(self, request, slug):
project_id = self.request.GET.get("project_id", None)
cursor = request.GET.get("cursor", None)
is_description_required = request.GET.get("description", False)
updated_at = request.GET.get("updated_at__gte", None)
updated_at = request.GET.get("updated_at__gt", None)
# required fields
required_fields = [
"id",
"name",
"state_id",
"state__group",
"sort_order",
"completed_at",
"estimate_point",
@@ -787,7 +821,6 @@ class IssuePaginatedViewSet(BaseViewSet):
"updated_by",
"is_draft",
"archived_at",
"deleted_at",
"module_ids",
"label_ids",
"assignee_ids",
@@ -800,15 +833,18 @@ class IssuePaginatedViewSet(BaseViewSet):
required_fields.append("description_html")
# querying issues
base_queryset = Issue.issue_objects.filter(
workspace__slug=slug, project_id=project_id
).order_by("updated_at")
base_queryset = Issue.issue_objects.filter(workspace__slug=slug)
if project_id:
base_queryset = base_queryset.filter(project_id=project_id)
base_queryset = base_queryset.order_by("updated_at")
queryset = self.get_queryset().order_by("updated_at")
# filtering issues by greater then updated_at given by the user
if updated_at:
base_queryset = base_queryset.filter(updated_at__gte=updated_at)
queryset = queryset.filter(updated_at__gte=updated_at)
base_queryset = base_queryset.filter(updated_at__gt=updated_at)
queryset = queryset.filter(updated_at__gt=updated_at)
queryset = queryset.annotate(
label_ids=Coalesce(

View File

@@ -347,7 +347,7 @@ def create_issues(workspace, project, user_id, issue_count):
)
)
text = fake.text(max_nb_chars=60000)
text = fake.text(max_nb_chars=3000)
issues.append(
Issue(
state_id=states[random.randint(0, len(states) - 1)],
@@ -490,18 +490,23 @@ def create_issue_assignees(workspace, project, user_id, issue_count):
def create_issue_labels(workspace, project, user_id, issue_count):
# labels
labels = Label.objects.filter(project=project).values_list("id", flat=True)
issues = random.sample(
list(
# issues = random.sample(
# list(
# Issue.objects.filter(project=project).values_list("id", flat=True)
# ),
# int(issue_count / 2),
# )
issues = list(
Issue.objects.filter(project=project).values_list("id", flat=True)
),
int(issue_count / 2),
)
)
shuffled_labels = list(labels)
# Bulk issue
bulk_issue_labels = []
for issue in issues:
random.shuffle(shuffled_labels)
for label in random.sample(
list(labels), random.randint(0, len(labels) - 1)
shuffled_labels, random.randint(0, 5)
):
bulk_issue_labels.append(
IssueLabel(
@@ -552,25 +557,33 @@ def create_module_issues(workspace, project, user_id, issue_count):
modules = Module.objects.filter(project=project).values_list(
"id", flat=True
)
issues = random.sample(
list(
# issues = random.sample(
# list(
# Issue.objects.filter(project=project).values_list("id", flat=True)
# ),
# int(issue_count / 2),
# )
issues = list(
Issue.objects.filter(project=project).values_list("id", flat=True)
),
int(issue_count / 2),
)
)
shuffled_modules = list(modules)
# Bulk issue
bulk_module_issues = []
for issue in issues:
module = modules[random.randint(0, len(modules) - 1)]
bulk_module_issues.append(
ModuleIssue(
module_id=module,
issue_id=issue,
project=project,
workspace=workspace,
random.shuffle(shuffled_modules)
for module in random.sample(
shuffled_modules, random.randint(0, 5)
):
bulk_module_issues.append(
ModuleIssue(
module_id=module,
issue_id=issue,
project=project,
workspace=workspace,
)
)
)
# Issue assignees
ModuleIssue.objects.bulk_create(
bulk_module_issues, batch_size=1000, ignore_conflicts=True

View File

@@ -73,7 +73,7 @@ class Command(BaseCommand):
from plane.bgtasks.dummy_data_task import create_dummy_data
create_dummy_data.delay(
create_dummy_data(
slug=workspace_slug,
email=creator,
members=members,

View File

@@ -1,3 +1,6 @@
# python imports
from math import ceil
# constants
PAGINATOR_MAX_LIMIT = 1000
@@ -36,6 +39,9 @@ def paginate(base_queryset, queryset, cursor, on_result):
total_results = base_queryset.count()
page_size = min(cursor_object.current_page_size, PAGINATOR_MAX_LIMIT)
# getting the total pages available based on the page size
total_pages = ceil(total_results / page_size)
# Calculate the start and end index for the paginated data
start_index = 0
if cursor_object.current_page > 0:
@@ -72,6 +78,7 @@ def paginate(base_queryset, queryset, cursor, on_result):
"next_page_results": next_page_results,
"page_count": len(paginated_data),
"total_results": total_results,
"total_pages": total_pages,
"results": paginated_data,
}

View File

@@ -82,7 +82,7 @@ class CursorResult(Sequence):
return f"<{type(self).__name__}: results={len(self.results)}>"
MAX_LIMIT = 100
MAX_LIMIT = 1000
class BadPaginationError(Exception):
@@ -118,7 +118,7 @@ class OffsetPaginator:
self.max_offset = max_offset
self.on_results = on_results
def get_result(self, limit=100, cursor=None):
def get_result(self, limit=1000, cursor=None):
# offset is page #
# value is page limit
if cursor is None:
@@ -727,7 +727,7 @@ class BasePaginator:
cursor_name = "cursor"
# get the per page parameter from request
def get_per_page(self, request, default_per_page=100, max_per_page=100):
def get_per_page(self, request, default_per_page=1000, max_per_page=1000):
try:
per_page = int(request.GET.get("per_page", default_per_page))
except ValueError:
@@ -747,8 +747,8 @@ class BasePaginator:
on_results=None,
paginator=None,
paginator_cls=OffsetPaginator,
default_per_page=100,
max_per_page=100,
default_per_page=1000,
max_per_page=1000,
cursor_cls=Cursor,
extra_stats=None,
controller=None,

View File

@@ -10,7 +10,16 @@ import { Disclosure, Transition } from "@headlessui/react";
// layouts
// components
import type { IUser } from "@plane/types";
import { Button, CustomSelect, CustomSearchSelect, Input, TOAST_TYPE, setPromiseToast, setToast } from "@plane/ui";
import {
Button,
CustomSelect,
CustomSearchSelect,
Input,
TOAST_TYPE,
setPromiseToast,
setToast,
ToggleSwitch,
} from "@plane/ui";
import { DeactivateAccountModal } from "@/components/account";
import { LogoSpinner } from "@/components/common";
import { ImagePickerPopover, UserImageUploadModal, PageHead } from "@/components/core";
@@ -22,10 +31,11 @@ import { ProfileSettingContentWrapper } from "@/components/profile";
import { TIME_ZONES } from "@/constants/timezones";
import { USER_ROLES } from "@/constants/workspace";
// hooks
import { useUser } from "@/hooks/store";
import { useUser, useUserSettings } from "@/hooks/store";
// import { ProfileSettingsLayout } from "@/layouts/settings-layout";
// layouts
import { FileService } from "@/services/file.service";
import { ENABLE_LOCAL_DB_CACHE } from "@/plane-web/constants/issues";
// services
// types
@@ -59,6 +69,7 @@ const ProfileSettingsPage = observer(() => {
} = useForm<IUser>({ defaultValues });
// store hooks
const { data: currentUser, updateCurrentUser } = useUser();
const { canUseLocalDB, toggleLocalDB } = useUserSettings();
useEffect(() => {
reset({ ...defaultValues, ...currentUser });
@@ -387,7 +398,7 @@ const ProfileSettingsPage = observer(() => {
render={({ field: { value, onChange } }) => (
<CustomSearchSelect
value={value}
label={value ? TIME_ZONES.find((t) => t.value === value)?.label ?? value : "Select a timezone"}
label={value ? (TIME_ZONES.find((t) => t.value === value)?.label ?? value) : "Select a timezone"}
options={timeZoneOptions}
onChange={onChange}
buttonClassName={errors.user_timezone ? "border-red-500" : "border-none"}
@@ -407,6 +418,37 @@ const ProfileSettingsPage = observer(() => {
</div>
</div>
</form>
{ENABLE_LOCAL_DB_CACHE && (
<Disclosure as="div" className="border-t border-custom-border-100 md:px-8">
{({ open }) => (
<>
<Disclosure.Button as="button" type="button" className="flex w-full items-center justify-between py-4">
<span className="text-lg tracking-tight">Local Cache</span>
<ChevronDown className={`h-5 w-5 transition-all ${open ? "rotate-180" : ""}`} />
</Disclosure.Button>
<Transition
show={open}
enter="transition duration-100 ease-out"
enterFrom="transform opacity-0"
enterTo="transform opacity-100"
leave="transition duration-75 ease-out"
leaveFrom="transform opacity-100"
leaveTo="transform opacity-0"
>
<Disclosure.Panel>
<div className="flex justify-between pb-4">
<span className="text-sm tracking-tight">
Toggled on by default to keep Plane performant. Disable this if you are facing any issues with
Plane. Applicable only to this device.
</span>
<ToggleSwitch value={canUseLocalDB} onChange={() => toggleLocalDB()} />
</div>
</Disclosure.Panel>
</Transition>
</>
)}
</Disclosure>
)}
<Disclosure as="div" className="border-t border-custom-border-100 md:px-8">
{({ open }) => (
<>

View File

@@ -30,3 +30,6 @@ export const filterActivityOnSelectedFilters = (
filter: TActivityFilters[]
): TIssueActivityComment[] =>
activity.filter((activity) => filter.includes(activity.activity_type as TActivityFilters));
// boolean to decide if the local db cache is enabled
export const ENABLE_LOCAL_DB_CACHE = false;

View File

@@ -43,6 +43,7 @@ export interface IKanBan {
isDropDisabled?: boolean;
dropErrorMessage?: string | undefined;
sub_group_id?: string;
sub_group_index?: number;
updateIssue: ((projectId: string | null, issueId: string, data: Partial<TIssue>) => Promise<void>) | undefined;
quickActions: TRenderQuickActions;
kanbanFilters: TIssueKanbanFilters;

View File

@@ -60,7 +60,7 @@ export const IssueView: FC<IIssueView> = observer((props) => {
isArchiveIssueModalOpen,
toggleDeleteIssueModal,
toggleArchiveIssueModal,
issue: { getIssueById },
issue: { getIssueById, isLocalDBIssueDescription },
} = useIssueDetail();
const issue = getIssueById(issueId);
// remove peek id
@@ -178,7 +178,7 @@ export const IssueView: FC<IIssueView> = observer((props) => {
projectId={projectId}
issueId={issueId}
issueOperations={issueOperations}
disabled={disabled || is_archived}
disabled={disabled || is_archived || isLocalDBIssueDescription}
isArchived={is_archived}
isSubmitting={isSubmitting}
setIsSubmitting={(value) => setIsSubmitting(value)}
@@ -217,7 +217,7 @@ export const IssueView: FC<IIssueView> = observer((props) => {
projectId={projectId}
issueId={issueId}
issueOperations={issueOperations}
disabled={disabled || is_archived}
disabled={disabled || is_archived || isLocalDBIssueDescription}
isArchived={is_archived}
isSubmitting={isSubmitting}
setIsSubmitting={(value) => setIsSubmitting(value)}

View File

@@ -1,27 +1,32 @@
"use client";
import { FC, ReactNode } from "react";
import { observer } from "mobx-react";
import { useParams } from "next/navigation";
import useSWR from "swr";
import useSWRImmutable from "swr/immutable";
// components
import { JoinProject } from "@/components/auth-screens";
import { EmptyState, LogoSpinner } from "@/components/common";
// hooks
import {
useEventTracker,
useCommandPalette,
useCycle,
useProjectEstimates,
useEventTracker,
useLabel,
useMember,
useModule,
useProject,
useProjectEstimates,
useProjectState,
useProjectView,
useCommandPalette,
useUserPermissions,
} from "@/hooks/store";
// plane web constants
import { EUserPermissions, EUserPermissionsLevel } from "@/plane-web/constants/user-permissions";
// images
import { persistence } from "@/local-db/storage.sqlite";
import emptyProject from "@/public/empty-state/onboarding/dashboard-light.webp";
interface IProjectAuthWrapper {
@@ -37,7 +42,7 @@ export const ProjectAuthWrapper: FC<IProjectAuthWrapper> = observer((props) => {
const { fetchUserProjectInfo, allowPermissions, projectUserInfo } = useUserPermissions();
const { loader, getProjectById, fetchProjectDetails } = useProject();
const { fetchAllCycles } = useCycle();
const { fetchModules } = useModule();
const { fetchModulesSlim, fetchModules } = useModule();
const { fetchViews } = useProjectView();
const {
project: { fetchProjectMembers },
@@ -50,11 +55,27 @@ export const ProjectAuthWrapper: FC<IProjectAuthWrapper> = observer((props) => {
const projectMemberInfo = projectUserInfo?.[workspaceSlug?.toString()]?.[projectId?.toString()];
useSWR(
workspaceSlug && projectId ? `PROJECT_SYNC_ISSUES_${workspaceSlug.toString()}_${projectId.toString()}` : null,
workspaceSlug && projectId
? () => {
persistence.syncIssues(projectId.toString());
}
: null,
{
revalidateIfStale: true,
revalidateOnFocus: true,
revalidateOnReconnect: true,
refreshInterval: 5 * 60 * 1000,
}
);
// fetching project details
useSWR(
workspaceSlug && projectId ? `PROJECT_DETAILS_${workspaceSlug.toString()}_${projectId.toString()}` : null,
workspaceSlug && projectId ? () => fetchProjectDetails(workspaceSlug.toString(), projectId.toString()) : null
);
// fetching user project member information
useSWR(
workspaceSlug && projectId ? `PROJECT_ME_INFORMATION_${workspaceSlug}_${projectId}` : null,
@@ -93,7 +114,12 @@ export const ProjectAuthWrapper: FC<IProjectAuthWrapper> = observer((props) => {
// fetching project modules
useSWR(
workspaceSlug && projectId ? `PROJECT_MODULES_${workspaceSlug}_${projectId}` : null,
workspaceSlug && projectId ? () => fetchModules(workspaceSlug.toString(), projectId.toString()) : null,
workspaceSlug && projectId
? async () => {
await fetchModulesSlim(workspaceSlug.toString(), projectId.toString());
await fetchModules(workspaceSlug.toString(), projectId.toString());
}
: null,
{ revalidateIfStale: false, revalidateOnFocus: false }
);
// fetching project views

View File

@@ -7,9 +7,11 @@ import Link from "next/link";
import { useParams } from "next/navigation";
import { useTheme } from "next-themes";
import useSWR from "swr";
import useSWRImmutable from "swr/immutable";
import { LogOut } from "lucide-react";
// hooks
import { Button, TOAST_TYPE, setToast, Tooltip } from "@plane/ui";
import { Button, setToast, TOAST_TYPE, Tooltip } from "@plane/ui";
import { LogoSpinner } from "@/components/common";
import { useMember, useProject, useUser, useUserPermissions, useWorkspace } from "@/hooks/store";
import { useFavorite } from "@/hooks/store/use-favorite";
@@ -17,6 +19,7 @@ import { usePlatformOS } from "@/hooks/use-platform-os";
// constants
import { EUserPermissions, EUserPermissionsLevel } from "@/plane-web/constants/user-permissions";
// images
import { persistence } from "@/local-db/storage.sqlite";
import PlaneBlackLogo from "@/public/plane-logos/black-horizontal-with-blue-logo.png";
import PlaneWhiteLogo from "@/public/plane-logos/white-horizontal-with-blue-logo.png";
import WorkSpaceNotAvailable from "@/public/workspace/workspace-not-available.png";
@@ -88,6 +91,20 @@ export const WorkspaceAuthWrapper: FC<IWorkspaceAuthWrapper> = observer((props)
{ revalidateIfStale: false, revalidateOnFocus: false }
);
// initialize the local database
const { isLoading: isDBInitializing } = useSWRImmutable(
workspaceSlug ? `WORKSPACE_DB_${workspaceSlug}` : null,
workspaceSlug
? async () => {
// persistence.reset();
await persistence.initialize(workspaceSlug.toString());
// Load common data
persistence.syncWorkspace();
return true;
}
: null
);
const handleSignOut = async () => {
await signOut().catch(() =>
setToast({
@@ -102,7 +119,7 @@ export const WorkspaceAuthWrapper: FC<IWorkspaceAuthWrapper> = observer((props)
const currentWorkspaceInfo = workspaceSlug && workspaceInfoBySlug(workspaceSlug.toString());
// if list of workspaces are not there then we have to render the spinner
if (allWorkspaces === undefined || loader) {
if (allWorkspaces === undefined || loader || isDBInitializing) {
return (
<div className="grid h-screen place-items-center bg-custom-background-100 p-4">
<div className="flex flex-col items-center gap-3 text-center">

View File

@@ -0,0 +1,420 @@
import set from "lodash/set";
// plane
import { EIssueGroupBYServerToProperty } from "@plane/constants";
import { TIssue } from "@plane/types";
// lib
import { rootStore } from "@/lib/store-context";
// services
import { IssueService } from "@/services/issue/issue.service";
//
import { ARRAY_FIELDS } from "./utils/constants";
import { getSubIssuesWithDistribution } from "./utils/data.utils";
import createIndexes from "./utils/indexes";
import { addIssuesBulk, syncDeletesToLocal } from "./utils/load-issues";
import { loadWorkSpaceData } from "./utils/load-workspace";
import { issueFilterCountQueryConstructor, issueFilterQueryConstructor } from "./utils/query-constructor";
import { runQuery } from "./utils/query-executor";
import { createTables } from "./utils/tables";
import { getGroupedIssueResults, getSubGroupedIssueResults } from "./utils/utils";
declare module "@sqlite.org/sqlite-wasm" {
export function sqlite3Worker1Promiser(...args: any): any;
}
const DB_VERSION = 1;
const PAGE_SIZE = 1000;
const BATCH_SIZE = 200;
const log = console.log;
const error = console.error;
const info = console.info;
type TProjectStatus = {
issues: { status: undefined | "loading" | "ready" | "error" | "syncing"; sync: Promise<void> | undefined };
};
type TDBStatus = "initializing" | "ready" | "error" | undefined;
export class Storage {
db: any;
status: TDBStatus = undefined;
dbName = "plane";
projectStatus: Record<string, TProjectStatus> = {};
workspaceSlug: string = "";
constructor() {
this.db = null;
}
reset = () => {
this.db = null;
this.status = undefined;
this.projectStatus = {};
this.workspaceSlug = "";
};
clearStorage = async () => {
try {
const storageManager = window.navigator.storage;
const fileSystemDirectoryHandle = await storageManager.getDirectory();
//@ts-expect-error
await fileSystemDirectoryHandle.remove({ recursive: true });
} catch (e) {
console.error("Error clearing sqlite sync storage", e);
}
};
initialize = async (workspaceSlug: string): Promise<boolean> => {
if (document.hidden || !rootStore.user.localDBEnabled) return false; // return if the window gets hidden
if (workspaceSlug !== this.workspaceSlug) {
this.reset();
}
try {
await this._initialize(workspaceSlug);
return true;
} catch (err) {
error(err);
this.status = "error";
return false;
}
};
_initialize = async (workspaceSlug: string): Promise<boolean> => {
if (this.status === "initializing") {
console.warn(`Initialization already in progress for workspace ${workspaceSlug}`);
return false;
}
if (this.status === "ready") {
console.warn(`Already initialized for workspace ${workspaceSlug}`);
return true;
}
if (this.status === "error") {
console.warn(`Initialization failed for workspace ${workspaceSlug}`);
return false;
}
info("Loading and initializing SQLite3 module...");
this.workspaceSlug = workspaceSlug;
this.dbName = workspaceSlug;
const { sqlite3Worker1Promiser } = await import("@sqlite.org/sqlite-wasm");
try {
const promiser: any = await new Promise((resolve) => {
const _promiser = sqlite3Worker1Promiser({
onready: () => resolve(_promiser),
});
});
const configResponse = await promiser("config-get", {});
log("Running SQLite3 version", configResponse.result.version.libVersion);
const openResponse = await promiser("open", {
filename: `file:${this.dbName}.sqlite3?vfs=opfs`,
});
const { dbId } = openResponse;
this.db = {
dbId,
exec: async (val: any) => {
if (typeof val === "string") {
val = { sql: val };
}
return promiser("exec", { dbId, ...val });
},
};
// dump DB of db version is matching
const dbVersion = await this.getOption("DB_VERSION");
if (dbVersion !== "" && parseInt(dbVersion) !== DB_VERSION) {
await this.clearStorage();
this.reset();
await this._initialize(workspaceSlug);
return false;
}
log(
"OPFS is available, created persisted database at",
openResponse.result.filename.replace(/^file:(.*?)\?vfs=opfs$/, "$1")
);
this.status = "ready";
// Your SQLite code here.
await createTables();
await this.setOption("DB_VERSION", DB_VERSION.toString());
} catch (err) {
error(err);
throw err;
}
return true;
};
syncWorkspace = async () => {
if (document.hidden || !rootStore.user.localDBEnabled) return; // return if the window gets hidden
loadWorkSpaceData(this.workspaceSlug);
};
syncProject = async (projectId: string) => {
if (document.hidden || !rootStore.user.localDBEnabled) return false; // return if the window gets hidden
// Load labels, members, states, modules, cycles
await this.syncIssues(projectId);
// // Sync rest of the projects
// const projects = await getProjectIds();
// // Exclude the one we just synced
// const projectsToSync = projects.filter((p: string) => p !== projectId);
// for (const project of projectsToSync) {
// await delay(8000);
// await this.syncIssues(project);
// }
// this.setOption("workspace_synced_at", new Date().toISOString());
};
syncIssues = async (projectId: string) => {
if (document.hidden || !rootStore.user.localDBEnabled) return false; // return if the window gets hidden
try {
const sync = this._syncIssues(projectId);
this.setSync(projectId, sync);
await sync;
} catch (e) {
this.setStatus(projectId, "error");
}
};
_syncIssues = async (projectId: string) => {
console.log("### Sync started");
let status = this.getStatus(projectId);
if (status === "loading" || status === "syncing") {
info(`Project ${projectId} is already loading or syncing`);
return;
}
const syncPromise = this.getSync(projectId);
if (syncPromise) {
// Redundant check?
return;
}
const queryParams: { cursor: string; updated_at__gt?: string; description: boolean } = {
cursor: `${PAGE_SIZE}:0:0`,
description: true,
};
const syncedAt = await this.getLastSyncTime(projectId);
const projectSync = await this.getOption(projectId);
if (syncedAt) {
queryParams["updated_at__gt"] = syncedAt;
}
this.setStatus(projectId, projectSync === "ready" ? "syncing" : "loading");
status = this.getStatus(projectId);
log(`### ${projectSync === "ready" ? "Syncing" : "Loading"} issues to local db for project ${projectId}`);
const start = performance.now();
const issueService = new IssueService();
const response = await issueService.getIssuesForSync(this.workspaceSlug, projectId, queryParams);
addIssuesBulk(response.results, BATCH_SIZE);
if (response.total_pages > 1) {
const promiseArray = [];
for (let i = 1; i < response.total_pages; i++) {
queryParams.cursor = `${PAGE_SIZE}:${i}:0`;
promiseArray.push(issueService.getIssuesForSync(this.workspaceSlug, projectId, queryParams));
}
const pages = await Promise.all(promiseArray);
for (const page of pages) {
await addIssuesBulk(page.results, BATCH_SIZE);
}
}
if (syncedAt) {
await syncDeletesToLocal(this.workspaceSlug, projectId, { updated_at__gt: syncedAt });
}
console.log("### Time taken to add issues", performance.now() - start);
if (status === "loading") {
await createIndexes();
}
this.setOption(projectId, "ready");
this.setStatus(projectId, "ready");
this.setSync(projectId, undefined);
};
getIssueCount = async (projectId: string) => {
const count = await runQuery(`select count(*) as count from issues where project_id='${projectId}'`);
return count[0]["count"];
};
getLastUpdatedIssue = async (projectId: string) => {
const lastUpdatedIssue = await runQuery(
`select id, name, updated_at , sequence_id from issues where project_id='${projectId}' order by datetime(updated_at) desc limit 1`
);
if (lastUpdatedIssue.length) {
return lastUpdatedIssue[0];
}
return;
};
getLastSyncTime = async (projectId: string) => {
const issue = await this.getLastUpdatedIssue(projectId);
if (!issue) {
return false;
}
return issue.updated_at;
};
getIssues = async (workspaceSlug: string, projectId: string, queries: any, config: any) => {
console.log("#### Queries", queries);
const currentProjectStatus = this.getStatus(projectId);
if (
!currentProjectStatus ||
this.status !== "ready" ||
currentProjectStatus === "loading" ||
currentProjectStatus === "error" ||
!rootStore.user.localDBEnabled
) {
info(`Project ${projectId} is loading, falling back to server`);
const issueService = new IssueService();
return await issueService.getIssuesFromServer(workspaceSlug, projectId, queries);
}
const { cursor, group_by, sub_group_by } = queries;
const query = issueFilterQueryConstructor(this.workspaceSlug, projectId, queries);
const countQuery = issueFilterCountQueryConstructor(this.workspaceSlug, projectId, queries);
const start = performance.now();
const [issuesRaw, count] = await Promise.all([runQuery(query), runQuery(countQuery)]);
// const issuesRaw = await runQuery(query);
const end = performance.now();
const { total_count } = count[0];
// const total_count = 2300;
const [pageSize, page, offset] = cursor.split(":");
const groupByProperty: string =
EIssueGroupBYServerToProperty[group_by as keyof typeof EIssueGroupBYServerToProperty];
const subGroupByProperty =
EIssueGroupBYServerToProperty[sub_group_by as keyof typeof EIssueGroupBYServerToProperty];
const parsingStart = performance.now();
let issueResults = issuesRaw.map((issue: any) => formatLocalIssue(issue));
console.log("#### Issue Results", issueResults.length);
const parsingEnd = performance.now();
const grouping = performance.now();
if (groupByProperty && page === "0") {
if (subGroupByProperty) {
issueResults = getSubGroupedIssueResults(issueResults);
} else {
issueResults = getGroupedIssueResults(issueResults);
}
}
const groupingEnd = performance.now();
const times = {
IssueQuery: end - start,
Parsing: parsingEnd - parsingStart,
Grouping: groupingEnd - grouping,
};
console.log(issueResults);
console.table(times);
const total_pages = Math.ceil(total_count / Number(pageSize));
const next_page_results = total_pages > parseInt(page) + 1;
const out = {
results: issueResults,
next_cursor: `${pageSize}:${parseInt(page) + 1}:${Number(offset) + Number(pageSize)}`,
prev_cursor: `${pageSize}:${parseInt(page) - 1}:${Number(offset) - Number(pageSize)}`,
total_results: total_count,
total_count,
next_page_results,
total_pages,
};
return out;
};
getIssue = async (issueId: string) => {
try {
if (!rootStore.user.localDBEnabled) return;
const issues = await runQuery(`select * from issues where id='${issueId}'`);
if (issues.length) {
return formatLocalIssue(issues[0]);
}
} catch (err) {
console.warn("unable to fetch issue from local db");
}
return;
};
getSubIssues = async (workspaceSlug: string, projectId: string, issueId: string) => {
const workspace_synced_at = await this.getOption("workspace_synced_at");
if (!workspace_synced_at) {
const issueService = new IssueService();
return await issueService.subIssues(workspaceSlug, projectId, issueId);
}
return await getSubIssuesWithDistribution(issueId);
};
getStatus = (projectId: string) => this.projectStatus[projectId]?.issues?.status || undefined;
setStatus = (projectId: string, status: "loading" | "ready" | "error" | "syncing" | undefined = undefined) => {
set(this.projectStatus, `${projectId}.issues.status`, status);
};
getSync = (projectId: string) => this.projectStatus[projectId]?.issues?.sync;
setSync = (projectId: string, sync: Promise<void> | undefined) => {
set(this.projectStatus, `${projectId}.issues.sync`, sync);
};
getOption = async (key: string, fallback = "") => {
try {
const options = await runQuery(`select * from options where key='${key}'`);
if (options.length) {
return options[0].value;
}
return fallback;
} catch (e) {
return fallback;
}
};
setOption = async (key: string, value: string) => {
await runQuery(`insert or replace into options (key, value) values ('${key}', '${value}')`);
};
getOptions = async (keys: string[]) => {
const options = await runQuery(`select * from options where key in ('${keys.join("','")}')`);
return options.reduce((acc: any, option: any) => {
acc[option.key] = option.value;
return acc;
}, {});
};
}
export const persistence = new Storage();
/**
* format the issue fetched from local db into an issue
* @param issue
* @returns
*/
export const formatLocalIssue = (issue: any) => {
const currIssue = issue;
ARRAY_FIELDS.forEach((field: string) => {
currIssue[field] = currIssue[field] ? JSON.parse(currIssue[field]) : [];
});
return currIssue as TIssue;
};

View File

@@ -0,0 +1,21 @@
export const ARRAY_FIELDS = ["label_ids", "assignee_ids", "module_ids"];
export const GROUP_BY_MAP = {
state_id: "state_id",
priority: "priority",
cycle_id: "cycle_id",
created_by: "created_by",
// Array Props
issue_module__module_id: "module_ids",
labels__id: "label_ids",
assignees__id: "assignee_ids",
target_date: "target_date",
};
export const PRIORITY_MAP = {
low: 1,
medium: 2,
high: 3,
urgent: 4,
none: 0,
};

View File

@@ -0,0 +1,30 @@
import { runQuery } from "./query-executor";
export const getProjectIds = async () => {
const q = `select project_id from states where project_id is not null group by project_id`;
return await runQuery(q);
};
export const getSubIssues = async (issueId: string) => {
const q = `select * from issues where parent_id = '${issueId}'`;
return await runQuery(q);
};
export const getSubIssueDistribution = async (issueId: string) => {
const q = `select s.'group', group_concat(i.id) as issues from issues i left join states s on s.id = i.state_id where i.parent_id = '${issueId}' group by s.'group'`;
const result = await runQuery(q);
if (!result.length) {
return {};
}
return result.reduce((acc: Record<string, string[]>, item: { group: string; issues: string }) => {
acc[item.group] = item.issues.split(",");
return acc;
}, {});
};
export const getSubIssuesWithDistribution = async (issueId: string) => {
const promises = [getSubIssues(issueId), getSubIssueDistribution(issueId)];
const [sub_issues, state_distribution] = await Promise.all(promises);
return { sub_issues, state_distribution };
};

View File

@@ -0,0 +1,68 @@
import { persistence } from "../storage.sqlite";
const log = console.log;
export const createIssueIndexes = async () => {
const columns = [
"state_id",
"sort_order",
// "priority",
"priority_proxy",
"project_id",
"created_by",
"cycle_id",
];
const promises: Promise<any>[] = [];
promises.push(persistence.db.exec({ sql: `CREATE UNIQUE INDEX issues_issue_id_idx ON issues (id)` }));
columns.forEach((column) => {
promises.push(
persistence.db.exec({ sql: `CREATE INDEX issues_issue_${column}_idx ON issues (project_id, ${column})` })
);
});
await Promise.all(promises);
};
export const createIssueMetaIndexes = async () => {
// Drop indexes
await persistence.db.exec({ sql: `CREATE INDEX issue_meta_all_idx ON issue_meta (issue_id,key,value)` });
};
export const createWorkSpaceIndexes = async () => {
const promises: Promise<any>[] = [];
// Labels
promises.push(persistence.db.exec({ sql: `CREATE INDEX labels_name_idx ON labels (id,name,project_id)` }));
// Modules
promises.push(persistence.db.exec({ sql: `CREATE INDEX modules_name_idx ON modules (id,name,project_id)` }));
// States
promises.push(persistence.db.exec({ sql: `CREATE INDEX states_name_idx ON states (id,name,project_id)` }));
// Cycles
promises.push(persistence.db.exec({ sql: `CREATE INDEX cycles_name_idx ON cycles (id,name,project_id)` }));
// Members
promises.push(persistence.db.exec({ sql: `CREATE INDEX members_name_idx ON members (id,first_name)` }));
// Estimate Points @todo
promises.push(persistence.db.exec({ sql: `CREATE INDEX estimate_points_name_idx ON estimate_points (id,value)` }));
// Options
promises.push(persistence.db.exec({ sql: `CREATE INDEX options_name_idx ON options (name)` }));
await Promise.all(promises);
};
const createIndexes = async () => {
log("### Creating indexes");
const start = performance.now();
const promises = [createIssueIndexes(), createIssueMetaIndexes(), createWorkSpaceIndexes()];
try {
await Promise.all(promises);
} catch (e) {
console.log((e as Error).message);
}
log("### Indexes created in", `${performance.now() - start}ms`);
};
export default createIndexes;

View File

@@ -0,0 +1,118 @@
import { TIssue } from "@plane/types";
import { rootStore } from "@/lib/store-context";
import { IssueService } from "@/services/issue";
import { persistence } from "../storage.sqlite";
import { ARRAY_FIELDS, PRIORITY_MAP } from "./constants";
import { issueSchema } from "./schemas";
export const PROJECT_OFFLINE_STATUS: Record<string, boolean> = {};
export const addIssue = async (issue: any) => {
if (document.hidden || !rootStore.user.localDBEnabled) return;
persistence.db.exec("BEGIN TRANSACTION;");
stageIssueInserts(issue);
persistence.db.exec("COMMIT;");
};
export const addIssuesBulk = async (issues: any, batchSize = 100) => {
if (!rootStore.user.localDBEnabled) return;
for (let i = 0; i < issues.length; i += batchSize) {
const batch = issues.slice(i, i + batchSize);
persistence.db.exec("BEGIN TRANSACTION;");
batch.forEach((issue: any) => {
if (!issue.type_id) {
issue.type_id = "";
}
stageIssueInserts(issue);
});
await persistence.db.exec("COMMIT;");
}
};
export const deleteIssueFromLocal = async (issue_id: any) => {
if (!rootStore.user.localDBEnabled) return;
const deleteQuery = `delete from issues where id='${issue_id}'`;
const deleteMetaQuery = `delete from issue_meta where issue_id='${issue_id}'`;
persistence.db.exec("BEGIN TRANSACTION;");
persistence.db.exec(deleteQuery);
persistence.db.exec(deleteMetaQuery);
persistence.db.exec("COMMIT;");
};
export const updateIssue = async (issue: TIssue) => {
if (document.hidden || !rootStore.user.localDBEnabled) return;
const issue_id = issue.id;
// delete the issue and its meta data
await deleteIssueFromLocal(issue_id);
addIssue(issue);
};
export const syncDeletesToLocal = async (workspaceId: string, projectId: string, queries: any) => {
if (!rootStore.user.localDBEnabled) return;
const issueService = new IssueService();
const response = await issueService.getDeletedIssues(workspaceId, projectId, queries);
if (Array.isArray(response)) {
response.map(async (issue) => deleteIssueFromLocal(issue));
}
};
const stageIssueInserts = (issue: any) => {
const issue_id = issue.id;
issue.priority_proxy = PRIORITY_MAP[issue.priority as keyof typeof PRIORITY_MAP];
const keys = Object.keys(issueSchema);
const sanitizedIssue = keys.reduce((acc: any, key) => {
if (issue[key] || issue[key] === 0) {
acc[key] = issue[key];
}
return acc;
}, {});
const columns = "'" + Object.keys(sanitizedIssue).join("','") + "'";
const values = Object.values(sanitizedIssue)
.map((value) => {
if (value === null) {
return "";
}
if (typeof value === "object") {
return `'${JSON.stringify(value)}'`;
}
if (typeof value === "string") {
return `'${value}'`;
}
return value;
})
.join(", ");
const query = `INSERT OR REPLACE INTO issues (${columns}) VALUES (${values});`;
persistence.db.exec(query);
persistence.db.exec({
sql: `DELETE from issue_meta where issue_id='${issue_id}'`,
});
ARRAY_FIELDS.forEach((field) => {
const values = issue[field];
if (values && values.length) {
values.forEach((val: any) => {
persistence.db.exec({
sql: `INSERT OR REPLACE into issue_meta(issue_id,key,value) values (?,?,?) `,
bind: [issue_id, field, val],
});
});
} else {
// Added for empty fields?
persistence.db.exec({
sql: `INSERT OR REPLACE into issue_meta(issue_id,key,value) values (?,?,?) `,
bind: [issue_id, field, ""],
});
}
});
};

View File

@@ -0,0 +1,142 @@
import { IEstimate, IEstimatePoint, IWorkspaceMember } from "@plane/types";
import { API_BASE_URL } from "@/helpers/common.helper";
import { EstimateService } from "@/plane-web/services/project/estimate.service";
import { CycleService } from "@/services/cycle.service";
import { IssueLabelService } from "@/services/issue/issue_label.service";
import { ModuleService } from "@/services/module.service";
import { ProjectStateService } from "@/services/project";
import { WorkspaceService } from "@/services/workspace.service";
import { persistence } from "../storage.sqlite";
import { cycleSchema, estimatePointSchema, labelSchema, memberSchema, Schema, stateSchema } from "./schemas";
const stageInserts = (table: string, schema: Schema, data: any) => {
const keys = Object.keys(schema);
// Pick only the keys that are in the schema
const filteredData = keys.reduce((acc: any, key) => {
if (data[key] || data[key] === 0) {
acc[key] = data[key];
}
return acc;
}, {});
const columns = "'" + Object.keys(filteredData).join("','") + "'";
// Add quotes to column names
const values = Object.values(filteredData)
.map((value) => {
if (value === null) {
return "";
}
if (typeof value === "object") {
return `'${JSON.stringify(value)}'`;
}
if (typeof value === "string") {
return `'${value}'`;
}
return value;
})
.join(", ");
const query = `INSERT OR REPLACE INTO ${table} (${columns}) VALUES (${values});`;
persistence.db.exec(query);
};
export const loadLabels = async (workspaceSlug: string, batchSize = 500) => {
const issueLabelService = new IssueLabelService();
const objects = await issueLabelService.getWorkspaceIssueLabels(workspaceSlug);
for (let i = 0; i < objects.length; i += batchSize) {
const batch = objects.slice(i, i + batchSize);
persistence.db.exec("BEGIN TRANSACTION;");
batch.forEach((label: any) => {
stageInserts("labels", labelSchema, label);
});
await persistence.db.exec("COMMIT;");
}
};
export const loadModules = async (workspaceSlug: string, batchSize = 500) => {
const moduleService = new ModuleService();
const objects = await moduleService.getWorkspaceModules(workspaceSlug);
for (let i = 0; i < objects.length; i += batchSize) {
const batch = objects.slice(i, i + batchSize);
persistence.db.exec("BEGIN TRANSACTION;");
batch.forEach((label: any) => {
stageInserts("modules", labelSchema, label);
});
await persistence.db.exec("COMMIT;");
}
};
export const loadCycles = async (workspaceSlug: string, batchSize = 500) => {
const cycleService = new CycleService();
const objects = await cycleService.getWorkspaceCycles(workspaceSlug);
for (let i = 0; i < objects.length; i += batchSize) {
const batch = objects.slice(i, i + batchSize);
persistence.db.exec("BEGIN TRANSACTION;");
batch.forEach((cycle: any) => {
stageInserts("cycles", cycleSchema, cycle);
});
await persistence.db.exec("COMMIT;");
}
};
export const loadStates = async (workspaceSlug: string, batchSize = 500) => {
const stateService = new ProjectStateService();
const objects = await stateService.getWorkspaceStates(workspaceSlug);
for (let i = 0; i < objects.length; i += batchSize) {
const batch = objects.slice(i, i + batchSize);
persistence.db.exec("BEGIN TRANSACTION;");
batch.forEach((state: any) => {
stageInserts("states", stateSchema, state);
});
await persistence.db.exec("COMMIT;");
}
};
export const loadEstimatePoints = async (workspaceSlug: string, batchSize = 500) => {
const estimateService = new EstimateService();
const estimates = await estimateService.fetchWorkspaceEstimates(workspaceSlug);
const objects: IEstimatePoint[] = [];
(estimates || []).forEach((estimate: IEstimate) => {
if (estimate?.points) {
objects.concat(estimate.points);
}
});
for (let i = 0; i < objects.length; i += batchSize) {
const batch = objects.slice(i, i + batchSize);
persistence.db.exec("BEGIN TRANSACTION;");
batch.forEach((point: any) => {
stageInserts("estimate_points", estimatePointSchema, point);
});
await persistence.db.exec("COMMIT;");
}
};
export const loadMembers = async (workspaceSlug: string, batchSize = 500) => {
const workspaceService = new WorkspaceService(API_BASE_URL);
const members = await workspaceService.fetchWorkspaceMembers(workspaceSlug);
const objects = members.map((member: IWorkspaceMember) => member.member);
for (let i = 0; i < objects.length; i += batchSize) {
const batch = objects.slice(i, i + batchSize);
persistence.db.exec("BEGIN TRANSACTION;");
batch.forEach((member: any) => {
stageInserts("members", memberSchema, member);
});
await persistence.db.exec("COMMIT;");
}
};
export const loadWorkSpaceData = async (workspaceSlug: string) => {
const promises = [];
promises.push(loadLabels(workspaceSlug));
promises.push(loadModules(workspaceSlug));
promises.push(loadCycles(workspaceSlug));
promises.push(loadStates(workspaceSlug));
promises.push(loadEstimatePoints(workspaceSlug));
promises.push(loadMembers(workspaceSlug));
await Promise.all(promises);
};

View File

@@ -0,0 +1,166 @@
import {
getFilteredRowsForGrouping,
getIssueFieldsFragment,
getMetaKeys,
getOrderByFragment,
singleFilterConstructor,
translateQueryParams,
} from "./query.utils";
export const SPECIAL_ORDER_BY = {
labels__name: "labels",
"-labels__name": "labels",
assignees__first_name: "members",
"-assignees__first_name": "members",
issue_module__module__name: "modules",
"-issue_module__module__name": "modules",
issue_cycle__cycle__name: "cycles",
"-issue_cycle__cycle__name": "cycles",
state__name: "states",
"-state__name": "states",
};
export const issueFilterQueryConstructor = (workspaceSlug: string, projectId: string, queries: any) => {
const {
cursor,
per_page,
group_by,
sub_group_by,
order_by = "created_at",
...otherProps
} = translateQueryParams(queries);
const [pageSize, page, offset] = cursor.split(":");
let sql = "";
const fieldsFragment = getIssueFieldsFragment();
if (sub_group_by) {
const orderByString = getOrderByFragment(order_by);
sql = getFilteredRowsForGrouping(projectId, queries);
sql += `, ranked_issues AS ( SELECT fi.*,
ROW_NUMBER() OVER (PARTITION BY group_id, sub_group_id ${orderByString}) as rank,
COUNT(*) OVER (PARTITION by group_id, sub_group_id) as total_issues from fi)
SELECT ri.*, ${fieldsFragment}
FROM ranked_issues ri
JOIN issues i ON ri.id = i.id
WHERE rank <= ${per_page}
`;
console.log("###", sql);
return sql;
}
if (group_by) {
const orderByString = getOrderByFragment(order_by);
sql = getFilteredRowsForGrouping(projectId, queries);
sql += `, ranked_issues AS ( SELECT fi.*,
ROW_NUMBER() OVER (PARTITION BY group_id ${orderByString}) as rank,
COUNT(*) OVER (PARTITION by group_id) as total_issues FROM fi)
SELECT ri.*, ${fieldsFragment}
FROM ranked_issues ri
JOIN issues i ON ri.id = i.id
WHERE rank <= ${per_page}
`;
console.log("###", sql);
return sql;
}
if (order_by && Object.keys(SPECIAL_ORDER_BY).includes(order_by)) {
const name = order_by.replace("-", "");
const orderByString = getOrderByFragment(order_by, "i.");
sql = `WITH sorted_issues AS (`;
sql += getFilteredRowsForGrouping(projectId, queries);
sql += `SELECT fi.* , `;
if (order_by.includes("assignee")) {
sql += ` s.first_name as ${name} `;
} else {
sql += ` s.name as ${name} `;
}
sql += `FROM fi `;
if (order_by && Object.keys(SPECIAL_ORDER_BY).includes(order_by)) {
if (order_by.includes("cycle")) {
sql += `
LEFT JOIN cycles s on fi.cycle_id = s.id`;
}
if (order_by.includes("estimate_point")) {
sql += `
LEFT JOIN estimate_points s on fi.estimate_point = s.id`;
}
if (order_by.includes("state")) {
sql += `
LEFT JOIN states s on fi.state_id = s.id`;
}
if (order_by.includes("label")) {
sql += `
LEFT JOIN issue_meta sm ON fi.id = sm.issue_id AND sm.key = 'label_ids'
LEFT JOIN labels s ON s.id = sm.value`;
}
if (order_by.includes("module")) {
sql += `
LEFT JOIN issue_meta sm ON fi.id = sm.issue_id AND sm.key = 'module_ids'
LEFT JOIN modules s ON s.id = sm.value`;
}
if (order_by.includes("assignee")) {
sql += `
LEFT JOIN issue_meta sm ON fi.id = sm.issue_id AND sm.key = 'assignee_ids'
LEFT JOIN members s ON s.id = sm.value`;
}
sql += ` ORDER BY ${name} ASC NULLS LAST`;
}
sql += `)`;
sql += `SELECT ${fieldsFragment}, group_concat(si.${name}) as ${name} from sorted_issues si JOIN issues i ON si.id = i.id
`;
sql += ` group by i.id ${orderByString} LIMIT ${pageSize} OFFSET ${offset * 1 + page * pageSize};`;
console.log("######$$$", sql);
return sql;
}
const filterJoinFields = getMetaKeys(queries);
const orderByString = getOrderByFragment(order_by);
sql = `SELECT ${fieldsFragment}`;
if (otherProps.state_group) {
sql += `, states.'group' as state_group`;
}
sql += ` from issues i
`;
if (otherProps.state_group) {
sql += `LEFT JOIN states ON i.state_id = states.id `;
}
filterJoinFields.forEach((field: string) => {
const value = otherProps[field] || "";
sql += ` INNER JOIN issue_meta ${field} ON i.id = ${field}.issue_id AND ${field}.key = '${field}' AND ${field}.value IN ('${value.split(",").join("','")}')
`;
});
sql += ` WHERE i.project_id = '${projectId}' ${singleFilterConstructor(otherProps)} group by i.id `;
sql += orderByString;
// Add offset and paging to query
sql += ` LIMIT ${pageSize} OFFSET ${offset * 1 + page * pageSize};`;
console.log("$$$", sql);
return sql;
};
export const issueFilterCountQueryConstructor = (workspaceSlug: string, projectId: string, queries: any) => {
//@todo Very crude way to extract count from the actual query. Needs to be refactored
// Remove group by from the query to fallback to non group query
const { group_by, sub_group_by, order_by, ...otherProps } = queries;
let sql = issueFilterQueryConstructor(workspaceSlug, projectId, otherProps);
const fieldsFragment = getIssueFieldsFragment();
sql = sql.replace(`SELECT ${fieldsFragment}`, "SELECT COUNT(DISTINCT i.id) as total_count");
// Remove everything after group by i.id
sql = `${sql.split("group by i.id")[0]};`;
return sql;
};

View File

@@ -0,0 +1,13 @@
// import { SQL } from "./sqlite";
import { persistence } from "../storage.sqlite";
export const runQuery = async (sql: string) => {
const data = await persistence.db.exec({
sql,
rowMode: "object",
returnValue: "resultRows",
});
return data.result.resultRows;
};

View File

@@ -0,0 +1,335 @@
import { ARRAY_FIELDS, GROUP_BY_MAP, PRIORITY_MAP } from "./constants";
import { SPECIAL_ORDER_BY } from "./query-constructor";
import { issueSchema } from "./schemas";
import { wrapDateTime } from "./utils";
export const translateQueryParams = (queries: any) => {
const { group_by, sub_group_by, labels, assignees, state, cycle, module, priority, type, ...otherProps } = queries;
const order_by = queries.order_by;
if (state) otherProps.state_id = state;
if (cycle) otherProps.cycle_id = cycle;
if (module) otherProps.module_ids = module;
if (labels) otherProps.label_ids = labels;
if (assignees) otherProps.assignee_ids = assignees;
if (group_by) otherProps.group_by = GROUP_BY_MAP[group_by as keyof typeof GROUP_BY_MAP];
if (sub_group_by) otherProps.sub_group_by = GROUP_BY_MAP[sub_group_by as keyof typeof GROUP_BY_MAP];
if (priority) {
otherProps.priority_proxy = priority
.split(",")
.map((priority: string) => PRIORITY_MAP[priority as keyof typeof PRIORITY_MAP])
.join(",");
}
if (type) {
otherProps.state_group = type === "backlog" ? "backlog" : "unstarted,started";
}
if (order_by?.includes("priority")) {
otherProps.order_by = order_by.replace("priority", "priority_proxy");
}
// Fix invalid orderby when switching from spreadsheet layout
if ((group_by || sub_group_by) && Object.keys(SPECIAL_ORDER_BY).includes(order_by)) {
otherProps.order_by = "sort_order";
}
// For each property value, replace None with empty string
Object.keys(otherProps).forEach((key) => {
if (otherProps[key] === "None") {
otherProps[key] = "";
}
});
return otherProps;
};
export const getOrderByFragment = (order_by: string, table = "") => {
let orderByString = "";
if (!order_by) return orderByString;
if (order_by.startsWith("-")) {
orderByString += ` ORDER BY ${wrapDateTime(order_by.slice(1))} DESC NULLS LAST, datetime(${table}created_at) DESC`;
} else {
orderByString += ` ORDER BY ${wrapDateTime(order_by)} ASC NULLS LAST, datetime(${table}created_at) DESC`;
}
return orderByString;
};
export const isMetaJoinRequired = (groupBy: string, subGroupBy: string) =>
ARRAY_FIELDS.includes(groupBy) || ARRAY_FIELDS.includes(subGroupBy);
export const getMetaKeysFragment = (queries: any) => {
const { group_by, sub_group_by, ...otherProps } = translateQueryParams(queries);
const fields: Set<string> = new Set();
if (ARRAY_FIELDS.includes(group_by)) {
fields.add(group_by);
}
if (ARRAY_FIELDS.includes(sub_group_by)) {
fields.add(sub_group_by);
}
const keys = Object.keys(otherProps);
keys.forEach((field: string) => {
if (ARRAY_FIELDS.includes(field)) {
fields.add(field);
}
});
let sql;
sql = ` ('${Array.from(fields).join("','")}')`;
return sql;
};
export const getMetaKeys = (queries: any): string[] => {
const { group_by, sub_group_by, ...otherProps } = translateQueryParams(queries);
const fields: Set<string> = new Set();
if (ARRAY_FIELDS.includes(group_by)) {
fields.add(group_by);
}
if (ARRAY_FIELDS.includes(sub_group_by)) {
fields.add(sub_group_by);
}
const keys = Object.keys(otherProps);
keys.forEach((field: string) => {
if (ARRAY_FIELDS.includes(field)) {
fields.add(field);
}
});
return Array.from(fields);
};
const areJoinsRequired = (queries: any) => {
const { group_by, sub_group_by, ...otherProps } = translateQueryParams(queries);
if (ARRAY_FIELDS.includes(group_by) || ARRAY_FIELDS.includes(sub_group_by)) {
return true;
}
if (Object.keys(otherProps).some((field) => ARRAY_FIELDS.includes(field))) {
return true;
}
return false;
};
// Apply filters to the query
export const getFilteredRowsForGrouping = (projectId: string, queries: any) => {
const { group_by, sub_group_by, ...otherProps } = translateQueryParams(queries);
const filterJoinFields = getMetaKeys(otherProps);
const temp = getSingleFilterFields(queries);
const issueTableFilterFields = temp.length ? "," + temp.join(",") : "";
const joinsRequired = areJoinsRequired(queries);
let sql = "";
if (!joinsRequired) {
sql = `WITH fi as (SELECT i.id,i.created_at ${issueTableFilterFields}`;
if (group_by) {
if (group_by === "target_date") {
sql += `, date(i.${group_by}) as group_id`;
} else {
sql += `, i.${group_by} as group_id`;
}
}
if (sub_group_by) {
sql += `, i.${sub_group_by} as sub_group_id`;
}
sql += ` FROM issues i `;
if (otherProps.state_group) {
sql += `LEFT JOIN states ON i.state_id = states.id `;
}
sql += `WHERE i.project_id = '${projectId}'
`;
sql += `${singleFilterConstructor(otherProps)})
`;
return sql;
}
sql = `WITH fi AS (`;
sql += `SELECT i.id,i.created_at ${issueTableFilterFields} `;
if (group_by) {
if (ARRAY_FIELDS.includes(group_by)) {
sql += `, ${group_by}.value as group_id
`;
} else if (group_by === "target_date") {
sql += `, date(i.${group_by}) as group_id
`;
} else {
sql += `, i.${group_by} as group_id
`;
}
}
if (sub_group_by) {
if (ARRAY_FIELDS.includes(sub_group_by)) {
sql += `, ${sub_group_by}.value as sub_group_id
`;
} else {
sql += `, i.${sub_group_by} as sub_group_id
`;
}
}
sql += ` from issues i
`;
if (otherProps.state_group) {
sql += `LEFT JOIN states ON i.state_id = states.id `;
}
filterJoinFields.forEach((field: string) => {
sql += ` INNER JOIN issue_meta ${field} ON i.id = ${field}.issue_id AND ${field}.key = '${field}' AND ${field}.value IN ('${otherProps[field].split(",").join("','")}')
`;
});
// If group by field is not already joined, join it
if (ARRAY_FIELDS.includes(group_by) && !filterJoinFields.includes(group_by)) {
sql += ` LEFT JOIN issue_meta ${group_by} ON i.id = ${group_by}.issue_id AND ${group_by}.key = '${group_by}'
`;
}
if (ARRAY_FIELDS.includes(sub_group_by) && !filterJoinFields.includes(sub_group_by)) {
sql += ` LEFT JOIN issue_meta ${sub_group_by} ON i.id = ${sub_group_by}.issue_id AND ${sub_group_by}.key = '${sub_group_by}'
`;
}
sql += ` WHERE i.project_id = '${projectId}'
`;
sql += singleFilterConstructor(otherProps);
sql += `)
`;
return sql;
};
export const singleFilterConstructor = (queries: any) => {
const {
order_by,
cursor,
per_page,
group_by,
sub_group_by,
state_group,
sub_issue,
target_date,
start_date,
...filters
} = translateQueryParams(queries);
let sql = "";
if (!sub_issue) {
sql += ` AND parent_id IS NULL
`;
}
if (target_date) {
sql += createDateFilter("target_date", target_date);
}
if (start_date) {
sql += createDateFilter("start_date", start_date);
}
if (state_group) {
sql += ` AND state_group in ('${state_group.split(",").join("','")}')
`;
}
const keys = Object.keys(filters);
keys.forEach((key) => {
const value = filters[key] ? filters[key].split(",") : "";
if (!value) return;
if (!ARRAY_FIELDS.includes(key)) {
sql += ` AND ${key} in ('${value.join("','")}')
`;
}
});
//
return sql;
};
// let q = '2_months;after;fromnow,1_months;after;fromnow,2024-09-01;after,2024-10-06;after,2_weeks;after;fromnow'
// ["2_months;after;fromnow", "1_months;after;fromnow", "2024-09-01;after", "2024-10-06;before", "2_weeks;after;fromnow"];
const createDateFilter = (key: string, q: string) => {
let sql = " ";
// get todays date in YYYY-MM-DD format
const queries = q.split(",");
const customRange: string[] = [];
let isAnd = true;
queries.forEach((query: string) => {
const [date, type, from] = query.split(";");
if (from) {
// Assuming type is always after
let after = "";
const [_length, unit] = date.split("_");
const length = parseInt(_length);
if (unit === "weeks") {
// get date in yyyy-mm-dd format one week from now
after = new Date(new Date().setDate(new Date().getDate() + length * 7)).toISOString().split("T")[0];
}
if (unit === "months") {
after = new Date(new Date().setDate(new Date().getDate() + length * 30)).toISOString().split("T")[0];
}
sql += ` ${isAnd ? "AND" : "OR"} ${key} >= date('${after}')`;
isAnd = false;
// sql += ` AND ${key} ${type === "after" ? ">=" : "<="} date('${date}', '${today}')`;
} else {
customRange.push(query);
}
});
if (customRange.length === 2) {
const end = customRange.find((date) => date.includes("before"))?.split(";")[0];
const start = customRange.find((date) => date.includes("after"))?.split(";")[0];
if (end && start) {
sql += ` ${isAnd ? "AND" : "OR"} ${key} BETWEEN date('${start}') AND date('${end}')`;
}
}
if (customRange.length === 1) {
sql += ` AND ${key}=date('${customRange[0].split(";")[0]}')`;
}
return sql;
};
const getSingleFilterFields = (queries: any) => {
const { order_by, cursor, per_page, group_by, sub_group_by, sub_issue, state_group, ...otherProps } =
translateQueryParams(queries);
const fields = new Set();
if (order_by && !order_by.includes("created_at") && !Object.keys(SPECIAL_ORDER_BY).includes(order_by))
fields.add(order_by.replace("-", ""));
const keys = Object.keys(otherProps);
keys.forEach((field: string) => {
if (!ARRAY_FIELDS.includes(field)) {
fields.add(field);
}
});
if (order_by?.includes("state__name")) {
fields.add("state_id");
}
if (order_by?.includes("cycle__name")) {
fields.add("cycle_id");
}
if (state_group) {
fields.add("states.'group' as state_group");
}
return Array.from(fields);
};
export const getIssueFieldsFragment = () => {
const { description_html, ...filtered } = issueSchema;
const keys = Object.keys(filtered);
const sql = ` ${keys.map((key, index) => `i.${key}`).join(`,
`)}`;
return sql;
};

View File

@@ -0,0 +1,135 @@
export type Schema = {
[key: string]: string;
};
export const issueSchema: Schema = {
id: "TEXT UNIQUE",
name: "TEXT",
state_id: "TEXT",
sort_order: "REAL",
completed_at: "TEXT",
estimate_point: "REAL",
priority: "TEXT",
priority_proxy: "INTEGER",
start_date: "TEXT",
target_date: "TEXT",
sequence_id: "INTEGER",
project_id: "TEXT",
parent_id: "TEXT",
created_at: "TEXT",
updated_at: "TEXT",
created_by: "TEXT",
updated_by: "TEXT",
is_draft: "INTEGER",
archived_at: "TEXT",
state__group: "TEXT",
sub_issues_count: "INTEGER",
cycle_id: "TEXT",
link_count: "INTEGER",
attachment_count: "INTEGER",
type_id: "TEXT",
label_ids: "TEXT",
assignee_ids: "TEXT",
module_ids: "TEXT",
description_html: "TEXT",
};
export const issueMetaSchema: Schema = {
issue_id: "TEXT",
key: "TEXT",
value: "TEXT",
};
export const moduleSchema: Schema = {
id: "TEXT UNIQUE",
workspace_id: "TEXT",
project_id: "TEXT",
name: "TEXT",
description: "TEXT",
description_text: "TEXT",
description_html: "TEXT",
start_date: "TEXT",
target_date: "TEXT",
status: "TEXT",
lead_id: "TEXT",
member_ids: "TEXT",
view_props: "TEXT",
sort_order: "INTEGER",
external_source: "TEXT",
external_id: "TEXT",
logo_props: "TEXT",
total_issues: "INTEGER",
cancelled_issues: "INTEGER",
completed_issues: "INTEGER",
started_issues: "INTEGER",
unstarted_issues: "INTEGER",
backlog_issues: "INTEGER",
created_at: "TEXT",
updated_at: "TEXT",
archived_at: "TEXT",
};
export const labelSchema: Schema = {
id: "TEXT UNIQUE",
name: "TEXT",
color: "TEXT",
parent: "TEXT",
project_id: "TEXT",
workspace_id: "TEXT",
sort_order: "INTEGER",
};
export const cycleSchema: Schema = {
id: "TEXT UNIQUE",
workspace_id: "TEXT",
project_id: "TEXT",
name: "TEXT",
description: "TEXT",
start_date: "TEXT",
end_date: "TEXT",
owned_by_id: "TEXT",
view_props: "TEXT",
sort_order: "INTEGER",
external_source: "TEXT",
external_id: "TEXT",
progress_snapshot: "TEXT",
logo_props: "TEXT",
total_issues: "INTEGER",
cancelled_issues: "INTEGER",
completed_issues: "INTEGER",
started_issues: "INTEGER",
unstarted_issues: "INTEGER",
backlog_issues: "INTEGER",
};
export const stateSchema: Schema = {
id: "TEXT UNIQUE",
project_id: "TEXT",
workspace_id: "TEXT",
name: "TEXT",
color: "TEXT",
group: "TEXT",
default: "BOOLEAN",
description: "TEXT",
sequence: "INTEGER",
};
export const estimatePointSchema: Schema = {
id: "TEXT UNIQUE",
key: "TEXT",
value: "REAL",
};
export const memberSchema: Schema = {
id: "TEXT UNIQUE",
first_name: "TEXT",
last_name: "TEXT",
avatar: "TEXT",
is_bot: "BOOLEAN",
display_name: "TEXT",
email: "TEXT",
};
export const optionsSchema: Schema = {
key: "TEXT UNIQUE",
value: "TEXT",
};

View File

@@ -0,0 +1,39 @@
import { persistence } from "../storage.sqlite";
import {
labelSchema,
moduleSchema,
Schema,
issueMetaSchema,
issueSchema,
stateSchema,
cycleSchema,
estimatePointSchema,
memberSchema,
optionsSchema,
} from "./schemas";
const createTableSQLfromSchema = (tableName: string, schema: Schema) => {
let sql = `CREATE TABLE IF NOT EXISTS ${tableName} (`;
sql += Object.keys(schema)
.map((key) => `'${key}' ${schema[key]}`)
.join(", ");
sql += `);`;
console.log("#####", sql);
return sql;
};
export const createTables = async () => {
persistence.db.exec("BEGIN TRANSACTION;");
persistence.db.exec(createTableSQLfromSchema("issues", issueSchema));
persistence.db.exec(createTableSQLfromSchema("issue_meta", issueMetaSchema));
persistence.db.exec(createTableSQLfromSchema("modules", moduleSchema));
persistence.db.exec(createTableSQLfromSchema("labels", labelSchema));
persistence.db.exec(createTableSQLfromSchema("states", stateSchema));
persistence.db.exec(createTableSQLfromSchema("cycles", cycleSchema));
persistence.db.exec(createTableSQLfromSchema("estimate_points", estimatePointSchema));
persistence.db.exec(createTableSQLfromSchema("members", memberSchema));
persistence.db.exec(createTableSQLfromSchema("options", optionsSchema));
persistence.db.exec("COMMIT;");
};

View File

@@ -0,0 +1,134 @@
import pick from "lodash/pick";
import { TIssue } from "@plane/types";
import { rootStore } from "@/lib/store-context";
import { updateIssue } from "./load-issues";
export const log = console.log;
// export const log = () => {};
export const updatePersistentLayer = async (issueIds: string | string[]) => {
if (typeof issueIds === "string") {
issueIds = [issueIds];
}
issueIds.forEach((issueId) => {
const issue = rootStore.issue.issues.getIssueById(issueId);
if (issue) {
const issuePartial = pick(JSON.parse(JSON.stringify(issue)), [
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"created_at",
"updated_at",
"created_by",
"updated_by",
"is_draft",
"archived_at",
"state__group",
"cycle_id",
"link_count",
"attachment_count",
"sub_issues_count",
"assignee_ids",
"label_ids",
"module_ids",
"type_id",
]);
updateIssue(issuePartial);
}
});
};
export const wrapDateTime = (field: string) => {
const DATE_TIME_FIELDS = ["created_at", "updated_at", "completed_at", "start_date", "target_date"];
if (DATE_TIME_FIELDS.includes(field)) {
return `datetime(${field})`;
}
return field;
};
export const getGroupedIssueResults = (issueResults: (TIssue & { group_id: string; total_issues: number })[]): any => {
const groupedResults: {
[key: string]: {
results: TIssue[];
total_results: number;
};
} = {};
for (const issue of issueResults) {
const { group_id, total_issues } = issue;
const groupId = group_id ? group_id : "None";
if (groupedResults?.[groupId] !== undefined && Array.isArray(groupedResults?.[groupId]?.results)) {
groupedResults?.[groupId]?.results.push(issue);
} else {
groupedResults[groupId] = { results: [issue], total_results: total_issues };
}
}
return groupedResults;
};
export const getSubGroupedIssueResults = (
issueResults: (TIssue & { group_id: string; total_issues: number; sub_group_id: string })[]
): any => {
const subGroupedResults: {
[key: string]: {
results: {
[key: string]: {
results: TIssue[];
total_results: number;
};
};
total_results: number;
};
} = {};
for (const issue of issueResults) {
const { group_id, total_issues, sub_group_id } = issue;
const groupId = group_id ? group_id : "None";
const subGroupId = sub_group_id ? sub_group_id : "None";
if (subGroupedResults?.[groupId] === undefined) {
subGroupedResults[groupId] = { results: {}, total_results: 0 };
}
if (
subGroupedResults[groupId].results[subGroupId] !== undefined &&
Array.isArray(subGroupedResults[groupId].results[subGroupId]?.results)
) {
subGroupedResults[groupId].results[subGroupId]?.results.push(issue);
} else {
subGroupedResults[groupId].results[subGroupId] = { results: [issue], total_results: total_issues };
}
}
const groupByKeys = Object.keys(subGroupedResults);
for (const groupByKey of groupByKeys) {
let totalIssues = 0;
const groupedResults = subGroupedResults[groupByKey]?.results ?? {};
const subGroupByKeys = Object.keys(groupedResults);
for (const subGroupByKey of subGroupByKeys) {
const subGroupedResultsCount = groupedResults[subGroupByKey].total_results ?? 0;
totalIssues += subGroupedResultsCount;
}
subGroupedResults[groupByKey].total_results = totalIssues;
}
return subGroupedResults;
};
export const delay = (ms: number) => new Promise((resolve) => setTimeout(resolve, ms));

View File

@@ -1,16 +1,20 @@
// types
import type {
TIssue,
IIssueDisplayProperties,
TIssueLink,
TIssueSubIssues,
TIssueActivity,
TIssuesResponse,
TBulkOperationsPayload,
TIssue,
TIssueActivity,
TIssueLink,
TIssuesResponse,
TIssueSubIssues,
} from "@plane/types";
// helpers
import { API_BASE_URL } from "@/helpers/common.helper";
import { persistence } from "@/local-db/storage.sqlite";
// services
import { addIssue, addIssuesBulk, deleteIssueFromLocal } from "@/local-db/utils/load-issues";
import { updatePersistentLayer } from "@/local-db/utils/utils";
import { APIService } from "@/services/api.service";
export class IssueService extends APIService {
@@ -20,13 +24,21 @@ export class IssueService extends APIService {
async createIssue(workspaceSlug: string, projectId: string, data: Partial<TIssue>): Promise<TIssue> {
return this.post(`/api/workspaces/${workspaceSlug}/projects/${projectId}/issues/`, data)
.then((response) => response?.data)
.then((response) => {
updatePersistentLayer(response?.data?.id);
return response?.data;
})
.catch((error) => {
throw error?.response?.data;
});
}
async getIssues(workspaceSlug: string, projectId: string, queries?: any, config = {}): Promise<TIssuesResponse> {
async getIssuesFromServer(
workspaceSlug: string,
projectId: string,
queries?: any,
config = {}
): Promise<TIssuesResponse> {
return this.get(
`/api/workspaces/${workspaceSlug}/projects/${projectId}/issues/`,
{
@@ -40,6 +52,41 @@ export class IssueService extends APIService {
});
}
async getIssuesForSync(
workspaceSlug: string,
projectId: string,
queries?: any,
config = {}
): Promise<TIssuesResponse> {
queries.project_id = projectId;
return this.get(
`/api/workspaces/${workspaceSlug}/v2/issues/`,
{
params: queries,
},
config
)
.then((response) => response?.data)
.catch((error) => {
throw error?.response?.data;
});
}
async getIssues(workspaceSlug: string, projectId: string, queries?: any, config = {}): Promise<TIssuesResponse> {
const response = await persistence.getIssues(workspaceSlug, projectId, queries, config);
return response as TIssuesResponse;
}
async getDeletedIssues(workspaceSlug: string, projectId: string, queries?: any): Promise<TIssuesResponse> {
return this.get(`/api/workspaces/${workspaceSlug}/projects/${projectId}/deleted-issues/`, {
params: queries,
})
.then((response) => response?.data)
.catch((error) => {
throw error?.response?.data;
});
}
async getIssuesWithParams(
workspaceSlug: string,
projectId: string,
@@ -58,7 +105,12 @@ export class IssueService extends APIService {
return this.get(`/api/workspaces/${workspaceSlug}/projects/${projectId}/issues/${issueId}/`, {
params: queries,
})
.then((response) => response?.data)
.then((response) => {
if (response.data) {
addIssue(response?.data);
}
return response?.data;
})
.catch((error) => {
throw error?.response?.data;
});
@@ -68,7 +120,12 @@ export class IssueService extends APIService {
return this.get(`/api/workspaces/${workspaceSlug}/projects/${projectId}/issues/list/`, {
params: { issues: issueIds.join(",") },
})
.then((response) => response?.data)
.then((response) => {
if (response?.data && Array.isArray(response?.data)) {
addIssuesBulk(response.data);
}
return response?.data;
})
.catch((error) => {
throw error?.response?.data;
});
@@ -90,6 +147,7 @@ export class IssueService extends APIService {
issues: string[];
}
) {
updatePersistentLayer(data.issues);
return this.post(`/api/workspaces/${workspaceSlug}/projects/${projectId}/cycles/${cycleId}/cycle-issues/`, data)
.then((response) => response?.data)
.catch((error) => {
@@ -119,6 +177,7 @@ export class IssueService extends APIService {
relation?: "blocking" | null;
}
) {
updatePersistentLayer(issueId);
return this.post(`/api/workspaces/${workspaceSlug}/projects/${projectId}/issues/${issueId}/issue-relation/`, data)
.then((response) => response?.data)
.catch((error) => {
@@ -159,6 +218,7 @@ export class IssueService extends APIService {
}
async patchIssue(workspaceSlug: string, projectId: string, issueId: string, data: Partial<TIssue>): Promise<any> {
updatePersistentLayer(issueId);
return this.patch(`/api/workspaces/${workspaceSlug}/projects/${projectId}/issues/${issueId}/`, data)
.then((response) => response?.data)
.catch((error) => {
@@ -167,6 +227,7 @@ export class IssueService extends APIService {
}
async deleteIssue(workspaceSlug: string, projectId: string, issuesId: string): Promise<any> {
deleteIssueFromLocal(issuesId);
return this.delete(`/api/workspaces/${workspaceSlug}/projects/${projectId}/issues/${issuesId}/`)
.then((response) => response?.data)
.catch((error) => {
@@ -188,6 +249,7 @@ export class IssueService extends APIService {
issueId: string,
data: { sub_issue_ids: string[] }
): Promise<TIssueSubIssues> {
updatePersistentLayer([issueId, ...data.sub_issue_ids]);
return this.post(`/api/workspaces/${workspaceSlug}/projects/${projectId}/issues/${issueId}/sub-issues/`, data)
.then((response) => response?.data)
.catch((error) => {
@@ -209,6 +271,7 @@ export class IssueService extends APIService {
issueId: string,
data: Partial<TIssueLink>
): Promise<TIssueLink> {
updatePersistentLayer(issueId);
return this.post(`/api/workspaces/${workspaceSlug}/projects/${projectId}/issues/${issueId}/issue-links/`, data)
.then((response) => response?.data)
.catch((error) => {
@@ -223,6 +286,7 @@ export class IssueService extends APIService {
linkId: string,
data: Partial<TIssueLink>
): Promise<TIssueLink> {
updatePersistentLayer(issueId);
return this.patch(
`/api/workspaces/${workspaceSlug}/projects/${projectId}/issues/${issueId}/issue-links/${linkId}/`,
data
@@ -234,6 +298,7 @@ export class IssueService extends APIService {
}
async deleteIssueLink(workspaceSlug: string, projectId: string, issueId: string, linkId: string): Promise<any> {
updatePersistentLayer(issueId);
return this.delete(
`/api/workspaces/${workspaceSlug}/projects/${projectId}/issues/${issueId}/issue-links/${linkId}/`
)
@@ -245,7 +310,10 @@ export class IssueService extends APIService {
async bulkOperations(workspaceSlug: string, projectId: string, data: TBulkOperationsPayload): Promise<any> {
return this.post(`/api/workspaces/${workspaceSlug}/projects/${projectId}/bulk-operation-issues/`, data)
.then((response) => response?.data)
.then((response) => {
persistence.syncIssues(projectId);
return response?.data;
})
.catch((error) => {
throw error?.response?.data;
});
@@ -259,7 +327,10 @@ export class IssueService extends APIService {
}
): Promise<any> {
return this.delete(`/api/workspaces/${workspaceSlug}/projects/${projectId}/bulk-delete-issues/`, data)
.then((response) => response?.data)
.then((response) => {
persistence.syncIssues(projectId);
return response?.data;
})
.catch((error) => {
throw error?.response?.data;
});
@@ -275,7 +346,10 @@ export class IssueService extends APIService {
archived_at: string;
}> {
return this.post(`/api/workspaces/${workspaceSlug}/projects/${projectId}/bulk-archive-issues/`, data)
.then((response) => response?.data)
.then((response) => {
persistence.syncIssues(projectId);
return response?.data;
})
.catch((error) => {
throw error?.response?.data;
});

View File

@@ -106,7 +106,7 @@ export class ArchivedIssues extends BaseIssuesStore implements IArchivedIssues {
});
// after fetching issues, call the base method to process the response further
this.onfetchIssues(response, options, workspaceSlug, projectId);
this.onfetchIssues(response, options, workspaceSlug, projectId, undefined, !isExistingPaginationOptions);
return response;
} catch (error) {
// set loader to undefined if errored out

View File

@@ -1,6 +1,4 @@
import isArray from "lodash/isArray";
import isEmpty from "lodash/isEmpty";
import pickBy from "lodash/pickBy";
import set from "lodash/set";
import { action, computed, makeObservable, observable, runInAction } from "mobx";
// base class
@@ -191,12 +189,10 @@ export class CycleIssuesFilter extends IssueFilterHelperStore implements ICycleI
});
});
const appliedFilters = _filters.filters || {};
const filteredFilters = pickBy(appliedFilters, (value) => value && isArray(value) && value.length > 0);
this.rootIssueStore.cycleIssues.fetchIssuesWithExistingPagination(
workspaceSlug,
projectId,
isEmpty(filteredFilters) ? "init-loader" : "mutation",
"mutation",
cycleId
);
await this.issueFilterService.patchCycleIssueFilters(workspaceSlug, projectId, cycleId, {
@@ -237,6 +233,10 @@ export class CycleIssuesFilter extends IssueFilterHelperStore implements ICycleI
});
});
if (this.getShouldClearIssues(updatedDisplayFilters)) {
this.rootIssueStore.cycleIssues.clear(true, true); // clear issues for local store when some filters like layout changes
}
if (this.getShouldReFetchIssues(updatedDisplayFilters)) {
this.rootIssueStore.cycleIssues.fetchIssuesWithExistingPagination(
workspaceSlug,

View File

@@ -179,8 +179,9 @@ export class CycleIssues extends BaseIssuesStore implements ICycleIssues {
// set loader and clear store
runInAction(() => {
this.setLoader(loadType);
this.clear(!isExistingPaginationOptions, false); // clear while fetching from server.
if (!this.groupBy) this.clear(!isExistingPaginationOptions, true); // clear while using local to have the no load effect.
});
this.clear(!isExistingPaginationOptions);
// get params from pagination options
const params = this.issueFilterStore?.getFilterParams(options, cycleId, undefined, undefined, undefined);
@@ -190,7 +191,7 @@ export class CycleIssues extends BaseIssuesStore implements ICycleIssues {
});
// after fetching issues, call the base method to process the response further
this.onfetchIssues(response, options, workspaceSlug, projectId, cycleId);
this.onfetchIssues(response, options, workspaceSlug, projectId, cycleId, !isExistingPaginationOptions);
return response;
} catch (error) {
// set loader to undefined once errored out
@@ -233,7 +234,7 @@ export class CycleIssues extends BaseIssuesStore implements ICycleIssues {
subGroupId
);
// call the fetch issues API with the params for next page in issues
const response = await this.issueService.getIssues(workspaceSlug, projectId, cycleId, params);
const response = await this.issueService.getIssues(workspaceSlug, projectId, params);
// after the next page of issues are fetched, call the base method to process the response
this.onfetchNexIssues(response, groupId, subGroupId);

View File

@@ -103,7 +103,7 @@ export class DraftIssues extends BaseIssuesStore implements IDraftIssues {
});
// after fetching issues, call the base method to process the response further
this.onfetchIssues(response, options, workspaceSlug, projectId);
this.onfetchIssues(response, options, workspaceSlug, projectId, undefined, !isExistingPaginationOptions);
return response;
} catch (error) {
// set loader to undefined if errored out

View File

@@ -65,6 +65,7 @@ export interface IBaseIssuesStore {
//actions
removeIssue: (workspaceSlug: string, projectId: string, issueId: string) => Promise<void>;
clear(shouldClearPaginationOptions?: boolean, clearForLocal?: boolean): void;
// helper methods
getIssueIds: (groupId?: string, subGroupId?: string) => string[] | undefined;
issuesSortWithOrderBy(issueIds: string[], key: Partial<TIssueOrderByOptions>): string[];
@@ -455,7 +456,8 @@ export abstract class BaseIssuesStore implements IBaseIssuesStore {
options: IssuePaginationOptions,
workspaceSlug: string,
projectId?: string,
id?: string
id?: string,
shouldClearPaginationOptions = true
) {
// Process the Issue Response to get the following data from it
const { issueList, groupedIssues, groupedIssueCount } = this.processIssueResponse(issuesResponse);
@@ -465,6 +467,7 @@ export abstract class BaseIssuesStore implements IBaseIssuesStore {
// Update all the GroupIds to this Store's groupedIssueIds and update Individual group issue counts
runInAction(() => {
this.clear(shouldClearPaginationOptions, true);
this.updateGroupedIssueIds(groupedIssues, groupedIssueCount);
this.loader[getGroupKey()] = undefined;
});
@@ -1139,17 +1142,22 @@ export abstract class BaseIssuesStore implements IBaseIssuesStore {
/**
* Method called to clear out the current store
*/
clear(shouldClearPaginationOptions = true) {
runInAction(() => {
this.groupedIssueIds = undefined;
this.issuePaginationData = {};
this.groupedIssueCount = {};
if (shouldClearPaginationOptions) {
this.paginationOptions = undefined;
}
});
this.controller.abort();
this.controller = new AbortController();
clear(shouldClearPaginationOptions = true, clearForLocal = false) {
if (
(this.rootIssueStore.rootStore.user?.localDBEnabled && clearForLocal) ||
(!this.rootIssueStore.rootStore.user?.localDBEnabled && !clearForLocal)
) {
runInAction(() => {
this.groupedIssueIds = undefined;
this.issuePaginationData = {};
this.groupedIssueCount = {};
if (shouldClearPaginationOptions) {
this.paginationOptions = undefined;
}
});
this.controller.abort();
this.controller = new AbortController();
}
}
/**
@@ -1694,7 +1702,7 @@ export abstract class BaseIssuesStore implements IBaseIssuesStore {
}
}
return isDataIdsArray ? (order ? orderBy(dataValues, undefined, [order])[0] : dataValues) : dataValues[0];
return isDataIdsArray ? (order ? orderBy(dataValues, undefined, [order]) : dataValues) : dataValues;
}
issuesSortWithOrderBy = (issueIds: string[], key: TIssueOrderByOptions | undefined): string[] => {

View File

@@ -267,6 +267,20 @@ export class IssueFilterHelperStore implements IIssueFilterHelperStore {
);
};
/**
* This Method returns true if the display properties changed requires a server side update
* @param displayFilters
* @returns
*/
getShouldClearIssues = (displayFilters: IIssueDisplayFilterOptions) => {
const NON_SERVER_DISPLAY_FILTERS = ["layout"];
const displayFilterKeys = Object.keys(displayFilters);
return NON_SERVER_DISPLAY_FILTERS.some((serverDisplayfilter: string) =>
displayFilterKeys.includes(serverDisplayfilter)
);
};
/**
* This Method is used to construct the url params along with paginated values
* @param filterParams params generated from filters

View File

@@ -2,6 +2,8 @@ import { makeObservable, observable } from "mobx";
import { computedFn } from "mobx-utils";
// types
import { TIssue } from "@plane/types";
// local
import { persistence } from "@/local-db/storage.sqlite";
// services
import { IssueArchiveService, IssueDraftService, IssueService } from "@/services/issue";
// types
@@ -33,12 +35,14 @@ export interface IIssueStoreActions {
export interface IIssueStore extends IIssueStoreActions {
isFetchingIssueDetails: boolean;
isLocalDBIssueDescription: boolean;
// helper methods
getIssueById: (issueId: string) => TIssue | undefined;
}
export class IssueStore implements IIssueStore {
isFetchingIssueDetails: boolean = false;
isLocalDBIssueDescription: boolean = false;
// root store
rootIssueDetailStore: IIssueDetail;
// services
@@ -72,8 +76,16 @@ export class IssueStore implements IIssueStore {
let issue: TIssue | undefined;
// fetch issue from local db
issue = await persistence.getIssue(issueId);
this.isFetchingIssueDetails = true;
if (issue) {
this.addIssueToStore(issue);
this.isLocalDBIssueDescription = true;
}
if (issueType === "ARCHIVED")
issue = await this.issueArchiveService.retrieveArchivedIssue(workspaceSlug, projectId, issueId, query);
else if (issueType === "DRAFT")
@@ -82,7 +94,10 @@ export class IssueStore implements IIssueStore {
if (!issue) throw new Error("Issue not found");
this.addIssueToStore(issue);
const issuePayload = this.addIssueToStore(issue);
this.isLocalDBIssueDescription = false;
this.rootIssueDetailStore.rootIssueStore.issues.addIssue([issuePayload]);
// store handlers from issue detail
// parent

View File

@@ -1,6 +1,4 @@
import isArray from "lodash/isArray";
import isEmpty from "lodash/isEmpty";
import pickBy from "lodash/pickBy";
import set from "lodash/set";
import { action, computed, makeObservable, observable, runInAction } from "mobx";
// base class
@@ -194,12 +192,10 @@ export class ModuleIssuesFilter extends IssueFilterHelperStore implements IModul
set(this.filters, [moduleId, "filters", _key], updatedFilters[_key as keyof IIssueFilterOptions]);
});
});
const appliedFilters = _filters.filters || {};
const filteredFilters = pickBy(appliedFilters, (value) => value && isArray(value) && value.length > 0);
this.rootIssueStore.moduleIssues.fetchIssuesWithExistingPagination(
workspaceSlug,
projectId,
isEmpty(filteredFilters) ? "init-loader" : "mutation",
"mutation",
moduleId
);
await this.issueFilterService.patchModuleIssueFilters(workspaceSlug, projectId, moduleId, {
@@ -240,6 +236,10 @@ export class ModuleIssuesFilter extends IssueFilterHelperStore implements IModul
});
});
if (this.getShouldClearIssues(updatedDisplayFilters)) {
this.rootIssueStore.moduleIssues.clear(true, true); // clear issues for local store when some filters like layout changes
}
if (this.getShouldReFetchIssues(updatedDisplayFilters)) {
this.rootIssueStore.moduleIssues.fetchIssuesWithExistingPagination(
workspaceSlug,

View File

@@ -136,8 +136,9 @@ export class ModuleIssues extends BaseIssuesStore implements IModuleIssues {
// set loader and clear store
runInAction(() => {
this.setLoader(loadType);
this.clear(!isExistingPaginationOptions, false); // clear while fetching from server.
if (!this.groupBy) this.clear(!isExistingPaginationOptions, true); // clear while using local to have the no load effect.
});
this.clear(!isExistingPaginationOptions);
// get params from pagination options
const params = this.issueFilterStore?.getFilterParams(options, moduleId, undefined, undefined, undefined);
@@ -147,7 +148,7 @@ export class ModuleIssues extends BaseIssuesStore implements IModuleIssues {
});
// after fetching issues, call the base method to process the response further
this.onfetchIssues(response, options, workspaceSlug, projectId, moduleId);
this.onfetchIssues(response, options, workspaceSlug, projectId, moduleId, !isExistingPaginationOptions);
return response;
} catch (error) {
// set loader to undefined once errored out

View File

@@ -1,6 +1,4 @@
import isArray from "lodash/isArray";
import isEmpty from "lodash/isEmpty";
import pickBy from "lodash/pickBy";
import set from "lodash/set";
import { action, computed, makeObservable, observable, runInAction } from "mobx";
// base class
@@ -180,13 +178,7 @@ export class ProfileIssuesFilter extends IssueFilterHelperStore implements IProf
});
});
const appliedFilters = _filters.filters || {};
const filteredFilters = pickBy(appliedFilters, (value) => value && isArray(value) && value.length > 0);
this.rootIssueStore.profileIssues.fetchIssuesWithExistingPagination(
workspaceSlug,
userId,
isEmpty(filteredFilters) ? "init-loader" : "mutation"
);
this.rootIssueStore.profileIssues.fetchIssuesWithExistingPagination(workspaceSlug, userId, "mutation");
this.handleIssuesLocalFilters.set(EIssuesStoreType.PROFILE, type, workspaceSlug, userId, undefined, {
filters: _filters.filters,

View File

@@ -140,7 +140,7 @@ export class ProfileIssues extends BaseIssuesStore implements IProfileIssues {
});
// after fetching issues, call the base method to process the response further
this.onfetchIssues(response, options, workspaceSlug);
this.onfetchIssues(response, options, workspaceSlug, undefined, undefined, !isExistingPaginationOptions);
return response;
} catch (error) {
// set loader to undefined if errored out

View File

@@ -1,6 +1,4 @@
import isArray from "lodash/isArray";
import isEmpty from "lodash/isEmpty";
import pickBy from "lodash/pickBy";
import set from "lodash/set";
import { action, computed, makeObservable, observable, runInAction } from "mobx";
// base class
@@ -188,13 +186,11 @@ export class ProjectViewIssuesFilter extends IssueFilterHelperStore implements I
});
});
const appliedFilters = _filters.filters || {};
const filteredFilters = pickBy(appliedFilters, (value) => value && isArray(value) && value.length > 0);
this.rootIssueStore.projectViewIssues.fetchIssuesWithExistingPagination(
workspaceSlug,
projectId,
viewId,
isEmpty(filteredFilters) ? "init-loader" : "mutation"
"mutation"
);
break;
}
@@ -231,6 +227,10 @@ export class ProjectViewIssuesFilter extends IssueFilterHelperStore implements I
});
});
if (this.getShouldClearIssues(updatedDisplayFilters)) {
this.rootIssueStore.projectIssues.clear(true, true); // clear issues for local store when some filters like layout changes
}
if (this.getShouldReFetchIssues(updatedDisplayFilters)) {
this.rootIssueStore.projectViewIssues.fetchIssuesWithExistingPagination(
workspaceSlug,

View File

@@ -93,8 +93,9 @@ export class ProjectViewIssues extends BaseIssuesStore implements IProjectViewIs
// set loader and clear store
runInAction(() => {
this.setLoader(loadType);
this.clear(!isExistingPaginationOptions, false); // clear while fetching from server.
if (!this.groupBy) this.clear(!isExistingPaginationOptions, true); // clear while using local to have the no load effect.
});
this.clear(!isExistingPaginationOptions);
// get params from pagination options
const params = this.issueFilterStore?.getFilterParams(options, viewId, undefined, undefined, undefined);
@@ -104,7 +105,7 @@ export class ProjectViewIssues extends BaseIssuesStore implements IProjectViewIs
});
// after fetching issues, call the base method to process the response further
this.onfetchIssues(response, options, workspaceSlug, projectId);
this.onfetchIssues(response, options, workspaceSlug, projectId, viewId, !isExistingPaginationOptions);
return response;
} catch (error) {
// set loader to undefined if errored out

View File

@@ -1,6 +1,4 @@
import isArray from "lodash/isArray";
import isEmpty from "lodash/isEmpty";
import pickBy from "lodash/pickBy";
import set from "lodash/set";
import { action, computed, makeObservable, observable, runInAction } from "mobx";
// base class
@@ -183,13 +181,7 @@ export class ProjectIssuesFilter extends IssueFilterHelperStore implements IProj
});
});
const appliedFilters = _filters.filters || {};
const filteredFilters = pickBy(appliedFilters, (value) => value && isArray(value) && value.length > 0);
this.rootIssueStore.projectIssues.fetchIssuesWithExistingPagination(
workspaceSlug,
projectId,
isEmpty(filteredFilters) ? "init-loader" : "mutation"
);
this.rootIssueStore.projectIssues.fetchIssuesWithExistingPagination(workspaceSlug, projectId, "mutation");
await this.issueFilterService.patchProjectIssueFilters(workspaceSlug, projectId, {
filters: _filters.filters,
});
@@ -228,6 +220,10 @@ export class ProjectIssuesFilter extends IssueFilterHelperStore implements IProj
});
});
if (this.getShouldClearIssues(updatedDisplayFilters)) {
this.rootIssueStore.projectIssues.clear(true, true); // clear issues for local store when some filters like layout changes
}
if (this.getShouldReFetchIssues(updatedDisplayFilters)) {
this.rootIssueStore.projectIssues.fetchIssuesWithExistingPagination(workspaceSlug, projectId, "mutation");
}

View File

@@ -101,8 +101,9 @@ export class ProjectIssues extends BaseIssuesStore implements IProjectIssues {
// set loader and clear store
runInAction(() => {
this.setLoader(loadType);
this.clear(!isExistingPaginationOptions, false); // clear while fetching from server.
if (!this.groupBy) this.clear(!isExistingPaginationOptions, true); // clear while using local to have the no load effect.
});
this.clear(!isExistingPaginationOptions);
// get params from pagination options
const params = this.issueFilterStore?.getFilterParams(options, projectId, undefined, undefined, undefined);
@@ -112,7 +113,7 @@ export class ProjectIssues extends BaseIssuesStore implements IProjectIssues {
});
// after fetching issues, call the base method to process the response further
this.onfetchIssues(response, options, workspaceSlug, projectId);
this.onfetchIssues(response, options, workspaceSlug, projectId, undefined, !isExistingPaginationOptions);
return response;
} catch (error) {
// set loader to undefined if errored out

View File

@@ -109,7 +109,7 @@ export class WorkspaceIssues extends BaseIssuesStore implements IWorkspaceIssues
});
// after fetching issues, call the base method to process the response further
this.onfetchIssues(response, options, workspaceSlug);
this.onfetchIssues(response, options, workspaceSlug, undefined, undefined, !isExistingPaginationOptions);
return response;
} catch (error) {
// set loader to undefined if errored out

View File

@@ -39,6 +39,7 @@ export interface IModuleStore {
updateModuleDistribution: (distributionUpdates: DistributionUpdates, moduleId: string) => void;
fetchWorkspaceModules: (workspaceSlug: string) => Promise<IModule[]>;
fetchModules: (workspaceSlug: string, projectId: string) => Promise<undefined | IModule[]>;
fetchModulesSlim: (workspaceSlug: string, projectId: string) => Promise<undefined | IModule[]>
fetchArchivedModules: (workspaceSlug: string, projectId: string) => Promise<undefined | IModule[]>;
fetchArchivedModuleDetails: (workspaceSlug: string, projectId: string, moduleId: string) => Promise<IModule>;
fetchModuleDetails: (workspaceSlug: string, projectId: string, moduleId: string) => Promise<IModule>;
@@ -281,6 +282,32 @@ export class ModulesStore implements IModuleStore {
}
};
/**
* @description fetch all modules
* @param workspaceSlug
* @param projectId
* @returns IModule[]
*/
fetchModulesSlim = async (workspaceSlug: string, projectId: string) => {
try {
this.loader = true;
await this.moduleService.getWorkspaceModules(workspaceSlug).then((response) => {
const projectModules = response.filter((module) => module.project_id === projectId);
runInAction(() => {
projectModules.forEach((module) => {
set(this.moduleMap, [module.id], { ...this.moduleMap[module.id], ...module });
});
set(this.fetchedMap, projectId, true);
this.loader = false;
});
return projectModules;
});
} catch (error) {
this.loader = false;
return undefined;
}
};
/**
* @description fetch all archived modules
* @param workspaceSlug

View File

@@ -7,6 +7,8 @@ import { TUserPermissions } from "@plane/types/src/enums";
// constants
// helpers
import { API_BASE_URL } from "@/helpers/common.helper";
// local
import { persistence } from "@/local-db/storage.sqlite";
import { EUserPermissions } from "@/plane-web/constants/user-permissions";
// services
import { AuthService } from "@/services/auth.service";
@@ -17,6 +19,7 @@ import { IAccountStore } from "@/store/user/account.store";
import { ProfileStore, IUserProfileStore } from "@/store/user/profile.store";
import { IUserPermissionStore, UserPermissionStore } from "./permissions.store";
import { IUserSettingsStore, UserSettingsStore } from "./settings.store";
import { ENABLE_LOCAL_DB_CACHE } from "@/plane-web/constants/issues";
type TUserErrorStatus = {
status: string;
@@ -42,6 +45,7 @@ export interface IUserStore {
reset: () => void;
signOut: () => Promise<void>;
// computed
localDBEnabled: boolean;
canPerformAnyCreateAction: boolean;
projectsWithCreatePermissions: { [projectId: string]: number } | null;
}
@@ -91,6 +95,8 @@ export class UserStore implements IUserStore {
// computed
canPerformAnyCreateAction: computed,
projectsWithCreatePermissions: computed,
localDBEnabled: computed,
});
}
@@ -226,6 +232,7 @@ export class UserStore implements IUserStore {
*/
signOut = async (): Promise<void> => {
await this.authService.signOut(API_BASE_URL);
await persistence.clearStorage();
this.store.resetOnSignOut();
};
@@ -269,4 +276,8 @@ export class UserStore implements IUserStore {
const filteredProjects = this.fetchProjectsWithCreatePermissions();
return filteredProjects ? Object.keys(filteredProjects).length > 0 : false;
}
get localDBEnabled() {
return ENABLE_LOCAL_DB_CACHE && this.userSettings.canUseLocalDB;
}
}

View File

@@ -1,5 +1,9 @@
import { action, makeObservable, observable, runInAction } from "mobx";
import { IUserSettings } from "@plane/types";
// hooks
import { getValueFromLocalStorage, setValueIntoLocalStorage } from "@/hooks/use-local-storage";
// local
import { persistence } from "@/local-db/storage.sqlite";
// services
import { UserService } from "@/services/user.service";
@@ -8,13 +12,17 @@ type TError = {
message: string;
};
const LOCAL_DB_ENABLED = "LOCAL_DB_ENABLED";
export interface IUserSettingsStore {
// observables
isLoading: boolean;
error: TError | undefined;
data: IUserSettings;
canUseLocalDB: boolean;
// actions
fetchCurrentUserSettings: () => Promise<IUserSettings | undefined>;
toggleLocalDB: () => Promise<void>;
}
export class UserSettingsStore implements IUserSettingsStore {
@@ -32,6 +40,7 @@ export class UserSettingsStore implements IUserSettingsStore {
invites: undefined,
},
};
canUseLocalDB: boolean = getValueFromLocalStorage(LOCAL_DB_ENABLED, true);
// services
userService: UserService;
@@ -41,13 +50,37 @@ export class UserSettingsStore implements IUserSettingsStore {
isLoading: observable.ref,
error: observable,
data: observable,
canUseLocalDB: observable.ref,
// actions
fetchCurrentUserSettings: action,
toggleLocalDB: action,
});
// services
this.userService = new UserService();
}
toggleLocalDB = async () => {
const currentLocalDBValue = this.canUseLocalDB;
try {
runInAction(() => {
this.canUseLocalDB = !currentLocalDBValue;
});
const transactionResult = setValueIntoLocalStorage(LOCAL_DB_ENABLED, !currentLocalDBValue);
if (!transactionResult) {
throw new Error("error while toggling local DB");
} else if (currentLocalDBValue) {
await persistence.clearStorage();
}
} catch (e) {
console.warn("error while toggling local DB");
runInAction(() => {
this.canUseLocalDB = currentLocalDBValue;
});
}
};
// actions
/**
* @description fetches user profile information

View File

@@ -20,6 +20,8 @@ const nextConfig = {
key: "Referrer-Policy",
value: "origin-when-cross-origin",
},
{ key: "Cross-Origin-Opener-Policy", value: "same-origin" },
{ key: "Cross-Origin-Embedder-Policy", value: "require-corp" },
],
},
];

View File

@@ -33,6 +33,7 @@
"@plane/ui": "*",
"@popperjs/core": "^2.11.8",
"@sentry/nextjs": "^8",
"@sqlite.org/sqlite-wasm": "^3.46.0-build2",
"axios": "^1.7.4",
"clsx": "^2.0.0",
"cmdk": "^1.0.0",

View File

@@ -2919,6 +2919,11 @@
resolved "https://registry.yarnpkg.com/@sindresorhus/merge-streams/-/merge-streams-2.3.0.tgz#719df7fb41766bc143369eaa0dd56d8dc87c9958"
integrity sha512-LtoMMhxAlorcGhmFYI+LhPgbPZCkgP6ra1YL604EeF6U98pLlQ3iWIGMdWSC+vWmPBWBNgmDBAhnAobLROJmwg==
"@sqlite.org/sqlite-wasm@^3.46.0-build2":
version "3.46.0-build2"
resolved "https://registry.yarnpkg.com/@sqlite.org/sqlite-wasm/-/sqlite-wasm-3.46.0-build2.tgz#f84c3014f3fed6db08fc585d67e386d39e3956bf"
integrity sha512-10s/u/Main1RGO+jjzK+mgC/zh1ls1CEnq3Dujr03TwvzLg+j4FAohOmlYkQj8KQOj1vGR9cuB9F8tVBTwVGVA==
"@storybook/addon-actions@8.3.2":
version "8.3.2"
resolved "https://registry.yarnpkg.com/@storybook/addon-actions/-/addon-actions-8.3.2.tgz#bded2d778f3c9309334d8e378a55723d25907f50"
@@ -10863,16 +10868,7 @@ streamx@^2.15.0, streamx@^2.20.0:
optionalDependencies:
bare-events "^2.2.0"
"string-width-cjs@npm:string-width@^4.2.0":
version "4.2.3"
resolved "https://registry.yarnpkg.com/string-width/-/string-width-4.2.3.tgz#269c7117d27b05ad2e536830a8ec895ef9c6d010"
integrity sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==
dependencies:
emoji-regex "^8.0.0"
is-fullwidth-code-point "^3.0.0"
strip-ansi "^6.0.1"
string-width@^4.1.0, string-width@^4.2.0, string-width@^4.2.3:
"string-width-cjs@npm:string-width@^4.2.0", string-width@^4.1.0, string-width@^4.2.0, string-width@^4.2.3:
version "4.2.3"
resolved "https://registry.yarnpkg.com/string-width/-/string-width-4.2.3.tgz#269c7117d27b05ad2e536830a8ec895ef9c6d010"
integrity sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==
@@ -10959,14 +10955,7 @@ string_decoder@^1.1.1, string_decoder@^1.3.0:
dependencies:
safe-buffer "~5.2.0"
"strip-ansi-cjs@npm:strip-ansi@^6.0.1":
version "6.0.1"
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-6.0.1.tgz#9e26c63d30f53443e9489495b2105d37b67a85d9"
integrity sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==
dependencies:
ansi-regex "^5.0.1"
strip-ansi@^6.0.0, strip-ansi@^6.0.1:
"strip-ansi-cjs@npm:strip-ansi@^6.0.1", strip-ansi@^6.0.0, strip-ansi@^6.0.1:
version "6.0.1"
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-6.0.1.tgz#9e26c63d30f53443e9489495b2105d37b67a85d9"
integrity sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==
@@ -12150,16 +12139,7 @@ word-wrap@^1.2.5:
resolved "https://registry.yarnpkg.com/word-wrap/-/word-wrap-1.2.5.tgz#d2c45c6dd4fbce621a66f136cbe328afd0410b34"
integrity sha512-BN22B5eaMMI9UMtjrGd5g5eCYPpCPDUy0FJXbYsaT5zYxjFOckS53SQDE3pWkVoWpHXVb3BrYcEN4Twa55B5cA==
"wrap-ansi-cjs@npm:wrap-ansi@^7.0.0":
version "7.0.0"
resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-7.0.0.tgz#67e145cff510a6a6984bdf1152911d69d2eb9e43"
integrity sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==
dependencies:
ansi-styles "^4.0.0"
string-width "^4.1.0"
strip-ansi "^6.0.0"
wrap-ansi@^7.0.0:
"wrap-ansi-cjs@npm:wrap-ansi@^7.0.0", wrap-ansi@^7.0.0:
version "7.0.0"
resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-7.0.0.tgz#67e145cff510a6a6984bdf1152911d69d2eb9e43"
integrity sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==