[full-ci] - use KQL as default search query language (#7212)

* enhancement: use kql as default search query language

* enhancement: add support for unicode search queries

* fix: escape bleve field query whitespace

* fix: search related acceptance tests

* enhancement: remove legacy search query language

* enhancement: add support for kql dateTime restriction node types

* chore: bump web to v8.0.0-alpha.2

* fix: failing search api test

* enhancement: search bleve query compiler use DateRangeQuery as DateTimeNode counterpart

* enhancement: support for colon operators in dateTime kql queries
This commit is contained in:
Florian Schade
2023-09-07 11:13:33 +02:00
committed by GitHub
parent eca299789d
commit 844783b6f9
30 changed files with 1335 additions and 355 deletions
+1 -1
View File
@@ -1,3 +1,3 @@
# The test runner source for UI tests
WEB_COMMITID=de510963a4c9d9eaa05ba69512fabb323a32bd73
WEB_COMMITID=1322e5b46c827d0e7f7b8f563f302e61269f8515
WEB_BRANCH=master
@@ -1,27 +0,0 @@
Enhancement: Keyword Query Language (KQL) search syntax support
Introduce support for [KQL](https://learn.microsoft.com/en-us/sharepoint/dev/general-development/keyword-query-language-kql-syntax-reference) search syntax.
The functionality consists of a kql lexer and a bleve query compiler
Supported field queries:
* `Tag` search `tag:golden tag:"silver"`
* `Filename` search `name:file.txt name:"file.docx"`
* `Content` search `content:ahab content:"captain aha*"`
Supported conjunctive normal form queries:
* `Boolean`: `AND`, `OR`, `NOT`,
* `Group`: `(tag:book content:ahab*)`, `tag:(book pdf)`
some examples are:
query: `(name:"moby di*" OR tag:bestseller) AND tag:book NOT tag:read`
* Resources with `name: moby di*` `OR` `tag: bestseller`.
* `AND` with `tag:book`.
* `NOT` with `tag:read`.
https://github.com/owncloud/ocis/pull/7043
https://github.com/owncloud/ocis/issues/7042
@@ -0,0 +1,27 @@
Enhancement: Keyword Query Language (KQL) search syntax
We've introduced support for [KQL](https://learn.microsoft.com/en-us/sharepoint/dev/general-development/keyword-query-language-kql-syntax-reference) as the default oCIS search query language.
Some examples of a valid KQL query are:
* `Tag`: `tag:golden tag:"silver"`
* `Filename`: `name:file.txt name:"file.docx"`
* `Content`: `content:ahab content:"captain aha*"`
Conjunctive normal form queries:
* `Boolean`: `tag:golden AND tag:"silver`, `tag:golden OR tag:"silver`, `tag:golden NOT tag:"silver`
* `Group`: `(tag:book content:ahab*)`, `tag:(book pdf)`
Complex queries:
* `(name:"moby di*" OR tag:bestseller) AND tag:book NOT tag:read`
https://github.com/owncloud/ocis/pull/7212
https://github.com/owncloud/ocis/pull/7043
https://github.com/owncloud/web/pull/9653
https://github.com/owncloud/ocis/issues/7042
https://github.com/owncloud/ocis/issues/7179
https://github.com/owncloud/ocis/issues/7114
https://github.com/owncloud/web/issues/9636
https://github.com/owncloud/web/issues/9646
+16
View File
@@ -25,6 +25,22 @@ Note that as of now, the search service can not be scaled. Consider using a dedi
By default, the search service is shipped with [bleve](https://github.com/blevesearch/bleve) as its primary search engine. The available engines can be extended by implementing the [Engine](pkg/engine/engine.go) interface and making that engine available.
## Query language
By default, [KQL](https://learn.microsoft.com/en-us/sharepoint/dev/general-development/keyword-query-language-kql-syntax-reference) is used as query language,
for an overview of how the syntax works, please read the [microsoft documentation](https://learn.microsoft.com/en-us/sharepoint/dev/general-development/keyword-query-language-kql-syntax-reference).
Not all parts are supported, the following list gives an overview of parts that are not implemented yet:
* Synonym operators
* Inclusion and exclusion operators
* Dynamic ranking operator
* ONEAR operator
* NEAR operator
* Date intervals
In the following [ADR](https://github.com/owncloud/ocis/blob/docs/ocis/adr/0020-file-search-query-language.md) you can read why we chose KQL.
## Extraction Engines
The search service provides the following extraction engines and their results are used as index for searching:
+11 -11
View File
@@ -17,7 +17,7 @@ import (
"github.com/blevesearch/bleve/v2/analysis/tokenizer/single"
"github.com/blevesearch/bleve/v2/analysis/tokenizer/unicode"
"github.com/blevesearch/bleve/v2/mapping"
bleveQuery "github.com/blevesearch/bleve/v2/search/query"
"github.com/blevesearch/bleve/v2/search/query"
storageProvider "github.com/cs3org/go-cs3apis/cs3/storage/provider/v1beta1"
"google.golang.org/protobuf/types/known/timestamppb"
@@ -27,13 +27,13 @@ import (
searchMessage "github.com/owncloud/ocis/v2/protogen/gen/ocis/messages/search/v0"
searchService "github.com/owncloud/ocis/v2/protogen/gen/ocis/services/search/v0"
"github.com/owncloud/ocis/v2/services/search/pkg/content"
"github.com/owncloud/ocis/v2/services/search/pkg/query"
searchQuery "github.com/owncloud/ocis/v2/services/search/pkg/query"
)
// Bleve represents a search engine which utilizes bleve to search and store resources.
type Bleve struct {
index bleve.Index
query query.Creator[bleveQuery.Query]
index bleve.Index
queryCreator searchQuery.Creator[query.Query]
}
// NewBleveIndex returns a new bleve index
@@ -58,10 +58,10 @@ func NewBleveIndex(root string) (bleve.Index, error) {
}
// NewBleveEngine creates a new Bleve instance
func NewBleveEngine(index bleve.Index, qbc query.Creator[bleveQuery.Query]) *Bleve {
func NewBleveEngine(index bleve.Index, queryCreator searchQuery.Creator[query.Query]) *Bleve {
return &Bleve{
index: index,
query: qbc,
index: index,
queryCreator: queryCreator,
}
}
@@ -118,15 +118,15 @@ func BuildBleveMapping() (mapping.IndexMapping, error) {
// Search executes a search request operation within the index.
// Returns a SearchIndexResponse object or an error.
func (b *Bleve) Search(_ context.Context, sir *searchService.SearchIndexRequest) (*searchService.SearchIndexResponse, error) {
createdQuery, err := b.query.Create(sir.Query)
func (b *Bleve) Search(ctx context.Context, sir *searchService.SearchIndexRequest) (*searchService.SearchIndexResponse, error) {
createdQuery, err := b.queryCreator.Create(sir.Query)
if err != nil {
return nil, err
}
q := bleve.NewConjunctionQuery(
// Skip documents that have been marked as deleted
&bleveQuery.BoolFieldQuery{
&query.BoolFieldQuery{
Bool: false,
FieldVal: "Deleted",
},
@@ -136,7 +136,7 @@ func (b *Bleve) Search(_ context.Context, sir *searchService.SearchIndexRequest)
if sir.Ref != nil {
q.Conjuncts = append(
q.Conjuncts,
&bleveQuery.TermQuery{
&query.TermQuery{
FieldVal: "RootID",
Term: storagespace.FormatResourceID(
storageProvider.ResourceId{
+13 -15
View File
@@ -4,10 +4,9 @@ import (
"context"
"fmt"
"github.com/cs3org/reva/v2/pkg/storagespace"
bleveSearch "github.com/blevesearch/bleve/v2"
sprovider "github.com/cs3org/go-cs3apis/cs3/storage/provider/v1beta1"
"github.com/cs3org/reva/v2/pkg/storagespace"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
@@ -22,7 +21,6 @@ var _ = Describe("Bleve", func() {
var (
eng *engine.Bleve
idx bleveSearch.Index
ctx context.Context
doSearch = func(id string, query, path string) (*searchsvc.SearchIndexResponse, error) {
rID, err := storagespace.ParseID(id)
@@ -30,7 +28,7 @@ var _ = Describe("Bleve", func() {
return nil, err
}
return eng.Search(ctx, &searchsvc.SearchIndexRequest{
return eng.Search(context.Background(), &searchsvc.SearchIndexRequest{
Query: query,
Ref: &searchmsg.Reference{
ResourceId: &searchmsg.ResourceID{
@@ -63,7 +61,7 @@ var _ = Describe("Bleve", func() {
idx, err = bleveSearch.NewMemOnly(mapping)
Expect(err).ToNot(HaveOccurred())
eng = engine.NewBleveEngine(idx, bleve.LegacyCreator)
eng = engine.NewBleveEngine(idx, bleve.DefaultCreator)
Expect(err).ToNot(HaveOccurred())
rootResource = engine.Resource{
@@ -94,7 +92,7 @@ var _ = Describe("Bleve", func() {
Describe("New", func() {
It("returns a new index instance", func() {
b := engine.NewBleveEngine(idx, bleve.LegacyCreator)
b := engine.NewBleveEngine(idx, bleve.DefaultCreator)
Expect(b).ToNot(BeNil())
})
})
@@ -134,7 +132,7 @@ var _ = Describe("Bleve", func() {
err := eng.Upsert(parentResource.ID, parentResource)
Expect(err).ToNot(HaveOccurred())
assertDocCount(rootResource.ID, `Name:foo\ o*`, 1)
assertDocCount(rootResource.ID, `name:"foo o*"`, 1)
})
It("finds files by digits in the filename", func() {
@@ -409,14 +407,14 @@ var _ = Describe("Bleve", func() {
err = eng.Upsert(childResource.ID, childResource)
Expect(err).ToNot(HaveOccurred())
assertDocCount(rootResource.ID, parentResource.Document.Name, 1)
assertDocCount(rootResource.ID, childResource.Document.Name, 1)
assertDocCount(rootResource.ID, `"`+parentResource.Document.Name+`"`, 1)
assertDocCount(rootResource.ID, `"`+childResource.Document.Name+`"`, 1)
err = eng.Delete(parentResource.ID)
Expect(err).ToNot(HaveOccurred())
assertDocCount(rootResource.ID, parentResource.Document.Name, 0)
assertDocCount(rootResource.ID, childResource.Document.Name, 0)
assertDocCount(rootResource.ID, `"`+parentResource.Document.Name+`"`, 0)
assertDocCount(rootResource.ID, `"`+childResource.Document.Name+`"`, 0)
})
})
@@ -431,14 +429,14 @@ var _ = Describe("Bleve", func() {
err = eng.Delete(parentResource.ID)
Expect(err).ToNot(HaveOccurred())
assertDocCount(rootResource.ID, parentResource.Name, 0)
assertDocCount(rootResource.ID, childResource.Name, 0)
assertDocCount(rootResource.ID, `"`+parentResource.Name+`"`, 0)
assertDocCount(rootResource.ID, `"`+childResource.Name+`"`, 0)
err = eng.Restore(parentResource.ID)
Expect(err).ToNot(HaveOccurred())
assertDocCount(rootResource.ID, parentResource.Name, 1)
assertDocCount(rootResource.ID, childResource.Name, 1)
assertDocCount(rootResource.ID, `"`+parentResource.Name+`"`, 1)
assertDocCount(rootResource.ID, `"`+childResource.Name+`"`, 1)
})
})
+12
View File
@@ -1,6 +1,10 @@
// Package ast provides available ast nodes.
package ast
import (
"time"
)
// Node represents abstract syntax tree node
type Node interface {
Location() *Location
@@ -48,6 +52,14 @@ type BooleanNode struct {
Value bool
}
// DateTimeNode represents a time.Time value
type DateTimeNode struct {
*Base
Key string
Operator *OperatorNode
Value time.Time
}
// OperatorNode represents an operator value like
// AND, OR, NOT, =, <= ... and so on
type OperatorNode struct {
@@ -21,6 +21,7 @@ func DiffAst(x, y interface{}, opts ...cmp.Option) string {
cmpopts.IgnoreFields(ast.OperatorNode{}, "Base"),
cmpopts.IgnoreFields(ast.GroupNode{}, "Base"),
cmpopts.IgnoreFields(ast.BooleanNode{}, "Base"),
cmpopts.IgnoreFields(ast.DateTimeNode{}, "Base"),
)...,
)
}
+3 -2
View File
@@ -5,6 +5,7 @@ import (
bQuery "github.com/blevesearch/bleve/v2/search/query"
"github.com/owncloud/ocis/v2/services/search/pkg/query"
"github.com/owncloud/ocis/v2/services/search/pkg/query/kql"
)
// Creator is combines a Builder and a Compiler which is used to Create the query.
@@ -29,5 +30,5 @@ func (c Creator[T]) Create(qs string) (T, error) {
return t, nil
}
// LegacyCreator exposes an ocis legacy bleve query creator.
var LegacyCreator = Creator[bQuery.Query]{LegacyBuilder{}, LegacyCompiler{}}
// DefaultCreator exposes a kql to bleve query creator.
var DefaultCreator = Creator[bQuery.Query]{kql.Builder{}, Compiler{}}
+46 -2
View File
@@ -22,13 +22,15 @@ var _fields = map[string]string{
"type": "Type",
"tag": "Tags",
"tags": "Tags",
"content": "Content",
"hidden": "Hidden",
}
// Compiler represents a KQL query search string to the bleve query formatter.
type Compiler struct{}
// Compile implements the query formatter which converts the KQL query search string to the bleve query.
func (c *Compiler) Compile(givenAst *ast.Ast) (bleveQuery.Query, error) {
func (c Compiler) Compile(givenAst *ast.Ast) (bleveQuery.Query, error) {
q, err := compile(givenAst)
if err != nil {
return nil, err
@@ -52,7 +54,49 @@ func walk(offset int, nodes []ast.Node) (bleveQuery.Query, int) {
for i := offset; i < len(nodes); i++ {
switch n := nodes[i].(type) {
case *ast.StringNode:
q := bleveQuery.NewQueryStringQuery(getField(n.Key) + ":" + n.Value)
k := getField(n.Key)
v := strings.ReplaceAll(n.Value, " ", `\ `)
if k != "Hidden" {
v = strings.ToLower(v)
}
q := bleveQuery.NewQueryStringQuery(k + ":" + v)
if prev == nil {
prev = q
} else {
next = q
}
case *ast.DateTimeNode:
q := &bleveQuery.DateRangeQuery{
Start: bleveQuery.BleveQueryTime{},
End: bleveQuery.BleveQueryTime{},
InclusiveStart: nil,
InclusiveEnd: nil,
FieldVal: getField(n.Key),
}
if n.Operator == nil {
continue
}
switch n.Operator.Value {
case ">":
q.Start.Time = n.Value
q.InclusiveStart = &[]bool{false}[0]
case ">=":
q.Start.Time = n.Value
q.InclusiveStart = &[]bool{true}[0]
case "<":
q.End.Time = n.Value
q.InclusiveEnd = &[]bool{false}[0]
case "<=":
q.End.Time = n.Value
q.InclusiveEnd = &[]bool{true}[0]
default:
continue
}
if prev == nil {
prev = q
} else {
+137 -10
View File
@@ -2,6 +2,7 @@ package bleve
import (
"testing"
"time"
"github.com/blevesearch/bleve/v2/search/query"
tAssert "github.com/stretchr/testify/assert"
@@ -9,6 +10,15 @@ import (
"github.com/owncloud/ocis/v2/services/search/pkg/query/ast"
)
var timeMustParse = func(t *testing.T, ts string) time.Time {
tp, err := time.Parse(time.RFC3339Nano, ts)
if err != nil {
t.Fatalf("time.Parse(...) error = %v", err)
}
return tp
}
func Test_compile(t *testing.T) {
tests := []struct {
name string
@@ -36,7 +46,7 @@ func Test_compile(t *testing.T) {
},
},
want: query.NewConjunctionQuery([]query.Query{
query.NewQueryStringQuery(`Name:John Smith`),
query.NewQueryStringQuery(`Name:john\ smith`),
}),
wantErr: false,
},
@@ -50,8 +60,8 @@ func Test_compile(t *testing.T) {
},
},
want: query.NewConjunctionQuery([]query.Query{
query.NewQueryStringQuery(`Name:John Smith`),
query.NewQueryStringQuery(`Name:Jane`),
query.NewQueryStringQuery(`Name:john\ smith`),
query.NewQueryStringQuery(`Name:jane`),
}),
wantErr: false,
},
@@ -82,7 +92,7 @@ func Test_compile(t *testing.T) {
},
},
want: query.NewDisjunctionQuery([]query.Query{
query.NewQueryStringQuery(`Name:moby di*`),
query.NewQueryStringQuery(`Name:moby\ di*`),
query.NewConjunctionQuery([]query.Query{
query.NewQueryStringQuery(`Tags:bestseller`),
query.NewQueryStringQuery(`Tags:book`),
@@ -125,7 +135,7 @@ func Test_compile(t *testing.T) {
},
want: query.NewConjunctionQuery([]query.Query{
query.NewDisjunctionQuery([]query.Query{
query.NewQueryStringQuery(`Name:moby di*`),
query.NewQueryStringQuery(`Name:moby\ di*`),
query.NewQueryStringQuery(`Tags:bestseller`),
}),
query.NewQueryStringQuery(`Tags:book`),
@@ -150,7 +160,7 @@ func Test_compile(t *testing.T) {
},
want: query.NewConjunctionQuery([]query.Query{
query.NewDisjunctionQuery([]query.Query{
query.NewQueryStringQuery(`Name:moby di*`),
query.NewQueryStringQuery(`Name:moby\ di*`),
query.NewQueryStringQuery(`Tags:bestseller`),
}),
query.NewQueryStringQuery(`Tags:book`),
@@ -173,8 +183,8 @@ func Test_compile(t *testing.T) {
},
},
want: query.NewConjunctionQuery([]query.Query{
query.NewQueryStringQuery(`author:John Smith`),
query.NewQueryStringQuery(`author:Jane`),
query.NewQueryStringQuery(`author:john\ smith`),
query.NewQueryStringQuery(`author:jane`),
}),
wantErr: false,
},
@@ -195,12 +205,129 @@ func Test_compile(t *testing.T) {
},
},
want: query.NewConjunctionQuery([]query.Query{
query.NewQueryStringQuery(`author:John Smith`),
query.NewQueryStringQuery(`author:Jane`),
query.NewQueryStringQuery(`author:john\ smith`),
query.NewQueryStringQuery(`author:jane`),
query.NewQueryStringQuery(`Tags:bestseller`),
}),
wantErr: false,
},
{
name: `id:b27d3bf1-b254-459f-92e8-bdba668d6d3f$d0648459-25fb-4ed8-8684-bc62c7dca29c!d0648459-25fb-4ed8-8684-bc62c7dca29c mtime>=2023-09-05T12:40:59.14741+02:00`,
args: &ast.Ast{
Nodes: []ast.Node{
&ast.StringNode{
Key: "id",
Value: "b27d3bf1-b254-459f-92e8-bdba668d6d3f$d0648459-25fb-4ed8-8684-bc62c7dca29c!d0648459-25fb-4ed8-8684-bc62c7dca29c",
},
&ast.OperatorNode{Value: "AND"},
&ast.DateTimeNode{
Key: "Mtime",
Operator: &ast.OperatorNode{Value: ">="},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
},
},
want: query.NewConjunctionQuery([]query.Query{
query.NewQueryStringQuery(`ID:b27d3bf1-b254-459f-92e8-bdba668d6d3f$d0648459-25fb-4ed8-8684-bc62c7dca29c!d0648459-25fb-4ed8-8684-bc62c7dca29c`),
func() query.Query {
q := query.NewDateRangeInclusiveQuery(timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"), time.Time{}, &[]bool{true}[0], nil)
q.FieldVal = "Mtime"
return q
}(),
}),
wantErr: false,
},
{
name: `StringNode value lowercase`,
args: &ast.Ast{
Nodes: []ast.Node{
&ast.StringNode{Value: "John Smith"},
&ast.OperatorNode{Value: "AND"},
&ast.StringNode{Key: "Hidden", Value: "T"},
&ast.OperatorNode{Value: "AND"},
&ast.StringNode{Key: "hidden", Value: "T"},
},
},
want: query.NewConjunctionQuery([]query.Query{
query.NewQueryStringQuery(`Name:john\ smith`),
query.NewQueryStringQuery(`Hidden:T`),
query.NewQueryStringQuery(`Hidden:T`),
}),
wantErr: false,
},
{
name: `ast.DateTimeNode`,
args: &ast.Ast{
Nodes: []ast.Node{
&ast.DateTimeNode{
Key: "mtime",
// "=" is not supported by bleve, ignore
Operator: &ast.OperatorNode{Value: "="},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.OperatorNode{Value: "AND"},
&ast.DateTimeNode{
Key: "mtime",
// ":" is not supported by bleve, ignore
Operator: &ast.OperatorNode{Value: ":"},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.OperatorNode{Value: "AND"},
&ast.DateTimeNode{
Key: "mtime",
// no operator, skip
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.OperatorNode{Value: "AND"},
&ast.DateTimeNode{
Key: "mtime",
Operator: &ast.OperatorNode{Value: ">"},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.OperatorNode{Value: "AND"},
&ast.DateTimeNode{
Key: "mtime",
Operator: &ast.OperatorNode{Value: ">="},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.OperatorNode{Value: "AND"},
&ast.DateTimeNode{
Key: "mtime",
Operator: &ast.OperatorNode{Value: "<"},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.OperatorNode{Value: "AND"},
&ast.DateTimeNode{
Key: "mtime",
Operator: &ast.OperatorNode{Value: "<="},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
},
},
want: query.NewConjunctionQuery([]query.Query{
func() query.Query {
q := query.NewDateRangeInclusiveQuery(timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"), time.Time{}, &[]bool{false}[0], nil)
q.FieldVal = "Mtime"
return q
}(),
func() query.Query {
q := query.NewDateRangeInclusiveQuery(timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"), time.Time{}, &[]bool{true}[0], nil)
q.FieldVal = "Mtime"
return q
}(),
func() query.Query {
q := query.NewDateRangeInclusiveQuery(time.Time{}, timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"), nil, &[]bool{false}[0])
q.FieldVal = "Mtime"
return q
}(),
func() query.Query {
q := query.NewDateRangeInclusiveQuery(time.Time{}, timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"), nil, &[]bool{true}[0])
q.FieldVal = "Mtime"
return q
}(),
}),
wantErr: false,
},
}
assert := tAssert.New(t)
-76
View File
@@ -1,76 +0,0 @@
package bleve
import (
"regexp"
"strings"
bQuery "github.com/blevesearch/bleve/v2/search/query"
"github.com/owncloud/ocis/v2/services/search/pkg/query/ast"
)
// LegacyBuilder implements the legacy Builder interface.
type LegacyBuilder struct{}
// Build translates the ast to a valid bleve query.
func (b LegacyBuilder) Build(qs string) (*ast.Ast, error) {
return &ast.Ast{
Base: &ast.Base{
Loc: &ast.Location{
Start: ast.Position{
Line: 0,
Column: 0,
},
End: ast.Position{
Line: 0,
Column: len(qs),
},
Source: &qs,
},
},
}, nil
}
// LegacyCompiler represents a default bleve query formatter.
type LegacyCompiler struct{}
// Compile implements the default bleve query formatter which converts the bleve likes query search string to the bleve query.
func (c LegacyCompiler) Compile(givenAst *ast.Ast) (bQuery.Query, error) {
return &bQuery.QueryStringQuery{
Query: c.formatQuery(*givenAst.Base.Loc.Source),
}, nil
}
func (c LegacyCompiler) formatQuery(q string) string {
cq := q
fields := []string{"RootID", "Path", "ID", "Name", "Size", "Mtime", "MimeType", "Type"}
for _, field := range fields {
cq = strings.ReplaceAll(cq, strings.ToLower(field)+":", field+":")
}
fieldRe := regexp.MustCompile(`\w+:[^ ]+`)
if fieldRe.MatchString(cq) {
nameTagesRe := regexp.MustCompile(`\+?(Name|Tags)`) // detect "Name", "+Name, "Tags" and "+Tags"
parts := strings.Split(cq, " ")
cq = ""
for _, part := range parts {
fieldParts := strings.SplitN(part, ":", 2)
if len(fieldParts) > 1 {
key := fieldParts[0]
value := fieldParts[1]
if nameTagesRe.MatchString(key) {
value = strings.ToLower(value) // do a lowercase query on the lowercased fields
}
cq += key + ":" + value + " "
} else {
cq += part + " "
}
}
return cq // Sophisticated field based search
}
// this is a basic filename search
cq = strings.ReplaceAll(cq, ":", `\:`)
return "Name:*" + strings.ReplaceAll(strings.ToLower(cq), " ", `\ `) + "*"
}
+25 -10
View File
@@ -2,6 +2,7 @@ package kql
import (
"fmt"
"time"
"github.com/owncloud/ocis/v2/services/search/pkg/query/ast"
)
@@ -13,23 +14,24 @@ func toIfaceSlice(in interface{}) []interface{} {
return in.([]interface{})
}
func toNode(in interface{}) (ast.Node, error) {
out, ok := in.(ast.Node)
func toNode[T ast.Node](in interface{}) (T, error) {
var t T
out, ok := in.(T)
if !ok {
return nil, fmt.Errorf("can't convert '%T' to ast.Node", in)
return t, fmt.Errorf("can't convert '%T' to '%T'", in, t)
}
return out, nil
}
func toNodes(in interface{}) ([]ast.Node, error) {
func toNodes[T ast.Node](in interface{}) ([]T, error) {
switch v := in.(type) {
case []interface{}:
var nodes []ast.Node
var nodes []T
for _, el := range toIfaceSlice(v) {
node, err := toNode(el)
node, err := toNode[T](el)
if err != nil {
return nil, err
}
@@ -38,7 +40,7 @@ func toNodes(in interface{}) ([]ast.Node, error) {
}
return nodes, nil
case []ast.Node:
case []T:
return v, nil
default:
return nil, fmt.Errorf("can't convert '%T' to []ast.Node", in)
@@ -52,9 +54,13 @@ func toString(in interface{}) (string, error) {
case []interface{}:
var str string
for _, i := range v {
j := i.([]uint8)
str += string(j[0])
for i := range v {
sv, err := toString(v[i])
if err != nil {
return "", err
}
str += sv
}
return str, nil
@@ -64,3 +70,12 @@ func toString(in interface{}) (string, error) {
return "", fmt.Errorf("can't convert '%T' to string", v)
}
}
func toTime(in interface{}) (time.Time, error) {
ts, err := toString(in)
if err != nil {
return time.Time{}, err
}
return time.Parse(time.RFC3339Nano, ts)
}
+87 -12
View File
@@ -17,7 +17,7 @@ Nodes <-
(
GroupNode /
PropertyRestrictionNodes /
BooleanOperatorNode /
OperatorBooleanNode /
FreeTextKeywordNodes
)
_
@@ -30,7 +30,7 @@ Nodes <-
////////////////////////////////////////////////////////
GroupNode <-
k:(Char+)? (ColonOperator / EqualOperator)? "(" v:Nodes ")" {
k:(Char+)? (OperatorColonNode / OperatorEqualNode)? "(" v:Nodes ")" {
return buildGroupNode(k, v, c.text, c.pos)
}
@@ -40,18 +40,23 @@ GroupNode <-
PropertyRestrictionNodes <-
YesNoPropertyRestrictionNode /
DateTimeRestrictionNode /
TextPropertyRestrictionNode
YesNoPropertyRestrictionNode <-
k:Char+ (ColonOperator / EqualOperator) v:("true" / "false"){
k:Char+ (OperatorColonNode / OperatorEqualNode) v:("true" / "false"){
return buildBooleanNode(k, v, c.text, c.pos)
}
TextPropertyRestrictionNode <-
k:Char+ (ColonOperator / EqualOperator) v:(String / [^ ()]+){
return buildStringNode(k, v, c.text, c.pos)
DateTimeRestrictionNode <-
k:Char+ o:(OperatorGreaterOrEqualNode / OperatorLessOrEqualNode / OperatorGreaterNode / OperatorLessNode / OperatorEqualNode / OperatorColonNode) '"'? v:(FullDate "T" FullTime) '"'? {
return buildDateTimeNode(k, o, v, c.text, c.pos)
}
TextPropertyRestrictionNode <-
k:Char+ (OperatorColonNode / OperatorEqualNode) v:(String / [^ ()]+){
return buildStringNode(k, v, c.text, c.pos)
}
////////////////////////////////////////////////////////
// free text-keywords
@@ -62,12 +67,12 @@ FreeTextKeywordNodes <-
WordNode
PhraseNode <-
ColonOperator? _ v:String _ ColonOperator? {
OperatorColonNode? _ v:String _ OperatorColonNode? {
return buildStringNode("", v, c.text, c.pos)
}
WordNode <-
ColonOperator? _ v:[^ :()]+ _ ColonOperator? {
OperatorColonNode? _ v:[^ :()]+ _ OperatorColonNode? {
return buildStringNode("", v, c.text, c.pos)
}
@@ -75,18 +80,83 @@ WordNode <-
// operators
////////////////////////////////////////////////////////
BooleanOperatorNode <-
OperatorBooleanNode <-
("AND" / "OR" / "NOT") {
return buildOperatorNode(c.text, c.pos)
}
ColonOperator <-
OperatorColonNode <-
":" {
return buildOperatorNode(c.text, c.pos)
}
OperatorEqualNode <-
"=" {
return buildOperatorNode(c.text, c.pos)
}
OperatorLessNode <-
"<" {
return buildOperatorNode(c.text, c.pos)
}
OperatorLessOrEqualNode <-
"<=" {
return buildOperatorNode(c.text, c.pos)
}
OperatorGreaterNode <-
">" {
return buildOperatorNode(c.text, c.pos)
}
OperatorGreaterOrEqualNode <-
">=" {
return buildOperatorNode(c.text, c.pos)
}
////////////////////////////////////////////////////////
// time
////////////////////////////////////////////////////////
TimeYear <-
Digit Digit Digit Digit {
return c.text, nil
}
EqualOperator <-
"=" {
TimeMonth <-
Digit Digit {
return c.text, nil
}
TimeDay <-
Digit Digit {
return c.text, nil
}
TimeHour <-
Digit Digit {
return c.text, nil
}
TimeMinute <-
Digit Digit {
return c.text, nil
}
TimeSecond <-
Digit Digit {
return c.text, nil
}
FullDate <-
TimeYear "-" TimeMonth "-" TimeDay {
return c.text, nil
}
FullTime <-
TimeHour ":" TimeMinute ":" TimeSecond ("." Digit+)? ("Z" / ("+" / "-") TimeHour ":" TimeMinute) {
return c.text, nil
}
@@ -104,5 +174,10 @@ String <-
return v, nil
}
Digit <-
[0-9] {
return c.text, nil
}
_ <-
[ \t]*
File diff suppressed because it is too large Load Diff
@@ -3,6 +3,7 @@ package kql_test
import (
"strings"
"testing"
"time"
tAssert "github.com/stretchr/testify/assert"
@@ -11,6 +12,15 @@ import (
"github.com/owncloud/ocis/v2/services/search/pkg/query/kql"
)
var timeMustParse = func(t *testing.T, ts string) time.Time {
tp, err := time.Parse(time.RFC3339Nano, ts)
if err != nil {
t.Fatalf("time.Parse(...) error = %v", err)
}
return tp
}
var FullDictionary = []string{
`federated search`,
`federat* search`,
@@ -242,6 +252,133 @@ func TestParse(t *testing.T) {
},
},
},
{
name: "unicode",
givenQuery: []string{
` 😂 "*😀 😁*" name:😂💁👌🎍😍 name:😂💁👌 😍`,
},
expectedAst: &ast.Ast{
Nodes: []ast.Node{
&ast.StringNode{
Value: "😂",
},
&ast.StringNode{
Value: "*😀 😁*",
},
&ast.StringNode{
Key: "name",
Value: "😂💁👌🎍😍",
},
&ast.StringNode{
Key: "name",
Value: "😂💁👌",
},
&ast.StringNode{
Value: "😍",
},
},
},
},
{
name: "DateTimeRestrictionNode",
givenQuery: []string{
`Mtime:"2023-09-05T08:42:11.23554+02:00"`,
`Mtime:2023-09-05T08:42:11.23554+02:00`,
`Mtime="2023-09-05T08:42:11.23554+02:00"`,
`Mtime=2023-09-05T08:42:11.23554+02:00`,
`Mtime<"2023-09-05T08:42:11.23554+02:00"`,
`Mtime<2023-09-05T08:42:11.23554+02:00`,
`Mtime<="2023-09-05T08:42:11.23554+02:00"`,
`Mtime<=2023-09-05T08:42:11.23554+02:00`,
`Mtime>"2023-09-05T08:42:11.23554+02:00"`,
`Mtime>2023-09-05T08:42:11.23554+02:00`,
`Mtime>="2023-09-05T08:42:11.23554+02:00"`,
`Mtime>=2023-09-05T08:42:11.23554+02:00`,
},
expectedAst: &ast.Ast{
Nodes: []ast.Node{
&ast.DateTimeNode{
Key: "Mtime",
Operator: &ast.OperatorNode{Value: ":"},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.DateTimeNode{
Key: "Mtime",
Operator: &ast.OperatorNode{Value: ":"},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.DateTimeNode{
Key: "Mtime",
Operator: &ast.OperatorNode{Value: "="},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.DateTimeNode{
Key: "Mtime",
Operator: &ast.OperatorNode{Value: "="},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.DateTimeNode{
Key: "Mtime",
Operator: &ast.OperatorNode{Value: "<"},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.DateTimeNode{
Key: "Mtime",
Operator: &ast.OperatorNode{Value: "<"},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.DateTimeNode{
Key: "Mtime",
Operator: &ast.OperatorNode{Value: "<="},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.DateTimeNode{
Key: "Mtime",
Operator: &ast.OperatorNode{Value: "<="},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.DateTimeNode{
Key: "Mtime",
Operator: &ast.OperatorNode{Value: ">"},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.DateTimeNode{
Key: "Mtime",
Operator: &ast.OperatorNode{Value: ">"},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.DateTimeNode{
Key: "Mtime",
Operator: &ast.OperatorNode{Value: ">="},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
&ast.DateTimeNode{
Key: "Mtime",
Operator: &ast.OperatorNode{Value: ">="},
Value: timeMustParse(t, "2023-09-05T08:42:11.23554+02:00"),
},
},
},
},
{
name: "id",
givenQuery: []string{
`id:b27d3bf1-b254-459f-92e8-bdba668d6d3f$d0648459-25fb-4ed8-8684-bc62c7dca29c!d0648459-25fb-4ed8-8684-bc62c7dca29c`,
`ID:b27d3bf1-b254-459f-92e8-bdba668d6d3f$d0648459-25fb-4ed8-8684-bc62c7dca29c!d0648459-25fb-4ed8-8684-bc62c7dca29c`,
},
expectedAst: &ast.Ast{
Nodes: []ast.Node{
&ast.StringNode{
Key: "id",
Value: "b27d3bf1-b254-459f-92e8-bdba668d6d3f$d0648459-25fb-4ed8-8684-bc62c7dca29c!d0648459-25fb-4ed8-8684-bc62c7dca29c",
},
&ast.StringNode{
Key: "ID",
Value: "b27d3bf1-b254-459f-92e8-bdba668d6d3f$d0648459-25fb-4ed8-8684-bc62c7dca29c!d0648459-25fb-4ed8-8684-bc62c7dca29c",
},
},
},
},
}
assert := tAssert.New(t)
+38 -4
View File
@@ -33,7 +33,7 @@ func buildAST(n interface{}, text []byte, pos position) (*ast.Ast, error) {
return nil, err
}
nodes, err := toNodes(n)
nodes, err := toNodes[ast.Node](n)
if err != nil {
return nil, err
}
@@ -54,7 +54,7 @@ func buildNodes(e interface{}) ([]ast.Node, error) {
nodes := make([]ast.Node, len(maybeNodesGroups))
for i, maybeNodesGroup := range maybeNodesGroups {
node, err := toNode(toIfaceSlice(maybeNodesGroup)[1])
node, err := toNode[ast.Node](toIfaceSlice(maybeNodesGroup)[1])
if err != nil {
return nil, err
}
@@ -88,6 +88,35 @@ func buildStringNode(k, v interface{}, text []byte, pos position) (*ast.StringNo
}, nil
}
func buildDateTimeNode(k, o, v interface{}, text []byte, pos position) (*ast.DateTimeNode, error) {
b, err := base(text, pos)
if err != nil {
return nil, err
}
operator, err := toNode[*ast.OperatorNode](o)
if err != nil {
return nil, err
}
key, err := toString(k)
if err != nil {
return nil, err
}
value, err := toTime(v)
if err != nil {
return nil, err
}
return &ast.DateTimeNode{
Base: b,
Key: key,
Operator: operator,
Value: value,
}, nil
}
func buildBooleanNode(k, v interface{}, text []byte, pos position) (*ast.BooleanNode, error) {
b, err := base(text, pos)
if err != nil {
@@ -117,9 +146,14 @@ func buildOperatorNode(text []byte, pos position) (*ast.OperatorNode, error) {
return nil, err
}
value, err := toString(text)
if err != nil {
return nil, err
}
return &ast.OperatorNode{
Base: b,
Value: string(text),
Value: value,
}, nil
}
@@ -131,7 +165,7 @@ func buildGroupNode(k, n interface{}, text []byte, pos position) (*ast.GroupNode
key, _ := toString(k)
nodes, err := toNodes(n)
nodes, err := toNodes[ast.Node](n)
if err != nil {
return nil, err
}
+2 -2
View File
@@ -9,8 +9,8 @@ import (
type Builder struct{}
// Build creates an ast.Ast based on a kql query
func (b Builder) Build(q string, opts ...Option) (*ast.Ast, error) {
f, err := Parse("", []byte(q), opts...)
func (b Builder) Build(q string) (*ast.Ast, error) {
f, err := Parse("", []byte(q))
if err != nil {
return nil, err
}
@@ -46,6 +46,14 @@ func NormalizeNodes(nodes []ast.Node) ([]ast.Node, error) {
}
currentNode = n
currentKey = &n.Key
case *ast.DateTimeNode:
if prevKey == nil {
prevKey = &n.Key
res = append(res, node)
continue
}
currentNode = n
currentKey = &n.Key
case *ast.BooleanNode:
if prevKey == nil {
prevKey = &n.Key
@@ -2,6 +2,7 @@ package kql_test
import (
"testing"
"time"
tAssert "github.com/stretchr/testify/assert"
@@ -10,6 +11,8 @@ import (
"github.com/owncloud/ocis/v2/services/search/pkg/query/kql"
)
var now = time.Now()
func TestNormalizeNodes(t *testing.T) {
tests := []struct {
name string
@@ -85,11 +88,14 @@ func TestNormalizeNodes(t *testing.T) {
givenNodes: []ast.Node{
&ast.StringNode{Key: "author", Value: "John Smith"},
&ast.StringNode{Key: "filetype", Value: "docx"},
&ast.DateTimeNode{Key: "mtime", Operator: &ast.OperatorNode{Value: "="}, Value: now},
},
expectedNodes: []ast.Node{
&ast.StringNode{Key: "author", Value: "John Smith"},
&ast.OperatorNode{Value: "AND"},
&ast.StringNode{Key: "filetype", Value: "docx"},
&ast.OperatorNode{Value: "AND"},
&ast.DateTimeNode{Key: "mtime", Operator: &ast.OperatorNode{Value: "="}, Value: now},
},
},
}
+1 -16
View File
@@ -11,8 +11,8 @@ import (
rpc "github.com/cs3org/go-cs3apis/cs3/rpc/v1beta1"
provider "github.com/cs3org/go-cs3apis/cs3/storage/provider/v1beta1"
"github.com/cs3org/reva/v2/pkg/rgrpc/todo/pool"
"github.com/cs3org/reva/v2/pkg/storagespace"
"github.com/cs3org/reva/v2/pkg/utils"
"github.com/owncloud/ocis/v2/ocis-pkg/log"
searchmsg "github.com/owncloud/ocis/v2/protogen/gen/ocis/messages/search/v0"
"github.com/owncloud/ocis/v2/services/search/pkg/engine"
@@ -140,21 +140,6 @@ func convertToWebDAVPermissions(isShared, isMountpoint, isDir bool, p *provider.
return b.String()
}
func extractScope(path string) (*provider.Reference, error) {
ref, err := storagespace.ParseReference(path)
if err != nil {
return nil, err
}
return &provider.Reference{
ResourceId: &provider.ResourceId{
StorageId: ref.ResourceId.StorageId,
SpaceId: ref.ResourceId.SpaceId,
OpaqueId: ref.ResourceId.OpaqueId,
},
Path: ref.GetPath(),
}, nil
}
// ParseScope extract a scope value from the query string and returns search, scope strings
func ParseScope(query string) (string, string) {
match := scopeRegex.FindStringSubmatch(query)
+7 -7
View File
@@ -97,16 +97,16 @@ func (s *Service) Search(ctx context.Context, req *searchsvc.SearchRequest) (*se
}
req.Query = query
if len(scope) > 0 {
// if req.Ref != nil {
// return nil, errtypes.BadRequest("cannot scope a search that is limited to a resource")
// }
scopeRef, err := extractScope(scope)
scopedID, err := storagespace.ParseID(scope)
if err != nil {
return nil, err
s.logger.Error().Err(err).Msg("failed to parse scope")
}
// Stat the scope to get the resource id
statRes, err := gatewayClient.Stat(ctx, &provider.StatRequest{
Ref: scopeRef,
Ref: &provider.Reference{
ResourceId: &scopedID,
},
FieldMask: &fieldmaskpb.FieldMask{Paths: []string{"space"}},
})
if err != nil {
@@ -418,7 +418,7 @@ func (s *Service) IndexSpace(spaceID *provider.StorageSpaceId, uID *user.UserId)
s.logger.Debug().Str("path", ref.Path).Msg("Walking tree")
searchRes, err := s.engine.Search(ownerCtx, &searchsvc.SearchIndexRequest{
Query: "+ID:" + storagespace.FormatResourceID(*info.Id) + ` +Mtime:>="` + utils.TSToTime(info.Mtime).Format(time.RFC3339Nano) + `"`,
Query: "id:" + storagespace.FormatResourceID(*info.Id) + ` mtime>=` + utils.TSToTime(info.Mtime).Format(time.RFC3339Nano),
})
if err == nil && len(searchRes.Matches) >= 1 {
@@ -49,7 +49,7 @@ func NewHandler(opts ...Option) (searchsvc.SearchProviderHandler, func(), error)
_ = idx.Close()
}
eng = engine.NewBleveEngine(idx, bleve.LegacyCreator)
eng = engine.NewBleveEngine(idx, bleve.DefaultCreator)
default:
return nil, teardown, fmt.Errorf("unknown search engine: %s", cfg.Engine.Type)
}
+1 -1
View File
@@ -1,6 +1,6 @@
SHELL := bash
NAME := web
WEB_ASSETS_VERSION = v8.0.0-alpha.1
WEB_ASSETS_VERSION = v8.0.0-alpha.2
include ../../.make/recursion.mk
+6 -4
View File
@@ -17,6 +17,10 @@ import (
"github.com/cs3org/reva/v2/pkg/storage/utils/templates"
"github.com/go-chi/chi/v5"
"github.com/go-chi/render"
"github.com/riandyrn/otelchi"
merrors "go-micro.dev/v4/errors"
grpcmetadata "google.golang.org/grpc/metadata"
"github.com/owncloud/ocis/v2/ocis-pkg/log"
"github.com/owncloud/ocis/v2/ocis-pkg/registry"
"github.com/owncloud/ocis/v2/ocis-pkg/tracing"
@@ -26,9 +30,6 @@ import (
"github.com/owncloud/ocis/v2/services/webdav/pkg/config"
"github.com/owncloud/ocis/v2/services/webdav/pkg/constants"
"github.com/owncloud/ocis/v2/services/webdav/pkg/dav/requests"
"github.com/riandyrn/otelchi"
merrors "go-micro.dev/v4/errors"
"google.golang.org/grpc/metadata"
)
func init() {
@@ -113,6 +114,7 @@ func NewService(opts ...Option) (Service, error) {
r.Get("/remote.php/dav/files/{id}/*", svc.Thumbnail)
r.Get("/dav/files/{id}", svc.Thumbnail)
r.Get("/dav/files/{id}/*", svc.Thumbnail)
r.MethodFunc("REPORT", "/remote.php/dav/files*", svc.Search)
r.MethodFunc("REPORT", "/dav/files*", svc.Search)
})
@@ -309,7 +311,7 @@ func (g Webdav) Thumbnail(w http.ResponseWriter, r *http.Request) {
user = userRes.GetUser()
} else {
// look up user from URL via GetUserByClaim
ctx := metadata.AppendToOutgoingContext(r.Context(), TokenHeader, t)
ctx := grpcmetadata.AppendToOutgoingContext(r.Context(), TokenHeader, t)
userRes, err := gatewayClient.GetUserByClaim(ctx, &userv1beta1.GetUserByClaimRequest{
Claim: "username",
Value: tr.Identifier,
@@ -74,7 +74,7 @@ Feature: REPORT request to project space
Scenario: check the response of the searched sub-folder
Given user "Alice" has created a folder "folderMain/sub-folder" in space "findData"
And using new DAV path
When user "Alice" searches for "sub" using the WebDAV API
When user "Alice" searches for "*sub*" using the WebDAV API
Then the HTTP status code should be "207"
And the search result of user "Alice" should contain only these entries:
| /folderMain/sub-folder |
@@ -20,7 +20,7 @@ Feature: Search
Scenario: user can find data from the project space
When user "Alice" searches for "fol" using the WebDAV API
When user "Alice" searches for "*fol*" using the WebDAV API
Then the HTTP status code should be "207"
And the search result should contain "4" entries
And the search result of user "Alice" should contain these entries:
@@ -31,7 +31,7 @@ Feature: Search
Scenario: user can only find data that they searched for from the project space
When user "Alice" searches for "SUB" using the WebDAV API
When user "Alice" searches for "*SUB*" using the WebDAV API
Then the HTTP status code should be "207"
And the search result should contain "2" entries
And the search result of user "Alice" should contain these entries:
@@ -48,7 +48,7 @@ Feature: Search
| shareWith | Brian |
| role | viewer |
And user "Brian" has accepted share "/folderMain" offered by user "Alice"
When user "Brian" searches for "folder" using the WebDAV API
When user "Brian" searches for "*folder*" using the WebDAV API
Then the HTTP status code should be "207"
And the search result should contain "4" entries
And the search result of user "Brian" should contain these entries:
@@ -60,7 +60,7 @@ Feature: Search
Scenario: user can find hidden file
Given user "Alice" has created a folder ".space" in space "find data"
When user "Alice" searches for ".sp" using the WebDAV API
When user "Alice" searches for "*.sp*" using the WebDAV API
Then the HTTP status code should be "207"
And the search result should contain "1" entries
And the search result of user "Alice" should contain these entries:
@@ -72,7 +72,7 @@ Feature: Search
| path | folderMain |
| shareWith | Brian |
| role | viewer |
When user "Brian" searches for "folder" using the WebDAV API
When user "Brian" searches for "*folder*" using the WebDAV API
Then the HTTP status code should be "207"
And the search result should contain "0" entries
And the search result of user "Brian" should not contain these entries:
@@ -87,7 +87,7 @@ Feature: Search
| shareWith | Brian |
| role | viewer |
And user "Brian" has declined share "/folderMain" offered by user "Alice"
When user "Brian" searches for "folder" using the WebDAV API
When user "Brian" searches for "*folder*" using the WebDAV API
Then the HTTP status code should be "207"
And the search result should contain "0" entries
And the search result of user "Brian" should not contain these entries:
@@ -98,20 +98,20 @@ Feature: Search
Scenario: user cannot find deleted folder
Given user "Alice" has removed the folder "folderMain" from space "find data"
When user "Alice" searches for "folderMain" using the WebDAV API
When user "Alice" searches for "*folderMain*" using the WebDAV API
Then the HTTP status code should be "207"
And the search result should contain "0" entries
Scenario: user can find project space by name
When user "Alice" searches for "find data" using the WebDAV API
When user "Alice" searches for '"*find data*"' using the WebDAV API
Then the HTTP status code should be "207"
And the search result should contain "1" entries
And for user "Alice" the search result should contain space "find data"
Scenario: user can search inside folder in space
When user "Alice" searches for "folder" inside folder "/folderMain" in space "find data" using the WebDAV API
When user "Alice" searches for "*folder*" inside folder "/folderMain" in space "find data" using the WebDAV API
Then the HTTP status code should be "207"
And the search result should contain "3" entries
And the search result of user "Alice" should contain only these entries:
@@ -128,7 +128,7 @@ Feature: Search
| shareWith | Brian |
| role | viewer |
And user "Brian" has accepted share "/folderMain" offered by user "Alice"
When user "Brian" searches for "folder" inside folder "/folderMain" in space "Shares" using the WebDAV API
When user "Brian" searches for "*folder*" inside folder "/folderMain" in space "Shares" using the WebDAV API
Then the HTTP status code should be "207"
And the search result of user "Brian" should contain only these entries:
| /SubFolder1 |
@@ -136,7 +136,7 @@ Feature: Search
| /SubFolder1/subFOLDER2/insideTheFolder.txt |
But the search result of user "Brian" should not contain these entries:
| /folderMain |
@issue-enterprise-6000
Scenario: sharee cannot find resources that are not shared
Given user "Alice" has created a folder "foo/sharedToBrian" in space "Alice Hansen"
@@ -146,7 +146,7 @@ Feature: Search
| shareWith | Brian |
| role | viewer |
And user "Brian" has accepted share "/foo" offered by user "Alice"
When user "Brian" searches for "shared" using the WebDAV API
When user "Brian" searches for "shared*" using the WebDAV API
Then the HTTP status code should be "207"
And the search result of user "Brian" should contain these entries:
| /sharedToBrian |
@@ -243,7 +243,7 @@ Feature: tag search
And user "Alice" has uploaded file with content "hello world inside folder" to "/Folder/file2.txt"
And user "Alice" has created folder "/Folder/SubFolder"
And user "Alice" has uploaded file with content "hello world inside sub-folder" to "/Folder/SubFolder/file3.txt"
When user "Alice" searches for "file" inside folder "/Folder" using the WebDAV API
When user "Alice" searches for "*file*" inside folder "/Folder" using the WebDAV API
Then the HTTP status code should be "207"
And the search result of user "Alice" should contain only these entries:
| file2.txt |
@@ -68,18 +68,10 @@ class SearchContext implements Context {
= "<?xml version='1.0' encoding='utf-8' ?>\n" .
" <oc:search-files xmlns:a='DAV:' xmlns:oc='http://owncloud.org/ns' >\n" .
" <oc:search>\n";
if ($scope !== null && $spaceName !== null) {
if ($scope !== null) {
$scope = \trim($scope, "/");
$spaceId = $this->featureContext->spacesContext->getSpaceIdByName($user, $spaceName);
$pattern .= " scope:$spaceId/$scope";
} elseif ($scope !== null) {
$scope = \trim($scope, "/");
if ($this->featureContext->getDavPathVersion() === 3) {
$rootPath = $this->featureContext->getPersonalSpaceIdForUser($user);
} else {
$rootPath = $this->featureContext->getUserIdByUserName($user);
}
$pattern .= " scope:$rootPath/$scope";
$resourceID = $this->featureContext->spacesContext->getResourceId($user, $spaceName ?? "Personal", $scope);
$pattern .= " scope:$resourceID";
}
$body .= "<oc:pattern>$pattern</oc:pattern>\n";
if ($limit !== null) {
@@ -24,7 +24,7 @@ Feature: Search
@smokeTest
Scenario Outline: search for entry by pattern
Given using <dav-path-version> DAV path
When user "Alice" searches for "upload" using the WebDAV API
When user "Alice" searches for "*upload*" using the WebDAV API
Then the HTTP status code should be "207"
And the search result of user "Alice" should contain these entries:
| /upload.txt |
@@ -51,7 +51,7 @@ Feature: Search
Scenario Outline: search for entries by only some letters from the middle of the entry name
Given using <dav-path-version> DAV path
And user "Alice" has created folder "FOLDER"
When user "Alice" searches for "ol" using the WebDAV API
When user "Alice" searches for "*ol*" using the WebDAV API
Then the HTTP status code should be "207"
And the search result should contain "4" entries
And the search result of user "Alice" should contain these entries:
@@ -72,7 +72,7 @@ Feature: Search
Scenario Outline: search for files by extension
Given using <dav-path-version> DAV path
When user "Alice" searches for "png" using the WebDAV API
When user "Alice" searches for "*png*" using the WebDAV API
Then the HTTP status code should be "207"
And the search result of user "Alice" should contain these entries:
| /a-image.png |
@@ -110,7 +110,7 @@ Feature: Search
Scenario Outline: limit returned search entries
Given using <dav-path-version> DAV path
When user "Alice" searches for "upload" and limits the results to "3" items using the WebDAV API
When user "Alice" searches for "*upload*" and limits the results to "3" items using the WebDAV API
Then the HTTP status code should be "207"
And the search result of user "Alice" should contain any "3" of these entries:
| /just-a-folder/upload.txt |
@@ -134,7 +134,7 @@ Feature: Search
Scenario Outline: limit returned search entries to only 1 entry
Given using <dav-path-version> DAV path
When user "Alice" searches for "upload" and limits the results to "1" items using the WebDAV API
When user "Alice" searches for "*upload*" and limits the results to "1" items using the WebDAV API
Then the HTTP status code should be "207"
And the search result of user "Alice" should contain any "1" of these entries:
| /just-a-folder/upload.txt |
@@ -158,7 +158,7 @@ Feature: Search
Scenario Outline: limit returned search entries to more entries than there are
Given using <dav-path-version> DAV path
When user "Alice" searches for "upload" and limits the results to "100" items using the WebDAV API
When user "Alice" searches for "*upload*" and limits the results to "100" items using the WebDAV API
Then the HTTP status code should be "207"
And the search result should contain "8" entries
And the search result of user "Alice" should contain these entries:
@@ -183,7 +183,7 @@ Feature: Search
@issue-4712
Scenario Outline: report extra properties in search entries for a file
Given using <dav-path-version> DAV path
When user "Alice" searches for "upload" using the WebDAV API requesting these properties:
When user "Alice" searches for "*upload*" using the WebDAV API requesting these properties:
| oc:fileid |
| oc:permissions |
| a:getlastmodified |
@@ -216,7 +216,7 @@ Feature: Search
@issue-4712
Scenario Outline: report extra properties in search entries for a folder
Given using <dav-path-version> DAV path
When user "Alice" searches for "upload" using the WebDAV API requesting these properties:
When user "Alice" searches for "*upload*" using the WebDAV API requesting these properties:
| oc:fileid |
| oc:permissions |
| a:getlastmodified |
@@ -248,7 +248,7 @@ Feature: Search
Scenario Outline: search for entry with emoji by pattern
Given using <dav-path-version> DAV path
When user "Alice" searches for "😀 😁" using the WebDAV API
When user "Alice" searches for '"*😀 😁*"' using the WebDAV API
Then the HTTP status code should be "207"
And the search result of user "Alice" should contain these entries:
| /upload😀 😁 |