Skip to main content

Guppy

What Does it Do

Guppy is used to render the explorer page. It uses elastic search indices to render the page.

How to Configure it

For a full set of configuration see the helm README.md for guppy or read the values.yaml directly

There is also config that needs to be set within the global block around the tier access level, defining how the explorer page should handle displaying unauthorized files, and the limit to how far unauthroized user can filter down files. Last there is a guppy block that needs to be configured with the elastic search indices guppy will use to render the explorer page.

global:
tierAccessLevel: "(libre|regular|private)"

guppy:
# -- (int) Only relevant if tireAccessLevel is set to "regular".
# The minimum amount of files unauthorized users can filter down to
tierAccessLimit: 1000

# -- (list) Elasticsearch index configurations
indices:
- index: dev_case
type: case
- index: dev_file
type: file

# -- (string) The Elasticsearch configuration index
configIndex: dev_case-array-config
# -- (string) The field used for access control and authorization filters
authFilterField: auth_resource_path
# -- (bool) Whether or not to enable encryption for specified fields
enableEncryptWhitelist: true
# -- (string) A comma-separated list of fields to encrypt
encryptWhitelist: test1


# -- (string) Elasticsearch endpoint.
# defaults to "elasticsearch:9200"
esEndpoint: ""

You will also need a mapping file to map the fields you want to pull from postgres into the elasticsearch indices. There are too many fields to describe here, but an example mapping file can be found here.

Last, guppy works closely with portal to render the explorer page. You will need to ensure a proper dataExplorer block is setup within the gitops.json file, referencing fields that have been pulled from postgres into the elasticsearch indices.

Extra Information

Guppy relies on indices being created to run, if there are no indices created guppy will fail to start up.

To create these indices you can run etl, however a valid ETL mapping file needs to be created and data needs to be submitted to the commons.