Redivis Documentation
API DocumentationRedivis Home
  • Introduction
  • Redivis for open science
    • FAIR data practices
    • Open access
    • Data repository characteristics
    • Data retention policy
    • Citations
  • Guides
    • Getting started
    • Discover & access data
      • Discover datasets
      • Apply to access restricted data
      • Create a study
    • Analyze data in a workflow
      • Reshape data in transforms
      • Work with data in notebooks
      • Running ML workloads
      • Example workflows
        • Analyzing large tabular data
        • Create an image classification model
        • Fine tuning a Large Language Model (LLM)
        • No-code visualization
        • Continuous enrollment
        • Select first/last encounter
    • Export & publish your work
      • Export to other environments
      • Build your own site with Observable
    • Create & manage datasets
      • Create and populate a dataset
      • Upload tabular data as tables
      • Upload unstructured data as files
      • Cleaning tabular data
    • Administer an organization
      • Configure access systems
      • Grant access to data
      • Generate a report
      • Example tasks
        • Emailing subsets of members
    • Video guides
  • Reference
    • Your account
      • Creating an account
      • Managing logins
      • Single Sign-On (SSO)
      • Workspace
      • Studies
      • Compute credits and billing
    • Datasets
      • Documentation
      • Tables
      • Variables
      • Files
      • Creating & editing datasets
      • Uploading data
        • Tabular data
        • Geospatial data
        • Unstructured data
        • Metadata
        • Data sources
        • Programmatic uploads
      • Version control
      • Sampling
      • Exporting data
        • Download
        • Programmatic
        • Google Data Studio
        • Google Cloud Storage
        • Google BigQuery
        • Embedding tables
    • Workflows
      • Workflow concepts
      • Documentation
      • Data sources
      • Tables
      • Transforms
        • Transform concepts
        • Step: Aggregate
        • Step: Create variables
        • Step: Filter
        • Step: Join
        • Step: Limit
        • Step: Stack
        • Step: Order
        • Step: Pivot
        • Step: Rename
        • Step: Retype
        • Step: SQL query
        • Variable selection
        • Value lists
        • Optimization and errors
        • Variable creation methods
          • Common elements
          • Aggregate
          • Case (if/else)
          • Date
          • DateTime
          • Geography
          • JSON
          • Math
          • Navigation
          • Numbering
          • Other
          • Statistical
          • String
          • Time
      • Notebooks
        • Notebook concepts
        • Compute resources
        • Python notebooks
        • R notebooks
        • Stata notebooks
        • SAS notebooks
        • Using the Jupyter interface
      • Access and privacy
    • Data access
      • Access levels
      • Configuring access
      • Requesting access
      • Approving access
      • Usage rules
      • Data access in workflows
    • Organizations
      • Administrator panel
      • Members
      • Studies
      • Workflows
      • Datasets
      • Permission groups
      • Requirements
      • Reports
      • Logs
      • Billing
      • Settings and branding
        • Account
        • Public profile
        • Membership
        • Export environments
        • Advanced: DOI configuration
        • Advanced: Stata & SAS setup
        • Advanced: Data storage locations
        • Advanced: Data egress configuration
    • Institutions
      • Administrator panel
      • Organizations
      • Members
      • Datasets
      • Reports
      • Settings and branding
    • Quotas and limits
    • Glossary
  • Additional Resources
    • Events and press
    • API documentation
    • Redivis Labs
    • Office hours
    • Contact us
    • More information
      • Product updates
      • Roadmap
      • System status
      • Security
      • Feature requests
      • Report a bug
Powered by GitBook
On this page
  • Overview
  • Configuring export environments
  • Export environment types
  • Export environment rules

Was this helpful?

Export as PDF
  1. Reference
  2. Organizations
  3. Settings and branding

Export environments

Last updated 5 months ago

Was this helpful?

Overview

For restricted data, you'll often want to to whether users can move data off of Redivis and onto external systems. For example, you may want researchers to be able to download data to a compute cluster for further analysis, but not onto their personal computer.

In this section of your organization settings, you can define specific export environments that can then be applied to datasets.

Configuring export environments

All export environments are given a unique name and optional description, which can be helpful for users to understand the context for what the environment is and when it applies. You can then specify the type of environment and any accompanying rules:

Export environment types

IP Address

Specify any number of IP addresses (or subnets, using CIDR notation) that represent the export environment. For example, if your on-premise cluster has persistent IP address(es), they would be listed here.

If you want to configure the workflow "Users can download derivatives of this dataset to anywhere, but only upon admin approval", create an IP Address environment with the wildcard IP address 0.0.0.0/0 and turn on All exports must be approved by an administrator.

This is often preferable to the Custom location environment type outlined below, as the latter mechanism is intended for you to grant a specific user the ability to download a specific table to a specific IP address (or cloud location). Given that users' IP addresses frequently change, giving approval for a user to one IP address might not work as they move between networks.

Google BigQuery

Specify specific Google Cloud project(s) to which BigQuery exports are allowed.

Google Cloud Storage

Specify specific Google Cloud Storage buckets(s) to which exports are allowed.

Google Data Studio

Allow data to be used within Google Data Studio dashboards.

Custom location

A "catch-all" environment, the custom location allows for researchers to request export on a case-by-case basis. For example, if they want to export a specific table to a given GCS bucket, or to a specific IP address, they'll be able to make this request which will then be sent to administrators for approval.

Export environment rules

By default, any exports to the specified environment will be permitted (assuming the environment is assigned to a given dataset). However, you can specify additional rules associated with the environment to allow fine-grained control:

All exports must be approved by an administrator

If this rule is turned on, every export to the specified environment must first be approved by an administrator.

Export size

If enabled, only exports above the specified size will require administrator approval. The size can be specified in terms of gigabytes (GB) or number of rows in the exported table.

Export size restrictions do not strictly disallow exports above a given size, as users can always break up tables into multiple smaller subsets and then export

assign restrictions