Redivis Documentation
API DocumentationRedivis Home
  • Introduction
  • Redivis for open science
    • FAIR data practices
    • Open access
    • Data repository characteristics
    • Data retention policy
    • Citations
  • Guides
    • Getting started
    • Discover & access data
      • Discover datasets
      • Apply to access restricted data
      • Create a study
    • Analyze data in a workflow
      • Reshape data in transforms
      • Work with data in notebooks
      • Running ML workloads
      • Example workflows
        • Analyzing large tabular data
        • Create an image classification model
        • Fine tuning a Large Language Model (LLM)
        • No-code visualization
        • Continuous enrollment
        • Select first/last encounter
    • Export & publish your work
      • Export to other environments
      • Build your own site with Observable
    • Create & manage datasets
      • Create and populate a dataset
      • Upload tabular data as tables
      • Upload unstructured data as files
      • Cleaning tabular data
    • Administer an organization
      • Configure access systems
      • Grant access to data
      • Generate a report
      • Example tasks
        • Emailing subsets of members
    • Video guides
  • Reference
    • Your account
      • Creating an account
      • Managing logins
      • Single Sign-On (SSO)
      • Workspace
      • Studies
      • Compute credits and billing
    • Datasets
      • Documentation
      • Tables
      • Variables
      • Files
      • Creating & editing datasets
      • Uploading data
        • Tabular data
        • Geospatial data
        • Unstructured data
        • Metadata
        • Data sources
        • Programmatic uploads
      • Version control
      • Sampling
      • Exporting data
        • Download
        • Programmatic
        • Google Data Studio
        • Google Cloud Storage
        • Google BigQuery
        • Embedding tables
    • Workflows
      • Workflow concepts
      • Documentation
      • Data sources
      • Tables
      • Transforms
        • Transform concepts
        • Step: Aggregate
        • Step: Create variables
        • Step: Filter
        • Step: Join
        • Step: Limit
        • Step: Stack
        • Step: Order
        • Step: Pivot
        • Step: Rename
        • Step: Retype
        • Step: SQL query
        • Variable selection
        • Value lists
        • Optimization and errors
        • Variable creation methods
          • Common elements
          • Aggregate
          • Case (if/else)
          • Date
          • DateTime
          • Geography
          • JSON
          • Math
          • Navigation
          • Numbering
          • Other
          • Statistical
          • String
          • Time
      • Notebooks
        • Notebook concepts
        • Compute resources
        • Python notebooks
        • R notebooks
        • Stata notebooks
        • SAS notebooks
        • Using the Jupyter interface
      • Access and privacy
    • Data access
      • Access levels
      • Configuring access
      • Requesting access
      • Approving access
      • Usage rules
      • Data access in workflows
    • Organizations
      • Administrator panel
      • Members
      • Studies
      • Workflows
      • Datasets
      • Permission groups
      • Requirements
      • Reports
      • Logs
      • Billing
      • Settings and branding
        • Account
        • Public profile
        • Membership
        • Export environments
        • Advanced: DOI configuration
        • Advanced: Stata & SAS setup
        • Advanced: Data storage locations
        • Advanced: Data egress configuration
    • Institutions
      • Administrator panel
      • Organizations
      • Members
      • Datasets
      • Reports
      • Settings and branding
    • Quotas and limits
    • Glossary
  • Additional Resources
    • Events and press
    • API documentation
    • Redivis Labs
    • Office hours
    • Contact us
    • More information
      • Product updates
      • Roadmap
      • System status
      • Security
      • Feature requests
      • Report a bug
Powered by GitBook
On this page
  • Datasets
  • Exports
  • Notebooks
  • Organizations
  • Queries
  • Tables
  • Transforms
  • Uploads
  • Variables
  • Workflows

Was this helpful?

Export as PDF
  1. Reference

Quotas and limits

Last updated 5 months ago

Was this helpful?

Redivis has limits in place to protect systems and guide correct usage. Some of these limits are more flexible than others, please if your specific use case is affected by these limits.

  • Max versions: 1,000

  • Max tables: 1,000

  • Max unstructured files: 100M

  • Max unstructured file size (per-file): 5TB

  • Max supporting documentation file size: 100MB

  • Naming rules:

    • Name length: Between 3 and 80 characters

    • The escaped name (lowercase, collapse all non-alphanumeric characters to "_") must be unique across all datasets in the organization. If dataset is owned by a user, must also be unique across all workflows owned by that user.

  • Max monthly export: 100GB (unless your organization has configured a )

  • Max export sizes:

    • CSV: 10TB

    • NDJSON: 10TB

    • Parquet: 10TB*

    • Avro: 10TB*

    • SAV: 10GB**

    • DTA: 10GB**

* Parquet and Avro exports larger than 1GB will be exported as a ZIP composed of multiple individual files of the corresponding format. ** To load larger files into SPSS, Stata, and SAS, you can export a CSV alongside the accompanying load script.

  • Max concurrent notebooks, per user: 5

  • Naming rules:

    • Name length: Between 1 and 80 characters

    • The escaped name (lowercase, collapse all non-alphanumeric characters to "_") must be unique across all transforms in the workflow.

  • Max datasets per organization: 10,000

  • Max members per organization: 10,000

  • Max concurrent queries, per user: 5

  • Max queued queries, per user: 1,000

  • Max query duration: 6 hours

  • Max query output size: 10GB compressed (often >10GB)

  • Max query string length: 1M characters

  • Max query compute seconds: 100k

  • Max total query compute seconds (per 24hrs): 5M

  • Max size: 100TB

  • Max row size: 100MB

  • Max variables: 9,990 (across all versions of the table)*

  • Max description length: 5,000

  • Naming rules:

    • Name length: Between 1 and 80 characters

    • Disallowed names: _source_

    • The escaped name (lowercase, collapse all non-alphanumeric characters to "_") must be unique across tables in the associated dataset / workflow.

* While Redivis supports up to 9,990 variables per table, we strongly recommend restructuring your data to have fewer variables if possible. Such "wide" tables will generally be less performant and harder for researchers to navigate and query compared to "tall" tables with a few variables and many records.

  • Max steps per transform: 100

  • Max concurrent transforms, per user: 5

  • Max generated SQL query string length: 1M characters

  • Max per-query compute seconds: 1M

  • Max transform output size: 1TB

  • Naming rules:

    • Name length: Between 1 and 80 characters

    • The escaped name (lowercase, collapse all non-alphanumeric characters to "_") must be unique across all transforms in the workflow.

  • Max uploads per table, per version: 500*

  • Max file sizes, tabular uploads

    • Delimited (csv, tsv, etc.): 5TB**

    • jsonl, json, geojson, geojsonl, kml, xlsx: 5TB**

    • SAS(.sas7bdat): 5TB**

    • Avro, Parquet, ORC: 5TB

    • shp, shp.zip, Stata(.dta), SPSS(.sav), Excel(xls, xlsx): 100GB

  • Max file size, unstructured data files: 5TB

  • Max label length: 256

  • Max description length: 5,000

  • Value labels:

    • Value length cannot exceed 32 characters

    • Label length cannot exceed 256 characters

    • Total length cannot exceed 1M characters (sum of all values + labels)

  • Naming rules:

    • Name length: Between 1 and 60 characters

    • Only alphanumeric, "_" are allowed. Cannot begin with a number.

    • Must be unique within its table (casing ignored)

  • Max data sources: 100

  • Max transforms: 1,000

  • Max notebooks: 1,000

  • Max tables: 1,000

  • Max parameters: 1,000

  • Max values per parameter: 10,000

  • Naming rules:

    • Name length: Between 3 and 80 characters

    • The escaped name (lowercase, collapse all non-alphanumeric characters to "_") must be unique across all datasets and workflows owned by the user.

* Depending on the length of variable names, the actual limits for max uploads may be lower. In general, the total length of all variable names cannot exceed 185,000 characters, and the total number of variables multiplied by the total number of uploads (for a single version of the table) cannot exceed 400,000. If either of these limits are reached, the upload will fail with an . ** The 5TB limit applies to the final size of a file after decompression and/or conversion (e.g., SAS files are converted to CSV before final import, shapefiles are converted to geojson; this final file must be less than 5TB).

Uploads
contact us
custom egress project
Datasets
Exports
Notebooks
Organizations
Tables
Transforms
Variables
Workflows
Queries
accompanying error message