Daily Reports
Authorization
This is available for Enterprise contracts and needs to be enabled for you, once. Please reach out to our support team, your sales contact, or via our slack channel if you want this enabled.
With data reports we strive for a 10am (PST) deadline but there's always a possibility we land late. For such cases, fallbacks should be implemented with this API.
Authorization
All requests must include the STATSIG-API-KEY
field in the header.
The value should be a Console API Key which can be created in 'Project Settings' > 'API Keys' tab
.
To use the 'try it' section on this page, enter your Console API into the box below.
Report Schema - type = first_exposures
Column Name | Data Type | Description |
---|---|---|
company_id | string | |
unit_id | string | Refers to the unit identifier used in the experiment |
unit_type | string | Refers to the unit type used in the experiment |
exposure_type | string | Statsig Product used: "check_gate" for Feature Gates, and "get_experiment" for experiments |
name | string | The name of the gate/experiment |
rule | string | rule id for both gates; 'experiment' for experiments |
experiment_group | string | The group the user was assigned to |
first_exposure_utc | timestamp | The UTC timestamp when the user was first assigned to the experiment |
first_exposure_pst_date | date | The date in PST when the user was first assigned to the experiment |
as_of_pst_date | date | The date this data was generated |
percent | float | Group's rollout percent. This is the rule-level pass-rate, or the experiment's group size |
rollout | int | Rollout version number |
user_dimensions | string | JSON-formatted key-value pairs describing the user's attributes at the time of first exposure |
inserted_at | timestamp | The timestamp when this report is generated |
rule_name | string | Rule name for gates; 'experiment' for experiments |
group_id | string | True/False for gates; group_id for experiments |
- After the feature is turned on for you, it will be ready when our pipeline runs. Generally, you can expect it around the time you see cumulative exposures.
- We commit to maintain the report format including column order and type, but we recommend that you use the column name and cast column type when parsing the file.
Egress exposure data to your warehouse
Besides the exposure reports from the API access, we also offer data warehouse export to Snowflake for our enterprise customers.
- At the top of your console, tap the "?" icon and navigate to
Tools and Resources
->Exports List
. Under theScheduled
tab, create a new connection. - In your Snowflake warehouse, create a custom user for this export purpose.
- Run the script to create the table and grant your user permissions to operate on it.
-- create the table
CREATE OR REPLACE TABLE <database>.<schema>.<exportTableName> (
COMPANY_ID VARCHAR(16777216),
UNIT_ID VARCHAR(16777216),
UNIT_TYPE VARCHAR(16777216),
EXPOSURE_TYPE VARCHAR(16777216),
NAME VARCHAR(16777216),
RULE VARCHAR(16777216),
EXPERIMENT_GROUP VARCHAR(16777216),
FIRST_EXPOSURE_UTC TIMESTAMP_NTZ(9),
FIRST_EXPOSURE_PST_DATE DATE,
AS_OF_PST_DATE DATE,
PERCENT FLOAT,
ROLLOUT NUMBER(38,0),
USER_DIMENSIONS VARCHAR(16777216),
INSERTED_AT TIMESTAMP_NTZ(9) NOT NULL,
RULE_NAME VARCHAR(16777216),
GROUP_ID VARCHAR(16777216)
);
-- grant the custom user privileges to operate on the table
GRANT INSERT, UPDATE, DELETE ON TABLE <database>.<schema>.<exportTableName> TO ROLE <generatedRole>;
- Connect. Then StatSig will run tests to make sure all privileges are granted. Once it's successful, you are enrolled in the feature.
Egress event data to your warehouse
We also offer custom event exports to Snowflake for our enterprise customers.
Create a table with the following schema and follow the same steps in the above exposure exports instructions to enabled
CREATE OR REPLACE TABLE <DATABASE>.<SCHEMA>.<TABLE> (
USER_ID STRING,
STABLE_ID STRING,
CUSTOM_IDS VARIANT,
TIMESTAMP TIMESTAMP_NTZ,
EVENT_NAME STRING,
EVENT_VALUE STRING,
USER_OBJECT VARIANT,
STATSIG_METADATA VARIANT,
COMPANY_METADATA VARIANT
);
Report Schema - type = pulse_daily
Column Name | Description |
---|---|
company_id | STRING, the id of the company |
name | STRING, the name of the gate/experiment |
rule | rule id for both gates; 'experiment' for experiments |
rule_name | rule name for gates; 'experiment' for experiments |
experiment_group | STRING, the group the user was assigned to |
pst_date | DATE, the 24hr window the the data refers to. All dates are anchored from 12:00a -> 11:59p PST. |
metric_type | STRING, the type of the metric, e.g. ratio |
metric_name | STRING, the name of the metric |
metric_dimension | STRING, the name of the metric dimension. '!statsig_topline' is the overall metric with no slicing |
units | BIGINT, the number of users included in this metric |
participating_units | BIGINT, the number of users participated in this metric |
mean | DOUBLE, the average value of this metric across units (or participating units when applicable) |
total | DOUBLE, the sum of this metric across units |
stddev | DOUBLE, the standard deviation of this metric across units |
stderr | DOUBLE, the standard error of this metric across units |
varpop | DOUBLE, the varpop of this metric |
- New columns may be added in the future.
- Exposures on the day when a decision is made are excluded from the report, as they are not part of the experiment.
- Metric values are non-CUPED.
- After the feature is turned on for you, it will be ready when our pipeline runs. Generally, you can expect it around the time you see pulse results.
Please note that the previous get_daily_reports endpoint has been deprecated. If you were using this endpoint, please migrate to the new version. However, you can find old docs here