feat(business-types): initial implementation of SIP-78 (#18794)
* add BUSINESS_TYPE_ADDONS to config with example callback * Removing uneeded whitespace * [Work in progress] Modifying cidr function to allow for single ip and adding port outline * Added test REST endpoint, added some more ports I've thrown in a test.py script as well that will try to connect to the business_type endpoint. * Moving code from config.py into the business api Very simple api is exposed that will allow someone to call a checkport endpoint and get back a response. * Removing commented out bits. * Adding fucntion dict back to the config * Moving business_type endpoint to charts * Adding schema for get endpoint * Removing imports, updating docstring, fixing typo Just some small changes as described in the title. I've updated the test.py as well so it functions with the endpoint changes. * Adding translation dict * Fixing ops * Adding check for list * Modifying changes to add quotes where needed Also changed BusinessTypeResponse to resp. * Adding in some code to call the filter config If a column starts with "cidr_" it will call the code in config.py to try to translate the filter. Nothing is changed in the JSON being executed, just some information is dumped to console. * Porting Ryan's changes * Adding migration script (as per Ryan's PR) * Fixing typo * Prettier fixes * [CLDN-1043] Adding rough version of filter changes for business types * fix down migration * Fixing bugs after merge * adding functionality to appy filters in back end * Fixing linting issues * fix down revision * Changing conversion callback to handle multiple values at once * Adding string representation of values * Code cleanup plus fixing debouce to only be called once for each entry * Removing non needed logginh * Changing operator list to use sting values * Using text value operators * Removing clear operator call * Moving business type endpoints * fix down revision * Adding port functions * update migration * fix bad rebase and add ff * implement validator * dont add invalid values to response * [CLDN-1205] Added a new exception type for a business type translation error. Added the error message in the display_value field within the business type response. Modified the IP and Port business types to populate the error message field in the response if an error occurs * [CLDN-1205] Added meaningful error message for port translation errors * Removing status field from businesstype Response and adding in error message * [CLDN-1205] Added check to make sure the port business type is within the valid range of ports, if it is not, it will populate the error message * [CLDN-1205] Fixed the if statement that checks to see if the string_value is in the valid range of port numbers. It did not corrently verify this before now. * [CLDN-1205] Fixed an error where it was trying to use string_value in <= statements. I just casted string_value to an integer if it is numeric, which allows <= operators to be used on it * [CLDN-1207] Added unit tests for the cidr_func and port_translation_func functions which are located in /superset/config.py * [CLDN-1207] removed the assertRaises line as it does not work with the cidr_func and port_translation_func functions * [CLDN-1207] Added the skeleton of the test_cidr_translate_filter_func unit test, still need to update what the expected response from the function will be. * [CLDN-1207] Added the remainder of the back-end unit tests for the business types * [CLDN-1207] Fixed the syntax error which caused the test_cidr_translate_filter_func_NOT_IN_double unit test to fail * [CLDN-1207] Removed the logging that was added for debugging purposes * [CLDN-1207] Formatted the commands_tests.py file to make it nicer to look at/read through * [CLDN-1207] Fixed the code so that it conformed to the pylint requirements (i.e., pylint no longer complains about the code in commands_tests.py) * [CLDN-1207] Modified some of the docstrings so they made better use of the 100 character per line, line limit * [CLDN-1207] Added the beginnings of the unit tests for the business types API * [CLDN-1207] Added a comment to the top of the commands_tests.py file explaining how to run the unit tests. This prevents the next person who tries to run them from having to waste time trying the different forms of testing that Superset supports (e.g., pytest, tox, etc.) * [CLDN-1207] Added a grammar fix to the comments describing how to run the unit tests * [CLDN-1207] Modified the description of the business_type API endpoints as they did not represent what the API was actually doing * [CLDN-1207] Added further instructions on how to run the unit tests that are within the business_type/api_tests.py file * add request validation * disable request if business type missing * [CLDN-1207] Unit tests for the business type API are now working, however, they need to be modified to make use of @mock as we don't want to have to run the server to be able to run the unit tests * Removing businesss types deffinitons from config * Adding select to only show valid business types * Fixed Enzyme tests * Added scalfolding for selecting filter dropdown * Adding intigration tests * fix revision * fix typos and unnecessary requests * break out useBusinessTypes * Added front-end RTL unit tests for the business type API endpoint * Fixed error from unit tests * Added a unit test to ensure the operator list is updated after a business type API response is received * Removing elect compoenet for business types * Adding feature flag and allowing saving when no business type present * fixing useEffect hooks * Adding feature flag to model * Changing behavior such that an empty string returns a default response * add form validation * Modified comments in unit test as command to run test has changed * Modified comments in unit test as filename to run test has changed * Modified the api_tests.py file to conform to the linting requirements * Changed the name of one of the tests to reflect what the test is actually testing * Added cypress back to the package.json * Added informative comments * Updated comments in files as well as removed imports which were not being used * Changes made by npm run prettier * Fixed spelling mistakes * Updated models.py to remove placeholder comments used in development * Added feature flag mocking in unit test * Fixing open api failure * Fixing business types to pass unit tests * Reverting unsafe connections back to false * Removing print statement * Adding business tpye to export test * setting default feature flag to false for business type * Reverting pre commit * Reverting pre commit and running pre commit * Reverting pre commit and running pre commit * Fixing formatting * Adding license * Fixing Linting * Protecting api enpoints * updating model * Fixing code path when business type exists * Linting * Linting * Fixing linting * Fixing spelling * Fixing schemas * Fixing app import * fixing item render * Added RTL test to make sure business type operator list is updated after API response * Fixing linting * fix migration * Changing unit tests * Fixing import and DB migration after rebase * Renaming to advanced types * Fixing Linting * More renaming * Removing uneeded change * Fixing linting and test errors * Removing unused imports * linting * Adding more detailed name for migration * Moving files to plugins * more renaming * Fixing schema name * Disabling feature flag that should not be enabled by default * Adding extra cehck * NameChange * formatting * Fixing equals check * Moveing all advanced type classes and types to one file, and converting tests to functional * Adding advanced type to test and fix linitng Co-authored-by: Ville Brofeldt <ville.v.brofeldt@gmail.com> Co-authored-by: Dan Parent <daniel.parent@cse-cst.gc.ca> Co-authored-by: GITHUB_USERNAME <EMAIL> Co-authored-by: cccs-Dustin <96579982+cccs-Dustin@users.noreply.github.com>
This commit is contained in:
parent
265013101a
commit
ddc01ea781
|
|
@ -3205,6 +3205,12 @@
|
|||
"owners": {
|
||||
"$ref": "#/components/schemas/DashboardRestApi.get_list.User2"
|
||||
},
|
||||
"advanced_data_type": {
|
||||
"maxLength": 255,
|
||||
"minLength": 1,
|
||||
"nullable": true,
|
||||
"type": "string"
|
||||
},
|
||||
"position_json": {
|
||||
"nullable": true,
|
||||
"type": "string"
|
||||
|
|
@ -4245,6 +4251,12 @@
|
|||
"nullable": true,
|
||||
"type": "string"
|
||||
},
|
||||
"advanced_data_type": {
|
||||
"maxLength": 255,
|
||||
"minLength": 1,
|
||||
"nullable": true,
|
||||
"type": "string"
|
||||
},
|
||||
"uuid": {
|
||||
"format": "uuid",
|
||||
"nullable": true,
|
||||
|
|
|
|||
|
|
@ -23,6 +23,7 @@ export enum FeatureFlag {
|
|||
ALERT_REPORTS = 'ALERT_REPORTS',
|
||||
CLIENT_CACHE = 'CLIENT_CACHE',
|
||||
DYNAMIC_PLUGINS = 'DYNAMIC_PLUGINS',
|
||||
ENABLE_ADVANCED_DATA_TYPES = 'ENABLE_ADVANCED_DATA_TYPES',
|
||||
SCHEDULED_QUERIES = 'SCHEDULED_QUERIES',
|
||||
SQL_VALIDATORS_BY_ENGINE = 'SQL_VALIDATORS_BY_ENGINE',
|
||||
ESTIMATE_QUERY_COST = 'ESTIMATE_QUERY_COST',
|
||||
|
|
|
|||
|
|
@ -173,22 +173,46 @@ function ColumnCollectionTable({
|
|||
return (
|
||||
<CollectionTable
|
||||
collection={columns}
|
||||
tableColumns={[
|
||||
'column_name',
|
||||
'type',
|
||||
'is_dttm',
|
||||
'main_dttm_col',
|
||||
'filterable',
|
||||
'groupby',
|
||||
]}
|
||||
sortColumns={[
|
||||
'column_name',
|
||||
'type',
|
||||
'is_dttm',
|
||||
'main_dttm_col',
|
||||
'filterable',
|
||||
'groupby',
|
||||
]}
|
||||
tableColumns={
|
||||
isFeatureEnabled(FeatureFlag.ENABLE_ADVANCED_DATA_TYPES)
|
||||
? [
|
||||
'column_name',
|
||||
'advanced_data_type',
|
||||
'type',
|
||||
'is_dttm',
|
||||
'main_dttm_col',
|
||||
'filterable',
|
||||
'groupby',
|
||||
]
|
||||
: [
|
||||
'column_name',
|
||||
'type',
|
||||
'is_dttm',
|
||||
'main_dttm_col',
|
||||
'filterable',
|
||||
'groupby',
|
||||
]
|
||||
}
|
||||
sortColumns={
|
||||
isFeatureEnabled(FeatureFlag.ENABLE_ADVANCED_DATA_TYPES)
|
||||
? [
|
||||
'column_name',
|
||||
'advanced_data_type',
|
||||
'type',
|
||||
'is_dttm',
|
||||
'main_dttm_col',
|
||||
'filterable',
|
||||
'groupby',
|
||||
]
|
||||
: [
|
||||
'column_name',
|
||||
'type',
|
||||
'is_dttm',
|
||||
'main_dttm_col',
|
||||
'filterable',
|
||||
'groupby',
|
||||
]
|
||||
}
|
||||
allowDeletes
|
||||
allowAddItem={allowAddItem}
|
||||
itemGenerator={itemGenerator}
|
||||
|
|
@ -243,6 +267,20 @@ function ColumnCollectionTable({
|
|||
}
|
||||
/>
|
||||
)}
|
||||
{isFeatureEnabled(FeatureFlag.ENABLE_ADVANCED_DATA_TYPES) ? (
|
||||
<Field
|
||||
fieldKey="advanced_data_type"
|
||||
label={t('Advanced data type')}
|
||||
control={
|
||||
<TextControl
|
||||
controlId="advanced_data_type"
|
||||
placeholder={t('Advanced Data type')}
|
||||
/>
|
||||
}
|
||||
/>
|
||||
) : (
|
||||
<></>
|
||||
)}
|
||||
<Field
|
||||
fieldKey="python_date_format"
|
||||
label={t('Datetime format')}
|
||||
|
|
@ -300,62 +338,131 @@ function ColumnCollectionTable({
|
|||
</Fieldset>
|
||||
</FormContainer>
|
||||
}
|
||||
columnLabels={{
|
||||
column_name: t('Column'),
|
||||
type: t('Data type'),
|
||||
groupby: t('Is dimension'),
|
||||
is_dttm: t('Is temporal'),
|
||||
main_dttm_col: t('Default datetime'),
|
||||
filterable: t('Is filterable'),
|
||||
}}
|
||||
columnLabels={
|
||||
isFeatureEnabled(FeatureFlag.ENABLE_ADVANCED_DATA_TYPES)
|
||||
? {
|
||||
column_name: t('Column'),
|
||||
advanced_data_type: t('Advanced data type'),
|
||||
type: t('Data type'),
|
||||
groupby: t('Is dimension'),
|
||||
is_dttm: t('Is temporal'),
|
||||
main_dttm_col: t('Default datetime'),
|
||||
filterable: t('Is filterable'),
|
||||
}
|
||||
: {
|
||||
column_name: t('Column'),
|
||||
type: t('Data type'),
|
||||
groupby: t('Is dimension'),
|
||||
is_dttm: t('Is temporal'),
|
||||
main_dttm_col: t('Default datetime'),
|
||||
filterable: t('Is filterable'),
|
||||
}
|
||||
}
|
||||
onChange={onColumnsChange}
|
||||
itemRenderers={{
|
||||
column_name: (v, onItemChange, _, record) =>
|
||||
editableColumnName ? (
|
||||
<StyledLabelWrapper>
|
||||
{record.is_certified && (
|
||||
<CertifiedBadge
|
||||
certifiedBy={record.certified_by}
|
||||
details={record.certification_details}
|
||||
/>
|
||||
)}
|
||||
<TextControl value={v} onChange={onItemChange} />
|
||||
</StyledLabelWrapper>
|
||||
) : (
|
||||
<StyledLabelWrapper>
|
||||
{record.is_certified && (
|
||||
<CertifiedBadge
|
||||
certifiedBy={record.certified_by}
|
||||
details={record.certification_details}
|
||||
/>
|
||||
)}
|
||||
{v}
|
||||
</StyledLabelWrapper>
|
||||
),
|
||||
main_dttm_col: (value, _onItemChange, _label, record) => {
|
||||
const checked = datasource.main_dttm_col === record.column_name;
|
||||
const disabled = !columns.find(
|
||||
column => column.column_name === record.column_name,
|
||||
).is_dttm;
|
||||
return (
|
||||
<Radio
|
||||
data-test={`radio-default-dttm-${record.column_name}`}
|
||||
checked={checked}
|
||||
disabled={disabled}
|
||||
onChange={() =>
|
||||
onDatasourceChange({
|
||||
...datasource,
|
||||
main_dttm_col: record.column_name,
|
||||
})
|
||||
}
|
||||
/>
|
||||
);
|
||||
},
|
||||
type: d => (d ? <Label>{d}</Label> : null),
|
||||
is_dttm: checkboxGenerator,
|
||||
filterable: checkboxGenerator,
|
||||
groupby: checkboxGenerator,
|
||||
}}
|
||||
itemRenderers={
|
||||
isFeatureEnabled(FeatureFlag.ENABLE_ADVANCED_DATA_TYPES)
|
||||
? {
|
||||
column_name: (v, onItemChange, _, record) =>
|
||||
editableColumnName ? (
|
||||
<StyledLabelWrapper>
|
||||
{record.is_certified && (
|
||||
<CertifiedBadge
|
||||
certifiedBy={record.certified_by}
|
||||
details={record.certification_details}
|
||||
/>
|
||||
)}
|
||||
<EditableTitle
|
||||
canEdit
|
||||
title={v}
|
||||
onSaveTitle={onItemChange}
|
||||
/>
|
||||
</StyledLabelWrapper>
|
||||
) : (
|
||||
<StyledLabelWrapper>
|
||||
{record.is_certified && (
|
||||
<CertifiedBadge
|
||||
certifiedBy={record.certified_by}
|
||||
details={record.certification_details}
|
||||
/>
|
||||
)}
|
||||
{v}
|
||||
</StyledLabelWrapper>
|
||||
),
|
||||
main_dttm_col: (value, _onItemChange, _label, record) => {
|
||||
const checked = datasource.main_dttm_col === record.column_name;
|
||||
const disabled = !columns.find(
|
||||
column => column.column_name === record.column_name,
|
||||
).is_dttm;
|
||||
return (
|
||||
<Radio
|
||||
data-test={`radio-default-dttm-${record.column_name}`}
|
||||
checked={checked}
|
||||
disabled={disabled}
|
||||
onChange={() =>
|
||||
onDatasourceChange({
|
||||
...datasource,
|
||||
main_dttm_col: record.column_name,
|
||||
})
|
||||
}
|
||||
/>
|
||||
);
|
||||
},
|
||||
type: d => (d ? <Label>{d}</Label> : null),
|
||||
advanced_data_type: d => (
|
||||
<Label onChange={onColumnsChange}>{d}</Label>
|
||||
),
|
||||
is_dttm: checkboxGenerator,
|
||||
filterable: checkboxGenerator,
|
||||
groupby: checkboxGenerator,
|
||||
}
|
||||
: {
|
||||
column_name: (v, onItemChange, _, record) =>
|
||||
editableColumnName ? (
|
||||
<StyledLabelWrapper>
|
||||
{record.is_certified && (
|
||||
<CertifiedBadge
|
||||
certifiedBy={record.certified_by}
|
||||
details={record.certification_details}
|
||||
/>
|
||||
)}
|
||||
<TextControl value={v} onChange={onItemChange} />
|
||||
</StyledLabelWrapper>
|
||||
) : (
|
||||
<StyledLabelWrapper>
|
||||
{record.is_certified && (
|
||||
<CertifiedBadge
|
||||
certifiedBy={record.certified_by}
|
||||
details={record.certification_details}
|
||||
/>
|
||||
)}
|
||||
{v}
|
||||
</StyledLabelWrapper>
|
||||
),
|
||||
main_dttm_col: (value, _onItemChange, _label, record) => {
|
||||
const checked = datasource.main_dttm_col === record.column_name;
|
||||
const disabled = !columns.find(
|
||||
column => column.column_name === record.column_name,
|
||||
).is_dttm;
|
||||
return (
|
||||
<Radio
|
||||
data-test={`radio-default-dttm-${record.column_name}`}
|
||||
checked={checked}
|
||||
disabled={disabled}
|
||||
onChange={() =>
|
||||
onDatasourceChange({
|
||||
...datasource,
|
||||
main_dttm_col: record.column_name,
|
||||
})
|
||||
}
|
||||
/>
|
||||
);
|
||||
},
|
||||
type: d => (d ? <Label>{d}</Label> : null),
|
||||
is_dttm: checkboxGenerator,
|
||||
filterable: checkboxGenerator,
|
||||
groupby: checkboxGenerator,
|
||||
}
|
||||
}
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
|
|
|||
|
|
@ -98,6 +98,7 @@ export default class AdhocFilterEditPopover extends React.Component {
|
|||
this.onMouseMove = this.onMouseMove.bind(this);
|
||||
this.onMouseUp = this.onMouseUp.bind(this);
|
||||
this.onAdhocFilterChange = this.onAdhocFilterChange.bind(this);
|
||||
this.setSimpleTabIsValid = this.setSimpleTabIsValid.bind(this);
|
||||
this.adjustHeight = this.adjustHeight.bind(this);
|
||||
this.onTabChange = this.onTabChange.bind(this);
|
||||
|
||||
|
|
@ -106,6 +107,7 @@ export default class AdhocFilterEditPopover extends React.Component {
|
|||
width: POPOVER_INITIAL_WIDTH,
|
||||
height: POPOVER_INITIAL_HEIGHT,
|
||||
activeKey: this.props?.adhocFilter?.expressionType || 'SIMPLE',
|
||||
isSimpleTabValid: true,
|
||||
};
|
||||
|
||||
this.popoverContentRef = React.createRef();
|
||||
|
|
@ -124,6 +126,10 @@ export default class AdhocFilterEditPopover extends React.Component {
|
|||
this.setState({ adhocFilter });
|
||||
}
|
||||
|
||||
setSimpleTabIsValid(isValid) {
|
||||
this.setState({ isSimpleTabValid: isValid });
|
||||
}
|
||||
|
||||
onSave() {
|
||||
this.props.onChange(this.state.adhocFilter);
|
||||
this.props.onClose();
|
||||
|
|
@ -214,6 +220,7 @@ export default class AdhocFilterEditPopover extends React.Component {
|
|||
onHeightChange={this.adjustHeight}
|
||||
partitionColumn={partitionColumn}
|
||||
popoverRef={this.popoverContentRef.current}
|
||||
validHandler={this.setSimpleTabIsValid}
|
||||
/>
|
||||
</ErrorBoundary>
|
||||
</Tabs.TabPane>
|
||||
|
|
@ -252,7 +259,7 @@ export default class AdhocFilterEditPopover extends React.Component {
|
|||
</Button>
|
||||
<Button
|
||||
data-test="adhoc-filter-edit-popover-save-button"
|
||||
disabled={!stateIsValid}
|
||||
disabled={!stateIsValid || !this.state.isSimpleTabValid}
|
||||
buttonStyle={
|
||||
hasUnsavedChanges && stateIsValid ? 'primary' : 'default'
|
||||
}
|
||||
|
|
|
|||
|
|
@ -31,8 +31,15 @@ import {
|
|||
OPERATOR_ENUM_TO_OPERATOR_TYPE,
|
||||
} from 'src/explore/constants';
|
||||
import AdhocMetric from 'src/explore/components/controls/MetricControl/AdhocMetric';
|
||||
import { render, screen, act, waitFor } from '@testing-library/react';
|
||||
import { supersetTheme, FeatureFlag, ThemeProvider } from '@superset-ui/core';
|
||||
import * as featureFlags from 'src/featureFlags';
|
||||
import userEvent from '@testing-library/user-event';
|
||||
import fetchMock from 'fetch-mock';
|
||||
|
||||
import AdhocFilterEditPopoverSimpleTabContent, {
|
||||
useSimpleTabFilterProps,
|
||||
Props,
|
||||
} from '.';
|
||||
|
||||
const simpleAdhocFilter = new AdhocFilter({
|
||||
|
|
@ -44,6 +51,15 @@ const simpleAdhocFilter = new AdhocFilter({
|
|||
clause: CLAUSES.WHERE,
|
||||
});
|
||||
|
||||
const advancedTypeTestAdhocFilterTest = new AdhocFilter({
|
||||
expressionType: EXPRESSION_TYPES.SIMPLE,
|
||||
subject: 'advancedDataType',
|
||||
operatorId: null,
|
||||
operator: null,
|
||||
comparator: null,
|
||||
clause: null,
|
||||
});
|
||||
|
||||
const simpleMultiAdhocFilter = new AdhocFilter({
|
||||
expressionType: EXPRESSION_TYPES.SIMPLE,
|
||||
subject: 'value',
|
||||
|
|
@ -55,8 +71,9 @@ const simpleMultiAdhocFilter = new AdhocFilter({
|
|||
|
||||
const sumValueAdhocMetric = new AdhocMetric({
|
||||
expressionType: EXPRESSION_TYPES.SIMPLE,
|
||||
column: { type: 'VARCHAR(255)', column_name: 'source' },
|
||||
column: { type: 'VARCHAR(255)', column_name: 'source', id: 5 },
|
||||
aggregate: AGGREGATES.SUM,
|
||||
label: 'test-AdhocMetric',
|
||||
});
|
||||
|
||||
const simpleCustomFilter = new AdhocFilter({
|
||||
|
|
@ -74,8 +91,29 @@ const options = [
|
|||
sumValueAdhocMetric,
|
||||
];
|
||||
|
||||
const getAdvancedDataTypeTestProps = (overrides?: Record<string, any>) => {
|
||||
const onChange = sinon.spy();
|
||||
const validHandler = sinon.spy();
|
||||
const props = {
|
||||
adhocFilter: advancedTypeTestAdhocFilterTest,
|
||||
onChange,
|
||||
options: [{ type: 'DOUBLE', column_name: 'advancedDataType', id: 5 }],
|
||||
datasource: {
|
||||
id: 'test-id',
|
||||
columns: [],
|
||||
type: 'postgres',
|
||||
filter_select: false,
|
||||
},
|
||||
partitionColumn: 'test',
|
||||
...overrides,
|
||||
validHandler,
|
||||
};
|
||||
return props;
|
||||
};
|
||||
|
||||
function setup(overrides?: Record<string, any>) {
|
||||
const onChange = sinon.spy();
|
||||
const validHandler = sinon.spy();
|
||||
const props = {
|
||||
adhocFilter: simpleAdhocFilter,
|
||||
onChange,
|
||||
|
|
@ -88,6 +126,7 @@ function setup(overrides?: Record<string, any>) {
|
|||
},
|
||||
partitionColumn: 'test',
|
||||
...overrides,
|
||||
validHandler,
|
||||
};
|
||||
const wrapper = shallow(
|
||||
<AdhocFilterEditPopoverSimpleTabContent {...props} />,
|
||||
|
|
@ -320,3 +359,196 @@ describe('AdhocFilterEditPopoverSimpleTabContent', () => {
|
|||
});
|
||||
});
|
||||
});
|
||||
|
||||
const ADVANCED_DATA_TYPE_ENDPOINT_VALID =
|
||||
'glob:*/api/v1/advanced_data_type/convert?q=(type:type,values:!(v))';
|
||||
const ADVANCED_DATA_TYPE_ENDPOINT_INVALID =
|
||||
'glob:*/api/v1/advanced_data_type/convert?q=(type:type,values:!(e))';
|
||||
fetchMock.get(ADVANCED_DATA_TYPE_ENDPOINT_VALID, {
|
||||
result: {
|
||||
display_value: 'VALID',
|
||||
error_message: '',
|
||||
valid_filter_operators: [Operators.EQUALS],
|
||||
values: ['VALID'],
|
||||
},
|
||||
});
|
||||
fetchMock.get(ADVANCED_DATA_TYPE_ENDPOINT_INVALID, {
|
||||
result: {
|
||||
display_value: '',
|
||||
error_message: 'error',
|
||||
valid_filter_operators: [],
|
||||
values: [],
|
||||
},
|
||||
});
|
||||
|
||||
describe('AdhocFilterEditPopoverSimpleTabContent Advanced data Type Test', () => {
|
||||
const setupFilter = async (props: Props) => {
|
||||
await act(async () => {
|
||||
render(
|
||||
<ThemeProvider theme={supersetTheme}>
|
||||
<AdhocFilterEditPopoverSimpleTabContent {...props} />
|
||||
</ThemeProvider>,
|
||||
);
|
||||
});
|
||||
};
|
||||
|
||||
let isFeatureEnabledMock: any;
|
||||
beforeEach(async () => {
|
||||
isFeatureEnabledMock = jest
|
||||
.spyOn(featureFlags, 'isFeatureEnabled')
|
||||
.mockImplementation(
|
||||
(featureFlag: FeatureFlag) =>
|
||||
featureFlag === FeatureFlag.ENABLE_ADVANCED_DATA_TYPES,
|
||||
);
|
||||
});
|
||||
|
||||
afterAll(() => {
|
||||
isFeatureEnabledMock.restore();
|
||||
});
|
||||
|
||||
it('should not call API when column has no advanced data type', async () => {
|
||||
fetchMock.resetHistory();
|
||||
|
||||
const props = getAdvancedDataTypeTestProps();
|
||||
|
||||
await setupFilter(props);
|
||||
|
||||
const filterValueField = screen.getByPlaceholderText(
|
||||
'Filter value (case sensitive)',
|
||||
);
|
||||
await act(async () => {
|
||||
userEvent.type(filterValueField, 'v');
|
||||
});
|
||||
|
||||
await act(async () => {
|
||||
userEvent.type(filterValueField, '{enter}');
|
||||
});
|
||||
|
||||
// When the column is not a advanced data type,
|
||||
// the advanced data type endpoint should not be called
|
||||
await waitFor(() =>
|
||||
expect(fetchMock.calls(ADVANCED_DATA_TYPE_ENDPOINT_VALID)).toHaveLength(
|
||||
0,
|
||||
),
|
||||
);
|
||||
});
|
||||
|
||||
it('should call API when column has advanced data type', async () => {
|
||||
fetchMock.resetHistory();
|
||||
|
||||
const props = getAdvancedDataTypeTestProps({
|
||||
options: [
|
||||
{
|
||||
type: 'DOUBLE',
|
||||
column_name: 'advancedDataType',
|
||||
id: 5,
|
||||
advanced_data_type: 'type',
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
await setupFilter(props);
|
||||
|
||||
const filterValueField = screen.getByPlaceholderText(
|
||||
'Filter value (case sensitive)',
|
||||
);
|
||||
await act(async () => {
|
||||
userEvent.type(filterValueField, 'v');
|
||||
});
|
||||
|
||||
await act(async () => {
|
||||
userEvent.type(filterValueField, '{enter}');
|
||||
});
|
||||
|
||||
// When the column is a advanced data type,
|
||||
// the advanced data type endpoint should be called
|
||||
await waitFor(() =>
|
||||
expect(fetchMock.calls(ADVANCED_DATA_TYPE_ENDPOINT_VALID)).toHaveLength(
|
||||
1,
|
||||
),
|
||||
);
|
||||
expect(props.validHandler.lastCall.args[0]).toBe(true);
|
||||
});
|
||||
|
||||
it('save button should be disabled if error message from API is returned', async () => {
|
||||
fetchMock.resetHistory();
|
||||
|
||||
const props = getAdvancedDataTypeTestProps({
|
||||
options: [
|
||||
{
|
||||
type: 'DOUBLE',
|
||||
column_name: 'advancedDataType',
|
||||
id: 5,
|
||||
advanced_data_type: 'type',
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
await setupFilter(props);
|
||||
|
||||
const filterValueField = screen.getByPlaceholderText(
|
||||
'Filter value (case sensitive)',
|
||||
);
|
||||
await act(async () => {
|
||||
userEvent.type(filterValueField, 'e');
|
||||
});
|
||||
|
||||
await act(async () => {
|
||||
userEvent.type(filterValueField, '{enter}');
|
||||
});
|
||||
|
||||
// When the column is a advanced data type but an error response is given by the endpoint,
|
||||
// the save button should be disabled
|
||||
await waitFor(() =>
|
||||
expect(fetchMock.calls(ADVANCED_DATA_TYPE_ENDPOINT_INVALID)).toHaveLength(
|
||||
1,
|
||||
),
|
||||
);
|
||||
expect(props.validHandler.lastCall.args[0]).toBe(false);
|
||||
});
|
||||
|
||||
it('advanced data type operator list should update after API response', async () => {
|
||||
fetchMock.resetHistory();
|
||||
|
||||
const props = getAdvancedDataTypeTestProps({
|
||||
options: [
|
||||
{
|
||||
type: 'DOUBLE',
|
||||
column_name: 'advancedDataType',
|
||||
id: 5,
|
||||
advanced_data_type: 'type',
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
await setupFilter(props);
|
||||
|
||||
const filterValueField = screen.getByPlaceholderText(
|
||||
'Filter value (case sensitive)',
|
||||
);
|
||||
await act(async () => {
|
||||
userEvent.type(filterValueField, 'v');
|
||||
});
|
||||
|
||||
await act(async () => {
|
||||
userEvent.type(filterValueField, '{enter}');
|
||||
});
|
||||
|
||||
// When the column is a advanced data type,
|
||||
// the advanced data type endpoint should be called
|
||||
await waitFor(() =>
|
||||
expect(fetchMock.calls(ADVANCED_DATA_TYPE_ENDPOINT_VALID)).toHaveLength(
|
||||
1,
|
||||
),
|
||||
);
|
||||
expect(props.validHandler.lastCall.args[0]).toBe(true);
|
||||
|
||||
const operatorValueField = screen.getByText('1 operator(s)');
|
||||
|
||||
await act(async () => {
|
||||
userEvent.type(operatorValueField, '{enter}');
|
||||
});
|
||||
|
||||
expect(screen.getByText('EQUALS')).toBeTruthy();
|
||||
});
|
||||
});
|
||||
|
|
|
|||
|
|
@ -17,6 +17,7 @@
|
|||
* under the License.
|
||||
*/
|
||||
import React, { useEffect, useState } from 'react';
|
||||
import FormItem from 'src/components/Form/FormItem';
|
||||
import { Select } from 'src/components';
|
||||
import { t, SupersetClient, SupersetTheme, styled } from '@superset-ui/core';
|
||||
import {
|
||||
|
|
@ -36,13 +37,22 @@ import AdhocFilter, {
|
|||
EXPRESSION_TYPES,
|
||||
CLAUSES,
|
||||
} from 'src/explore/components/controls/FilterControl/AdhocFilter';
|
||||
import { Tooltip } from 'src/components/Tooltip';
|
||||
import { Input } from 'src/components/Input';
|
||||
import { optionLabel } from 'src/utils/common';
|
||||
import { FeatureFlag, isFeatureEnabled } from 'src/featureFlags';
|
||||
import useAdvancedDataTypes from './useAdvancedDataTypes';
|
||||
|
||||
const StyledInput = styled(Input)`
|
||||
margin-bottom: ${({ theme }) => theme.gridUnit * 4}px;
|
||||
`;
|
||||
|
||||
export const StyledFormItem = styled(FormItem)`
|
||||
&.ant-row.ant-form-item {
|
||||
margin: 0;
|
||||
}
|
||||
`;
|
||||
|
||||
const SelectWithLabel = styled(Select)<{ labelText: string }>`
|
||||
.ant-select-selector::after {
|
||||
content: ${({ labelText }) => labelText || '\\A0'};
|
||||
|
|
@ -61,6 +71,7 @@ export interface SimpleColumnType {
|
|||
optionName?: string;
|
||||
filterBy?: string;
|
||||
value?: string;
|
||||
advanced_data_type?: string;
|
||||
}
|
||||
|
||||
export interface SimpleExpressionType {
|
||||
|
|
@ -97,7 +108,15 @@ export interface Props {
|
|||
};
|
||||
partitionColumn: string;
|
||||
operators?: Operators[];
|
||||
validHandler: (isValid: boolean) => void;
|
||||
}
|
||||
|
||||
export interface AdvancedDataTypesState {
|
||||
parsedAdvancedDataType: string;
|
||||
advancedDataTypeOperatorList: string[];
|
||||
errorMessage: string;
|
||||
}
|
||||
|
||||
export const useSimpleTabFilterProps = (props: Props) => {
|
||||
const isOperatorRelevant = (operator: Operators, subject: string) => {
|
||||
const column = props.datasource.columns?.find(
|
||||
|
|
@ -136,7 +155,6 @@ export const useSimpleTabFilterProps = (props: Props) => {
|
|||
('column_name' in option && option.column_name === id) ||
|
||||
('optionName' in option && option.optionName === id),
|
||||
);
|
||||
|
||||
let subject = '';
|
||||
let clause;
|
||||
// infer the new clause based on what subject was selected.
|
||||
|
|
@ -211,11 +229,20 @@ export const useSimpleTabFilterProps = (props: Props) => {
|
|||
}),
|
||||
);
|
||||
};
|
||||
const clearOperator = (): void => {
|
||||
props.onChange(
|
||||
props.adhocFilter.duplicateWith({
|
||||
operatorId: undefined,
|
||||
operator: undefined,
|
||||
}),
|
||||
);
|
||||
};
|
||||
return {
|
||||
onSubjectChange,
|
||||
onOperatorChange,
|
||||
onComparatorChange,
|
||||
isOperatorRelevant,
|
||||
clearOperator,
|
||||
};
|
||||
};
|
||||
|
||||
|
|
@ -233,6 +260,18 @@ const AdhocFilterEditPopoverSimpleTabContent: React.FC<Props> = props => {
|
|||
const [loadingComparatorSuggestions, setLoadingComparatorSuggestions] =
|
||||
useState(false);
|
||||
|
||||
const {
|
||||
advancedDataTypesState,
|
||||
subjectAdvancedDataType,
|
||||
fetchAdvancedDataTypeValueCallback,
|
||||
fetchSubjectAdvancedDataType,
|
||||
} = useAdvancedDataTypes(props.validHandler);
|
||||
// TODO: This does not need to exist, just use the advancedTypeOperatorList list
|
||||
const isOperatorRelevantWrapper = (operator: Operators, subject: string) =>
|
||||
subjectAdvancedDataType
|
||||
? isOperatorRelevant(operator, subject) &&
|
||||
advancedDataTypesState.advancedDataTypeOperatorList.includes(operator)
|
||||
: isOperatorRelevant(operator, subject);
|
||||
const onInputComparatorChange = (
|
||||
event: React.ChangeEvent<HTMLInputElement>,
|
||||
) => {
|
||||
|
|
@ -299,7 +338,7 @@ const AdhocFilterEditPopoverSimpleTabContent: React.FC<Props> = props => {
|
|||
placeholder: t(
|
||||
'%s operator(s)',
|
||||
(props.operators ?? OPERATORS_OPTIONS).filter(op =>
|
||||
isOperatorRelevant(op, subject),
|
||||
isOperatorRelevantWrapper(op, subject),
|
||||
).length,
|
||||
),
|
||||
value: operatorId,
|
||||
|
|
@ -366,7 +405,25 @@ const AdhocFilterEditPopoverSimpleTabContent: React.FC<Props> = props => {
|
|||
}, [props.adhocFilter.subject]);
|
||||
|
||||
useEffect(() => {
|
||||
setComparator(props.adhocFilter.comparator);
|
||||
if (isFeatureEnabled(FeatureFlag.ENABLE_ADVANCED_DATA_TYPES)) {
|
||||
fetchSubjectAdvancedDataType(props);
|
||||
}
|
||||
}, [props.adhocFilter.subject]);
|
||||
|
||||
useEffect(() => {
|
||||
if (isFeatureEnabled(FeatureFlag.ENABLE_ADVANCED_DATA_TYPES)) {
|
||||
fetchAdvancedDataTypeValueCallback(
|
||||
comparator === undefined ? '' : comparator,
|
||||
advancedDataTypesState,
|
||||
subjectAdvancedDataType,
|
||||
);
|
||||
}
|
||||
}, [comparator, subjectAdvancedDataType, fetchAdvancedDataTypeValueCallback]);
|
||||
|
||||
useEffect(() => {
|
||||
if (isFeatureEnabled(FeatureFlag.ENABLE_ADVANCED_DATA_TYPES)) {
|
||||
setComparator(props.adhocFilter.comparator);
|
||||
}
|
||||
}, [props.adhocFilter.comparator]);
|
||||
|
||||
return (
|
||||
|
|
@ -376,6 +433,7 @@ const AdhocFilterEditPopoverSimpleTabContent: React.FC<Props> = props => {
|
|||
marginTop: theme.gridUnit * 4,
|
||||
marginBottom: theme.gridUnit * 4,
|
||||
})}
|
||||
data-test="select-element"
|
||||
options={columns.map(column => ({
|
||||
value:
|
||||
('column_name' in column && column.column_name) ||
|
||||
|
|
@ -396,7 +454,7 @@ const AdhocFilterEditPopoverSimpleTabContent: React.FC<Props> = props => {
|
|||
<Select
|
||||
css={(theme: SupersetTheme) => ({ marginBottom: theme.gridUnit * 4 })}
|
||||
options={(props.operators ?? OPERATORS_OPTIONS)
|
||||
.filter(op => isOperatorRelevant(op, subject))
|
||||
.filter(op => isOperatorRelevantWrapper(op, subject))
|
||||
.map((option, index) => ({
|
||||
value: option,
|
||||
label: OPERATOR_ENUM_TO_OPERATOR_TYPE[option].display,
|
||||
|
|
@ -406,25 +464,39 @@ const AdhocFilterEditPopoverSimpleTabContent: React.FC<Props> = props => {
|
|||
{...operatorSelectProps}
|
||||
/>
|
||||
{MULTI_OPERATORS.has(operatorId) || suggestions.length > 0 ? (
|
||||
<SelectWithLabel
|
||||
labelText={labelText}
|
||||
options={suggestions}
|
||||
{...comparatorSelectProps}
|
||||
/>
|
||||
<Tooltip
|
||||
title={
|
||||
advancedDataTypesState.errorMessage ||
|
||||
advancedDataTypesState.parsedAdvancedDataType
|
||||
}
|
||||
>
|
||||
<SelectWithLabel
|
||||
labelText={labelText}
|
||||
options={suggestions}
|
||||
{...comparatorSelectProps}
|
||||
/>
|
||||
</Tooltip>
|
||||
) : (
|
||||
<StyledInput
|
||||
data-test="adhoc-filter-simple-value"
|
||||
name="filter-value"
|
||||
ref={ref => {
|
||||
if (ref && shouldFocusComparator) {
|
||||
ref.focus();
|
||||
}
|
||||
}}
|
||||
onChange={onInputComparatorChange}
|
||||
value={comparator}
|
||||
placeholder={t('Filter value (case sensitive)')}
|
||||
disabled={DISABLE_INPUT_OPERATORS.includes(operatorId)}
|
||||
/>
|
||||
<Tooltip
|
||||
title={
|
||||
advancedDataTypesState.errorMessage ||
|
||||
advancedDataTypesState.parsedAdvancedDataType
|
||||
}
|
||||
>
|
||||
<StyledInput
|
||||
data-test="adhoc-filter-simple-value"
|
||||
name="filter-value"
|
||||
ref={ref => {
|
||||
if (ref && shouldFocusComparator) {
|
||||
ref.focus();
|
||||
}
|
||||
}}
|
||||
onChange={onInputComparatorChange}
|
||||
value={comparator}
|
||||
placeholder={t('Filter value (case sensitive)')}
|
||||
disabled={DISABLE_INPUT_OPERATORS.includes(operatorId)}
|
||||
/>
|
||||
</Tooltip>
|
||||
)}
|
||||
</>
|
||||
);
|
||||
|
|
|
|||
|
|
@ -0,0 +1,103 @@
|
|||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
import { useCallback, useState } from 'react';
|
||||
import { ensureIsArray, SupersetClient, t } from '@superset-ui/core';
|
||||
import { debounce } from 'lodash';
|
||||
import rison from 'rison';
|
||||
import { AdvancedDataTypesState, Props } from './index';
|
||||
|
||||
const INITIAL_ADVANCED_DATA_TYPES_STATE: AdvancedDataTypesState = {
|
||||
parsedAdvancedDataType: '',
|
||||
advancedDataTypeOperatorList: [],
|
||||
errorMessage: '',
|
||||
};
|
||||
|
||||
const useAdvancedDataTypes = (validHandler: (isValid: boolean) => void) => {
|
||||
const [advancedDataTypesState, setAdvancedDataTypesState] =
|
||||
useState<AdvancedDataTypesState>(INITIAL_ADVANCED_DATA_TYPES_STATE);
|
||||
const [subjectAdvancedDataType, setSubjectAdvancedDataType] = useState<
|
||||
string | undefined
|
||||
>();
|
||||
|
||||
const fetchAdvancedDataTypeValueCallback = useCallback(
|
||||
(
|
||||
comp: string | string[],
|
||||
advancedDataTypesState: AdvancedDataTypesState,
|
||||
subjectAdvancedDataType?: string,
|
||||
) => {
|
||||
const values = ensureIsArray(comp);
|
||||
if (!subjectAdvancedDataType) {
|
||||
setAdvancedDataTypesState(INITIAL_ADVANCED_DATA_TYPES_STATE);
|
||||
return;
|
||||
}
|
||||
debounce(() => {
|
||||
const queryParams = rison.encode({
|
||||
type: subjectAdvancedDataType,
|
||||
values,
|
||||
});
|
||||
const endpoint = `/api/v1/advanced_data_type/convert?q=${queryParams}`;
|
||||
SupersetClient.get({ endpoint })
|
||||
.then(({ json }) => {
|
||||
setAdvancedDataTypesState({
|
||||
parsedAdvancedDataType: json.result.display_value,
|
||||
advancedDataTypeOperatorList: json.result.valid_filter_operators,
|
||||
errorMessage: json.result.error_message,
|
||||
});
|
||||
// Changed due to removal of status field
|
||||
validHandler(!json.result.error_message);
|
||||
})
|
||||
.catch(() => {
|
||||
setAdvancedDataTypesState({
|
||||
parsedAdvancedDataType: '',
|
||||
advancedDataTypeOperatorList:
|
||||
advancedDataTypesState.advancedDataTypeOperatorList,
|
||||
errorMessage: t('Failed to retrieve advanced type'),
|
||||
});
|
||||
validHandler(false);
|
||||
});
|
||||
}, 600)();
|
||||
},
|
||||
[validHandler],
|
||||
);
|
||||
|
||||
const fetchSubjectAdvancedDataType = (props: Props) => {
|
||||
const option = props.options.find(
|
||||
option =>
|
||||
('column_name' in option &&
|
||||
option.column_name === props.adhocFilter.subject) ||
|
||||
('optionName' in option &&
|
||||
option.optionName === props.adhocFilter.subject),
|
||||
);
|
||||
if (option && 'advanced_data_type' in option) {
|
||||
setSubjectAdvancedDataType(option.advanced_data_type);
|
||||
} else {
|
||||
props.validHandler(true);
|
||||
}
|
||||
};
|
||||
|
||||
return {
|
||||
advancedDataTypesState,
|
||||
subjectAdvancedDataType,
|
||||
setAdvancedDataTypesState,
|
||||
fetchAdvancedDataTypeValueCallback,
|
||||
fetchSubjectAdvancedDataType,
|
||||
};
|
||||
};
|
||||
|
||||
export default useAdvancedDataTypes;
|
||||
|
|
@ -0,0 +1,16 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
|
@ -0,0 +1,160 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
from typing import Any
|
||||
|
||||
from flask import current_app as app
|
||||
from flask.wrappers import Response
|
||||
from flask_appbuilder.api import BaseApi, expose, permission_name, protect, rison, safe
|
||||
from flask_babel import lazy_gettext as _
|
||||
|
||||
from superset.advanced_data_type.schemas import advanced_data_type_convert_schema
|
||||
from superset.advanced_data_type.types import AdvancedDataTypeResponse
|
||||
from superset.extensions import event_logger
|
||||
|
||||
config = app.config
|
||||
ADVANCED_DATA_TYPES = config["ADVANCED_DATA_TYPES"]
|
||||
|
||||
|
||||
class AdvancedDataTypeRestApi(BaseApi):
|
||||
"""
|
||||
Advanced Data Type Rest API
|
||||
-Will return available AdvancedDataTypes when the /types endpoint is accessed
|
||||
-Will return a AdvancedDataTypeResponse object when the /convert endpoint is accessed
|
||||
and is passed in valid arguments
|
||||
"""
|
||||
|
||||
allow_browser_login = True
|
||||
include_route_methods = {"get", "get_types"}
|
||||
resource_name = "advanced_data_type"
|
||||
|
||||
openapi_spec_tag = "Advanced Data Type"
|
||||
apispec_parameter_schemas = {
|
||||
"advanced_data_type_convert_schema": advanced_data_type_convert_schema,
|
||||
}
|
||||
|
||||
@protect()
|
||||
@safe
|
||||
@expose("/convert", methods=["GET"])
|
||||
@permission_name("get")
|
||||
@event_logger.log_this_with_context(
|
||||
action=lambda self, *args, **kwargs: f"{self.__class__.__name__}.get",
|
||||
log_to_statsd=False, # pylint: disable-arguments-renamed
|
||||
)
|
||||
@rison()
|
||||
def get(self, **kwargs: Any) -> Response:
|
||||
"""Returns a AdvancedDataTypeResponse object populated with the passed in args
|
||||
---
|
||||
get:
|
||||
description: >-
|
||||
Returns a AdvancedDataTypeResponse object populated with the passed in args.
|
||||
parameters:
|
||||
- in: query
|
||||
name: q
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/advanced_data_type_convert_schema'
|
||||
responses:
|
||||
200:
|
||||
description: >-
|
||||
AdvancedDataTypeResponse object has been returned.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
type: object
|
||||
properties:
|
||||
status:
|
||||
type: string
|
||||
values:
|
||||
type: array
|
||||
formatted_value:
|
||||
type: string
|
||||
error_message:
|
||||
type: string
|
||||
valid_filter_operators:
|
||||
type: string
|
||||
400:
|
||||
$ref: '#/components/responses/400'
|
||||
401:
|
||||
$ref: '#/components/responses/401'
|
||||
404:
|
||||
$ref: '#/components/responses/404'
|
||||
500:
|
||||
$ref: '#/components/responses/500'
|
||||
"""
|
||||
items = kwargs["rison"]
|
||||
advanced_data_type = items.get("type")
|
||||
if not advanced_data_type:
|
||||
return self.response(
|
||||
400, message=_("Missing advanced data type in request")
|
||||
)
|
||||
values = items["values"]
|
||||
if not values:
|
||||
return self.response(400, message=_("Missing values in request"))
|
||||
addon = ADVANCED_DATA_TYPES.get(advanced_data_type)
|
||||
if not addon:
|
||||
return self.response(
|
||||
400,
|
||||
message=_(
|
||||
"Invalid advanced data type: %(advanced_data_type)s",
|
||||
advanced_data_type=advanced_data_type,
|
||||
),
|
||||
)
|
||||
bus_resp: AdvancedDataTypeResponse = addon.translate_type(
|
||||
{
|
||||
"values": values,
|
||||
}
|
||||
)
|
||||
return self.response(200, result=bus_resp)
|
||||
|
||||
@protect()
|
||||
@safe
|
||||
@expose("/types", methods=["GET"])
|
||||
@permission_name("get")
|
||||
@event_logger.log_this_with_context(
|
||||
action=lambda self, *args, **kwargs: f"{self.__class__.__name__}.get",
|
||||
log_to_statsd=False, # pylint: disable-arguments-renamed
|
||||
)
|
||||
def get_types(self) -> Response:
|
||||
"""Returns a list of available advanced data types
|
||||
---
|
||||
get:
|
||||
description: >-
|
||||
Returns a list of available advanced data types.
|
||||
responses:
|
||||
200:
|
||||
description: >-
|
||||
a successful return of the available
|
||||
advanced data types has taken place.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
type: object
|
||||
properties:
|
||||
result:
|
||||
type: array
|
||||
400:
|
||||
$ref: '#/components/responses/400'
|
||||
401:
|
||||
$ref: '#/components/responses/401'
|
||||
404:
|
||||
$ref: '#/components/responses/404'
|
||||
500:
|
||||
$ref: '#/components/responses/500'
|
||||
"""
|
||||
|
||||
return self.response(200, result=list(ADVANCED_DATA_TYPES.keys()))
|
||||
|
|
@ -0,0 +1,16 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
|
@ -0,0 +1,138 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding coperatoryright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a coperatory of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import ipaddress
|
||||
from typing import Any, List
|
||||
|
||||
from sqlalchemy import Column
|
||||
|
||||
from superset.advanced_data_type.types import (
|
||||
AdvancedDataType,
|
||||
AdvancedDataTypeRequest,
|
||||
AdvancedDataTypeResponse,
|
||||
)
|
||||
from superset.utils.core import FilterOperator, FilterStringOperators
|
||||
|
||||
|
||||
def cidr_func(req: AdvancedDataTypeRequest) -> AdvancedDataTypeResponse:
|
||||
"""
|
||||
Convert a passed in AdvancedDataTypeRequest to a AdvancedDataTypeResponse
|
||||
"""
|
||||
resp: AdvancedDataTypeResponse = {
|
||||
"values": [],
|
||||
"error_message": "",
|
||||
"display_value": "",
|
||||
"valid_filter_operators": [
|
||||
FilterStringOperators.EQUALS,
|
||||
FilterStringOperators.GREATER_THAN_OR_EQUAL,
|
||||
FilterStringOperators.GREATER_THAN,
|
||||
FilterStringOperators.IN,
|
||||
FilterStringOperators.LESS_THAN,
|
||||
FilterStringOperators.LESS_THAN_OR_EQUAL,
|
||||
],
|
||||
}
|
||||
if req["values"] == [""]:
|
||||
resp["values"].append("")
|
||||
return resp
|
||||
for val in req["values"]:
|
||||
string_value = str(val)
|
||||
try:
|
||||
ip_range = (
|
||||
ipaddress.ip_network(int(string_value), strict=False)
|
||||
if string_value.isnumeric()
|
||||
else ipaddress.ip_network(string_value, strict=False)
|
||||
)
|
||||
resp["values"].append(
|
||||
{"start": int(ip_range[0]), "end": int(ip_range[-1])}
|
||||
if ip_range[0] != ip_range[-1]
|
||||
else int(ip_range[0])
|
||||
)
|
||||
except ValueError as ex:
|
||||
resp["error_message"] = str(ex)
|
||||
break
|
||||
else:
|
||||
resp["display_value"] = ", ".join(
|
||||
map(
|
||||
lambda x: f"{x['start']} - {x['end']}"
|
||||
if isinstance(x, dict)
|
||||
else str(x),
|
||||
resp["values"],
|
||||
)
|
||||
)
|
||||
return resp
|
||||
|
||||
|
||||
# Make this return a single clause
|
||||
def cidr_translate_filter_func(
|
||||
col: Column, operator: FilterOperator, values: List[Any]
|
||||
) -> Any:
|
||||
"""
|
||||
Convert a passed in column, FilterOperator and
|
||||
list of values into an sqlalchemy expression
|
||||
"""
|
||||
return_expression: Any
|
||||
if operator in (FilterOperator.IN, FilterOperator.NOT_IN):
|
||||
dict_items = [val for val in values if isinstance(val, dict)]
|
||||
single_values = [val for val in values if not isinstance(val, dict)]
|
||||
if operator == FilterOperator.IN.value:
|
||||
cond = col.in_(single_values)
|
||||
for dictionary in dict_items:
|
||||
cond = cond | (col <= dictionary["end"]) & (col >= dictionary["start"])
|
||||
elif operator == FilterOperator.NOT_IN.value:
|
||||
cond = ~(col.in_(single_values))
|
||||
for dictionary in dict_items:
|
||||
cond = cond & (col > dictionary["end"]) & (col < dictionary["start"])
|
||||
return_expression = cond
|
||||
if len(values) == 1:
|
||||
value = values[0]
|
||||
if operator == FilterOperator.EQUALS.value:
|
||||
return_expression = (
|
||||
col == value
|
||||
if not isinstance(value, dict)
|
||||
else (col <= value["end"]) & (col >= value["start"])
|
||||
)
|
||||
if operator == FilterOperator.GREATER_THAN_OR_EQUALS.value:
|
||||
return_expression = (
|
||||
col >= value if not isinstance(value, dict) else col >= value["end"]
|
||||
)
|
||||
if operator == FilterOperator.GREATER_THAN.value:
|
||||
return_expression = (
|
||||
col > value if not isinstance(value, dict) else col > value["end"]
|
||||
)
|
||||
if operator == FilterOperator.LESS_THAN.value:
|
||||
return_expression = (
|
||||
col < value if not isinstance(value, dict) else col < value["start"]
|
||||
)
|
||||
if operator == FilterOperator.LESS_THAN_OR_EQUALS.value:
|
||||
return_expression = (
|
||||
col <= value if not isinstance(value, dict) else col <= value["start"]
|
||||
)
|
||||
if operator == FilterOperator.NOT_EQUALS.value:
|
||||
return_expression = (
|
||||
col != value
|
||||
if not isinstance(value, dict)
|
||||
else (col > value["end"]) | (col < value["start"])
|
||||
)
|
||||
return return_expression
|
||||
|
||||
|
||||
internet_address: AdvancedDataType = AdvancedDataType(
|
||||
verbose_name="internet address",
|
||||
description="represents both an ip and cidr range",
|
||||
valid_data_types=["int"],
|
||||
translate_filter=cidr_translate_filter_func,
|
||||
translate_type=cidr_func,
|
||||
)
|
||||
|
|
@ -0,0 +1,141 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding coperatoryright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a coperatory of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import itertools
|
||||
from typing import Any, Dict, List
|
||||
|
||||
from sqlalchemy import Column
|
||||
|
||||
from superset.advanced_data_type.types import (
|
||||
AdvancedDataType,
|
||||
AdvancedDataTypeRequest,
|
||||
AdvancedDataTypeResponse,
|
||||
)
|
||||
from superset.utils.core import FilterOperator, FilterStringOperators
|
||||
|
||||
port_conversion_dict: Dict[str, List[int]] = {
|
||||
"http": [80],
|
||||
"ssh": [22],
|
||||
"https": [443],
|
||||
"ftp": [20, 21],
|
||||
"ftps": [989, 990],
|
||||
"telnet": [23],
|
||||
"telnets": [992],
|
||||
"smtp": [25],
|
||||
"submissions": [465], # aka smtps, ssmtp, urd
|
||||
"kerberos": [88],
|
||||
"kerberos-adm": [749],
|
||||
"poperator3": [110],
|
||||
"poperator3s": [995],
|
||||
"nntp": [119],
|
||||
"nntps": [563],
|
||||
"ntp": [123],
|
||||
"snmp": [161],
|
||||
"ldap": [389],
|
||||
"ldaps": [636],
|
||||
"imap2": [143], # aka imap
|
||||
"imaps": [993],
|
||||
}
|
||||
|
||||
|
||||
def port_translation_func(req: AdvancedDataTypeRequest) -> AdvancedDataTypeResponse:
|
||||
"""
|
||||
Convert a passed in AdvancedDataTypeRequest to a AdvancedDataTypeResponse
|
||||
"""
|
||||
resp: AdvancedDataTypeResponse = {
|
||||
"values": [],
|
||||
"error_message": "",
|
||||
"display_value": "",
|
||||
"valid_filter_operators": [
|
||||
FilterStringOperators.EQUALS,
|
||||
FilterStringOperators.GREATER_THAN_OR_EQUAL,
|
||||
FilterStringOperators.GREATER_THAN,
|
||||
FilterStringOperators.IN,
|
||||
FilterStringOperators.LESS_THAN,
|
||||
FilterStringOperators.LESS_THAN_OR_EQUAL,
|
||||
],
|
||||
}
|
||||
if req["values"] == [""]:
|
||||
resp["values"].append([""])
|
||||
return resp
|
||||
for val in req["values"]:
|
||||
string_value = str(val)
|
||||
try:
|
||||
if string_value.isnumeric():
|
||||
if not 1 <= int(string_value) <= 65535:
|
||||
raise ValueError
|
||||
resp["values"].append(
|
||||
[int(string_value)]
|
||||
if string_value.isnumeric()
|
||||
else port_conversion_dict[string_value]
|
||||
)
|
||||
except (KeyError, ValueError):
|
||||
resp["error_message"] = str(
|
||||
f"'{string_value}' does not appear to be a port name or number"
|
||||
)
|
||||
break
|
||||
else:
|
||||
resp["display_value"] = ", ".join(
|
||||
map(
|
||||
lambda x: f"{x['start']} - {x['end']}"
|
||||
if isinstance(x, dict)
|
||||
else str(x),
|
||||
resp["values"],
|
||||
)
|
||||
)
|
||||
return resp
|
||||
|
||||
|
||||
def port_translate_filter_func(
|
||||
col: Column, operator: FilterOperator, values: List[Any]
|
||||
) -> Any:
|
||||
"""
|
||||
Convert a passed in column, FilterOperator
|
||||
and list of values into an sqlalchemy expression
|
||||
"""
|
||||
return_expression: Any
|
||||
if operator in (FilterOperator.IN, FilterOperator.NOT_IN):
|
||||
vals_list = itertools.chain.from_iterable(values)
|
||||
if operator == FilterOperator.IN.value:
|
||||
cond = col.in_(vals_list)
|
||||
elif operator == FilterOperator.NOT_IN.value:
|
||||
cond = ~(col.in_(vals_list))
|
||||
return_expression = cond
|
||||
if len(values) == 1:
|
||||
value = values[0]
|
||||
value.sort()
|
||||
if operator == FilterOperator.EQUALS.value:
|
||||
return_expression = col.in_(value)
|
||||
if operator == FilterOperator.GREATER_THAN_OR_EQUALS.value:
|
||||
return_expression = col >= value[0]
|
||||
if operator == FilterOperator.GREATER_THAN.value:
|
||||
return_expression = col > value[0]
|
||||
if operator == FilterOperator.LESS_THAN.value:
|
||||
return_expression = col < value[-1]
|
||||
if operator == FilterOperator.LESS_THAN_OR_EQUALS.value:
|
||||
return_expression = col <= value[-1]
|
||||
if operator == FilterOperator.NOT_EQUALS.value:
|
||||
return_expression = ~col.in_(value)
|
||||
return return_expression
|
||||
|
||||
|
||||
internet_port: AdvancedDataType = AdvancedDataType(
|
||||
verbose_name="port",
|
||||
description="represents of a port",
|
||||
valid_data_types=["int"],
|
||||
translate_filter=port_translate_filter_func,
|
||||
translate_type=port_translation_func,
|
||||
)
|
||||
|
|
@ -0,0 +1,30 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""
|
||||
Schemas for advanced data types
|
||||
"""
|
||||
|
||||
advanced_data_type_convert_schema = {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"type": {"type": "string"},
|
||||
"values": {"type": "array"},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
|
@ -0,0 +1,59 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
from dataclasses import dataclass
|
||||
from typing import Any, Callable, List, Optional, TypedDict, Union
|
||||
|
||||
from sqlalchemy import Column
|
||||
from sqlalchemy.sql.expression import BinaryExpression
|
||||
|
||||
from superset.superset_typing import FilterValues
|
||||
from superset.utils.core import FilterOperator, FilterStringOperators
|
||||
|
||||
|
||||
class AdvancedDataTypeRequest(TypedDict):
|
||||
"""
|
||||
AdvancedDataType request class
|
||||
"""
|
||||
|
||||
advanced_data_type: str
|
||||
values: List[
|
||||
Union[FilterValues, None]
|
||||
] # unparsed value (usually text when passed from text box)
|
||||
|
||||
|
||||
class AdvancedDataTypeResponse(TypedDict, total=False):
|
||||
"""
|
||||
AdvancedDataType response
|
||||
"""
|
||||
|
||||
error_message: Optional[str]
|
||||
values: List[Any] # parsed value (can be any value)
|
||||
display_value: str # The string representation of the parsed values
|
||||
valid_filter_operators: List[FilterStringOperators]
|
||||
|
||||
|
||||
@dataclass
|
||||
class AdvancedDataType:
|
||||
"""
|
||||
Used for coverting base type value into an advanced type value
|
||||
"""
|
||||
|
||||
verbose_name: str
|
||||
description: str
|
||||
valid_data_types: List[str]
|
||||
translate_type: Callable[[AdvancedDataTypeRequest], AdvancedDataTypeResponse]
|
||||
translate_filter: Callable[[Column, FilterOperator, Any], BinaryExpression]
|
||||
|
|
@ -30,7 +30,7 @@ from marshmallow import ValidationError
|
|||
from werkzeug.wrappers import Response as WerkzeugResponse
|
||||
from werkzeug.wsgi import FileWrapper
|
||||
|
||||
from superset import is_feature_enabled, thumbnail_cache
|
||||
from superset import app, is_feature_enabled, thumbnail_cache
|
||||
from superset.charts.commands.bulk_delete import BulkDeleteChartCommand
|
||||
from superset.charts.commands.create import CreateChartCommand
|
||||
from superset.charts.commands.delete import DeleteChartCommand
|
||||
|
|
@ -82,6 +82,7 @@ from superset.views.base_api import (
|
|||
from superset.views.filters import FilterRelatedOwners
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
config = app.config
|
||||
|
||||
|
||||
class ChartRestApi(BaseSupersetModelRestApi):
|
||||
|
|
|
|||
|
|
@ -94,6 +94,10 @@ class Column(
|
|||
# Raw type as returned and used by db engine.
|
||||
type = sa.Column(sa.Text, default=UNKOWN_TYPE)
|
||||
|
||||
# Assigns column advnaced type to determine custom behavior
|
||||
# does nothing unless feature flag ENABLE_ADVANCED_DATA_TYPES in true
|
||||
advanced_data_type = sa.Column(sa.Text)
|
||||
|
||||
# Columns are defined by expressions. For tables, these are the actual columns names,
|
||||
# and should match the ``name`` attribute. For datasets, these can be any valid SQL
|
||||
# expression. If the SQL expression is an aggregation the column is a metric,
|
||||
|
|
|
|||
|
|
@ -30,7 +30,17 @@ import re
|
|||
import sys
|
||||
from collections import OrderedDict
|
||||
from datetime import timedelta
|
||||
from typing import Any, Callable, Dict, List, Optional, Type, TYPE_CHECKING, Union
|
||||
from typing import (
|
||||
Any,
|
||||
Callable,
|
||||
Dict,
|
||||
List,
|
||||
Literal,
|
||||
Optional,
|
||||
Type,
|
||||
TYPE_CHECKING,
|
||||
Union,
|
||||
)
|
||||
|
||||
import pkg_resources
|
||||
from cachelib.base import BaseCache
|
||||
|
|
@ -39,8 +49,10 @@ from dateutil import tz
|
|||
from flask import Blueprint
|
||||
from flask_appbuilder.security.manager import AUTH_DB
|
||||
from pandas._libs.parsers import STR_NA_VALUES # pylint: disable=no-name-in-module
|
||||
from typing_extensions import Literal
|
||||
|
||||
from superset.advanced_data_type.plugins.internet_address import internet_address
|
||||
from superset.advanced_data_type.plugins.internet_port import internet_port
|
||||
from superset.advanced_data_type.types import AdvancedDataType
|
||||
from superset.constants import CHANGE_ME_SECRET_KEY
|
||||
from superset.jinja_context import BaseTemplateProcessor
|
||||
from superset.stats_logger import DummyStatsLogger
|
||||
|
|
@ -393,6 +405,7 @@ DEFAULT_FEATURE_FLAGS: Dict[str, bool] = {
|
|||
"DASHBOARD_RBAC": False,
|
||||
"ENABLE_EXPLORE_DRAG_AND_DROP": True,
|
||||
"ENABLE_FILTER_BOX_MIGRATION": False,
|
||||
"ENABLE_ADVANCED_DATA_TYPES": False,
|
||||
"ENABLE_DND_WITH_CLICK_UX": True,
|
||||
# Enabling ALERTS_ATTACH_REPORTS, the system sends email and slack message
|
||||
# with screenshot and link
|
||||
|
|
@ -1269,6 +1282,13 @@ MENU_HIDE_USER_INFO = False
|
|||
# Set to False to only allow viewing own recent activity
|
||||
ENABLE_BROAD_ACTIVITY_ACCESS = True
|
||||
|
||||
# the advanced data type key should correspond to that set in the column metadata
|
||||
ADVANCED_DATA_TYPES: Dict[str, AdvancedDataType] = {
|
||||
"internet_address": internet_address,
|
||||
"port": internet_port,
|
||||
}
|
||||
|
||||
|
||||
# -------------------------------------------------------------------
|
||||
# * WARNING: STOP EDITING HERE *
|
||||
# -------------------------------------------------------------------
|
||||
|
|
|
|||
|
|
@ -599,6 +599,7 @@ class BaseColumn(AuditMixinNullable, ImportExportMixin):
|
|||
verbose_name = Column(String(1024))
|
||||
is_active = Column(Boolean, default=True)
|
||||
type = Column(Text)
|
||||
advanced_data_type = Column(String(255))
|
||||
groupby = Column(Boolean, default=True)
|
||||
filterable = Column(Boolean, default=True)
|
||||
description = Column(MediumText())
|
||||
|
|
@ -674,6 +675,7 @@ class BaseColumn(AuditMixinNullable, ImportExportMixin):
|
|||
"groupby",
|
||||
"is_dttm",
|
||||
"type",
|
||||
"advanced_data_type",
|
||||
)
|
||||
return {s: getattr(self, s) for s in attrs if hasattr(self, s)}
|
||||
|
||||
|
|
|
|||
|
|
@ -74,6 +74,7 @@ from sqlalchemy.sql.expression import Label, Select, TextAsFrom
|
|||
from sqlalchemy.sql.selectable import Alias, TableClause
|
||||
|
||||
from superset import app, db, is_feature_enabled, security_manager
|
||||
from superset.advanced_data_type.types import AdvancedDataTypeResponse
|
||||
from superset.columns.models import Column as NewColumn, UNKOWN_TYPE
|
||||
from superset.common.db_query_status import QueryStatus
|
||||
from superset.connectors.base.models import BaseColumn, BaseDatasource, BaseMetric
|
||||
|
|
@ -86,9 +87,11 @@ from superset.connectors.sqla.utils import (
|
|||
from superset.datasets.models import Dataset as NewDataset
|
||||
from superset.db_engine_specs.base import BaseEngineSpec, CTE_ALIAS, TimestampExpression
|
||||
from superset.exceptions import (
|
||||
AdvancedDataTypeResponseError,
|
||||
QueryClauseValidationException,
|
||||
QueryObjectValidationError,
|
||||
)
|
||||
from superset.extensions import feature_flag_manager
|
||||
from superset.jinja_context import (
|
||||
BaseTemplateProcessor,
|
||||
ExtraCache,
|
||||
|
|
@ -130,7 +133,7 @@ from superset.utils.core import (
|
|||
config = app.config
|
||||
metadata = Model.metadata # pylint: disable=no-member
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
ADVANCED_DATA_TYPES = config["ADVANCED_DATA_TYPES"]
|
||||
VIRTUAL_TABLE_ALIAS = "virtual_table"
|
||||
|
||||
# a non-exhaustive set of additive metrics
|
||||
|
|
@ -242,6 +245,7 @@ class TableColumn(Model, BaseColumn, CertificationMixin):
|
|||
"is_dttm",
|
||||
"is_active",
|
||||
"type",
|
||||
"advanced_data_type",
|
||||
"groupby",
|
||||
"filterable",
|
||||
"expression",
|
||||
|
|
@ -414,6 +418,7 @@ class TableColumn(Model, BaseColumn, CertificationMixin):
|
|||
"is_dttm",
|
||||
"type",
|
||||
"type_generic",
|
||||
"advanced_data_type",
|
||||
"python_date_format",
|
||||
"is_certified",
|
||||
"certified_by",
|
||||
|
|
@ -1472,7 +1477,10 @@ class SqlaTable(Model, BaseDatasource): # pylint: disable=too-many-public-metho
|
|||
utils.FilterOperator.IN.value,
|
||||
utils.FilterOperator.NOT_IN.value,
|
||||
)
|
||||
if col_spec:
|
||||
|
||||
col_advanced_data_type = col_obj.advanced_data_type if col_obj else ""
|
||||
|
||||
if col_spec and not col_advanced_data_type:
|
||||
target_generic_type = col_spec.generic_type
|
||||
else:
|
||||
target_generic_type = GenericDataType.STRING
|
||||
|
|
@ -1484,7 +1492,33 @@ class SqlaTable(Model, BaseDatasource): # pylint: disable=too-many-public-metho
|
|||
db_engine_spec=db_engine_spec,
|
||||
db_extra=self.database.get_extra(),
|
||||
)
|
||||
if is_list_target:
|
||||
if (
|
||||
col_advanced_data_type != ""
|
||||
and feature_flag_manager.is_feature_enabled(
|
||||
"ENABLE_ADVANCED_DATA_TYPES"
|
||||
)
|
||||
and col_advanced_data_type in ADVANCED_DATA_TYPES
|
||||
):
|
||||
values = eq if is_list_target else [eq] # type: ignore
|
||||
bus_resp: AdvancedDataTypeResponse = ADVANCED_DATA_TYPES[
|
||||
col_advanced_data_type
|
||||
].translate_type(
|
||||
{
|
||||
"type": col_advanced_data_type,
|
||||
"values": values,
|
||||
}
|
||||
)
|
||||
if bus_resp["error_message"]:
|
||||
raise AdvancedDataTypeResponseError(
|
||||
_(bus_resp["error_message"])
|
||||
)
|
||||
|
||||
where_clause_and.append(
|
||||
ADVANCED_DATA_TYPES[col_advanced_data_type].translate_filter(
|
||||
sqla_col, op, bus_resp["values"]
|
||||
)
|
||||
)
|
||||
elif is_list_target:
|
||||
assert isinstance(eq, (tuple, list))
|
||||
if len(eq) == 0:
|
||||
raise QueryObjectValidationError(
|
||||
|
|
|
|||
|
|
@ -66,6 +66,7 @@ class TableColumnInlineView(CompactCRUDMixin, SupersetModelView):
|
|||
"verbose_name",
|
||||
"description",
|
||||
"type",
|
||||
"advanced_data_type",
|
||||
"groupby",
|
||||
"filterable",
|
||||
"table",
|
||||
|
|
@ -79,6 +80,7 @@ class TableColumnInlineView(CompactCRUDMixin, SupersetModelView):
|
|||
"column_name",
|
||||
"verbose_name",
|
||||
"type",
|
||||
"advanced_data_type",
|
||||
"groupby",
|
||||
"filterable",
|
||||
"is_dttm",
|
||||
|
|
@ -144,6 +146,7 @@ class TableColumnInlineView(CompactCRUDMixin, SupersetModelView):
|
|||
"is_dttm": _("Is temporal"),
|
||||
"python_date_format": _("Datetime Format"),
|
||||
"type": _("Type"),
|
||||
"advanced_data_type": _("Business Data Type"),
|
||||
}
|
||||
validators_columns = {
|
||||
"python_date_format": [
|
||||
|
|
|
|||
|
|
@ -166,6 +166,7 @@ class DatasetRestApi(BaseSupersetModelRestApi):
|
|||
show_columns = show_select_columns + [
|
||||
"columns.type_generic",
|
||||
"database.backend",
|
||||
"columns.advanced_data_type",
|
||||
"is_managed_externally",
|
||||
]
|
||||
add_model_schema = DatasetPostSchema()
|
||||
|
|
|
|||
|
|
@ -48,6 +48,7 @@ class DatasetColumnsPutSchema(Schema):
|
|||
id = fields.Integer()
|
||||
column_name = fields.String(required=True, validate=Length(1, 255))
|
||||
type = fields.String(allow_none=True)
|
||||
advanced_data_type = fields.String(allow_none=True, validate=Length(1, 255))
|
||||
verbose_name = fields.String(allow_none=True, Length=(1, 1024))
|
||||
description = fields.String(allow_none=True)
|
||||
expression = fields.String(allow_none=True)
|
||||
|
|
@ -156,6 +157,7 @@ class ImportV1ColumnSchema(Schema):
|
|||
is_dttm = fields.Boolean(default=False, allow_none=True)
|
||||
is_active = fields.Boolean(default=True, allow_none=True)
|
||||
type = fields.String(allow_none=True)
|
||||
advanced_data_type = fields.String(allow_none=True)
|
||||
groupby = fields.Boolean()
|
||||
filterable = fields.Boolean()
|
||||
expression = fields.String(allow_none=True)
|
||||
|
|
|
|||
|
|
@ -196,6 +196,10 @@ class QueryObjectValidationError(SupersetException):
|
|||
status = 400
|
||||
|
||||
|
||||
class AdvancedDataTypeResponseError(SupersetException):
|
||||
status = 400
|
||||
|
||||
|
||||
class InvalidPostProcessingError(SupersetException):
|
||||
status = 400
|
||||
|
||||
|
|
|
|||
|
|
@ -113,6 +113,7 @@ class SupersetAppInitializer: # pylint: disable=too-many-public-methods
|
|||
# the global Flask app
|
||||
#
|
||||
# pylint: disable=import-outside-toplevel,too-many-locals,too-many-statements
|
||||
from superset.advanced_data_type.api import AdvancedDataTypeRestApi
|
||||
from superset.annotation_layers.annotations.api import AnnotationRestApi
|
||||
from superset.annotation_layers.api import AnnotationLayerRestApi
|
||||
from superset.async_events.api import AsyncEventsRestApi
|
||||
|
|
@ -190,6 +191,7 @@ class SupersetAppInitializer: # pylint: disable=too-many-public-methods
|
|||
appbuilder.add_api(AnnotationRestApi)
|
||||
appbuilder.add_api(AnnotationLayerRestApi)
|
||||
appbuilder.add_api(AsyncEventsRestApi)
|
||||
appbuilder.add_api(AdvancedDataTypeRestApi)
|
||||
appbuilder.add_api(CacheRestApi)
|
||||
appbuilder.add_api(ChartRestApi)
|
||||
appbuilder.add_api(ChartDataRestApi)
|
||||
|
|
|
|||
|
|
@ -0,0 +1,64 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""adding_advanced_data_type.py
|
||||
Revision ID: 6f139c533bea
|
||||
Revises: cbe71abde154
|
||||
Create Date: 2021-05-27 16:10:59.567684
|
||||
"""
|
||||
|
||||
import sqlalchemy as sa
|
||||
from alembic import op
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "6f139c533bea"
|
||||
down_revision = "cbe71abde154"
|
||||
|
||||
|
||||
def upgrade():
|
||||
with op.batch_alter_table("table_columns") as batch_op:
|
||||
batch_op.add_column(
|
||||
sa.Column(
|
||||
"advanced_data_type",
|
||||
sa.VARCHAR(255),
|
||||
nullable=True,
|
||||
)
|
||||
)
|
||||
with op.batch_alter_table("columns") as batch_op:
|
||||
batch_op.add_column(
|
||||
sa.Column(
|
||||
"advanced_data_type",
|
||||
sa.VARCHAR(255),
|
||||
nullable=True,
|
||||
)
|
||||
)
|
||||
with op.batch_alter_table("sl_columns") as batch_op:
|
||||
batch_op.add_column(
|
||||
sa.Column(
|
||||
"advanced_data_type",
|
||||
sa.Text,
|
||||
nullable=True,
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
with op.batch_alter_table("table_columns") as batch_op:
|
||||
batch_op.drop_column("advanced_data_type")
|
||||
with op.batch_alter_table("columns") as batch_op:
|
||||
batch_op.drop_column("advanced_data_type")
|
||||
with op.batch_alter_table("sl_columns") as batch_op:
|
||||
batch_op.drop_column("advanced_data_type")
|
||||
|
|
@ -243,6 +243,25 @@ class FilterOperator(str, Enum):
|
|||
IS_FALSE = "IS FALSE"
|
||||
|
||||
|
||||
class FilterStringOperators(str, Enum):
|
||||
EQUALS = ("EQUALS",)
|
||||
NOT_EQUALS = ("NOT_EQUALS",)
|
||||
LESS_THAN = ("LESS_THAN",)
|
||||
GREATER_THAN = ("GREATER_THAN",)
|
||||
LESS_THAN_OR_EQUAL = ("LESS_THAN_OR_EQUAL",)
|
||||
GREATER_THAN_OR_EQUAL = ("GREATER_THAN_OR_EQUAL",)
|
||||
IN = ("IN",)
|
||||
NOT_IN = ("NOT_IN",)
|
||||
ILIKE = ("ILIKE",)
|
||||
LIKE = ("LIKE",)
|
||||
REGEX = ("REGEX",)
|
||||
IS_NOT_NULL = ("IS_NOT_NULL",)
|
||||
IS_NULL = ("IS_NULL",)
|
||||
LATEST_PARTITION = ("LATEST_PARTITION",)
|
||||
IS_TRUE = ("IS_TRUE",)
|
||||
IS_FALSE = ("IS_FALSE",)
|
||||
|
||||
|
||||
class PostProcessingBoxplotWhiskerType(str, Enum):
|
||||
"""
|
||||
Calculate cell contribution to row/column total
|
||||
|
|
|
|||
|
|
@ -0,0 +1,16 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
|
@ -0,0 +1,143 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
# isort:skip_file
|
||||
"""Unit tests for Superset"""
|
||||
import json
|
||||
import prison
|
||||
from sqlalchemy import null
|
||||
|
||||
from superset.connectors.sqla.models import SqlaTable
|
||||
from superset.utils.core import get_example_default_schema
|
||||
|
||||
from tests.integration_tests.base_tests import (
|
||||
SupersetTestCase,
|
||||
logged_in_admin,
|
||||
test_client,
|
||||
)
|
||||
from tests.integration_tests.test_app import app
|
||||
from tests.integration_tests.utils.get_dashboards import get_dashboards_ids
|
||||
from unittest import mock
|
||||
from sqlalchemy import Column
|
||||
from typing import Any, List
|
||||
from superset.advanced_data_type.types import (
|
||||
AdvancedDataType,
|
||||
AdvancedDataTypeRequest,
|
||||
AdvancedDataTypeResponse,
|
||||
)
|
||||
from superset.utils.core import FilterOperator, FilterStringOperators
|
||||
|
||||
|
||||
target_resp: AdvancedDataTypeResponse = {
|
||||
"values": [],
|
||||
"error_message": "",
|
||||
"display_value": "",
|
||||
"valid_filter_operators": [
|
||||
FilterStringOperators.EQUALS,
|
||||
FilterStringOperators.GREATER_THAN_OR_EQUAL,
|
||||
FilterStringOperators.GREATER_THAN,
|
||||
FilterStringOperators.IN,
|
||||
FilterStringOperators.LESS_THAN,
|
||||
FilterStringOperators.LESS_THAN_OR_EQUAL,
|
||||
],
|
||||
}
|
||||
|
||||
|
||||
def translation_func(req: AdvancedDataTypeRequest) -> AdvancedDataTypeResponse:
|
||||
return target_resp
|
||||
|
||||
|
||||
def translate_filter_func(col: Column, op: FilterOperator, values: List[Any]):
|
||||
pass
|
||||
|
||||
|
||||
test_type: AdvancedDataType = AdvancedDataType(
|
||||
verbose_name="type",
|
||||
valid_data_types=["int"],
|
||||
translate_type=translation_func,
|
||||
description="",
|
||||
translate_filter=translate_filter_func,
|
||||
)
|
||||
|
||||
CHART_DATA_URI = "api/v1/chart/advanced_data_type"
|
||||
CHARTS_FIXTURE_COUNT = 10
|
||||
|
||||
|
||||
@mock.patch(
|
||||
"superset.advanced_data_type.api.ADVANCED_DATA_TYPES",
|
||||
{"type": 1},
|
||||
)
|
||||
def test_types_type_request(logged_in_admin):
|
||||
"""
|
||||
Advanced Data Type API: Test to see if the API call returns all the valid advanced data types
|
||||
"""
|
||||
uri = f"api/v1/advanced_data_type/types"
|
||||
response_value = test_client.get(uri)
|
||||
data = json.loads(response_value.data.decode("utf-8"))
|
||||
assert response_value.status_code == 200
|
||||
assert data == {"result": ["type"]}
|
||||
|
||||
|
||||
def test_types_convert_bad_request_no_vals(logged_in_admin):
|
||||
"""
|
||||
Advanced Data Type API: Test request to see if it behaves as expected when no values are passed
|
||||
"""
|
||||
arguments = {"type": "type", "values": []}
|
||||
uri = f"api/v1/advanced_data_type/convert?q={prison.dumps(arguments)}"
|
||||
response_value = test_client.get(uri)
|
||||
assert response_value.status_code == 400
|
||||
|
||||
|
||||
def test_types_convert_bad_request_no_type(logged_in_admin):
|
||||
"""
|
||||
Advanced Data Type API: Test request to see if it behaves as expected when no type is passed
|
||||
"""
|
||||
arguments = {"type": "", "values": [1]}
|
||||
uri = f"api/v1/advanced_data_type/convert?q={prison.dumps(arguments)}"
|
||||
response_value = test_client.get(uri)
|
||||
assert response_value.status_code == 400
|
||||
|
||||
|
||||
@mock.patch(
|
||||
"superset.advanced_data_type.api.ADVANCED_DATA_TYPES",
|
||||
{"type": 1},
|
||||
)
|
||||
def test_types_convert_bad_request_type_not_found(logged_in_admin):
|
||||
"""
|
||||
Advanced Data Type API: Test request to see if it behaves as expected when passed in type is
|
||||
not found/not valid
|
||||
"""
|
||||
arguments = {"type": "not_found", "values": [1]}
|
||||
uri = f"api/v1/advanced_data_type/convert?q={prison.dumps(arguments)}"
|
||||
response_value = test_client.get(uri)
|
||||
assert response_value.status_code == 400
|
||||
|
||||
|
||||
@mock.patch(
|
||||
"superset.advanced_data_type.api.ADVANCED_DATA_TYPES",
|
||||
{"type": test_type},
|
||||
)
|
||||
def test_types_convert_request(logged_in_admin):
|
||||
"""
|
||||
Advanced Data Type API: Test request to see if it behaves as expected when a valid type
|
||||
and valid values are passed in
|
||||
"""
|
||||
arguments = {"type": "type", "values": [1]}
|
||||
uri = f"api/v1/advanced_data_type/convert?q={prison.dumps(arguments)}"
|
||||
response_value = test_client.get(uri)
|
||||
assert response_value.status_code == 200
|
||||
data = json.loads(response_value.data.decode("utf-8"))
|
||||
assert data == {"result": target_resp}
|
||||
|
|
@ -185,6 +185,7 @@ class TestExportDatabasesCommand(SupersetTestCase):
|
|||
"is_dttm": True,
|
||||
"python_date_format": None,
|
||||
"type": ds_type,
|
||||
"advanced_data_type": None,
|
||||
"verbose_name": None,
|
||||
},
|
||||
{
|
||||
|
|
@ -197,6 +198,7 @@ class TestExportDatabasesCommand(SupersetTestCase):
|
|||
"is_dttm": False,
|
||||
"python_date_format": None,
|
||||
"type": "STRING" if example_db.backend == "hive" else "VARCHAR(16)",
|
||||
"advanced_data_type": None,
|
||||
"verbose_name": None,
|
||||
},
|
||||
{
|
||||
|
|
@ -211,6 +213,7 @@ class TestExportDatabasesCommand(SupersetTestCase):
|
|||
"type": "STRING"
|
||||
if example_db.backend == "hive"
|
||||
else "VARCHAR(255)",
|
||||
"advanced_data_type": None,
|
||||
"verbose_name": None,
|
||||
},
|
||||
{
|
||||
|
|
@ -223,6 +226,7 @@ class TestExportDatabasesCommand(SupersetTestCase):
|
|||
"is_dttm": False,
|
||||
"python_date_format": None,
|
||||
"type": big_int_type,
|
||||
"advanced_data_type": None,
|
||||
"verbose_name": None,
|
||||
},
|
||||
{
|
||||
|
|
@ -235,6 +239,7 @@ class TestExportDatabasesCommand(SupersetTestCase):
|
|||
"is_dttm": False,
|
||||
"python_date_format": None,
|
||||
"type": None,
|
||||
"advanced_data_type": None,
|
||||
"verbose_name": None,
|
||||
},
|
||||
{
|
||||
|
|
@ -247,6 +252,7 @@ class TestExportDatabasesCommand(SupersetTestCase):
|
|||
"is_dttm": False,
|
||||
"python_date_format": None,
|
||||
"type": "STRING" if example_db.backend == "hive" else "VARCHAR(10)",
|
||||
"advanced_data_type": None,
|
||||
"verbose_name": None,
|
||||
},
|
||||
{
|
||||
|
|
@ -259,6 +265,7 @@ class TestExportDatabasesCommand(SupersetTestCase):
|
|||
"is_dttm": False,
|
||||
"python_date_format": None,
|
||||
"type": big_int_type,
|
||||
"advanced_data_type": None,
|
||||
"verbose_name": None,
|
||||
},
|
||||
{
|
||||
|
|
@ -271,6 +278,7 @@ class TestExportDatabasesCommand(SupersetTestCase):
|
|||
"is_dttm": False,
|
||||
"python_date_format": None,
|
||||
"type": big_int_type,
|
||||
"advanced_data_type": None,
|
||||
"verbose_name": None,
|
||||
},
|
||||
],
|
||||
|
|
|
|||
|
|
@ -660,6 +660,7 @@ class TestDatasetApi(SupersetTestCase):
|
|||
"description": "description",
|
||||
"expression": "expression",
|
||||
"type": "INTEGER",
|
||||
"advanced_data_type": "ADVANCED_DATA_TYPE",
|
||||
"verbose_name": "New Col",
|
||||
}
|
||||
dataset_data = {
|
||||
|
|
@ -676,6 +677,9 @@ class TestDatasetApi(SupersetTestCase):
|
|||
assert new_col_dict["description"] in [col.description for col in columns]
|
||||
assert new_col_dict["expression"] in [col.expression for col in columns]
|
||||
assert new_col_dict["type"] in [col.type for col in columns]
|
||||
assert new_col_dict["advanced_data_type"] in [
|
||||
col.advanced_data_type for col in columns
|
||||
]
|
||||
|
||||
db.session.delete(dataset)
|
||||
db.session.commit()
|
||||
|
|
@ -693,6 +697,7 @@ class TestDatasetApi(SupersetTestCase):
|
|||
"expression": "expression",
|
||||
"extra": '{"abc":123}',
|
||||
"type": "INTEGER",
|
||||
"advanced_data_type": "ADVANCED_DATA_TYPE",
|
||||
"verbose_name": "New Col",
|
||||
"uuid": "c626b60a-3fb2-4e99-9f01-53aca0b17166",
|
||||
}
|
||||
|
|
@ -749,6 +754,7 @@ class TestDatasetApi(SupersetTestCase):
|
|||
assert columns[2].description == new_column_data["description"]
|
||||
assert columns[2].expression == new_column_data["expression"]
|
||||
assert columns[2].type == new_column_data["type"]
|
||||
assert columns[2].advanced_data_type == new_column_data["advanced_data_type"]
|
||||
assert columns[2].extra == new_column_data["extra"]
|
||||
assert columns[2].verbose_name == new_column_data["verbose_name"]
|
||||
assert str(columns[2].uuid) == new_column_data["uuid"]
|
||||
|
|
@ -785,6 +791,7 @@ class TestDatasetApi(SupersetTestCase):
|
|||
"description": "description",
|
||||
"expression": "expression",
|
||||
"type": "INTEGER",
|
||||
"advanced_data_type": "ADVANCED_DATA_TYPE",
|
||||
"verbose_name": "New Col",
|
||||
}
|
||||
uri = f"api/v1/dataset/{dataset.id}"
|
||||
|
|
|
|||
|
|
@ -94,6 +94,7 @@ class TestExportDatasetsCommand(SupersetTestCase):
|
|||
"is_dttm": False,
|
||||
"python_date_format": None,
|
||||
"type": type_map["source"],
|
||||
"advanced_data_type": None,
|
||||
"verbose_name": None,
|
||||
"extra": None,
|
||||
},
|
||||
|
|
@ -107,6 +108,7 @@ class TestExportDatasetsCommand(SupersetTestCase):
|
|||
"is_dttm": False,
|
||||
"python_date_format": None,
|
||||
"type": type_map["target"],
|
||||
"advanced_data_type": None,
|
||||
"verbose_name": None,
|
||||
"extra": None,
|
||||
},
|
||||
|
|
@ -120,6 +122,7 @@ class TestExportDatasetsCommand(SupersetTestCase):
|
|||
"is_dttm": False,
|
||||
"python_date_format": None,
|
||||
"type": type_map["value"],
|
||||
"advanced_data_type": None,
|
||||
"verbose_name": None,
|
||||
"extra": None,
|
||||
},
|
||||
|
|
|
|||
|
|
@ -0,0 +1,16 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
|
@ -0,0 +1,518 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
# isort:skip_file
|
||||
"""Unit tests for Superset"""
|
||||
|
||||
from ipaddress import ip_address
|
||||
import sqlalchemy
|
||||
from flask.ctx import AppContext
|
||||
from sqlalchemy import Column, Integer
|
||||
from tests.integration_tests.base_tests import SupersetTestCase
|
||||
from superset.advanced_data_type.types import (
|
||||
AdvancedDataTypeRequest,
|
||||
AdvancedDataTypeResponse,
|
||||
)
|
||||
from superset.utils.core import FilterOperator, FilterStringOperators
|
||||
|
||||
from superset.advanced_data_type.plugins.internet_address import internet_address
|
||||
from superset.advanced_data_type.plugins.internet_port import internet_port as port
|
||||
|
||||
|
||||
# To run the unit tests below, use the following command in the root Superset folder:
|
||||
# tox -e py38 -- tests/unit_tests/advanced_data_type/types_tests.py
|
||||
|
||||
|
||||
def test_ip_func_valid_ip(app_context: None):
|
||||
"""Test to see if the cidr_func behaves as expected when a valid IP is passed in"""
|
||||
cidr_request: AdvancedDataTypeRequest = {
|
||||
"advanced_data_type": "cidr",
|
||||
"values": ["1.1.1.1"],
|
||||
}
|
||||
cidr_response: AdvancedDataTypeResponse = {
|
||||
"values": [16843009],
|
||||
"error_message": "",
|
||||
"display_value": "16843009",
|
||||
"valid_filter_operators": [
|
||||
FilterStringOperators.EQUALS,
|
||||
FilterStringOperators.GREATER_THAN_OR_EQUAL,
|
||||
FilterStringOperators.GREATER_THAN,
|
||||
FilterStringOperators.IN,
|
||||
FilterStringOperators.LESS_THAN,
|
||||
FilterStringOperators.LESS_THAN_OR_EQUAL,
|
||||
],
|
||||
}
|
||||
|
||||
assert internet_address.translate_type(cidr_request) == cidr_response
|
||||
|
||||
|
||||
def test_cidr_func_invalid_ip(app_context: None):
|
||||
"""Test to see if the cidr_func behaves as expected when an invalid IP is passed in"""
|
||||
cidr_request: AdvancedDataTypeRequest = {
|
||||
"advanced_data_type": "cidr",
|
||||
"values": ["abc"],
|
||||
}
|
||||
cidr_response: AdvancedDataTypeResponse = {
|
||||
"values": [],
|
||||
"error_message": "'abc' does not appear to be an IPv4 or IPv6 network",
|
||||
"display_value": "",
|
||||
"valid_filter_operators": [
|
||||
FilterStringOperators.EQUALS,
|
||||
FilterStringOperators.GREATER_THAN_OR_EQUAL,
|
||||
FilterStringOperators.GREATER_THAN,
|
||||
FilterStringOperators.IN,
|
||||
FilterStringOperators.LESS_THAN,
|
||||
FilterStringOperators.LESS_THAN_OR_EQUAL,
|
||||
],
|
||||
}
|
||||
|
||||
assert internet_address.translate_type(cidr_request) == cidr_response
|
||||
|
||||
|
||||
def test_port_translation_func_valid_port_number(app_context: None):
|
||||
"""Test to see if the port_translation_func behaves as expected when a valid port number
|
||||
is passed in"""
|
||||
port_request: AdvancedDataTypeRequest = {
|
||||
"advanced_data_type": "port",
|
||||
"values": ["80"],
|
||||
}
|
||||
port_response: AdvancedDataTypeResponse = {
|
||||
"values": [[80]],
|
||||
"error_message": "",
|
||||
"display_value": "[80]",
|
||||
"valid_filter_operators": [
|
||||
FilterStringOperators.EQUALS,
|
||||
FilterStringOperators.GREATER_THAN_OR_EQUAL,
|
||||
FilterStringOperators.GREATER_THAN,
|
||||
FilterStringOperators.IN,
|
||||
FilterStringOperators.LESS_THAN,
|
||||
FilterStringOperators.LESS_THAN_OR_EQUAL,
|
||||
],
|
||||
}
|
||||
|
||||
assert port.translate_type(port_request) == port_response
|
||||
|
||||
|
||||
def test_port_translation_func_valid_port_name(app_context: None):
|
||||
"""Test to see if the port_translation_func behaves as expected when a valid port name
|
||||
is passed in"""
|
||||
port_request: AdvancedDataTypeRequest = {
|
||||
"advanced_data_type": "port",
|
||||
"values": ["https"],
|
||||
}
|
||||
port_response: AdvancedDataTypeResponse = {
|
||||
"values": [[443]],
|
||||
"error_message": "",
|
||||
"display_value": "[443]",
|
||||
"valid_filter_operators": [
|
||||
FilterStringOperators.EQUALS,
|
||||
FilterStringOperators.GREATER_THAN_OR_EQUAL,
|
||||
FilterStringOperators.GREATER_THAN,
|
||||
FilterStringOperators.IN,
|
||||
FilterStringOperators.LESS_THAN,
|
||||
FilterStringOperators.LESS_THAN_OR_EQUAL,
|
||||
],
|
||||
}
|
||||
|
||||
assert port.translate_type(port_request) == port_response
|
||||
|
||||
|
||||
def test_port_translation_func_invalid_port_name(app_context: None):
|
||||
"""Test to see if the port_translation_func behaves as expected when an invalid port name
|
||||
is passed in"""
|
||||
port_request: AdvancedDataTypeRequest = {
|
||||
"advanced_data_type": "port",
|
||||
"values": ["abc"],
|
||||
}
|
||||
port_response: AdvancedDataTypeResponse = {
|
||||
"values": [],
|
||||
"error_message": "'abc' does not appear to be a port name or number",
|
||||
"display_value": "",
|
||||
"valid_filter_operators": [
|
||||
FilterStringOperators.EQUALS,
|
||||
FilterStringOperators.GREATER_THAN_OR_EQUAL,
|
||||
FilterStringOperators.GREATER_THAN,
|
||||
FilterStringOperators.IN,
|
||||
FilterStringOperators.LESS_THAN,
|
||||
FilterStringOperators.LESS_THAN_OR_EQUAL,
|
||||
],
|
||||
}
|
||||
|
||||
assert port.translate_type(port_request) == port_response
|
||||
|
||||
|
||||
def test_port_translation_func_invalid_port_number(app_context: None):
|
||||
"""Test to see if the port_translation_func behaves as expected when an invalid port
|
||||
number is passed in"""
|
||||
port_request: AdvancedDataTypeRequest = {
|
||||
"advanced_data_type": "port",
|
||||
"values": ["123456789"],
|
||||
}
|
||||
port_response: AdvancedDataTypeResponse = {
|
||||
"values": [],
|
||||
"error_message": "'123456789' does not appear to be a port name or number",
|
||||
"display_value": "",
|
||||
"valid_filter_operators": [
|
||||
FilterStringOperators.EQUALS,
|
||||
FilterStringOperators.GREATER_THAN_OR_EQUAL,
|
||||
FilterStringOperators.GREATER_THAN,
|
||||
FilterStringOperators.IN,
|
||||
FilterStringOperators.LESS_THAN,
|
||||
FilterStringOperators.LESS_THAN_OR_EQUAL,
|
||||
],
|
||||
}
|
||||
|
||||
assert port.translate_type(port_request) == port_response
|
||||
|
||||
|
||||
def test_cidr_translate_filter_func_equals(app_context: None):
|
||||
"""Test to see if the cidr_translate_filter_func behaves as expected when the EQUALS
|
||||
operator is used"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.EQUALS
|
||||
input_values = [16843009]
|
||||
|
||||
cidr_translate_filter_response = input_column == input_values[0]
|
||||
|
||||
assert internet_address.translate_filter(
|
||||
input_column, input_operation, input_values
|
||||
).compare(cidr_translate_filter_response)
|
||||
|
||||
|
||||
def test_cidr_translate_filter_func_not_equals(app_context: None):
|
||||
"""Test to see if the cidr_translate_filter_func behaves as expected when the NOT_EQUALS
|
||||
operator is used"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.NOT_EQUALS
|
||||
input_values = [16843009]
|
||||
|
||||
cidr_translate_filter_response = input_column != input_values[0]
|
||||
|
||||
assert internet_address.translate_filter(
|
||||
input_column, input_operation, input_values
|
||||
).compare(cidr_translate_filter_response)
|
||||
|
||||
|
||||
def test_cidr_translate_filter_func_greater_than_or_equals(app_context: None):
|
||||
"""Test to see if the cidr_translate_filter_func behaves as expected when the
|
||||
GREATER_THAN_OR_EQUALS operator is used"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.GREATER_THAN_OR_EQUALS
|
||||
input_values = [16843009]
|
||||
|
||||
cidr_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_column >= input_values[0]
|
||||
)
|
||||
|
||||
assert internet_address.translate_filter(
|
||||
input_column, input_operation, input_values
|
||||
).compare(cidr_translate_filter_response)
|
||||
|
||||
|
||||
def test_cidr_translate_filter_func_greater_than(app_context: None):
|
||||
"""Test to see if the cidr_translate_filter_func behaves as expected when the
|
||||
GREATER_THAN operator is used"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.GREATER_THAN
|
||||
input_values = [16843009]
|
||||
|
||||
cidr_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_column > input_values[0]
|
||||
)
|
||||
|
||||
assert internet_address.translate_filter(
|
||||
input_column, input_operation, input_values
|
||||
).compare(cidr_translate_filter_response)
|
||||
|
||||
|
||||
def test_cidr_translate_filter_func_less_than(app_context: None):
|
||||
"""Test to see if the cidr_translate_filter_func behaves as expected when the LESS_THAN
|
||||
operator is used"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.LESS_THAN
|
||||
input_values = [16843009]
|
||||
|
||||
cidr_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_column < input_values[0]
|
||||
)
|
||||
|
||||
assert internet_address.translate_filter(
|
||||
input_column, input_operation, input_values
|
||||
).compare(cidr_translate_filter_response)
|
||||
|
||||
|
||||
def test_cidr_translate_filter_func_less_than_or_equals(app_context: None):
|
||||
"""Test to see if the cidr_translate_filter_func behaves as expected when the
|
||||
LESS_THAN_OR_EQUALS operator is used"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.LESS_THAN_OR_EQUALS
|
||||
input_values = [16843009]
|
||||
|
||||
cidr_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_column <= input_values[0]
|
||||
)
|
||||
|
||||
assert internet_address.translate_filter(
|
||||
input_column, input_operation, input_values
|
||||
).compare(cidr_translate_filter_response)
|
||||
|
||||
|
||||
def test_cidr_translate_filter_func_in_single(app_context: None):
|
||||
"""Test to see if the cidr_translate_filter_func behaves as expected when the IN operator
|
||||
is used with a single IP"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.IN
|
||||
input_values = [16843009]
|
||||
|
||||
cidr_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_column.in_(input_values)
|
||||
)
|
||||
|
||||
assert internet_address.translate_filter(
|
||||
input_column, input_operation, input_values
|
||||
).compare(cidr_translate_filter_response)
|
||||
|
||||
|
||||
def test_cidr_translate_filter_func_in_double(app_context: None):
|
||||
"""Test to see if the cidr_translate_filter_func behaves as expected when the IN operator
|
||||
is used with two IP's"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.IN
|
||||
input_values = [{"start": 16843009, "end": 33686018}]
|
||||
|
||||
input_condition = input_column.in_([])
|
||||
|
||||
cidr_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_condition | ((input_column <= 33686018) & (input_column >= 16843009))
|
||||
)
|
||||
|
||||
assert internet_address.translate_filter(
|
||||
input_column, input_operation, input_values
|
||||
).compare(cidr_translate_filter_response)
|
||||
|
||||
|
||||
def test_cidr_translate_filter_func_not_in_single(app_context: None):
|
||||
"""Test to see if the cidr_translate_filter_func behaves as expected when the NOT_IN
|
||||
operator is used with a single IP"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.NOT_IN
|
||||
input_values = [16843009]
|
||||
|
||||
cidr_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = ~(
|
||||
input_column.in_(input_values)
|
||||
)
|
||||
|
||||
assert internet_address.translate_filter(
|
||||
input_column, input_operation, input_values
|
||||
).compare(cidr_translate_filter_response)
|
||||
|
||||
|
||||
def test_cidr_translate_filter_func_not_in_double(app_context: None):
|
||||
"""Test to see if the cidr_translate_filter_func behaves as expected when the NOT_IN
|
||||
operator is used with two IP's"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.NOT_IN
|
||||
input_values = [{"start": 16843009, "end": 33686018}]
|
||||
|
||||
input_condition = ~(input_column.in_([]))
|
||||
|
||||
cidr_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_condition & (input_column > 33686018) & (input_column < 16843009)
|
||||
)
|
||||
|
||||
assert internet_address.translate_filter(
|
||||
input_column, input_operation, input_values
|
||||
).compare(cidr_translate_filter_response)
|
||||
|
||||
|
||||
def test_port_translate_filter_func_equals(app_context: None):
|
||||
"""Test to see if the port_translate_filter_func behaves as expected when the EQUALS
|
||||
operator is used"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.EQUALS
|
||||
input_values = [[443]]
|
||||
|
||||
port_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_column.in_(input_values[0])
|
||||
)
|
||||
|
||||
assert port.translate_filter(input_column, input_operation, input_values).compare(
|
||||
port_translate_filter_response
|
||||
)
|
||||
|
||||
|
||||
def test_port_translate_filter_func_not_equals(app_context: None):
|
||||
"""Test to see if the port_translate_filter_func behaves as expected when the NOT_EQUALS
|
||||
operator is used"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.NOT_EQUALS
|
||||
input_values = [[443]]
|
||||
|
||||
port_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = ~(
|
||||
input_column.in_(input_values[0])
|
||||
)
|
||||
|
||||
assert port.translate_filter(input_column, input_operation, input_values).compare(
|
||||
port_translate_filter_response
|
||||
)
|
||||
|
||||
|
||||
def test_port_translate_filter_func_greater_than_or_equals(app_context: None):
|
||||
"""Test to see if the port_translate_filter_func behaves as expected when the
|
||||
GREATER_THAN_OR_EQUALS operator is used"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.GREATER_THAN_OR_EQUALS
|
||||
input_values = [[443]]
|
||||
|
||||
port_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_column >= input_values[0][0]
|
||||
)
|
||||
|
||||
assert port.translate_filter(input_column, input_operation, input_values).compare(
|
||||
port_translate_filter_response
|
||||
)
|
||||
|
||||
|
||||
def test_port_translate_filter_func_greater_than(app_context: None):
|
||||
"""Test to see if the port_translate_filter_func behaves as expected when the
|
||||
GREATER_THAN operator is used"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.GREATER_THAN
|
||||
input_values = [[443]]
|
||||
|
||||
port_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_column > input_values[0][0]
|
||||
)
|
||||
|
||||
assert port.translate_filter(input_column, input_operation, input_values).compare(
|
||||
port_translate_filter_response
|
||||
)
|
||||
|
||||
|
||||
def test_port_translate_filter_func_less_than_or_equals(app_context: None):
|
||||
"""Test to see if the port_translate_filter_func behaves as expected when the
|
||||
LESS_THAN_OR_EQUALS operator is used"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.LESS_THAN_OR_EQUALS
|
||||
input_values = [[443]]
|
||||
|
||||
port_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_column <= input_values[0][0]
|
||||
)
|
||||
|
||||
assert port.translate_filter(input_column, input_operation, input_values).compare(
|
||||
port_translate_filter_response
|
||||
)
|
||||
|
||||
|
||||
def test_port_translate_filter_func_less_than(app_context: None):
|
||||
"""Test to see if the port_translate_filter_func behaves as expected when the LESS_THAN
|
||||
operator is used"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.LESS_THAN
|
||||
input_values = [[443]]
|
||||
|
||||
port_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_column < input_values[0][0]
|
||||
)
|
||||
|
||||
assert port.translate_filter(input_column, input_operation, input_values).compare(
|
||||
port_translate_filter_response
|
||||
)
|
||||
|
||||
|
||||
def test_port_translate_filter_func_in_single(app_context: None):
|
||||
"""Test to see if the port_translate_filter_func behaves as expected when the IN operator
|
||||
is used with a single port"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.IN
|
||||
input_values = [[443]]
|
||||
|
||||
port_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_column.in_(input_values[0])
|
||||
)
|
||||
|
||||
assert port.translate_filter(input_column, input_operation, input_values).compare(
|
||||
port_translate_filter_response
|
||||
)
|
||||
|
||||
|
||||
def test_port_translate_filter_func_in_double(app_context: None):
|
||||
"""Test to see if the port_translate_filter_func behaves as expected when the IN operator
|
||||
is used with two ports"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.IN
|
||||
input_values = [[443, 80]]
|
||||
|
||||
port_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = (
|
||||
input_column.in_(input_values[0])
|
||||
)
|
||||
|
||||
assert port.translate_filter(input_column, input_operation, input_values).compare(
|
||||
port_translate_filter_response
|
||||
)
|
||||
|
||||
|
||||
def test_port_translate_filter_func_not_in_single(app_context: None):
|
||||
"""Test to see if the port_translate_filter_func behaves as expected when the NOT_IN
|
||||
operator is used with a single port"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.NOT_IN
|
||||
input_values = [[443]]
|
||||
|
||||
port_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = ~(
|
||||
input_column.in_(input_values[0])
|
||||
)
|
||||
|
||||
assert port.translate_filter(input_column, input_operation, input_values).compare(
|
||||
port_translate_filter_response
|
||||
)
|
||||
|
||||
|
||||
def test_port_translate_filter_func_not_in_double(app_context: None):
|
||||
"""Test to see if the port_translate_filter_func behaves as expected when the NOT_IN
|
||||
operator is used with two ports"""
|
||||
|
||||
input_column = Column("user_ip", Integer)
|
||||
input_operation = FilterOperator.NOT_IN
|
||||
input_values = [[443, 80]]
|
||||
|
||||
port_translate_filter_response: sqlalchemy.sql.expression.BinaryExpression = ~(
|
||||
input_column.in_(input_values[0])
|
||||
)
|
||||
|
||||
assert port.translate_filter(input_column, input_operation, input_values).compare(
|
||||
port_translate_filter_response
|
||||
)
|
||||
|
|
@ -125,6 +125,7 @@ columns:
|
|||
is_dttm: null
|
||||
is_active: null
|
||||
type: INTEGER
|
||||
advanced_data_type: null
|
||||
groupby: null
|
||||
filterable: null
|
||||
expression: revenue-expenses
|
||||
|
|
@ -137,6 +138,7 @@ columns:
|
|||
is_dttm: 1
|
||||
is_active: null
|
||||
type: TIMESTAMP
|
||||
advanced_data_type: null
|
||||
groupby: null
|
||||
filterable: null
|
||||
expression: null
|
||||
|
|
@ -148,6 +150,7 @@ columns:
|
|||
is_dttm: null
|
||||
is_active: null
|
||||
type: INTEGER
|
||||
advanced_data_type: null
|
||||
groupby: null
|
||||
filterable: null
|
||||
expression: null
|
||||
|
|
@ -159,6 +162,7 @@ columns:
|
|||
is_dttm: null
|
||||
is_active: null
|
||||
type: INTEGER
|
||||
advanced_data_type: null
|
||||
groupby: null
|
||||
filterable: null
|
||||
expression: null
|
||||
|
|
@ -170,6 +174,7 @@ columns:
|
|||
is_dttm: null
|
||||
is_active: null
|
||||
type: INTEGER
|
||||
advanced_data_type: null
|
||||
groupby: null
|
||||
filterable: null
|
||||
expression: null
|
||||
|
|
|
|||
|
|
@ -57,6 +57,7 @@ def sample_columns() -> Dict["TableColumn", Dict[str, Any]]:
|
|||
"name": "ds",
|
||||
"expression": "ds",
|
||||
"type": "TIMESTAMP",
|
||||
"advanced_data_type": None,
|
||||
"is_temporal": True,
|
||||
"is_physical": True,
|
||||
},
|
||||
|
|
@ -64,6 +65,7 @@ def sample_columns() -> Dict["TableColumn", Dict[str, Any]]:
|
|||
"name": "num_boys",
|
||||
"expression": "num_boys",
|
||||
"type": "INTEGER",
|
||||
"advanced_data_type": None,
|
||||
"is_dimensional": True,
|
||||
"is_physical": True,
|
||||
},
|
||||
|
|
@ -71,6 +73,7 @@ def sample_columns() -> Dict["TableColumn", Dict[str, Any]]:
|
|||
"name": "region",
|
||||
"expression": "region",
|
||||
"type": "VARCHAR",
|
||||
"advanced_data_type": None,
|
||||
"is_dimensional": True,
|
||||
"is_physical": True,
|
||||
},
|
||||
|
|
@ -83,6 +86,7 @@ def sample_columns() -> Dict["TableColumn", Dict[str, Any]]:
|
|||
"name": "profit",
|
||||
"expression": "revenue-expenses",
|
||||
"type": "INTEGER",
|
||||
"advanced_data_type": None,
|
||||
"is_physical": False,
|
||||
},
|
||||
}
|
||||
|
|
@ -98,6 +102,7 @@ def sample_metrics() -> Dict["SqlMetric", Dict[str, Any]]:
|
|||
"expression": "COUNT(*)",
|
||||
"extra_json": '{"metric_type": "COUNT"}',
|
||||
"type": "UNKNOWN",
|
||||
"advanced_data_type": None,
|
||||
"is_additive": True,
|
||||
"is_aggregation": True,
|
||||
"is_filterable": False,
|
||||
|
|
@ -110,6 +115,7 @@ def sample_metrics() -> Dict["SqlMetric", Dict[str, Any]]:
|
|||
"expression": "AVG(revenue)",
|
||||
"extra_json": '{"metric_type": "AVG"}',
|
||||
"type": "UNKNOWN",
|
||||
"advanced_data_type": None,
|
||||
"is_additive": False,
|
||||
"is_aggregation": True,
|
||||
"is_filterable": False,
|
||||
|
|
|
|||
|
|
@ -476,12 +476,31 @@ def test_create_virtual_sqlatable(
|
|||
name="ds",
|
||||
is_temporal=True,
|
||||
type="TIMESTAMP",
|
||||
advanced_data_type=None,
|
||||
expression="ds",
|
||||
is_physical=True,
|
||||
),
|
||||
dict(name="num_boys", type="INTEGER", expression="num_boys", is_physical=True),
|
||||
dict(name="revenue", type="INTEGER", expression="revenue", is_physical=True),
|
||||
dict(name="expenses", type="INTEGER", expression="expenses", is_physical=True),
|
||||
dict(
|
||||
name="num_boys",
|
||||
type="INTEGER",
|
||||
advanced_data_type=None,
|
||||
expression="num_boys",
|
||||
is_physical=True,
|
||||
),
|
||||
dict(
|
||||
name="revenue",
|
||||
type="INTEGER",
|
||||
advanced_data_type=None,
|
||||
expression="revenue",
|
||||
is_physical=True,
|
||||
),
|
||||
dict(
|
||||
name="expenses",
|
||||
type="INTEGER",
|
||||
advanced_data_type=None,
|
||||
expression="expenses",
|
||||
is_physical=True,
|
||||
),
|
||||
]
|
||||
# create a physical ``Table`` that the virtual dataset points to
|
||||
database = Database(database_name="my_database", sqlalchemy_uri="sqlite://")
|
||||
|
|
|
|||
Loading…
Reference in New Issue