Reference API documentation¶
This is the reference API documentation of the Python API client.
dataikuapi.dssclient.DSSClient (host[, ...]) |
Entry point for the DSS API client |
dataikuapi.dss.project.DSSProject (client, ...) |
A project on the DSS instance |
dataikuapi.dss.dataset.DSSDataset (client, ...) |
A dataset on the DSS instance |
dataikuapi.dss.recipe.DSSRecipe (client, ...) |
A recipe on the DSS instance |
dataikuapi.dss.recipe.GroupingRecipeCreator (...) |
|
dataikuapi.dss.recipe.JoinRecipeCreator (...) |
|
dataikuapi.dss.recipe.StackRecipeCreator (...) |
|
dataikuapi.dss.recipe.WindowRecipeCreator (...) |
|
dataikuapi.dss.recipe.SyncRecipeCreator (...) |
|
dataikuapi.dss.recipe.SamplingRecipeCreator (...) |
|
dataikuapi.dss.recipe.SQLQueryRecipeCreator (...) |
|
dataikuapi.dss.recipe.CodeRecipeCreator |
|
dataikuapi.dss.recipe.SplitRecipeCreator |
|
dataikuapi.dss.managedfolder.DSSManagedFolder (...) |
A managed folder on the DSS instance |
dataikuapi.dss.job.DSSJob (client, ...) |
A job on the DSS instance |
dataikuapi.dss.meaning.DSSMeaning (client, id) |
A user-defined meaning on the DSS instance |
dataikuapi.dss.metrics.ComputedMetrics (raw) |
|
dataikuapi.dss.admin.DSSUser (client, login) |
A handle for a user on the DSS instance |
dataikuapi.dss.admin.DSSGroup (client, name) |
A group on the DSS instance |
dataikuapi.dss.admin.DSSConnection (client, name) |
A connection on the DSS instance |
dataikuapi.dss.admin.DSSGeneralSettings |
|
dataikuapi.dss.admin.DSSUserImpersonationRule |
|
dataikuapi.dss.admin.DSSGroupImpersonationRule |
|
dataikuapi.dss.scenario.DSSScenario (client, ...) |
A handle to interact with a scenario on the DSS instance |
dataikuapi.dss.scenario.DSSScenarioRun (...) |
A handle containing basic info about a past run of a scenario. |
dataikuapi.dss.scenario.DSSTriggerFire (...) |
The activation of a trigger on the DSS instance |
dataikuapi.dss.apiservice.DSSAPIService (...) |
An API Service on the DSS instance |
-
class
dataikuapi.dssclient.
DSSClient
(host, api_key=None, internal_ticket=None)¶ Entry point for the DSS API client
-
create_connection
(name, type=None, params=None, usable_by='ALL', allowed_groups=None)¶ Create a connection, and return a handle to interact with it
Note: this call requires an API key with admin rights
- Args:
name: the name of the new connection type: the type of the new connection params: the parameters of the new connection, as a JSON object usable_by: the type of access control for the connection. Either ‘ALL’ (=no access control)
or ‘ALLOWED’ (=access restricted to users of a list of groups)- allowed_groups: when using access control (that is, setting usable_by=’ALLOWED’), the list
- of names of the groups whose users are allowed to use the new connection
- Returns:
- A
dataikuapi.dss.admin.DSSConnection
connection handle
-
create_group
(name, description=None, source_type='LOCAL')¶ Create a group, and return a handle to interact with it
Note: this call requires an API key with admin rights
- Args:
- name: the name of the new group description: a description of the new group source_type: the type of the new group. Admissible values are ‘LOCAL’, ‘LDAP’, ‘SAAS’
- Returns:
- A
dataikuapi.dss.admin.DSSGroup
group handle
-
create_meaning
(id, label, type, description=None, values=None, mappings=None, pattern=None, normalizationMode=None, detectable=False)¶ Create a meaning, and return a handle to interact with it
Note: this call requires an API key with admin rights
- Args:
id: the ID of the new meaning type: the type of the new meaning. Admissible values are ‘DECLARATIVE’, ‘VALUES_LIST’, ‘VALUES_MAPPING’ and ‘PATTERN’ description (optional): the description of the new meaning values (optional): when type is ‘VALUES_LIST’, the list of values mappings (optional): when type is ‘VALUES_MAPPING’, the mapping, as a list of objects with this
structure: {‘from’: ‘value_1’, ‘to’: ‘value_a’}pattern (optional): when type is ‘PATTERN’, the pattern normalizationMode (optional): when type is ‘VALUES_LIST’, ‘VALUES_MAPPING’ or ‘PATTERN’, the normalization
mode to use for value matching. One of ‘EXACT’, ‘LOWERCASE’, or ‘NORMALIZED’ (not available for ‘PATTERN’ type). Defaults to ‘EXACT’.detectable (optional): whether DSS should consider assigning the meaning to columns set to ‘Auto-detect’. Defaults to False.
- Returns:
- A
dataikuapi.dss.meaning.DSSMeaning
meaning handle
-
create_project
(project_key, name, owner, description=None, settings=None)¶ Create a project, and return a project handle to interact with it.
Note: this call requires an API key with admin rights
- Args:
- project_key: the identifier to use for the project. name: the name for the project. owner: the owner of the project. description: a short description for the project.
- Returns:
- A
dataikuapi.dss.project.DSSProject
project handle
-
create_project_from_bundle_archive
(fp)¶
-
create_project_from_bundle_local_archive
(archive_path)¶
-
create_user
(login, password, display_name='', source_type='LOCAL', groups=[], profile='DATA_SCIENTIST')¶ Create a user, and return a handle to interact with it
Note: this call requires an API key with admin rights
Parameters: - login (str) – the login of the new user
- password (str) – the password of the new user
- display_name (str) – the displayed name for the new user
- source_type (str) – the type of new user. Admissible values are ‘LOCAL’ or ‘LDAP’
- groups (list) – the names of the groups the new user belongs to
- profile (str) – The profile for the new user, can be one of READER, DATA_ANALYST or DATA_SCIENTIST
Returns: A
dataikuapi.dss.admin.DSSUser
user handle
-
get_connection
(name)¶ Get a handle to interact with a specific connection
- Args:
- name: the name of the desired connection
- Returns:
- A
dataikuapi.dss.admin.DSSConnection
connection handle
-
get_future
(job_id)¶ Get a handle to interact with a specific long task (a.k.a future).
- Args:
- job_id: the job_id key of the desired future
- Returns:
- A
dataikuapi.dss.future.DSSFuture
-
get_group
(name)¶ Get a handle to interact with a specific group
- Args:
- name: the name of the desired group
- Returns:
- A
dataikuapi.dss.admin.DSSGroup
group handle
-
get_log
(name)¶ Get a specific log
Note: this call requires an API key with admin rights
- Args:
- name: the name of the desired log
- Returns:
- The full log, as a string
-
get_meaning
(id)¶ Get a handle to interact with a specific user-defined meaning
Note: this call requires an API key with admin rights
- Args:
- id: the ID of the desired meaning
- Returns:
- A
dataikuapi.dss.meaning.DSSMeaning
meaning handle
-
get_plugin
(plugin_id)¶ Get a handle to interact with a specific dev plugin.
- Args:
- plugin_id: the identifier of the desired plugin
- Returns:
- A
dataikuapi.dss.project.DSSPlugin
-
get_project
(project_key)¶ Get a handle to interact with a specific project.
- Args:
- project_key: the project key of the desired project
- Returns:
- A
dataikuapi.dss.project.DSSProject
-
get_user
(login)¶ Get a handle to interact with a specific user
Parameters: login (str) – the login of the desired user Returns: A dataikuapi.dss.admin.DSSUser
user handle
-
get_variables
()¶ Get the DSS instance’s variables
Note: this call requires an API key with admin rights
- Returns:
- A JSON object
-
list_connections
()¶ List all connections setup on the DSS instance
Note: this call requires an API key with admin rights
- Returns:
- All connections, as a map of connection name to connection definition
-
list_futures
(as_objects=False, all_users=False)¶ List the currently-running long tasks (a.k.a futures)
- Returns:
- list of futures. Each object contains at least a ‘jobId’ field
-
list_groups
()¶ List all groups setup on the DSS instance
Note: this call requires an API key with admin rights
- Returns:
- A list of groups, as an array of JSON objects
-
list_logs
()¶ List all logs on the DSS instance
Note: this call requires an API key with admin rights
- Returns:
- A list of log names
-
list_meanings
()¶ List all user-defined meanings on the DSS instance
Note: this call requires an API key with admin rights
- Returns:
- A list of meanings, as an array of JSON objects
-
list_plugins
()¶ List the installed plugins
- Returns:
- list of objects. Each object contains at least a ‘projectKey’ field
-
list_project_keys
()¶ List the project keys (=project identifiers).
- Returns:
- list of identifiers (=strings)
-
list_projects
()¶ List the projects
- Returns:
- list of objects. Each object contains at least a ‘projectKey’ field
-
list_running_scenarios
(all_users=False)¶ List the running scenarios
- Returns:
- the list of scenarios, each one as a JSON object containing a jobId field for the future hosting the scenario run, and a payload field with scenario identifiers
-
list_users
()¶ List all users setup on the DSS instance
Note: this call requires an API key with admin rights
Returns: A list of users, as an array of dicts.
-
prepare_project_import
(f)¶ Prepares import of a project archive
@param: fp: the input stream, as a file-like object
Returns a handle for the prepared import
-
set_variables
(variables)¶ Set the DSS instance’s variables
Note: this call requires an API key with admin rights
- Args:
- variables: the new state of all variables of the instance, as a JSON object
-
sql_query
(query, connection=None, database=None, dataset_full_name=None, pre_queries=None, post_queries=None, type='sql', extra_conf={})¶ Initiate a SQL, Hive or Impala query and get a handle to retrieve the results of the query. Internally, the query is run by DSS. The database to run the query on is specified either by passing a connection name, or by passing a database name, or by passing a dataset full name (whose connection is then used to retrieve the database)
- Args:
- query: the query to run connection: the connection on which the query should be run (exclusive of database and dataset_full_name) database: the database on which the query should be run (exclusive of connection and dataset_full_name) dataset_full_name: the dataset on the connection of which the query should be run (exclusive of connection and database) pre_queries: (optional) array of queries to run before the query post_queries: (optional) array of queries to run after the query type: the type of query : either ‘sql’, ‘hive’ or ‘impala’
- Returns:
- A
dataikuapi.dss.sqlquery.DSSSQLQuery
query handle
-
-
class
dataikuapi.dssclient.
TemporaryImportHandle
(client, import_id)¶ -
execute
(settings={})¶ Executes the import with provided settings. @warning: You must check the ‘success’ flag
-
-
class
dataikuapi.dss.project.
DSSProject
(client, project_key)¶ A project on the DSS instance
-
activate_bundle
(bundle_id)¶
-
create_dataset
(dataset_name, type, params={}, formatType=None, formatParams={})¶ Create a new dataset in the project, and return a handle to interact with it
- Args:
- dataset_name: the name for the new dataset type: the type of the dataset params: the parameters for the type, as a JSON object formatType: an optional format to create the dataset with formatParams: the parameters to the format, as a JSON object
- Returns:
- A
dataikuapi.dss.dataset.DSSDataset
dataset handle
-
create_managed_folder
(name)¶ Create a new managed folder in the project, and return a handle to interact with it
- Args:
- name: the name of the managed folder
- Returns:
- A
dataikuapi.dss.managedfolder.DSSManagedFolder
managed folder handle
-
create_recipe
(recipe_name, type, recipe_inputs=[], recipe_outputs=[], recipe_params={})¶ Create a new recipe in the project, and return a handle to interact with it. Some recipe types take additional parameters in recipe_params:
- ‘grouping’ : a ‘groupKey’ column name
- ‘python’, ‘sql_query’, ‘hive’, ‘impala’ : the code of the recipe as a ‘payload’ string
- Args:
- recipe_name: the name for the new recipe type: the type of the recipe (‘sync’, ‘grouping’, ‘join’, ‘vstack’, ‘python’, ‘sql_query’, ‘hive’, ‘impala’) recipe_inputs: an array of recipes inputs, as objects {‘ref’:’...’, ‘deps’:[...]} recipe_outputs: an array of recipes outputs, as objects {‘ref’:’...’, ‘appendMode’:True/False} recipe_params: additional parameters for the recipe creation
- Returns:
- A
dataikuapi.dss.recipe.DSSRecipe
recipe handle
-
create_scenario
(scenario_name, type, definition={})¶ Create a new scenario in the project, and return a handle to interact with it
- Args:
- scenario_name: the name for the new scenario type: the type of the scenario (‘step_based’ or ‘custom_python’) definition: the definition of the scenario, as a JSON object
- Returns:
- A
dataikuapi.dss.scenario.DSSScenario
scenario handle
-
delete
()¶ Delete the project
Note: this call requires an API key with admin rights
-
download_exported_bundle_archive_to_file
(bundle_id, path)¶ Download a bundle archive that can be deployed in a DSS automation Node into the given output file. @param path if “-”, will write to /dev/stdout
-
export_bundle
(bundle_id)¶
-
export_to_file
(path, options={})¶ Export the project to a file Args:
path: the path of the file in which the exported project should be saved
-
get_api_service
(service_id)¶ Get a handle to interact with a specific API service
- Args:
- service_id: the ID of the desired API service
- Returns:
- A
dataikuapi.dss.dataset.DSSAPIService
API Service handle
-
get_dataset
(dataset_name)¶ Get a handle to interact with a specific dataset
- Args:
- dataset_name: the name of the desired dataset
- Returns:
- A
dataikuapi.dss.dataset.DSSDataset
dataset handle
-
get_export_stream
(options={})¶ Return a stream of the exported project
Warning: this stream will monopolize the DSSClient until closed
-
get_exported_bundle_archive_stream
(bundle_id)¶ Download a bundle archive that can be deployed in a DSS automation Node, as a binary stream. Warning: this stream will monopolize the DSSClient until closed.
-
get_job
(id)¶ Get a handler to interact with a specific job
- Returns:
- A
dataikuapi.dss.job.DSSJob
job handle
-
get_managed_folder
(odb_id)¶ Get a handle to interact with a specific managed folder
- Args:
- odb_id: the identifier of the desired managed folder
- Returns:
- A
dataikuapi.dss.managedfolder.DSSManagedFolder
managed folder handle
-
get_metadata
()¶ Get the metadata attached to this project. The metadata contains label, description checklists, tags and custom metadata of the project
- Returns:
- a dict object. For more information on available metadata, please see https://doc.dataiku.com/dss/api/latest
-
get_permissions
()¶ Get the permissions attached to this project
- Returns:
- a JSON object, containing the owner and the permissions, as a list of pairs of group name and permission type
-
get_recipe
(recipe_name)¶ Get a handle to interact with a specific recipe
- Args:
- recipe_name: the name of the desired recipe
- Returns:
- A
dataikuapi.dss.recipe.DSSRecipe
recipe handle
-
get_saved_model
(sm_id)¶ Get a handle to interact with a specific saved model
- Args:
- sm_id: the identifier of the desired saved model
- Returns:
- A
dataikuapi.dss.savedmodel.DSSSavedModel
saved model handle
-
get_scenario
(scenario_id)¶ Get a handle to interact with a specific scenario
- Args:
- scenario_id: the ID of the desired scenario
- Returns:
- A
dataikuapi.dss.scenario.DSSScenario
scenario handle
-
get_variables
()¶ Gets the variables of this project.
- Returns:
- a dictionary containing two dictionaries : “standard” and “local”. “standard” are regular variables, exported with bundles. “local” variables are not part of the bundles for this project
-
import_bundle_from_archive
(archive_path)¶
-
list_api_services
()¶ List the API services in this project
- Returns:
- the list of API services, each one as a JSON object
-
list_datasets
()¶ List the datasets in this project
- Returns:
- the list of the datasets, each one as a JSON object
-
list_exported_bundles
()¶
-
list_imported_bundles
()¶
-
list_jobs
()¶ List the jobs in this project
- Returns:
- a list of the jobs, each one as a JSON object, containing both the definition and the state
-
list_managed_folders
()¶ List the managed folders in this project
- Returns:
- the list of the managed folders, each one as a JSON object
-
list_recipes
()¶ List the recipes in this project
- Returns:
- the list of the recipes, each one as a JSON object
-
list_saved_models
()¶ List the saved models in this project
- Returns:
- the list of the saved models, each one as a JSON object
-
list_scenarios
()¶ List the scenarios in this project
- Returns:
- the list of scenarios, each one as a JSON object.
-
set_metadata
(metadata)¶ Set the metadata on this project.
- Args:
- metadata: the new state of the metadata for the project. You should only set a metadata object that has been retrieved using the get_metadata call.
-
set_permissions
(permissions)¶ Set the permissions on this project
- Args:
- permissions: a JSON object of the same structure as the one returned by get_permissions call
-
set_variables
(obj)¶ Sets the variables of this project. @param obj: must be a modified version of the object returned by get_variables
-
start_job
(definition)¶ Create a new job, and return a handle to interact with it
- Args:
- definition: the definition for the job to create. The definition must contain the type of job (RECURSIVE_BUILD, NON_RECURSIVE_FORCED_BUILD, RECURSIVE_FORCED_BUILD, RECURSIVE_MISSING_ONLY_BUILD) and a list of outputs to build. Optionally, a refreshHiveMetastore field can specify whether to re-synchronize the Hive metastore for recomputed HDFS datasets.
- Returns:
- A
dataikuapi.dss.job.DSSJob
job handle
-
sync_datasets_acls
()¶ Resync permissions on HDFS datasets in this project
- Returns:
- a DSSFuture handle to the task of resynchronizing the permissions
Note: this call requires an API key with admin rights
-
-
class
dataikuapi.dss.dataset.
DSSDataset
(client, project_key, dataset_name)¶ A dataset on the DSS instance
-
clear
(partitions=None)¶ Clear all data in this dataset
- Args:
- partitions: (optional) a list of partitions to clear. When not provided, the entire dataset is cleared
-
compute_metrics
(partition='', metric_ids=None, probes=None)¶ Compute metrics on a partition of this dataset. If neither metric ids nor custom probes set are specified, the metrics setup on the dataset are used.
-
delete
()¶ Delete the dataset
-
get_definition
()¶ Get the definition of the dataset
- Returns:
- the definition, as a JSON object
-
get_last_metric_values
(partition='')¶ Get the last values of the metrics on this dataset
- Returns:
- a list of metric objects and their value
-
get_metadata
()¶ Get the metadata attached to this dataset. The metadata contains label, description checklists, tags and custom metadata of the dataset
- Returns:
- a dict object. For more information on available metadata, please see https://doc.dataiku.com/dss/api/latest
-
get_metric_history
(metric, partition='')¶ Get the history of the values of the metric on this dataset
- Returns:
- an object containing the values of the metric, cast to the appropriate type (double, boolean,...)
-
get_schema
()¶ Get the schema of the dataset
- Returns:
- a JSON object of the schema, with the list of columns
-
get_usages
()¶ Get the recipes or analyses referencing this dataset
- Returns:
- a list of usages
-
iter_rows
(partitions=None)¶ Get the dataset’s data
- Return:
- an iterator over the rows, each row being a tuple of values. The order of values in the tuples is the same as the order of columns in the schema returned by get_schema
-
list_partitions
()¶ Get the list of all partitions of this dataset
- Returns:
- the list of partitions, as a list of strings
-
run_checks
(partition='', checks=None)¶ Run checks on a partition of this dataset. If the checks are not specified, the checks setup on the dataset are used.
-
set_definition
(definition)¶ Set the definition of the dataset
- Args:
- definition: the definition, as a JSON object. You should only set a definition object that has been retrieved using the get_definition call.
-
set_metadata
(metadata)¶ Set the metadata on this dataset.
- Args:
- metadata: the new state of the metadata for the dataset. You should only set a metadata object that has been retrieved using the get_metadata call.
-
set_schema
(schema)¶ Set the schema of the dataset
- Args:
- schema: the desired schema for the dataset, as a JSON object. All columns have to provide their name and type
-
synchronize_hive_metastore
()¶ Synchronize this dataset with the Hive metastore
-
-
class
dataikuapi.dss.recipe.
DSSRecipe
(client, project_key, recipe_name)¶ A recipe on the DSS instance
-
delete
()¶ Delete the recipe
-
get_definition_and_payload
()¶ Get the definition of the recipe
- Returns:
- the definition, as a DSSRecipeDefinitionAndPayload object, containing the recipe definition itself and its payload
-
get_metadata
()¶ Get the metadata attached to this recipe. The metadata contains label, description checklists, tags and custom metadata of the recipe
- Returns:
- a dict object. For more information on available metadata, please see https://doc.dataiku.com/dss/api/latest
-
set_definition_and_payload
(definition)¶ Set the definition of the recipe
- Args:
- definition: the definition, as a DSSRecipeDefinitionAndPayload object. You should only set a definition object that has been retrieved using the get_definition call.
-
set_metadata
(metadata)¶ Set the metadata on this recipe.
- Args:
- metadata: the new state of the metadata for the recipe. You should only set a metadata object that has been retrieved using the get_metadata call.
-
-
class
dataikuapi.dss.recipe.
DSSRecipeCreator
(type, name, project)¶ Helper to create new recipes
-
build
()¶ Create a new recipe in the project, and return a handle to interact with it.
- Returns:
- A
dataikuapi.dss.recipe.DSSRecipe
recipe handle
-
with_input
(dataset_name, project_key=None)¶
-
with_output
(dataset_name, append=False)¶
-
-
class
dataikuapi.dss.recipe.
DSSRecipeDefinitionAndPayload
(data)¶ Definition for a recipe, that is, the recipe definition itself and its payload
-
get_json_payload
()¶
-
get_payload
()¶
-
get_recipe_inputs
()¶
-
get_recipe_outputs
()¶
-
get_recipe_params
()¶
-
get_recipe_raw_definition
()¶
-
set_json_payload
(payload)¶
-
set_payload
(payload)¶
-
-
class
dataikuapi.dss.recipe.
JoinRecipeCreator
(name, project)¶
-
class
dataikuapi.dss.recipe.
SQLQueryRecipeCreator
(name, project)¶
-
class
dataikuapi.dss.recipe.
SamplingRecipeCreator
(name, project)¶
-
class
dataikuapi.dss.recipe.
SingleOutputRecipeCreator
(type, name, project)¶ -
with_existing_output
(dataset_name, append=False)¶
-
with_new_output
(dataset_name, connection_id, format_option_id=None, partitioning_option_id=None, append=False)¶
-
with_output
(dataset_name, append=False)¶
-
-
class
dataikuapi.dss.recipe.
StackRecipeCreator
(name, project)¶
-
class
dataikuapi.dss.recipe.
SyncRecipeCreator
(name, project)¶
-
class
dataikuapi.dss.recipe.
VirtualInputsSingleOutputRecipeCreator
(type, name, project)¶ -
with_input
(dataset_name, project_key=None)¶
-
-
class
dataikuapi.dss.recipe.
WindowRecipeCreator
(name, project)¶
-
class
dataikuapi.dss.managedfolder.
DSSManagedFolder
(client, project_key, odb_id)¶ A managed folder on the DSS instance
-
compute_metrics
(metric_ids=None, probes=None)¶ Compute metrics on this managed folder. If the metrics are not specified, the metrics setup on the managed folder are used.
-
delete
()¶ Delete the managed folder
-
delete_file
(path)¶ Delete a file from the managed folder
-
get_definition
()¶ Get the definition of the managed folder
- Returns:
- the definition, as a JSON object
-
get_file
(path)¶ Get a file from the managed folder
- Returns:
- the file’s content, as a stream
-
get_last_metric_values
()¶ Get the last values of the metrics on this managed folder
- Returns:
- a list of metric objects and their value
-
get_metric_history
(metric)¶ Get the history of the values of the metric on this dataset
- Returns:
- an object containing the values of the metric, cast to the appropriate type (double, boolean,...)
-
get_usages
()¶ Get the recipes referencing this folder
- Returns:
- a list of usages
-
list_contents
()¶ Get the list of files in the managed folder
- Returns:
- the list of files, as a JSON object
-
put_file
(name, f)¶ Upload the file to the managed folder
- Args:
- f: the file contents, as a stream name: the name of the file
-
set_definition
(definition)¶ Set the definition of the managed folder
- Args:
- definition: the definition, as a JSON object. You should only set a definition object that has been retrieved using the get_definition call.
-
-
class
dataikuapi.dss.job.
DSSJob
(client, project_key, id)¶ A job on the DSS instance
-
abort
()¶ Aborts the job
-
get_log
(activity=None)¶ Get the logs of the job
- Args:
- activity: (optional) the name of the activity in the job whose log is requested
- Returns:
- the log, as a string
-
get_status
()¶ Get the current status of the job
- Returns:
- the state of the job, as a JSON object
-
-
class
dataikuapi.dss.meaning.
DSSMeaning
(client, id)¶ A user-defined meaning on the DSS instance
-
get_definition
()¶ Get the meaning’s definition.
Note: this call requires an API key with admin rights
- Returns:
- the meaning definition, as a JSON object
-
set_definition
(definition)¶ Set the meaning’s definition.
Note: this call requires an API key with admin rights
- Args:
- definition: the definition for the meaning, as a JSON object
-
-
class
dataikuapi.dss.metrics.
ComputedMetrics
(raw)¶ -
get_all_ids
()¶
-
get_first_partition_data
(metric_id)¶
-
get_global_data
(metric_id)¶
-
get_global_value
(metric_id)¶
-
get_metric_by_id
(id)¶
-
get_partition_data
(metric_id, partition)¶
-
get_partition_value
(metric_id, partition)¶
-
static
get_value_from_data
(data)¶
-
-
class
dataikuapi.dss.admin.
DSSConnection
(client, name)¶ A connection on the DSS instance
-
delete
()¶ Delete the connection
Note: this call requires an API key with admin rights
-
get_definition
()¶ Get the connection’s definition (type, name, params, usage restrictions)
Note: this call requires an API key with admin rights
- Returns:
- the connection definition, as a JSON object
-
set_definition
(description)¶ Set the connection’s definition.
Note: this call requires an API key with admin rights
- Args:
- definition: the definition for the connection, as a JSON object.
-
sync_datasets_acls
()¶ Resync permissions on datasets in this connection path
- Returns:
- a DSSFuture handle to the task of resynchronizing the permissions
Note: this call requires an API key with admin rights
-
sync_root_acls
()¶ Resync root permissions on this connection path
- Returns:
- a DSSFuture handle to the task of resynchronizing the permissions
Note: this call requires an API key with admin rights
-
-
class
dataikuapi.dss.admin.
DSSGroup
(client, name)¶ A group on the DSS instance
-
delete
()¶ Delete the group
Note: this call requires an API key with admin rights
-
get_definition
()¶ Get the group’s definition (name, description, admin abilities, type, ldap name mapping)
Note: this call requires an API key with admin rights
- Returns:
- the group definition, as a JSON object
-
set_definition
(definition)¶ Set the group’s definition.
Note: this call requires an API key with admin rights
- Args:
- definition: the definition for the group, as a JSON object.
-
-
class
dataikuapi.dss.admin.
DSSUser
(client, login)¶ A handle for a user on the DSS instance
-
delete
()¶ Deletes the user
Note: this call requires an API key with admin rights
-
get_definition
()¶ Get the user’s definition (login, type, display name, permissions, ...)
Note: this call requires an API key with admin rights
Returns: the user’s definition, as a dict
-
set_definition
(definition)¶ Set the user’s definition.
Note: this call requires an API key with admin rights
Parameters: definition (dict) – the definition for the user, as a dict. You should obtain the definition using get_definition, not create one. The fields that can be changed are:
- displayName
- groups
- userProfile
- password
-
-
class
dataikuapi.dss.scenario.
DSSScenario
(client, project_key, id)¶ A handle to interact with a scenario on the DSS instance
-
abort
()¶ Aborts the scenario if it currently running
-
get_average_duration
(limit=3)¶ Get the average duration (in fractional seconds) of the last runs of this scenario that finished, where finished means it ended with SUCCESS or WARNING. If there are not enough runs to perform the average, returns None
- Args:
- limit: number of last runs to average on
-
get_current_run
()¶ Get the current run of the scenario, or None if it is not running at the moment
Returns: A dataikuapi.dss.scenario.DSSScenarioRun
-
get_definition
(with_status=True)¶ Returns the definition of the scenario
- Args:
- with_status: if True, the definition contains the run status of the scenario but not its
- actions’ definition. If False, the definition doesn’t contain the run status but has the scenario’s actions definition
-
get_last_runs
(limit=10, only_finished_runs=False)¶ Get the list of the last runs of the scenario.
Returns: A list of dataikuapi.dss.scenario.DSSScenarioRun
-
get_payload
(extension='py')¶ Returns the payload of the scenario
Parameters: extension (str) – the type of script. Default is ‘py’ for python
-
get_run
(run_id)¶ Get a handle to a run of the scenario
Returns: A dataikuapi.dss.scenario.DSSScenarioRun
-
run
(params={})¶ Requests a run of the scenario, which will start after a few seconds.
Params dict params: additional parameters that will be passed to the scenario through trigger params
-
set_definition
(definition, with_status=True)¶ Updates the definition of this scenario
- Args:
- with_status: should be the same as the value passed to get_definition(). If True, the params,
- triggers and reporters fields of the scenario are ignored,
-
set_payload
(script, with_status=True)¶ Updates the payload of this scenario
Parameters: extension (str) – the type of script. Default is ‘py’ for python
-
-
class
dataikuapi.dss.scenario.
DSSScenarioRun
(client, run)¶ A handle containing basic info about a past run of a scenario.
This handle can also be used to fetch additional information about the urn
-
get_details
()¶ Get the full details of the scenario run, including its step runs.
Note: this perform another API call
-
get_duration
()¶ Get the duration of this run (in fractional seconds).
If the run is still running, get the duration since it started
-
get_info
()¶ Get the basic information of the scenario run
-
get_start_time
()¶ Get the start time of the scenario run
-
-
class
dataikuapi.dss.scenario.
DSSTriggerFire
(scenario, trigger_fire)¶ The activation of a trigger on the DSS instance
-
get_scenario_run
()¶ Get the run of the scenario that this trigger activation launched
-
-
class
dataikuapi.dss.apiservice.
DSSAPIService
(client, project_key, service_id)¶ An API Service on the DSS instance
-
create_package
(package_id)¶ Prepare a package of this API service
-
delete_package
(package_id)¶ Delete a package of this API service
-
download_package_stream
(package_id)¶ Download a package archive that can be deployed in a DSS API Node, as a binary stream.
Warning: this stream will monopolize the DSSClient until closed.
-
download_package_to_file
(package_id, path)¶ Download a package archive that can be deployed in a DSS API Node, into the given output file.
-
list_packages
()¶ List the packages of this API services
- Returns:
- the list of API service packages, each one as a JSON object
-