h&Cs      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~                                                                                                                            !!!!!!!!!!!!!!!!""""""""""""""""""""""""""""""""#################$$$$$$$$$$$$$$$$$$$$$$$$$%%%%%%%%%%%%%%%&&&&&&&&&&&&&&&&&'''''''''''''''''''((((((((((((((((((()))))))))))))))))))*************+++++++++++++++++++++++,,,,,,,,,,,,,,,,,,,,,,,,---------------.........................///////////////00000000000000000000000000000000000000111111111111111111111111111111222222222222222222222222222222223333333333333333333333333444444444444444444444444444444444444555555555555555666666666666666777777777777777888888888888888 8 8 8 8 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 : : : : : : : : : : : : : : : : : : : : : : : : ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; < < < < < < < < < < < < < < < < < < < < < < < = = = = = = = = = = = = = = = = = = = = = = = > > > > > > > > > > > > > > > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ A A A A A A A A A A A A A A A A A A A A A A B B B B B B B C C C C C C C C C C C C C C C C C C C C C C C C C C C C D D D D D D D D D D D D D D D D D D D D D D D D D D D D D D E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E F F F F F F F F F F F F F F F F F F F F F F F F F F F F G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G H H H H H H H H H H H H H H H H H H H H H H H H H H H H H H I I I I I I I I I I I I I I I I I I I I I I I I I I I J J J J J J J J J J J J J J J J J J J J J J J J J J J J J J J J J J K K K K K K K K K K K K K K K K K K K K K K K K K K K K K K K K K K L L L L L L L L L L L L L L L L L L L L L L L L L L L L L L L L L L M M M M M M M M M M M M M M M M M M M M M M M M M M M M M M M M M M N N N N N N N N N N N N N N N N N N N N N N N NNNNNNNNNOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXYYYYYYYYYYYYYYYYYYYYYYYYYYYZZZZZZZZZZZZZZZZZZZZZZZZZZZ[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[\\\\\\\\\\\\\\\\\\\\\\\\\\\]]]]]]]]]]]]]]]]]]]]]]]]]]]^^^^^^^^^^^^^^^^^^^^^^^^^^^__________________________________````````````````````````````````````aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaabbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbccccccccccccccccccccccccccccccccccccccddddddddddddddddddddddddddddddddddddddddddddddddddddddeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeffffffffffffffffffffffffffffffffggggggggggggggggggggggggggghhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiijjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkllllllllllllllllllllllllllllllllllllllllllllllllmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';?amazonka-databrewConfiguration of statistics that are allowed to be run on columns that contain detected entities. When undefined, no statistics will be computed on columns that contain detected entities.See:  smart constructor.amazonka-databrewOne or more column statistics to allow for columns that contain detected entities.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - One or more column statistics to allow for columns that contain detected entities.amazonka-databrewOne or more column statistics to allow for columns that contain detected entities.amazonka-databrew(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred";?@u  (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';E%amazonka-databrewSelector of a column from a dataset for profile job configuration. One selector includes either a column name or a regular expression.See: ) smart constructor.'amazonka-databrew$The name of a column from a dataset.(amazonka-databrew;A regular expression for selecting a column from a dataset.)amazonka-databrewCreate a value of %" with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:', *' - The name of a column from a dataset.(, +> - A regular expression for selecting a column from a dataset.*amazonka-databrew$The name of a column from a dataset.+amazonka-databrew;A regular expression for selecting a column from a dataset.%('&)*+%('&)*+(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred";?E 4?>=<;:987564?>=<;:98756?>=<;:987(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';NSamazonka-databrewRepresents an individual condition that evaluates to true or false.Conditions are used with recipe actions. The action is only performed for column values where the condition evaluates to true.If a recipe requires more than one condition, then the recipe must specify multiple ConditionExpression elements. Each condition is applied to the rows in a dataset first, before the recipe action is performed.See: X smart constructor.Uamazonka-databrewA value that the condition must evaluate to for the condition to succeed.Vamazonka-databrewA specific condition to apply to a recipe action. For more information, see  https://docs.aws.amazon.com/databrew/latest/dg/recipes.html#recipes.structureRecipe structure in the Glue DataBrew Developer Guide.Wamazonka-databrew$A column to apply this condition to.Xamazonka-databrewCreate a value of S" with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:U, Y - A value that the condition must evaluate to for the condition to succeed.V, Z - A specific condition to apply to a recipe action. For more information, see  https://docs.aws.amazon.com/databrew/latest/dg/recipes.html#recipes.structureRecipe structure in the Glue DataBrew Developer Guide.W, [' - A column to apply this condition to.Yamazonka-databrewA value that the condition must evaluate to for the condition to succeed.Zamazonka-databrewA specific condition to apply to a recipe action. For more information, see  https://docs.aws.amazon.com/databrew/latest/dg/recipes.html#recipes.structureRecipe structure in the Glue DataBrew Developer Guide.[amazonka-databrew$A column to apply this condition to.Xamazonka-databrewVamazonka-databrewW SWVUTXYZ[ SWVUTXYZ[(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';Tdamazonka-databrewRepresents a set of options that define how DataBrew will read a comma-separated value (CSV) file when creating a dataset from that file.See: h smart constructor.famazonka-databrewA single character that specifies the delimiter being used in the CSV file.gamazonka-databrewA variable that specifies whether the first row in the file is parsed as the header. If this value is false, column names are auto-generated.hamazonka-databrewCreate a value of d" with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:f, i - A single character that specifies the delimiter being used in the CSV file.g, j - A variable that specifies whether the first row in the file is parsed as the header. If this value is false, column names are auto-generated.iamazonka-databrewA single character that specifies the delimiter being used in the CSV file.jamazonka-databrewA variable that specifies whether the first row in the file is parsed as the header. If this value is false, column names are auto-generated.dgfehijdgfehij(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';Xsamazonka-databrewRepresents a set of options that define how DataBrew will write a comma-separated value (CSV) file.See: v smart constructor.uamazonka-databrewA single character that specifies the delimiter used to create CSV job output.vamazonka-databrewCreate a value of s" with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:u, w - A single character that specifies the delimiter used to create CSV job output.wamazonka-databrewA single character that specifies the delimiter used to create CSV job output.sutvwsutvw(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred";?Y~ (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';d(amazonka-databrewRepresents additional options for correct interpretation of datetime parameters used in the Amazon S3 path of a dataset.See:  smart constructor.amazonka-databrewOptional value for a non-US locale code, needed for correct interpretation of some date formats.amazonka-databrewOptional value for a timezone offset of the datetime parameter value in the Amazon S3 path. Shouldn't be used if Format for this parameter includes timezone fields. If no offset specified, UTC is assumed.amazonka-databrewRequired option, that defines the datetime format used for a date parameter in the Amazon S3 path. Should use only supported datetime specifiers and separation characters, all literal a-z or A-Z characters should be escaped with single quotes. E.g. "MM.dd.yyyy-'at'-HH:mm".amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - Optional value for a non-US locale code, needed for correct interpretation of some date formats.,  - Optional value for a timezone offset of the datetime parameter value in the Amazon S3 path. Shouldn't be used if Format for this parameter includes timezone fields. If no offset specified, UTC is assumed.,  - Required option, that defines the datetime format used for a date parameter in the Amazon S3 path. Should use only supported datetime specifiers and separation characters, all literal a-z or A-Z characters should be escaped with single quotes. E.g. "MM.dd.yyyy-'at'-HH:mm".amazonka-databrewOptional value for a non-US locale code, needed for correct interpretation of some date formats.amazonka-databrewOptional value for a timezone offset of the datetime parameter value in the Amazon S3 path. Shouldn't be used if Format for this parameter includes timezone fields. If no offset specified, UTC is assumed.amazonka-databrewRequired option, that defines the datetime format used for a date parameter in the Amazon S3 path. Should use only supported datetime specifiers and separation characters, all literal a-z or A-Z characters should be escaped with single quotes. E.g. "MM.dd.yyyy-'at'-HH:mm".amazonka-databrew   (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred";?d (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';pamazonka-databrewConfiguration of entity detection for a profile job. When undefined, entity detection is disabled.See:  smart constructor.amazonka-databrewConfiguration of statistics that are allowed to be run on columns that contain detected entities. When undefined, no statistics will be computed on columns that contain detected entities.amazonka-databrew4Entity types to detect. Can be any of the following:USA_SSNEMAILUSA_ITINUSA_PASSPORT_NUMBER PHONE_NUMBERUSA_DRIVING_LICENSE BANK_ACCOUNT CREDIT_CARD IP_ADDRESS MAC_ADDRESSUSA_DEA_NUMBERUSA_HCPCS_CODE USA_NATIONAL_PROVIDER_IDENTIFIERUSA_NATIONAL_DRUG_CODE!USA_HEALTH_INSURANCE_CLAIM_NUMBER#USA_MEDICARE_BENEFICIARY_IDENTIFIER USA_CPT_CODE PERSON_NAMEDATEThe Entity type group USA_ALL is also supported, and includes all of the above entity types except PERSON_NAME and DATE.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - Configuration of statistics that are allowed to be run on columns that contain detected entities. When undefined, no statistics will be computed on columns that contain detected entities., 7 - Entity types to detect. Can be any of the following:USA_SSNEMAILUSA_ITINUSA_PASSPORT_NUMBER PHONE_NUMBERUSA_DRIVING_LICENSE BANK_ACCOUNT CREDIT_CARD IP_ADDRESS MAC_ADDRESSUSA_DEA_NUMBERUSA_HCPCS_CODE USA_NATIONAL_PROVIDER_IDENTIFIERUSA_NATIONAL_DRUG_CODE!USA_HEALTH_INSURANCE_CLAIM_NUMBER#USA_MEDICARE_BENEFICIARY_IDENTIFIER USA_CPT_CODE PERSON_NAMEDATEThe Entity type group USA_ALL is also supported, and includes all of the above entity types except PERSON_NAME and DATE.amazonka-databrewConfiguration of statistics that are allowed to be run on columns that contain detected entities. When undefined, no statistics will be computed on columns that contain detected entities.amazonka-databrew4Entity types to detect. Can be any of the following:USA_SSNEMAILUSA_ITINUSA_PASSPORT_NUMBER PHONE_NUMBERUSA_DRIVING_LICENSE BANK_ACCOUNT CREDIT_CARD IP_ADDRESS MAC_ADDRESSUSA_DEA_NUMBERUSA_HCPCS_CODE USA_NATIONAL_PROVIDER_IDENTIFIERUSA_NATIONAL_DRUG_CODE!USA_HEALTH_INSURANCE_CLAIM_NUMBER#USA_MEDICARE_BENEFICIARY_IDENTIFIER USA_CPT_CODE PERSON_NAMEDATEThe Entity type group USA_ALL is also supported, and includes all of the above entity types except PERSON_NAME and DATE.amazonka-databrew (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';xHamazonka-databrewRepresents a set of options that define how DataBrew will interpret a Microsoft Excel file when creating a dataset from that file.See:  smart constructor.amazonka-databrewA variable that specifies whether the first row in the file is parsed as the header. If this value is false, column names are auto-generated.amazonka-databrewOne or more sheet numbers in the Excel file that will be included in the dataset.amazonka-databrewOne or more named sheets in the Excel file that will be included in the dataset.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - A variable that specifies whether the first row in the file is parsed as the header. If this value is false, column names are auto-generated.,  - One or more sheet numbers in the Excel file that will be included in the dataset.,  - One or more named sheets in the Excel file that will be included in the dataset.amazonka-databrewA variable that specifies whether the first row in the file is parsed as the header. If this value is false, column names are auto-generated.amazonka-databrewOne or more sheet numbers in the Excel file that will be included in the dataset.amazonka-databrewOne or more named sheets in the Excel file that will be included in the dataset.   (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';4amazonka-databrewRepresents a structure for defining parameter conditions. Supported conditions are described here:  https://docs.aws.amazon.com/databrew/latest/dg/datasets.multiple-files.html#conditions.for.dynamic.datasets)Supported conditions for dynamic datasets in the Glue DataBrew Developer Guide.See:  smart constructor.amazonka-databrewThe expression which includes condition names followed by substitution variables, possibly grouped and combined with other conditions. For example, "(starts_with :prefix1 or starts_with :prefix2) and (ends_with :suffix1 or ends_with :suffix2)". Substitution variables should start with ':' symbol.amazonka-databrewThe map of substitution variable names to their values used in this filter expression.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The expression which includes condition names followed by substitution variables, possibly grouped and combined with other conditions. For example, "(starts_with :prefix1 or starts_with :prefix2) and (ends_with :suffix1 or ends_with :suffix2)". Substitution variables should start with ':' symbol.,  - The map of substitution variable names to their values used in this filter expression.amazonka-databrewThe expression which includes condition names followed by substitution variables, possibly grouped and combined with other conditions. For example, "(starts_with :prefix1 or starts_with :prefix2) and (ends_with :suffix1 or ends_with :suffix2)". Substitution variables should start with ':' symbol.amazonka-databrewThe map of substitution variable names to their values used in this filter expression.amazonka-databrew(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred";? (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred";? (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred";?(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';amazonka-databrewRepresents the JSON-specific options that define how input is to be interpreted by Glue DataBrew.See:  smart constructor.amazonka-databrewA value that specifies whether JSON input contains embedded new line characters.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - A value that specifies whether JSON input contains embedded new line characters.amazonka-databrewA value that specifies whether JSON input contains embedded new line characters.(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';amazonka-databrewRepresents a set of options that define the structure of either comma-separated value (CSV), Excel, or JSON input.See:  smart constructor.amazonka-databrewOptions that define how CSV input is to be interpreted by DataBrew.amazonka-databrewOptions that define how Excel input is to be interpreted by DataBrew.amazonka-databrewOptions that define how JSON input is to be interpreted by DataBrew.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - Options that define how CSV input is to be interpreted by DataBrew.,  - Options that define how Excel input is to be interpreted by DataBrew.,  - Options that define how JSON input is to be interpreted by DataBrew.amazonka-databrewOptions that define how CSV input is to be interpreted by DataBrew.amazonka-databrewOptions that define how Excel input is to be interpreted by DataBrew.amazonka-databrewOptions that define how JSON input is to be interpreted by DataBrew.  (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred";?(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';amazonka-databrewContains additional resource information needed for specific datasets.See:  smart constructor.amazonka-databrewThe Amazon Resource Name (ARN) associated with the dataset. Currently, DataBrew only supports ARNs from Amazon AppFlow.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The Amazon Resource Name (ARN) associated with the dataset. Currently, DataBrew only supports ARNs from Amazon AppFlow.amazonka-databrewThe Amazon Resource Name (ARN) associated with the dataset. Currently, DataBrew only supports ARNs from Amazon AppFlow.(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred";?(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred";?u(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';amazonka-databrewRepresents a limit imposed on number of Amazon S3 files that should be selected for a dataset from a connected Amazon S3 path.See:  smart constructor.amazonka-databrewA criteria to use for Amazon S3 files sorting before their selection. By default uses DESCENDING order, i.e. most recent files are selected first. Another possible value is ASCENDING.amazonka-databrewA criteria to use for Amazon S3 files sorting before their selection. By default uses LAST_MODIFIED_DATE as a sorting criteria. Currently it's the only allowed value.amazonka-databrew(The number of Amazon S3 files to select.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - A criteria to use for Amazon S3 files sorting before their selection. By default uses DESCENDING order, i.e. most recent files are selected first. Another possible value is ASCENDING.,  - A criteria to use for Amazon S3 files sorting before their selection. By default uses LAST_MODIFIED_DATE as a sorting criteria. Currently it's the only allowed value., + - The number of Amazon S3 files to select.amazonka-databrewA criteria to use for Amazon S3 files sorting before their selection. By default uses DESCENDING order, i.e. most recent files are selected first. Another possible value is ASCENDING.amazonka-databrewA criteria to use for Amazon S3 files sorting before their selection. By default uses LAST_MODIFIED_DATE as a sorting criteria. Currently it's the only allowed value.amazonka-databrew(The number of Amazon S3 files to select.amazonka-databrew  (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred";? (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';amazonka-databrewRepresents a set of options that define the structure of comma-separated (CSV) job output.See:  smart constructor.amazonka-databrewRepresents a set of options that define the structure of comma-separated value (CSV) job output.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - Represents a set of options that define the structure of comma-separated value (CSV) job output.amazonka-databrewRepresents a set of options that define the structure of comma-separated value (CSV) job output.(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred";? (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&'; amazonka-databrewRepresents a dataset parameter that defines type and conditions for a parameter in the Amazon S3 path of the dataset.See:  smart constructor.amazonka-databrewOptional boolean value that defines whether the captured value of this parameter should be used to create a new column in a dataset.amazonka-databrewAdditional parameter options such as a format and a timezone. Required for datetime parameters.amazonka-databrewThe optional filter expression structure to apply additional matching criteria to the parameter.amazonka-databrewThe name of the parameter that is used in the dataset's Amazon S3 path.amazonka-databrewThe type of the dataset parameter, can be one of a 'String', 'Number' or 'Datetime'.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - Optional boolean value that defines whether the captured value of this parameter should be used to create a new column in a dataset.,  - Additional parameter options such as a format and a timezone. Required for datetime parameters.,  - The optional filter expression structure to apply additional matching criteria to the parameter.,  - The name of the parameter that is used in the dataset's Amazon S3 path.,  - The type of the dataset parameter, can be one of a 'String', 'Number' or 'Datetime'.amazonka-databrewOptional boolean value that defines whether the captured value of this parameter should be used to create a new column in a dataset.amazonka-databrewAdditional parameter options such as a format and a timezone. Required for datetime parameters.amazonka-databrewThe optional filter expression structure to apply additional matching criteria to the parameter.amazonka-databrewThe name of the parameter that is used in the dataset's Amazon S3 path.amazonka-databrewThe type of the dataset parameter, can be one of a 'String', 'Number' or 'Datetime'.amazonka-databrewamazonka-databrew  (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';amazonka-databrewRepresents a set of options that define how DataBrew selects files for a given Amazon S3 path in a dataset.See:  smart constructor.amazonka-databrewIf provided, this structure imposes a limit on a number of files that should be selected.amazonka-databrewIf provided, this structure defines a date range for matching Amazon S3 objects based on their LastModifiedDate attribute in Amazon S3.amazonka-databrewA structure that maps names of parameters used in the Amazon S3 path of a dataset to their definitions.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - If provided, this structure imposes a limit on a number of files that should be selected.,  - If provided, this structure defines a date range for matching Amazon S3 objects based on their LastModifiedDate attribute in Amazon S3.,  - A structure that maps names of parameters used in the Amazon S3 path of a dataset to their definitions.amazonka-databrewIf provided, this structure imposes a limit on a number of files that should be selected.amazonka-databrewIf provided, this structure defines a date range for matching Amazon S3 objects based on their LastModifiedDate attribute in Amazon S3.amazonka-databrewA structure that maps names of parameters used in the Amazon S3 path of a dataset to their definitions.  (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';Kamazonka-databrewRepresents a transformation and associated parameters that are used to apply a change to a DataBrew dataset. For more information, see  https://docs.aws.amazon.com/databrew/latest/dg/recipe-actions-reference.htmlRecipe actions reference.See:  smart constructor.amazonka-databrew-Contextual parameters for the transformation.amazonka-databrewThe name of a valid DataBrew transformation to be performed on the data.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, 0 - Contextual parameters for the transformation.,  - The name of a valid DataBrew transformation to be performed on the data.amazonka-databrew-Contextual parameters for the transformation.amazonka-databrewThe name of a valid DataBrew transformation to be performed on the data.amazonka-databrew(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';oamazonka-databrew5Represents the name and version of a DataBrew recipe.See:  smart constructor.amazonka-databrew.The identifier for the version for the recipe.amazonka-databrewThe name of the recipe.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, 1 - The identifier for the version for the recipe.,  - The name of the recipe.amazonka-databrew.The identifier for the version for the recipe.amazonka-databrewThe name of the recipe.amazonka-databrew(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';amazonka-databrewRepresents a single step from a DataBrew recipe to be performed.See:  smart constructor.amazonka-databrewOne or more conditions that must be met for the recipe step to succeed.All of the conditions in the array must be met. In other words, all of the conditions must be combined using a logical AND operation.amazonka-databrew9The particular action to be performed in the recipe step.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - One or more conditions that must be met for the recipe step to succeed.All of the conditions in the array must be met. In other words, all of the conditions must be combined using a logical AND operation., < - The particular action to be performed in the recipe step.amazonka-databrewOne or more conditions that must be met for the recipe step to succeed.All of the conditions in the array must be met. In other words, all of the conditions must be combined using a logical AND operation.amazonka-databrew9The particular action to be performed in the recipe step.amazonka-databrew (c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';ؒamazonka-databrewRepresents one or more actions to be performed on a DataBrew dataset.See:  smart constructor.amazonka-databrew.The date and time that the recipe was created.amazonka-databrewThe Amazon Resource Name (ARN) of the user who created the recipe.amazonka-databrewThe description of the recipe.amazonka-databrewThe Amazon Resource Name (ARN) of the user who last modified the recipe.amazonka-databrew2The last modification date and time of the recipe.amazonka-databrew;The name of the project that the recipe is associated with.amazonka-databrewThe Amazon Resource Name (ARN) of the user who published the recipe.amazonka-databrew0The date and time when the recipe was published.amazonka-databrewThe identifier for the version for the recipe. Must be one of the following:Numeric version (X.Y) - X and Y stand for major and minor version numbers. The maximum length of each is 6 digits, and neither can be negative values. Both X and Y4 are required, and "0.0" isn't a valid version.LATEST_WORKING - the most recent valid version being developed in a DataBrew project.LATEST_PUBLISHED% - the most recent published version.amazonka-databrew.The Amazon Resource Name (ARN) for the recipe.amazonka-databrew/A list of steps that are defined by the recipe.amazonka-databrew3Metadata tags that have been applied to the recipe.amazonka-databrewThe unique name for the recipe.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, 1 - The date and time that the recipe was created.,  - The Amazon Resource Name (ARN) of the user who created the recipe., ! - The description of the recipe.,  - The Amazon Resource Name (ARN) of the user who last modified the recipe., 5 - The last modification date and time of the recipe., > - The name of the project that the recipe is associated with.,  - The Amazon Resource Name (ARN) of the user who published the recipe., 3 - The date and time when the recipe was published.,  - The identifier for the version for the recipe. Must be one of the following:Numeric version (X.Y) - X and Y stand for major and minor version numbers. The maximum length of each is 6 digits, and neither can be negative values. Both X and Y4 are required, and "0.0" isn't a valid version.LATEST_WORKING - the most recent valid version being developed in a DataBrew project.LATEST_PUBLISHED% - the most recent published version., 1 - The Amazon Resource Name (ARN) for the recipe., 2 - A list of steps that are defined by the recipe., 6 - Metadata tags that have been applied to the recipe., " - The unique name for the recipe.amazonka-databrew.The date and time that the recipe was created.amazonka-databrewThe Amazon Resource Name (ARN) of the user who created the recipe.amazonka-databrewThe description of the recipe.amazonka-databrewThe Amazon Resource Name (ARN) of the user who last modified the recipe.amazonka-databrew2The last modification date and time of the recipe.amazonka-databrew;The name of the project that the recipe is associated with.amazonka-databrewThe Amazon Resource Name (ARN) of the user who published the recipe.amazonka-databrew0The date and time when the recipe was published.amazonka-databrewThe identifier for the version for the recipe. Must be one of the following:Numeric version (X.Y) - X and Y stand for major and minor version numbers. The maximum length of each is 6 digits, and neither can be negative values. Both X and Y4 are required, and "0.0" isn't a valid version.LATEST_WORKING - the most recent valid version being developed in a DataBrew project.LATEST_PUBLISHED% - the most recent published version.amazonka-databrew.The Amazon Resource Name (ARN) for the recipe.amazonka-databrew/A list of steps that are defined by the recipe.amazonka-databrew3Metadata tags that have been applied to the recipe.amazonka-databrewThe unique name for the recipe.amazonka-databrew!(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';amazonka-databrewRepresents any errors encountered when attempting to delete multiple recipe versions.See:  smart constructor.amazonka-databrew#The HTTP status code for the error.amazonka-databrewThe text of the error message.amazonka-databrewThe identifier for the recipe version associated with this error.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, & - The HTTP status code for the error., ! - The text of the error message.,  - The identifier for the recipe version associated with this error.amazonka-databrew#The HTTP status code for the error.amazonka-databrewThe text of the error message.amazonka-databrewThe identifier for the recipe version associated with this error.  "(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&'; amazonka-databrew$Contains metadata about the ruleset.See:  smart constructor.amazonka-databrewThe ID of the Amazon Web Services account that owns the ruleset.amazonka-databrew/The date and time that the ruleset was created.amazonka-databrewThe Amazon Resource Name (ARN) of the user who created the ruleset.amazonka-databrewThe description of the ruleset.amazonka-databrewThe Amazon Resource Name (ARN) of the user who last modified the ruleset.amazonka-databrew.The modification date and time of the ruleset.amazonka-databrew/The Amazon Resource Name (ARN) for the ruleset.amazonka-databrew4The number of rules that are defined in the ruleset.amazonka-databrew4Metadata tags that have been applied to the ruleset.amazonka-databrewThe name of the ruleset.amazonka-databrewThe Amazon Resource Name (ARN) of a resource (dataset) that the ruleset is associated with.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The ID of the Amazon Web Services account that owns the ruleset., 2 - The date and time that the ruleset was created.,  - The Amazon Resource Name (ARN) of the user who created the ruleset., " - The description of the ruleset.,  - The Amazon Resource Name (ARN) of the user who last modified the ruleset., 1 - The modification date and time of the ruleset., 2 - The Amazon Resource Name (ARN) for the ruleset., 7 - The number of rules that are defined in the ruleset., 7 - Metadata tags that have been applied to the ruleset.,  - The name of the ruleset.,  - The Amazon Resource Name (ARN) of a resource (dataset) that the ruleset is associated with.amazonka-databrewThe ID of the Amazon Web Services account that owns the ruleset.amazonka-databrew/The date and time that the ruleset was created.amazonka-databrewThe Amazon Resource Name (ARN) of the user who created the ruleset.amazonka-databrewThe description of the ruleset.amazonka-databrewThe Amazon Resource Name (ARN) of the user who last modified the ruleset.amazonka-databrew.The modification date and time of the ruleset.amazonka-databrew/The Amazon Resource Name (ARN) for the ruleset.amazonka-databrew4The number of rules that are defined in the ruleset.amazonka-databrew4Metadata tags that have been applied to the ruleset.amazonka-databrewThe name of the ruleset.amazonka-databrewThe Amazon Resource Name (ARN) of a resource (dataset) that the ruleset is associated with.amazonka-databrewamazonka-databrew#(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';amazonka-databrewRepresents an Amazon S3 location (bucket name, bucket owner, and object key) where DataBrew can read input data, or write output from a job.See:  smart constructor.amazonka-databrew7The Amazon Web Services account ID of the bucket owner.amazonka-databrew,The unique name of the object in the bucket.amazonka-databrewThe Amazon S3 bucket name.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, : - The Amazon Web Services account ID of the bucket owner., / - The unique name of the object in the bucket.,  - The Amazon S3 bucket name.amazonka-databrew7The Amazon Web Services account ID of the bucket owner.amazonka-databrew,The unique name of the object in the bucket.amazonka-databrewThe Amazon S3 bucket name.amazonka-databrew  $(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';amazonka-databrewRepresents options that specify how and where in Amazon S3 DataBrew writes the output generated by recipe jobs or profile jobs.See:  smart constructor.amazonka-databrewThe compression algorithm used to compress the output text of the job.amazonka-databrew)The data format of the output of the job.amazonka-databrewRepresents options that define how DataBrew formats job output files.amazonka-databrewMaximum number of files to be generated by the job and written to the output folder. For output partitioned by column(s), the MaxOutputFiles value is the maximum number of files per partition.amazonka-databrewA value that, if true, means that any data in the location specified for output is overwritten with new output.amazonka-databrewThe names of one or more partition columns for the output of the job.amazonka-databrew:The location in Amazon S3 where the job writes its output.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The compression algorithm used to compress the output text of the job., , - The data format of the output of the job.,  - Represents options that define how DataBrew formats job output files.,  - Maximum number of files to be generated by the job and written to the output folder. For output partitioned by column(s), the MaxOutputFiles value is the maximum number of files per partition.,  - A value that, if true, means that any data in the location specified for output is overwritten with new output.,  - The names of one or more partition columns for the output of the job., = - The location in Amazon S3 where the job writes its output.amazonka-databrewThe compression algorithm used to compress the output text of the job.amazonka-databrew)The data format of the output of the job.amazonka-databrewRepresents options that define how DataBrew formats job output files.amazonka-databrewMaximum number of files to be generated by the job and written to the output folder. For output partitioned by column(s), the MaxOutputFiles value is the maximum number of files per partition.amazonka-databrewA value that, if true, means that any data in the location specified for output is overwritten with new output.amazonka-databrewThe names of one or more partition columns for the output of the job.amazonka-databrew:The location in Amazon S3 where the job writes its output.amazonka-databrew%(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';amazonka-databrewRepresents options that specify how and where DataBrew writes the database output generated by recipe jobs.See:  smart constructor.amazonka-databrewRepresents an Amazon S3 location (bucket name and object key) where DataBrew can store intermediate results.amazonka-databrewA prefix for the name of a table DataBrew will create in the database.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - Represents an Amazon S3 location (bucket name and object key) where DataBrew can store intermediate results.,  - A prefix for the name of a table DataBrew will create in the database.amazonka-databrewRepresents an Amazon S3 location (bucket name and object key) where DataBrew can store intermediate results.amazonka-databrewA prefix for the name of a table DataBrew will create in the database.amazonka-databrew&(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&'; amazonka-databrewRepresents a JDBC database output object which defines the output destination for a DataBrew recipe job to write into.See:  smart constructor.amazonka-databrewThe output mode to write into the database. Currently supported option: NEW_TABLE.amazonka-databrewThe Glue connection that stores the connection information for the target database.amazonka-databrewRepresents options that specify how and where DataBrew writes the database output generated by recipe jobs.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The output mode to write into the database. Currently supported option: NEW_TABLE.,  - The Glue connection that stores the connection information for the target database.,  - Represents options that specify how and where DataBrew writes the database output generated by recipe jobs.amazonka-databrewThe output mode to write into the database. Currently supported option: NEW_TABLE.amazonka-databrewThe Glue connection that stores the connection information for the target database.amazonka-databrewRepresents options that specify how and where DataBrew writes the database output generated by recipe jobs.amazonka-databrewamazonka-databrew  '(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&'; amazonka-databrewConnection information for dataset input files stored in a database.See:  smart constructor.amazonka-databrew%The table within the target database.amazonka-databrewCustom SQL to run against the provided Glue connection. This SQL will be used as the input for DataBrew projects and jobs.amazonka-databrewThe Glue Connection that stores the connection information for the target database.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, ( - The table within the target database.,  - Custom SQL to run against the provided Glue connection. This SQL will be used as the input for DataBrew projects and jobs.,  - Undocumented member.,  - The Glue Connection that stores the connection information for the target database.amazonka-databrew%The table within the target database.amazonka-databrewCustom SQL to run against the provided Glue connection. This SQL will be used as the input for DataBrew projects and jobs.amazonka-databrewUndocumented member.amazonka-databrewThe Glue Connection that stores the connection information for the target database.amazonka-databrew  ((c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&'; amazonka-databrewRepresents how metadata stored in the Glue Data Catalog is defined in a DataBrew dataset.See:  smart constructor.amazonka-databrewThe unique identifier of the Amazon Web Services account that holds the Data Catalog that stores the data.amazonka-databrewRepresents an Amazon location where DataBrew can store intermediate results.amazonka-databrew+The name of a database in the Data Catalog.amazonka-databrewThe name of a database table in the Data Catalog. This table corresponds to a DataBrew dataset.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The unique identifier of the Amazon Web Services account that holds the Data Catalog that stores the data.,  - Represents an Amazon location where DataBrew can store intermediate results., . - The name of a database in the Data Catalog.,  - The name of a database table in the Data Catalog. This table corresponds to a DataBrew dataset.amazonka-databrewThe unique identifier of the Amazon Web Services account that holds the Data Catalog that stores the data.amazonka-databrewRepresents an Amazon location where DataBrew can store intermediate results.amazonka-databrew+The name of a database in the Data Catalog.amazonka-databrewThe name of a database table in the Data Catalog. This table corresponds to a DataBrew dataset.amazonka-databrewamazonka-databrew  )(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&'; amazonka-databrewRepresents information on how DataBrew can find data, in either the Glue Data Catalog or Amazon S3.See:  smart constructor.amazonka-databrew.The Glue Data Catalog parameters for the data.amazonka-databrewConnection information for dataset input files stored in a database.amazonka-databrewContains additional resource information needed for specific datasets.amazonka-databrew0The Amazon S3 location where the data is stored.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, 1 - The Glue Data Catalog parameters for the data.,  - Connection information for dataset input files stored in a database.,  - Contains additional resource information needed for specific datasets., 3 - The Amazon S3 location where the data is stored.amazonka-databrew.The Glue Data Catalog parameters for the data.amazonka-databrewConnection information for dataset input files stored in a database.amazonka-databrewContains additional resource information needed for specific datasets.amazonka-databrew0The Amazon S3 location where the data is stored.  *(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';$qamazonka-databrewRepresents options that specify how and where DataBrew writes the Amazon S3 output generated by recipe jobs.See:  smart constructor.amazonka-databrewRepresents an Amazon S3 location (bucket name and object key) where DataBrew can write output from a job.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - Represents an Amazon S3 location (bucket name and object key) where DataBrew can write output from a job.amazonka-databrewRepresents an Amazon S3 location (bucket name and object key) where DataBrew can write output from a job.amazonka-databrew+(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';/amazonka-databrewRepresents options that specify how and where in the Glue Data Catalog DataBrew writes the output generated by recipe jobs.See:  smart constructor.amazonka-databrewThe unique identifier of the Amazon Web Services account that holds the Data Catalog that stores the data.amazonka-databrewRepresents options that specify how and where DataBrew writes the database output generated by recipe jobs.amazonka-databrewA value that, if true, means that any data in the location specified for output is overwritten with new output. Not supported with DatabaseOptions.amazonka-databrewRepresents options that specify how and where DataBrew writes the Amazon S3 output generated by recipe jobs.amazonka-databrew+The name of a database in the Data Catalog.amazonka-databrew(The name of a table in the Data Catalog.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The unique identifier of the Amazon Web Services account that holds the Data Catalog that stores the data.,  - Represents options that specify how and where DataBrew writes the database output generated by recipe jobs.,  - A value that, if true, means that any data in the location specified for output is overwritten with new output. Not supported with DatabaseOptions.,  - Represents options that specify how and where DataBrew writes the Amazon S3 output generated by recipe jobs., . - The name of a database in the Data Catalog., + - The name of a table in the Data Catalog.amazonka-databrewThe unique identifier of the Amazon Web Services account that holds the Data Catalog that stores the data.amazonka-databrewRepresents options that specify how and where DataBrew writes the database output generated by recipe jobs.amazonka-databrewA value that, if true, means that any data in the location specified for output is overwritten with new output. Not supported with DatabaseOptions.amazonka-databrewRepresents options that specify how and where DataBrew writes the Amazon S3 output generated by recipe jobs.amazonka-databrew+The name of a database in the Data Catalog.amazonka-databrew(The name of a table in the Data Catalog.amazonka-databrewamazonka-databrew,(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred";?0-(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';;amazonka-databrewA sample configuration for profile jobs only, which determines the number of rows on which the profile job is run. If a  JobSample value isn't provided, the default is used. The default value is CUSTOM_ROWS for the mode parameter and 20,000 for the size parameter.See:  smart constructor.amazonka-databrewA value that determines whether the profile job is run on the entire dataset or a specified number of rows. This value must be one of the following:(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';  amazonka-databrewConfiguration for data quality validation. Used to select the Rulesets and Validation Mode to be used in the profile job. When ValidationConfiguration is null, the profile job will run without data quality validation.See:   smart constructor. amazonka-databrewMode of data quality validation. Default mode is @CHECK_ALL@ which verifies all rules defined in the selected ruleset. amazonka-databrewThe Amazon Resource Name (ARN) for the ruleset to be validated in the profile job. The TargetArn of the selected ruleset should be the same as the Amazon Resource Name (ARN) of the dataset that is associated with the profile job. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,   - Mode of data quality validation. Default mode is @CHECK_ALL@ which verifies all rules defined in the selected ruleset. ,   - The Amazon Resource Name (ARN) for the ruleset to be validated in the profile job. The TargetArn of the selected ruleset should be the same as the Amazon Resource Name (ARN) of the dataset that is associated with the profile job. amazonka-databrewMode of data quality validation. Default mode is @CHECK_ALL@ which verifies all rules defined in the selected ruleset. amazonka-databrewThe Amazon Resource Name (ARN) for the ruleset to be validated in the profile job. The TargetArn of the selected ruleset should be the same as the Amazon Resource Name (ARN) of the dataset that is associated with the profile job. amazonka-databrew   ?(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';& amazonka-databrew%Represents one run of a DataBrew job.See:   smart constructor. amazonka-databrew?The number of times that DataBrew has attempted to run the job. amazonka-databrew4The date and time when the job completed processing. amazonka-databrewOne or more artifacts that represent the Glue Data Catalog output from running the job. amazonka-databrewRepresents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write into. amazonka-databrew/The name of the dataset for the job to process. amazonka-databrewA message indicating an error (if any) that was encountered when the job ran. amazonka-databrewThe amount of time, in seconds, during which a job run consumed resources. amazonka-databrew4The name of the job being processed during this run. amazonka-databrewA sample configuration for profile jobs only, which determines the number of rows on which the profile job is run. If a  JobSample value isn't provided, the default is used. The default value is CUSTOM_ROWS for the mode parameter and 20,000 for the size parameter. amazonka-databrewThe name of an Amazon CloudWatch log group, where the job writes diagnostic messages when it runs. amazonka-databrewThe current status of Amazon CloudWatch logging for the job run. amazonka-databrew,One or more output artifacts from a job run. amazonka-databrew&The set of steps processed by the job. amazonka-databrew%The unique identifier of the job run. amazonka-databrewThe Amazon Resource Name (ARN) of the user who initiated the job run. amazonka-databrew)The date and time when the job run began. amazonka-databrew/The current state of the job run entity itself. amazonka-databrewList of validation configurations that are applied to the profile job run. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,   - The number of times that DataBrew has attempted to run the job. ,  7 - The date and time when the job completed processing. ,   - One or more artifacts that represent the Glue Data Catalog output from running the job. ,   - Represents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write into. ,  2 - The name of the dataset for the job to process. ,   - A message indicating an error (if any) that was encountered when the job ran. ,   - The amount of time, in seconds, during which a job run consumed resources. ,  7 - The name of the job being processed during this run. ,   - A sample configuration for profile jobs only, which determines the number of rows on which the profile job is run. If a  JobSample value isn't provided, the default is used. The default value is CUSTOM_ROWS for the mode parameter and 20,000 for the size parameter. ,   - The name of an Amazon CloudWatch log group, where the job writes diagnostic messages when it runs. ,   - The current status of Amazon CloudWatch logging for the job run. ,  / - One or more output artifacts from a job run. ,  ) - The set of steps processed by the job. ,  ( - The unique identifier of the job run. ,   - The Amazon Resource Name (ARN) of the user who initiated the job run. ,  , - The date and time when the job run began. ,  2 - The current state of the job run entity itself. ,   - List of validation configurations that are applied to the profile job run. amazonka-databrew?The number of times that DataBrew has attempted to run the job. amazonka-databrew4The date and time when the job completed processing. amazonka-databrewOne or more artifacts that represent the Glue Data Catalog output from running the job. amazonka-databrewRepresents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write into. amazonka-databrew/The name of the dataset for the job to process. amazonka-databrewA message indicating an error (if any) that was encountered when the job ran. amazonka-databrewThe amount of time, in seconds, during which a job run consumed resources. amazonka-databrew4The name of the job being processed during this run. amazonka-databrewA sample configuration for profile jobs only, which determines the number of rows on which the profile job is run. If a  JobSample value isn't provided, the default is used. The default value is CUSTOM_ROWS for the mode parameter and 20,000 for the size parameter. amazonka-databrewThe name of an Amazon CloudWatch log group, where the job writes diagnostic messages when it runs. amazonka-databrewThe current status of Amazon CloudWatch logging for the job run. amazonka-databrew,One or more output artifacts from a job run. amazonka-databrew&The set of steps processed by the job. amazonka-databrew%The unique identifier of the job run. amazonka-databrewThe Amazon Resource Name (ARN) of the user who initiated the job run. amazonka-databrew)The date and time when the job run began. amazonka-databrew/The current state of the job run entity itself. amazonka-databrewList of validation configurations that are applied to the profile job run.' ' @(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%&';2 amazonka-databrew3Represents all of the attributes of a DataBrew job.See:   smart constructor. amazonka-databrew=<;:98756STUVWXYZ[defghijstuvw  4?>=<;:98756?>=<;:987 %&'()*+STUVWXYZ[defghijstuvw   C(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&'; amazonka-databrewSee:   smart constructor. amazonka-databrew The response's http status code. amazonka-databrewSee:   smart constructor. amazonka-databrewThe DataBrew resource to which tags should be added. The value for this parameter is an Amazon Resource Name (ARN). For DataBrew, you can tag a dataset, a job, a project, or a recipe. amazonka-databrew0One or more tags to be assigned to the resource. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,   - The DataBrew resource to which tags should be added. The value for this parameter is an Amazon Resource Name (ARN). For DataBrew, you can tag a dataset, a job, a project, or a recipe. ,  3 - One or more tags to be assigned to the resource. amazonka-databrewThe DataBrew resource to which tags should be added. The value for this parameter is an Amazon Resource Name (ARN). For DataBrew, you can tag a dataset, a job, a project, or a recipe. amazonka-databrew0One or more tags to be assigned to the resource. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  # - The response's http status code. amazonka-databrew The response's http status code. amazonka-databrew amazonka-databrew D(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';#q amazonka-databrewSee:   smart constructor. amazonka-databrew The response's http status code. amazonka-databrew'The ID of the job run that you stopped. amazonka-databrewSee:   smart constructor. amazonka-databrew"The name of the job to be stopped. amazonka-databrew$The ID of the job run to be stopped. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  % - The name of the job to be stopped. ,  ' - The ID of the job run to be stopped. amazonka-databrew"The name of the job to be stopped. amazonka-databrew$The ID of the job run to be stopped. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  # - The response's http status code. ,  * - The ID of the job run that you stopped. amazonka-databrew The response's http status code. amazonka-databrew'The ID of the job run that you stopped. amazonka-databrew amazonka-databrew amazonka-databrew amazonka-databrew   E(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';, amazonka-databrewSee:   smart constructor. amazonka-databrew.A system-generated identifier for the session. amazonka-databrew The response's http status code. amazonka-databrew)The name of the project to be acted upon. amazonka-databrewSee:   smart constructor. amazonka-databrewA value that, if true, enables you to take control of a session, even if a different client is currently accessing the project. amazonka-databrew$The name of the project to act upon. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,   - A value that, if true, enables you to take control of a session, even if a different client is currently accessing the project. ,  ' - The name of the project to act upon. amazonka-databrewA value that, if true, enables you to take control of a session, even if a different client is currently accessing the project. amazonka-databrew$The name of the project to act upon. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  1 - A system-generated identifier for the session. ,  # - The response's http status code. ,  , - The name of the project to be acted upon. amazonka-databrew.A system-generated identifier for the session. amazonka-databrew The response's http status code. amazonka-databrew)The name of the project to be acted upon. amazonka-databrew amazonka-databrew amazonka-databrew   F(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';3 amazonka-databrewSee:   smart constructor. amazonka-databrew The response's http status code. amazonka-databrew:A system-generated identifier for this particular job run. amazonka-databrewSee:   smart constructor. amazonka-databrewThe name of the job to be run. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  ! - The name of the job to be run. amazonka-databrewThe name of the job to be run. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  # - The response's http status code. ,  = - A system-generated identifier for this particular job run. amazonka-databrew The response's http status code. amazonka-databrew:A system-generated identifier for this particular job run. amazonka-databrew amazonka-databrew amazonka-databrew G(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';B amazonka-databrewSee:   smart constructor. amazonka-databrew6A unique identifier for the action that was performed. amazonka-databrew9A message indicating the result of performing the action. amazonka-databrew The response's http status code. amazonka-databrew8The name of the project that was affected by the action. amazonka-databrewSee:   smart constructor. amazonka-databrewA unique identifier for an interactive session that's currently open and ready for work. The action will be performed on this session. amazonka-databrewIf true, the result of the recipe step will be returned, but not applied. amazonka-databrewThe index from which to preview a step. This index is used to preview the result of steps that have already been applied, so that the resulting view frame is from earlier in the view frame stack. amazonka-databrew/The name of the project to apply the action to. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,   - A unique identifier for an interactive session that's currently open and ready for work. The action will be performed on this session. ,   - If true, the result of the recipe step will be returned, but not applied. ,   - Undocumented member. ,   - The index from which to preview a step. This index is used to preview the result of steps that have already been applied, so that the resulting view frame is from earlier in the view frame stack. ,   - Undocumented member. ,  2 - The name of the project to apply the action to. amazonka-databrewA unique identifier for an interactive session that's currently open and ready for work. The action will be performed on this session. amazonka-databrewIf true, the result of the recipe step will be returned, but not applied. amazonka-databrewUndocumented member. amazonka-databrewThe index from which to preview a step. This index is used to preview the result of steps that have already been applied, so that the resulting view frame is from earlier in the view frame stack. amazonka-databrewUndocumented member. amazonka-databrew/The name of the project to apply the action to. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  9 - A unique identifier for the action that was performed. ,  < - A message indicating the result of performing the action. ,  # - The response's http status code. ,  ; - The name of the project that was affected by the action. amazonka-databrew6A unique identifier for the action that was performed. amazonka-databrew9A message indicating the result of performing the action. amazonka-databrew The response's http status code. amazonka-databrew8The name of the project that was affected by the action. amazonka-databrew amazonka-databrew amazonka-databrew   H(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';J amazonka-databrewSee:   smart constructor. amazonka-databrew The response's http status code. amazonka-databrew*The name of the recipe that you published. amazonka-databrewSee:   smart constructor. amazonka-databrewA description of the recipe to be published, for this version of the recipe. amazonka-databrew'The name of the recipe to be published. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,   - A description of the recipe to be published, for this version of the recipe. ,  * - The name of the recipe to be published. amazonka-databrewA description of the recipe to be published, for this version of the recipe. amazonka-databrew'The name of the recipe to be published. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  # - The response's http status code. ,  - - The name of the recipe that you published. amazonka-databrew The response's http status code. amazonka-databrew*The name of the recipe that you published. amazonka-databrew amazonka-databrew amazonka-databrew   I(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';RE amazonka-databrewSee:   smart constructor. amazonka-databrew5A list of tags associated with the DataBrew resource. amazonka-databrew The response's http status code. amazonka-databrewSee:   smart constructor. amazonka-databrewThe Amazon Resource Name (ARN) string that uniquely identifies the DataBrew resource. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,   - The Amazon Resource Name (ARN) string that uniquely identifies the DataBrew resource. amazonka-databrewThe Amazon Resource Name (ARN) string that uniquely identifies the DataBrew resource. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  8 - A list of tags associated with the DataBrew resource. ,  # - The response's http status code. amazonka-databrew5A list of tags associated with the DataBrew resource. amazonka-databrew The response's http status code. amazonka-databrew amazonka-databrew J(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';\9 amazonka-databrewSee:   smart constructor. amazonka-databrewA token that you can use in a subsequent call to retrieve the next set of results. amazonka-databrew The response's http status code. amazonka-databrew%A list of schedules that are defined. amazonka-databrewSee:   smart constructor. amazonka-databrew2The name of the job that these schedules apply to. amazonka-databrew8The maximum number of results to return in this request. amazonka-databrewThe token returned by a previous call to retrieve the next set of results. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  5 - The name of the job that these schedules apply to. ,  ; - The maximum number of results to return in this request. ,   - The token returned by a previous call to retrieve the next set of results. amazonka-databrew2The name of the job that these schedules apply to. amazonka-databrew8The maximum number of results to return in this request. amazonka-databrewThe token returned by a previous call to retrieve the next set of results. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,   - A token that you can use in a subsequent call to retrieve the next set of results. ,  # - The response's http status code. ,  ( - A list of schedules that are defined. amazonka-databrewA token that you can use in a subsequent call to retrieve the next set of results. amazonka-databrew The response's http status code. amazonka-databrew%A list of schedules that are defined. amazonka-databrew   K(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';i amazonka-databrewSee:   smart constructor. amazonka-databrewA token that you can use in a subsequent call to retrieve the next set of results. amazonka-databrew The response's http status code. amazonka-databrewA list of RulesetItem. RulesetItem contains meta data of a ruleset. amazonka-databrewSee:   smart constructor. amazonka-databrew8The maximum number of results to return in this request. amazonka-databrewA token generated by DataBrew that specifies where to continue pagination if a previous request was truncated. To get the next set of pages, pass in the NextToken value from the response object of the previous page call. amazonka-databrewThe Amazon Resource Name (ARN) of a resource (dataset). Using this parameter indicates to return only those rulesets that are associated with the specified resource. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  ; - The maximum number of results to return in this request. ,   - A token generated by DataBrew that specifies where to continue pagination if a previous request was truncated. To get the next set of pages, pass in the NextToken value from the response object of the previous page call. ,   - The Amazon Resource Name (ARN) of a resource (dataset). Using this parameter indicates to return only those rulesets that are associated with the specified resource. amazonka-databrew8The maximum number of results to return in this request. amazonka-databrewA token generated by DataBrew that specifies where to continue pagination if a previous request was truncated. To get the next set of pages, pass in the NextToken value from the response object of the previous page call. amazonka-databrewThe Amazon Resource Name (ARN) of a resource (dataset). Using this parameter indicates to return only those rulesets that are associated with the specified resource. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,   - A token that you can use in a subsequent call to retrieve the next set of results. ,  # - The response's http status code. ,   - A list of RulesetItem. RulesetItem contains meta data of a ruleset. amazonka-databrewA token that you can use in a subsequent call to retrieve the next set of results. amazonka-databrew The response's http status code. amazonka-databrewA list of RulesetItem. RulesetItem contains meta data of a ruleset. amazonka-databrew   L(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';v amazonka-databrewSee:   smart constructor. amazonka-databrewA token that you can use in a subsequent call to retrieve the next set of results. amazonka-databrew The response's http status code. amazonka-databrew#A list of recipes that are defined. amazonka-databrewSee:   smart constructor. amazonka-databrew8The maximum number of results to return in this request. amazonka-databrewThe token returned by a previous call to retrieve the next set of results. amazonka-databrew7Return only those recipes with a version identifier of LATEST_WORKING or LATEST_PUBLISHED. If  RecipeVersion is omitted,  ListRecipes returns all of the LATEST_PUBLISHED recipe versions.Valid values: LATEST_WORKING | LATEST_PUBLISHED amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  ; - The maximum number of results to return in this request. ,   - The token returned by a previous call to retrieve the next set of results. ,  : - Return only those recipes with a version identifier of LATEST_WORKING or LATEST_PUBLISHED. If  RecipeVersion is omitted,  ListRecipes returns all of the LATEST_PUBLISHED recipe versions.Valid values: LATEST_WORKING | LATEST_PUBLISHED amazonka-databrew8The maximum number of results to return in this request. amazonka-databrewThe token returned by a previous call to retrieve the next set of results. amazonka-databrew7Return only those recipes with a version identifier of LATEST_WORKING or LATEST_PUBLISHED. If  RecipeVersion is omitted,  ListRecipes returns all of the LATEST_PUBLISHED recipe versions.Valid values: LATEST_WORKING | LATEST_PUBLISHED amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,   - A token that you can use in a subsequent call to retrieve the next set of results. ,  # - The response's http status code. ,  & - A list of recipes that are defined. amazonka-databrewA token that you can use in a subsequent call to retrieve the next set of results. amazonka-databrew The response's http status code. amazonka-databrew#A list of recipes that are defined. amazonka-databrew   M(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&'; amazonka-databrewSee:   smart constructor. amazonka-databrewA token that you can use in a subsequent call to retrieve the next set of results. amazonka-databrew The response's http status code. amazonka-databrew,A list of versions for the specified recipe. amazonka-databrewSee:   smart constructor. amazonka-databrew8The maximum number of results to return in this request. amazonka-databrewThe token returned by a previous call to retrieve the next set of results. amazonka-databrew?The name of the recipe for which to return version information. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  ; - The maximum number of results to return in this request. ,   - The token returned by a previous call to retrieve the next set of results. ,   - The name of the recipe for which to return version information. amazonka-databrew8The maximum number of results to return in this request. amazonka-databrewThe token returned by a previous call to retrieve the next set of results. amazonka-databrew?The name of the recipe for which to return version information. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,   - A token that you can use in a subsequent call to retrieve the next set of results. ,  # - The response's http status code. ,  / - A list of versions for the specified recipe. amazonka-databrewA token that you can use in a subsequent call to retrieve the next set of results. amazonka-databrew The response's http status code. amazonka-databrew,A list of versions for the specified recipe. amazonka-databrew amazonka-databrew   N(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&'; amazonka-databrewSee:   smart constructor. amazonka-databrewA token that you can use in a subsequent call to retrieve the next set of results. amazonka-databrew The response's http status code. amazonka-databrew%A list of projects that are defined . amazonka-databrewSee:   smart constructor. amazonka-databrew8The maximum number of results to return in this request. amazonka-databrewThe token returned by a previous call to retrieve the next set of results. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,  ; - The maximum number of results to return in this request. ,   - The token returned by a previous call to retrieve the next set of results. amazonka-databrew8The maximum number of results to return in this request. amazonka-databrewThe token returned by a previous call to retrieve the next set of results. amazonka-databrewCreate a value of  " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility: ,   - A token that you can use in a subsequent call to retrieve the next set of results. ,  # - The response's http status code. ,  ( - A list of projects that are defined . amazonka-databrewA token that you can use in a subsequent call to retrieve the next set of results. amazonka-databrew The response's http status code. amazonka-databrew%A list of projects that are defined . amazonka-databrew   O(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';Wamazonka-databrewSee:  smart constructor.amazonka-databrewA token that you can use in a subsequent call to retrieve the next set of results.amazonka-databrew The response's http status code.amazonka-databrew A list of jobs that are defined.amazonka-databrewSee:  smart constructor.amazonka-databrewThe name of a dataset. Using this parameter indicates to return only those jobs that act on the specified dataset.amazonka-databrew8The maximum number of results to return in this request.amazonka-databrewA token generated by DataBrew that specifies where to continue pagination if a previous request was truncated. To get the next set of pages, pass in the NextToken value from the response object of the previous page call.amazonka-databrewThe name of a project. Using this parameter indicates to return only those jobs that are associated with the specified project.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The name of a dataset. Using this parameter indicates to return only those jobs that act on the specified dataset., ; - The maximum number of results to return in this request.,  - A token generated by DataBrew that specifies where to continue pagination if a previous request was truncated. To get the next set of pages, pass in the NextToken value from the response object of the previous page call.,  - The name of a project. Using this parameter indicates to return only those jobs that are associated with the specified project.amazonka-databrewThe name of a dataset. Using this parameter indicates to return only those jobs that act on the specified dataset.amazonka-databrew8The maximum number of results to return in this request.amazonka-databrewA token generated by DataBrew that specifies where to continue pagination if a previous request was truncated. To get the next set of pages, pass in the NextToken value from the response object of the previous page call.amazonka-databrewThe name of a project. Using this parameter indicates to return only those jobs that are associated with the specified project.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - A token that you can use in a subsequent call to retrieve the next set of results., # - The response's http status code., # - A list of jobs that are defined.amazonka-databrewA token that you can use in a subsequent call to retrieve the next set of results.amazonka-databrew The response's http status code.amazonka-databrew A list of jobs that are defined.amazonka-databrewP(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';samazonka-databrewSee:  smart constructor.amazonka-databrewA token that you can use in a subsequent call to retrieve the next set of results.amazonka-databrew The response's http status code.amazonka-databrewThe identifier (user name) of the user who created the recipe.amazonka-databrewThe description of the recipe.amazonka-databrewThe identifier (user name) of the user who last modified the recipe.amazonka-databrew4The date and time that the recipe was last modified.amazonka-databrew4The name of the project associated with this recipe.amazonka-databrewThe identifier (user name) of the user who last published the recipe.amazonka-databrew5The date and time when the recipe was last published.amazonka-databrewThe recipe version identifier.amazonka-databrewThe ARN of the recipe.amazonka-databrewOne or more steps to be performed by the recipe. Each step consists of an action, and the conditions under which the action should succeed.amazonka-databrew+Metadata tags associated with this project.amazonka-databrew The response's http status code.amazonka-databrewThe name of the recipe.amazonka-databrewSee:  smart constructor.amazonka-databrewThe recipe version identifier. If this parameter isn't specified, then the latest published version is returned.amazonka-databrew'The name of the recipe to be described.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The recipe version identifier. If this parameter isn't specified, then the latest published version is returned., * - The name of the recipe to be described.amazonka-databrewThe recipe version identifier. If this parameter isn't specified, then the latest published version is returned.amazonka-databrew'The name of the recipe to be described.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, 1 - The date and time that the recipe was created.,  - The identifier (user name) of the user who created the recipe., ! - The description of the recipe.,  - The identifier (user name) of the user who last modified the recipe., 7 - The date and time that the recipe was last modified., 7 - The name of the project associated with this recipe.,  - The identifier (user name) of the user who last published the recipe., 8 - The date and time when the recipe was last published., ! - The recipe version identifier.,  - The ARN of the recipe.,  - One or more steps to be performed by the recipe. Each step consists of an action, and the conditions under which the action should succeed., . - Metadata tags associated with this project., # - The response's http status code.,  - The name of the recipe.amazonka-databrew.The date and time that the recipe was created.amazonka-databrew>The identifier (user name) of the user who created the recipe.amazonka-databrewThe description of the recipe.amazonka-databrewThe identifier (user name) of the user who last modified the recipe.amazonka-databrew4The date and time that the recipe was last modified.amazonka-databrew4The name of the project associated with this recipe.amazonka-databrewThe identifier (user name) of the user who last published the recipe.amazonka-databrew5The date and time when the recipe was last published.amazonka-databrewThe recipe version identifier.amazonka-databrewThe ARN of the recipe.amazonka-databrewOne or more steps to be performed by the recipe. Each step consists of an action, and the conditions under which the action should succeed.amazonka-databrew+Metadata tags associated with this project.amazonka-databrew The response's http status code.amazonka-databrewThe name of the recipe.amazonka-databrewamazonka-databrewamazonka-databrew&&U(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';-#amazonka-databrewSee:  smart constructor.amazonka-databrew/The date and time that the project was created.amazonka-databrew?The identifier (user name) of the user who created the project.amazonka-databrew(The dataset associated with the project.amazonka-databrewThe identifier (user name) of the user who last modified the project.amazonka-databrew5The date and time that the project was last modified.amazonka-databrew.The date and time when the project was opened.amazonka-databrewThe identifier (user name) of the user that opened the project for use.amazonka-databrew$The recipe associated with this job.amazonka-databrew.The Amazon Resource Name (ARN) of the project.amazonka-databrewThe ARN of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrew+Describes the current state of the session: PROVISIONING( - allocating resources for the session. INITIALIZING+ - getting the session ready for first use.ASSIGNED - the session is ready for use.amazonka-databrew+Metadata tags associated with this project.amazonka-databrew The response's http status code.amazonka-databrewThe name of the project.amazonka-databrewSee:  smart constructor.amazonka-databrew(The name of the project to be described.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, + - The name of the project to be described.amazonka-databrew(The name of the project to be described.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, 2 - The date and time that the project was created.,  - The identifier (user name) of the user who created the project., + - The dataset associated with the project.,  - The identifier (user name) of the user who last modified the project., 8 - The date and time that the project was last modified., 1 - The date and time when the project was opened.,  - The identifier (user name) of the user that opened the project for use., ' - The recipe associated with this job., 1 - The Amazon Resource Name (ARN) of the project.,  - The ARN of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.,  - Undocumented member., . - Describes the current state of the session: PROVISIONING( - allocating resources for the session. INITIALIZING+ - getting the session ready for first use.ASSIGNED - the session is ready for use., . - Metadata tags associated with this project., # - The response's http status code.,  - The name of the project.amazonka-databrew/The date and time that the project was created.amazonka-databrew?The identifier (user name) of the user who created the project.amazonka-databrew(The dataset associated with the project.amazonka-databrewThe identifier (user name) of the user who last modified the project.amazonka-databrew5The date and time that the project was last modified.amazonka-databrew.The date and time when the project was opened.amazonka-databrewThe identifier (user name) of the user that opened the project for use.amazonka-databrew$The recipe associated with this job.amazonka-databrew.The Amazon Resource Name (ARN) of the project.amazonka-databrewThe ARN of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrewUndocumented member.amazonka-databrew+Describes the current state of the session: PROVISIONING( - allocating resources for the session. INITIALIZING+ - getting the session ready for first use.ASSIGNED - the session is ready for use.amazonka-databrew+Metadata tags associated with this project.amazonka-databrew The response's http status code.amazonka-databrewThe name of the project.amazonka-databrewamazonka-databrewamazonka-databrew&&V(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';/amazonka-databrewSee:  smart constructor.amazonka-databrew?The number of times that DataBrew has attempted to run the job.amazonka-databrew4The date and time when the job completed processing.amazonka-databrewOne or more artifacts that represent the Glue Data Catalog output from running the job.amazonka-databrewRepresents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write into.amazonka-databrew/The name of the dataset for the job to process.amazonka-databrewA message indicating an error (if any) that was encountered when the job ran.amazonka-databrewThe amount of time, in seconds, during which the job run consumed resources.amazonka-databrewSample configuration for profile jobs only. Determines the number of rows on which the profile job will be executed. If a JobSample value is not provided, the default value will be used. The default value is CUSTOM_ROWS for the mode parameter and 20000 for the size parameter.amazonka-databrewThe name of an Amazon CloudWatch log group, where the job writes diagnostic messages when it runs.amazonka-databrewThe current status of Amazon CloudWatch logging for the job run.amazonka-databrew,One or more output artifacts from a job run.amazonka-databrewConfiguration for profile jobs. Used to select columns, do evaluations, and override default parameters of evaluations. When configuration is null, the profile job will run with default settings.amazonka-databrew%The unique identifier of the job run.amazonka-databrewThe Amazon Resource Name (ARN) of the user who started the job run.amazonka-databrew)The date and time when the job run began.amazonka-databrew/The current state of the job run entity itself.amazonka-databrewList of validation configurations that are applied to the profile job.amazonka-databrew The response's http status code.amazonka-databrew4The name of the job being processed during this run.amazonka-databrewSee:  smart constructor.amazonka-databrew4The name of the job being processed during this run.amazonka-databrew%The unique identifier of the job run.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, 7 - The name of the job being processed during this run., ( - The unique identifier of the job run.amazonka-databrew4The name of the job being processed during this run.amazonka-databrew%The unique identifier of the job run.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The number of times that DataBrew has attempted to run the job., 7 - The date and time when the job completed processing.,  - One or more artifacts that represent the Glue Data Catalog output from running the job.,  - Represents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write into., 2 - The name of the dataset for the job to process.,  - A message indicating an error (if any) that was encountered when the job ran.,  - The amount of time, in seconds, during which the job run consumed resources.,  - Sample configuration for profile jobs only. Determines the number of rows on which the profile job will be executed. If a JobSample value is not provided, the default value will be used. The default value is CUSTOM_ROWS for the mode parameter and 20000 for the size parameter.,  - The name of an Amazon CloudWatch log group, where the job writes diagnostic messages when it runs.,  - The current status of Amazon CloudWatch logging for the job run., / - One or more output artifacts from a job run.,  - Configuration for profile jobs. Used to select columns, do evaluations, and override default parameters of evaluations. When configuration is null, the profile job will run with default settings.,  - Undocumented member., ( - The unique identifier of the job run.,  - The Amazon Resource Name (ARN) of the user who started the job run., , - The date and time when the job run began., 2 - The current state of the job run entity itself.,  - List of validation configurations that are applied to the profile job., # - The response's http status code., 7 - The name of the job being processed during this run.amazonka-databrew?The number of times that DataBrew has attempted to run the job.amazonka-databrew4The date and time when the job completed processing.amazonka-databrewOne or more artifacts that represent the Glue Data Catalog output from running the job.amazonka-databrewRepresents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write into.amazonka-databrew/The name of the dataset for the job to process.amazonka-databrewA message indicating an error (if any) that was encountered when the job ran.amazonka-databrewThe amount of time, in seconds, during which the job run consumed resources.amazonka-databrewSample configuration for profile jobs only. Determines the number of rows on which the profile job will be executed. If a JobSample value is not provided, the default value will be used. The default value is CUSTOM_ROWS for the mode parameter and 20000 for the size parameter.amazonka-databrewThe name of an Amazon CloudWatch log group, where the job writes diagnostic messages when it runs.amazonka-databrewThe current status of Amazon CloudWatch logging for the job run.amazonka-databrew,One or more output artifacts from a job run.amazonka-databrewConfiguration for profile jobs. Used to select columns, do evaluations, and override default parameters of evaluations. When configuration is null, the profile job will run with default settings.amazonka-databrewUndocumented member.amazonka-databrew%The unique identifier of the job run.amazonka-databrewThe Amazon Resource Name (ARN) of the user who started the job run.amazonka-databrew)The date and time when the job run began.amazonka-databrew/The current state of the job run entity itself.amazonka-databrewList of validation configurations that are applied to the profile job.amazonka-databrew The response's http status code.amazonka-databrew4The name of the job being processed during this run.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrew22W(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';87amazonka-databrewSee:  smart constructor.amazonka-databrew+The date and time that the job was created.amazonka-databrewThe identifier (user name) of the user associated with the creation of the job.amazonka-databrewOne or more artifacts that represent the Glue Data Catalog output from running the job.amazonka-databrewRepresents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write into.amazonka-databrew#The dataset that the job acts upon.amazonka-databrewThe Amazon Resource Name (ARN) of an encryption key that is used to protect the job.amazonka-databrewThe encryption mode for the job, which can be one of the following:SSE-KMS3 - Server-side encryption with keys managed by KMS.SSE-S39 - Server-side encryption with keys managed by Amazon S3.amazonka-databrewSample configuration for profile jobs only. Determines the number of rows on which the profile job will be executed.amazonka-databrewThe identifier (user name) of the user who last modified the job.amazonka-databrew1The date and time that the job was last modified.amazonka-databrewIndicates whether Amazon CloudWatch logging is enabled for this job.amazonka-databrewThe maximum number of compute nodes that DataBrew can consume when the job processes data.amazonka-databrewThe maximum number of times to retry the job after a job run fails.amazonka-databrewOne or more artifacts that represent the output from running the job.amazonka-databrewConfiguration for profile jobs. Used to select columns, do evaluations, and override default parameters of evaluations. When configuration is null, the profile job will run with default settings.amazonka-databrew.The DataBrew project associated with this job.amazonka-databrew*The Amazon Resource Name (ARN) of the job.amazonka-databrewThe ARN of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrew'Metadata tags associated with this job.amazonka-databrewThe job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT.amazonka-databrew1The job type, which must be one of the following:PROFILE - The job analyzes the dataset to determine its size, data types, data distribution, and more.RECIPE< - The job applies one or more transformations to a dataset.amazonka-databrewList of validation configurations that are applied to the profile job.amazonka-databrew The response's http status code.amazonka-databrewThe name of the job.amazonka-databrewSee:  smart constructor.amazonka-databrew$The name of the job to be described.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, ' - The name of the job to be described.amazonka-databrew$The name of the job to be described.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, . - The date and time that the job was created.,  - The identifier (user name) of the user associated with the creation of the job.,  - One or more artifacts that represent the Glue Data Catalog output from running the job.,  - Represents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write into., & - The dataset that the job acts upon.,  - The Amazon Resource Name (ARN) of an encryption key that is used to protect the job.,  - The encryption mode for the job, which can be one of the following:SSE-KMS3 - Server-side encryption with keys managed by KMS.SSE-S39 - Server-side encryption with keys managed by Amazon S3.,  - Sample configuration for profile jobs only. Determines the number of rows on which the profile job will be executed.,  - The identifier (user name) of the user who last modified the job., 4 - The date and time that the job was last modified.,  - Indicates whether Amazon CloudWatch logging is enabled for this job.,  - The maximum number of compute nodes that DataBrew can consume when the job processes data.,  - The maximum number of times to retry the job after a job run fails.,  - One or more artifacts that represent the output from running the job.,  - Configuration for profile jobs. Used to select columns, do evaluations, and override default parameters of evaluations. When configuration is null, the profile job will run with default settings., 1 - The DataBrew project associated with this job.,  - Undocumented member., - - The Amazon Resource Name (ARN) of the job.,  - The ARN of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job., * - Metadata tags associated with this job.,  - The job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT., 4 - The job type, which must be one of the following:PROFILE - The job analyzes the dataset to determine its size, data types, data distribution, and more.RECIPE< - The job applies one or more transformations to a dataset.,  - List of validation configurations that are applied to the profile job., # - The response's http status code.,  - The name of the job.amazonka-databrew+The date and time that the job was created.amazonka-databrewThe identifier (user name) of the user associated with the creation of the job.amazonka-databrewOne or more artifacts that represent the Glue Data Catalog output from running the job.amazonka-databrewRepresents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write into.amazonka-databrew#The dataset that the job acts upon.amazonka-databrewThe Amazon Resource Name (ARN) of an encryption key that is used to protect the job.amazonka-databrewThe encryption mode for the job, which can be one of the following:SSE-KMS3 - Server-side encryption with keys managed by KMS.SSE-S39 - Server-side encryption with keys managed by Amazon S3.amazonka-databrewSample configuration for profile jobs only. Determines the number of rows on which the profile job will be executed.amazonka-databrewThe identifier (user name) of the user who last modified the job.amazonka-databrew1The date and time that the job was last modified.amazonka-databrewIndicates whether Amazon CloudWatch logging is enabled for this job.amazonka-databrewThe maximum number of compute nodes that DataBrew can consume when the job processes data.amazonka-databrewThe maximum number of times to retry the job after a job run fails.amazonka-databrewOne or more artifacts that represent the output from running the job.amazonka-databrewConfiguration for profile jobs. Used to select columns, do evaluations, and override default parameters of evaluations. When configuration is null, the profile job will run with default settings.amazonka-databrew.The DataBrew project associated with this job.amazonka-databrewUndocumented member.amazonka-databrew*The Amazon Resource Name (ARN) of the job.amazonka-databrewThe ARN of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrew'Metadata tags associated with this job.amazonka-databrewThe job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT.amazonka-databrew1The job type, which must be one of the following:PROFILE - The job analyzes the dataset to determine its size, data types, data distribution, and more.RECIPE< - The job applies one or more transformations to a dataset.amazonka-databrewList of validation configurations that are applied to the profile job.amazonka-databrew The response's http status code.amazonka-databrewThe name of the job.amazonka-databrewamazonka-databrewamazonka-databrew::X(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';Iamazonka-databrewSee:  smart constructor.amazonka-databrew/The date and time that the dataset was created.amazonka-databrew?The identifier (user name) of the user who created the dataset.amazonka-databrewThe file format of a dataset that is created from an Amazon S3 file or folder.amazonka-databrewThe identifier (user name) of the user who last modified the dataset.amazonka-databrew5The date and time that the dataset was last modified.amazonka-databrewA set of options that defines how DataBrew interprets an Amazon S3 path of the dataset.amazonka-databrew.The Amazon Resource Name (ARN) of the dataset.amazonka-databrewThe location of the data for this dataset, Amazon S3 or the Glue Data Catalog.amazonka-databrew+Metadata tags associated with this dataset.amazonka-databrew The response's http status code.amazonka-databrewThe name of the dataset.amazonka-databrewSee:  smart constructor.amazonka-databrew(The name of the dataset to be described.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, + - The name of the dataset to be described.amazonka-databrew(The name of the dataset to be described.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, 2 - The date and time that the dataset was created.,  - The identifier (user name) of the user who created the dataset.,  - The file format of a dataset that is created from an Amazon S3 file or folder.,  - Undocumented member.,  - The identifier (user name) of the user who last modified the dataset., 8 - The date and time that the dataset was last modified.,  - A set of options that defines how DataBrew interprets an Amazon S3 path of the dataset., 1 - The Amazon Resource Name (ARN) of the dataset.,  - The location of the data for this dataset, Amazon S3 or the Glue Data Catalog., . - Metadata tags associated with this dataset., # - The response's http status code.,  - The name of the dataset.,  - Undocumented member.amazonka-databrew/The date and time that the dataset was created.amazonka-databrew?The identifier (user name) of the user who created the dataset.amazonka-databrewThe file format of a dataset that is created from an Amazon S3 file or folder.amazonka-databrewUndocumented member.amazonka-databrewThe identifier (user name) of the user who last modified the dataset.amazonka-databrew5The date and time that the dataset was last modified.amazonka-databrewA set of options that defines how DataBrew interprets an Amazon S3 path of the dataset.amazonka-databrew.The Amazon Resource Name (ARN) of the dataset.amazonka-databrewThe location of the data for this dataset, Amazon S3 or the Glue Data Catalog.amazonka-databrew+Metadata tags associated with this dataset.amazonka-databrew The response's http status code.amazonka-databrewThe name of the dataset.amazonka-databrewUndocumented member.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrew""Y(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';Q amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew*The name of the schedule that was deleted.amazonka-databrewSee:  smart constructor.amazonka-databrew'The name of the schedule to be deleted.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, * - The name of the schedule to be deleted.amazonka-databrew'The name of the schedule to be deleted.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., - - The name of the schedule that was deleted.amazonka-databrew The response's http status code.amazonka-databrew*The name of the schedule that was deleted.amazonka-databrewamazonka-databrewamazonka-databrew  Z(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';W amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew The name of the deleted ruleset.amazonka-databrewSee:  smart constructor.amazonka-databrew&The name of the ruleset to be deleted.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, ) - The name of the ruleset to be deleted.amazonka-databrew&The name of the ruleset to be deleted.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., # - The name of the deleted ruleset.amazonka-databrew The response's http status code.amazonka-databrew The name of the deleted ruleset.amazonka-databrewamazonka-databrewamazonka-databrew  [(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';aBamazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew(The name of the recipe that was deleted.amazonka-databrew+The version of the recipe that was deleted.amazonka-databrewSee:  smart constructor.amazonka-databrewThe name of the recipe.amazonka-databrewThe version of the recipe to be deleted. You can specify a numeric versions (X.Y) or LATEST_WORKING. LATEST_PUBLISHED is not supported.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The name of the recipe.,  - The version of the recipe to be deleted. You can specify a numeric versions (X.Y) or LATEST_WORKING. LATEST_PUBLISHED is not supported.amazonka-databrewThe name of the recipe.amazonka-databrewThe version of the recipe to be deleted. You can specify a numeric versions (X.Y) or LATEST_WORKING. LATEST_PUBLISHED is not supported.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., + - The name of the recipe that was deleted., . - The version of the recipe that was deleted.amazonka-databrew The response's http status code.amazonka-databrew(The name of the recipe that was deleted.amazonka-databrew+The version of the recipe that was deleted.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrew\(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';h amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew)The name of the project that you deleted.amazonka-databrewSee:  smart constructor.amazonka-databrew&The name of the project to be deleted.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, ) - The name of the project to be deleted.amazonka-databrew&The name of the project to be deleted.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., , - The name of the project that you deleted.amazonka-databrew The response's http status code.amazonka-databrew)The name of the project that you deleted.amazonka-databrewamazonka-databrewamazonka-databrew  ](c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';n amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew%The name of the job that you deleted.amazonka-databrewSee:  smart constructor.amazonka-databrew"The name of the job to be deleted.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, % - The name of the job to be deleted.amazonka-databrew"The name of the job to be deleted.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., ( - The name of the job that you deleted.amazonka-databrew The response's http status code.amazonka-databrew%The name of the job that you deleted.amazonka-databrewamazonka-databrewamazonka-databrew  ^(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';uS amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew)The name of the dataset that you deleted.amazonka-databrewSee:  smart constructor.amazonka-databrew&The name of the dataset to be deleted.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, ) - The name of the dataset to be deleted.amazonka-databrew&The name of the dataset to be deleted.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., , - The name of the dataset that you deleted.amazonka-databrew The response's http status code.amazonka-databrew)The name of the dataset that you deleted.amazonka-databrewamazonka-databrewamazonka-databrew  _(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew*The name of the schedule that was created.amazonka-databrewSee:  smart constructor.amazonka-databrew0The name or names of one or more jobs to be run.amazonka-databrew(Metadata tags to apply to this schedule.amazonka-databrewThe date or dates and time or times when the jobs are to be run. For more information, see  =https://docs.aws.amazon.com/databrew/latest/dg/jobs.cron.htmlCron expressions in the Glue DataBrew Developer Guide.amazonka-databrewA unique name for the schedule. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, 3 - The name or names of one or more jobs to be run., + - Metadata tags to apply to this schedule.,  - The date or dates and time or times when the jobs are to be run. For more information, see  =https://docs.aws.amazon.com/databrew/latest/dg/jobs.cron.htmlCron expressions in the Glue DataBrew Developer Guide.,  - A unique name for the schedule. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrew0The name or names of one or more jobs to be run.amazonka-databrew(Metadata tags to apply to this schedule.amazonka-databrewThe date or dates and time or times when the jobs are to be run. For more information, see  =https://docs.aws.amazon.com/databrew/latest/dg/jobs.cron.htmlCron expressions in the Glue DataBrew Developer Guide.amazonka-databrewA unique name for the schedule. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., - - The name of the schedule that was created.amazonka-databrew The response's http status code.amazonka-databrew*The name of the schedule that was created.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrew`(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew'The unique name of the created ruleset.amazonka-databrewSee:  smart constructor.amazonka-databrewThe description of the ruleset.amazonka-databrew&Metadata tags to apply to the ruleset.amazonka-databrewThe name of the ruleset to be created. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrewThe Amazon Resource Name (ARN) of a resource (dataset) that the ruleset is associated with.amazonka-databrewA list of rules that are defined with the ruleset. A rule includes one or more checks to be validated on a DataBrew dataset.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, " - The description of the ruleset., ) - Metadata tags to apply to the ruleset.,  - The name of the ruleset to be created. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.,  - The Amazon Resource Name (ARN) of a resource (dataset) that the ruleset is associated with.,  - A list of rules that are defined with the ruleset. A rule includes one or more checks to be validated on a DataBrew dataset.amazonka-databrewThe description of the ruleset.amazonka-databrew&Metadata tags to apply to the ruleset.amazonka-databrewThe name of the ruleset to be created. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrewThe Amazon Resource Name (ARN) of a resource (dataset) that the ruleset is associated with.amazonka-databrewA list of rules that are defined with the ruleset. A rule includes one or more checks to be validated on a DataBrew dataset.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., * - The unique name of the created ruleset.amazonka-databrew The response's http status code.amazonka-databrew'The unique name of the created ruleset.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewa(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';A%amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew%The name of the job that you created.amazonka-databrewSee:  smart constructor.amazonka-databrewOne or more artifacts that represent the Glue Data Catalog output from running the job.amazonka-databrewRepresents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write to.amazonka-databrew0The name of the dataset that this job processes.amazonka-databrewThe Amazon Resource Name (ARN) of an encryption key that is used to protect the job.amazonka-databrewThe encryption mode for the job, which can be one of the following:SSE-KMS3 - Server-side encryption with keys managed by KMS.SSE-S39 - Server-side encryption with keys managed by Amazon S3.amazonka-databrewEnables or disables Amazon CloudWatch logging for the job. If logging is enabled, CloudWatch writes one log stream for each job run.amazonka-databrewThe maximum number of nodes that DataBrew can consume when the job processes data.amazonka-databrewThe maximum number of times to retry the job after a job run fails.amazonka-databrewOne or more artifacts that represent the output from running the job.amazonka-databrewEither the name of an existing project, or a combination of a recipe and a dataset to associate with the recipe.amazonka-databrew#Metadata tags to apply to this job.amazonka-databrewThe job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT.amazonka-databrewA unique name for the job. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrewThe Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - One or more artifacts that represent the Glue Data Catalog output from running the job.,  - Represents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write to., 3 - The name of the dataset that this job processes.,  - The Amazon Resource Name (ARN) of an encryption key that is used to protect the job.,  - The encryption mode for the job, which can be one of the following:SSE-KMS3 - Server-side encryption with keys managed by KMS.SSE-S39 - Server-side encryption with keys managed by Amazon S3.,  - Enables or disables Amazon CloudWatch logging for the job. If logging is enabled, CloudWatch writes one log stream for each job run.,  - The maximum number of nodes that DataBrew can consume when the job processes data.,  - The maximum number of times to retry the job after a job run fails.,  - One or more artifacts that represent the output from running the job.,  - Either the name of an existing project, or a combination of a recipe and a dataset to associate with the recipe.,  - Undocumented member., & - Metadata tags to apply to this job.,  - The job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT.,  - A unique name for the job. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.,  - The Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrewOne or more artifacts that represent the Glue Data Catalog output from running the job.amazonka-databrewRepresents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write to.amazonka-databrew0The name of the dataset that this job processes.amazonka-databrewThe Amazon Resource Name (ARN) of an encryption key that is used to protect the job.amazonka-databrewThe encryption mode for the job, which can be one of the following:SSE-KMS3 - Server-side encryption with keys managed by KMS.SSE-S39 - Server-side encryption with keys managed by Amazon S3.amazonka-databrewEnables or disables Amazon CloudWatch logging for the job. If logging is enabled, CloudWatch writes one log stream for each job run.amazonka-databrewThe maximum number of nodes that DataBrew can consume when the job processes data.amazonka-databrewThe maximum number of times to retry the job after a job run fails.amazonka-databrewOne or more artifacts that represent the output from running the job.amazonka-databrewEither the name of an existing project, or a combination of a recipe and a dataset to associate with the recipe.amazonka-databrewUndocumented member.amazonka-databrew#Metadata tags to apply to this job.amazonka-databrewThe job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT.amazonka-databrewA unique name for the job. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrewThe Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., ( - The name of the job that you created.amazonka-databrew The response's http status code.amazonka-databrew%The name of the job that you created.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrew((b(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew(The name of the recipe that you created.amazonka-databrewSee:  smart constructor.amazonka-databrewA description for the recipe.amazonka-databrew&Metadata tags to apply to this recipe.amazonka-databrewA unique name for the recipe. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrewAn array containing the steps to be performed by the recipe. Each recipe step consists of one recipe action and (optionally) an array of condition expressions.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - A description for the recipe., ) - Metadata tags to apply to this recipe.,  - A unique name for the recipe. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.,  - An array containing the steps to be performed by the recipe. Each recipe step consists of one recipe action and (optionally) an array of condition expressions.amazonka-databrewA description for the recipe.amazonka-databrew&Metadata tags to apply to this recipe.amazonka-databrewA unique name for the recipe. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrewAn array containing the steps to be performed by the recipe. Each recipe step consists of one recipe action and (optionally) an array of condition expressions.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., + - The name of the recipe that you created.amazonka-databrew The response's http status code.amazonka-databrew(The name of the recipe that you created.amazonka-databrewamazonka-databrewamazonka-databrewc(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew)The name of the project that you created.amazonka-databrewSee:  smart constructor.amazonka-databrew'Metadata tags to apply to this project.amazonka-databrew?The name of an existing dataset to associate this project with.amazonka-databrewA unique name for the new project. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrew=The name of an existing recipe to associate with the project.amazonka-databrewThe Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed for this request.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - Undocumented member., * - Metadata tags to apply to this project.,  - The name of an existing dataset to associate this project with.,  - A unique name for the new project. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.,  - The name of an existing recipe to associate with the project.,  - The Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed for this request.amazonka-databrewUndocumented member.amazonka-databrew'Metadata tags to apply to this project.amazonka-databrew?The name of an existing dataset to associate this project with.amazonka-databrewA unique name for the new project. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrew=The name of an existing recipe to associate with the project.amazonka-databrewThe Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed for this request.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., , - The name of the project that you created.amazonka-databrew The response's http status code.amazonka-databrew)The name of the project that you created.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewd(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';#amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew%The name of the job that was created.amazonka-databrewSee:  smart constructor.amazonka-databrewConfiguration for profile jobs. Used to select columns, do evaluations, and override default parameters of evaluations. When configuration is null, the profile job will run with default settings.amazonka-databrewThe Amazon Resource Name (ARN) of an encryption key that is used to protect the job.amazonka-databrewThe encryption mode for the job, which can be one of the following:SSE-KMS - SSE-KMS5 - Server-side encryption with KMS-managed keys.SSE-S39 - Server-side encryption with keys managed by Amazon S3.amazonka-databrewSample configuration for profile jobs only. Determines the number of rows on which the profile job will be executed. If a JobSample value is not provided, the default value will be used. The default value is CUSTOM_ROWS for the mode parameter and 20000 for the size parameter.amazonka-databrewEnables or disables Amazon CloudWatch logging for the job. If logging is enabled, CloudWatch writes one log stream for each job run.amazonka-databrewThe maximum number of nodes that DataBrew can use when the job processes data.amazonka-databrewThe maximum number of times to retry the job after a job run fails.amazonka-databrew#Metadata tags to apply to this job.amazonka-databrewThe job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT.amazonka-databrewList of validation configurations that are applied to the profile job.amazonka-databrew5The name of the dataset that this job is to act upon.amazonka-databrewThe name of the job to be created. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrewThe Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - Configuration for profile jobs. Used to select columns, do evaluations, and override default parameters of evaluations. When configuration is null, the profile job will run with default settings.,  - The Amazon Resource Name (ARN) of an encryption key that is used to protect the job.,  - The encryption mode for the job, which can be one of the following:SSE-KMS - SSE-KMS5 - Server-side encryption with KMS-managed keys.SSE-S39 - Server-side encryption with keys managed by Amazon S3.,  - Sample configuration for profile jobs only. Determines the number of rows on which the profile job will be executed. If a JobSample value is not provided, the default value will be used. The default value is CUSTOM_ROWS for the mode parameter and 20000 for the size parameter.,  - Enables or disables Amazon CloudWatch logging for the job. If logging is enabled, CloudWatch writes one log stream for each job run.,  - The maximum number of nodes that DataBrew can use when the job processes data.,  - The maximum number of times to retry the job after a job run fails., & - Metadata tags to apply to this job.,  - The job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT.,  - List of validation configurations that are applied to the profile job., 8 - The name of the dataset that this job is to act upon.,  - The name of the job to be created. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.,  - Undocumented member.,  - The Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrewConfiguration for profile jobs. Used to select columns, do evaluations, and override default parameters of evaluations. When configuration is null, the profile job will run with default settings.amazonka-databrewThe Amazon Resource Name (ARN) of an encryption key that is used to protect the job.amazonka-databrewThe encryption mode for the job, which can be one of the following:SSE-KMS - SSE-KMS5 - Server-side encryption with KMS-managed keys.SSE-S39 - Server-side encryption with keys managed by Amazon S3.amazonka-databrewSample configuration for profile jobs only. Determines the number of rows on which the profile job will be executed. If a JobSample value is not provided, the default value will be used. The default value is CUSTOM_ROWS for the mode parameter and 20000 for the size parameter.amazonka-databrewEnables or disables Amazon CloudWatch logging for the job. If logging is enabled, CloudWatch writes one log stream for each job run.amazonka-databrewThe maximum number of nodes that DataBrew can use when the job processes data.amazonka-databrewThe maximum number of times to retry the job after a job run fails.amazonka-databrew#Metadata tags to apply to this job.amazonka-databrewThe job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT.amazonka-databrewList of validation configurations that are applied to the profile job.amazonka-databrew5The name of the dataset that this job is to act upon.amazonka-databrewThe name of the job to be created. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrewUndocumented member.amazonka-databrewThe Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., ( - The name of the job that was created.amazonka-databrew The response's http status code.amazonka-databrew%The name of the job that was created.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrew&&e(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew)The name of the dataset that you created.amazonka-databrewSee:  smart constructor.amazonka-databrewThe file format of a dataset that is created from an Amazon S3 file or folder.amazonka-databrewA set of options that defines how DataBrew interprets an Amazon S3 path of the dataset.amazonka-databrew'Metadata tags to apply to this dataset.amazonka-databrewThe name of the dataset to be created. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The file format of a dataset that is created from an Amazon S3 file or folder.,  - Undocumented member.,  - A set of options that defines how DataBrew interprets an Amazon S3 path of the dataset., * - Metadata tags to apply to this dataset.,  - The name of the dataset to be created. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.,  - Undocumented member.amazonka-databrewThe file format of a dataset that is created from an Amazon S3 file or folder.amazonka-databrewUndocumented member.amazonka-databrewA set of options that defines how DataBrew interprets an Amazon S3 path of the dataset.amazonka-databrew'Metadata tags to apply to this dataset.amazonka-databrewThe name of the dataset to be created. Valid characters are alphanumeric (A-Z, a-z, 0-9), hyphen (-), period (.), and space.amazonka-databrewUndocumented member.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., , - The name of the dataset that you created.amazonka-databrew The response's http status code.amazonka-databrew)The name of the dataset that you created.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewf(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';amazonka-databrewSee:  smart constructor.amazonka-databrewErrors, if any, that occurred while attempting to delete the recipe versions.amazonka-databrew The response's http status code.amazonka-databrew)The name of the recipe that was modified.amazonka-databrewSee:  smart constructor.amazonka-databrew8The name of the recipe whose versions are to be deleted.amazonka-databrewAn array of version identifiers, for the recipe versions to be deleted. You can specify numeric versions (X.Y) or LATEST_WORKING. LATEST_PUBLISHED is not supported.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, ; - The name of the recipe whose versions are to be deleted.,  - An array of version identifiers, for the recipe versions to be deleted. You can specify numeric versions (X.Y) or LATEST_WORKING. LATEST_PUBLISHED is not supported.amazonka-databrew8The name of the recipe whose versions are to be deleted.amazonka-databrewAn array of version identifiers, for the recipe versions to be deleted. You can specify numeric versions (X.Y) or LATEST_WORKING. LATEST_PUBLISHED is not supported.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - Errors, if any, that occurred while attempting to delete the recipe versions., # - The response's http status code., , - The name of the recipe that was modified.amazonka-databrewErrors, if any, that occurred while attempting to delete the recipe versions.amazonka-databrew The response's http status code.amazonka-databrew)The name of the recipe that was modified.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewg(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&'; amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrewSee:  smart constructor.amazonka-databrewA DataBrew resource from which you want to remove a tag or tags. The value for this parameter is an Amazon Resource Name (ARN).amazonka-databrew7The tag keys (names) of one or more tags to be removed.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - A DataBrew resource from which you want to remove a tag or tags. The value for this parameter is an Amazon Resource Name (ARN)., : - The tag keys (names) of one or more tags to be removed.amazonka-databrewA DataBrew resource from which you want to remove a tag or tags. The value for this parameter is an Amazon Resource Name (ARN).amazonka-databrew7The tag keys (names) of one or more tags to be removed.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code.amazonka-databrew The response's http status code.amazonka-databrewamazonka-databrewamazonka-databrew  h(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew)The name of the dataset that you updated.amazonka-databrewSee:  smart constructor.amazonka-databrewThe file format of a dataset that is created from an Amazon S3 file or folder.amazonka-databrewA set of options that defines how DataBrew interprets an Amazon S3 path of the dataset.amazonka-databrew&The name of the dataset to be updated.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The file format of a dataset that is created from an Amazon S3 file or folder.,  - Undocumented member.,  - A set of options that defines how DataBrew interprets an Amazon S3 path of the dataset., ) - The name of the dataset to be updated.,  - Undocumented member.amazonka-databrewThe file format of a dataset that is created from an Amazon S3 file or folder.amazonka-databrewUndocumented member.amazonka-databrewA set of options that defines how DataBrew interprets an Amazon S3 path of the dataset.amazonka-databrew&The name of the dataset to be updated.amazonka-databrewUndocumented member.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., , - The name of the dataset that you updated.amazonka-databrew The response's http status code.amazonka-databrew)The name of the dataset that you updated.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewi(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';"[amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew%The name of the job that was updated.amazonka-databrewSee:  smart constructor.amazonka-databrewConfiguration for profile jobs. Used to select columns, do evaluations, and override default parameters of evaluations. When configuration is null, the profile job will run with default settings.amazonka-databrewThe Amazon Resource Name (ARN) of an encryption key that is used to protect the job.amazonka-databrewThe encryption mode for the job, which can be one of the following:SSE-KMS3 - Server-side encryption with keys managed by KMS.SSE-S39 - Server-side encryption with keys managed by Amazon S3.amazonka-databrewSample configuration for Profile Jobs only. Determines the number of rows on which the Profile job will be executed. If a JobSample value is not provided for profile jobs, the default value will be used. The default value is CUSTOM_ROWS for the mode parameter and 20000 for the size parameter.amazonka-databrewEnables or disables Amazon CloudWatch logging for the job. If logging is enabled, CloudWatch writes one log stream for each job run.amazonka-databrewThe maximum number of compute nodes that DataBrew can use when the job processes data.amazonka-databrewThe maximum number of times to retry the job after a job run fails.amazonka-databrewThe job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT.amazonka-databrewList of validation configurations that are applied to the profile job.amazonka-databrew"The name of the job to be updated.amazonka-databrewThe Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - Configuration for profile jobs. Used to select columns, do evaluations, and override default parameters of evaluations. When configuration is null, the profile job will run with default settings.,  - The Amazon Resource Name (ARN) of an encryption key that is used to protect the job.,  - The encryption mode for the job, which can be one of the following:SSE-KMS3 - Server-side encryption with keys managed by KMS.SSE-S39 - Server-side encryption with keys managed by Amazon S3.,  - Sample configuration for Profile Jobs only. Determines the number of rows on which the Profile job will be executed. If a JobSample value is not provided for profile jobs, the default value will be used. The default value is CUSTOM_ROWS for the mode parameter and 20000 for the size parameter.,  - Enables or disables Amazon CloudWatch logging for the job. If logging is enabled, CloudWatch writes one log stream for each job run.,  - The maximum number of compute nodes that DataBrew can use when the job processes data.,  - The maximum number of times to retry the job after a job run fails.,  - The job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT.,  - List of validation configurations that are applied to the profile job., % - The name of the job to be updated.,  - Undocumented member.,  - The Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrewConfiguration for profile jobs. Used to select columns, do evaluations, and override default parameters of evaluations. When configuration is null, the profile job will run with default settings.amazonka-databrewThe Amazon Resource Name (ARN) of an encryption key that is used to protect the job.amazonka-databrewThe encryption mode for the job, which can be one of the following:SSE-KMS3 - Server-side encryption with keys managed by KMS.SSE-S39 - Server-side encryption with keys managed by Amazon S3.amazonka-databrewSample configuration for Profile Jobs only. Determines the number of rows on which the Profile job will be executed. If a JobSample value is not provided for profile jobs, the default value will be used. The default value is CUSTOM_ROWS for the mode parameter and 20000 for the size parameter.amazonka-databrewEnables or disables Amazon CloudWatch logging for the job. If logging is enabled, CloudWatch writes one log stream for each job run.amazonka-databrewThe maximum number of compute nodes that DataBrew can use when the job processes data.amazonka-databrewThe maximum number of times to retry the job after a job run fails.amazonka-databrewThe job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT.amazonka-databrewList of validation configurations that are applied to the profile job.amazonka-databrew"The name of the job to be updated.amazonka-databrewUndocumented member.amazonka-databrewThe Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., ( - The name of the job that was updated.amazonka-databrew The response's http status code.amazonka-databrew%The name of the job that was updated.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrew""j(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';+amazonka-databrewSee:  smart constructor.amazonka-databrew5The date and time that the project was last modified.amazonka-databrew The response's http status code.amazonka-databrew)The name of the project that you updated.amazonka-databrewSee:  smart constructor.amazonka-databrewThe Amazon Resource Name (ARN) of the IAM role to be assumed for this request.amazonka-databrew&The name of the project to be updated.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - Undocumented member.,  - The Amazon Resource Name (ARN) of the IAM role to be assumed for this request., ) - The name of the project to be updated.amazonka-databrewUndocumented member.amazonka-databrewThe Amazon Resource Name (ARN) of the IAM role to be assumed for this request.amazonka-databrew&The name of the project to be updated.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, 8 - The date and time that the project was last modified., # - The response's http status code., , - The name of the project that you updated.amazonka-databrew5The date and time that the project was last modified.amazonka-databrew The response's http status code.amazonka-databrew)The name of the project that you updated.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewk(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';5Famazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew(The name of the recipe that was updated.amazonka-databrewSee:  smart constructor.amazonka-databrewA description of the recipe.amazonka-databrewOne or more steps to be performed by the recipe. Each step consists of an action, and the conditions under which the action should succeed.amazonka-databrew%The name of the recipe to be updated.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - A description of the recipe.,  - One or more steps to be performed by the recipe. Each step consists of an action, and the conditions under which the action should succeed., ( - The name of the recipe to be updated.amazonka-databrewA description of the recipe.amazonka-databrewOne or more steps to be performed by the recipe. Each step consists of an action, and the conditions under which the action should succeed.amazonka-databrew%The name of the recipe to be updated.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., + - The name of the recipe that was updated.amazonka-databrew The response's http status code.amazonka-databrew(The name of the recipe that was updated.amazonka-databrewamazonka-databrewamazonka-databrewl(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';Kamazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew%The name of the job that you updated.amazonka-databrewSee:  smart constructor.amazonka-databrewOne or more artifacts that represent the Glue Data Catalog output from running the job.amazonka-databrewRepresents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write into.amazonka-databrewThe Amazon Resource Name (ARN) of an encryption key that is used to protect the job.amazonka-databrewThe encryption mode for the job, which can be one of the following:SSE-KMS3 - Server-side encryption with keys managed by KMS.SSE-S39 - Server-side encryption with keys managed by Amazon S3.amazonka-databrewEnables or disables Amazon CloudWatch logging for the job. If logging is enabled, CloudWatch writes one log stream for each job run.amazonka-databrewThe maximum number of nodes that DataBrew can consume when the job processes data.amazonka-databrewThe maximum number of times to retry the job after a job run fails.amazonka-databrewOne or more artifacts that represent the output from running the job.amazonka-databrewThe job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT.amazonka-databrewThe name of the job to update.amazonka-databrewThe Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - One or more artifacts that represent the Glue Data Catalog output from running the job.,  - Represents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write into.,  - The Amazon Resource Name (ARN) of an encryption key that is used to protect the job.,  - The encryption mode for the job, which can be one of the following:SSE-KMS3 - Server-side encryption with keys managed by KMS.SSE-S39 - Server-side encryption with keys managed by Amazon S3.,  - Enables or disables Amazon CloudWatch logging for the job. If logging is enabled, CloudWatch writes one log stream for each job run.,  - The maximum number of nodes that DataBrew can consume when the job processes data.,  - The maximum number of times to retry the job after a job run fails.,  - One or more artifacts that represent the output from running the job.,  - The job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT., ! - The name of the job to update.,  - The Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrewOne or more artifacts that represent the Glue Data Catalog output from running the job.amazonka-databrewRepresents a list of JDBC database output objects which defines the output destination for a DataBrew recipe job to write into.amazonka-databrewThe Amazon Resource Name (ARN) of an encryption key that is used to protect the job.amazonka-databrewThe encryption mode for the job, which can be one of the following:SSE-KMS3 - Server-side encryption with keys managed by KMS.SSE-S39 - Server-side encryption with keys managed by Amazon S3.amazonka-databrewEnables or disables Amazon CloudWatch logging for the job. If logging is enabled, CloudWatch writes one log stream for each job run.amazonka-databrewThe maximum number of nodes that DataBrew can consume when the job processes data.amazonka-databrewThe maximum number of times to retry the job after a job run fails.amazonka-databrewOne or more artifacts that represent the output from running the job.amazonka-databrewThe job's timeout in minutes. A job that attempts to run longer than this timeout period ends with a status of TIMEOUT.amazonka-databrewThe name of the job to update.amazonka-databrewThe Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role to be assumed when DataBrew runs the job.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., ( - The name of the job that you updated.amazonka-databrew The response's http status code.amazonka-databrew%The name of the job that you updated.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrew  m(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';UOamazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew The name of the updated ruleset.amazonka-databrewSee:  smart constructor.amazonka-databrewThe description of the ruleset.amazonka-databrew&The name of the ruleset to be updated.amazonka-databrewA list of rules that are defined with the ruleset. A rule includes one or more checks to be validated on a DataBrew dataset.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, " - The description of the ruleset., ) - The name of the ruleset to be updated.,  - A list of rules that are defined with the ruleset. A rule includes one or more checks to be validated on a DataBrew dataset.amazonka-databrewThe description of the ruleset.amazonka-databrew&The name of the ruleset to be updated.amazonka-databrewA list of rules that are defined with the ruleset. A rule includes one or more checks to be validated on a DataBrew dataset.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., # - The name of the updated ruleset.amazonka-databrew The response's http status code.amazonka-databrew The name of the updated ruleset.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewn(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred "%&';`%amazonka-databrewSee:  smart constructor.amazonka-databrew The response's http status code.amazonka-databrew*The name of the schedule that was updated.amazonka-databrewSee:  smart constructor.amazonka-databrewThe name or names of one or more jobs to be run for this schedule.amazonka-databrewThe date or dates and time or times when the jobs are to be run. For more information, see  =https://docs.aws.amazon.com/databrew/latest/dg/jobs.cron.htmlCron expressions in the Glue DataBrew Developer Guide.amazonka-databrew#The name of the schedule to update.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:,  - The name or names of one or more jobs to be run for this schedule.,  - The date or dates and time or times when the jobs are to be run. For more information, see  =https://docs.aws.amazon.com/databrew/latest/dg/jobs.cron.htmlCron expressions in the Glue DataBrew Developer Guide., & - The name of the schedule to update.amazonka-databrewThe name or names of one or more jobs to be run for this schedule.amazonka-databrewThe date or dates and time or times when the jobs are to be run. For more information, see  =https://docs.aws.amazon.com/databrew/latest/dg/jobs.cron.htmlCron expressions in the Glue DataBrew Developer Guide.amazonka-databrew#The name of the schedule to update.amazonka-databrewCreate a value of " with all optional fields omitted.Use  0https://hackage.haskell.org/package/generic-lens generic-lens or  *https://hackage.haskell.org/package/opticsoptics! to modify other optional fields.The following record fields are available, with the corresponding lenses provided for backwards compatibility:, # - The response's http status code., - - The name of the schedule that was updated.amazonka-databrew The response's http status code.amazonka-databrew*The name of the schedule that was updated.amazonka-databrewamazonka-databrewamazonka-databrewamazonka-databrewo(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred`*+YZ[ijw  *+YZ[ijw   p(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferred"%jQq(c) 2013-2023 Brendan HayMozilla Public License, v. 2.0. Brendan Hayauto-generatednon-portable (GHC extensions) Safe-Inferredj %&)4?>=<;:98756STXdehstv    4?>=<;:98756?>=<;:987 %&)STXdehstv   rstuvwxyz{|}~                                                                                                                            !!!!!!!!!!!!!!!!""""""""""""""""""""""""""""""""#################$$$$$$$$$$$$$$$$$$$$$$$$$%%%%%%%%%%%%%%%&&&&&&&&&&&&&&&&&'''''''''''''''''''((((((((((((((((((()))))))))))))))))))*************+++++++++++++++++++++++,,,,,,,,,,,,,,,,,,,,,,,,---------------.........................///////////////000000000000000000000000000000000000001111111111111111111111111111112222222222222222222222222222222233333333 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 6 6 6 6 6 6 6 6 6 6 6 6 6 6 6 7 7 7 7 7 7 7 7 7 7 7 7 7 7 7 8 8 8 8 8 8 8 8 8 8 8 8 8 8 8 8 8 8 8 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 9 : : : : : : : : : : : : : : : : : : : : : : : : ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; < < < < < < < < < < < < < < < < < < < < < < < = = = = = = = = = = = = = = = = = = = = = = = > > > > > > > > > > > > > > > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @ A A A A A A A A A A A A A A A A A A A A A A B B B B B B B C C C C C C C C C C C C C C C C C C C C C C C C C C C C D D D D D D D D D D D D D D D D D D D D D D D D D D D D D D E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E F F F F F F F F F F F F F F F F F F F F F F F F F F F F G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G H H H H H H H H H H H H H H H H H H H H H H H H H H H H H H I I I I I I I I I I I I I I I I I I I I I I I I I I I J J J J J J J J J J J J J J J J J J J J J J J J J J J J J J J J J J K K K K K K K K K K K K KKKKKKKKKKKKKKKKKKKKKKLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXYYYYYYYYYYYYYYYYYYYYYYYYYYYZZZZZZZZZZZZZZZZZZZZZZZZZZZ[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[\\\\\\\\\\\\\\\\\\\\\\\\\\\]]]]]]]]]]]]]]]]]]]]]]]]]]]^^^^^^^^^^^^^^^^^^^^^^^^^^^__________________________________````````````````````````````````````aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaabbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbccccccccccccccccccccccccccccccccccccccddddddddddddddddddddddddddddddddddddddddddddddddddddddeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeffffffffffffffffffffffffffffffffggggggggggggggggggggggggggghhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiijjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkllllllllllllllllllllllllllllllllllllllllllllllllmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn,amazonka-databrew-2.0-GJMXoRzRb387Yq6XNY29RM)Amazonka.DataBrew.Types.AllowedStatistics%Amazonka.DataBrew.Types.AnalyticsMode&Amazonka.DataBrew.Types.ColumnSelector)Amazonka.DataBrew.Types.CompressionFormat+Amazonka.DataBrew.Types.ConditionExpression"Amazonka.DataBrew.Types.CsvOptions(Amazonka.DataBrew.Types.CsvOutputOptions*Amazonka.DataBrew.Types.DatabaseOutputMode'Amazonka.DataBrew.Types.DatetimeOptions&Amazonka.DataBrew.Types.EncryptionMode3Amazonka.DataBrew.Types.EntityDetectorConfiguration$Amazonka.DataBrew.Types.ExcelOptions(Amazonka.DataBrew.Types.FilterExpression#Amazonka.DataBrew.Types.InputFormat#Amazonka.DataBrew.Types.JobRunStateAmazonka.DataBrew.Types.JobType#Amazonka.DataBrew.Types.JsonOptions%Amazonka.DataBrew.Types.FormatOptions'Amazonka.DataBrew.Types.LogSubscription Amazonka.DataBrew.Types.MetadataAmazonka.DataBrew.Types.Order!Amazonka.DataBrew.Types.OrderedBy"Amazonka.DataBrew.Types.FilesLimit$Amazonka.DataBrew.Types.OutputFormat+Amazonka.DataBrew.Types.OutputFormatOptions%Amazonka.DataBrew.Types.ParameterType(Amazonka.DataBrew.Types.DatasetParameter#Amazonka.DataBrew.Types.PathOptions$Amazonka.DataBrew.Types.RecipeAction'Amazonka.DataBrew.Types.RecipeReference"Amazonka.DataBrew.Types.RecipeStepAmazonka.DataBrew.Types.Recipe0Amazonka.DataBrew.Types.RecipeVersionErrorDetail#Amazonka.DataBrew.Types.RulesetItem"Amazonka.DataBrew.Types.S3LocationAmazonka.DataBrew.Types.Output2Amazonka.DataBrew.Types.DatabaseTableOutputOptions&Amazonka.DataBrew.Types.DatabaseOutput/Amazonka.DataBrew.Types.DatabaseInputDefinition2Amazonka.DataBrew.Types.DataCatalogInputDefinitionAmazonka.DataBrew.Types.Input,Amazonka.DataBrew.Types.S3TableOutputOptions)Amazonka.DataBrew.Types.DataCatalogOutput"Amazonka.DataBrew.Types.SampleMode!Amazonka.DataBrew.Types.JobSample"Amazonka.DataBrew.Types.SampleTypeAmazonka.DataBrew.Types.SampleAmazonka.DataBrew.Types.Project Amazonka.DataBrew.Types.Schedule%Amazonka.DataBrew.Types.SessionStatusAmazonka.DataBrew.Types.SourceAmazonka.DataBrew.Types.Dataset)Amazonka.DataBrew.Types.StatisticOverride/Amazonka.DataBrew.Types.StatisticsConfiguration5Amazonka.DataBrew.Types.ColumnStatisticsConfiguration,Amazonka.DataBrew.Types.ProfileConfiguration%Amazonka.DataBrew.Types.ThresholdType%Amazonka.DataBrew.Types.ThresholdUnit!Amazonka.DataBrew.Types.ThresholdAmazonka.DataBrew.Types.Rule&Amazonka.DataBrew.Types.ValidationMode/Amazonka.DataBrew.Types.ValidationConfigurationAmazonka.DataBrew.Types.JobRunAmazonka.DataBrew.Types.Job!Amazonka.DataBrew.Types.ViewFrameAmazonka.DataBrew.TypesAmazonka.DataBrew.TagResourceAmazonka.DataBrew.StopJobRun%Amazonka.DataBrew.StartProjectSessionAmazonka.DataBrew.StartJobRun*Amazonka.DataBrew.SendProjectSessionActionAmazonka.DataBrew.PublishRecipe%Amazonka.DataBrew.ListTagsForResourceAmazonka.DataBrew.ListSchedulesAmazonka.DataBrew.ListRulesetsAmazonka.DataBrew.ListRecipes$Amazonka.DataBrew.ListRecipeVersionsAmazonka.DataBrew.ListProjectsAmazonka.DataBrew.ListJobsAmazonka.DataBrew.ListJobRunsAmazonka.DataBrew.ListDatasets"Amazonka.DataBrew.DescribeSchedule!Amazonka.DataBrew.DescribeRuleset Amazonka.DataBrew.DescribeRecipe!Amazonka.DataBrew.DescribeProject Amazonka.DataBrew.DescribeJobRunAmazonka.DataBrew.DescribeJob!Amazonka.DataBrew.DescribeDataset Amazonka.DataBrew.DeleteScheduleAmazonka.DataBrew.DeleteRuleset%Amazonka.DataBrew.DeleteRecipeVersionAmazonka.DataBrew.DeleteProjectAmazonka.DataBrew.DeleteJobAmazonka.DataBrew.DeleteDataset Amazonka.DataBrew.CreateScheduleAmazonka.DataBrew.CreateRuleset!Amazonka.DataBrew.CreateRecipeJobAmazonka.DataBrew.CreateRecipeAmazonka.DataBrew.CreateProject"Amazonka.DataBrew.CreateProfileJobAmazonka.DataBrew.CreateDataset*Amazonka.DataBrew.BatchDeleteRecipeVersionAmazonka.DataBrew.UntagResourceAmazonka.DataBrew.UpdateDataset"Amazonka.DataBrew.UpdateProfileJobAmazonka.DataBrew.UpdateProjectAmazonka.DataBrew.UpdateRecipe!Amazonka.DataBrew.UpdateRecipeJobAmazonka.DataBrew.UpdateRuleset Amazonka.DataBrew.UpdateScheduleAmazonka.DataBrew.LensAmazonka.DataBrew.WaitersAmazonka.DataBrewAllowedStatisticsAllowedStatistics'"$sel:statistics:AllowedStatistics'newAllowedStatisticsallowedStatistics_statistics$fToJSONAllowedStatistics$fNFDataAllowedStatistics$fHashableAllowedStatistics$fFromJSONAllowedStatistics$fEqAllowedStatistics$fReadAllowedStatistics$fShowAllowedStatistics$fGenericAllowedStatistics AnalyticsModeAnalyticsMode'fromAnalyticsModeAnalyticsMode_ENABLEAnalyticsMode_DISABLE$fShowAnalyticsMode$fReadAnalyticsMode$fEqAnalyticsMode$fOrdAnalyticsMode$fGenericAnalyticsMode$fHashableAnalyticsMode$fNFDataAnalyticsMode$fFromTextAnalyticsMode$fToTextAnalyticsMode$fToByteStringAnalyticsMode$fToLogAnalyticsMode$fToHeaderAnalyticsMode$fToQueryAnalyticsMode$fFromJSONAnalyticsMode$fFromJSONKeyAnalyticsMode$fToJSONAnalyticsMode$fToJSONKeyAnalyticsMode$fFromXMLAnalyticsMode$fToXMLAnalyticsModeColumnSelectorColumnSelector'$sel:name:ColumnSelector'$sel:regex:ColumnSelector'newColumnSelectorcolumnSelector_namecolumnSelector_regex$fToJSONColumnSelector$fNFDataColumnSelector$fHashableColumnSelector$fFromJSONColumnSelector$fEqColumnSelector$fReadColumnSelector$fShowColumnSelector$fGenericColumnSelectorCompressionFormatCompressionFormat'fromCompressionFormatCompressionFormat_ZSTDCompressionFormat_ZLIBCompressionFormat_SNAPPYCompressionFormat_LZOCompressionFormat_LZ4CompressionFormat_GZIPCompressionFormat_DEFLATECompressionFormat_BZIP2CompressionFormat_BROTLI$fShowCompressionFormat$fReadCompressionFormat$fEqCompressionFormat$fOrdCompressionFormat$fGenericCompressionFormat$fHashableCompressionFormat$fNFDataCompressionFormat$fFromTextCompressionFormat$fToTextCompressionFormat$fToByteStringCompressionFormat$fToLogCompressionFormat$fToHeaderCompressionFormat$fToQueryCompressionFormat$fFromJSONCompressionFormat$fFromJSONKeyCompressionFormat$fToJSONCompressionFormat$fToJSONKeyCompressionFormat$fFromXMLCompressionFormat$fToXMLCompressionFormatConditionExpressionConditionExpression'$sel:value:ConditionExpression'#$sel:condition:ConditionExpression'&$sel:targetColumn:ConditionExpression'newConditionExpressionconditionExpression_valueconditionExpression_condition conditionExpression_targetColumn$fToJSONConditionExpression$fNFDataConditionExpression$fHashableConditionExpression$fFromJSONConditionExpression$fEqConditionExpression$fReadConditionExpression$fShowConditionExpression$fGenericConditionExpression CsvOptions CsvOptions'$sel:delimiter:CsvOptions'$sel:headerRow:CsvOptions' newCsvOptionscsvOptions_delimitercsvOptions_headerRow$fToJSONCsvOptions$fNFDataCsvOptions$fHashableCsvOptions$fFromJSONCsvOptions$fEqCsvOptions$fReadCsvOptions$fShowCsvOptions$fGenericCsvOptionsCsvOutputOptionsCsvOutputOptions' $sel:delimiter:CsvOutputOptions'newCsvOutputOptionscsvOutputOptions_delimiter$fToJSONCsvOutputOptions$fNFDataCsvOutputOptions$fHashableCsvOutputOptions$fFromJSONCsvOutputOptions$fEqCsvOutputOptions$fReadCsvOutputOptions$fShowCsvOutputOptions$fGenericCsvOutputOptionsDatabaseOutputModeDatabaseOutputMode'fromDatabaseOutputModeDatabaseOutputMode_NEW_TABLE$fShowDatabaseOutputMode$fReadDatabaseOutputMode$fEqDatabaseOutputMode$fOrdDatabaseOutputMode$fGenericDatabaseOutputMode$fHashableDatabaseOutputMode$fNFDataDatabaseOutputMode$fFromTextDatabaseOutputMode$fToTextDatabaseOutputMode $fToByteStringDatabaseOutputMode$fToLogDatabaseOutputMode$fToHeaderDatabaseOutputMode$fToQueryDatabaseOutputMode$fFromJSONDatabaseOutputMode$fFromJSONKeyDatabaseOutputMode$fToJSONDatabaseOutputMode$fToJSONKeyDatabaseOutputMode$fFromXMLDatabaseOutputMode$fToXMLDatabaseOutputModeDatetimeOptionsDatetimeOptions' $sel:localeCode:DatetimeOptions'$$sel:timezoneOffset:DatetimeOptions'$sel:format:DatetimeOptions'newDatetimeOptionsdatetimeOptions_localeCodedatetimeOptions_timezoneOffsetdatetimeOptions_format$fToJSONDatetimeOptions$fNFDataDatetimeOptions$fHashableDatetimeOptions$fFromJSONDatetimeOptions$fEqDatetimeOptions$fReadDatetimeOptions$fShowDatetimeOptions$fGenericDatetimeOptionsEncryptionModeEncryptionMode'fromEncryptionModeEncryptionMode_SSE_S3EncryptionMode_SSE_KMS$fShowEncryptionMode$fReadEncryptionMode$fEqEncryptionMode$fOrdEncryptionMode$fGenericEncryptionMode$fHashableEncryptionMode$fNFDataEncryptionMode$fFromTextEncryptionMode$fToTextEncryptionMode$fToByteStringEncryptionMode$fToLogEncryptionMode$fToHeaderEncryptionMode$fToQueryEncryptionMode$fFromJSONEncryptionMode$fFromJSONKeyEncryptionMode$fToJSONEncryptionMode$fToJSONKeyEncryptionMode$fFromXMLEncryptionMode$fToXMLEncryptionModeEntityDetectorConfigurationEntityDetectorConfiguration'3$sel:allowedStatistics:EntityDetectorConfiguration'-$sel:entityTypes:EntityDetectorConfiguration'newEntityDetectorConfiguration-entityDetectorConfiguration_allowedStatistics'entityDetectorConfiguration_entityTypes#$fToJSONEntityDetectorConfiguration#$fNFDataEntityDetectorConfiguration%$fHashableEntityDetectorConfiguration%$fFromJSONEntityDetectorConfiguration$fEqEntityDetectorConfiguration!$fReadEntityDetectorConfiguration!$fShowEntityDetectorConfiguration$$fGenericEntityDetectorConfiguration ExcelOptions ExcelOptions'$sel:headerRow:ExcelOptions'$sel:sheetIndexes:ExcelOptions'$sel:sheetNames:ExcelOptions'newExcelOptionsexcelOptions_headerRowexcelOptions_sheetIndexesexcelOptions_sheetNames$fToJSONExcelOptions$fNFDataExcelOptions$fHashableExcelOptions$fFromJSONExcelOptions$fEqExcelOptions$fReadExcelOptions$fShowExcelOptions$fGenericExcelOptionsFilterExpressionFilterExpression'!$sel:expression:FilterExpression' $sel:valuesMap:FilterExpression'newFilterExpressionfilterExpression_expressionfilterExpression_valuesMap$fToJSONFilterExpression$fNFDataFilterExpression$fHashableFilterExpression$fFromJSONFilterExpression$fEqFilterExpression$fReadFilterExpression$fShowFilterExpression$fGenericFilterExpression InputFormat InputFormat'fromInputFormatInputFormat_PARQUETInputFormat_ORCInputFormat_JSONInputFormat_EXCELInputFormat_CSV$fShowInputFormat$fReadInputFormat$fEqInputFormat$fOrdInputFormat$fGenericInputFormat$fHashableInputFormat$fNFDataInputFormat$fFromTextInputFormat$fToTextInputFormat$fToByteStringInputFormat$fToLogInputFormat$fToHeaderInputFormat$fToQueryInputFormat$fFromJSONInputFormat$fFromJSONKeyInputFormat$fToJSONInputFormat$fToJSONKeyInputFormat$fFromXMLInputFormat$fToXMLInputFormat JobRunState JobRunState'fromJobRunStateJobRunState_TIMEOUTJobRunState_SUCCEEDEDJobRunState_STOPPINGJobRunState_STOPPEDJobRunState_STARTINGJobRunState_RUNNINGJobRunState_FAILED$fShowJobRunState$fReadJobRunState$fEqJobRunState$fOrdJobRunState$fGenericJobRunState$fHashableJobRunState$fNFDataJobRunState$fFromTextJobRunState$fToTextJobRunState$fToByteStringJobRunState$fToLogJobRunState$fToHeaderJobRunState$fToQueryJobRunState$fFromJSONJobRunState$fFromJSONKeyJobRunState$fToJSONJobRunState$fToJSONKeyJobRunState$fFromXMLJobRunState$fToXMLJobRunStateJobTypeJobType' fromJobTypeJobType_RECIPEJobType_PROFILE $fShowJobType $fReadJobType $fEqJobType $fOrdJobType$fGenericJobType$fHashableJobType$fNFDataJobType$fFromTextJobType$fToTextJobType$fToByteStringJobType$fToLogJobType$fToHeaderJobType$fToQueryJobType$fFromJSONJobType$fFromJSONKeyJobType$fToJSONJobType$fToJSONKeyJobType$fFromXMLJobType$fToXMLJobType JsonOptions JsonOptions'$sel:multiLine:JsonOptions'newJsonOptionsjsonOptions_multiLine$fToJSONJsonOptions$fNFDataJsonOptions$fHashableJsonOptions$fFromJSONJsonOptions$fEqJsonOptions$fReadJsonOptions$fShowJsonOptions$fGenericJsonOptions FormatOptionsFormatOptions'$sel:csv:FormatOptions'$sel:excel:FormatOptions'$sel:json:FormatOptions'newFormatOptionsformatOptions_csvformatOptions_excelformatOptions_json$fToJSONFormatOptions$fNFDataFormatOptions$fHashableFormatOptions$fFromJSONFormatOptions$fEqFormatOptions$fReadFormatOptions$fShowFormatOptions$fGenericFormatOptionsLogSubscriptionLogSubscription'fromLogSubscriptionLogSubscription_ENABLELogSubscription_DISABLE$fShowLogSubscription$fReadLogSubscription$fEqLogSubscription$fOrdLogSubscription$fGenericLogSubscription$fHashableLogSubscription$fNFDataLogSubscription$fFromTextLogSubscription$fToTextLogSubscription$fToByteStringLogSubscription$fToLogLogSubscription$fToHeaderLogSubscription$fToQueryLogSubscription$fFromJSONLogSubscription$fFromJSONKeyLogSubscription$fToJSONLogSubscription$fToJSONKeyLogSubscription$fFromXMLLogSubscription$fToXMLLogSubscriptionMetadata Metadata'$sel:sourceArn:Metadata' newMetadatametadata_sourceArn$fToJSONMetadata$fNFDataMetadata$fHashableMetadata$fFromJSONMetadata $fEqMetadata$fReadMetadata$fShowMetadata$fGenericMetadataOrderOrder' fromOrderOrder_DESCENDINGOrder_ASCENDING $fShowOrder $fReadOrder $fEqOrder $fOrdOrder$fGenericOrder$fHashableOrder $fNFDataOrder$fFromTextOrder $fToTextOrder$fToByteStringOrder $fToLogOrder$fToHeaderOrder$fToQueryOrder$fFromJSONOrder$fFromJSONKeyOrder $fToJSONOrder$fToJSONKeyOrder$fFromXMLOrder $fToXMLOrder OrderedBy OrderedBy' fromOrderedByOrderedBy_LAST_MODIFIED_DATE$fShowOrderedBy$fReadOrderedBy $fEqOrderedBy$fOrdOrderedBy$fGenericOrderedBy$fHashableOrderedBy$fNFDataOrderedBy$fFromTextOrderedBy$fToTextOrderedBy$fToByteStringOrderedBy$fToLogOrderedBy$fToHeaderOrderedBy$fToQueryOrderedBy$fFromJSONOrderedBy$fFromJSONKeyOrderedBy$fToJSONOrderedBy$fToJSONKeyOrderedBy$fFromXMLOrderedBy$fToXMLOrderedBy FilesLimit FilesLimit'$sel:order:FilesLimit'$sel:orderedBy:FilesLimit'$sel:maxFiles:FilesLimit' newFilesLimitfilesLimit_orderfilesLimit_orderedByfilesLimit_maxFiles$fToJSONFilesLimit$fNFDataFilesLimit$fHashableFilesLimit$fFromJSONFilesLimit$fEqFilesLimit$fReadFilesLimit$fShowFilesLimit$fGenericFilesLimit OutputFormat OutputFormat'fromOutputFormatOutputFormat_XMLOutputFormat_TABLEAUHYPEROutputFormat_PARQUETOutputFormat_ORCOutputFormat_JSONOutputFormat_GLUEPARQUETOutputFormat_CSVOutputFormat_AVRO$fShowOutputFormat$fReadOutputFormat$fEqOutputFormat$fOrdOutputFormat$fGenericOutputFormat$fHashableOutputFormat$fNFDataOutputFormat$fFromTextOutputFormat$fToTextOutputFormat$fToByteStringOutputFormat$fToLogOutputFormat$fToHeaderOutputFormat$fToQueryOutputFormat$fFromJSONOutputFormat$fFromJSONKeyOutputFormat$fToJSONOutputFormat$fToJSONKeyOutputFormat$fFromXMLOutputFormat$fToXMLOutputFormatOutputFormatOptionsOutputFormatOptions'$sel:csv:OutputFormatOptions'newOutputFormatOptionsoutputFormatOptions_csv$fToJSONOutputFormatOptions$fNFDataOutputFormatOptions$fHashableOutputFormatOptions$fFromJSONOutputFormatOptions$fEqOutputFormatOptions$fReadOutputFormatOptions$fShowOutputFormatOptions$fGenericOutputFormatOptions ParameterTypeParameterType'fromParameterTypeParameterType_StringParameterType_NumberParameterType_Datetime$fShowParameterType$fReadParameterType$fEqParameterType$fOrdParameterType$fGenericParameterType$fHashableParameterType$fNFDataParameterType$fFromTextParameterType$fToTextParameterType$fToByteStringParameterType$fToLogParameterType$fToHeaderParameterType$fToQueryParameterType$fFromJSONParameterType$fFromJSONKeyParameterType$fToJSONParameterType$fToJSONKeyParameterType$fFromXMLParameterType$fToXMLParameterTypeDatasetParameterDatasetParameter'#$sel:createColumn:DatasetParameter'&$sel:datetimeOptions:DatasetParameter'$sel:filter':DatasetParameter'$sel:name:DatasetParameter'$sel:type':DatasetParameter'newDatasetParameterdatasetParameter_createColumn datasetParameter_datetimeOptionsdatasetParameter_filterdatasetParameter_namedatasetParameter_type$fToJSONDatasetParameter$fNFDataDatasetParameter$fHashableDatasetParameter$fFromJSONDatasetParameter$fEqDatasetParameter$fReadDatasetParameter$fShowDatasetParameter$fGenericDatasetParameter PathOptions PathOptions'$sel:filesLimit:PathOptions'+$sel:lastModifiedDateCondition:PathOptions'$sel:parameters:PathOptions'newPathOptionspathOptions_filesLimit%pathOptions_lastModifiedDateConditionpathOptions_parameters$fToJSONPathOptions$fNFDataPathOptions$fHashablePathOptions$fFromJSONPathOptions$fEqPathOptions$fReadPathOptions$fShowPathOptions$fGenericPathOptions RecipeAction RecipeAction'$sel:parameters:RecipeAction'$sel:operation:RecipeAction'newRecipeActionrecipeAction_parametersrecipeAction_operation$fToJSONRecipeAction$fNFDataRecipeAction$fHashableRecipeAction$fFromJSONRecipeAction$fEqRecipeAction$fReadRecipeAction$fShowRecipeAction$fGenericRecipeActionRecipeReferenceRecipeReference'#$sel:recipeVersion:RecipeReference'$sel:name:RecipeReference'newRecipeReferencerecipeReference_recipeVersionrecipeReference_name$fToJSONRecipeReference$fNFDataRecipeReference$fHashableRecipeReference$fFromJSONRecipeReference$fEqRecipeReference$fReadRecipeReference$fShowRecipeReference$fGenericRecipeReference RecipeStep RecipeStep'%$sel:conditionExpressions:RecipeStep'$sel:action:RecipeStep' newRecipeSteprecipeStep_conditionExpressionsrecipeStep_action$fToJSONRecipeStep$fNFDataRecipeStep$fHashableRecipeStep$fFromJSONRecipeStep$fEqRecipeStep$fReadRecipeStep$fShowRecipeStep$fGenericRecipeStepRecipeRecipe'$sel:createDate:Recipe'$sel:createdBy:Recipe'$sel:description:Recipe'$sel:lastModifiedBy:Recipe'$sel:lastModifiedDate:Recipe'$sel:projectName:Recipe'$sel:publishedBy:Recipe'$sel:publishedDate:Recipe'$sel:recipeVersion:Recipe'$sel:resourceArn:Recipe'$sel:steps:Recipe'$sel:tags:Recipe'$sel:name:Recipe' newReciperecipe_createDaterecipe_createdByrecipe_descriptionrecipe_lastModifiedByrecipe_lastModifiedDaterecipe_projectNamerecipe_publishedByrecipe_publishedDaterecipe_recipeVersionrecipe_resourceArn recipe_steps recipe_tags recipe_name$fNFDataRecipe$fHashableRecipe$fFromJSONRecipe $fEqRecipe $fReadRecipe $fShowRecipe$fGenericRecipeRecipeVersionErrorDetailRecipeVersionErrorDetail'($sel:errorCode:RecipeVersionErrorDetail'+$sel:errorMessage:RecipeVersionErrorDetail',$sel:recipeVersion:RecipeVersionErrorDetail'newRecipeVersionErrorDetail"recipeVersionErrorDetail_errorCode%recipeVersionErrorDetail_errorMessage&recipeVersionErrorDetail_recipeVersion $fNFDataRecipeVersionErrorDetail"$fHashableRecipeVersionErrorDetail"$fFromJSONRecipeVersionErrorDetail$fEqRecipeVersionErrorDetail$fReadRecipeVersionErrorDetail$fShowRecipeVersionErrorDetail!$fGenericRecipeVersionErrorDetail RulesetItem RulesetItem'$sel:accountId:RulesetItem'$sel:createDate:RulesetItem'$sel:createdBy:RulesetItem'$sel:description:RulesetItem' $sel:lastModifiedBy:RulesetItem'"$sel:lastModifiedDate:RulesetItem'$sel:resourceArn:RulesetItem'$sel:ruleCount:RulesetItem'$sel:tags:RulesetItem'$sel:name:RulesetItem'$sel:targetArn:RulesetItem'newRulesetItemrulesetItem_accountIdrulesetItem_createDaterulesetItem_createdByrulesetItem_descriptionrulesetItem_lastModifiedByrulesetItem_lastModifiedDaterulesetItem_resourceArnrulesetItem_ruleCountrulesetItem_tagsrulesetItem_namerulesetItem_targetArn$fNFDataRulesetItem$fHashableRulesetItem$fFromJSONRulesetItem$fEqRulesetItem$fReadRulesetItem$fShowRulesetItem$fGenericRulesetItem S3Location S3Location'$sel:bucketOwner:S3Location'$sel:key:S3Location'$sel:bucket:S3Location' newS3Locations3Location_bucketOwners3Location_keys3Location_bucket$fToJSONS3Location$fNFDataS3Location$fHashableS3Location$fFromJSONS3Location$fEqS3Location$fReadS3Location$fShowS3Location$fGenericS3LocationOutputOutput'$sel:compressionFormat:Output'$sel:format:Output'$sel:formatOptions:Output'$sel:maxOutputFiles:Output'$sel:overwrite:Output'$sel:partitionColumns:Output'$sel:location:Output' newOutputoutput_compressionFormat output_formatoutput_formatOptionsoutput_maxOutputFilesoutput_overwriteoutput_partitionColumnsoutput_location$fToJSONOutput$fNFDataOutput$fHashableOutput$fFromJSONOutput $fEqOutput $fReadOutput $fShowOutput$fGenericOutputDatabaseTableOutputOptionsDatabaseTableOutputOptions'.$sel:tempDirectory:DatabaseTableOutputOptions'*$sel:tableName:DatabaseTableOutputOptions'newDatabaseTableOutputOptions(databaseTableOutputOptions_tempDirectory$databaseTableOutputOptions_tableName"$fToJSONDatabaseTableOutputOptions"$fNFDataDatabaseTableOutputOptions$$fHashableDatabaseTableOutputOptions$$fFromJSONDatabaseTableOutputOptions$fEqDatabaseTableOutputOptions $fReadDatabaseTableOutputOptions $fShowDatabaseTableOutputOptions#$fGenericDatabaseTableOutputOptionsDatabaseOutputDatabaseOutput''$sel:databaseOutputMode:DatabaseOutput''$sel:glueConnectionName:DatabaseOutput'$$sel:databaseOptions:DatabaseOutput'newDatabaseOutput!databaseOutput_databaseOutputMode!databaseOutput_glueConnectionNamedatabaseOutput_databaseOptions$fToJSONDatabaseOutput$fNFDataDatabaseOutput$fHashableDatabaseOutput$fFromJSONDatabaseOutput$fEqDatabaseOutput$fReadDatabaseOutput$fShowDatabaseOutput$fGenericDatabaseOutputDatabaseInputDefinitionDatabaseInputDefinition'/$sel:databaseTableName:DatabaseInputDefinition')$sel:queryString:DatabaseInputDefinition'+$sel:tempDirectory:DatabaseInputDefinition'0$sel:glueConnectionName:DatabaseInputDefinition'newDatabaseInputDefinition)databaseInputDefinition_databaseTableName#databaseInputDefinition_queryString%databaseInputDefinition_tempDirectory*databaseInputDefinition_glueConnectionName$fToJSONDatabaseInputDefinition$fNFDataDatabaseInputDefinition!$fHashableDatabaseInputDefinition!$fFromJSONDatabaseInputDefinition$fEqDatabaseInputDefinition$fReadDatabaseInputDefinition$fShowDatabaseInputDefinition $fGenericDatabaseInputDefinitionDataCatalogInputDefinitionDataCatalogInputDefinition'*$sel:catalogId:DataCatalogInputDefinition'.$sel:tempDirectory:DataCatalogInputDefinition'-$sel:databaseName:DataCatalogInputDefinition'*$sel:tableName:DataCatalogInputDefinition'newDataCatalogInputDefinition$dataCatalogInputDefinition_catalogId(dataCatalogInputDefinition_tempDirectory'dataCatalogInputDefinition_databaseName$dataCatalogInputDefinition_tableName"$fToJSONDataCatalogInputDefinition"$fNFDataDataCatalogInputDefinition$$fHashableDataCatalogInputDefinition$$fFromJSONDataCatalogInputDefinition$fEqDataCatalogInputDefinition $fReadDataCatalogInputDefinition $fShowDataCatalogInputDefinition#$fGenericDataCatalogInputDefinitionInputInput'&$sel:dataCatalogInputDefinition:Input'#$sel:databaseInputDefinition:Input'$sel:metadata:Input'$sel:s3InputDefinition:Input'newInput input_dataCatalogInputDefinitioninput_databaseInputDefinitioninput_metadatainput_s3InputDefinition $fToJSONInput $fNFDataInput$fHashableInput$fFromJSONInput $fEqInput $fReadInput $fShowInput$fGenericInputS3TableOutputOptionsS3TableOutputOptions'#$sel:location:S3TableOutputOptions'newS3TableOutputOptionss3TableOutputOptions_location$fToJSONS3TableOutputOptions$fNFDataS3TableOutputOptions$fHashableS3TableOutputOptions$fFromJSONS3TableOutputOptions$fEqS3TableOutputOptions$fReadS3TableOutputOptions$fShowS3TableOutputOptions$fGenericS3TableOutputOptionsDataCatalogOutputDataCatalogOutput'!$sel:catalogId:DataCatalogOutput''$sel:databaseOptions:DataCatalogOutput'!$sel:overwrite:DataCatalogOutput'!$sel:s3Options:DataCatalogOutput'$$sel:databaseName:DataCatalogOutput'!$sel:tableName:DataCatalogOutput'newDataCatalogOutputdataCatalogOutput_catalogId!dataCatalogOutput_databaseOptionsdataCatalogOutput_overwritedataCatalogOutput_s3OptionsdataCatalogOutput_databaseNamedataCatalogOutput_tableName$fToJSONDataCatalogOutput$fNFDataDataCatalogOutput$fHashableDataCatalogOutput$fFromJSONDataCatalogOutput$fEqDataCatalogOutput$fReadDataCatalogOutput$fShowDataCatalogOutput$fGenericDataCatalogOutput SampleMode SampleMode'fromSampleModeSampleMode_FULL_DATASETSampleMode_CUSTOM_ROWS$fShowSampleMode$fReadSampleMode$fEqSampleMode$fOrdSampleMode$fGenericSampleMode$fHashableSampleMode$fNFDataSampleMode$fFromTextSampleMode$fToTextSampleMode$fToByteStringSampleMode$fToLogSampleMode$fToHeaderSampleMode$fToQuerySampleMode$fFromJSONSampleMode$fFromJSONKeySampleMode$fToJSONSampleMode$fToJSONKeySampleMode$fFromXMLSampleMode$fToXMLSampleMode JobSample JobSample'$sel:mode:JobSample'$sel:size:JobSample' newJobSamplejobSample_modejobSample_size$fToJSONJobSample$fNFDataJobSample$fHashableJobSample$fFromJSONJobSample $fEqJobSample$fReadJobSample$fShowJobSample$fGenericJobSample SampleType SampleType'fromSampleTypeSampleType_RANDOMSampleType_LAST_NSampleType_FIRST_N$fShowSampleType$fReadSampleType$fEqSampleType$fOrdSampleType$fGenericSampleType$fHashableSampleType$fNFDataSampleType$fFromTextSampleType$fToTextSampleType$fToByteStringSampleType$fToLogSampleType$fToHeaderSampleType$fToQuerySampleType$fFromJSONSampleType$fFromJSONKeySampleType$fToJSONSampleType$fToJSONKeySampleType$fFromXMLSampleType$fToXMLSampleTypeSampleSample'$sel:size:Sample'$sel:type':Sample' newSample sample_size sample_type$fToJSONSample$fNFDataSample$fHashableSample$fFromJSONSample $fEqSample $fReadSample $fShowSample$fGenericSampleProjectProject'$sel:accountId:Project'$sel:createDate:Project'$sel:createdBy:Project'$sel:datasetName:Project'$sel:lastModifiedBy:Project'$sel:lastModifiedDate:Project'$sel:openDate:Project'$sel:openedBy:Project'$sel:resourceArn:Project'$sel:roleArn:Project'$sel:sample:Project'$sel:tags:Project'$sel:name:Project'$sel:recipeName:Project' newProjectproject_accountIdproject_createDateproject_createdByproject_datasetNameproject_lastModifiedByproject_lastModifiedDateproject_openDateproject_openedByproject_resourceArnproject_roleArnproject_sample project_tags project_nameproject_recipeName$fNFDataProject$fHashableProject$fFromJSONProject $fEqProject $fReadProject $fShowProject$fGenericProjectSchedule Schedule'$sel:accountId:Schedule'$sel:createDate:Schedule'$sel:createdBy:Schedule'$sel:cronExpression:Schedule'$sel:jobNames:Schedule'$sel:lastModifiedBy:Schedule'$sel:lastModifiedDate:Schedule'$sel:resourceArn:Schedule'$sel:tags:Schedule'$sel:name:Schedule' newScheduleschedule_accountIdschedule_createDateschedule_createdByschedule_cronExpressionschedule_jobNamesschedule_lastModifiedByschedule_lastModifiedDateschedule_resourceArn schedule_tags schedule_name$fNFDataSchedule$fHashableSchedule$fFromJSONSchedule $fEqSchedule$fReadSchedule$fShowSchedule$fGenericSchedule SessionStatusSessionStatus'fromSessionStatusSessionStatus_UPDATINGSessionStatus_TERMINATINGSessionStatus_TERMINATEDSessionStatus_ROTATINGSessionStatus_RECYCLINGSessionStatus_READYSessionStatus_PROVISIONINGSessionStatus_INITIALIZINGSessionStatus_FAILEDSessionStatus_ASSIGNED$fShowSessionStatus$fReadSessionStatus$fEqSessionStatus$fOrdSessionStatus$fGenericSessionStatus$fHashableSessionStatus$fNFDataSessionStatus$fFromTextSessionStatus$fToTextSessionStatus$fToByteStringSessionStatus$fToLogSessionStatus$fToHeaderSessionStatus$fToQuerySessionStatus$fFromJSONSessionStatus$fFromJSONKeySessionStatus$fToJSONSessionStatus$fToJSONKeySessionStatus$fFromXMLSessionStatus$fToXMLSessionStatusSourceSource' fromSource Source_S3Source_DATA_CATALOGSource_DATABASE $fShowSource $fReadSource $fEqSource $fOrdSource$fGenericSource$fHashableSource$fNFDataSource$fFromTextSource$fToTextSource$fToByteStringSource $fToLogSource$fToHeaderSource$fToQuerySource$fFromJSONSource$fFromJSONKeySource$fToJSONSource$fToJSONKeySource$fFromXMLSource $fToXMLSourceDatasetDataset'$sel:accountId:Dataset'$sel:createDate:Dataset'$sel:createdBy:Dataset'$sel:format:Dataset'$sel:formatOptions:Dataset'$sel:lastModifiedBy:Dataset'$sel:lastModifiedDate:Dataset'$sel:pathOptions:Dataset'$sel:resourceArn:Dataset'$sel:source:Dataset'$sel:tags:Dataset'$sel:name:Dataset'$sel:input:Dataset' newDatasetdataset_accountIddataset_createDatedataset_createdBydataset_formatdataset_formatOptionsdataset_lastModifiedBydataset_lastModifiedDatedataset_pathOptionsdataset_resourceArndataset_source dataset_tags dataset_name dataset_input$fNFDataDataset$fHashableDataset$fFromJSONDataset $fEqDataset $fReadDataset $fShowDataset$fGenericDatasetStatisticOverrideStatisticOverride'!$sel:statistic:StatisticOverride'"$sel:parameters:StatisticOverride'newStatisticOverridestatisticOverride_statisticstatisticOverride_parameters$fToJSONStatisticOverride$fNFDataStatisticOverride$fHashableStatisticOverride$fFromJSONStatisticOverride$fEqStatisticOverride$fReadStatisticOverride$fShowStatisticOverride$fGenericStatisticOverrideStatisticsConfigurationStatisticsConfiguration'0$sel:includedStatistics:StatisticsConfiguration''$sel:overrides:StatisticsConfiguration'newStatisticsConfiguration*statisticsConfiguration_includedStatistics!statisticsConfiguration_overrides$fToJSONStatisticsConfiguration$fNFDataStatisticsConfiguration!$fHashableStatisticsConfiguration!$fFromJSONStatisticsConfiguration$fEqStatisticsConfiguration$fReadStatisticsConfiguration$fShowStatisticsConfiguration $fGenericStatisticsConfigurationColumnStatisticsConfigurationColumnStatisticsConfiguration'-$sel:selectors:ColumnStatisticsConfiguration'.$sel:statistics:ColumnStatisticsConfiguration' newColumnStatisticsConfiguration'columnStatisticsConfiguration_selectors(columnStatisticsConfiguration_statistics%$fToJSONColumnStatisticsConfiguration%$fNFDataColumnStatisticsConfiguration'$fHashableColumnStatisticsConfiguration'$fFromJSONColumnStatisticsConfiguration!$fEqColumnStatisticsConfiguration#$fReadColumnStatisticsConfiguration#$fShowColumnStatisticsConfiguration&$fGenericColumnStatisticsConfigurationProfileConfigurationProfileConfiguration'9$sel:columnStatisticsConfigurations:ProfileConfiguration'9$sel:datasetStatisticsConfiguration:ProfileConfiguration'6$sel:entityDetectorConfiguration:ProfileConfiguration')$sel:profileColumns:ProfileConfiguration'newProfileConfiguration3profileConfiguration_columnStatisticsConfigurations3profileConfiguration_datasetStatisticsConfiguration0profileConfiguration_entityDetectorConfiguration#profileConfiguration_profileColumns$fToJSONProfileConfiguration$fNFDataProfileConfiguration$fHashableProfileConfiguration$fFromJSONProfileConfiguration$fEqProfileConfiguration$fReadProfileConfiguration$fShowProfileConfiguration$fGenericProfileConfiguration ThresholdTypeThresholdType'fromThresholdType ThresholdType_LESS_THAN_OR_EQUALThresholdType_LESS_THAN#ThresholdType_GREATER_THAN_OR_EQUALThresholdType_GREATER_THAN$fShowThresholdType$fReadThresholdType$fEqThresholdType$fOrdThresholdType$fGenericThresholdType$fHashableThresholdType$fNFDataThresholdType$fFromTextThresholdType$fToTextThresholdType$fToByteStringThresholdType$fToLogThresholdType$fToHeaderThresholdType$fToQueryThresholdType$fFromJSONThresholdType$fFromJSONKeyThresholdType$fToJSONThresholdType$fToJSONKeyThresholdType$fFromXMLThresholdType$fToXMLThresholdType ThresholdUnitThresholdUnit'fromThresholdUnitThresholdUnit_PERCENTAGEThresholdUnit_COUNT$fShowThresholdUnit$fReadThresholdUnit$fEqThresholdUnit$fOrdThresholdUnit$fGenericThresholdUnit$fHashableThresholdUnit$fNFDataThresholdUnit$fFromTextThresholdUnit$fToTextThresholdUnit$fToByteStringThresholdUnit$fToLogThresholdUnit$fToHeaderThresholdUnit$fToQueryThresholdUnit$fFromJSONThresholdUnit$fFromJSONKeyThresholdUnit$fToJSONThresholdUnit$fToJSONKeyThresholdUnit$fFromXMLThresholdUnit$fToXMLThresholdUnit Threshold Threshold'$sel:type':Threshold'$sel:unit:Threshold'$sel:value:Threshold' newThresholdthreshold_typethreshold_unitthreshold_value$fToJSONThreshold$fNFDataThreshold$fHashableThreshold$fFromJSONThreshold $fEqThreshold$fReadThreshold$fShowThreshold$fGenericThresholdRuleRule'$sel:columnSelectors:Rule'$sel:disabled:Rule'$sel:substitutionMap:Rule'$sel:threshold:Rule'$sel:name:Rule'$sel:checkExpression:Rule'newRulerule_columnSelectors rule_disabledrule_substitutionMaprule_threshold rule_namerule_checkExpression $fToJSONRule $fNFDataRule$fHashableRule$fFromJSONRule$fEqRule $fReadRule $fShowRule $fGenericRuleValidationModeValidationMode'fromValidationModeValidationMode_CHECK_ALL$fShowValidationMode$fReadValidationMode$fEqValidationMode$fOrdValidationMode$fGenericValidationMode$fHashableValidationMode$fNFDataValidationMode$fFromTextValidationMode$fToTextValidationMode$fToByteStringValidationMode$fToLogValidationMode$fToHeaderValidationMode$fToQueryValidationMode$fFromJSONValidationMode$fFromJSONKeyValidationMode$fToJSONValidationMode$fToJSONKeyValidationMode$fFromXMLValidationMode$fToXMLValidationModeValidationConfigurationValidationConfiguration',$sel:validationMode:ValidationConfiguration'($sel:rulesetArn:ValidationConfiguration'newValidationConfiguration&validationConfiguration_validationMode"validationConfiguration_rulesetArn$fToJSONValidationConfiguration$fNFDataValidationConfiguration!$fHashableValidationConfiguration!$fFromJSONValidationConfiguration$fEqValidationConfiguration$fReadValidationConfiguration$fShowValidationConfiguration $fGenericValidationConfigurationJobRunJobRun'$sel:attempt:JobRun'$sel:completedOn:JobRun'$sel:dataCatalogOutputs:JobRun'$sel:databaseOutputs:JobRun'$sel:datasetName:JobRun'$sel:errorMessage:JobRun'$sel:executionTime:JobRun'$sel:jobName:JobRun'$sel:jobSample:JobRun'$sel:logGroupName:JobRun'$sel:logSubscription:JobRun'$sel:outputs:JobRun'$sel:recipeReference:JobRun'$sel:runId:JobRun'$sel:startedBy:JobRun'$sel:startedOn:JobRun'$sel:state:JobRun'%$sel:validationConfigurations:JobRun' newJobRunjobRun_attemptjobRun_completedOnjobRun_dataCatalogOutputsjobRun_databaseOutputsjobRun_datasetNamejobRun_errorMessagejobRun_executionTimejobRun_jobNamejobRun_jobSamplejobRun_logGroupNamejobRun_logSubscriptionjobRun_outputsjobRun_recipeReference jobRun_runIdjobRun_startedByjobRun_startedOn jobRun_statejobRun_validationConfigurations$fNFDataJobRun$fHashableJobRun$fFromJSONJobRun $fEqJobRun $fReadJobRun $fShowJobRun$fGenericJobRunJobJob'$sel:accountId:Job'$sel:createDate:Job'$sel:createdBy:Job'$sel:dataCatalogOutputs:Job'$sel:databaseOutputs:Job'$sel:datasetName:Job'$sel:encryptionKeyArn:Job'$sel:encryptionMode:Job'$sel:jobSample:Job'$sel:lastModifiedBy:Job'$sel:lastModifiedDate:Job'$sel:logSubscription:Job'$sel:maxCapacity:Job'$sel:maxRetries:Job'$sel:outputs:Job'$sel:projectName:Job'$sel:recipeReference:Job'$sel:resourceArn:Job'$sel:roleArn:Job'$sel:tags:Job'$sel:timeout:Job'$sel:type':Job'"$sel:validationConfigurations:Job'$sel:name:Job'newJob job_accountIdjob_createDate job_createdByjob_dataCatalogOutputsjob_databaseOutputsjob_datasetNamejob_encryptionKeyArnjob_encryptionMode job_jobSamplejob_lastModifiedByjob_lastModifiedDatejob_logSubscriptionjob_maxCapacityjob_maxRetries job_outputsjob_projectNamejob_recipeReferencejob_resourceArn job_roleArnjob_tags job_timeoutjob_typejob_validationConfigurationsjob_name $fNFDataJob $fHashableJob $fFromJSONJob$fEqJob $fReadJob $fShowJob $fGenericJob ViewFrame ViewFrame'$sel:analytics:ViewFrame'$sel:columnRange:ViewFrame'$sel:hiddenColumns:ViewFrame'$sel:rowRange:ViewFrame'$sel:startRowIndex:ViewFrame' $sel:startColumnIndex:ViewFrame' newViewFrameviewFrame_analyticsviewFrame_columnRangeviewFrame_hiddenColumnsviewFrame_rowRangeviewFrame_startRowIndexviewFrame_startColumnIndex$fToJSONViewFrame$fNFDataViewFrame$fHashableViewFrame $fEqViewFrame$fReadViewFrame$fShowViewFrame$fGenericViewFramedefaultService_AccessDeniedException_ConflictException_InternalServerException_ResourceNotFoundException_ServiceQuotaExceededException_ValidationExceptionTagResourceResponseTagResourceResponse'$$sel:httpStatus:TagResourceResponse' TagResource TagResource'$sel:resourceArn:TagResource'$sel:tags:TagResource'newTagResourcetagResource_resourceArntagResource_tagsnewTagResourceResponsetagResourceResponse_httpStatus$fToQueryTagResource$fToPathTagResource$fToJSONTagResource$fToHeadersTagResource$fNFDataTagResource$fHashableTagResource$fNFDataTagResourceResponse$fAWSRequestTagResource$fEqTagResourceResponse$fReadTagResourceResponse$fShowTagResourceResponse$fGenericTagResourceResponse$fEqTagResource$fReadTagResource$fShowTagResource$fGenericTagResourceStopJobRunResponseStopJobRunResponse'#$sel:httpStatus:StopJobRunResponse'$sel:runId:StopJobRunResponse' StopJobRun StopJobRun'$sel:name:StopJobRun'$sel:runId:StopJobRun' newStopJobRunstopJobRun_namestopJobRun_runIdnewStopJobRunResponsestopJobRunResponse_httpStatusstopJobRunResponse_runId$fToQueryStopJobRun$fToPathStopJobRun$fToJSONStopJobRun$fToHeadersStopJobRun$fNFDataStopJobRun$fHashableStopJobRun$fNFDataStopJobRunResponse$fAWSRequestStopJobRun$fEqStopJobRunResponse$fReadStopJobRunResponse$fShowStopJobRunResponse$fGenericStopJobRunResponse$fEqStopJobRun$fReadStopJobRun$fShowStopJobRun$fGenericStopJobRunStartProjectSessionResponseStartProjectSessionResponse'1$sel:clientSessionId:StartProjectSessionResponse',$sel:httpStatus:StartProjectSessionResponse'&$sel:name:StartProjectSessionResponse'StartProjectSessionStartProjectSession''$sel:assumeControl:StartProjectSession'$sel:name:StartProjectSession'newStartProjectSession!startProjectSession_assumeControlstartProjectSession_namenewStartProjectSessionResponse+startProjectSessionResponse_clientSessionId&startProjectSessionResponse_httpStatus startProjectSessionResponse_name$fToQueryStartProjectSession$fToPathStartProjectSession$fToJSONStartProjectSession$fToHeadersStartProjectSession$fNFDataStartProjectSession$fHashableStartProjectSession#$fNFDataStartProjectSessionResponse$fAWSRequestStartProjectSession$fEqStartProjectSessionResponse!$fShowStartProjectSessionResponse$$fGenericStartProjectSessionResponse$fEqStartProjectSession$fReadStartProjectSession$fShowStartProjectSession$fGenericStartProjectSessionStartJobRunResponseStartJobRunResponse'$$sel:httpStatus:StartJobRunResponse'$sel:runId:StartJobRunResponse' StartJobRun StartJobRun'$sel:name:StartJobRun'newStartJobRunstartJobRun_namenewStartJobRunResponsestartJobRunResponse_httpStatusstartJobRunResponse_runId$fToQueryStartJobRun$fToPathStartJobRun$fToJSONStartJobRun$fToHeadersStartJobRun$fNFDataStartJobRun$fHashableStartJobRun$fNFDataStartJobRunResponse$fAWSRequestStartJobRun$fEqStartJobRunResponse$fReadStartJobRunResponse$fShowStartJobRunResponse$fGenericStartJobRunResponse$fEqStartJobRun$fReadStartJobRun$fShowStartJobRun$fGenericStartJobRun SendProjectSessionActionResponse!SendProjectSessionActionResponse'/$sel:actionId:SendProjectSessionActionResponse'-$sel:result:SendProjectSessionActionResponse'1$sel:httpStatus:SendProjectSessionActionResponse'+$sel:name:SendProjectSessionActionResponse'SendProjectSessionActionSendProjectSessionAction'.$sel:clientSessionId:SendProjectSessionAction'&$sel:preview:SendProjectSessionAction')$sel:recipeStep:SendProjectSessionAction'($sel:stepIndex:SendProjectSessionAction'($sel:viewFrame:SendProjectSessionAction'#$sel:name:SendProjectSessionAction'newSendProjectSessionAction(sendProjectSessionAction_clientSessionId sendProjectSessionAction_preview#sendProjectSessionAction_recipeStep"sendProjectSessionAction_stepIndex"sendProjectSessionAction_viewFramesendProjectSessionAction_name#newSendProjectSessionActionResponse)sendProjectSessionActionResponse_actionId'sendProjectSessionActionResponse_result+sendProjectSessionActionResponse_httpStatus%sendProjectSessionActionResponse_name!$fToQuerySendProjectSessionAction $fToPathSendProjectSessionAction $fToJSONSendProjectSessionAction#$fToHeadersSendProjectSessionAction $fNFDataSendProjectSessionAction"$fHashableSendProjectSessionAction($fNFDataSendProjectSessionActionResponse$$fAWSRequestSendProjectSessionAction$$fEqSendProjectSessionActionResponse&$fReadSendProjectSessionActionResponse&$fShowSendProjectSessionActionResponse)$fGenericSendProjectSessionActionResponse$fEqSendProjectSessionAction$fShowSendProjectSessionAction!$fGenericSendProjectSessionActionPublishRecipeResponsePublishRecipeResponse'&$sel:httpStatus:PublishRecipeResponse' $sel:name:PublishRecipeResponse' PublishRecipePublishRecipe'$sel:description:PublishRecipe'$sel:name:PublishRecipe'newPublishRecipepublishRecipe_descriptionpublishRecipe_namenewPublishRecipeResponse publishRecipeResponse_httpStatuspublishRecipeResponse_name$fToQueryPublishRecipe$fToPathPublishRecipe$fToJSONPublishRecipe$fToHeadersPublishRecipe$fNFDataPublishRecipe$fHashablePublishRecipe$fNFDataPublishRecipeResponse$fAWSRequestPublishRecipe$fEqPublishRecipeResponse$fReadPublishRecipeResponse$fShowPublishRecipeResponse$fGenericPublishRecipeResponse$fEqPublishRecipe$fReadPublishRecipe$fShowPublishRecipe$fGenericPublishRecipeListTagsForResourceResponseListTagsForResourceResponse'&$sel:tags:ListTagsForResourceResponse',$sel:httpStatus:ListTagsForResourceResponse'ListTagsForResourceListTagsForResource'%$sel:resourceArn:ListTagsForResource'newListTagsForResourcelistTagsForResource_resourceArnnewListTagsForResourceResponse listTagsForResourceResponse_tags&listTagsForResourceResponse_httpStatus$fToQueryListTagsForResource$fToPathListTagsForResource$fToHeadersListTagsForResource$fNFDataListTagsForResource$fHashableListTagsForResource#$fNFDataListTagsForResourceResponse$fAWSRequestListTagsForResource$fEqListTagsForResourceResponse!$fReadListTagsForResourceResponse!$fShowListTagsForResourceResponse$$fGenericListTagsForResourceResponse$fEqListTagsForResource$fReadListTagsForResource$fShowListTagsForResource$fGenericListTagsForResourceListSchedulesResponseListSchedulesResponse'%$sel:nextToken:ListSchedulesResponse'&$sel:httpStatus:ListSchedulesResponse'%$sel:schedules:ListSchedulesResponse' ListSchedulesListSchedules'$sel:jobName:ListSchedules'$sel:maxResults:ListSchedules'$sel:nextToken:ListSchedules'newListScheduleslistSchedules_jobNamelistSchedules_maxResultslistSchedules_nextTokennewListSchedulesResponselistSchedulesResponse_nextToken listSchedulesResponse_httpStatuslistSchedulesResponse_schedules$fToQueryListSchedules$fToPathListSchedules$fToHeadersListSchedules$fNFDataListSchedules$fHashableListSchedules$fAWSPagerListSchedules$fNFDataListSchedulesResponse$fAWSRequestListSchedules$fEqListSchedulesResponse$fReadListSchedulesResponse$fShowListSchedulesResponse$fGenericListSchedulesResponse$fEqListSchedules$fReadListSchedules$fShowListSchedules$fGenericListSchedulesListRulesetsResponseListRulesetsResponse'$$sel:nextToken:ListRulesetsResponse'%$sel:httpStatus:ListRulesetsResponse'#$sel:rulesets:ListRulesetsResponse' ListRulesets ListRulesets'$sel:maxResults:ListRulesets'$sel:nextToken:ListRulesets'$sel:targetArn:ListRulesets'newListRulesetslistRulesets_maxResultslistRulesets_nextTokenlistRulesets_targetArnnewListRulesetsResponselistRulesetsResponse_nextTokenlistRulesetsResponse_httpStatuslistRulesetsResponse_rulesets$fToQueryListRulesets$fToPathListRulesets$fToHeadersListRulesets$fNFDataListRulesets$fHashableListRulesets$fAWSPagerListRulesets$fNFDataListRulesetsResponse$fAWSRequestListRulesets$fEqListRulesetsResponse$fReadListRulesetsResponse$fShowListRulesetsResponse$fGenericListRulesetsResponse$fEqListRulesets$fReadListRulesets$fShowListRulesets$fGenericListRulesetsListRecipesResponseListRecipesResponse'#$sel:nextToken:ListRecipesResponse'$$sel:httpStatus:ListRecipesResponse'!$sel:recipes:ListRecipesResponse' ListRecipes ListRecipes'$sel:maxResults:ListRecipes'$sel:nextToken:ListRecipes'$sel:recipeVersion:ListRecipes'newListRecipeslistRecipes_maxResultslistRecipes_nextTokenlistRecipes_recipeVersionnewListRecipesResponselistRecipesResponse_nextTokenlistRecipesResponse_httpStatuslistRecipesResponse_recipes$fToQueryListRecipes$fToPathListRecipes$fToHeadersListRecipes$fNFDataListRecipes$fHashableListRecipes$fAWSPagerListRecipes$fNFDataListRecipesResponse$fAWSRequestListRecipes$fEqListRecipesResponse$fReadListRecipesResponse$fShowListRecipesResponse$fGenericListRecipesResponse$fEqListRecipes$fReadListRecipes$fShowListRecipes$fGenericListRecipesListRecipeVersionsResponseListRecipeVersionsResponse'*$sel:nextToken:ListRecipeVersionsResponse'+$sel:httpStatus:ListRecipeVersionsResponse'($sel:recipes:ListRecipeVersionsResponse'ListRecipeVersionsListRecipeVersions'#$sel:maxResults:ListRecipeVersions'"$sel:nextToken:ListRecipeVersions'$sel:name:ListRecipeVersions'newListRecipeVersionslistRecipeVersions_maxResultslistRecipeVersions_nextTokenlistRecipeVersions_namenewListRecipeVersionsResponse$listRecipeVersionsResponse_nextToken%listRecipeVersionsResponse_httpStatus"listRecipeVersionsResponse_recipes$fToQueryListRecipeVersions$fToPathListRecipeVersions$fToHeadersListRecipeVersions$fNFDataListRecipeVersions$fHashableListRecipeVersions$fAWSPagerListRecipeVersions"$fNFDataListRecipeVersionsResponse$fAWSRequestListRecipeVersions$fEqListRecipeVersionsResponse $fReadListRecipeVersionsResponse $fShowListRecipeVersionsResponse#$fGenericListRecipeVersionsResponse$fEqListRecipeVersions$fReadListRecipeVersions$fShowListRecipeVersions$fGenericListRecipeVersionsListProjectsResponseListProjectsResponse'$$sel:nextToken:ListProjectsResponse'%$sel:httpStatus:ListProjectsResponse'#$sel:projects:ListProjectsResponse' ListProjects ListProjects'$sel:maxResults:ListProjects'$sel:nextToken:ListProjects'newListProjectslistProjects_maxResultslistProjects_nextTokennewListProjectsResponselistProjectsResponse_nextTokenlistProjectsResponse_httpStatuslistProjectsResponse_projects$fToQueryListProjects$fToPathListProjects$fToHeadersListProjects$fNFDataListProjects$fHashableListProjects$fAWSPagerListProjects$fNFDataListProjectsResponse$fAWSRequestListProjects$fEqListProjectsResponse$fReadListProjectsResponse$fShowListProjectsResponse$fGenericListProjectsResponse$fEqListProjects$fReadListProjects$fShowListProjects$fGenericListProjectsListJobsResponseListJobsResponse' $sel:nextToken:ListJobsResponse'!$sel:httpStatus:ListJobsResponse'$sel:jobs:ListJobsResponse'ListJobs ListJobs'$sel:datasetName:ListJobs'$sel:maxResults:ListJobs'$sel:nextToken:ListJobs'$sel:projectName:ListJobs' newListJobslistJobs_datasetNamelistJobs_maxResultslistJobs_nextTokenlistJobs_projectNamenewListJobsResponselistJobsResponse_nextTokenlistJobsResponse_httpStatuslistJobsResponse_jobs$fToQueryListJobs$fToPathListJobs$fToHeadersListJobs$fNFDataListJobs$fHashableListJobs$fAWSPagerListJobs$fNFDataListJobsResponse$fAWSRequestListJobs$fEqListJobsResponse$fReadListJobsResponse$fShowListJobsResponse$fGenericListJobsResponse $fEqListJobs$fReadListJobs$fShowListJobs$fGenericListJobsListJobRunsResponseListJobRunsResponse'#$sel:nextToken:ListJobRunsResponse'$$sel:httpStatus:ListJobRunsResponse'!$sel:jobRuns:ListJobRunsResponse' ListJobRuns ListJobRuns'$sel:maxResults:ListJobRuns'$sel:nextToken:ListJobRuns'$sel:name:ListJobRuns'newListJobRunslistJobRuns_maxResultslistJobRuns_nextTokenlistJobRuns_namenewListJobRunsResponselistJobRunsResponse_nextTokenlistJobRunsResponse_httpStatuslistJobRunsResponse_jobRuns$fToQueryListJobRuns$fToPathListJobRuns$fToHeadersListJobRuns$fNFDataListJobRuns$fHashableListJobRuns$fAWSPagerListJobRuns$fNFDataListJobRunsResponse$fAWSRequestListJobRuns$fEqListJobRunsResponse$fReadListJobRunsResponse$fShowListJobRunsResponse$fGenericListJobRunsResponse$fEqListJobRuns$fReadListJobRuns$fShowListJobRuns$fGenericListJobRunsListDatasetsResponseListDatasetsResponse'$$sel:nextToken:ListDatasetsResponse'%$sel:httpStatus:ListDatasetsResponse'#$sel:datasets:ListDatasetsResponse' ListDatasets ListDatasets'$sel:maxResults:ListDatasets'$sel:nextToken:ListDatasets'newListDatasetslistDatasets_maxResultslistDatasets_nextTokennewListDatasetsResponselistDatasetsResponse_nextTokenlistDatasetsResponse_httpStatuslistDatasetsResponse_datasets$fToQueryListDatasets$fToPathListDatasets$fToHeadersListDatasets$fNFDataListDatasets$fHashableListDatasets$fAWSPagerListDatasets$fNFDataListDatasetsResponse$fAWSRequestListDatasets$fEqListDatasetsResponse$fReadListDatasetsResponse$fShowListDatasetsResponse$fGenericListDatasetsResponse$fEqListDatasets$fReadListDatasets$fShowListDatasets$fGenericListDatasetsDescribeScheduleResponseDescribeScheduleResponse')$sel:createDate:DescribeScheduleResponse'($sel:createdBy:DescribeScheduleResponse'-$sel:cronExpression:DescribeScheduleResponse''$sel:jobNames:DescribeScheduleResponse'-$sel:lastModifiedBy:DescribeScheduleResponse'/$sel:lastModifiedDate:DescribeScheduleResponse'*$sel:resourceArn:DescribeScheduleResponse'#$sel:tags:DescribeScheduleResponse')$sel:httpStatus:DescribeScheduleResponse'#$sel:name:DescribeScheduleResponse'DescribeScheduleDescribeSchedule'$sel:name:DescribeSchedule'newDescribeScheduledescribeSchedule_namenewDescribeScheduleResponse#describeScheduleResponse_createDate"describeScheduleResponse_createdBy'describeScheduleResponse_cronExpression!describeScheduleResponse_jobNames'describeScheduleResponse_lastModifiedBy)describeScheduleResponse_lastModifiedDate$describeScheduleResponse_resourceArndescribeScheduleResponse_tags#describeScheduleResponse_httpStatusdescribeScheduleResponse_name$fToQueryDescribeSchedule$fToPathDescribeSchedule$fToHeadersDescribeSchedule$fNFDataDescribeSchedule$fHashableDescribeSchedule $fNFDataDescribeScheduleResponse$fAWSRequestDescribeSchedule$fEqDescribeScheduleResponse$fReadDescribeScheduleResponse$fShowDescribeScheduleResponse!$fGenericDescribeScheduleResponse$fEqDescribeSchedule$fReadDescribeSchedule$fShowDescribeSchedule$fGenericDescribeScheduleDescribeRulesetResponseDescribeRulesetResponse'($sel:createDate:DescribeRulesetResponse''$sel:createdBy:DescribeRulesetResponse')$sel:description:DescribeRulesetResponse',$sel:lastModifiedBy:DescribeRulesetResponse'.$sel:lastModifiedDate:DescribeRulesetResponse')$sel:resourceArn:DescribeRulesetResponse'#$sel:rules:DescribeRulesetResponse'"$sel:tags:DescribeRulesetResponse''$sel:targetArn:DescribeRulesetResponse'($sel:httpStatus:DescribeRulesetResponse'"$sel:name:DescribeRulesetResponse'DescribeRulesetDescribeRuleset'$sel:name:DescribeRuleset'newDescribeRulesetdescribeRuleset_namenewDescribeRulesetResponse"describeRulesetResponse_createDate!describeRulesetResponse_createdBy#describeRulesetResponse_description&describeRulesetResponse_lastModifiedBy(describeRulesetResponse_lastModifiedDate#describeRulesetResponse_resourceArndescribeRulesetResponse_rulesdescribeRulesetResponse_tags!describeRulesetResponse_targetArn"describeRulesetResponse_httpStatusdescribeRulesetResponse_name$fToQueryDescribeRuleset$fToPathDescribeRuleset$fToHeadersDescribeRuleset$fNFDataDescribeRuleset$fHashableDescribeRuleset$fNFDataDescribeRulesetResponse$fAWSRequestDescribeRuleset$fEqDescribeRulesetResponse$fReadDescribeRulesetResponse$fShowDescribeRulesetResponse $fGenericDescribeRulesetResponse$fEqDescribeRuleset$fReadDescribeRuleset$fShowDescribeRuleset$fGenericDescribeRulesetDescribeRecipeResponseDescribeRecipeResponse''$sel:createDate:DescribeRecipeResponse'&$sel:createdBy:DescribeRecipeResponse'($sel:description:DescribeRecipeResponse'+$sel:lastModifiedBy:DescribeRecipeResponse'-$sel:lastModifiedDate:DescribeRecipeResponse'($sel:projectName:DescribeRecipeResponse'($sel:publishedBy:DescribeRecipeResponse'*$sel:publishedDate:DescribeRecipeResponse'*$sel:recipeVersion:DescribeRecipeResponse'($sel:resourceArn:DescribeRecipeResponse'"$sel:steps:DescribeRecipeResponse'!$sel:tags:DescribeRecipeResponse''$sel:httpStatus:DescribeRecipeResponse'!$sel:name:DescribeRecipeResponse'DescribeRecipeDescribeRecipe'"$sel:recipeVersion:DescribeRecipe'$sel:name:DescribeRecipe'newDescribeRecipedescribeRecipe_recipeVersiondescribeRecipe_namenewDescribeRecipeResponse!describeRecipeResponse_createDate describeRecipeResponse_createdBy"describeRecipeResponse_description%describeRecipeResponse_lastModifiedBy'describeRecipeResponse_lastModifiedDate"describeRecipeResponse_projectName"describeRecipeResponse_publishedBy$describeRecipeResponse_publishedDate$describeRecipeResponse_recipeVersion"describeRecipeResponse_resourceArndescribeRecipeResponse_stepsdescribeRecipeResponse_tags!describeRecipeResponse_httpStatusdescribeRecipeResponse_name$fToQueryDescribeRecipe$fToPathDescribeRecipe$fToHeadersDescribeRecipe$fNFDataDescribeRecipe$fHashableDescribeRecipe$fNFDataDescribeRecipeResponse$fAWSRequestDescribeRecipe$fEqDescribeRecipeResponse$fReadDescribeRecipeResponse$fShowDescribeRecipeResponse$fGenericDescribeRecipeResponse$fEqDescribeRecipe$fReadDescribeRecipe$fShowDescribeRecipe$fGenericDescribeRecipeDescribeProjectResponseDescribeProjectResponse'($sel:createDate:DescribeProjectResponse''$sel:createdBy:DescribeProjectResponse')$sel:datasetName:DescribeProjectResponse',$sel:lastModifiedBy:DescribeProjectResponse'.$sel:lastModifiedDate:DescribeProjectResponse'&$sel:openDate:DescribeProjectResponse'&$sel:openedBy:DescribeProjectResponse'($sel:recipeName:DescribeProjectResponse')$sel:resourceArn:DescribeProjectResponse'%$sel:roleArn:DescribeProjectResponse'$$sel:sample:DescribeProjectResponse'+$sel:sessionStatus:DescribeProjectResponse'"$sel:tags:DescribeProjectResponse'($sel:httpStatus:DescribeProjectResponse'"$sel:name:DescribeProjectResponse'DescribeProjectDescribeProject'$sel:name:DescribeProject'newDescribeProjectdescribeProject_namenewDescribeProjectResponse"describeProjectResponse_createDate!describeProjectResponse_createdBy#describeProjectResponse_datasetName&describeProjectResponse_lastModifiedBy(describeProjectResponse_lastModifiedDate describeProjectResponse_openDate describeProjectResponse_openedBy"describeProjectResponse_recipeName#describeProjectResponse_resourceArndescribeProjectResponse_roleArndescribeProjectResponse_sample%describeProjectResponse_sessionStatusdescribeProjectResponse_tags"describeProjectResponse_httpStatusdescribeProjectResponse_name$fToQueryDescribeProject$fToPathDescribeProject$fToHeadersDescribeProject$fNFDataDescribeProject$fHashableDescribeProject$fNFDataDescribeProjectResponse$fAWSRequestDescribeProject$fEqDescribeProjectResponse$fReadDescribeProjectResponse$fShowDescribeProjectResponse $fGenericDescribeProjectResponse$fEqDescribeProject$fReadDescribeProject$fShowDescribeProject$fGenericDescribeProjectDescribeJobRunResponseDescribeJobRunResponse'$$sel:attempt:DescribeJobRunResponse'($sel:completedOn:DescribeJobRunResponse'/$sel:dataCatalogOutputs:DescribeJobRunResponse',$sel:databaseOutputs:DescribeJobRunResponse'($sel:datasetName:DescribeJobRunResponse')$sel:errorMessage:DescribeJobRunResponse'*$sel:executionTime:DescribeJobRunResponse'&$sel:jobSample:DescribeJobRunResponse')$sel:logGroupName:DescribeJobRunResponse',$sel:logSubscription:DescribeJobRunResponse'$$sel:outputs:DescribeJobRunResponse'1$sel:profileConfiguration:DescribeJobRunResponse',$sel:recipeReference:DescribeJobRunResponse'"$sel:runId:DescribeJobRunResponse'&$sel:startedBy:DescribeJobRunResponse'&$sel:startedOn:DescribeJobRunResponse'"$sel:state:DescribeJobRunResponse'5$sel:validationConfigurations:DescribeJobRunResponse''$sel:httpStatus:DescribeJobRunResponse'$$sel:jobName:DescribeJobRunResponse'DescribeJobRunDescribeJobRun'$sel:name:DescribeJobRun'$sel:runId:DescribeJobRun'newDescribeJobRundescribeJobRun_namedescribeJobRun_runIdnewDescribeJobRunResponsedescribeJobRunResponse_attempt"describeJobRunResponse_completedOn)describeJobRunResponse_dataCatalogOutputs&describeJobRunResponse_databaseOutputs"describeJobRunResponse_datasetName#describeJobRunResponse_errorMessage$describeJobRunResponse_executionTime describeJobRunResponse_jobSample#describeJobRunResponse_logGroupName&describeJobRunResponse_logSubscriptiondescribeJobRunResponse_outputs+describeJobRunResponse_profileConfiguration&describeJobRunResponse_recipeReferencedescribeJobRunResponse_runId describeJobRunResponse_startedBy describeJobRunResponse_startedOndescribeJobRunResponse_state/describeJobRunResponse_validationConfigurations!describeJobRunResponse_httpStatusdescribeJobRunResponse_jobName$fToQueryDescribeJobRun$fToPathDescribeJobRun$fToHeadersDescribeJobRun$fNFDataDescribeJobRun$fHashableDescribeJobRun$fNFDataDescribeJobRunResponse$fAWSRequestDescribeJobRun$fEqDescribeJobRunResponse$fReadDescribeJobRunResponse$fShowDescribeJobRunResponse$fGenericDescribeJobRunResponse$fEqDescribeJobRun$fReadDescribeJobRun$fShowDescribeJobRun$fGenericDescribeJobRunDescribeJobResponseDescribeJobResponse'$$sel:createDate:DescribeJobResponse'#$sel:createdBy:DescribeJobResponse',$sel:dataCatalogOutputs:DescribeJobResponse')$sel:databaseOutputs:DescribeJobResponse'%$sel:datasetName:DescribeJobResponse'*$sel:encryptionKeyArn:DescribeJobResponse'($sel:encryptionMode:DescribeJobResponse'#$sel:jobSample:DescribeJobResponse'($sel:lastModifiedBy:DescribeJobResponse'*$sel:lastModifiedDate:DescribeJobResponse')$sel:logSubscription:DescribeJobResponse'%$sel:maxCapacity:DescribeJobResponse'$$sel:maxRetries:DescribeJobResponse'!$sel:outputs:DescribeJobResponse'.$sel:profileConfiguration:DescribeJobResponse'%$sel:projectName:DescribeJobResponse')$sel:recipeReference:DescribeJobResponse'%$sel:resourceArn:DescribeJobResponse'!$sel:roleArn:DescribeJobResponse'$sel:tags:DescribeJobResponse'!$sel:timeout:DescribeJobResponse'$sel:type':DescribeJobResponse'2$sel:validationConfigurations:DescribeJobResponse'$$sel:httpStatus:DescribeJobResponse'$sel:name:DescribeJobResponse' DescribeJob DescribeJob'$sel:name:DescribeJob'newDescribeJobdescribeJob_namenewDescribeJobResponsedescribeJobResponse_createDatedescribeJobResponse_createdBy&describeJobResponse_dataCatalogOutputs#describeJobResponse_databaseOutputsdescribeJobResponse_datasetName$describeJobResponse_encryptionKeyArn"describeJobResponse_encryptionModedescribeJobResponse_jobSample"describeJobResponse_lastModifiedBy$describeJobResponse_lastModifiedDate#describeJobResponse_logSubscriptiondescribeJobResponse_maxCapacitydescribeJobResponse_maxRetriesdescribeJobResponse_outputs(describeJobResponse_profileConfigurationdescribeJobResponse_projectName#describeJobResponse_recipeReferencedescribeJobResponse_resourceArndescribeJobResponse_roleArndescribeJobResponse_tagsdescribeJobResponse_timeoutdescribeJobResponse_type,describeJobResponse_validationConfigurationsdescribeJobResponse_httpStatusdescribeJobResponse_name$fToQueryDescribeJob$fToPathDescribeJob$fToHeadersDescribeJob$fNFDataDescribeJob$fHashableDescribeJob$fNFDataDescribeJobResponse$fAWSRequestDescribeJob$fEqDescribeJobResponse$fReadDescribeJobResponse$fShowDescribeJobResponse$fGenericDescribeJobResponse$fEqDescribeJob$fReadDescribeJob$fShowDescribeJob$fGenericDescribeJobDescribeDatasetResponseDescribeDatasetResponse'($sel:createDate:DescribeDatasetResponse''$sel:createdBy:DescribeDatasetResponse'$$sel:format:DescribeDatasetResponse'+$sel:formatOptions:DescribeDatasetResponse',$sel:lastModifiedBy:DescribeDatasetResponse'.$sel:lastModifiedDate:DescribeDatasetResponse')$sel:pathOptions:DescribeDatasetResponse')$sel:resourceArn:DescribeDatasetResponse'$$sel:source:DescribeDatasetResponse'"$sel:tags:DescribeDatasetResponse'($sel:httpStatus:DescribeDatasetResponse'"$sel:name:DescribeDatasetResponse'#$sel:input:DescribeDatasetResponse'DescribeDatasetDescribeDataset'$sel:name:DescribeDataset'newDescribeDatasetdescribeDataset_namenewDescribeDatasetResponse"describeDatasetResponse_createDate!describeDatasetResponse_createdBydescribeDatasetResponse_format%describeDatasetResponse_formatOptions&describeDatasetResponse_lastModifiedBy(describeDatasetResponse_lastModifiedDate#describeDatasetResponse_pathOptions#describeDatasetResponse_resourceArndescribeDatasetResponse_sourcedescribeDatasetResponse_tags"describeDatasetResponse_httpStatusdescribeDatasetResponse_namedescribeDatasetResponse_input$fToQueryDescribeDataset$fToPathDescribeDataset$fToHeadersDescribeDataset$fNFDataDescribeDataset$fHashableDescribeDataset$fNFDataDescribeDatasetResponse$fAWSRequestDescribeDataset$fEqDescribeDatasetResponse$fReadDescribeDatasetResponse$fShowDescribeDatasetResponse $fGenericDescribeDatasetResponse$fEqDescribeDataset$fReadDescribeDataset$fShowDescribeDataset$fGenericDescribeDatasetDeleteScheduleResponseDeleteScheduleResponse''$sel:httpStatus:DeleteScheduleResponse'!$sel:name:DeleteScheduleResponse'DeleteScheduleDeleteSchedule'$sel:name:DeleteSchedule'newDeleteScheduledeleteSchedule_namenewDeleteScheduleResponse!deleteScheduleResponse_httpStatusdeleteScheduleResponse_name$fToQueryDeleteSchedule$fToPathDeleteSchedule$fToHeadersDeleteSchedule$fNFDataDeleteSchedule$fHashableDeleteSchedule$fNFDataDeleteScheduleResponse$fAWSRequestDeleteSchedule$fEqDeleteScheduleResponse$fReadDeleteScheduleResponse$fShowDeleteScheduleResponse$fGenericDeleteScheduleResponse$fEqDeleteSchedule$fReadDeleteSchedule$fShowDeleteSchedule$fGenericDeleteScheduleDeleteRulesetResponseDeleteRulesetResponse'&$sel:httpStatus:DeleteRulesetResponse' $sel:name:DeleteRulesetResponse' DeleteRulesetDeleteRuleset'$sel:name:DeleteRuleset'newDeleteRulesetdeleteRuleset_namenewDeleteRulesetResponse deleteRulesetResponse_httpStatusdeleteRulesetResponse_name$fToQueryDeleteRuleset$fToPathDeleteRuleset$fToHeadersDeleteRuleset$fNFDataDeleteRuleset$fHashableDeleteRuleset$fNFDataDeleteRulesetResponse$fAWSRequestDeleteRuleset$fEqDeleteRulesetResponse$fReadDeleteRulesetResponse$fShowDeleteRulesetResponse$fGenericDeleteRulesetResponse$fEqDeleteRuleset$fReadDeleteRuleset$fShowDeleteRuleset$fGenericDeleteRulesetDeleteRecipeVersionResponseDeleteRecipeVersionResponse',$sel:httpStatus:DeleteRecipeVersionResponse'&$sel:name:DeleteRecipeVersionResponse'/$sel:recipeVersion:DeleteRecipeVersionResponse'DeleteRecipeVersionDeleteRecipeVersion'$sel:name:DeleteRecipeVersion''$sel:recipeVersion:DeleteRecipeVersion'newDeleteRecipeVersiondeleteRecipeVersion_name!deleteRecipeVersion_recipeVersionnewDeleteRecipeVersionResponse&deleteRecipeVersionResponse_httpStatus deleteRecipeVersionResponse_name)deleteRecipeVersionResponse_recipeVersion$fToQueryDeleteRecipeVersion$fToPathDeleteRecipeVersion$fToHeadersDeleteRecipeVersion$fNFDataDeleteRecipeVersion$fHashableDeleteRecipeVersion#$fNFDataDeleteRecipeVersionResponse$fAWSRequestDeleteRecipeVersion$fEqDeleteRecipeVersionResponse!$fReadDeleteRecipeVersionResponse!$fShowDeleteRecipeVersionResponse$$fGenericDeleteRecipeVersionResponse$fEqDeleteRecipeVersion$fReadDeleteRecipeVersion$fShowDeleteRecipeVersion$fGenericDeleteRecipeVersionDeleteProjectResponseDeleteProjectResponse'&$sel:httpStatus:DeleteProjectResponse' $sel:name:DeleteProjectResponse' DeleteProjectDeleteProject'$sel:name:DeleteProject'newDeleteProjectdeleteProject_namenewDeleteProjectResponse deleteProjectResponse_httpStatusdeleteProjectResponse_name$fToQueryDeleteProject$fToPathDeleteProject$fToHeadersDeleteProject$fNFDataDeleteProject$fHashableDeleteProject$fNFDataDeleteProjectResponse$fAWSRequestDeleteProject$fEqDeleteProjectResponse$fReadDeleteProjectResponse$fShowDeleteProjectResponse$fGenericDeleteProjectResponse$fEqDeleteProject$fReadDeleteProject$fShowDeleteProject$fGenericDeleteProjectDeleteJobResponseDeleteJobResponse'"$sel:httpStatus:DeleteJobResponse'$sel:name:DeleteJobResponse' DeleteJob DeleteJob'$sel:name:DeleteJob' newDeleteJobdeleteJob_namenewDeleteJobResponsedeleteJobResponse_httpStatusdeleteJobResponse_name$fToQueryDeleteJob$fToPathDeleteJob$fToHeadersDeleteJob$fNFDataDeleteJob$fHashableDeleteJob$fNFDataDeleteJobResponse$fAWSRequestDeleteJob$fEqDeleteJobResponse$fReadDeleteJobResponse$fShowDeleteJobResponse$fGenericDeleteJobResponse $fEqDeleteJob$fReadDeleteJob$fShowDeleteJob$fGenericDeleteJobDeleteDatasetResponseDeleteDatasetResponse'&$sel:httpStatus:DeleteDatasetResponse' $sel:name:DeleteDatasetResponse' DeleteDatasetDeleteDataset'$sel:name:DeleteDataset'newDeleteDatasetdeleteDataset_namenewDeleteDatasetResponse deleteDatasetResponse_httpStatusdeleteDatasetResponse_name$fToQueryDeleteDataset$fToPathDeleteDataset$fToHeadersDeleteDataset$fNFDataDeleteDataset$fHashableDeleteDataset$fNFDataDeleteDatasetResponse$fAWSRequestDeleteDataset$fEqDeleteDatasetResponse$fReadDeleteDatasetResponse$fShowDeleteDatasetResponse$fGenericDeleteDatasetResponse$fEqDeleteDataset$fReadDeleteDataset$fShowDeleteDataset$fGenericDeleteDatasetCreateScheduleResponseCreateScheduleResponse''$sel:httpStatus:CreateScheduleResponse'!$sel:name:CreateScheduleResponse'CreateScheduleCreateSchedule'$sel:jobNames:CreateSchedule'$sel:tags:CreateSchedule'#$sel:cronExpression:CreateSchedule'$sel:name:CreateSchedule'newCreateSchedulecreateSchedule_jobNamescreateSchedule_tagscreateSchedule_cronExpressioncreateSchedule_namenewCreateScheduleResponse!createScheduleResponse_httpStatuscreateScheduleResponse_name$fToQueryCreateSchedule$fToPathCreateSchedule$fToJSONCreateSchedule$fToHeadersCreateSchedule$fNFDataCreateSchedule$fHashableCreateSchedule$fNFDataCreateScheduleResponse$fAWSRequestCreateSchedule$fEqCreateScheduleResponse$fReadCreateScheduleResponse$fShowCreateScheduleResponse$fGenericCreateScheduleResponse$fEqCreateSchedule$fReadCreateSchedule$fShowCreateSchedule$fGenericCreateScheduleCreateRulesetResponseCreateRulesetResponse'&$sel:httpStatus:CreateRulesetResponse' $sel:name:CreateRulesetResponse' CreateRulesetCreateRuleset'$sel:description:CreateRuleset'$sel:tags:CreateRuleset'$sel:name:CreateRuleset'$sel:targetArn:CreateRuleset'$sel:rules:CreateRuleset'newCreateRulesetcreateRuleset_descriptioncreateRuleset_tagscreateRuleset_namecreateRuleset_targetArncreateRuleset_rulesnewCreateRulesetResponse createRulesetResponse_httpStatuscreateRulesetResponse_name$fToQueryCreateRuleset$fToPathCreateRuleset$fToJSONCreateRuleset$fToHeadersCreateRuleset$fNFDataCreateRuleset$fHashableCreateRuleset$fNFDataCreateRulesetResponse$fAWSRequestCreateRuleset$fEqCreateRulesetResponse$fReadCreateRulesetResponse$fShowCreateRulesetResponse$fGenericCreateRulesetResponse$fEqCreateRuleset$fReadCreateRuleset$fShowCreateRuleset$fGenericCreateRulesetCreateRecipeJobResponseCreateRecipeJobResponse'($sel:httpStatus:CreateRecipeJobResponse'"$sel:name:CreateRecipeJobResponse'CreateRecipeJobCreateRecipeJob'($sel:dataCatalogOutputs:CreateRecipeJob'%$sel:databaseOutputs:CreateRecipeJob'!$sel:datasetName:CreateRecipeJob'&$sel:encryptionKeyArn:CreateRecipeJob'$$sel:encryptionMode:CreateRecipeJob'%$sel:logSubscription:CreateRecipeJob'!$sel:maxCapacity:CreateRecipeJob' $sel:maxRetries:CreateRecipeJob'$sel:outputs:CreateRecipeJob'!$sel:projectName:CreateRecipeJob'%$sel:recipeReference:CreateRecipeJob'$sel:tags:CreateRecipeJob'$sel:timeout:CreateRecipeJob'$sel:name:CreateRecipeJob'$sel:roleArn:CreateRecipeJob'newCreateRecipeJob"createRecipeJob_dataCatalogOutputscreateRecipeJob_databaseOutputscreateRecipeJob_datasetName createRecipeJob_encryptionKeyArncreateRecipeJob_encryptionModecreateRecipeJob_logSubscriptioncreateRecipeJob_maxCapacitycreateRecipeJob_maxRetriescreateRecipeJob_outputscreateRecipeJob_projectNamecreateRecipeJob_recipeReferencecreateRecipeJob_tagscreateRecipeJob_timeoutcreateRecipeJob_namecreateRecipeJob_roleArnnewCreateRecipeJobResponse"createRecipeJobResponse_httpStatuscreateRecipeJobResponse_name$fToQueryCreateRecipeJob$fToPathCreateRecipeJob$fToJSONCreateRecipeJob$fToHeadersCreateRecipeJob$fNFDataCreateRecipeJob$fHashableCreateRecipeJob$fNFDataCreateRecipeJobResponse$fAWSRequestCreateRecipeJob$fEqCreateRecipeJobResponse$fReadCreateRecipeJobResponse$fShowCreateRecipeJobResponse $fGenericCreateRecipeJobResponse$fEqCreateRecipeJob$fReadCreateRecipeJob$fShowCreateRecipeJob$fGenericCreateRecipeJobCreateRecipeResponseCreateRecipeResponse'%$sel:httpStatus:CreateRecipeResponse'$sel:name:CreateRecipeResponse' CreateRecipe CreateRecipe'$sel:description:CreateRecipe'$sel:tags:CreateRecipe'$sel:name:CreateRecipe'$sel:steps:CreateRecipe'newCreateRecipecreateRecipe_descriptioncreateRecipe_tagscreateRecipe_namecreateRecipe_stepsnewCreateRecipeResponsecreateRecipeResponse_httpStatuscreateRecipeResponse_name$fToQueryCreateRecipe$fToPathCreateRecipe$fToJSONCreateRecipe$fToHeadersCreateRecipe$fNFDataCreateRecipe$fHashableCreateRecipe$fNFDataCreateRecipeResponse$fAWSRequestCreateRecipe$fEqCreateRecipeResponse$fReadCreateRecipeResponse$fShowCreateRecipeResponse$fGenericCreateRecipeResponse$fEqCreateRecipe$fReadCreateRecipe$fShowCreateRecipe$fGenericCreateRecipeCreateProjectResponseCreateProjectResponse'&$sel:httpStatus:CreateProjectResponse' $sel:name:CreateProjectResponse' CreateProjectCreateProject'$sel:sample:CreateProject'$sel:tags:CreateProject'$sel:datasetName:CreateProject'$sel:name:CreateProject'$sel:recipeName:CreateProject'$sel:roleArn:CreateProject'newCreateProjectcreateProject_samplecreateProject_tagscreateProject_datasetNamecreateProject_namecreateProject_recipeNamecreateProject_roleArnnewCreateProjectResponse createProjectResponse_httpStatuscreateProjectResponse_name$fToQueryCreateProject$fToPathCreateProject$fToJSONCreateProject$fToHeadersCreateProject$fNFDataCreateProject$fHashableCreateProject$fNFDataCreateProjectResponse$fAWSRequestCreateProject$fEqCreateProjectResponse$fReadCreateProjectResponse$fShowCreateProjectResponse$fGenericCreateProjectResponse$fEqCreateProject$fReadCreateProject$fShowCreateProject$fGenericCreateProjectCreateProfileJobResponseCreateProfileJobResponse')$sel:httpStatus:CreateProfileJobResponse'#$sel:name:CreateProfileJobResponse'CreateProfileJobCreateProfileJob'$$sel:configuration:CreateProfileJob''$sel:encryptionKeyArn:CreateProfileJob'%$sel:encryptionMode:CreateProfileJob' $sel:jobSample:CreateProfileJob'&$sel:logSubscription:CreateProfileJob'"$sel:maxCapacity:CreateProfileJob'!$sel:maxRetries:CreateProfileJob'$sel:tags:CreateProfileJob'$sel:timeout:CreateProfileJob'/$sel:validationConfigurations:CreateProfileJob'"$sel:datasetName:CreateProfileJob'$sel:name:CreateProfileJob'%$sel:outputLocation:CreateProfileJob'$sel:roleArn:CreateProfileJob'newCreateProfileJobcreateProfileJob_configuration!createProfileJob_encryptionKeyArncreateProfileJob_encryptionModecreateProfileJob_jobSample createProfileJob_logSubscriptioncreateProfileJob_maxCapacitycreateProfileJob_maxRetriescreateProfileJob_tagscreateProfileJob_timeout)createProfileJob_validationConfigurationscreateProfileJob_datasetNamecreateProfileJob_namecreateProfileJob_outputLocationcreateProfileJob_roleArnnewCreateProfileJobResponse#createProfileJobResponse_httpStatuscreateProfileJobResponse_name$fToQueryCreateProfileJob$fToPathCreateProfileJob$fToJSONCreateProfileJob$fToHeadersCreateProfileJob$fNFDataCreateProfileJob$fHashableCreateProfileJob $fNFDataCreateProfileJobResponse$fAWSRequestCreateProfileJob$fEqCreateProfileJobResponse$fReadCreateProfileJobResponse$fShowCreateProfileJobResponse!$fGenericCreateProfileJobResponse$fEqCreateProfileJob$fReadCreateProfileJob$fShowCreateProfileJob$fGenericCreateProfileJobCreateDatasetResponseCreateDatasetResponse'&$sel:httpStatus:CreateDatasetResponse' $sel:name:CreateDatasetResponse' CreateDatasetCreateDataset'$sel:format:CreateDataset'!$sel:formatOptions:CreateDataset'$sel:pathOptions:CreateDataset'$sel:tags:CreateDataset'$sel:name:CreateDataset'$sel:input:CreateDataset'newCreateDatasetcreateDataset_formatcreateDataset_formatOptionscreateDataset_pathOptionscreateDataset_tagscreateDataset_namecreateDataset_inputnewCreateDatasetResponse createDatasetResponse_httpStatuscreateDatasetResponse_name$fToQueryCreateDataset$fToPathCreateDataset$fToJSONCreateDataset$fToHeadersCreateDataset$fNFDataCreateDataset$fHashableCreateDataset$fNFDataCreateDatasetResponse$fAWSRequestCreateDataset$fEqCreateDatasetResponse$fReadCreateDatasetResponse$fShowCreateDatasetResponse$fGenericCreateDatasetResponse$fEqCreateDataset$fReadCreateDataset$fShowCreateDataset$fGenericCreateDataset BatchDeleteRecipeVersionResponse!BatchDeleteRecipeVersionResponse'-$sel:errors:BatchDeleteRecipeVersionResponse'1$sel:httpStatus:BatchDeleteRecipeVersionResponse'+$sel:name:BatchDeleteRecipeVersionResponse'BatchDeleteRecipeVersionBatchDeleteRecipeVersion'#$sel:name:BatchDeleteRecipeVersion'-$sel:recipeVersions:BatchDeleteRecipeVersion'newBatchDeleteRecipeVersionbatchDeleteRecipeVersion_name'batchDeleteRecipeVersion_recipeVersions#newBatchDeleteRecipeVersionResponse'batchDeleteRecipeVersionResponse_errors+batchDeleteRecipeVersionResponse_httpStatus%batchDeleteRecipeVersionResponse_name!$fToQueryBatchDeleteRecipeVersion $fToPathBatchDeleteRecipeVersion $fToJSONBatchDeleteRecipeVersion#$fToHeadersBatchDeleteRecipeVersion $fNFDataBatchDeleteRecipeVersion"$fHashableBatchDeleteRecipeVersion($fNFDataBatchDeleteRecipeVersionResponse$$fAWSRequestBatchDeleteRecipeVersion$$fEqBatchDeleteRecipeVersionResponse&$fReadBatchDeleteRecipeVersionResponse&$fShowBatchDeleteRecipeVersionResponse)$fGenericBatchDeleteRecipeVersionResponse$fEqBatchDeleteRecipeVersion$fReadBatchDeleteRecipeVersion$fShowBatchDeleteRecipeVersion!$fGenericBatchDeleteRecipeVersionUntagResourceResponseUntagResourceResponse'&$sel:httpStatus:UntagResourceResponse' UntagResourceUntagResource'$sel:resourceArn:UntagResource'$sel:tagKeys:UntagResource'newUntagResourceuntagResource_resourceArnuntagResource_tagKeysnewUntagResourceResponse untagResourceResponse_httpStatus$fToQueryUntagResource$fToPathUntagResource$fToHeadersUntagResource$fNFDataUntagResource$fHashableUntagResource$fNFDataUntagResourceResponse$fAWSRequestUntagResource$fEqUntagResourceResponse$fReadUntagResourceResponse$fShowUntagResourceResponse$fGenericUntagResourceResponse$fEqUntagResource$fReadUntagResource$fShowUntagResource$fGenericUntagResourceUpdateDatasetResponseUpdateDatasetResponse'&$sel:httpStatus:UpdateDatasetResponse' $sel:name:UpdateDatasetResponse' UpdateDatasetUpdateDataset'$sel:format:UpdateDataset'!$sel:formatOptions:UpdateDataset'$sel:pathOptions:UpdateDataset'$sel:name:UpdateDataset'$sel:input:UpdateDataset'newUpdateDatasetupdateDataset_formatupdateDataset_formatOptionsupdateDataset_pathOptionsupdateDataset_nameupdateDataset_inputnewUpdateDatasetResponse updateDatasetResponse_httpStatusupdateDatasetResponse_name$fToQueryUpdateDataset$fToPathUpdateDataset$fToJSONUpdateDataset$fToHeadersUpdateDataset$fNFDataUpdateDataset$fHashableUpdateDataset$fNFDataUpdateDatasetResponse$fAWSRequestUpdateDataset$fEqUpdateDatasetResponse$fReadUpdateDatasetResponse$fShowUpdateDatasetResponse$fGenericUpdateDatasetResponse$fEqUpdateDataset$fReadUpdateDataset$fShowUpdateDataset$fGenericUpdateDatasetUpdateProfileJobResponseUpdateProfileJobResponse')$sel:httpStatus:UpdateProfileJobResponse'#$sel:name:UpdateProfileJobResponse'UpdateProfileJobUpdateProfileJob'$$sel:configuration:UpdateProfileJob''$sel:encryptionKeyArn:UpdateProfileJob'%$sel:encryptionMode:UpdateProfileJob' $sel:jobSample:UpdateProfileJob'&$sel:logSubscription:UpdateProfileJob'"$sel:maxCapacity:UpdateProfileJob'!$sel:maxRetries:UpdateProfileJob'$sel:timeout:UpdateProfileJob'/$sel:validationConfigurations:UpdateProfileJob'$sel:name:UpdateProfileJob'%$sel:outputLocation:UpdateProfileJob'$sel:roleArn:UpdateProfileJob'newUpdateProfileJobupdateProfileJob_configuration!updateProfileJob_encryptionKeyArnupdateProfileJob_encryptionModeupdateProfileJob_jobSample updateProfileJob_logSubscriptionupdateProfileJob_maxCapacityupdateProfileJob_maxRetriesupdateProfileJob_timeout)updateProfileJob_validationConfigurationsupdateProfileJob_nameupdateProfileJob_outputLocationupdateProfileJob_roleArnnewUpdateProfileJobResponse#updateProfileJobResponse_httpStatusupdateProfileJobResponse_name$fToQueryUpdateProfileJob$fToPathUpdateProfileJob$fToJSONUpdateProfileJob$fToHeadersUpdateProfileJob$fNFDataUpdateProfileJob$fHashableUpdateProfileJob $fNFDataUpdateProfileJobResponse$fAWSRequestUpdateProfileJob$fEqUpdateProfileJobResponse$fReadUpdateProfileJobResponse$fShowUpdateProfileJobResponse!$fGenericUpdateProfileJobResponse$fEqUpdateProfileJob$fReadUpdateProfileJob$fShowUpdateProfileJob$fGenericUpdateProfileJobUpdateProjectResponseUpdateProjectResponse',$sel:lastModifiedDate:UpdateProjectResponse'&$sel:httpStatus:UpdateProjectResponse' $sel:name:UpdateProjectResponse' UpdateProjectUpdateProject'$sel:sample:UpdateProject'$sel:roleArn:UpdateProject'$sel:name:UpdateProject'newUpdateProjectupdateProject_sampleupdateProject_roleArnupdateProject_namenewUpdateProjectResponse&updateProjectResponse_lastModifiedDate updateProjectResponse_httpStatusupdateProjectResponse_name$fToQueryUpdateProject$fToPathUpdateProject$fToJSONUpdateProject$fToHeadersUpdateProject$fNFDataUpdateProject$fHashableUpdateProject$fNFDataUpdateProjectResponse$fAWSRequestUpdateProject$fEqUpdateProjectResponse$fReadUpdateProjectResponse$fShowUpdateProjectResponse$fGenericUpdateProjectResponse$fEqUpdateProject$fReadUpdateProject$fShowUpdateProject$fGenericUpdateProjectUpdateRecipeResponseUpdateRecipeResponse'%$sel:httpStatus:UpdateRecipeResponse'$sel:name:UpdateRecipeResponse' UpdateRecipe UpdateRecipe'$sel:description:UpdateRecipe'$sel:steps:UpdateRecipe'$sel:name:UpdateRecipe'newUpdateRecipeupdateRecipe_descriptionupdateRecipe_stepsupdateRecipe_namenewUpdateRecipeResponseupdateRecipeResponse_httpStatusupdateRecipeResponse_name$fToQueryUpdateRecipe$fToPathUpdateRecipe$fToJSONUpdateRecipe$fToHeadersUpdateRecipe$fNFDataUpdateRecipe$fHashableUpdateRecipe$fNFDataUpdateRecipeResponse$fAWSRequestUpdateRecipe$fEqUpdateRecipeResponse$fReadUpdateRecipeResponse$fShowUpdateRecipeResponse$fGenericUpdateRecipeResponse$fEqUpdateRecipe$fReadUpdateRecipe$fShowUpdateRecipe$fGenericUpdateRecipeUpdateRecipeJobResponseUpdateRecipeJobResponse'($sel:httpStatus:UpdateRecipeJobResponse'"$sel:name:UpdateRecipeJobResponse'UpdateRecipeJobUpdateRecipeJob'($sel:dataCatalogOutputs:UpdateRecipeJob'%$sel:databaseOutputs:UpdateRecipeJob'&$sel:encryptionKeyArn:UpdateRecipeJob'$$sel:encryptionMode:UpdateRecipeJob'%$sel:logSubscription:UpdateRecipeJob'!$sel:maxCapacity:UpdateRecipeJob' $sel:maxRetries:UpdateRecipeJob'$sel:outputs:UpdateRecipeJob'$sel:timeout:UpdateRecipeJob'$sel:name:UpdateRecipeJob'$sel:roleArn:UpdateRecipeJob'newUpdateRecipeJob"updateRecipeJob_dataCatalogOutputsupdateRecipeJob_databaseOutputs updateRecipeJob_encryptionKeyArnupdateRecipeJob_encryptionModeupdateRecipeJob_logSubscriptionupdateRecipeJob_maxCapacityupdateRecipeJob_maxRetriesupdateRecipeJob_outputsupdateRecipeJob_timeoutupdateRecipeJob_nameupdateRecipeJob_roleArnnewUpdateRecipeJobResponse"updateRecipeJobResponse_httpStatusupdateRecipeJobResponse_name$fToQueryUpdateRecipeJob$fToPathUpdateRecipeJob$fToJSONUpdateRecipeJob$fToHeadersUpdateRecipeJob$fNFDataUpdateRecipeJob$fHashableUpdateRecipeJob$fNFDataUpdateRecipeJobResponse$fAWSRequestUpdateRecipeJob$fEqUpdateRecipeJobResponse$fReadUpdateRecipeJobResponse$fShowUpdateRecipeJobResponse $fGenericUpdateRecipeJobResponse$fEqUpdateRecipeJob$fReadUpdateRecipeJob$fShowUpdateRecipeJob$fGenericUpdateRecipeJobUpdateRulesetResponseUpdateRulesetResponse'&$sel:httpStatus:UpdateRulesetResponse' $sel:name:UpdateRulesetResponse' UpdateRulesetUpdateRuleset'$sel:description:UpdateRuleset'$sel:name:UpdateRuleset'$sel:rules:UpdateRuleset'newUpdateRulesetupdateRuleset_descriptionupdateRuleset_nameupdateRuleset_rulesnewUpdateRulesetResponse updateRulesetResponse_httpStatusupdateRulesetResponse_name$fToQueryUpdateRuleset$fToPathUpdateRuleset$fToJSONUpdateRuleset$fToHeadersUpdateRuleset$fNFDataUpdateRuleset$fHashableUpdateRuleset$fNFDataUpdateRulesetResponse$fAWSRequestUpdateRuleset$fEqUpdateRulesetResponse$fReadUpdateRulesetResponse$fShowUpdateRulesetResponse$fGenericUpdateRulesetResponse$fEqUpdateRuleset$fReadUpdateRuleset$fShowUpdateRuleset$fGenericUpdateRulesetUpdateScheduleResponseUpdateScheduleResponse''$sel:httpStatus:UpdateScheduleResponse'!$sel:name:UpdateScheduleResponse'UpdateScheduleUpdateSchedule'$sel:jobNames:UpdateSchedule'#$sel:cronExpression:UpdateSchedule'$sel:name:UpdateSchedule'newUpdateScheduleupdateSchedule_jobNamesupdateSchedule_cronExpressionupdateSchedule_namenewUpdateScheduleResponse!updateScheduleResponse_httpStatusupdateScheduleResponse_name$fToQueryUpdateSchedule$fToPathUpdateSchedule$fToJSONUpdateSchedule$fToHeadersUpdateSchedule$fNFDataUpdateSchedule$fHashableUpdateSchedule$fNFDataUpdateScheduleResponse$fAWSRequestUpdateSchedule$fEqUpdateScheduleResponse$fReadUpdateScheduleResponse$fShowUpdateScheduleResponse$fGenericUpdateScheduleResponse$fEqUpdateSchedule$fReadUpdateSchedule$fShowUpdateSchedule$fGenericUpdateSchedule