@Generated(value="software.amazon.awssdk:codegen") public final class SparkConnectorTarget extends Object implements SdkPojo, Serializable, ToCopyableBuilder<SparkConnectorTarget.Builder,SparkConnectorTarget>
Specifies a target that uses an Apache Spark connector.
| Modifier and Type | Class and Description |
|---|---|
static interface |
SparkConnectorTarget.Builder |
| Modifier and Type | Method and Description |
|---|---|
Map<String,String> |
additionalOptions()
Additional connection options for the connector.
|
static SparkConnectorTarget.Builder |
builder() |
String |
connectionName()
The name of a connection for an Apache Spark connector.
|
String |
connectionType()
The type of connection, such as marketplace.spark or custom.spark, designating a connection to an Apache Spark
data store.
|
String |
connectorName()
The name of an Apache Spark connector.
|
boolean |
equals(Object obj) |
boolean |
equalsBySdkFields(Object obj) |
<T> Optional<T> |
getValueForField(String fieldName,
Class<T> clazz) |
boolean |
hasAdditionalOptions()
For responses, this returns true if the service returned a value for the AdditionalOptions property.
|
int |
hashCode() |
boolean |
hasInputs()
For responses, this returns true if the service returned a value for the Inputs property.
|
boolean |
hasOutputSchemas()
For responses, this returns true if the service returned a value for the OutputSchemas property.
|
List<String> |
inputs()
The nodes that are inputs to the data target.
|
String |
name()
The name of the data target.
|
List<GlueSchema> |
outputSchemas()
Specifies the data schema for the custom spark target.
|
List<SdkField<?>> |
sdkFields() |
static Class<? extends SparkConnectorTarget.Builder> |
serializableBuilderClass() |
SparkConnectorTarget.Builder |
toBuilder() |
String |
toString()
Returns a string representation of this object.
|
clone, finalize, getClass, notify, notifyAll, wait, wait, waitcopypublic final String name()
The name of the data target.
public final boolean hasInputs()
isEmpty() method on the property). This is
useful because the SDK will never return a null collection or map, but you may need to differentiate between the
service returning nothing (or null) and the service returning an empty collection or map. For requests, this
returns true if a value for the property was specified in the request builder, and false if a value was not
specified.public final List<String> inputs()
The nodes that are inputs to the data target.
Attempts to modify the collection returned by this method will result in an UnsupportedOperationException.
This method will never return null. If you would like to know whether the service returned this field (so that
you can differentiate between null and empty), you can use the hasInputs() method.
public final String connectionName()
The name of a connection for an Apache Spark connector.
public final String connectorName()
The name of an Apache Spark connector.
public final String connectionType()
The type of connection, such as marketplace.spark or custom.spark, designating a connection to an Apache Spark data store.
public final boolean hasAdditionalOptions()
isEmpty() method on the property).
This is useful because the SDK will never return a null collection or map, but you may need to differentiate
between the service returning nothing (or null) and the service returning an empty collection or map. For
requests, this returns true if a value for the property was specified in the request builder, and false if a
value was not specified.public final Map<String,String> additionalOptions()
Additional connection options for the connector.
Attempts to modify the collection returned by this method will result in an UnsupportedOperationException.
This method will never return null. If you would like to know whether the service returned this field (so that
you can differentiate between null and empty), you can use the hasAdditionalOptions() method.
public final boolean hasOutputSchemas()
isEmpty() method on the property).
This is useful because the SDK will never return a null collection or map, but you may need to differentiate
between the service returning nothing (or null) and the service returning an empty collection or map. For
requests, this returns true if a value for the property was specified in the request builder, and false if a
value was not specified.public final List<GlueSchema> outputSchemas()
Specifies the data schema for the custom spark target.
Attempts to modify the collection returned by this method will result in an UnsupportedOperationException.
This method will never return null. If you would like to know whether the service returned this field (so that
you can differentiate between null and empty), you can use the hasOutputSchemas() method.
public SparkConnectorTarget.Builder toBuilder()
toBuilder in interface ToCopyableBuilder<SparkConnectorTarget.Builder,SparkConnectorTarget>public static SparkConnectorTarget.Builder builder()
public static Class<? extends SparkConnectorTarget.Builder> serializableBuilderClass()
public final boolean equalsBySdkFields(Object obj)
equalsBySdkFields in interface SdkPojopublic final String toString()
Copyright © 2023. All rights reserved.