You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CHANGELOG
+9Lines changed: 9 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -16,6 +16,15 @@
16
16
17
17
# spark-redshift Changelog
18
18
19
+
## 6.3.0 (2024-07-03)
20
+
- Validates connector tests pass with Spark releases 3.4.3 and 3.5.1 [Beaux Sharifi]
21
+
- Adds support for three-part table names to allow connector to query Redshift data sharing tables (#153) [Prashant Singh, Beaux Sharifi]
22
+
- Corrects mapping of Spark ShortType to use Redshift SMALLINT instead of INTEGER to better match expected data size (#152) [Akira Ajisaka, Ruei Huang]
23
+
- Adds pushdown of toprettystring() function to support df.show() operations used by Spark 3.5 [Beaux Sharifi]
24
+
- Corrects pushdown of inner joins with no join condition to result in a cross join instead of an invalid inner join [Beaux Sharifi]
25
+
- Corrects pushdown when casting null boolean values to strings to result in the value null instead of the string "null". [Beaux Sharifi]
26
+
- Upgrades Redshift JDBC driver to the latest available version 2.1.0.29. [Beaux Sharifi]
27
+
19
28
## 6.2.0 (2024-01-12)
20
29
- Validates support for Spark 3.3.4 and Spark 3.4.2
21
30
- Upgrades Redshift JDBC driver to version 2.1.0.24
An identifier to include in the query group set when running queries with the connector. Should be 100 or fewer characters and all characters must be valid unicodeIdentifierParts. Characters in excess of 100 will be trimmed.
819
-
When running a query with the connector a json formatted string will be set as the query group (for example `{"spark-redshift-connector":{"svc":"","ver":"6.2.0-spark_3.5","op":"Read","lbl":"","tid":""}}`).
819
+
When running a query with the connector a json formatted string will be set as the query group (for example `{"spark-redshift-connector":{"svc":"","ver":"6.3.0-spark_3.5","op":"Read","lbl":"","tid":""}}`).
820
820
This option will be substituted for the value of the `lbl` key.
821
821
</td>
822
822
</tr></table>
@@ -924,7 +924,7 @@ SET spark.datasource.redshift.community.autopushdown.lazyMode=false
924
924
### trace_id
925
925
A new tracing identifier field that is added to the existing `label` parameter. When set, the provided string value will be used as part of label. Otherwise, it will default to the Spark application identifier. For example:
0 commit comments