Facets-overview

Latest version: v1.1.1

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 3

0.5.2

0.5.1

* shade com.google.protobuf.*
according https://scalapb.github.io/docs/sparksql/
"Spark ships with an old version of Google's Protocol Buffers runtime that is not
compatible with the current version. Therefore, we need to shade our copy of the
Protocol Buffer runtime."
* "Spark 3 also ships with an incompatible version of scala-collection-compat.",
although our unit test show passes, without any changes, but we experienced exceptions
when trying to convert protobuf to json using json4s. Even trying to shade scala-collection-compat
doesn't help. here we did not shade the scala-collection-compact
* instead of directly cast categorical data type to string,
the logic is changed to check the data type first before extract the value.
Also catch the unexpected exception to associate the feature with exception.
Many cases, where spark considered the data is string, if it is no explicitly cast to decimals.
this will help debugging which feature is the issue.
* update readme

0.5.0

* Spark 3 requires Scala 2.12, we have to few dependency upgrade as well, in particular,
scala-maven-plugin, scalatest, spark-tensorflow-connector and scalapb
* scala-maven-plugin version 4.3.1 works, but higher version such as 4.4.x, 4.5.x do not work.
If you use version 4.4.x, 4.5.x, you will get error identical to the one reported in :
http://5.9.10.113/66489291/project-build-with-circle-ci-is-failing
* with change to scalapb to the current version, the protobuf version argument is changed.
in scalapb 0.9.8 if the protobuf version is 3.8.0 then the argument is v380 and v261 for version 2.6.1
in scalapb 0.11.0 we need to specify v3.8.0 for version 3.8.0
* spark-tensorflow-connector has no scala 2.12 releases at the moment, although the master branch does
has code for scala 2.12. Instead of waiting for the official release, we temporarily build the dependency ourselves.
We add git submodule of tensorflow/ecosystem, then build the dependency locally

cd ecosystem/spark/spark-tensorflow-connector;
mvn clean install

* change scalatest dependency to version 3.0.5.
when change the scalatest to the later versions > "3.0.5", the tests will show errors like "object FunSuite is not a member of package org.scalatest"

0.4.1

The main branch will be upgrade to Apache Spark 3.0

0.4.0

This avoid user create a big protobuf size in a big dataset with a lot of categorical feature rows.

0.3.8

Page 1 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.