MongoDB
 sql >> Database >  >> NoSQL >> MongoDB

Come risolvere com.mongodb.spark.exceptions.MongoTypeConversionException:impossibile eseguire il cast... Java Spark

Ho avuto lo stesso problema e sampleSize risolve parzialmente questo problema, ma non lo risolve se hai molti dati.

Ecco la soluzione su come risolvere questo problema. Usa questo approccio insieme a un aumento di sampleSize (nel mio caso è 100000):

def fix_schema(schema: StructType) -> StructType:
    """Fix spark schema due to inconsistent MongoDB schema collection.

    It fixes such issues like:
        Cannot cast STRING into a NullType
        Cannot cast STRING into a StructType

    :param schema: a source schema taken from a Spark DataFrame to be fixed
    """
    if isinstance(schema, StructType):
        return StructType([fix_schema(field) for field in schema.fields])
    if isinstance(schema, ArrayType):
        return ArrayType(fix_schema(schema.elementType))
    if isinstance(schema, StructField) and is_struct_oid_obj(schema):
        return StructField(name=schema.name, dataType=StringType(), nullable=schema.nullable)
    elif isinstance(schema, StructField):
        return StructField(schema.name, fix_schema(schema.dataType), schema.nullable)
    if isinstance(schema, NullType):
        return StringType()
    return schema


def is_struct_oid_obj(struct_field: StructField) -> bool:
    """
    Checks that our schema has StructType field with single oid name inside

    :param struct_field: a StructField from Spark schema
    :return bool
    """
    return (isinstance(struct_field.dataType, StructType)
            and len(struct_field.dataType.fields) == 1
            and struct_field.dataType.fields[0].name == "oid")