saurfang / Sparksql Protobuf
Licence: apache-2.0
Read SparkSQL parquet file as RDD[Protobuf]
Stars: ✭ 82
Programming Languages
scala
5932 projects
Projects that are alternatives of or similar to Sparksql Protobuf
Gcs Tools
GCS support for avro-tools, parquet-tools and protobuf
Stars: ✭ 57 (-30.49%)
Mutual labels: protobuf, parquet
Ratatool
A tool for data sampling, data generation, and data diffing
Stars: ✭ 279 (+240.24%)
Mutual labels: protobuf, parquet
Sol2proto
Ethereum contract ABI to gRPC protobuf IDL transpiler
Stars: ✭ 41 (-50%)
Mutual labels: protobuf
Grpcc
A gRPC cli interface for easy testing against gRPC servers
Stars: ✭ 1,078 (+1214.63%)
Mutual labels: protobuf
Protocol Buffers Language Server
[WIP] Protocol Buffers Language Server
Stars: ✭ 44 (-46.34%)
Mutual labels: protobuf
Rumble
⛈️ Rumble 1.11.0 "Banyan Tree"🌳 for Apache Spark | Run queries on your large-scale, messy JSON-like data (JSON, text, CSV, Parquet, ROOT, AVRO, SVM...) | No install required (just a jar to download) | Declarative Machine Learning and more
Stars: ✭ 58 (-29.27%)
Mutual labels: parquet
Grpc Contract
A tool to generate the grpc server code for a contract
Stars: ✭ 40 (-51.22%)
Mutual labels: protobuf
Easyrpc
EasyRpc is a simple, high-performance, easy-to-use RPC framework based on Netty, ZooKeeper and ProtoStuff.
Stars: ✭ 79 (-3.66%)
Mutual labels: protobuf
Proto Extractor
Program to extract protobufs compiled for C#
Stars: ✭ 49 (-40.24%)
Mutual labels: protobuf
Mortgageblockchainfabric
Mortgage Processing App using Hyperledger Fabric Blockchain. Uses channels for privacy and access, and restricts read/write previleges through endorsement policies
Stars: ✭ 45 (-45.12%)
Mutual labels: protobuf
Node Google Play Cli
command line tools using the node-google-play library
Stars: ✭ 58 (-29.27%)
Mutual labels: protobuf
Lua Protobuf
A Lua module to work with Google protobuf
Stars: ✭ 1,002 (+1121.95%)
Mutual labels: protobuf
Curaengine
Powerful, fast and robust engine for converting 3D models into g-code instructions for 3D printers. It is part of the larger open source project Cura.
Stars: ✭ 1,207 (+1371.95%)
Mutual labels: protobuf
sparksql-protobuf
This library provides utilities to work with Protobuf objects in SparkSQL. It provides a way to read parquet file written by SparkSQL back as an RDD of compatible protobuf object. It can also converts RDD of protobuf objects into DataFrame.
For sbt 0.13.6+
resolvers += Resolver.jcenterRepo
libraryDependencies ++= Seq(
"com.github.saurfang" %% "sparksql-protobuf" % "0.1.3",
"org.apache.parquet" % "parquet-protobuf" % "1.8.3"
)
Motivation
SparkSQL is very powerful and easy to use. However it has a few limitations and schema is only detected during runtime makes developers a lot less confident that they will get things right at first time. Static typing helps a lot! This is where protobuf comes in:
- Protobuf defines nested data structure easily
- It doesn't constraint you to the 22 fields limit in case class (no longer true once we upgrade to 2.11+)
- It is language agnostic and generates code that gives you native objects
hence you get all the benefit of type checking and code completion unlike operating
Row
in Spark/SparkSQL
Features
RDD[Protobuf]
Read Parquet file as val personsPB = new ProtoParquetRDD(sc, "persons.parquet", classOf[Person])
where we need SparkContext
, parquet path and protobuf class.
This converts the existing workflow:
- Ingest raw data as DataFrame with nested data structure
- Create awkward runtime type checking udfs
- Transform raw DataFrame using above udfs into a tabular DataFrame for data analytics
to
- Ingest raw data as DataFrame with nested data structure and persist as Parquet file
- Read Parquet file back as
RDD[Protobuf]
- Perform any data transformation and extraction by working with compile typesafe Protobuf getters
- Create a DataFrame out of the above transformation and perform additional downstream data analytics on the tabular DataFrame
Infer SparkSQL Schema from Protobuf Definition
val personSchema = ProtoReflection.schemaFor[Person].dataType.asInstanceOf[StructType]
RDD[Protobuf]
to DataFrame
Convert import com.github.saurfang.parquet.proto.spark.sql._
val personsDF = sqlContext.createDataFrame(protoPersons)
For more information, please see test cases.
Under the hood
-
ProtoMessageConverter
has been improved to read from LIST specification according to latest parquet documentation. This implementation should be backwards compatible and is able to read repeated fields generated by writers like SparkSQL. -
ProtoMessageParquetInputFormat
helps the above process by correctly returning the built protobuf object as value. -
ProtoParquetRDD
abstract the Hadoop input format and returns an RDD of your protobuf objects from parquet files directly. -
ProtoReflection
infers SparkSQL schema from any Protobuf message class. -
ProtoRDDConversions
converts Protobuf objects into SparkSQL rows.
Related Work
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].