apache / Hadoop Mapreduce
Licence: apache-2.0
Mirror of Apache Hadoop MapReduce
Stars: ✭ 88
Programming Languages
java
68154 projects - #9 most used programming language
Labels
Projects that are alternatives of or similar to Hadoop Mapreduce
Docker Spark Cluster
A Spark cluster setup running on Docker containers
Stars: ✭ 57 (-35.23%)
Mutual labels: hadoop
Src
A light-weight distributed stream computing framework for Golang
Stars: ✭ 67 (-23.86%)
Mutual labels: hadoop
Moosefs
MooseFS – Open Source, Petabyte, Fault-Tolerant, Highly Performing, Scalable Network Distributed File System (Software-Defined Storage)
Stars: ✭ 1,025 (+1064.77%)
Mutual labels: hadoop
Docker Hadoop Cluster
Multiple node cluster on Docker for self development.
Stars: ✭ 82 (-6.82%)
Mutual labels: hadoop
Likelike
An implementation of locality sensitive hashing with Hadoop
Stars: ✭ 58 (-34.09%)
Mutual labels: hadoop
Jumbune
Jumbune, an open source BigData APM & Data Quality Management Platform for Data Clouds. Enterprise feature offering is available at http://jumbune.com. More details of open source offering are at,
Stars: ✭ 64 (-27.27%)
Mutual labels: hadoop
Dataspherestudio
DataSphereStudio is a one stop data application development& management portal, covering scenarios including data exchange, desensitization/cleansing, analysis/mining, quality measurement, visualization, and task scheduling.
Stars: ✭ 1,195 (+1257.95%)
Mutual labels: hadoop
Docker Hadoop
A Docker container with a full Hadoop cluster setup with Spark and Zeppelin
Stars: ✭ 54 (-38.64%)
Mutual labels: hadoop
Apache Spark Hands On
Educational notes,Hands on problems w/ solutions for hadoop ecosystem
Stars: ✭ 74 (-15.91%)
Mutual labels: hadoop
Tf Yarn
Train TensorFlow models on YARN in just a few lines of code!
Stars: ✭ 76 (-13.64%)
Mutual labels: hadoop
This project does not contain a readme.
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].