All Projects → YBIGTA → EngineeringTeam

YBIGTA / EngineeringTeam

Licence: other
와이빅타 엔지니어링팀의 자료를 정리해두는 곳입니다.

Projects that are alternatives of or similar to EngineeringTeam

Haproxy Configs
80+ HAProxy Configs for Hadoop, Big Data, NoSQL, Docker, Elasticsearch, SolrCloud, HBase, MySQL, PostgreSQL, Apache Drill, Hive, Presto, Impala, Hue, ZooKeeper, SSH, RabbitMQ, Redis, Riak, Cloudera, OpenTSDB, InfluxDB, Prometheus, Kibana, Graphite, Rancher etc.
Stars: ✭ 106 (+158.54%)
Mutual labels:  hive, hadoop, nosql
dockerfiles
Multi docker container images for main Big Data Tools. (Hadoop, Spark, Kafka, HBase, Cassandra, Zookeeper, Zeppelin, Drill, Flink, Hive, Hue, Mesos, ... )
Stars: ✭ 29 (-29.27%)
Mutual labels:  hive, hadoop
cloud
云计算之hadoop、hive、hue、oozie、sqoop、hbase、zookeeper环境搭建及配置文件
Stars: ✭ 48 (+17.07%)
Mutual labels:  hive, hadoop
hadoopoffice
HadoopOffice - Analyze Office documents using the Hadoop ecosystem (Spark/Flink/Hive)
Stars: ✭ 56 (+36.59%)
Mutual labels:  hive, hadoop
hive to es
同步Hive数据仓库数据到Elasticsearch的小工具
Stars: ✭ 21 (-48.78%)
Mutual labels:  hive, hadoop
hive-bigquery-storage-handler
Hive Storage Handler for interoperability between BigQuery and Apache Hive
Stars: ✭ 16 (-60.98%)
Mutual labels:  hive, hadoop
xxhadoop
Data Analysis Using Hadoop/Spark/Storm/ElasticSearch/MachineLearning etc. This is My Daily Notes/Code/Demo. Don't fork, Just star !
Stars: ✭ 37 (-9.76%)
Mutual labels:  hive, hadoop
Helicalinsight
Helical Insight software is world’s first Open Source Business Intelligence framework which helps you to make sense out of your data and make well informed decisions.
Stars: ✭ 214 (+421.95%)
Mutual labels:  hive, nosql
liquibase-impala
Liquibase extension to add Impala Database support
Stars: ✭ 23 (-43.9%)
Mutual labels:  hive, hadoop
hive-jdbc-driver
An alternative to the "hive standalone" jar for connecting Java applications to Apache Hive via JDBC
Stars: ✭ 31 (-24.39%)
Mutual labels:  hive, hadoop
aaocp
一个对用户行为日志进行分析的大数据项目
Stars: ✭ 53 (+29.27%)
Mutual labels:  hive, hadoop
smart-data-lake
Smart Automation Tool for building modern Data Lakes and Data Pipelines
Stars: ✭ 79 (+92.68%)
Mutual labels:  hive, hadoop
bigdata-doc
大数据学习笔记,学习路线,技术案例整理。
Stars: ✭ 37 (-9.76%)
Mutual labels:  hive, hadoop
the-apache-ignite-book
All code samples, scripts and more in-depth examples for The Apache Ignite Book. Include Apache Ignite 2.6 or above
Stars: ✭ 65 (+58.54%)
Mutual labels:  hive, hadoop
dpkb
大数据相关内容汇总,包括分布式存储引擎、分布式计算引擎、数仓建设等。关键词:Hadoop、HBase、ES、Kudu、Hive、Presto、Spark、Flink、Kylin、ClickHouse
Stars: ✭ 123 (+200%)
Mutual labels:  hive, hadoop
BigInsights-on-Apache-Hadoop
Example projects for 'BigInsights for Apache Hadoop' on IBM Bluemix
Stars: ✭ 21 (-48.78%)
Mutual labels:  hive, hadoop
web-click-flow
网站点击流离线日志分析
Stars: ✭ 14 (-65.85%)
Mutual labels:  hive, hadoop
Hive Jdbc Uber Jar
Hive JDBC "uber" or "standalone" jar based on the latest Apache Hive version
Stars: ✭ 188 (+358.54%)
Mutual labels:  hive, hadoop
Facebook Hive Udfs
Facebook's Hive UDFs
Stars: ✭ 213 (+419.51%)
Mutual labels:  hive, hadoop
hadoop-etl-udfs
The Hadoop ETL UDFs are the main way to load data from Hadoop into EXASOL
Stars: ✭ 17 (-58.54%)
Mutual labels:  hive, hadoop

Ybigta Engineering Team

와이빅타 엔지니어링팀이 공부했던 것 정리해두는 곳입니다.

자료는 모두 구글드라이브에 업로드되어있습니다



소개

와이빅타 엔지니어링팀은 Spark, Hadoop, Hive, Kafka 등 분산처리 프레임워크와 sql, nosql, crawling 등 데이터 엔지니어링에 대하여 공부하는 곳입니다.



운영

엔지니어링팀은 Training Session, Teaching Session, Team Study로 운영이 됩니다. 각 세션과 스터디는 팀의 운영진의 주도하에 커리큘럼을 정하여 기존 기수를 중심으로 강의자를 선정합니다.

  • Training Session

    • 정해진 커리큘럼에 따라 분산처리 프레임워크에 대해서 공부합니다.
    • Spark와 Hadoop에 대한 강의를 중심으로 교육을 받으며, 추가적으로 Kafka와 Hive 등 기타 분산처리 프레임워크를 공부합니다.
  • Teaching Session

    • Training Session을 이수한 이후, 후배 기수들에게 직접 강의를 하며 부족했던 부분을 채우며 더 심화적으로 공부합니다.
  • Team Study

    • 데이터 엔지니어링에 기본이 되는sql, nosql, crawling 등에 대하여 스터디를 구성하여 공부합니다.

운영진

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].