All Projects → shinichi-takii → Ddlparse

shinichi-takii / Ddlparse

Licence: bsd-3-clause
DDL parase and Convert to BigQuery JSON schema and DDL statements

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Ddlparse

Sqlfiddle3
New version based on vert.x and docker
Stars: ✭ 242 (+365.38%)
Mutual labels:  oracle, sql, mysql, postgresql
Smartsql
SmartSql = MyBatis in C# + .NET Core+ Cache(Memory | Redis) + R/W Splitting + PropertyChangedTrack +Dynamic Repository + InvokeSync + Diagnostics
Stars: ✭ 775 (+1390.38%)
Mutual labels:  oracle, sql, mysql, postgresql
E Commerce Db
Database schema for e-commerce (webstores) sites.
Stars: ✭ 245 (+371.15%)
Mutual labels:  oracle, sql, mysql, postgresql
Rom Sql
SQL support for rom-rb
Stars: ✭ 169 (+225%)
Mutual labels:  oracle, sql, mysql, postgresql
Eosio sql plugin
EOSIO sql database plugin
Stars: ✭ 21 (-59.62%)
Mutual labels:  oracle, sql, mysql, postgresql
Linq2db
Linq to database provider.
Stars: ✭ 2,211 (+4151.92%)
Mutual labels:  oracle, sql, mysql, postgresql
Jsqlparser
JSqlParser parses an SQL statement and translate it into a hierarchy of Java classes. The generated hierarchy can be navigated using the Visitor Pattern
Stars: ✭ 3,405 (+6448.08%)
Mutual labels:  oracle, sql, mysql, postgresql
Xo
Command line tool to generate idiomatic Go code for SQL databases supporting PostgreSQL, MySQL, SQLite, Oracle, and Microsoft SQL Server
Stars: ✭ 2,974 (+5619.23%)
Mutual labels:  oracle, sql, mysql, postgresql
Franchise
🍟 a notebook sql client. what you get when have a lot of sequels.
Stars: ✭ 3,823 (+7251.92%)
Mutual labels:  sql, bigquery, mysql, postgresql
Dbeaver
Free universal database tool and SQL client
Stars: ✭ 23,752 (+45576.92%)
Mutual labels:  oracle, sql, mysql, postgresql
Sqlcheck
Automatically identify anti-patterns in SQL queries
Stars: ✭ 2,062 (+3865.38%)
Mutual labels:  oracle, sql, mysql, postgresql
Openrecord
Make ORMs great again!
Stars: ✭ 474 (+811.54%)
Mutual labels:  oracle, sql, mysql, postgresql
Kangaroo
SQL client and admin tool for popular databases
Stars: ✭ 127 (+144.23%)
Mutual labels:  oracle, sql, mysql, postgresql
Fluentmigrator
Fluent migrations framework for .NET
Stars: ✭ 2,636 (+4969.23%)
Mutual labels:  oracle, sql, mysql, redshift
Yuniql
Free and open source schema versioning and database migration made natively with .NET Core.
Stars: ✭ 156 (+200%)
Mutual labels:  sql, mysql, postgresql, redshift
Redash
Make Your Company Data Driven. Connect to any data source, easily visualize, dashboard and share your data.
Stars: ✭ 20,147 (+38644.23%)
Mutual labels:  bigquery, mysql, postgresql, redshift
Jooq
jOOQ is the best way to write SQL in Java
Stars: ✭ 4,695 (+8928.85%)
Mutual labels:  oracle, sql, mysql, postgresql
Tbls
tbls is a CI-Friendly tool for document a database, written in Go.
Stars: ✭ 940 (+1707.69%)
Mutual labels:  bigquery, mysql, postgresql, redshift
Pmacct
pmacct is a small set of multi-purpose passive network monitoring tools [NetFlow IPFIX sFlow libpcap BGP BMP RPKI IGP Streaming Telemetry].
Stars: ✭ 677 (+1201.92%)
Mutual labels:  sql, mysql, postgresql
Blog
Everything about database,business.(Most for PostgreSQL).
Stars: ✭ 6,330 (+12073.08%)
Mutual labels:  oracle, mysql, postgresql

DDL Parse

PyPI version Python version Travis CI Build Status Coveralls Coverage Status codecov Coverage Status Requirements Status License

DDL parase and Convert to BigQuery JSON schema and DDL statements module, available in Python.


Features

  • DDL parse and get table schema information.
  • Currently, only the CREATE TABLE statement is supported.
  • Convert to BigQuery JSON schema and BigQuery DDL statements.
  • Supported databases are MySQL/MariaDB, PostgreSQL, Oracle, Redshift.

Requirement

  1. Python >= 3.5
  2. pyparsing

Installation

Install

pip install:

$ pip install ddlparse

command install:

$ python setup.py install

Update

pip update:

$ pip install ddlparse --upgrade

Usage

Example

import json

from ddlparse import DdlParse

sample_ddl = """
CREATE TABLE My_Schema.Sample_Table (
  Id integer PRIMARY KEY COMMENT 'User ID',
  Name varchar(100) NOT NULL COMMENT 'User name',
  Total bigint NOT NULL,
  Avg decimal(5,1) NOT NULL,
  Point int(10) unsigned,
  Zerofill_Id integer unsigned zerofill NOT NULL,
  Created_At date, -- Oracle 'DATE' -> BigQuery 'DATETIME'
  UNIQUE (NAME)
);
"""


# parse pattern (1-1)
table = DdlParse().parse(sample_ddl)

# parse pattern (1-2) : Specify source database
table = DdlParse().parse(ddl=sample_ddl, source_database=DdlParse.DATABASE.oracle)


# parse pattern (2-1)
parser = DdlParse(sample_ddl)
table = parser.parse()

print("* BigQuery Fields * : normal")
print(table.to_bigquery_fields())


# parse pattern (2-2) : Specify source database
parser = DdlParse(ddl=sample_ddl, source_database=DdlParse.DATABASE.oracle)
table = parser.parse()


# parse pattern (3-1)
parser = DdlParse()
parser.ddl = sample_ddl
table = parser.parse()

# parse pattern (3-2) : Specify source database
parser = DdlParse()
parser.source_database = DdlParse.DATABASE.oracle
parser.ddl = sample_ddl
table = parser.parse()

print("* BigQuery Fields * : Oracle")
print(table.to_bigquery_fields())


print("* TABLE *")
print("schema = {} : name = {} : is_temp = {}".format(table.schema, table.name, table.is_temp))

print("* BigQuery Fields *")
print(table.to_bigquery_fields())

print("* BigQuery Fields - column name to lower case / upper case *")
print(table.to_bigquery_fields(DdlParse.NAME_CASE.lower))
print(table.to_bigquery_fields(DdlParse.NAME_CASE.upper))

print("* COLUMN *")
for col in table.columns.values():
    col_info = {}

    col_info["name"]                  = col.name
    col_info["data_type"]             = col.data_type
    col_info["length"]                = col.length
    col_info["precision(=length)"]    = col.precision
    col_info["scale"]                 = col.scale
    col_info["is_unsigned"]           = col.is_unsigned
    col_info["is_zerofill"]           = col.is_zerofill
    col_info["constraint"]            = col.constraint
    col_info["not_null"]              = col.not_null
    col_info["PK"]                    = col.primary_key
    col_info["unique"]                = col.unique
    col_info["auto_increment"]        = col.auto_increment
    col_info["distkey"]               = col.distkey
    col_info["sortkey"]               = col.sortkey
    col_info["encode"]                = col.encode
    col_info["default"]               = col.default
    col_info["character_set"]         = col.character_set
    col_info["bq_legacy_data_type"]   = col.bigquery_legacy_data_type
    col_info["bq_standard_data_type"] = col.bigquery_standard_data_type
    col_info["comment"]               = col.comment
    col_info["description(=comment)"] = col.description
    col_info["bigquery_field"]        = json.loads(col.to_bigquery_field())

    print(json.dumps(col_info, indent=2, ensure_ascii=False))

print("* DDL (CREATE TABLE) statements *")
print(table.to_bigquery_ddl())

print("* DDL (CREATE TABLE) statements - dataset name, table name and column name to lower case / upper case *")
print(table.to_bigquery_ddl(DdlParse.NAME_CASE.lower))
print(table.to_bigquery_ddl(DdlParse.NAME_CASE.upper))

print("* Get Column object (case insensitive) *")
print(table.columns["total"])
print(table.columns["total"].data_type)

License

BSD 3-Clause License

Author

Shinichi Takii [email protected]

Links

Special Thanks

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].