All Projects → AtomGraph → LinkedDataHub

AtomGraph / LinkedDataHub

Licence: Apache-2.0 License
The Knowledge Graph notebook. Apache license.

Programming Languages

XSLT
1337 projects
java
68154 projects - #9 most used programming language
javascript
184084 projects - #8 most used programming language
shell
77523 projects
CSS
56736 projects
HTML
75241 projects
Dockerfile
14818 projects

Projects that are alternatives of or similar to LinkedDataHub

Processor
Ontology-driven Linked Data processor and server for SPARQL backends. Apache License.
Stars: ✭ 54 (-64%)
Mutual labels:  linked-data, sparql, data-driven, rdf, declarative, semantic-web, knowledge-graph, ontology-driven-development, linked-data-templates
CSV2RDF
Streaming, transforming, SPARQL-based CSV to RDF converter. Apache license.
Stars: ✭ 48 (-68%)
Mutual labels:  linked-data, sparql, rdf, semantic-web, knowledge-graph
OLGA
an Ontology SDK
Stars: ✭ 36 (-76%)
Mutual labels:  sparql, rdf, semantic-web, owl, knowledge-graph
LD-Connect
LD Connect is a Linked Data portal for IOS Press in collaboration with the STKO Lab at UC Santa Barbara.
Stars: ✭ 0 (-100%)
Mutual labels:  linked-data, sparql, rdf, semantic-web, knowledge-graph
semantic-python-overview
(subjective) overview of projects which are related both to python and semantic technologies (RDF, OWL, Reasoning, ...)
Stars: ✭ 406 (+170.67%)
Mutual labels:  sparql, rdf, semantic-web, owl, knowledge-graph
Semanticmediawiki
🔗 Semantic MediaWiki turns MediaWiki into a knowledge management platform with query and export capabilities
Stars: ✭ 359 (+139.33%)
Mutual labels:  linked-data, sparql, rdf, semantic-web, knowledge-graph
Rdf4j
Eclipse RDF4J: scalable RDF for Java
Stars: ✭ 242 (+61.33%)
Mutual labels:  linked-data, sparql, rdf, semantic-web
cognipy
In-memory Graph Database and Knowledge Graph with Natural Language Interface, compatible with Pandas
Stars: ✭ 31 (-79.33%)
Mutual labels:  sparql, rdf, owl, knowledge-graph
semagrow
A SPARQL query federator of heterogeneous data sources
Stars: ✭ 27 (-82%)
Mutual labels:  linked-data, sparql, rdf, triplestore
Hypergraphql
GraphQL interface for querying and serving linked data on the Web.
Stars: ✭ 112 (-25.33%)
Mutual labels:  linked-data, sparql, rdf, semantic-web
mobi
Mobi is a decentralized, federated, and distributed graph data platform for teams and communities to publish and discover data, data models, and analytics that are instantly consumable.
Stars: ✭ 41 (-72.67%)
Mutual labels:  sparql, rdf, semantic-web, owl
sparql-micro-service
SPARQL micro-services: A lightweight approach to query Web APIs with SPARQL
Stars: ✭ 22 (-85.33%)
Mutual labels:  linked-data, sparql, rdf, semantic-web
YALC
🕸 YALC: Yet Another LOD Cloud (registry of Linked Open Datasets).
Stars: ✭ 14 (-90.67%)
Mutual labels:  linked-data, rdf, semantic-web, linked-open-data
Nspm
🤖 Neural SPARQL Machines for Knowledge Graph Question Answering.
Stars: ✭ 156 (+4%)
Mutual labels:  linked-data, sparql, rdf, knowledge-graph
Hypergraphql
GraphQL interface for querying and serving linked data on the Web.
Stars: ✭ 120 (-20%)
Mutual labels:  linked-data, sparql, rdf, semantic-web
ont-api
ONT-API (OWL-API over Apache Jena)
Stars: ✭ 20 (-86.67%)
Mutual labels:  sparql, rdf, semantic-web, owl
PheKnowLator
PheKnowLator: Heterogeneous Biomedical Knowledge Graphs and Benchmarks Constructed Under Alternative Semantic Models
Stars: ✭ 74 (-50.67%)
Mutual labels:  semantic-web, owl, knowledge-graph, linked-open-data
Web Client
Generic Linked Data browser and UX component framework. Apache license.
Stars: ✭ 105 (-30%)
Mutual labels:  linked-data, rdf, semantic-web, knowledge-graph
everything
The semantic desktop search engine
Stars: ✭ 22 (-85.33%)
Mutual labels:  sparql, rdf, semantic-web, knowledge-graph
viziquer
Tool for Search in Structured Semantic Data
Stars: ✭ 12 (-92%)
Mutual labels:  linked-data, sparql, rdf, triplestore

The Knowledge Graph notebook

LinkedDataHub (LDH) is open source software you can use to manage data, create visualizations and build apps on RDF Knowledge Graphs.

What's new in LinkedDataHub v3? Watch this video for a feature overview: What's new in LinkedDataHub v3? Feature overview

We started the project with the intention to use it for Linked Data publishing, but gradually realized that we've built a multi-purpose data-driven platform.

We are building LinkedDataHub primarily for:

  • researchers who need an RDF-native notebook that can consume and collect Linked Data and SPARQL documents and follows the FAIR principles
  • developers who are looking for a declarative full stack framework for Knowledge Graph application development, with out-of-the-box UI and API

What makes LinkedDataHub unique is its completely data-driven architecture: applications and documents are defined as data, managed using a single generic HTTP API and presented using declarative technologies. The default application structure and user interface are provided, but they can be completely overridden and customized. Unless a custom server-side processing is required, no imperative code such as Java or JavaScript needs to be involved at all.

Follow the Get started guide to LinkedDataHub. The setup and basic configuration sections are provided below and should get you running.

Setup

Click to expand

Prerequisites

Steps

  1. Fork this repository and clone the fork into a folder
  2. In the folder, create an .env file and fill out the missing values (you can use .env_sample as a template). For example:
    COMPOSE_CONVERT_WINDOWS_PATHS=1
    COMPOSE_PROJECT_NAME=linkeddatahub
    
    PROTOCOL=https
    PROXY_HTTP_PORT=81
    PROXY_HTTPS_PORT=4443
    HOST=localhost
    ABS_PATH=/
    
    [email protected]
    OWNER_GIVEN_NAME=John
    OWNER_FAMILY_NAME=Doe
    OWNER_ORG_UNIT=My unit
    OWNER_ORGANIZATION=My org
    OWNER_LOCALITY=Copenhagen
    OWNER_STATE_OR_PROVINCE=Denmark
    OWNER_COUNTRY_NAME=DK
    
  3. Setup SSL certificates/keys by running this from command line (replace $owner_cert_pwd and $secretary_cert_pwd with your own passwords):
    ./scripts/setup.sh .env ssl $owner_cert_pwd $secretary_cert_pwd 3650
    
    The script will create an ssl sub-folder where the SSL certificates and/or public keys will be placed.
  4. Launch the application services by running this from command line:
    docker-compose up --build
    
    It will build LinkedDataHub's Docker image, start its container and mount the following sub-folders:
    • data where the triplestore(s) will persist RDF data
    • uploads where LDH stores content-hashed file uploads The first should take around half a minute as datasets are being loaded into triplestores. After a successful startup, the last line of the Docker log should read something like:
    linkeddatahub_1     | 09-Feb-2021 14:18:10.536 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in [32609] milliseconds
    
  5. Install ssl/owner/keystore.p12 into a web browser of your choice (password is the $owner_cert_pwd value supplied to setup.sh)
    • Google Chrome: Settings > Advanced > Manage Certificates > Import...
    • Mozilla Firefox: Options > Privacy > Security > View Certificates... > Import...
    • Apple Safari: The file is installed directly into the operating system. Open the file and import it using the Keychain Access tool.
    • Microsoft Edge: Does not support certificate management, you need to install the file into Windows. Read more here.
  6. Open https://localhost:4443/ in that web browser

Notes

  • You will likely get a browser warning such as Your connection is not private in Chrome or Warning: Potential Security Risk Ahead in Firefox due to the self-signed server certificate. Ignore it: click Advanced and Proceed or Accept the risk to proceed.
    • If this option does not appear in Chrome (as observed on some MacOS), you can open chrome://flags/#allow-insecure-localhost, switch Allow invalid certificates for resources loaded from localhost to Enabled and restart Chrome
  • .env_sample and .env files might be invisible in MacOS Finder which hides filenames starting with a dot. You should be able to create it using Terminal however.
  • On Linux your user may need to be a member of the docker group. Add it using
sudo usermod -aG docker ${USER}

and re-login with your user. An alternative, but not recommended, is to run

sudo docker-compose up

Configuration

Click to expand

Base URI

A common case is changing the base URI from the default https://localhost:4443/ to your own.

Lets use https://ec2-54-235-229-141.compute-1.amazonaws.com/linkeddatahub/ as an example. We need to split the URI into components and set them in the .env file using the following parameters:

PROTOCOL=https
HTTP_PORT=80
HTTPS_PORT=443
HOST=ec2-54-235-229-141.compute-1.amazonaws.com
ABS_PATH=/linkeddatahub/

ABS_PATH is required, even if it's just /.

Dataspaces

Dataspaces are configured in config/system-varnish.trig. Relative URIs will be resolved against the base URI configured in the .env file.

⚠️ Do not use blank nodes to identify applications or services. We recommend using the urn: URI scheme, since LinkedDataHub application resources are not accessible under their own dataspace.

Environment

LinkedDataHub supports a range of configuration options that can be passed as environment parameters in docker-compose.yml. The most common ones are:

CATALINA_OPTS
Tomcat's command line options
SELF_SIGNED_CERT
true if the server certificate is self-signed
SIGN_UP_CERT_VALIDITY
Validity of the WebID certificates of signed up users (not the owner's)
IMPORT_KEEPALIVE
The period for which the data import can keep an open HTTP connection before it times out, in ms. The larger files are being imported, the longer it has to be in order for the import to complete.
MAX_CONTENT_LENGTH
Maximum allowed size of the request body, in bytes
MAIL_SMTP_HOST
Hostname of the mail server
MAIL_SMTP_PORT
Port number of the mail server
GOOGLE_CLIENT_ID
OAuth 2.0 Client ID from Google. When provided, enables the Login with Google authentication method.
GOOGLE_CLIENT_SECRET
Client secret from Google

Reset

If you need to start fresh and wipe the existing setup (e.g. after configuring a new base URI), you can do that using

sudo rm -rf data uploads && docker-compose down -v

⚠️ This will remove the persisted data and files as well as Docker volumes.

Documentation

Command line interface

LinkedDataHub CLI wraps the HTTP API into a set of shell scripts with convenient parameters. The scripts can be used for testing, automation, scheduled execution and such. It is usually much quicker to perform actions using CLI rather than the user interface, as well as easier to reproduce.

The scripts can be found in the scripts subfolder.

⚠️ The CLI scripts internally use Jena's CLI commands. Set up the Jena environment before running the scripts.

An environment variable JENA_HOME is used by all the command line tools to configure the class path automatically for you. You can set this up as follows:

On Linux / Mac

export JENA_HOME=the directory you downloaded Jena to
export PATH="$PATH:$JENA_HOME/bin"

Get the source code

⚠️ Before running app installation scripts that use LinkedDataHub's CLI scripts, set the SCRIPT_ROOT environmental variable to the scripts subfolder of your LinkedDataHub fork or clone. For example:

export SCRIPT_ROOT="/c/Users/namedgraph/WebRoot/AtomGraph/LinkedDataHub/scripts"

How to get involved

Test suite

LinkedDataHub includes an HTTP test suite. The server implementation is also covered by the Processor test suite.

HTTP-tests HTTP-tests

Dependencies

Browser

Java

Docker

Support

Please report issues if you've encountered a bug or have a feature request.

Commercial consulting, development, and support are available from AtomGraph.

Community

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].