Risultati
106 risultati
-
italia
Security Manager
[TBD] ...
-
italia
Ingestion Manager
[TBD] ...
-
italia
Storage Manager
[TBD] ...
-
italia
Core microservices
Content: ...
-
italia
Semantic Repository
There is a list of basic endpoints:. /kb/v1/ontologies : using this endpoint it’s possible to add new ontologies using HTTP POST with parameters. By conventions the user should assign a prefix / context pair (which willbe unique in the underlying repository catalog). /kb/v1/ontologies/remove : the endpoint is dedicated to remove an existing ontology, using the context where it was published. /kb/v1/contexts : a list of the existing context can e retrieved: by convention ontologies are published on assigned contexts, which will be different from the ones used for data. /kb/v1/prefixes : this endpoints returns the full list of used prefix / namespace pairs, where the namespace usually coincide with an assigned contexts on the underlying repository. /kb/v1/prefixes/lookup : this endpoint can be used for retrieving the namespace associated to a given prefix. /kb/v1/prefixes/reverse : this endpoint can be used for retrieving the prefix associated to a given namespace. /kb/v1/triples : provides the amount of available triples. /kb/v1/triples/{prefix} : provides the triples count for a given context. Detailed informations about the service can be found here ...
-
italia
Semantic Manager
There are two endpoints:. /kb/v1/ontologies : provides a list of the available ontologies. /kb/v1/ontologies/properties/find : enable searching by terms and language for properties which may be used for a simple annotation of dataset fields in the ingestion form (and later for standardization). Detailed informations about the service can be found here ...
-
italia
OntoNetHub
There is a list of available endpoints:. /stanbol/ontonethub/ontology : can be used to add a new ontology using a POST request. /stanbol/jobs/{job_id} : provides informations about the status of a job associated with the upload of an ontology. /stanbol/ontonethub/ontology/{ontology_id} : can be used with a GET request to access the information about the specific ontology. /stanbol/ontonethub/ontology/{ontology_id} : can be used with a DELETE request for deleting an existing ontology. /stanbol/ontonethub/ontology/{ontology_id}/source : can be used with a GET request for obtaining a representation of the ontology in JSON-LD. /stanbol/ontonethub/ontologies/find : can be used for querying the OntoNetHub and retrieving OWL entities from the ontologies managed by it. Detailed informations about the service can be found here ...
-
italia
Semantic Validator
There are two endpoints:. /validator/validate : in order to validate a document. /validator/validators : in oder to ghe the list of available validators. Detailed informations about the service can be found here ...
-
italia
Semantic microservices
The semantic part of the DAF consists of the following microservices: ...
-
italia
Local Installation
In the host, run the following command to clone the DAF project:. > git clone https://github.com/italia/daf.git. In case sbt is not found, install it:. > echo "deb https://dl.bintray.com/sbt/debian /" | sudo tee -a /etc/apt/sources.list.d/sbt.list > sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 2EE0EA64E40A89B84B2DF73499E82A75642AC823 > sudo apt-get update > sudo apt-getinstall sbt. Common. On the host PC, go to the folder daf/common and run the following commands:. > sbt > clean > compile > publishLocal. Security Manager. In your daf/security_manager folder, run:. > sbt > clean > compile > run -Dconfig.resource=svil.conf -Dhttp.port=9002. Catalog Manager. On the host PC, go to the folder dat/catalog_manager and run the commands:. > sbt > clean > compile > run -Dconfig.resource=svil.conf -Dhttp.port=9001. Dataportal. Clone the project daf-dataportal-backend from GitHub using the following command:. > git clone https://github.com/italia/daf-dataportal-backend. In your daf-dataportal-backend project, run the following commands:. > sbt > clean > compile > run -Dconfig.resource=local.conf. Front-end. Clone the project daf-dataportal from GitHub:. > git clone https://github.com/italia/daf-dataportal. In your daf-dataportal project, add the following lines in …/src/config/serviceurl.js:. apiURLSSOManager: "http://localhost:9002/sso-manager", apiURLDatiGov: "http://localhost:9000/dati-gov/v1", apiURLCatalog: "http://localhost:9001/catalog-manager/v1", apiURLIngestion: "http://localhost:9002/ingestion-manager/v1", apiURLSecurity: "http://localhost:9002/security-manager/v1", urlMetabase: 'http://metabase.daf.test.it', urlSuperset: 'http://superset.daf.test.it', domain:".daf.test.it". In your …/package.json edit the line in the section scripts. "start": "PORT=80 react-scripts start". You can run the FE in the following modality:. Start in Debug Mode:. npm install npm start. Start in Production Mode:. npm run build npm install -g serve serve -s build. For each configuration, the application should be reached through the following URL:. http://datipubblici-private.daf.test.it. When you access for the first time, click on the button “Registrati” to sign up. After the registration, access the FreeIpa, search for your account and add it to your user groups “daf_admins”. Now, log out and log in again to DAF - Dataportal to see the admin features ...
-
italia
Jupyter
This Docker container runs a JupyterHub instance which is connected with a PostgreSQL database. Run the Docker container:. > cd ./daf-recipes/jupyterhub > docker-compose up -d. Check whether dockers are running:. > docker ps 8350963ac06c jupyterhub_jupyterhub "/wait_db_is_ready.sh" 16 minutes ago Up 16 minutes 0.0.0.0:8000->8000/tcp jupyterhub 6a0d0d6c3b9a osixia/phpldapadmin "/container/tool/run" 17 minutes ago Up 17 minutes 0.0.0.0:80->80/tcp, 443/tcp phpldapadmin e8ff9611aeff osixia/openldap "/container/tool/r..." 17 minutes ago Up 17 minutes 0.0.0.0:389->389/tcp, 0.0.0.0:636->636/tcp ldap cee2d35feaaf postgres:9.6 "docker-entrypoint..." 2 hours ago Up 2 hours 0.0.0.0:5432->5432/tcp postgresjupyterhub. To open the interactive shell type http://localhost:8000 and login as user alice (password password) ...
-
italia
Metabase
This guide explains how to run and execute a Metabase server. Follow these steps to run the Docker images. Clone the git project:. > git clone git@github.com:italia/daf-recipes.git. Go to the metabase directory, the images needed by docker-compose and run it:. > cd metabase > ./build_local.sh > docker-compose up -d # it will run all the needed containers. Open the Metabase home at http://localhost:3000. Go to GitHub to check how to set up Metabase ...