-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[EPIC] As an admin I want to onboard machines so that machine data is mapped to data models in an automated way #514
Labels
Comments
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Mar 11, 2024
With this PR external ontologies can be pulled and combined to create the KMS data. Every onology contains of a prefix, and three parts: prefix_entities.ttl, prefix_knowledge.ttl and prefix_shacl.ttl. Different ontologies can be pulled together with a make target: make ontology2kms ONTOLOGIES="base filter" would pull in base and filter ontologies and compile a joint kms structure. The PR provides: * Syncing of knowledge configmaps between debezium-bridge and semantic model * Added stakater/Reloader to restart debezium-bridge when new knowledge is deployed * Extended SPARQL parser to enable consistent inheritance of RDFS Types when creating SHACL constraints and rules * Tolerating of rdfs:subClassOf* path expression * Streamlining of Knowledge closure calculation across all tools and add explicit script to calculate knowledge closure (create_knowledge_closure.py) * Add explicit tool to execute SPARQL query (check_sparql_expression.py) * For the time being - drop the sh:ord based deterministic ordering of fields. It creates confusion but will be needed in future to allow forward compatibility. For the time being, deterministic ordering is achieved by lexical ordering by field names. Related EPIC: IndustryFusion#514 Related User Story: IndustryFusion#515 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Mar 15, 2024
With this PR external ontologies can be pulled and combined to create the KMS data. Every onology contains of a prefix, and three parts: prefix_entities.ttl, prefix_knowledge.ttl and prefix_shacl.ttl. Different ontologies can be pulled together with a make target: make ontology2kms ONTOLOGIES="base filter" would pull in base and filter ontologies and compile a joint kms structure. The PR provides: * Syncing of knowledge configmaps between debezium-bridge and semantic model * Added stakater/Reloader to restart debezium-bridge when new knowledge is deployed * Extended SPARQL parser to enable consistent inheritance of RDFS Types when creating SHACL constraints and rules * Tolerating of rdfs:subClassOf* path expression * Streamlining of Knowledge closure calculation across all tools and add explicit script to calculate knowledge closure (create_knowledge_closure.py) * Add explicit tool to execute SPARQL query (check_sparql_expression.py) * For the time being - drop the sh:ord based deterministic ordering of fields. It creates confusion but will be needed in future to allow forward compatibility. For the time being, deterministic ordering is achieved by lexical ordering by field names. Related EPIC: IndustryFusion#514 Related User Story: IndustryFusion#515 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Mar 15, 2024
With this PR external ontologies can be pulled and combined to create the KMS data. Every onology contains of a prefix, and three parts: prefix_entities.ttl, prefix_knowledge.ttl and prefix_shacl.ttl. Different ontologies can be pulled together with a make target: make ontology2kms ONTOLOGIES="base filter" would pull in base and filter ontologies and compile a joint kms structure. The PR provides: * Syncing of knowledge configmaps between debezium-bridge and semantic model * Added stakater/Reloader to restart debezium-bridge when new knowledge is deployed * Extended SPARQL parser to enable consistent inheritance of RDFS Types when creating SHACL constraints and rules * Tolerating of rdfs:subClassOf* path expression * Streamlining of Knowledge closure calculation across all tools and add explicit script to calculate knowledge closure (create_knowledge_closure.py) * Add explicit tool to execute SPARQL query (check_sparql_expression.py) * For the time being - drop the sh:ord based deterministic ordering of fields. It creates confusion but will be needed in future to allow forward compatibility. For the time being, deterministic ordering is achieved by lexical ordering by field names. Related EPIC: IndustryFusion#514 Related User Story: IndustryFusion#515 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Mar 15, 2024
With this PR external ontologies can be pulled and combined to create the KMS data. Every onology contains of a prefix, and three parts: prefix_entities.ttl, prefix_knowledge.ttl and prefix_shacl.ttl. Different ontologies can be pulled together with a make target: make ontology2kms ONTOLOGIES="base filter" would pull in base and filter ontologies and compile a joint kms structure. The PR provides: * Syncing of knowledge configmaps between debezium-bridge and semantic model * Added stakater/Reloader to restart debezium-bridge when new knowledge is deployed * Extended SPARQL parser to enable consistent inheritance of RDFS Types when creating SHACL constraints and rules * Tolerating of rdfs:subClassOf* path expression * Streamlining of Knowledge closure calculation across all tools and add explicit script to calculate knowledge closure (create_knowledge_closure.py) * Add explicit tool to execute SPARQL query (check_sparql_expression.py) * For the time being - drop the sh:ord based deterministic ordering of fields. It creates confusion but will be needed in future to allow forward compatibility. For the time being, deterministic ordering is achieved by lexical ordering by field names. Related EPIC: IndustryFusion#514 Related User Story: IndustryFusion#515 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Mar 19, 2024
With this PR external ontologies can be pulled and combined to create the KMS data. Every onology contains of a prefix, and three parts: prefix_entities.ttl, prefix_knowledge.ttl and prefix_shacl.ttl. Different ontologies can be pulled together with a make target: make ontology2kms ONTOLOGIES="base filter" would pull in base and filter ontologies and compile a joint kms structure. The PR provides: * Syncing of knowledge configmaps between debezium-bridge and semantic model * Added stakater/Reloader to restart debezium-bridge when new knowledge is deployed * Extended SPARQL parser to enable consistent inheritance of RDFS Types when creating SHACL constraints and rules * Tolerating of rdfs:subClassOf* path expression * Streamlining of Knowledge closure calculation across all tools and add explicit script to calculate knowledge closure (create_knowledge_closure.py) * Add explicit tool to execute SPARQL query (check_sparql_expression.py) * For the time being - drop the sh:ord based deterministic ordering of fields. It creates confusion but will be needed in future to allow forward compatibility. For the time being, deterministic ordering is achieved by lexical ordering by field names. Related EPIC: IndustryFusion#514 Related User Story: IndustryFusion#515 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
that referenced
this issue
Mar 19, 2024
With this PR external ontologies can be pulled and combined to create the KMS data. Every onology contains of a prefix, and three parts: prefix_entities.ttl, prefix_knowledge.ttl and prefix_shacl.ttl. Different ontologies can be pulled together with a make target: make ontology2kms ONTOLOGIES="base filter" would pull in base and filter ontologies and compile a joint kms structure. The PR provides: * Syncing of knowledge configmaps between debezium-bridge and semantic model * Added stakater/Reloader to restart debezium-bridge when new knowledge is deployed * Extended SPARQL parser to enable consistent inheritance of RDFS Types when creating SHACL constraints and rules * Tolerating of rdfs:subClassOf* path expression * Streamlining of Knowledge closure calculation across all tools and add explicit script to calculate knowledge closure (create_knowledge_closure.py) * Add explicit tool to execute SPARQL query (check_sparql_expression.py) * For the time being - drop the sh:ord based deterministic ordering of fields. It creates confusion but will be needed in future to allow forward compatibility. For the time being, deterministic ordering is achieved by lexical ordering by field names. Related EPIC: #514 Related User Story: #515 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Apr 1, 2024
IoT Agent was so far operating independent of datamodel. This had seveal disadvantages: - No detection of mismatch between namespaces - No systematic normalization/contextualization of input data with respect to data model - No trust building/plausiblity checks of incoming data - No use of knowledge graphs to pre-process data at gateway This PR provides a first draft of the future Dataservice which uses the ontology not only for data modelling but also to determine contextualization of incoming data. It only contains a test binding which is creating random data to demonstrate the concept. In future it will contain modules to retrieve data from other protocols, mainly proxied by MQTT and REST. As first step the JSON-LD "@context" is becoming a central structure. Every ontology must contain a context which determines the prefixes and namespace binding. For instance, it is assumed that every ontology must contain a "base:" prefix for the namespace which contains all the base terms of the used ontology (such as Binding and Connector classes) An example ontology is provided here: https://industryfusion.github.io/contexts/staging/example/v0.1 including context: https://industryfusion.github.io/contexts/staging/example/v0.1/context.jsonld Related Epic: IndustryFusion#514 Related User Story: IndustryFusion#515 Signed-off-by: marcel <wagmarcel@web.de>
abhijith-hr
pushed a commit
that referenced
this issue
Apr 3, 2024
IoT Agent was so far operating independent of datamodel. This had seveal disadvantages: - No detection of mismatch between namespaces - No systematic normalization/contextualization of input data with respect to data model - No trust building/plausiblity checks of incoming data - No use of knowledge graphs to pre-process data at gateway This PR provides a first draft of the future Dataservice which uses the ontology not only for data modelling but also to determine contextualization of incoming data. It only contains a test binding which is creating random data to demonstrate the concept. In future it will contain modules to retrieve data from other protocols, mainly proxied by MQTT and REST. As first step the JSON-LD "@context" is becoming a central structure. Every ontology must contain a context which determines the prefixes and namespace binding. For instance, it is assumed that every ontology must contain a "base:" prefix for the namespace which contains all the base terms of the used ontology (such as Binding and Connector classes) An example ontology is provided here: https://industryfusion.github.io/contexts/staging/example/v0.1 including context: https://industryfusion.github.io/contexts/staging/example/v0.1/context.jsonld Related Epic: #514 Related User Story: #515 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Jul 1, 2024
Up to now the IFF-agent can only manage one single device with a certain id. This limits cases where the device is consisting of several subsystems. For such cases, all the subsystem data was mapped to the main device with the respective id. With these changes, a device can now consist of several subsystems and these IDs can be added to the device token. This PR contains everything needed to support subdevice/subcomponent processing: * IFF-Agent accepts deviceIds in the TCP/UDP messages * IFF-Agent utils offer additional options to add subcomponent IDs and send data for subcompoentns * Keycloak allows now the field "subdevice_ids" in the token to add subdevice IDs * The MQTT-Bridge permits subdevice IDs to stream data Related Epic: IndustryFusion#514 Related User-stories: IndustryFusion#555 Signed-off-by: marcel <wagmarcel@web.de>
abhijith-hr
pushed a commit
that referenced
this issue
Jul 2, 2024
Up to now the IFF-agent can only manage one single device with a certain id. This limits cases where the device is consisting of several subsystems. For such cases, all the subsystem data was mapped to the main device with the respective id. With these changes, a device can now consist of several subsystems and these IDs can be added to the device token. This PR contains everything needed to support subdevice/subcomponent processing: * IFF-Agent accepts deviceIds in the TCP/UDP messages * IFF-Agent utils offer additional options to add subcomponent IDs and send data for subcompoentns * Keycloak allows now the field "subdevice_ids" in the token to add subdevice IDs * The MQTT-Bridge permits subdevice IDs to stream data Related Epic: #514 Related User-stories: #555 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Jul 7, 2024
In the datamodel the entity.ttl file describes the "static" knowledge about attributes. For instance the domain and range of attributes. In this PR, the Relationships will be typed as subcomponents and peer relationships. Related Epics: IndustryFusion#514 Related User-stories: IndustryFusion#555 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Jul 11, 2024
In the datamodel the entity.ttl file describes the "static" knowledge about attributes. For instance the domain and range of attributes. In this PR, the Relationships will be typed as subcomponents and peer relationships. Related Epics: IndustryFusion#514 Related User-stories: IndustryFusion#555 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Jul 11, 2024
In the datamodel the entity.ttl file describes the "static" knowledge about attributes. For instance the domain and range of attributes. In this PR, the Relationships will be typed as subcomponents and peer relationships. Related Epics: IndustryFusion#514 Related User-stories: IndustryFusion#555 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
that referenced
this issue
Jul 11, 2024
In the datamodel the entity.ttl file describes the "static" knowledge about attributes. For instance the domain and range of attributes. In this PR, the Relationships will be typed as subcomponents and peer relationships. Related Epics: #514 Related User-stories: #555 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Jul 19, 2024
This PR provides a script to retrieve all subcomponent IDs from an entity id. The script assumes to have the entity knowledge and access to a NGSI-LD Context Broker. In addition it provides: * Changed default context in datamodel examples * Plain JSON-properties to integrate legacy JSON objects * Validation and tests for json properties * Add NOTICE for licence compliance and Software BOM * Bats e2e test for subcompoenent tests * Updated README with getsubcomponent example * Extension of validate.js to validate files from stdin Related Epic: IndustryFusion#514 Related User Story: IndustryFusion#555 Signed-off-by: marcel <wagmarcel@web.de>
abhijith-hr
pushed a commit
that referenced
this issue
Jul 19, 2024
This PR provides a script to retrieve all subcomponent IDs from an entity id. The script assumes to have the entity knowledge and access to a NGSI-LD Context Broker. In addition it provides: * Changed default context in datamodel examples * Plain JSON-properties to integrate legacy JSON objects * Validation and tests for json properties * Add NOTICE for licence compliance and Software BOM * Bats e2e test for subcompoenent tests * Updated README with getsubcomponent example * Extension of validate.js to validate files from stdin Related Epic: #514 Related User Story: #555 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Aug 19, 2024
This is the first step of OPCUA to Semantic Web transformation. The tool added can be used to parse the existing opcua companion specifications and provide a RDF/OWL based representation. This can then later be used of tools to validate and translate OPCUA machines to a common data language. Related Epic IndustryFusion#514 Related User Stories IndustryFusion#555, IndustryFusion#571 Signed-off-by: marcel <wagmarcel@web.de>
abhijith-hr
pushed a commit
that referenced
this issue
Aug 20, 2024
This is the first step of OPCUA to Semantic Web transformation. The tool added can be used to parse the existing opcua companion specifications and provide a RDF/OWL based representation. This can then later be used of tools to validate and translate OPCUA machines to a common data language. Related Epic #514 Related User Stories #555, #571 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Sep 7, 2024
…dings This PR provides a transfomrational tool which takes nodeset2 owl files and creates NGSI-LD, SHACL and OWL based machine descriptions. Also, bindings are created to map opcua data to the external data model. The PR also contains: * Updated Dataservice to work with bindings created by this extractType tool. * Updated Datamodel to pull NGSI-LD subcomponents based on the OPCUA metadata * Additional E2e tests for extractType tool related EPIC: IndustryFusion#514 related User Story: IndustryFusion#555, IndustryFusion#571 Signed-off-by: marcel <wagmarcel@web.de>
wagmarcel
added a commit
to wagmarcel/DigitalTwin
that referenced
this issue
Sep 7, 2024
…dings This PR provides a transfomrational tool which takes nodeset2 owl files and creates NGSI-LD, SHACL and OWL based machine descriptions. Also, bindings are created to map opcua data to the external data model. The PR also contains: * Updated Dataservice to work with bindings created by this extractType tool. * Updated Datamodel to pull NGSI-LD subcomponents based on the OPCUA metadata * Additional E2e tests for extractType tool related EPIC: IndustryFusion#514 related User Story: IndustryFusion#555, IndustryFusion#571 Signed-off-by: marcel <wagmarcel@web.de>
abhijith-hr
pushed a commit
that referenced
this issue
Sep 7, 2024
…dings This PR provides a transfomrational tool which takes nodeset2 owl files and creates NGSI-LD, SHACL and OWL based machine descriptions. Also, bindings are created to map opcua data to the external data model. The PR also contains: * Updated Dataservice to work with bindings created by this extractType tool. * Updated Datamodel to pull NGSI-LD subcomponents based on the OPCUA metadata * Additional E2e tests for extractType tool related EPIC: #514 related User Story: #555, #571 Signed-off-by: marcel <wagmarcel@web.de>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Details
Also called Semantic Machine Binding
For a specific machine with specific firmware, the mapping to NGSI-LD models is defined and with wrong firmware the trust-level of the data point is set to 0
User Stories: #515, #555, #571
Create external Ontology (base_entities, base_knowledge, base_shacl)
Create Mapping mechanism between machine fields and data
Allow sending data trust level through agent
Notes
The text was updated successfully, but these errors were encountered: