Giter Site home page Giter Site logo

managed-simple-data-exchanger-backend's Introduction

Simple Data Exchanger (formally known Data Format Transformer)


Description

This repository is part of the overarching Eclipse Tractus-X project. It contains the Backend for the SDE/DFT. SDE Simple data exchanger(formally known DFT is short for Data Format Transformer)

It is a standalone service which can be self-hosted. It enables companies to provide their data in the Eclipse Tractus-X network via an EDC.

Important !!!

Deployment of SDE backend

The Auto-Setup is the central service orchestration component. The Auto-Setup hide all complex configuration properties for you and get SDE Backend as well as Frontend service deployed for you as service. The Auto-Setup taking all deployment through their specific helm charts. The Auto-Setup knows which prerequisites and which configurations are required for the components and creates them. All dependencies and any error messages are intercepted by Auto-Setup and treated correctly and meaningfully. Therefore, Auto-Setup meets your requirements exactly.

Once SDE deployed, The data is uploaded via CSV-files or tabular entry. The SDE registers the data in the Digital Twin Registry and makes it accessible via an EDC.

The SDE project has three dependencies: Digital Twins, Portal and EDC.

How to run

For SDE installation, please refer the INSTALL file

SDE is a SpringBoot Java Maven software project.

When running, the project requires a postgresql database to be available to connect. You can find the standard require configuration keys as below:

Configuration

Listed below are configuration keys needed to get the sde-backend up and running.

Key Required Example Description
keycloak.clientid X sdeclientId This is keycloak clienId/resource
spring.security.oauth2.resourceserver.jwt.issuer-uri X https://ids.issuer.com/auth/realms/master Url of Keycloak issuer uri
management.endpoint.health.probes.enabled X true Default value, no need to change
management.health.readinessstate.enabled X true Default value, no need to change
management.health.livenessstate.enabled X true Default value, no need to change
management.endpoints.web.exposure.include X * Default value, no need to change
spring.lifecycle.timeout-per-shutdown-phase X 30s Default value, no need to change
logging.level.org.springframework.security.web.csrf X INFO Default value, no need to change
logging.level.org.apache.http X info Default value, no need to change
logging.level.root X info Default value, no need to change
file.upload-dir X ./temp/ Default value, no need to change
spring.servlet.multipart.enabled X true Default value, no need to change
spring.main.allow-bean-definition-overriding X true Default value, no need to change
spring.servlet.multipart.file-size-threshold X 2KB Default value, no need to change
spring.servlet.multipart.max-file-size X 200MB Default value, no need to change
spring.servlet.multipart.max-request-size X 215MB Default value, no need to change
server.servlet.context-path X /api Default value, no need to change
spring.flyway.baseline-on-migrate X true Default value, no need to change
spring.flyway.locations X classpath:/flyway Default value, no need to change
spring.datasource.driver-class-name X org.postgresql.Driver Default value, no need to change
spring.datasource.url X jdbc:postgres//dbserver.com:5432/db Your database server details
spring.datasource.username X Your database password
spring.datasource.password X Your database password
spring.jpa.hibernate.ddl-auto update Default value, no need to change
spring.jpa.open-in-view false Default value, no need to change
digital-twins.hostname X https://example.digitaltwin.com Digital twin registry url
digital-twins.authentication.url X http://ex*.keycloak.com/auth/realms/default Digital twin registry auth url
digital-twins.authentication.clientId X your clientId Digital twin registry clientId
digital-twins.authentication.clientSecret X your secrete Digital twin registry secrete
digital-twins.authentication.grantType X client_credentials Default value, no need to change
edc.hostname X https://example.provider-connector.com Your EDC provider connector url
edc.managementpath X default edc provider management path
edc.apiKeyHeader X x-api-key Your connector api key
edc.apiKey X yourpass Your connector apikey value
edc.consumer.hostname X https://example.consumer-connector.com Your EDC consumer connector
edc.consumer.apikeyheader X x-api-key Your connector api key
edc.consumer.apikey X yourpass Your connector apikey value
edc.consumer.datauri X /api/v1/ids/data IDS endpoint path
edc.consumer.protocol.path X default edc consumer protocol path
edc.consumer.managementpath X default edc consumer management path
dft.hostname X https://example.sdehost.com Your SDE hostname
dft.apiKeyHeader X API_KEY Your default key
dft.apiKey X yourpass Your default key password
manufacturerId X default Your CX partner BPN number
partner.pool.hostname X default url to get legal-entity information
connector.discovery.token-url X https://example.portal.backend.com Portal backend AuthURL
connector.discovery.clientId X default client ID for connector discovery
connector.discovery.clientSecret X default password for connector discovery
portal.backend.hostname X default Portal backend svc URL based on BPN
springdoc.api-docs.path X default swagger API path
bpndiscovery.hostname X default bpn discovery hostname
discovery.authentication.url X default discovery authentication url
discovery.clientId X default discovery clientId
discovery.clientSecret X default discovery clientSecret
discovery.grantType X default discovery grantType
partner.pool.hostname X default partner pool hostname
partner.pool.authentication.url X default partner pool authentication url
partner.pool.clientId X default partner pool clientId
partner.pool.clientSecret X default partner pool clientSecret
partner.pool.grantType X default partner pool grantType
portal.backend.hostname X default portal backend hostname
portal.backend.authentication.url X default portal authentication url
portal.backend.clientId X default portal clientId
portal.backend.clientSecret X default portal clientSecret
portal.backend.grantType X default portal grantType
policy.hub.hostname X default policy hub hostname
policy.hub.authentication.url X default policy hub authentication url
policy.hub.clientId X default policy hub clientId
policy.hub.clientSecret X default policy hub clientSecret
policy.hub.grantType X default policy hub grantType

Example Configuration/application.properties

keycloak.clientid=sdeclientId
spring.security.oauth2.resourceserver.jwt.issuer-uri=https://ids.issuer.com/auth/realms/master
management.endpoint.health.probes.enabled=true
management.health.readinessstate.enabled=true
management.health.livenessstate.enabled=true
management.endpoints.web.exposure.include=*
spring.lifecycle.timeout-per-shutdown-phase=30s

#provider your logging level
logging.level.org.springframework.security.web.csrf=INFO
logging.level.org.apache.http=info
logging.level.root=info

#default spring boot configuration not need to change
file.upload-dir=./temp/
spring.servlet.multipart.enabled=true
spring.main.allow-bean-definition-overriding=true
spring.servlet.multipart.file-size-threshold=2KB
spring.servlet.multipart.max-file-size=200MB
spring.servlet.multipart.max-request-size=215MB

#API context path to access application apis
server.servlet.context-path=/api

#Database and flyway details, the database will 
spring.flyway.baseline-on-migrate=true
spring.flyway.locations=classpath:/flyway
spring.datasource.driver-class-name=org.postgresql.Driver
spring.datasource.url=jdbc:postgres//dbserver.com:5432/dftdb #your database server details
spring.datasource.username=your database password
spring.datasource.password=your database password
spring.jpa.hibernate.ddl-auto=update
spring.jpa.open-in-view=false

#Provide digital twin registry details which SDE should use to create twin for your, 
#The need technical user details depend on digital twin security configuration
digital-twins.hostname=https://example.digitaltwin.com
digital-twins.authentication.url=http://example.keycloak.com/auth/realms/default
digital-twins.authentication.clientId=your clientId
digital-twins.authentication.clientSecret=your secrete
digital-twins.authentication.grantType=client_credentials

#The EDC connector information which SDE should use As Data provider connector
edc.hostname=https://example.provider-connector.com
edc.apiKeyHeader=your connector api key
edc.apiKey=your connector apikey value 

#The EDC connector information which SDE should use As Data consumer connector
edc.consumer.hostname=https://example.consumer-connector.com
edc.consumer.apikeyheader=your connector api key
edc.consumer.apikey=your connector apikey value 
edc.consumer.datauri=/api/v1/ids/data

#Your Own SDE host url which will share with EDC connector as data address proxy
dft.hostname=https://example.sdehost.com
dft.apiKeyHeader=your default key
dft.apiKey=your default key password

#Your company BPN number
manufacturerId=default

#Portal pool hostname url to use discover legal company information in SDE
partner.pool.hostname=default

#Portal backend url for get connector list based on BPN number
connector.discovery.token-url=https://example.portal.backend.com
connector.discovery.clientId=default
connector.discovery.clientSecret=default
portal.backend.hostname=default
springdoc.api-docs.path=/api-docs
bpndiscovery.hostname=default
discovery.authentication.url=default
discovery.clientId=default
discovery.clientSecret=default
discovery.grantType=default
edc.consumer.protocol.path=default
edc.consumer.managementpath=default
edc.managementpath=default
partner.pool.hostname=default
partner.pool.authentication.url=default
partner.pool.clientId=default
partner.pool.clientSecret=default
partner.pool.grantType=default
portal.backend.hostname=default
portal.backend.authentication.url=default
portal.backend.clientId=default
portal.backend.clientSecret=default
portal.backend.grantType=default
bpndiscovery.hostname=default
discovery.authentication.url=default
discovery.clientId=default
discovery.clientSecret=default
discovery.grantType=default
edc.consumer.protocol.path=default
edc.consumer.managementpath=default
edc.managementpath=default
partner.pool.hostname=default
partner.pool.authentication.url=default
partner.pool.clientId=default
partner.pool.clientSecret=default
partner.pool.grantType=default
portal.backend.hostname=default
portal.backend.authentication.url=default
portal.backend.clientId=default
portal.backend.clientSecret=default
portal.backend.grantType=default
policy.hub.hostname=default
policy.hub.authentication.url=default
policy.hub.clientId=default
policy.hub.clientSecret=default
policy.hub.grantType=default

The above configuration we can use as for different deployment as specified here InstallationGuide.md


Supported submodules

To find information about supported submodules and there version in SDE please visit here


DFT(Simple Data Exchanger) Compatible with :=

  1. File Uploads
    • SerialPart
    • SingleLevelBoMAsBuilt
    • Batch
    • PartAsPlanned
    • PartTypeInformation
    • SingleLevelBoMAsPlanned
    • PartSiteInformationAsPlanned
    • SingleLevelUsageAsBuilt
    • Product Carbon Footprint(PCF)
  2. Json Update
    • SerialPart
    • SingleLevelBoMAsBuilt
    • Batch
    • PartAsPlanned
    • PartTypeInformation
    • SingleLevelBoMAsPlanned
    • PartSiteInformationAsPlanned
    • SingleLevelUsageAsBuilt
    • Product Carbon Footprint(PCF)
  3. Application UI
    • SerialPart
    • SingleLevelBoMAsBuilt
    • Batch
    • PartAsPlanned
    • PartTypeInformation
    • SingleLevelBoMAsPlanned
    • PartSiteInformationAsPlanned
    • SingleLevelUsageAsBuilt
    • Product Carbon Footprint(PCF)

RESTful APIs OF DFT (Simple Data Exchanger)

Note: API_KEY, AUTHORIZATION TOKEN Required as Headers

API Description Request body Response body
GET:- localhost:8080/api/submodels This API is used to get all submodels list which is implemented/supported by SDE Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/submodels/schema-details This API is used to get schema details of submodels which is implemented/supported by SDE Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/submodels/{submodelName} This API is used to get the schema data of specific model Refer Api Doc Refer Api Doc
POST:- localhost:8080/api/{submodel}/upload This API is used to uploading data From CSV file for particular selected submodel Refer Api Doc 4ca03d5f-9e37-4c12-a8b8-6583b81892c8
POST:- localhost:8080/api/{submodel}/manualentry This API is used for uploading data From JSon/Tabular form Refer Api Doc 4ca03d5f-9e37-4c12-a8b8-6583b81892c8
GET:- localhost:8080/api/{submodel}/public/{uuid} This API is used for to get the specific submodel data Refer Api Doc Refer Api Doc
DELETE:- localhost:8080/api/{submodel}/delete/{processId} This API is used to delete processed data from EDC and DigitalTwins Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/role/{role}/permissions This API is used to fetch all permissions associate with particular role Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/user/role/permissions This API is used to fetch all list of permissions Refer Api Doc Refer Api Doc
POST:- localhost:8080/api/role/{role}/permissions This API is used to apply list of permissions to the specific role Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/processing-report/87d0aece-ae46-4006-904d-9ec41cddee8b This API Is Used For fetch Process Report by Process ID Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/ping This API Is Used For Health Check -- 2022-09-30T16:21:02.630868
GET:- localhost:8080/api/processing-report?page=&pageSize=50 This API Is Used For fetch Process Report Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/query-data-offers This API is used to fetch all data offers of provider URL Refer Api Doc Refer Api Doc
POST:- localhost:8080/api/subscribe-data-offers This API is used to subscribe data offers Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/contract-offers This API is used to get all contract offers Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/legal-entities This API is used to fetch legal entities (list of company's) for Process Refer Api Doc Refer Api Doc
POST:- localhost:8080/api/connectors-discovery This API is used to fetch connector's information Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/submodels This API is used to get all submodels list which is implemented/supported by SDE Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/submodels/{submodelName} This API is used to get the schema data of specific model Refer Api Doc Refer Api Doc
POST:- localhost:8080/api/{submodel}/upload This API is used to uploading data From CSV file for particular selected submodel Refer Api Doc 4ca03d5f-9e37-4c12-a8b8-6583b81892c8
POST:- localhost:8080/api/{submodel}/manualentry This API is used for uploading data From JSon/Tabular form Refer Api Doc 4ca03d5f-9e37-4c12-a8b8-6583b81892c8
GET:- localhost:8080/api/{submodel}/public/{uuid} This API is used for to get the specific submodel data Refer Api Doc Refer Api Doc
DELETE:- localhost:8080/api/{submodel}/delete/{processId} This API is used to delete processed data from EDC and DigitalTwins Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/role/{role}/permissions This API is used to fetch all permissions associate with particular role Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/user/role/permissions This API is used to fetch all list of permissions Refer Api Doc Refer Api Doc
POST:- localhost:8080/api/role/{role}/permissions This API is used to apply list of permissions to the specific role Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/processing-report/87d0aece-ae46-4006-904d-9ec41cddee8b This API Is Used For fetch Process Report by Process ID Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/ping This API Is Used For Health Check -- 2022-09-30T16:21:02.630868
GET:- localhost:8080/api/processing-report?page=&pageSize=50 This API Is Used For fetch Process Report Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/query-data-offers This API is used to fetch all data offers of provider URL Refer Api Doc Refer Api Doc
POST:- localhost:8080/api/subscribe-data-offers This API is used to subscribe data offers Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/contract-offers This API is used to get all contract offers Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/legal-entities This API is used to fetch legal entities (list of company's) for Process Refer Api Doc Refer Api Doc
POST:- localhost:8080/api/connectors-discovery This API is used to fetch connectors information Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/policy-attributes This API is used to fetch policy attributes Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/policy-types This API is used to fetch type of policy attributes Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/policy-content This API is used to fetch policy content Refer Api Doc Refer Api Doc
POST:- localhost:8080/api/policy-content This API is used to create policy content Refer Api Doc Refer Api Doc
POST:- localhost:8080/api/policy This API is used to save policy Refer Api Doc Refer Api Doc
PUT:- localhost:8080/api/policy/{uuid} This API is used to update policy Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/policy/{uuid} This API is used to get policy Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/policy/is-policy-name-valid This API is used to check policy name valid or not Refer Api Doc Refer Api Doc
GET:- localhost:8080/api/policy This API is used to all policy Refer Api Doc Refer Api Doc
DELETE:- localhost:8080/api/policy/{uuid} This API is used to delete policy Refer Api Doc Refer Api Doc

Detailed API specs available under:

Backend API Swagger-ui :

https:///backend/api/swagger-ui/index.html


Response Status


SUCCESS RESPONSE CODES:

Code DESCRIPTION
200 OK Indicates that the request has succeeded.
201 Created Indicates that the request has succeeded and a new resource has been created as a result.
202 Accepted Indicates that the request has been received but not completed yet
204 No Content The server has fulfilled the request but does not need to return a response body. The server may return the updated meta information.

ERROR RESPONSE CODES:

Code DESCRIPTION
400 Bad Request The request could not be understood by the server due to incorrect syntax.
401 Unauthorized Indicates that the request requires user authentication information.
403 Forbidden Unauthorized request.
404 Not Found The server can not find the requested resource.
405 Method Not Allowed The request HTTP method is known by the server but has been disabled and cannot be used for that resource
500 Internal Server Error The server encountered an unexpected condition that prevented it from fulfilling the request.
502 Bad Gateway The server got an invalid response while working as a gateway to get the response needed to handle the request.
503 Service Unavailable The server is not ready to handle the request.
504 Gateway Timeout The server is acting as a gateway and cannot get a response in time for a request.

Database

Tables Description Unique Id
aspect Table used to Store Date About Serialized Part Primary Key:UUID
serialpart_v_300 Table used to Store Date About Serialized Part Primary Key:UUID
aspect_relationship Data about the relationship of parts to its child-components. Primary Key:parent_catenax_id, child_catenax_id
single_level_bom_asbuilt_v_300 Data about the relationship of parts to its child-components. Primary Key:parent_catenax_id, child_catenax_id
batch Table used to Store Date about Serialized Part. Primary Key:UUID
batch_v_300 Table used to Store Date about Serialized Part. Primary Key:UUID
part_as_planned Table used to Store Date about Part As Planned. Primary Key:UUID
part_type_information Table used to Store Date about Part As Planned. Primary Key:UUID
pcf_aspect Table used to Store Date about Part As Planned. Primary Key:UUID
single_level_bom_as_planned Data about the relationship of part As Planned to its child-components. Primary Key:parent_catenax_id, child_catenax_id
single_level_bom_as_planned_v_300 Data about the relationship of part As Planned to its child-components. Primary Key:parent_catenax_id, child_catenax_id
part_site_information_as_planned Table used to Store Date about Part Site Information As Planned. Primary Key:UUID
contract_negotiation_info Tables Contains Contract Negotiation Info and offerid Primary Key: connector_id, offer_id
failure_log Table Contains Data About Failure Entries Primary Key:UUID
Flyway_Schema_History Table Contains data Migration History Primary Key:installed_rank
Process_Report Table Contains status of Processing upload Primary Key:process_id
sde_role Table Contains list of roles Primary Key:sde_role
sde_permission Table Contains list of permissions Primary Key:sde_permission
sde_role_permission_mapping Table Contains mapping of role with permissions Primary Key:sde_role, sde_permission
single_level_usage_as_built Data about the relationship of parts to its child-components. Primary Key:parent_catenax_id, child_catenax_id
single_level_usage_as_built_v_300 Data about the relationship of parts to its child-components. Primary Key:parent_catenax_id, child_catenax_id

flyway

The scripts are in the folder: resources/flyway.

File naming: Vx__script_name.sql, where x is the version number.

When there is a need to change the last script, it is necessary to create a new script with the changes.

Link to flyway documentation: Documentation.

API authentication

Authentication for the backend is handled via an API Key. This can be set in the configuration file.

EDC

GitHub repository with correct version of the Eclipse DataSpace Connector Project: repository.

Licenses

For used licenses, please see the NOTICE.

Eclipse Dash Tool

The Eclipse Dash tool is used to analyze the dependencies used in the project and ensure all legal requirements are met. We're using the official maven plugin to resolve all project dependencies and then run the tool and update the summary in the DEPENDENCIES file.

Notice for Docker image

Bellow you can find the information regarding Docker Notice for this application.

managed-simple-data-exchanger-backend's People

Contributors

adityagajbhiye avatar adityagajbhiye9 avatar adkumar1 avatar ajayraturi123 avatar almadigabor avatar amoldashwant avatar carslen avatar chetant-system avatar cxkreuz avatar dependabot[bot] avatar dhanashrijadhavs avatar diogoboucactw avatar ds-mwesener avatar dvasunin avatar fagru3n avatar fedornaz avatar gins47 avatar greenipple avatar hzierer avatar kchaudh1 avatar mustax avatar pedrojorgectw avatar pedrompt avatar sachinargade123 avatar scherersebastian avatar siegfriedk avatar tomaszbarwicki avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

managed-simple-data-exchanger-backend's Issues

Demo Issue

Demo Issue to understand linking problem between project and the repository.

[TRG 2.05] Add .tractusx repository metadata

In our release guideline 2.05 we describe a yaml metadata file that should be present for each of our repositories in eclipse-tractusx.

We are using this to automate several things across the GitHub org, like combining repos to a "Product view".
If the .tractusx metadata file is missing, we cannot show the repo in views like that

docs: appropriate `README.md`

Please see TRG1.01 and other TRGs in the list.

The readme is missing the following sections:

  • How to configure the application
  • Reference to an INSTALL.md
  • Reference to the leading product repository and backend repository according to the TRG2.04
  • Notice for Docker image or reference to the notice based on TRG4.06. You can find a good example for the notice section here.

TRG 7.05 Legal information for distributions

You need to include the license, notice and dependencies file in your artifact (not in docker image, but in .jar this case) according to this TRG. You can take a look at this example that already does the same.

  • Also you need to get rid of these file from the docker image or at lease don't include them in the root as it would indicate that all files in the image are under Apache 2.0 license.

chore: add missing license headers

Please add license headers to these files according to the TRG.

  • .github/workflows/chart-release.yml
  • .github/workflows/kics.yml
  • .github/workflows/trivy.yml
  • .github/workflows/veracode.yaml
  • build/Dockerfile

Also please evaluate these files if they need the license header:

  • docs/DFT-Swagger.postman_collection.json
  • modules/sde-core/src/main/resources/use-case.json
  • modules/sde-core/target/classes/use-case.json
  • modules/sde-external-services/edc/src/main/java/org/eclipse/tractusx/sde/edc/model/contractoffers/ContractOfferRequestFactory.java
  • modules/sde-submodules/assembly-part-relationship/src/main/resources/assembly-part-relationship.json
  • modules/sde-submodules/assembly-part-relationship/src/main/resources/assemblyPartRelationship.csv
  • modules/sde-submodules/assembly-part-relationship/target/classes/assembly-part-relationship.json
  • modules/sde-submodules/assembly-part-relationship/target/classes/assemblyPartRelationship.csv
  • modules/sde-submodules/batch/src/main/resources/batch.csv
  • modules/sde-submodules/batch/src/main/resources/batch.json
  • modules/sde-submodules/batch/target/classes/batch.csv
  • modules/sde-submodules/batch/target/classes/batch.json
  • modules/sde-submodules/part-as-planned/src/main/resources/part-as-planned.json
  • modules/sde-submodules/part-as-planned/src/main/resources/partAsPlanned.csv
  • modules/sde-submodules/part-as-planned/target/classes/part-as-planned.json
  • modules/sde-submodules/part-as-planned/target/classes/partAsPlanned.csv
  • modules/sde-submodules/part-site-information-as-planned/src/main/resources/part-site-information-as-planned.json
  • modules/sde-submodules/part-site-information-as-planned/src/main/resources/partSiteInformationAsPlanned.csv
  • modules/sde-submodules/part-site-information-as-planned/target/classes/part-site-information-as-planned.json
  • modules/sde-submodules/part-site-information-as-planned/target/classes/partSiteInformationAsPlanned.csv
  • modules/sde-submodules/serial-part-typization/src/main/resources/serial-part-typization.json
  • modules/sde-submodules/serial-part-typization/src/main/resources/serialPartTypization.csv
  • modules/sde-submodules/serial-part-typization/target/classes/serial-part-typization.json
  • modules/sde-submodules/serial-part-typization/target/classes/serialPartTypization.csv
  • modules/sde-submodules/single-level-bom-as-planned/src/main/resources/SingleLevelBoMAsPlanned.csv
  • modules/sde-submodules/single-level-bom-as-planned/src/main/resources/single-level-bom-as-planned.json
  • modules/sde-submodules/single-level-bom-as-planned/target/classes/SingleLevelBoMAsPlanned.csv
  • modules/sde-submodules/single-level-bom-as-planned/target/classes/single-level-bom-as-planned.json
  • modules/sde-submodules/single-level-usage-as-built/src/main/resources/single-level-usage-as-built.json
  • modules/sde-submodules/single-level-usage-as-built/src/main/resources/singleLevelUsageAsBuilt.csv
  • modules/sde-submodules/single-level-usage-as-built/target/classes/single-level-usage-as-built.json
  • modules/sde-submodules/single-level-usage-as-built/target/classes/singleLevelUsageAsBuilt.csv

QG 4 checks (Release 24.3)

QG checks

Please keep this issue open until QG is concluded and will be managed by the Issue Creator!
We will inform you about finding and proposals in separated issues, this issue here is for the Overview of the Checks!

Please keep this issue open until QG is concluded!

Product Owner:
Dev SPOC:
Helm Chart Version:
App Version:

Release Managemnet Reference Issue:

Check of Tractus-X Release Guidelines

TRG 1 Documentation

  • TRG 1.01 appropriate README.md
  • TRG 1.02 appropriate install instructions either INSTALL.md or in README.md
  • TRG 1.03 appropriate CHANGELOG.md
  • TRG 1.04 editable static files

TRG 2 Git

TRG 3 Kubernetes

  • TRG 3.02 persistent volume and persistent volume claim is used when needed

TRG 4 Container

TRG 5 Helm

  • TRG 5.01 Helm chart requirements
  • TRG 5.02 Helm chart location in /charts directory and correct structure
  • TRG 5.03 proper version strategy
  • TRG 5.04 CPU / MEM resource requests and limits and are properly set
  • TRG 5.06 Application must be configurable through the Helm chart
  • TRG 5.07 Dependencies are present and properly configured in the Chart.yaml
  • TRG 5.08 Product has a single deployable helm chart that contains all components
  • TRG 5.09 Helm Test running properly
  • TRG 5.10 Products need to support 3 versions at a time
  • TRG 5.11 Upgradeability

TRG 6 Released Helm Chart

TRG 7 Open Source Governance

  • TRG 7.01 Legal Documentation
  • TRG 7.02 License and copyright header
  • TRG 7.03 IP checks for project content
  • TRG 7.04 IP checks for 3rd party content
  • TRG 7.05 Legal information for distributions
  • TRG 7.06 Legal information for end user content
  • TRG 7.07 Legal notice for documentation
  • TRG 7.08 Legal notice for KIT documentation

Hints

Information Sharing

Example CSV for assembly-part-relationship does not match the documentation and json definition

The CSV Example for assembly-part-relationship submodule

https://github.com/eclipse-tractusx/managed-simple-data-exchanger-backend/blob/main/modules/sde-submodules/assembly-part-relationship/src/main/resources/assemblyPartRelationship.csv

Does not match the documentation

https://github.com/eclipse-tractusx/managed-simple-data-exchanger-backend/blob/main/modules/sde-submodules/assembly-part-relationship/assembly-part-relationship.md

The csv example is even linked from the documentation

And json schema

modules/sde-submodules/assembly-part-relationship/src/main/resources/assembly-part-relationship.json

SDE Backend QG 4 checks (Release 3.2)

The main issue can be found here. This issue is just to check specific content of the backend repository.

Please keep this issue open until QG 4 is concluded!

Product Name: Managed Simple Data Exchanger Backend
Product Owner: Mehran Roshandel
Dev SPOC: Aditya Gajbhiye
App Version: 2.0.11

Check of Tractus-X Release Guidelines

This QG 4 Check is depending on the mandatory information from our current Release Guidelines.

TRG 1 Documentation

TRG 2 Git

  • TRG 2.01 default branch is named main

  • TRG 2.03 repository structure

    Checks within TRG 2.03
    • TRG 2.03 /docs directory contains detailed product related documentation for the Tractus-X product
    • TRG 2.03 /charts directory contains the Helm chart for the Tractus-X product IF available
    • TRG 2.03 AUTHORS.md file (optional) (TRG 2.03)
    • TRG 2.03 CODE_OF_CONDUCT.md file (TRG 2.03) #46
    • #47
    • #48
    • TRG 2.03 LICENSE file (TRG 2.03)
    • TRG 2.03 NOTICE.md file (TRG 2.03)
    • TRG 2.03 SECURITY.md file (TRG 2.03)
  • TRG 2.04 Leading product repository

    Checks within TRG 2.04
    • TRG 2.04 repository name must be productname without prefix or suffix
    • TRG 2.04 should contain the release
    • TRG 2.04 references/urls to the product's other repositories
    • TRG 2.04 might contain product helm chart(s)
    • TRG 2.04 README.md: contains the urls for the underlying applications
  • TRG 2.05 .tractusx metafile in a proper format

TRG 3 Kubernetes

  • TRG 3.02 PersistentVolume and PersistentVolumeClaim is used when needed

TRG 4 Container

  • TRG 4.01 semantic versioning and tagging #50

  • TRG 4.02 top level README.md file, that contains information about the used base image #44

  • TRG 4.03 Image has USER command and Non Root Container

    Checks within TRG 4.03
    • TRG 4.03 deployment.yaml has runAsUser and allowPrivilegeEscalation: false properly set
  • TRG 4.05 released image must be place DockerHub as mandatory container registry; remove GHCR references

  • TRG 4.06 Notice File for DockerHub has all necessary information #44

    Checks within TRG 4.06
    • TRG 4.06 Link to the source of your base image (Container registry and GitHub if available)
    • TRG 4.06 Link to your product image on DockerHub
    • TRG 4.06 Link to your repository on GitHub
    • TRG 4.06 Direct link to the Dockerfile used to build your image
    • TRG 4.06 Link to LICENCE file in your repo as Project License (make clear, that this is the PROJECT licence, not an image license

TRG 6 Released Helm Chart

TRG 7 Open Source Governance

  • #51

  • TRG 7.02 License and copyright header #43

  • TRG 7.03 IP checks for project content

  • TRG 7.04 IP checks for 3rd party content #48

    Checks within TRG 7.04
    • TRG 7.04 DEPENDENCIES file is up-to-date and reflects the current use of the 3rd party content
    • TRG 7.04 all libraries listed there should have the status "approved"
    • TRG 7.04 no libraries with status "rejected"
    • TRG 7.04 for libraries with status "restricted", the according IP issues must be present (issue number in the source column)
  • #52

  • TRG 7.06 Legal information for end user content

  • TRG 7.07 Legal notice for documentation #53

Hints

Information Sharing

[TRG 4.06] Move "Notice for docker images" to separate file

Please move the Notice for docker images into a separate file. Although the current TRG version does not require it explicitly, you are running into a problem we discussed in one of the Office hours end of last year:
If you got a long README.md file with the notice at the end (where it should be) Dockerhub is cutting the Readme which will result in a published image without notice. Unfortunately the notice is legally required.

hint: the docker workflow has to be adjusted to include the notice.md: https://eclipse-tractusx.github.io/docs/release/trg-4/trg-4-06

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.