Giter Site home page Giter Site logo

sap-archive / cloud-s4-sdk-examples Goto Github PK

View Code? Open in Web Editor NEW
65.0 18.0 38.0 2.4 MB

Runnable example applications that showcase the usage of the SAP Cloud SDK.

License: Apache License 2.0

JavaScript 9.89% Java 56.62% CSS 0.08% HTML 4.67% Gherkin 0.16% RAML 0.95% Shell 27.00% Batchfile 0.64%
sample

cloud-s4-sdk-examples's Issues

Invalid document data upload using AttachmentContent

Hi colleagues,

Our team is working with the attachments APIs (especifically, the DefaultAPICVATTACHMENTSRVService service) and we found an error with the files that were uploaded using the SDK. The operation runs fine and the document is uploaded to S/4 with the correct linked object key/type and metadata (file name, mime type); however, the actual data for the document is not valid.

Through a binary comparison between the original file and the one retrieved from S/4, I've noticed that data for the uploaded file is actually readable in a text editor and it contains a JSON representation of the actual file. E.g.:

{
  "LinkedSAPObjectKey":"123456789",
  "FileSize":"39618",
  "FileName":"testfile.jpg",
  "MimeType":"image/jpeg",
  "Content":"base64encodeddatahere",
  "BusinessObjectType":"EXAMPLE"
}

Here is how our code looks like when creating the command (using the generated builders, we also tried to use the setter methods but we got the same result):

byte[] attachmentData = document.getDataInBytes();
AttachmentContent attachmentContent = AttachmentContent.builder()
  .attachmentContent(attachmentData)
  .fileSize(String.valueOf(attachmentData.length))
  .fileName(document.getName())
  .mimeType(document.getMimeType())
  .linkedSAPObjectKey(document.getKey())
  .businessObjectType("EXAMPLE")
  .build();

This attachmentContent variable is then provided when we call the createAttachmentContent() method on DefaultAPICVATTACHMENTSRVService, but somewhere along the way it seems like the content is being replaced with the JSON containing both the data and the metadata.

The command's run() method looks like this:

private Map<String, String> getUploadHeaders() {
    Map<String, String> headers = new HashMap<>();
    headers.put("Slug", attachmentContent.getFileName());
    headers.put("Content-Type", attachmentContent.getMimeType());
    headers.put("BusinessObjectTypeName", "EXAMPLE");
    headers.put("LinkedSAPObjectKey", attachmentContent.getLinkedSAPObjectKey());
    return headers;
}
...
defaultAttachmentsService
  .createAttachmentContent(attachmentContent)
  .withCustomHttpHeaders(getUploadHeaders()).onRequestAndImplicitRequests()
  .execute(getConfigContext());

Right now are unsure if this is caused by something we are missing from our code or if there is a bug in this service somewhere. Could you please assist us with this issue?

"Failed to convert response into ODataFeed: An exception of type 'EdmSimpleTypeException' occurred."

Hi team,

I'm trying to build an app that reads info from SFSF following these steps:

  • Get a project via archetype (with powershell):
mvn archetype:generate "-DarchetypeGroupId=com.sap.cloud.sdk.archetypes" "-DarchetypeArtifactId=scp-cf-tomee" "-DarchetypeVersion=RELEASE"
  • Add the following to the application\pom.xml
    In dependencies:
<dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
            <scope>provided</scope>
</dependency>

In plugins:

<plugin>
    <groupId>com.sap.cloud.sdk.datamodel</groupId>
    <artifactId>odata-generator-maven-plugin</artifactId>
    <version>3.13.0</version>
    <executions>
        <execution>
            <id>generate-consumption</id>
            <phase>generate-sources</phase>
            <goals>
                <goal>generate</goal>
            </goals>
            <configuration>
                <inputDirectory>${project.basedir}/edmx</inputDirectory>
                <outputDirectory>${project.build.directory}/vdm</outputDirectory>
                <defaultBasePath>/odata/v2</defaultBasePath>
            </configuration>
        </execution>
    </executions>
</plugin>
<plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>build-helper-maven-plugin</artifactId>
                <version>3.0.0</version>
                <executions>
                    <execution>
                        <phase>generate-sources</phase>
                        <goals>
                            <goal>add-source</goal>
                        </goals>
                        <configuration>
                            <sources>
                                <source>${project.basedir}/vdm</source>
                            </sources>
                        </configuration>
                    </execution>
    </executions>
</plugin>
  • Get the OData metadata file from https://apisalesdemo2.successfactors.eu/odata/v2/JobRequisition/$metadata and place it in ./application/edmx
  • Create a destination service (my-destination) and add a destination there pointing to my SFSF instance with basic auth (with user@companyId, the connection is 200:OK)
  • Add the destination service in the manifest.yml
  • Create a java class to call the destination and get the data:
package com.sap.sdk;

import com.google.gson.Gson;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.io.IOException;
import java.util.List;

import com.sap.cloud.sdk.cloudplatform.connectivity.DestinationAccessor;
import com.sap.cloud.sdk.odatav2.connectivity.ODataException;

import com.sap.cloud.sdk.s4hana.connectivity.DefaultErpHttpDestination;
import com.sap.cloud.sdk.s4hana.connectivity.ErpHttpDestination;
import com.sap.cloud.sdk.s4hana.datamodel.odata.namespaces.rcmjobrequisition.JobRequisition;
import com.sap.cloud.sdk.s4hana.datamodel.odata.services.DefaultRCMJobRequisitionService;


@WebServlet("/req")
public class JobReqServlet extends HttpServlet {

    private static final long serialVersionUID = 1L;
    private static final Logger logger = LoggerFactory.getLogger(JobReqServlet.class);

    private final ErpHttpDestination destination = DestinationAccessor.getDestination("sfsf-sdk-dest").asHttp()
            .decorate(DefaultErpHttpDestination::new);
    

    @Override
    protected void doGet(final HttpServletRequest request, final HttpServletResponse response)
            throws ServletException, IOException {
        try {
            final List<JobRequisition> jobReqs = new DefaultRCMJobRequisitionService()
                .getAllJobRequisition()
                .execute(destination);
            response.setContentType("application/json");
            response.getWriter().write(new Gson().toJson(jobReqs));
        } catch (final ODataException e) {
            logger.error(e.getMessage(), e);
            response.setStatus(HttpServletResponse.SC_INTERNAL_SERVER_ERROR);
            response.getWriter().write(e.getMessage());
        }
    }
}

With all this (I think I'm not missing anything), I do:

mvn clean install

and:

cf push

Everything works well, the hello world servlet works, but when I try to access /req, I get a:
Unable to execute metadata request.

However, I can see that the app is hitting SFSF because if I play with the base path of the service (in the pom.xml) I get 404's coming from SFSF.

Checking everything, I see this when the VDM generator is running:

  1. This is the base path I'm giving in the pom:
<defaultBasePath>/odata/v2</defaultBasePath>
  1. I can see the generator picking that path correctly:
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.DataModelGenerator -   Default base path:              /odata/v2/
  1. But this is what the generator processes:
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.ODataToVdmGenerator -   Title: RCMJobRequisition
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.ODataToVdmGenerator -   Raw URL: /odata/v2/SFODataSet
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.ODataToVdmGenerator -   Java Package Name: rcmjobrequisition
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.ODataToVdmGenerator -   Java Class Name: RCMJobRequisition

Clearly, that SFODataSet in the URL is not correct. When the app runs, it's tring to get the metadata from .../odata/v2/SFODataSet/$metadata, and that's why it's not finding it.
That SFODataSet is coming from the SFSF metadata:

<Schema Namespace="SFODataSet" xmlns="http://schemas.microsoft.com/ado/2008/09/edm" xmlns:sf="http://www.successfactors.com/edm/sf" xmlns:sap="http://www.sap.com/Protocols/SAPData">
      <EntityContainer Name="EntityContainer" m:IsDefaultEntityContainer="true">
        <EntitySet Name="JobOfferTemplate_Standard_Offer_Details" EntityType="SFOData.JobOfferTemplate_Standard_Offer_Details" sap:label="JobOfferTemplate_Standard_Offer_Details" sap:creatable="false" sap:updatable="false" sap:upsertable="false" sap:deletable="false">
          <Documentation>
            <Summary>Job Requisition Template</Summary>
            <LongDescription>These entities represent the job requisition template as defined in provisioning.</LongDescription>
            <sap:tagcollection>
              <sap:tag>Recruiting (RCM)</sap:tag>
              <sap:tag>RCM - Job Requisition</sap:tag>
            </sap:tagcollection>
          </Documentation>
        </EntitySet>
        <EntitySet Name="JobRequisitionLocale" EntityType="SFOData.JobRequisitionLocale" sap:label="JobRequisitionLocale" sap:creatable="false" sap:updatable="false" sap:upsertable="false" sap:deletable="false">
          <Documentation>
...

I tried something else. In the java class, I added

.withServicePath("odata/v2")

in the call to the destination, leaving it like this::

final List<JobRequisition> jobReqs = new DefaultRCMJobRequisitionService()
                .withServicePath("odata/v2/JobRequisition")
                .getAllJobRequisition()
                .execute(destination);

That replaces the /odata/v2/SFODataSet url that was generated. In this case, the error I get is:

"com.sap.cloud.sdk.odatav2.connectivity.ODataQuery","thread":"http-nio-0.0.0.0-8080-exec-3","level":"ERROR","categories":[],"msg":"Failed to convert response into ODataFeed: An exception of type 'EdmSimpleTypeException' occurred." }

I've tried different values: odata/v2, odata/v2/, odata/v2/JobRequisition...

I can't find the way for this to work. Can you help me find the issue here?

I'm using:

  • Apache Maven 3.6.2
  • SAP Cloud SDK 3.13.0

Kr,
kepair

Unable to create a change master with material association

Hello colleagues,

Our team is working with the change master API and one of our requirements is to create a change master and assign both materials and task lists to it.

We are using the API via SDK with generated sources, as a workaround recommended in #69. The create works fine when no associations are assigned. However, once we add a material to the change request before creating it, the S4 backend returns a 400 error:

The endpoint responded with HTTP error code 400.
Object type &Material& is not permitted for objects of change number

Our code to create a new change master looks like this:

APIForChangeNumber changeMaster = new APIForChangeNumber();
changeMaster.setValidFrom(validFrom);
changeMaster.setDescription(changeRequest.getDescription());
changeMaster.setStatus("1");
 
APIForChangeNumberObjectManagementRecordMaterial s4Material = new APIForChangeNumberObjectManagementRecordMaterial();
s4Material.setMaterial(material.getMaterialId());

changeMaster.addChangeMstrObMgReMaterial(s4Material);

Could you please assist us? I am wondering if this is either incorrect usage of the SDK, a S4 misconfiguration or a known issue when using metadata generated sources.

Thanks!
Tiago

Cast error on Change Master

Hello colleagues,

I am working with the Change Master API and we are having some issues regarding class cast exceptions. The exception stack trace is below:

java.lang.ClassCastException: java.util.GregorianCalendar cannot be cast to java.lang.Double at com.sap.cloud.sdk.s4hana.datamodel.odata.namespaces.changemaster.ChangeMaster.fromMap(ChangeMaster.java:1073)
at com.sap.cloud.sdk.s4hana.datamodel.odata.helper.FluentHelperCreate.buildEntityFromODataResult(FluentHelperCreate.java:177)
at com.sap.cloud.sdk.s4hana.datamodel.odata.helper.FluentHelperCreate.execute(FluentHelperCreate.java:165)
at com.sap.cloudscame.bmsmoc.s4.command.CreateChangeMasterV2Command.run(CreateChangeMasterV2Command.java:35)

This happens whenever a GET is executing and all properties of the change master are selected. Also, when we create a new change master, the same error occurs.

We are using scp-neo SDK in version 2.14.0.

Could you please assist us on this issue?

Thanks,
Tiago

501 Error on Update

Hello colleagues,

We are working with the Change Master API and we are having some issues updating the status.

We are calling the API as:

this.defaultChangeMasterService
.updateAPIForChangeNumber(changeMaster)
.execute(getConfigContext())

While debugging, we could validate that changemaster is set as:

APIForChangeNumber(super=VdmObject(customFields={}, changedOriginalFields={Status=2}), changeNumber=12345, status=1, authorizationGroup=, function=, relTechnically=, releaseKey=0, reasonForChange=123, description=123, validFrom=9999-01-01T00:00, createdOn=2019-05-07T00:00, createdBy=SCP_RFC, changedOn=null, changedBy=, inUse=false, deletionFlag=false, timeStamp=1972-04-01T00:00Z[GMT], toAlternativeDate=null, toChangeMstrObjectMgmtRecord=null, toChangeMstrObMgReDocInfoRecd=null, toChangeMstrObMgReMaterial=null, toChangeMstrObMgReMatlBOM=null, toCharacteristics=null, toClassification=null, toObjTypeAssignment=null)

However, we get HTTP 501 error with the message:

The endpoint responded with HTTP error code 501.
PATCH requests require components to be updated
Full error message: 
{
"error": {
"code": "/IWFND/CM_MGW/096",
"message": {
"lang": "en",
"value": "PATCH requests require components to be updated"
},
"innererror": {
"application": {
"component_id": "PLM-WUI-OBJ-ECN",
"service_namespace": "/SAP/",
"service_id": "API_CHANGEMASTER",
"service_version": "0002"
},
"transactionid": "83...10",
"timestamp": "20190507201621.8351780",
"Error_Resolution": {
"SAP_Transaction": "For backend administrators: run transaction /IWFND/ERROR_LOG on SAP Gateway hub system and search for entries with the timestamp above for more details",
"SAP_Note": "See SAP Note 1797736 for error analysis (https://service.sap.com/sap/support/notes/1797736)"
},
"errordetails": []
}
}
}

Do you see anything that we could be doing wrong? When we try calling the API using Postman it works fine. Let me know if there's any additional info that we could provide you.

Thanks,
Rafael

Can't maintain the destination URL till Service Path

Hi Colleagues,

We are trying to consume S/4 HANA SDK , especially oDataQueryBuilder. In our scenario, customer will maintain the destination. WithEntityMethod needs 2 parameter one servicePath and other one is Entity.

It won't be good to tell customer to maintain n destination with same URL ( till before service path all are same) and different Password for us to connecting to many service.

When we give complete URL in Destination till service path and uses "/" in our code for servicePath in our code it gives 502.
ODataQueryBuilder.withEntity("/", "")

Could you please help us here.

Extend SDK to a different program

Hello Colleagues,

I'm trying to extend the S4-sdk to Hybris Marketing Cloud, but unfortunately there's no archetype to do so, is there a way i can extend this sdk for creating a new sdk ? so far I've just seen the templates w.r.t packaging an executable nothing in terms of library.

Regards,
Anish

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.