sap-archive / cloud-s4-sdk-examples Goto Github PK
View Code? Open in Web Editor NEWRunnable example applications that showcase the usage of the SAP Cloud SDK.
License: Apache License 2.0
Runnable example applications that showcase the usage of the SAP Cloud SDK.
License: Apache License 2.0
Hi colleagues,
Our team is working with the attachments APIs (especifically, the DefaultAPICVATTACHMENTSRVService
service) and we found an error with the files that were uploaded using the SDK. The operation runs fine and the document is uploaded to S/4 with the correct linked object key/type and metadata (file name, mime type); however, the actual data for the document is not valid.
Through a binary comparison between the original file and the one retrieved from S/4, I've noticed that data for the uploaded file is actually readable in a text editor and it contains a JSON representation of the actual file. E.g.:
{
"LinkedSAPObjectKey":"123456789",
"FileSize":"39618",
"FileName":"testfile.jpg",
"MimeType":"image/jpeg",
"Content":"base64encodeddatahere",
"BusinessObjectType":"EXAMPLE"
}
Here is how our code looks like when creating the command (using the generated builders, we also tried to use the setter methods but we got the same result):
byte[] attachmentData = document.getDataInBytes();
AttachmentContent attachmentContent = AttachmentContent.builder()
.attachmentContent(attachmentData)
.fileSize(String.valueOf(attachmentData.length))
.fileName(document.getName())
.mimeType(document.getMimeType())
.linkedSAPObjectKey(document.getKey())
.businessObjectType("EXAMPLE")
.build();
This attachmentContent
variable is then provided when we call the createAttachmentContent()
method on DefaultAPICVATTACHMENTSRVService
, but somewhere along the way it seems like the content is being replaced with the JSON containing both the data and the metadata.
The command's run()
method looks like this:
private Map<String, String> getUploadHeaders() {
Map<String, String> headers = new HashMap<>();
headers.put("Slug", attachmentContent.getFileName());
headers.put("Content-Type", attachmentContent.getMimeType());
headers.put("BusinessObjectTypeName", "EXAMPLE");
headers.put("LinkedSAPObjectKey", attachmentContent.getLinkedSAPObjectKey());
return headers;
}
...
defaultAttachmentsService
.createAttachmentContent(attachmentContent)
.withCustomHttpHeaders(getUploadHeaders()).onRequestAndImplicitRequests()
.execute(getConfigContext());
Right now are unsure if this is caused by something we are missing from our code or if there is a bug in this service somewhere. Could you please assist us with this issue?
Hi team,
I'm trying to build an app that reads info from SFSF following these steps:
mvn archetype:generate "-DarchetypeGroupId=com.sap.cloud.sdk.archetypes" "-DarchetypeArtifactId=scp-cf-tomee" "-DarchetypeVersion=RELEASE"
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<scope>provided</scope>
</dependency>
In plugins:
<plugin>
<groupId>com.sap.cloud.sdk.datamodel</groupId>
<artifactId>odata-generator-maven-plugin</artifactId>
<version>3.13.0</version>
<executions>
<execution>
<id>generate-consumption</id>
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<inputDirectory>${project.basedir}/edmx</inputDirectory>
<outputDirectory>${project.build.directory}/vdm</outputDirectory>
<defaultBasePath>/odata/v2</defaultBasePath>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>${project.basedir}/vdm</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
package com.sap.sdk;
import com.google.gson.Gson;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.io.IOException;
import java.util.List;
import com.sap.cloud.sdk.cloudplatform.connectivity.DestinationAccessor;
import com.sap.cloud.sdk.odatav2.connectivity.ODataException;
import com.sap.cloud.sdk.s4hana.connectivity.DefaultErpHttpDestination;
import com.sap.cloud.sdk.s4hana.connectivity.ErpHttpDestination;
import com.sap.cloud.sdk.s4hana.datamodel.odata.namespaces.rcmjobrequisition.JobRequisition;
import com.sap.cloud.sdk.s4hana.datamodel.odata.services.DefaultRCMJobRequisitionService;
@WebServlet("/req")
public class JobReqServlet extends HttpServlet {
private static final long serialVersionUID = 1L;
private static final Logger logger = LoggerFactory.getLogger(JobReqServlet.class);
private final ErpHttpDestination destination = DestinationAccessor.getDestination("sfsf-sdk-dest").asHttp()
.decorate(DefaultErpHttpDestination::new);
@Override
protected void doGet(final HttpServletRequest request, final HttpServletResponse response)
throws ServletException, IOException {
try {
final List<JobRequisition> jobReqs = new DefaultRCMJobRequisitionService()
.getAllJobRequisition()
.execute(destination);
response.setContentType("application/json");
response.getWriter().write(new Gson().toJson(jobReqs));
} catch (final ODataException e) {
logger.error(e.getMessage(), e);
response.setStatus(HttpServletResponse.SC_INTERNAL_SERVER_ERROR);
response.getWriter().write(e.getMessage());
}
}
}
With all this (I think I'm not missing anything), I do:
mvn clean install
and:
cf push
Everything works well, the hello world servlet works, but when I try to access /req, I get a:
Unable to execute metadata request.
However, I can see that the app is hitting SFSF because if I play with the base path of the service (in the pom.xml) I get 404's coming from SFSF.
Checking everything, I see this when the VDM generator is running:
<defaultBasePath>/odata/v2</defaultBasePath>
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.DataModelGenerator - Default base path: /odata/v2/
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.ODataToVdmGenerator - Title: RCMJobRequisition
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.ODataToVdmGenerator - Raw URL: /odata/v2/SFODataSet
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.ODataToVdmGenerator - Java Package Name: rcmjobrequisition
[main] INFO com.sap.cloud.sdk.datamodel.odata.generator.ODataToVdmGenerator - Java Class Name: RCMJobRequisition
Clearly, that SFODataSet in the URL is not correct. When the app runs, it's tring to get the metadata from .../odata/v2/SFODataSet/$metadata, and that's why it's not finding it.
That SFODataSet is coming from the SFSF metadata:
<Schema Namespace="SFODataSet" xmlns="http://schemas.microsoft.com/ado/2008/09/edm" xmlns:sf="http://www.successfactors.com/edm/sf" xmlns:sap="http://www.sap.com/Protocols/SAPData">
<EntityContainer Name="EntityContainer" m:IsDefaultEntityContainer="true">
<EntitySet Name="JobOfferTemplate_Standard_Offer_Details" EntityType="SFOData.JobOfferTemplate_Standard_Offer_Details" sap:label="JobOfferTemplate_Standard_Offer_Details" sap:creatable="false" sap:updatable="false" sap:upsertable="false" sap:deletable="false">
<Documentation>
<Summary>Job Requisition Template</Summary>
<LongDescription>These entities represent the job requisition template as defined in provisioning.</LongDescription>
<sap:tagcollection>
<sap:tag>Recruiting (RCM)</sap:tag>
<sap:tag>RCM - Job Requisition</sap:tag>
</sap:tagcollection>
</Documentation>
</EntitySet>
<EntitySet Name="JobRequisitionLocale" EntityType="SFOData.JobRequisitionLocale" sap:label="JobRequisitionLocale" sap:creatable="false" sap:updatable="false" sap:upsertable="false" sap:deletable="false">
<Documentation>
...
I tried something else. In the java class, I added
.withServicePath("odata/v2")
in the call to the destination, leaving it like this::
final List<JobRequisition> jobReqs = new DefaultRCMJobRequisitionService()
.withServicePath("odata/v2/JobRequisition")
.getAllJobRequisition()
.execute(destination);
That replaces the /odata/v2/SFODataSet url that was generated. In this case, the error I get is:
"com.sap.cloud.sdk.odatav2.connectivity.ODataQuery","thread":"http-nio-0.0.0.0-8080-exec-3","level":"ERROR","categories":[],"msg":"Failed to convert response into ODataFeed: An exception of type 'EdmSimpleTypeException' occurred." }
I've tried different values: odata/v2, odata/v2/, odata/v2/JobRequisition...
I can't find the way for this to work. Can you help me find the issue here?
I'm using:
Kr,
kepair
Hello colleagues,
Our team is working with the change master API and one of our requirements is to create a change master and assign both materials and task lists to it.
We are using the API via SDK with generated sources, as a workaround recommended in #69. The create works fine when no associations are assigned. However, once we add a material to the change request before creating it, the S4 backend returns a 400 error:
The endpoint responded with HTTP error code 400.
Object type &Material& is not permitted for objects of change number
Our code to create a new change master looks like this:
APIForChangeNumber changeMaster = new APIForChangeNumber();
changeMaster.setValidFrom(validFrom);
changeMaster.setDescription(changeRequest.getDescription());
changeMaster.setStatus("1");
APIForChangeNumberObjectManagementRecordMaterial s4Material = new APIForChangeNumberObjectManagementRecordMaterial();
s4Material.setMaterial(material.getMaterialId());
changeMaster.addChangeMstrObMgReMaterial(s4Material);
Could you please assist us? I am wondering if this is either incorrect usage of the SDK, a S4 misconfiguration or a known issue when using metadata generated sources.
Thanks!
Tiago
Hello colleagues,
I am working with the Change Master API and we are having some issues regarding class cast exceptions. The exception stack trace is below:
java.lang.ClassCastException: java.util.GregorianCalendar cannot be cast to java.lang.Double at com.sap.cloud.sdk.s4hana.datamodel.odata.namespaces.changemaster.ChangeMaster.fromMap(ChangeMaster.java:1073)
at com.sap.cloud.sdk.s4hana.datamodel.odata.helper.FluentHelperCreate.buildEntityFromODataResult(FluentHelperCreate.java:177)
at com.sap.cloud.sdk.s4hana.datamodel.odata.helper.FluentHelperCreate.execute(FluentHelperCreate.java:165)
at com.sap.cloudscame.bmsmoc.s4.command.CreateChangeMasterV2Command.run(CreateChangeMasterV2Command.java:35)
This happens whenever a GET is executing and all properties of the change master are selected. Also, when we create a new change master, the same error occurs.
We are using scp-neo
SDK in version 2.14.0
.
Could you please assist us on this issue?
Thanks,
Tiago
Hello colleagues,
We are working with the Change Master API and we are having some issues updating the status.
We are calling the API as:
this.defaultChangeMasterService
.updateAPIForChangeNumber(changeMaster)
.execute(getConfigContext())
While debugging, we could validate that changemaster
is set as:
APIForChangeNumber(super=VdmObject(customFields={}, changedOriginalFields={Status=2}), changeNumber=12345, status=1, authorizationGroup=, function=, relTechnically=, releaseKey=0, reasonForChange=123, description=123, validFrom=9999-01-01T00:00, createdOn=2019-05-07T00:00, createdBy=SCP_RFC, changedOn=null, changedBy=, inUse=false, deletionFlag=false, timeStamp=1972-04-01T00:00Z[GMT], toAlternativeDate=null, toChangeMstrObjectMgmtRecord=null, toChangeMstrObMgReDocInfoRecd=null, toChangeMstrObMgReMaterial=null, toChangeMstrObMgReMatlBOM=null, toCharacteristics=null, toClassification=null, toObjTypeAssignment=null)
However, we get HTTP 501 error with the message:
The endpoint responded with HTTP error code 501.
PATCH requests require components to be updated
Full error message:
{
"error": {
"code": "/IWFND/CM_MGW/096",
"message": {
"lang": "en",
"value": "PATCH requests require components to be updated"
},
"innererror": {
"application": {
"component_id": "PLM-WUI-OBJ-ECN",
"service_namespace": "/SAP/",
"service_id": "API_CHANGEMASTER",
"service_version": "0002"
},
"transactionid": "83...10",
"timestamp": "20190507201621.8351780",
"Error_Resolution": {
"SAP_Transaction": "For backend administrators: run transaction /IWFND/ERROR_LOG on SAP Gateway hub system and search for entries with the timestamp above for more details",
"SAP_Note": "See SAP Note 1797736 for error analysis (https://service.sap.com/sap/support/notes/1797736)"
},
"errordetails": []
}
}
}
Do you see anything that we could be doing wrong? When we try calling the API using Postman it works fine. Let me know if there's any additional info that we could provide you.
Thanks,
Rafael
A violation against the OSS Rules of Play has been detected.
Rule ID: rl-reuse_tool-3
Explanation: Is it registered in REUSE? No
Find more information at: https://sap.github.io/fosstars-rating-core/oss_rules_of_play_rating.html
Hi Colleagues,
We are trying to consume S/4 HANA SDK , especially oDataQueryBuilder. In our scenario, customer will maintain the destination. WithEntityMethod needs 2 parameter one servicePath and other one is Entity.
It won't be good to tell customer to maintain n destination with same URL ( till before service path all are same) and different Password for us to connecting to many service.
When we give complete URL in Destination till service path and uses "/" in our code for servicePath in our code it gives 502.
ODataQueryBuilder.withEntity("/", "")
Could you please help us here.
Hello Colleagues,
I'm trying to extend the S4-sdk to Hybris Marketing Cloud, but unfortunately there's no archetype to do so, is there a way i can extend this sdk for creating a new sdk ? so far I've just seen the templates w.r.t packaging an executable nothing in terms of library.
Regards,
Anish
A violation against the OSS Rules of Play has been detected.
Rule ID: rl-reuse_tool-4
Explanation: Is it compliant with REUSE rules? No
Find more information at: https://sap.github.io/fosstars-rating-core/oss_rules_of_play_rating.html
A violation against the OSS Rules of Play has been detected.
Rule ID: rl-reuse_tool-1
Explanation: Does README mention REUSE? No
Find more information at: https://sap.github.io/fosstars-rating-core/oss_rules_of_play_rating.html
Please solve immediately.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.