Giter Site home page Giter Site logo

roadrunner-temporal's Introduction

Linux Discourse

Roadrunner Temporal

The repository contains a number of plugins which enable workflow and activity processing for PHP processes. The communication protocol, supervisor, load-balancer is based on RoadRunner PHP Application Server.

Installation

Temporal is an official plugin of RoadRunner and available out-of-the-box in >= RoadRunner 2023.0.

Read more about application server installation here.

To install PHP-SDK:

$ composer require temporal/sdk

License

MIT License

roadrunner-temporal's People

Contributors

cv65kr avatar dependabot[bot] avatar rustatian avatar shanginn avatar wolfy-j avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

roadrunner-temporal's Issues

[๐Ÿ› BUG?]: MaxConcurrentActivityExecutionSize ignored.

No duplicates ๐Ÿฅฒ.

  • I have searched for a similar issue in our bug tracker and didn't find any solutions.

What happened?

MaxConcurrentActivityExecutionSize seems to be ignored, it's supposed to decrease the https://docs.temporal.io/docs/operation/how-to-tune-workers/#metrics task_slots_available metric, but it doesn't. On go SDK it's working.

In both go workers and PHP workers the default is 1000, but if I set it to 1 on PHP the metric stays 1000, when I set it to 1 on pure GO worker, it goes to 1 as it should.

I'm sure from the logs, RR is getting the right value.

Any ideas guys? Maybe I'm setting up something wrong? ๐Ÿค”

Version

2.8.4

Relevant log output

2022-03-22T19:12:33.968Z	DEBUG	temporal    	received message	{"data": "\n\ufffd&*\ufffd&\n\ufffd\r\n\u0016\n\u0008encoding\u0012\njson/plain\u0012\ufffd\u000c{\"TaskQueue\":\"test_queue\",\"Options\":{\"MaxConcurrentActivityExecutionSize\":1,\
"WorkerActivitiesPerSecond\":0.0,\"MaxConcurrentLocalActivityExecutionSize\":0,\"WorkerLocalActivitiesPerSecond\":0.0,\"TaskQueueActivitiesPerSecond\":0.0,\"MaxConcurrentActivityTaskPollers\":0,\"MaxConcurrentWorkflowTaskExecutionSize\":0,\"MaxConcurrentWorkflowTaskPollers\":0,\"StickyScheduleToStartTimeout\":null,\"WorkerStopTimeout\":null,\"EnableSessionWorker\":false,\"SessionResourceID\":null,\"MaxConcurrentSessionExecutionSize\":1000},\"Workflows\":[....]
{\"TaskQueue\":\"test_conversion\",\"Options\":{\"MaxConcurrentActivityExecutionSize\":1,\
"WorkerActivitiesPerSecond\":0.0,\"MaxConcurrentLocalActivityExecutionSize\":0,\"WorkerLocalActivitiesPerSecond\":0.0,\"TaskQueueActivitiesPerSecond\":0.0,\"MaxConcurrentActivityTaskPollers\":0,\"MaxConcurrentWorkflowTaskExecutionSize\":0,\"MaxConcurrentWorkflowTaskPollers\":0,\"StickyScheduleToStartTimeout\":null,\"WorkerStopTimeout\":null,\"EnableSessionWorker\":false,\"SessionResourceID\":null,\"MaxConcurrentSessionExecutionSize\":1000},\"Workflows\":[{\"Name\":\"ConverterWorkflow\",\"Queries\":[],\"Signals\":[]}],\"Activities\":[....]

[BUG] Roadrunner starts with 2 temporal workers even if in config i set more

Describe the bug

I set in .rr.yaml file temporal.activities.num_workers to 4 , but when i start rr i see only 2 workers running.

To Reproduce

  1. build docker container https://github.com/temporalio/samples-php/blob/master/Dockerfile
  2. i took code from here (https://github.com/wolfy-j/temporal-workshop) and mounted it in docker volume in /var/app/workshop
  3. ran container in interactive mode with bash (docker run -v /Volumes/Workspace/projects/temporal-workshop/temporal-workshop:/var/app/workshop -it temporal-php-samples bash)
  4. started rr with rr serve -c /var/app/workshop/.rr.yaml -w /var/app/workshop and see such result:
root@2e358f6cd379:/var/app/workshop# rr serve -w . -c ./.rr.yaml
2021/08/16 18:22:06	INFO	temporal    	Started Worker	{"Namespace": "default", "TaskQueue": "default", "WorkerID": "12560@2e358f6cd379@default@1"}
2021/08/16 18:22:09	INFO	temporal    	Started Worker	{"Namespace": "default", "TaskQueue": "default", "WorkerID": "12560@2e358f6cd379@default@2"}

Expected behavior

expected to see 4 workers running, because in config it's temporal.activities.num_workers = 4

Screenshots/Terminal output

Versions

Additional context

nope

proto_codec_parse_message: invalid character '6' looking for beginning of value

Expected Behavior

Without panic error.

Actual Behavior

proto_codec_parse_message: invalid character '6' looking for beginning of value

process event for order.creation [panic]:
github.com/temporalio/roadrunner-temporal/v4/aggregatedpool.(*Workflow).OnWorkflowTaskStarted(0xc000e0a380, 0x4b?)
    github.com/temporalio/roadrunner-temporal/[email protected]/aggregatedpool/workflow.go:184 +0x3e7
go.temporal.io/sdk/internal.(*workflowExecutionEventHandlerImpl).ProcessEvent(0xc000b2c0a8, 0xc0008a3c70, 0xa0?, 0x1)
    go.temporal.io/[email protected]/internal/internal_event_handlers.go:1143 +0x225
go.temporal.io/sdk/internal.(*workflowExecutionContextImpl).ProcessWorkflowTask(0xc00137e000, 0xc0001cde90)
    go.temporal.io/[email protected]/internal/internal_task_handlers.go:1100 +0x1608
go.temporal.io/sdk/internal.(*workflowTaskHandlerImpl).ProcessWorkflowTask(0xc000c224e0, 0xc0001cde90, 0xc00137e000, 0xc0014b71a0)
    go.temporal.io/[email protected]/internal/internal_task_handlers.go:889 +0x3c5
go.temporal.io/sdk/internal.(*workflowTaskPoller).processWorkflowTask(0xc000e40360, 0xc0001cde90)
    go.temporal.io/[email protected]/internal/internal_task_pollers.go:357 +0x3c3
go.temporal.io/sdk/internal.(*workflowTaskPoller).ProcessTask(0xc000e40360, {0x1a2c3a0, 0xc0001cde90})
    go.temporal.io/[email protected]/internal/internal_task_pollers.go:321 +0x78
go.temporal.io/sdk/internal.(*baseWorker).processTask(0xc000e882c0, {0x1a2cea0, 0xc000d355e0})
    go.temporal.io/[email protected]/internal/internal_worker_base.go:518 +0x153
go.temporal.io/sdk/internal.(*baseWorker).processTaskAsync.func1()
    go.temporal.io/[email protected]/internal/internal_worker_base.go:369 +0x45
created by go.temporal.io/sdk/internal.(*baseWorker).processTaskAsync in goroutine 284
    go.temporal.io/[email protected]/internal/internal_worker_base.go:365 +0xa5

Steps to Reproduce the Problem

  1. Random workflow in production :(

Specifications

  • Version: v1.26.0-rc.3
  • Platform: RoadRunner Temporal v4.6.1

github.com/uber-go/tally/v4-v4.1.7: 2 vulnerabilities (highest severity is: 7.5)

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.7

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.7.zip

Found in HEAD commit: f4bf0385c1caa5a0b5b29221292d5b1a64b431a9

Vulnerabilities

CVE Severity CVSS Dependency Type Fixed in (github.com/uber-go/tally/v4-v4.1.7 version) Remediation Possible**
CVE-2019-0205 High 7.5 github.com/uber-go/tally/v4-v4.1.7 Direct org.apache.thrift:libthrift:0.13.0 โŒ
CVE-2019-0210 High 7.5 github.com/uber-go/tally/v4-v4.1.7 Direct 0.13.0 โŒ

**In some cases, Remediation PR cannot be created automatically for a vulnerability despite the availability of remediation

Details

CVE-2019-0205

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.7

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.7.zip

Dependency Hierarchy:

  • โŒ github.com/uber-go/tally/v4-v4.1.7 (Vulnerable Library)

Found in HEAD commit: f4bf0385c1caa5a0b5b29221292d5b1a64b431a9

Found in base branch: master

Vulnerability Details

In Apache Thrift all versions up to and including 0.12.0, a server or client may run into an endless loop when feed with specific input data. Because the issue had already been partially fixed in version 0.11.0, depending on the installed version it affects only certain language bindings.

Publish Date: 2019-10-29

URL: CVE-2019-0205

CVSS 3 Score Details (7.5)

Base Score Metrics:

  • Exploitability Metrics:
    • Attack Vector: Network
    • Attack Complexity: Low
    • Privileges Required: None
    • User Interaction: None
    • Scope: Unchanged
  • Impact Metrics:
    • Confidentiality Impact: None
    • Integrity Impact: None
    • Availability Impact: High

For more information on CVSS3 Scores, click here.

Suggested Fix

Type: Upgrade version

Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0205

Release Date: 2019-10-29

Fix Resolution: org.apache.thrift:libthrift:0.13.0

CVE-2019-0210

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.7

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.7.zip

Dependency Hierarchy:

  • โŒ github.com/uber-go/tally/v4-v4.1.7 (Vulnerable Library)

Found in HEAD commit: f4bf0385c1caa5a0b5b29221292d5b1a64b431a9

Found in base branch: master

Vulnerability Details

In Apache Thrift 0.9.3 to 0.12.0, a server implemented in Go using TJSONProtocol or TSimpleJSONProtocol may panic when feed with invalid input data.

Publish Date: 2019-10-29

URL: CVE-2019-0210

CVSS 3 Score Details (7.5)

Base Score Metrics:

  • Exploitability Metrics:
    • Attack Vector: Network
    • Attack Complexity: Low
    • Privileges Required: None
    • User Interaction: None
    • Scope: Unchanged
  • Impact Metrics:
    • Confidentiality Impact: None
    • Integrity Impact: None
    • Availability Impact: High

For more information on CVSS3 Scores, click here.

Suggested Fix

Type: Upgrade version

Origin: http://mail-archives.apache.org/mod_mbox/thrift-dev/201910.mbox/%3C277A46CA87494176B1BBCF5D72624A2A%40HAGGIS%3E

Release Date: 2019-10-29

Fix Resolution: 0.13.0

[๐Ÿ› BUG]: Incorrect worker pool recreating

No duplicates ๐Ÿฅฒ.

  • I have searched for a similar issue in our bug tracker and didn't find any solutions.

What happened?

Greetings.
I was trying to investigate reasons for some errors in processing my workflows with a lot of activities(e.g. 100+) and have found something strange when worker pools were recreated during workflow execution. Especially when max_jobs option for activities in .rr.yaml file is used, because it causes pool to be recreated more often.
Sometimes it simple causes activity execution timeout but sometimes it cause to workflow task execution error and as result - whole workflow fails.
On different setup errors slightly vary but in both cases stack trace of golang panic points to aggregatedpool/workflow.go:153 .
Also If at least workflow tasks worker pool has restarted successfully (e.g. "workflow pool restarted" appears in log) - everything works fine.
So for now I suppress exceptions in my workflows and turn of max_jobs but this imho not good decision.

Version

PHP 8.1.3
Roadrunner 2.10.3 (temporal plugin version 1.4.4)
Temporal PHP SDK 1.3.2

Relevant log output

Log 1:

DEBUG	server      	worker stopped	{"internal_event_name": "EventWorkerWaitExit", "error": "signal: killed; process_wait: signal: killed", "errorCauses": [{"error": "signal: killed"}, {"error": "process_wait: signal: killed"}]}
DEBUG	temporal    	worker stopped, restarting pool and temporal workers	{"message": "process exited"}
INFO	temporal    	reset signal received, resetting activity and workflow worker pools
DEBUG	server      	worker is allocated	{"pid": 473, "internal_event_name": "EventWorkerConstruct"}
DEBUG	server      	worker destroyed	{"pid": 461, "internal_event_name": "EventWorkerDestruct"}
DEBUG	server      	worker is allocated	{"pid": 477, "internal_event_name": "EventWorkerConstruct"}
INFO	temporal    	workflow pool restarted
DEBUG	server      	worker destroyed	{"pid": 473, "internal_event_name": "EventWorkerDestruct"}
DEBUG	server      	worker destroyed	{"pid": 466, "internal_event_name": "EventWorkerDestruct"}
DEBUG	temporal    	workflow task started	{"time": "1s"}
DEBUG	temporal    	outgoing message	{"id": 9266, "data": "", "context": ""}
DEBUG	server      	worker is allocated	{"pid": 482, "internal_event_name": "EventWorkerConstruct"}
DEBUG	server      	worker is allocated	{"pid": 481, "internal_event_name": "EventWorkerConstruct"}
INFO	temporal    	activity pool restarted
WARN	temporal    	Failed to poll for task.	{"Namespace": "default", "TaskQueue": "default", "WorkerID": "default:071a4091-8f8e-4dd9-bf4b-17275481ef76", "WorkerType": "WorkflowWorker", "Error": "worker stopping"}
WARN	server      	worker stopped, and will be restarted	{"reason": "worker error", "pid": 477, "internal_event_name": "EventWorkerError", "error": "sync_worker_exec: SoftJobError:\n\tsync_worker_exec_payload: LogicException: Got the response to undefined request 9266 in /app/vendor/temporal/sdk/src/Internal/Transport/Client.php:60\nStack trace:\n#0 /app/vendor/temporal/sdk/src/WorkerFactory.php(389): Temporal\\Internal\\Transport\\Client->dispatch(Object(Temporal\\Worker\\Transport\\Command\\SuccessResponse))\n#1 /app/vendor/temporal/sdk/src/WorkerFactory.php(261): Temporal\\WorkerFactory->dispatch('\\n'\\x08\\xB2H*\"\\n \\n\\x16\\n\\x08en...', Array)\n#2 /app/vendor/spiral/temporal-bridge/src/Dispatcher.php(61): Temporal\\WorkerFactory->run()\n#3 /app/vendor/spiral/framework/src/Core/src/ContainerScope.php(46): Spiral\\TemporalBridge\\Dispatcher->serve()\n#4 /app/vendor/spiral/framework/src/Core/src/Container.php(282): Spiral\\Core\\ContainerScope::runScope(Object(Spiral\\Core\\Container), Array)\n#5 /app/vendor/spiral/framework/src/Boot/src/AbstractKernel.php(212): Spiral\\Core\\Container->runScope(Array, Array)\n#6 /app/app.php(39): Spiral\\Boot\\AbstractKernel->serve()\n#7 {main}"}
ERROR	temporal    	Workflow panic	{"Namespace": "default", "TaskQueue": "default", "WorkerID": "default:071a4091-8f8e-4dd9-bf4b-17275481ef76", "WorkflowType": "many_action_test", "WorkflowID": "cfbc01e1-6880-48da-95df-b876bcde4165", "RunID": "f280e334-79c3-4c9b-a750-4c7c0064d38a", "Attempt": 1, "Error": "sync_worker_exec: SoftJobError:\n\tsync_worker_exec_payload: LogicException: Got the response to undefined request 9266 in /app/vendor/temporal/sdk/src/Internal/Transport/Client.php:60\nStack trace:\n#0 /app/vendor/temporal/sdk/src/WorkerFactory.php(389): Temporal\\Internal\\Transport\\Client->dispatch(Object(Temporal\\Worker\\Transport\\Command\\SuccessResponse))\n#1 /app/vendor/temporal/sdk/src/WorkerFactory.php(261): Temporal\\WorkerFactory->dispatch('\\n'\\x08\\xB2H*\"\\n \\n\\x16\\n\\x08en...', Array)\n#2 /app/vendor/spiral/temporal-bridge/src/Dispatcher.php(61): Temporal\\WorkerFactory->run()\n#3 /app/vendor/spiral/framework/src/Core/src/ContainerScope.php(46): Spiral\\TemporalBridge\\Dispatcher->serve()\n#4 /app/vendor/spiral/framework/src/Core/src/Container.php(282): Spiral\\Core\\ContainerScope::runScope(Object(Spiral\\Core\\Container), Array)\n#5 /app/vendor/spiral/framework/src/Boot/src/AbstractKernel.php(212): Spiral\\Core\\Container->runScope(Array, Array)\n#6 /app/app.php(39): Spiral\\Boot\\AbstractKernel->serve()\n#7 {main}", "StackTrace": "process event for default [panic]:\ngithub.com/temporalio/roadrunner-temporal/aggregatedpool.(*Workflow).OnWorkflowTaskStarted(0xc00088ad20, 0xc001b5e340?)\n\tgithub.com/temporalio/[email protected]/aggregatedpool/workflow.go:153 +0x2e8\ngo.temporal.io/sdk/internal.(*workflowExecutionEventHandlerImpl).ProcessEvent(0xc002059860, 0xc001b5e400, 0x60?, 0x1)\n\tgo.temporal.io/[email protected]/internal/internal_event_handlers.go:815 +0x203\ngo.temporal.io/sdk/internal.(*workflowExecutionContextImpl).ProcessWorkflowTask(0xc002041570, 0xc00078b9e0)\n\tgo.temporal.io/[email protected]/internal/internal_task_handlers.go:878 +0xca8\ngo.temporal.io/sdk/internal.(*workflowTaskHandlerImpl).ProcessWorkflowTask(0xc0007d9130, 0xc00078b9e0, 0xc00078bd70)\n\tgo.temporal.io/[email protected]/internal/internal_task_handlers.go:727 +0x485\ngo.temporal.io/sdk/internal.(*workflowTaskPoller).processWorkflowTask(0xc001ff7ad0, 0xc00078b9e0)\n\tgo.temporal.io/[email protected]/internal/internal_task_pollers.go:284 +0x2cd\ngo.temporal.io/sdk/internal.(*workflowTaskPoller).ProcessTask(0xc001ff7ad0, {0x15b7080?, 0xc00078b9e0?})\n\tgo.temporal.io/[email protected]/internal/internal_task_pollers.go:255 +0x6c\ngo.temporal.io/sdk/internal.(*baseWorker).processTask(0xc0006e0f00, {0x15b6c40?, 0xc001aa56c0})\n\tgo.temporal.io/[email protected]/internal/internal_worker_base.go:398 +0x167\ncreated by go.temporal.io/sdk/internal.(*baseWorker).runTaskDispatcher\n\tgo.temporal.io/[email protected]/internal/internal_worker_base.go:302 +0xb5"}
WARN	temporal    	Failed to process workflow task.	{"Namespace": "default", "TaskQueue": "default", "WorkerID": "default:071a4091-8f8e-4dd9-bf4b-17275481ef76", "WorkflowType": "many_action_test", "WorkflowID": "cfbc01e1-6880-48da-95df-b876bcde4165", "RunID": "f280e334-79c3-4c9b-a750-4c7c0064d38a", "Attempt": 1, "Error": "sync_worker_exec: SoftJobError:\n\tsync_worker_exec_payload: LogicException: Got the response to undefined request 9266 in /app/vendor/temporal/sdk/src/Internal/Transport/Client.php:60\nStack trace:\n#0 /app/vendor/temporal/sdk/src/WorkerFactory.php(389): Temporal\\Internal\\Transport\\Client->dispatch(Object(Temporal\\Worker\\Transport\\Command\\SuccessResponse))\n#1 /app/vendor/temporal/sdk/src/WorkerFactory.php(261): Temporal\\WorkerFactory->dispatch('\\n'\\x08\\xB2H*\"\\n \\n\\x16\\n\\x08en...', Array)\n#2 /app/vendor/spiral/temporal-bridge/src/Dispatcher.php(61): Temporal\\WorkerFactory->run()\n#3 /app/vendor/spiral/framework/src/Core/src/ContainerScope.php(46): Spiral\\TemporalBridge\\Dispatcher->serve()\n#4 /app/vendor/spiral/framework/src/Core/src/Container.php(282): Spiral\\Core\\ContainerScope::runScope(Object(Spiral\\Core\\Container), Array)\n#5 /app/vendor/spiral/framework/src/Boot/src/AbstractKernel.php(212): Spiral\\Core\\Container->runScope(Array, Array)\n#6 /app/app.php(39): Spiral\\Boot\\AbstractKernel->serve()\n#7 {main}"}
DEBUG	temporal    	sequenceID	{"before": 38}
DEBUG	temporal    	sequenceID	{"after": 39}
DEBUG	server      	worker stopped	{"internal_event_name": "EventWorkerWaitExit", "error": "signal: killed; process_wait: signal: killed", "errorCauses": [{"error": "signal: killed"}, {"error": "process_wait: signal: killed"}]}
DEBUG	temporal    	outgoing message	{"id": 39, "data": "", "context": ""}
DEBUG	server      	worker is allocated	{"pid": 489, "internal_event_name": "EventWorkerConstruct"}
DEBUG	temporal    	received message	{"command": null, "id": 39, "data": "\n\ufffd\u000f\u0008'\"\ufffd\u000f\ndUnable to kill workflow because workflow process #f280e334-79c3-4c9b-a750-4c7c0064d38a was not found\u0012\u0007PHP_SDK\u001a\ufffd\u000eInvalidArgumentException: Unable to kill workflow because workflow process #f280e334-79c3-4c9b-a750-4c7c0064d38a was not found in /app/vendor/temporal/sdk/src/Internal/Transport/Router/DestroyWorkflow.php:48\nStack trace:\n#0 /app/vendor/temporal/sdk/src/Internal/Transport/Router/DestroyWorkflow.php(34): Temporal\\Internal\\Transport\\Router\\DestroyWorkflow->kill('f280e334-79c3-4...')\n#1 /app/vendor/temporal/sdk/src/Internal/Transport/Router.php(81): Temporal\\Internal\\Transport\\Router\\DestroyWorkflow->handle(Object(Temporal\\Worker\\Transport\\Command\\Request), Array, Object(React\\Promise\\Deferred))\n#2 /app/vendor/temporal/sdk/src/Worker/Worker.php(94): Temporal\\Internal\\Transport\\Router->dispatch(Object(Temporal\\Worker\\Transport\\Command\\Request), Array)\n#3 /app/vendor/temporal/sdk/src/WorkerFactory.php(413): Temporal\\Worker\\Worker->dispatch(Object(Temporal\\Worker\\Transport\\Command\\Request), Array)\n#4 /app/vendor/temporal/sdk/src/Internal/Transport/Server.php(60): Temporal\\WorkerFactory->onRequest(Object(Temporal\\Worker\\Transport\\Command\\Request), Array)\n#5 /app/vendor/temporal/sdk/src/WorkerFactory.php(387): Temporal\\Internal\\Transport\\Server->dispatch(Object(Temporal\\Worker\\Transport\\Command\\Request), Array)\n#6 /app/vendor/temporal/sdk/src/WorkerFactory.php(261): Temporal\\WorkerFactory->dispatch('\\nE\\x08'\\x12\\x0FDestroyWo...', Array)\n#7 /app/vendor/spiral/temporal-bridge/src/Dispatcher.php(61): Temporal\\WorkerFactory->run()\n#8 /app/vendor/spiral/framework/src/Core/src/ContainerScope.php(46): Spiral\\TemporalBridge\\Dispatcher->serve()\n#9 /app/vendor/spiral/framework/src/Core/src/Container.php(282): Spiral\\Core\\ContainerScope::runScope(Object(Spiral\\Core\\Container), Array)\n#10 /app/vendor/spiral/framework/src/Boot/src/AbstractKernel.php(212): Spiral\\Core\\Container->runScope(Array, Array)\n#11 /app/app.php(39): Spiral\\Boot\\AbstractKernel->serve()\n#12 {main}*\u001a\n\u0018InvalidArgumentException"}
WARN	temporal    	Failed to poll for task.	{"Namespace": "default", "TaskQueue": "default", "WorkerID": "default:071a4091-8f8e-4dd9-bf4b-17275481ef76", "WorkerType": "ActivityWorker", "Error": "worker stopping"}
INFO	temporal    	Stopped Worker	{"Namespace": "default", "TaskQueue": "default", "WorkerID": "default:071a4091-8f8e-4dd9-bf4b-17275481ef76"}
DEBUG	temporal    	outgoing message	{"id": 0, "data": "", "context": ""}
DEBUG	temporal    	received message	{"command": null, "id": 0, "data": "\n\ufffd\u0004*\ufffd\u0004\n\ufffd\u0004\n\u0016\n\u0008encoding\u0012\njson/plain\u0012\ufffd\u0004{\"TaskQueue\":\"default\",\"Options\":{\"MaxConcurrentActivityExecutionSize\":0,\"WorkerActivitiesPerSecond\":0.0,\"MaxConcurrentLocalActivityExecutionSize\":0,\"WorkerLocalActivitiesPerSecond\":0.0,\"TaskQueueActivitiesPerSecond\":0.0,\"MaxConcurrentActivityTaskPollers\":0,\"MaxConcurrentWorkflowTaskExecutionSize\":0,\"MaxConcurrentWorkflowTaskPollers\":0,\"StickyScheduleToStartTimeout\":null,\"WorkerStopTimeout\":null,\"EnableSessionWorker\":false,\"SessionResourceID\":null,\"MaxConcurrentSessionExecutionSize\":1000},\"Workflows\":[{\"Name\":\"many_action_test\",\"Queries\":[],\"Signals\":[]}],\"Activities\":[{\"Name\":\"exec\"}]}"}
DEBUG	temporal    	worker info	{"taskqueue": "default", "options": {"MaxConcurrentActivityExecutionSize":0,"WorkerActivitiesPerSecond":0,"MaxConcurrentLocalActivityExecutionSize":0,"WorkerLocalActivitiesPerSecond":0,"TaskQueueActivitiesPerSecond":0,"MaxConcurrentActivityTaskPollers":0,"MaxConcurrentWorkflowTaskExecutionSize":0,"MaxConcurrentWorkflowTaskPollers":0,"EnableLoggingInReplay":false,"DisableStickyExecution":false,"StickyScheduleToStartTimeout":0,"BackgroundActivityContext":null,"WorkflowPanicPolicy":0,"WorkerStopTimeout":0,"EnableSessionWorker":false,"MaxConcurrentSessionExecutionSize":1000,"DisableWorkflowWorker":false,"LocalActivityWorkerOnly":false,"Identity":"","DeadlockDetectionTimeout":0,"MaxHeartbeatThrottleInterval":0,"DefaultHeartbeatThrottleInterval":0,"Interceptors":null}}
DEBUG	temporal    	workflow registered	{"taskqueue": "default", "workflow name": "many_action_test"}
DEBUG	temporal    	activity registered	{"taskqueue": "default", "workflow name": "exec"}
DEBUG	temporal    	workers initialized	{"num_workers": 1}
INFO	temporal    	Started Worker	{"Namespace": "default", "TaskQueue": "default", "WorkerID": "default:ef206c5e-a1d0-4c3b-8404-fe99e29fa9fc"}
DEBUG	temporal    	worker stopped, restarting pool and temporal workers	{"message": "process exited"}
INFO	temporal    	reset signal received, resetting activity and workflow worker pools
DEBUG	server      	worker destroyed	{"pid": 489, "internal_event_name": "EventWorkerDestruct"}
DEBUG	server      	worker is allocated	{"pid": 493, "internal_event_name": "EventWorkerConstruct"}
INFO	temporal    	workflow pool restarted



Log 2:

static_pool_exec: Workers watcher stopped:\n\tstatic_pool_exec:\n\tworker_watcher_get_free_worker", "StackTrace": "process event for default [panic]:\ngithub.com/temporalio/roadrunner-temporal/aggregatedpool.(*Workflow).OnWorkflowTaskStarted(0xc0008914a0, 0xc0011ac5c0?)\n\tgithub.com/temporalio/[email protected]/aggregatedpool/workflow.go:153 +0x2e8\ngo.temporal.io/sdk/internal.(*workflowExecutionEventHandlerImpl).ProcessEvent(0xc000f12600, 0xc0011ac680, 0x60?, 0x1)\n\tgo.temporal.io/[email protected]/internal/internal_event_handlers.go:815 +0x203\ngo.temporal.io/sdk/internal.(*workflowExecutionContextImpl).ProcessWorkflowTask(0xc00019a5b0, 0xc0011b4d80)\n\tgo.temporal.io/[email protected]/internal/internal_task_handlers.go:878 +0xca8\ngo.temporal.io/sdk/internal.(*workflowTaskHandlerImpl).ProcessWorkflowTask(0xc000fd49a0, 0xc0011b4d80, 0xc0011b5110)\n\tgo.temporal.io/[email protected]/internal/internal_task_handlers.go:727 +0x485\ngo.temporal.io/sdk/internal.(*workflowTaskPoller).processWorkflowTask(0xc00071a5b0, 0xc0011b4d80)\n\tgo.temporal.io/[email protected]/internal/internal_task_pollers.go:284 +0x2cd\ngo.temporal.io/sdk/internal.(*workflowTaskPoller).ProcessTask(0xc00071a5b0, {0x15b7080?, 0xc0011b4d80?})\n\tgo.temporal.io/[email protected]/internal/internal_task_pollers.go:255 +0x6c\ngo.temporal.io/sdk/internal.(*baseWorker).processTask(0xc000544280, {0x15b6c40?, 0xc000efa260})\n\tgo.temporal.io/[email protected]/internal/internal_worker_base.go:398 +0x167\ncreated by go.temporal.io/sdk/internal.(*baseWorker).runTaskDispatcher\n\tgo.temporal.io/[email protected]/internal/internal_worker_base.go:302 +0xb5

[๐Ÿ’ก FEATURE REQUEST]: Support LocalActivities

Plugin

Temporal

I have an idea!

Tracking ticket to support LA in the temporal plugin:

  • Use 1 temporal worker to register all available in the FetchWorkerInfo workflows and activities.
  • Add new command type to the internal.Message -> internal.InvokeLocalActivity.
  • Add 2 new messages: ExecuteLocalActivityRequest, RequestLocalActivityCancel.
  • Update arch diagram (draw.io)
  • Update tests
  • Move all protobuf to the buf protoregistry (https://buf.build/roadrunner-server/api/docs)

[Bug] Unable to start with only one worker (for xdebuging)

Describe the bug
When using this .rr.yaml configuration:

temporal:
  address: ${TEMPORAL_CLI_ADDRESS}
  namespace: ${TEMPORAL_NAMESPACE}
  activities:
    num_workers: 1

I'm seeing 2 php worker processes running. That is quite unfortunate when I'm trying to debug activities execution, since the two are competing for tasks and I'm only able to debug one worker at a time.

To Reproduce
Steps to reproduce the behavior:

  1. set num_workers to 1 in your .rr.yaml
  2. run rr serve -c .rr.yaml
  3. Look in the process list (top or ps -aux) and see two workers running

Expected behavior
One worker should be running soI would be able to connect to the one which does the processing.

Screenshots/Terminal ouput
If applicable, add screenshots or code blocks to help explain your problem. You can also use Loom to do short, free video bug reports.

Versions (please complete the following information where relevant):

  • OS: Linux (Ubuntu from Docker)
  • Temporal Version: 1.10.2
  • are you using Docker or Kubernetes or building Temporal from source? Docker

github.com/uber-go/tally/v4-v4.1.6: 2 vulnerabilities (highest severity is: 7.5)

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.6

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.6.zip

Found in HEAD commit: bd87ce52af3af5dd0455e08c47bba251532be405

Vulnerabilities

CVE Severity CVSS Dependency Type Fixed in (github.com/uber-go/tally/v4-v4.1.6 version) Remediation Available
CVE-2019-0205 High 7.5 github.com/uber-go/tally/v4-v4.1.6 Direct org.apache.thrift:libthrift:0.13.0 โŒ
CVE-2019-0210 High 7.5 github.com/uber-go/tally/v4-v4.1.6 Direct 0.13.0 โŒ

Details

CVE-2019-0205

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.6

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.6.zip

Dependency Hierarchy:

  • โŒ github.com/uber-go/tally/v4-v4.1.6 (Vulnerable Library)

Found in HEAD commit: bd87ce52af3af5dd0455e08c47bba251532be405

Found in base branch: master

Vulnerability Details

In Apache Thrift all versions up to and including 0.12.0, a server or client may run into an endless loop when feed with specific input data. Because the issue had already been partially fixed in version 0.11.0, depending on the installed version it affects only certain language bindings.

Publish Date: 2019-10-29

URL: CVE-2019-0205

CVSS 3 Score Details (7.5)

Base Score Metrics:

  • Exploitability Metrics:
    • Attack Vector: Network
    • Attack Complexity: Low
    • Privileges Required: None
    • User Interaction: None
    • Scope: Unchanged
  • Impact Metrics:
    • Confidentiality Impact: None
    • Integrity Impact: None
    • Availability Impact: High

For more information on CVSS3 Scores, click here.

Suggested Fix

Type: Upgrade version

Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0205

Release Date: 2019-10-29

Fix Resolution: org.apache.thrift:libthrift:0.13.0

CVE-2019-0210

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.6

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.6.zip

Dependency Hierarchy:

  • โŒ github.com/uber-go/tally/v4-v4.1.6 (Vulnerable Library)

Found in HEAD commit: bd87ce52af3af5dd0455e08c47bba251532be405

Found in base branch: master

Vulnerability Details

In Apache Thrift 0.9.3 to 0.12.0, a server implemented in Go using TJSONProtocol or TSimpleJSONProtocol may panic when feed with invalid input data.

Publish Date: 2019-10-29

URL: CVE-2019-0210

CVSS 3 Score Details (7.5)

Base Score Metrics:

  • Exploitability Metrics:
    • Attack Vector: Network
    • Attack Complexity: Low
    • Privileges Required: None
    • User Interaction: None
    • Scope: Unchanged
  • Impact Metrics:
    • Confidentiality Impact: None
    • Integrity Impact: None
    • Availability Impact: High

For more information on CVSS3 Scores, click here.

Suggested Fix

Type: Upgrade version

Origin: http://mail-archives.apache.org/mod_mbox/thrift-dev/201910.mbox/%3C277A46CA87494176B1BBCF5D72624A2A%40HAGGIS%3E

Release Date: 2019-10-29

Fix Resolution: 0.13.0

[๐Ÿ› BUG]: Can't cancel workflow from activity - UnhandledCommand for activity, but workflow is completed

No duplicates ๐Ÿฅฒ.

  • I have searched for a similar issue in our bug tracker and didn't find any solutions.

What happened?

I created an example for reproducing the behavior.
A new CancelWorkflow with Activity that cancels that workflow (forked from https://github.com/temporalio/samples-php)

I am using the command:
docker-compose exec app php app.php cancel-activity

Result:

  • the workflow is completed
  • workflow not canceled
  • activity with WorkflowTaskFailed in the temporal admin with message:

Cause : UnhandledCommand
Failure:
{
"message": "UnhandledCommand",
"source": "",
"stackTrace": "",
"encodedAttributes": null,
"cause": null,
"serverFailureInfo": {
"nonRetryable": true
}
}

I'm expecting workflow canceling from the activity

Version

RoadRunner version: 2.12.3

Relevant log output

samples-php-app-1       | 2023-08-15T15:25:21.171Z      DEBUG   temporal        workflow execute        {"runID": "53f32eee-387e-478c-9d8b-c9f91d01099d", "workflow info": {"WorkflowExecution":{"ID":"wf-462","RunID":"53f32eee-387e-478c-9d8b-c9f91d01099d"},"WorkflowType":{"Name":"CancelActivity"},"TaskQueueName":"default","WorkflowExecutionTimeout":60000000000,"WorkflowRunTimeout":60000000000,"WorkflowTaskTimeout":10000000000,"Namespace":"default","Attempt":1,"WorkflowStartTime":"2023-08-15T15:25:21.141917945Z","CronSchedule":"","ContinuedExecutionRunID":"","ParentWorkflowNamespace":"","ParentWorkflowExecution":null,"Memo":null,"SearchAttributes":null,"RetryPolicy":null,"BinaryChecksum":"e3d02694289f74cc72ea990fe241278d"}}
samples-php-app-1       | 2023-08-15T15:25:21.171Z      DEBUG   temporal        workflow task started   {"time": "1s"}
samples-php-app-1       | 2023-08-15T15:25:21.171Z      DEBUG   temporal        outgoing message        {"id": 1, "data": "", "context": ""}
samples-php-app-1       | 2023-08-15T15:25:21.177Z      DEBUG   temporal        received message        {"command": {"name":"ActivityCancel.cancel","options":{"ActivityID":"","TaskQueueName":"","ScheduleToCloseTimeout":20000000000,"ScheduleToStartTimeout":0,"StartToCloseTimeout":0,"HeartbeatTimeout":0,"WaitForCancellation":false,"OriginalTaskQueueName":"","RetryPolicy":null,"DisableEagerExecution":false}}, "id": 9001, "data": "\n\ufffd\u0002\u0008\ufffdF\u0012\u000fExecuteActivity\u001a\ufffd\u0001{\"name\":\"ActivityCancel.cancel\",\"options\":{\"TaskQueueName\":null,\"ScheduleToCloseTimeout\":20000000000,\"ScheduleToStartTimeout\":0,\"StartToCloseTimeout\":0,\"HeartbeatTimeout\":0,\"WaitForCancellation\":false,\"ActivityID\":\"\",\"RetryPolicy\":null}}*$\n\"\n\u0016\n\u0008encoding\u0012\njson/plain\u0012\u0008\"wf-462\"\n\u001f\u0008\u0001*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null"}
samples-php-app-1       | 2023-08-15T15:25:21.177Z      DEBUG   temporal        received message        {"command": null, "id": 1, "data": "\n\ufffd\u0002\u0008\ufffdF\u0012\u000fExecuteActivity\u001a\ufffd\u0001{\"name\":\"ActivityCancel.cancel\",\"options\":{\"TaskQueueName\":null,\"ScheduleToCloseTimeout\":20000000000,\"ScheduleToStartTimeout\":0,\"StartToCloseTimeout\":0,\"HeartbeatTimeout\":0,\"WaitForCancellation\":false,\"ActivityID\":\"\",\"RetryPolicy\":null}}*$\n\"\n\u0016\n\u0008encoding\u0012\njson/plain\u0012\u0008\"wf-462\"\n\u001f\u0008\u0001*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null"}
samples-php-app-1       | 2023-08-15T15:25:21.177Z      DEBUG   temporal        activity request        {"ID": 9001}
samples-php-app-1       | 2023-08-15T15:25:21.177Z      DEBUG   temporal        ExecuteActivity {"Namespace": "default", "TaskQueue": "default", "WorkerID": "default:fdd90c6e-152a-4568-a0a2-d76fc3a8f7e8", "WorkflowType": "CancelActivity", "WorkflowID": "wf-462", "RunID": "53f32eee-387e-478c-9d8b-c9f91d01099d", "Attempt": 1, "ActivityID": "5", "ActivityType": "ActivityCancel.cancel"}
samples-php-app-1       | 2023-08-15T15:25:21.190Z      DEBUG   temporal        outgoing message        {"id": 1, "data": "", "context": ""}

samples-php-app-1       | 2023-08-15T15:25:21.195Z      INFO    server          [debug] Starting cancel activity.
samples-php-app-1       | 2023-08-15T15:25:21.198Z      INFO    server          [debug] created new workflowClient, get workflowID:wf-462
samples-php-app-1       | 2023-08-15T15:25:21.198Z      INFO    server          [debug] got workflow with ID:wf-462
samples-php-app-1       | 2023-08-15T15:25:21.210Z      INFO    server          [debug] cancelled workflow with ID:wf-462
samples-php-app-1       | 2023-08-15T15:25:21.210Z      DEBUG   temporal        received message        {"command": null, "id": 1, "data": "\n\u001f\u0008\u0001*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null"}
samples-php-app-1       | 2023-08-15T15:25:21.217Z      DEBUG   temporal        workflow task started   {"time": "1s"}
samples-php-app-1       | 2023-08-15T15:25:21.217Z      DEBUG   temporal        outgoing message        {"id": 2, "data": "", "context": ""}
samples-php-app-1       | 2023-08-15T15:25:21.217Z      DEBUG   temporal        received message        {"command": {"ids":[9001]}, "id": 9002, "data": "\n\u001d\u0008\ufffdF\u0012\u0006Cancel\u001a\u000e{\"ids\":[9001]}*\u0000\n\u001f\u0008\u0002*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null"}
samples-php-app-1       | 2023-08-15T15:25:21.217Z      DEBUG   temporal        received message        {"command": null, "id": 2, "data": "\n\u001d\u0008\ufffdF\u0012\u0006Cancel\u001a\u000e{\"ids\":[9001]}*\u0000\n\u001f\u0008\u0002*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null"}
samples-php-app-1       | 2023-08-15T15:25:21.217Z      DEBUG   temporal        cancel request  {"ID": 9002}
samples-php-app-1       | 2023-08-15T15:25:21.217Z      DEBUG   temporal        registering activity canceller  {"activityID": "5"}
samples-php-app-1       | 2023-08-15T15:25:21.217Z      DEBUG   temporal        calling callback IN LOOP        {"ID": 9001, "type": "activity"}
samples-php-app-1       | 2023-08-15T15:25:21.217Z      DEBUG   temporal        executing callback      {"ID": 9001, "type": "activity"}
samples-php-app-1       | 2023-08-15T15:25:21.217Z      DEBUG   temporal        error   {"error": "canceled", "type": "activity"}
samples-php-app-1       | 2023-08-15T15:25:21.217Z      DEBUG   temporal        RequestCancelActivity   {"Namespace": "default", "TaskQueue": "default", "WorkerID": "default:fdd90c6e-152a-4568-a0a2-d76fc3a8f7e8", "WorkflowType": "CancelActivity", "WorkflowID": "wf-462", "RunID": "53f32eee-387e-478c-9d8b-c9f91d01099d", "Attempt": 1, "ActivityID": "5"}
samples-php-app-1       | 2023-08-15T15:25:21.217Z      DEBUG   temporal        outgoing message        {"id": 9001, "data": "", "context": ""}
samples-php-app-1       | 2023-08-15T15:25:21.217Z      DEBUG   temporal        outgoing message        {"id": 9002, "data": "", "context": ""}
samples-php-app-1       | 2023-08-15T15:25:21.219Z      DEBUG   temporal        received message        {"command": {}, "id": 9003, "data": "\n\ufffd\u0002\u0008\ufffdF\u0012\u0010CompleteWorkflow\u001a\u0002{}\"\ufffd\u0002\n\u0008canceled\u0012\u0005GoSDK\u001a\ufffd\u0002/var/app/src/CancelActivity/CancelWorkflow.php:46\n/var/app/vendor/temporal/sdk/src/Worker/Worker.php:94\n/var/app/vendor/temporal/sdk/src/WorkerFactory.php:413\n/var/app/vendor/temporal/sdk/src/WorkerFactory.php:387\n/var/app/vendor/temporal/sdk/src/WorkerFactory.php:261\n/var/app/worker.php:47:\u0000*\u0000"}
samples-php-app-1       | 2023-08-15T15:25:21.219Z      DEBUG   temporal        complete workflow request       {"ID": 9003}
samples-php-app-1       | 2023-08-15T15:25:21.219Z      DEBUG   temporal        close workflow  {"RunID": "53f32eee-387e-478c-9d8b-c9f91d01099d"}
samples-php-app-1       | 2023-08-15T15:25:21.219Z      DEBUG   temporal        outgoing message        {"id": 3, "data": "", "context": ""}
samples-php-app-1       | 2023-08-15T15:25:21.219Z      DEBUG   temporal        received message        {"command": null, "id": 3, "data": "\n\u001f\u0008\u0003*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null"}
temporal                | {"level":"info","ts":"2023-08-15T15:25:21.220Z","msg":"Failing the workflow task.","shard-id":3,"address":"192.168.224.4:7234","component":"history-engine","value":"UnhandledCommand","wf-id":"wf-462","wf-run-id":"53f32eee-387e-478c-9d8b-c9f91d01099d","wf-namespace-id":"0ed8e308-3644-4697-92a7-689f941154ce","logging-call-at":"workflowTaskHandlerCallbacks.go:570"}
temporal                | {"level":"info","ts":"2023-08-15T15:25:21.227Z","msg":"history client encountered error","service":"frontend","error":"UnhandledCommand","service-error-type":"serviceerror.InvalidArgument","logging-call-at":"metric_client.go:104"}
samples-php-app-1       | 2023-08-15T15:25:21.227Z      INFO    temporal        Task processing failed with error       {"Namespace": "default", "TaskQueue": "default", "WorkerID": "default:fdd90c6e-152a-4568-a0a2-d76fc3a8f7e8", "WorkerType": "WorkflowWorker", "Error": "UnhandledCommand"}
samples-php-app-1       | 2023-08-15T15:25:21.234Z      DEBUG   temporal        workflow execute        {"runID": "53f32eee-387e-478c-9d8b-c9f91d01099d", "workflow info": {"WorkflowExecution":{"ID":"wf-462","RunID":"53f32eee-387e-478c-9d8b-c9f91d01099d"},"WorkflowType":{"Name":"CancelActivity"},"TaskQueueName":"default","WorkflowExecutionTimeout":60000000000,"WorkflowRunTimeout":60000000000,"WorkflowTaskTimeout":10000000000,"Namespace":"default","Attempt":1,"WorkflowStartTime":"2023-08-15T15:25:21.141917945Z","CronSchedule":"","ContinuedExecutionRunID":"","ParentWorkflowNamespace":"","ParentWorkflowExecution":null,"Memo":null,"SearchAttributes":null,"RetryPolicy":null,"BinaryChecksum":"e3d02694289f74cc72ea990fe241278d"}}
samples-php-app-1       | 2023-08-15T15:25:21.234Z      DEBUG   temporal        workflow task started   {"time": "1s"}
samples-php-app-1       | 2023-08-15T15:25:21.234Z      DEBUG   temporal        outgoing message        {"id": 4, "data": "", "context": ""}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        received message        {"command": {"name":"ActivityCancel.cancel","options":{"ActivityID":"","TaskQueueName":"","ScheduleToCloseTimeout":20000000000,"ScheduleToStartTimeout":0,"StartToCloseTimeout":0,"HeartbeatTimeout":0,"WaitForCancellation":false,"OriginalTaskQueueName":"","RetryPolicy":null,"DisableEagerExecution":false}}, "id": 9004, "data": "\n\ufffd\u0002\u0008\ufffdF\u0012\u000fExecuteActivity\u001a\ufffd\u0001{\"name\":\"ActivityCancel.cancel\",\"options\":{\"TaskQueueName\":null,\"ScheduleToCloseTimeout\":20000000000,\"ScheduleToStartTimeout\":0,\"StartToCloseTimeout\":0,\"HeartbeatTimeout\":0,\"WaitForCancellation\":false,\"ActivityID\":\"\",\"RetryPolicy\":null}}*$\n\"\n\u0016\n\u0008encoding\u0012\njson/plain\u0012\u0008\"wf-462\"\n\u001f\u0008\u0004*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null"}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        received message        {"command": null, "id": 4, "data": "\n\ufffd\u0002\u0008\ufffdF\u0012\u000fExecuteActivity\u001a\ufffd\u0001{\"name\":\"ActivityCancel.cancel\",\"options\":{\"TaskQueueName\":null,\"ScheduleToCloseTimeout\":20000000000,\"ScheduleToStartTimeout\":0,\"StartToCloseTimeout\":0,\"HeartbeatTimeout\":0,\"WaitForCancellation\":false,\"ActivityID\":\"\",\"RetryPolicy\":null}}*$\n\"\n\u0016\n\u0008encoding\u0012\njson/plain\u0012\u0008\"wf-462\"\n\u001f\u0008\u0004*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null"}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        activity request        {"ID": 9004}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        workflow task started   {"time": "1s"}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        appending callback      {"ID": 9004, "type": "activity"}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        executing callback      {"ID": 9004, "type": "activity"}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        pushing response        {"ID": 9004, "type": "activity"}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        outgoing message        {"id": 5, "data": "", "context": ""}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        outgoing message        {"id": 9004, "data": "", "context": ""}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        received message        {"command": {"ids":[9004]}, "id": 9005, "data": "\n\u001d\u0008\ufffdF\u0012\u0006Cancel\u001a\u000e{\"ids\":[9004]}*\u0000\n\u001f\u0008\u0005*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null\n6\u0008\ufffdF\u0012\u0010CompleteWorkflow\u001a\u0002{}*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null"}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        received message        {"command": null, "id": 5, "data": "\n\u001d\u0008\ufffdF\u0012\u0006Cancel\u001a\u000e{\"ids\":[9004]}*\u0000\n\u001f\u0008\u0005*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null\n6\u0008\ufffdF\u0012\u0010CompleteWorkflow\u001a\u0002{}*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null"}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        received message        {"command": {}, "id": 9006, "data": "\n\u001d\u0008\ufffdF\u0012\u0006Cancel\u001a\u000e{\"ids\":[9004]}*\u0000\n\u001f\u0008\u0005*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null\n6\u0008\ufffdF\u0012\u0010CompleteWorkflow\u001a\u0002{}*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null"}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        cancel request  {"ID": 9005}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        outgoing message        {"id": 9005, "data": "", "context": ""}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        complete workflow request       {"ID": 9006}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        close workflow  {"RunID": "53f32eee-387e-478c-9d8b-c9f91d01099d"}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        outgoing message        {"id": 6, "data": "", "context": ""}
samples-php-app-1       | 2023-08-15T15:25:21.235Z      DEBUG   temporal        received message        {"command": null, "id": 6, "data": "\n\u001f\u0008\u0006*\u001b\n\u0019\n\u0017\n\u0008encoding\u0012\u000bbinary/null"}
temporal-ui             | {"time":"2023-08-15T15:25:23.327122692Z","id":"","remote_ip":"192.168.224.1","host":"localhost:8080","method":"GET","uri":"/_app/version.json","user_agent":"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36","status":200,"error":"","latency":25261,"latency_human":"25.261ยตs","bytes_in":0,"bytes_out":27}
temporal-ui             | {"time":"2023-08-15T15:25:33.337458261Z","id":"","remote_ip":"192.168.224.1","host":"localhost:8080","method":"GET","uri":"/_app/version.json","user_agent":"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36","status":200,"error":"","latency":59931,"latency_human":"59.931ยตs","bytes_in":0,"bytes_out":27}
temporal-ui             | {"time":"2023-08-15T15:25:43.34294232Z","id":"","remote_ip":"192.168.224.1","host":"localhost:8080","method":"GET","uri":"/_app/version.json","user_agent":"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36","status":200,"error":"","latency":30211,"latency_human":"30.211ยตs","bytes_in":0,"bytes_out":27}
temporal                | {"level":"info","ts":"2023-08-15T15:25:49.020Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/temporal-sys-batcher-taskqueue/1","wf-task-queue-type":"Activity","wf-namespace":"temporal-system","lifecycle":"Started","logging-call-at":"taskQueueManager.go:316"}
temporal                | {"level":"info","ts":"2023-08-15T15:25:49.023Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/default-worker-tq/1","wf-task-queue-type":"Workflow","wf-namespace":"temporal-system","lifecycle":"Started","logging-call-at":"taskQueueManager.go:316"}

[๐Ÿ› BUG]: Worker after SIGTERM not wait for finish activity

No duplicates ๐Ÿฅฒ.

  • I have searched for a similar issue in our bug tracker and didn't find any solutions.

What happened?

Workflow

#[Workflow\WorkflowInterface]
class Workflow
{
    #[Workflow\WorkflowMethod(name: 'Workflow')]
    public function run()
    {
        $simple = Workflow::newActivityStub(
            SimpleActivity::class,
            ActivityOptions::new()
                ->withStartToCloseTimeout(60)
                ->withRetryOptions(RetryOptions::new()->withMaximumAttempts(1))
        );

        yield $simple->sendRequest('request');
    }
}

Acitivity:

#[ActivityInterface(prefix: "SimpleActivity.")]
class SimpleActivity
{
    #[ActivityMethod]
    public function sendRequest(
        string $input
    ): string {
        sleep(40);

        return strtolower($input);
    }
}

Let's imagine situation when one of activity is responsible for sending POST request (this request can be send only one time) and it takes around 30 seconds.
In the 15th second while sending request, one of developer made a deployment - so he send a SIGTERM to kill container.

In logs after sent SIGTERM I see

2022-09-30T11:03:44.501Z	WARN	temporal    	Failed to poll for task.	{"Namespace": "default", "TaskQueue": "default", "WorkerID": "default:37c72875-bdf1-49a8-9730-b686e60194b0", "WorkerType": "WorkflowWorker", "Error": "worker stopping"}

Activity was interrupted and is not processed anymore so in result in Temporal I see

{
  "message": "activity StartToClose timeout",
  "source": "Server",
  "stackTrace": "",
  "cause": null,
  "timeoutFailureInfo": {
    "timeoutType": "StartToClose",
    "lastHeartbeatDetails": {
      "payloads": [
        {
          "metadata": {
            "encoding": "anNvbi9wbGFpbg=="
          },
          "data": "eyJ2YWx1ZSI6MTh9"
        }
      ]
    }
  }
}

From my perspective worker should wait for finish this activity as a secure way. Only SIGKILL should immediately kill process.

Functionality available in java-sdk - https://www.javadoc.io/static/io.temporal/temporal-sdk/1.0.0/io/temporal/worker/WorkerFactory.html#awaitTermination-long-java.util.concurrent.TimeUnit-

Or maybe I am doing something wrong.

Version

v2.11.3

Relevant log output

No response

[BUG] Host process should not send the remaining queue content to worker on Destroy

Expected behavior:
When the host closes the workflow only the "Destroy" command should be sent to the worker.

Actual behavior:
The host sends all remaining callbacks with the destroy command which in some cases causes workers to reject the payload (due to unexpected responses).

Reference:
https://github.com/temporalio/roadrunner-temporal/blob/master/workflow/process.go#L129

Should be changed to runCommand or should only include Destroy command.

[๐Ÿ› BUG]: Metric `temporal_activity_schedule_to_start_latency` useless with RR?

No duplicates ๐Ÿฅฒ.

  • I have searched for a similar issue in our bug tracker and didn't find any solutions.

What happened?

Hey guys, need help with autoscaling roadrunner instances. I'm not sure if this is a bug, maybe there's a way around it.

I'm trying to attach HPA on kubernetes to a temporal metric, but the one which is supposed to approximate the number in the queue always stays the same. I'm guessing this is because roadrunner gets the message from temporal, which fires the temporal_activity_schedule_to_start_latency and then looks for a worker. If it cannot find a free worker it then throws

Error: activity_pool_execute_activity: NoFreeWorkers:
	codec_execute:
	static_pool_exec:
	static_pool_exec:
	worker_watcher_get_free_worker:
	context deadline exceeded

This is exactly what we get on some of our pods, when the load increases. The workers are I/O bound so we cannot use CPU metrics to scale. Yet the temporal_activity_schedule_to_start_latency doesn't change at all, even when the roadrunner cannot find any workers and times out after 60s (default setting).

Any ideas? Maybe there are other metrics roadrunner itself can emit f.e. how long it takes to find a worker I'm not aware of?

Version

2.7.9

Relevant log output

No response

github.com/uber-go/tally/v4-v4.1.5: 2 vulnerabilities (highest severity is: 7.5)

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.5

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.5.zip

Found in HEAD commit: 001c8fd594d2fbda1d558ac8129d20f0677fa6db

Vulnerabilities

CVE Severity CVSS Dependency Type Fixed in (github.com/uber-go/tally/v4-v4.1.5 version) Remediation Available
CVE-2019-0205 High 7.5 github.com/uber-go/tally/v4-v4.1.5 Direct org.apache.thrift:libthrift:0.13.0 โŒ
CVE-2019-0210 High 7.5 github.com/uber-go/tally/v4-v4.1.5 Direct 0.13.0 โŒ

Details

CVE-2019-0205

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.5

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.5.zip

Dependency Hierarchy:

  • โŒ github.com/uber-go/tally/v4-v4.1.5 (Vulnerable Library)

Found in HEAD commit: 001c8fd594d2fbda1d558ac8129d20f0677fa6db

Found in base branch: master

Vulnerability Details

In Apache Thrift all versions up to and including 0.12.0, a server or client may run into an endless loop when feed with specific input data. Because the issue had already been partially fixed in version 0.11.0, depending on the installed version it affects only certain language bindings.

Publish Date: 2019-10-29

URL: CVE-2019-0205

CVSS 3 Score Details (7.5)

Base Score Metrics:

  • Exploitability Metrics:
    • Attack Vector: Network
    • Attack Complexity: Low
    • Privileges Required: None
    • User Interaction: None
    • Scope: Unchanged
  • Impact Metrics:
    • Confidentiality Impact: None
    • Integrity Impact: None
    • Availability Impact: High

For more information on CVSS3 Scores, click here.

Suggested Fix

Type: Upgrade version

Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0205

Release Date: 2019-10-29

Fix Resolution: org.apache.thrift:libthrift:0.13.0

CVE-2019-0210

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.5

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.5.zip

Dependency Hierarchy:

  • โŒ github.com/uber-go/tally/v4-v4.1.5 (Vulnerable Library)

Found in HEAD commit: 001c8fd594d2fbda1d558ac8129d20f0677fa6db

Found in base branch: master

Vulnerability Details

In Apache Thrift 0.9.3 to 0.12.0, a server implemented in Go using TJSONProtocol or TSimpleJSONProtocol may panic when feed with invalid input data.

Publish Date: 2019-10-29

URL: CVE-2019-0210

CVSS 3 Score Details (7.5)

Base Score Metrics:

  • Exploitability Metrics:
    • Attack Vector: Network
    • Attack Complexity: Low
    • Privileges Required: None
    • User Interaction: None
    • Scope: Unchanged
  • Impact Metrics:
    • Confidentiality Impact: None
    • Integrity Impact: None
    • Availability Impact: High

For more information on CVSS3 Scores, click here.

Suggested Fix

Type: Upgrade version

Origin: http://mail-archives.apache.org/mod_mbox/thrift-dev/201910.mbox/%3C277A46CA87494176B1BBCF5D72624A2A%40HAGGIS%3E

Release Date: 2019-10-29

Fix Resolution: 0.13.0

[๐Ÿ’ก FEATURE REQUEST]: Overwrite `client-name` and `client-version` in Go client to represent PHP-SDK

In order to track the temporal usage all the client calls, workflow and activity polling calls, must container an SDK version header. Since Golang SDK represents PHP one - we have to adjust this header as well, via GRPC interceptor or another way.

See: temporalio/sdk-php#220
Also: https://github.com/temporalio/sdk-php/blob/master/src/Client/GRPC/Context.php#L35

Target: https://github.com/temporalio/roadrunner-temporal/blob/master/plugin.go#L159

[๐Ÿ› BUG]: broken load balancing for workflow worker

No duplicates ๐Ÿฅฒ.

  • I have searched for a similar issue in our bug tracker and didn't find any solutions.

What happened?

In the test application https://github.com/temporalio/samples-php, two related problems appear:

  • broken load balancing - basically only workflow worker works
  • no way to set limits for workflow worker

In the test application, running in docker, I added limits (max_jobs, ttl, max_worker_memory):

version: "2.7"

rpc:
  listen: tcp://127.0.0.1:6001

server:
  command: "php worker.php"

temporal:
  address: "temporal:7233"
  activities:
    num_workers: 4
    max_jobs: 20
    supervisor:
      ttl: 180s # 3 minutes
      max_worker_memory: 80

logs:
  level: debug
  mode: development

these limits work correctly for activity workers and do not apply to workflow worker. As a result, the workflow worker lives forever, has memory leaks and does not restart automatically:

Workers of [temporal]:
+---------+-----------+---------+---------+---------+--------------------+
|   PID   |  STATUS   |  EXECS  | MEMORY  |  CPU%   |      CREATED       |
+---------+-----------+---------+---------+---------+--------------------+
|   31006 | ready     |   1,053 | 134 MB  |    0.47 | 10 hours ago       |
|   33611 | working   |      14 | 61 MB   |    0.12 | 2 minutes ago      |
|   33614 | working   |      14 | 61 MB   |    0.12 | 2 minutes ago      |
|   33612 | working   |      14 | 62 MB   |    0.11 | 2 minutes ago      |
|   33613 | working   |      14 | 61 MB   |    0.12 | 2 minutes ago      |
+---------+-----------+---------+---------+---------+--------------------+

I was looking for the ability to set limits for the workflow worker, but did not find it. Did I search badly?

Also, according to the statistics, there is a clear bias in the load towards the workflow worker

Version

2.12.3

Relevant log output

No response

[FEATURE REQUEST] ScheduleToStartLatency Metrics

Hi guys, I can't activity_task_schedule_to_start_latency metrics on php-sdk. Is it not implemented yet ?

I found the metrics exist on go-sdk here. Where can I grab this metrics on php-sdk ? I have enabled roadrunner metrics but there is nothing related to the temporal metrics.

invalid character '1' looking for beginning of value",

Expected Behavior

It should execute the activity as usual.

Actual Behavior

Locally on arm64 it works fine, When deploying to production on some activities execution I get error:

{"message":"proto_codec_parse_message: invalid character '1' looking for beginning of value","source":"GoSDK","stackTrace":"process event for default [panic]:\ngithub.com/temporalio/roadrunner-temporal/v4/aggregatedpool.(*Workflow).OnWorkflowTaskStarted(0xc00126b340, 0x18?)\n\tgithub.com/temporalio/roadrunner-temporal/[email protected]/aggregatedpool/workflow.go:184 +0x3e7\ngo.temporal.io/sdk/internal.(*workflowExecutionEventHandlerImpl).ProcessEvent(0xc001424168, 0xc0008f7030, 0x20?, 0x1)\n\tgo.temporal.io/[email protected]/internal/internal_event_handlers.go:1143 +0x225\ngo.temporal.io/sdk/internal.(*workflowExecutionContextImpl).ProcessWorkflowTask(0xc001420900, 0xc0011d8690)\n\tgo.temporal.io/[email protected]/internal/internal_task_handlers.go:1100 +0x1608\ngo.temporal.io/sdk/internal.(*workflowTaskHandlerImpl).ProcessWorkflowTask(0xc000a91930, 0xc0011d8690, 0xc001420900, 0xc001422810)\n\tgo.temporal.io/[email protected]/internal/internal_task_handlers.go:889 +0x3c5\ngo.temporal.io/sdk/internal.(*workflowTaskPoller).processWorkflowTask(0xc0008d5320, 0xc0011d8690)\n\tgo.temporal.io/[email protected]/internal/internal_task_pollers.go:357 +0x3c3\ngo.temporal.io/sdk/internal.(*workflowTaskPoller).ProcessTask(0xc0008d5320, {0x1a2c3a0, 0xc0011d8690})\n\tgo.temporal.io/[email protected]/internal/internal_task_pollers.go:321 +0x78\ngo.temporal.io/sdk/internal.(*baseWorker).processTask(0xc000a82420, {0x1a2cea0, 0xc0011e2040})\n\tgo.temporal.io/[email protected]/internal/internal_worker_base.go:518 +0x153\ngo.temporal.io/sdk/internal.(*baseWorker).processTaskAsync.func1()\n\tgo.temporal.io/[email protected]/internal/internal_worker_base.go:369 +0x45\ncreated by go.temporal.io/sdk/internal.(*baseWorker).processTaskAsync in goroutine 91\n\tgo.temporal.io/[email protected]/internal/internal_worker_base.go:365 +0xa5","encodedAttributes":null,"cause":null,"applicationFailureInfo":{"type":"PanicError","nonRetryable":true,"details":null}}

On docker container

{"level":"warn","ts":"2024-03-01T16:54:31.782Z","msg":"Critical attempts processing workflow task","service":"history","shard-id":1,"address":"127.0.0.1:7234","wf-namespace":"testing","wf-id":"8bfe431b-b9ce-43ea-a72c-130894bdb629","wf-run-id":"a6c94584-9be9-4737-a9f8-ad6b146a9c7b","attempt":10,"logging-call-at":"workflow_task_state_machine.go:925"}

I tried chaging temporal pgres server version to 13 and 16 both didn't work.

Steps to Reproduce the Problem

Not sure how to reproduce, the error also doesn't provide the payload, and am not sure how to see it in Temporal.
Screenshot 2024-03-01 at 17 54 35

Specifications

  • Version: temporalio/auto-setup:1.22.6, pgres: 13 & 16
  • Platform: ubuntu, amd64, temporal/sdk:v2.7.6, PHP/Laravel app.

dynamic conf:

          limit.maxIDLength:
            - value: 255
              constraints: {}
          frontend.keepAliveMaxConnectionAge:
          - value: 5m
          frontend.keepAliveMaxConnectionAgeGrace:
          - value: 70s
          frontend.enableClientVersionCheck:
          - value: true
            constraints: {}
          history.persistenceMaxQPS:
          - value: 3000
            constraints: {}
          frontend.persistenceMaxQPS:
          - value: 3000
            constraints: {}
          frontend.historyMgrNumConns:
          - value: 10
            constraints: {}
          frontend.throttledLogRPS:
          - value: 20
            constraints: {}
          history.historyMgrNumConns:
          - value: 50
            constraints: {}
          history.defaultActivityRetryPolicy:
          - value:
              InitialIntervalInSeconds: 1
              MaximumIntervalCoefficient: 100.0
              BackoffCoefficient: 2.0
              MaximumAttempts: 0
          history.defaultWorkflowRetryPolicy:
          - value:
              InitialIntervalInSeconds: 1
              MaximumIntervalCoefficient: 100.0
              BackoffCoefficient: 2.0
              MaximumAttempts: 0

github.com/uber-go/tally/v4-v4.1.3: 2 vulnerabilities (highest severity is: 7.5) - autoclosed

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.3

A Go metrics interface with fast buffered metrics and third party reporters

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.3.zip

Found in HEAD commit: 06013c6f0f045b2663566f6adb6bd356f2ca489e

Vulnerabilities

CVE Severity CVSS Dependency Type Fixed in (github.com/uber-go/tally/v4-v4.1.3 version) Remediation Available
CVE-2019-0205 High 7.5 github.com/uber-go/tally/v4-v4.1.3 Direct org.apache.thrift:libthrift:0.13.0 โŒ
CVE-2019-0210 High 7.5 github.com/uber-go/tally/v4-v4.1.3 Direct 0.13.0 โŒ

Details

CVE-2019-0205

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.3

A Go metrics interface with fast buffered metrics and third party reporters

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.3.zip

Dependency Hierarchy:

  • โŒ github.com/uber-go/tally/v4-v4.1.3 (Vulnerable Library)

Found in HEAD commit: 06013c6f0f045b2663566f6adb6bd356f2ca489e

Found in base branch: master

Vulnerability Details

In Apache Thrift all versions up to and including 0.12.0, a server or client may run into an endless loop when feed with specific input data. Because the issue had already been partially fixed in version 0.11.0, depending on the installed version it affects only certain language bindings.

Publish Date: 2019-10-29

URL: CVE-2019-0205

CVSS 3 Score Details (7.5)

Base Score Metrics:

  • Exploitability Metrics:
    • Attack Vector: Network
    • Attack Complexity: Low
    • Privileges Required: None
    • User Interaction: None
    • Scope: Unchanged
  • Impact Metrics:
    • Confidentiality Impact: None
    • Integrity Impact: None
    • Availability Impact: High

For more information on CVSS3 Scores, click here.

Suggested Fix

Type: Upgrade version

Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0205

Release Date: 2019-10-29

Fix Resolution: org.apache.thrift:libthrift:0.13.0

CVE-2019-0210

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.3

A Go metrics interface with fast buffered metrics and third party reporters

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.3.zip

Dependency Hierarchy:

  • โŒ github.com/uber-go/tally/v4-v4.1.3 (Vulnerable Library)

Found in HEAD commit: 06013c6f0f045b2663566f6adb6bd356f2ca489e

Found in base branch: master

Vulnerability Details

In Apache Thrift 0.9.3 to 0.12.0, a server implemented in Go using TJSONProtocol or TSimpleJSONProtocol may panic when feed with invalid input data.

Publish Date: 2019-10-29

URL: CVE-2019-0210

CVSS 3 Score Details (7.5)

Base Score Metrics:

  • Exploitability Metrics:
    • Attack Vector: Network
    • Attack Complexity: Low
    • Privileges Required: None
    • User Interaction: None
    • Scope: Unchanged
  • Impact Metrics:
    • Confidentiality Impact: None
    • Integrity Impact: None
    • Availability Impact: High

For more information on CVSS3 Scores, click here.

Suggested Fix

Type: Upgrade version

Origin: http://mail-archives.apache.org/mod_mbox/thrift-dev/201910.mbox/%3C277A46CA87494176B1BBCF5D72624A2A%40HAGGIS%3E

Release Date: 2019-10-29

Fix Resolution: 0.13.0

github.com/uber-go/tally/v4-v4.1.4: 2 vulnerabilities (highest severity is: 7.5)

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.4

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.4.zip

Found in HEAD commit: c66d99e3b38d878c9ba4349ca803756a0825cd61

Vulnerabilities

CVE Severity CVSS Dependency Type Fixed in (github.com/uber-go/tally/v4-v4.1.4 version) Remediation Available
CVE-2019-0205 High 7.5 github.com/uber-go/tally/v4-v4.1.4 Direct org.apache.thrift:libthrift:0.13.0 โŒ
CVE-2019-0210 High 7.5 github.com/uber-go/tally/v4-v4.1.4 Direct 0.13.0 โŒ

Details

CVE-2019-0205

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.4

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.4.zip

Dependency Hierarchy:

  • โŒ github.com/uber-go/tally/v4-v4.1.4 (Vulnerable Library)

Found in HEAD commit: c66d99e3b38d878c9ba4349ca803756a0825cd61

Found in base branch: master

Vulnerability Details

In Apache Thrift all versions up to and including 0.12.0, a server or client may run into an endless loop when feed with specific input data. Because the issue had already been partially fixed in version 0.11.0, depending on the installed version it affects only certain language bindings.

Publish Date: 2019-10-29

URL: CVE-2019-0205

CVSS 3 Score Details (7.5)

Base Score Metrics:

  • Exploitability Metrics:
    • Attack Vector: Network
    • Attack Complexity: Low
    • Privileges Required: None
    • User Interaction: None
    • Scope: Unchanged
  • Impact Metrics:
    • Confidentiality Impact: None
    • Integrity Impact: None
    • Availability Impact: High

For more information on CVSS3 Scores, click here.

Suggested Fix

Type: Upgrade version

Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0205

Release Date: 2019-10-29

Fix Resolution: org.apache.thrift:libthrift:0.13.0

CVE-2019-0210

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.4

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.4.zip

Dependency Hierarchy:

  • โŒ github.com/uber-go/tally/v4-v4.1.4 (Vulnerable Library)

Found in HEAD commit: c66d99e3b38d878c9ba4349ca803756a0825cd61

Found in base branch: master

Vulnerability Details

In Apache Thrift 0.9.3 to 0.12.0, a server implemented in Go using TJSONProtocol or TSimpleJSONProtocol may panic when feed with invalid input data.

Publish Date: 2019-10-29

URL: CVE-2019-0210

CVSS 3 Score Details (7.5)

Base Score Metrics:

  • Exploitability Metrics:
    • Attack Vector: Network
    • Attack Complexity: Low
    • Privileges Required: None
    • User Interaction: None
    • Scope: Unchanged
  • Impact Metrics:
    • Confidentiality Impact: None
    • Integrity Impact: None
    • Availability Impact: High

For more information on CVSS3 Scores, click here.

Suggested Fix

Type: Upgrade version

Origin: http://mail-archives.apache.org/mod_mbox/thrift-dev/201910.mbox/%3C277A46CA87494176B1BBCF5D72624A2A%40HAGGIS%3E

Release Date: 2019-10-29

Fix Resolution: 0.13.0

Incorrect signature of RPC SignalWithStartWorkflow method

Actual

type SignalWithStartIn struct {
	WorkflowID        string               `json:"wid"`
	SignalName        string               `json:"signal_name"`
	SignalArg         interface{}          `json:"signal_arg"`
	Options           StartWorkflowOptions `json:"options"`
	WorkflowInterface string               `json:"workflow_interface"`
	Args              []interface{}        `json:"args"`
}

Expected

  1. WorkflowID string json:"wid" should be avoided. StartWorkflowOptions already provides workflow id field.
  2. signal_arg should be an array of signal method: https://github.com/temporalio/sdk-java/blob/master/temporal-sdk/src/main/java/io/temporal/client/WorkflowStub.java#L61

Error NoFreeWorkers

Hi again, we sometimes run into this issue.
image

Error: activity_pool_execute_activity: NoFreeWorkers:
	execute:
	static_pool_exec:
	static_pool_exec:
	worker_watcher_get_free_worker: no free workers in the stack, timeout exceed

We do a bit of testing, and we suspect that:

  • this happens when there are lots of activity jobs on the temporal server,
  • somehow the roadrunner temporal retrieve the jobs from the temporal server
  • but all the available php workers are busy

We're not sure yet if this is the case, or if this error is caused by something else.
Please let us know if there is anything we can help.

Best Regards,

NOTE: Edited to include the full error message

[BUG] Heartbeat on non running activity

Describe the bug

Sending out a heartbeat in activity method results in heartbeat on non running activity (at random times). The failed state for heartbeats seems to be set upon roadrunner launch (rr serve command). If the activity worker launches with a 'failed' heartbeat state, it'll continue failing upon Heartbeats until it, or the Workflows rr serve instance is killed and restarted. I'm still not very clear on what fixes it. The more roadrunner instances I run (rr serve) the higher the chance of this error.

The failing heartbeats throw: 'activity_pool_get_activity_context: heartbeat on non running activity' on tcp://127.0.0.1:6001

Or sometimes the worker (very rarely) enters a different state for heartbeats and failed with a socket error. IDK if it's related:
socket_send(): unable to write to socket [32]: Broken pipe

To Reproduce

  1. Create a dummy workflow.
  2. Create a dummy activity.
  3. Send a heartbeat in the activity
  4. Launch rr serve commands for the workflow part and activities separately. In our system workflows are contained in a single app, and activities are handled by different service apps. The more rr serve instances for the activity app is launched, the higher the chance one will see the error. It's pretty easy to recreate, it doesn't require many workers. I would say on my end with one worker it's about 50/50 to see the heartbeat error.
  5. Launch the workflow
  6. See the error.

Expected behavior

Heartbeat to be sent out successfully.

Screenshots/Terminal output

Errors

Temporal\Exception\TransportException: Error 'activity_pool_get_activity_context: heartbeat on non running activity' on tcp://127.0.0.1:6001 
Spiral\Goridge\RPC\Exception\ServiceException: Error 'activity_pool_get_activity_context: heartbeat on non running activity' on tcp://127.0.0.1:6001 in /home/docker/services/hyperwallet/vendor/spiral/goridge/src/RPC/RPC.php:128
Stack trace:
#0 /home/docker/services/hyperwallet/vendor/spiral/goridge/src/RPC/RPC.php(98): Spiral\Goridge\RPC\RPC->decodeResponse(Object(Spiral\Goridge\Frame), NULL)
#1 /home/docker/services/hyperwallet/vendor/temporal/sdk/src/Worker/Transport/Goridge.php(56): Spiral\Goridge\RPC\RPC->call('activities.Reco...', Array)
#2 /home/docker/services/hyperwallet/vendor/temporal/sdk/src/Internal/Activity/ActivityContext.php(137): Temporal\Worker\Transport\Goridge->call('activities.Reco...', Array)

Very rare (dunno if related)

ErrorException: Warning: socket_send(): unable to write to socket [32]: Broken pipe 
ErrorException: Warning: socket_send(): unable to write to socket [32]: Broken pipe in /home/docker/services/hyperwallet/vendor/spiral/goridge/src/SocketRelay.php:234
Stack trace:
#0 /home/docker/services/hyperwallet/vendor/spiral/goridge/src/RPC/RPC.php(83): Spiral\Goridge\SocketRelay->send(Object(Spiral\Goridge\Frame))
#1 /home/docker/services/hyperwallet/vendor/temporal/sdk/src/Worker/Transport/Goridge.php(56): Spiral\Goridge\RPC\RPC->call('activities.Reco...', Array)
#2 /home/docker/services/hyperwallet/vendor/temporal/sdk/src/Internal/Activity/ActivityContext.php(137): Temporal\Worker\Transport\Goridge->call('activities.Reco...', Array)

Code samples:

<?php
declare(strict_types=1);

namespace App\Command;

use Carbon\CarbonInterval;
use Temporal\Activity\ActivityOptions;
use Temporal\Workflow;
use Temporal\Workflow\WorkflowInterface;

#[WorkflowInterface]
class TemporalTestWorkflow
{
    #[Workflow\WorkflowMethod]
    public function launchActivity()
    {
        $options = ActivityOptions::new()
            ->withStartToCloseTimeout(CarbonInterval::seconds(2))
            ->withTaskQueue('hyperwallet_command_bus');

        Workflow::newUntypedActivityStub($options)->execute('testActivity');
    }
}
<?php

declare(strict_types=1);

namespace App;

use Temporal\Activity;
use Temporal\Activity\ActivityInterface;
use Temporal\Activity\ActivityMethod;

#[ActivityInterface]
class TestActivity
{
    #[ActivityMethod]
    public function testActivity()
    {
        Activity::heartbeat('test'); // This causes failure.
    }
}

Versions

  • OS: In Docker Debian GNU/Linux 10 (buster), host Pop!_OS 21.04
  • Temporal Version: 1.12.2
  • Roadrunner version: 2.5.2
  • PHP composer versions:
    • spiral/roadrunner: 2.5.0
    • spiral/roadrunner-cli: 2.0.11
    • spiral/roadrunner-http: 2.0.4
    • spiral/roadrunner-worker: 2.1.3
    • temporal/sdk: 1.0.4
  • We're using docker image temporalio/auto-setup in our docker-compose.yaml, we've seen this error with sample temporal docker-compose-mysql-es.yaml files as well.

Additional context

I'm not sure, maybe this is an issue only with our setup, but any help would be greatly appreciated.

Temporal RPC service

Need RPC service to access all Temporal methods to create workflows, send signals and etc.

[๐Ÿ› BUG]: Ghost workers (pollers) in roadrunner version 2.7.5

No duplicates ๐Ÿฅฒ.

  • I have searched for a similar issue in our bug tracker and didn't find any solutions.

What happened?

Hey guys ๐Ÿ‘‹,

With version 2.7.5 I see duplicated or ghost workers in temporal. This causes temporal activities to fail with errors "ActivityNotRegisteredError: unable to find activityType=getTransactionOptions. Supported types: []" These "ghosts" workers are actually empty. Temporal tries a few times and stumbles upon the correct worker and everything is fine. Still, the workflows take a little longer than they should because of this ๐Ÿ˜€

2.7.4 doesn't cause this issue.

A few images from temporal page for an example:

2.7.4:
Screenshot from 2022-02-05 21-36-32

2.7.5:
Screenshot from 2022-02-05 21-22-28

Version

Roadrunner 2.7.5
Temporal 1.13.0

Relevant log output

Console output of 2.7.5:
2022-02-05T19:44:55.522Z	INFO	temporal    	connected to temporal server	{"address": "temporal:7233"}
2022-02-05T19:44:55.925Z	DEBUG	server      	worker is allocated	{"pid": 4541, "internal_event_name": "EventWorkerConstruct"}
2022-02-05T19:44:55.932Z	DEBUG	server      	worker is allocated	{"pid": 4542, "internal_event_name": "EventWorkerConstruct"}
2022-02-05T19:44:55.932Z	DEBUG	temporal    	start fetching worker info for the activity
2022-02-05T19:44:55.932Z	DEBUG	temporal    	outgoing message	{"data": "\u001b[32m\n\"\u0012\rGetWorkerInfo\u001a\u0011{\"rr_version\":\"\"} {}\u001b[0m"}
2022-02-05T19:44:56.066Z	DEBUG	temporal    	received message	{"data": "\u001b[93m\n\ufffd\u0004*\ufffd\u0004\n\ufffd\u0004\n\u0016\n\u0008encoding\u0012\njson/plain\u0012\ufffd\u0004{\"TaskQueue\":\"events_hub_bridge_command_bus\",\"Options\":{\"MaxConcurrentActivityExecutionSize\":0,\"WorkerActivitiesPerSecond\":0.0,\"MaxConcurrentLocalActivityExecutionSize\":0,\"WorkerLocalActivitiesPerSecond\":0.0,\"TaskQueueActivitiesPerSecond\":0.0,\"MaxConcurrentActivityTaskPollers\":0,\"MaxConcurrentWorkflowTaskExecutionSize\":0,\"MaxConcurrentWorkflowTaskPollers\":0,\"StickyScheduleToStartTimeout\":null,\"WorkerStopTimeout\":null,\"EnableSessionWorker\":false,\"SessionResourceID\":null,\"MaxConcurrentSessionExecutionSize\":1000},\"Workflows\":[],\"Activities\":[{\"Name\":\"SendNotificationCommand\"}]}\u001b[0m"}
2022-02-05T19:44:56.066Z	DEBUG	temporal    	activity registered	{"name": "SendNotificationCommand"}
2022-02-05T19:44:56.150Z	INFO	temporal    	Started Worker	{"Namespace": "default", "TaskQueue": "events_hub_bridge_command_bus", "WorkerID": "events_hub_bridge_command_bus:ac908a7f-e885-4f1a-b657-6fca87ea2b6e"}
2022-02-05T19:44:56.150Z	DEBUG	temporal    	activity workers started	{"num_workers": 1}
2022-02-05T19:44:56.457Z	DEBUG	server      	worker is allocated	{"pid": 4563, "internal_event_name": "EventWorkerConstruct"}
2022-02-05T19:44:56.457Z	DEBUG	temporal    	outgoing message	{"data": "\u001b[32m\n\"\u0012\rGetWorkerInfo\u001a\u0011{\"rr_version\":\"\"} {}\u001b[0m"}
2022-02-05T19:44:56.581Z	DEBUG	temporal    	received message	{"data": "\u001b[93m\n\ufffd\u0004*\ufffd\u0004\n\ufffd\u0004\n\u0016\n\u0008encoding\u0012\njson/plain\u0012\ufffd\u0004{\"TaskQueue\":\"events_hub_bridge_command_bus\",\"Options\":{\"MaxConcurrentActivityExecutionSize\":0,\"WorkerActivitiesPerSecond\":0.0,\"MaxConcurrentLocalActivityExecutionSize\":0,\"WorkerLocalActivitiesPerSecond\":0.0,\"TaskQueueActivitiesPerSecond\":0.0,\"MaxConcurrentActivityTaskPollers\":0,\"MaxConcurrentWorkflowTaskExecutionSize\":0,\"MaxConcurrentWorkflowTaskPollers\":0,\"StickyScheduleToStartTimeout\":null,\"WorkerStopTimeout\":null,\"EnableSessionWorker\":false,\"SessionResourceID\":null,\"MaxConcurrentSessionExecutionSize\":1000},\"Workflows\":[],\"Activities\":[{\"Name\":\"SendNotificationCommand\"}]}\u001b[0m"}
2022-02-05T19:44:56.581Z	DEBUG	temporal    	worker info	{"taskqueue": "events_hub_bridge_command_bus", "options": {"MaxConcurrentActivityExecutionSize":0,"WorkerActivitiesPerSecond":0,"MaxConcurrentLocalActivityExecutionSize":0,"WorkerLocalActivitiesPerSecond":0,"TaskQueueActivitiesPerSecond":0,"MaxConcurrentActivityTaskPollers":0,"MaxConcurrentWorkflowTaskExecutionSize":0,"MaxConcurrentWorkflowTaskPollers":0,"EnableLoggingInReplay":false,"DisableStickyExecution":false,"StickyScheduleToStartTimeout":0,"BackgroundActivityContext":null,"WorkflowPanicPolicy":0,"WorkerStopTimeout":0,"EnableSessionWorker":false,"MaxConcurrentSessionExecutionSize":1000,"DisableWorkflowWorker":false,"LocalActivityWorkerOnly":false,"Identity":"","DeadlockDetectionTimeout":0,"MaxHeartbeatThrottleInterval":0,"DefaultHeartbeatThrottleInterval":0,"Interceptors":null}}
2022-02-05T19:44:56.584Z	INFO	temporal    	Started Worker	{"Namespace": "default", "TaskQueue": "events_hub_bridge_command_bus", "WorkerID": "events_hub_bridge_command_bus:48a25841-265a-4767-996a-070f12228277"}
2022-02-05T19:44:56.586Z	INFO	temporal    	Started Worker	{"Namespace": "default", "TaskQueue": "events_hub_bridge_command_bus", "WorkerID": "events_hub_bridge_command_bus:ac908a7f-e885-4f1a-b657-6fca87ea2b6e"}
2022-02-05T19:44:56.586Z	DEBUG	rpc         	plugin was started	{"address": "tcp://127.0.0.1:6001", "list of the plugins with RPC methods:": ["informer", "resetter", "temporal"]}
[INFO] RoadRunner server started; version: 2.7.5, buildtime: 2022-02-04T22:53:48+0000

Console output of 2.7.4:
2022-02-05T19:42:54.646Z	INFO	temporal    	connected to temporal server	{"address": "temporal:7233"}
2022-02-05T19:42:55.015Z	DEBUG	server      	worker is allocated	{"pid": 4488, "internal_event_name": "EventWorkerConstruct"}
2022-02-05T19:42:55.018Z	DEBUG	server      	worker is allocated	{"pid": 4487, "internal_event_name": "EventWorkerConstruct"}
2022-02-05T19:42:55.018Z	DEBUG	temporal    	start fetching worker info for the activity
2022-02-05T19:42:55.019Z	DEBUG	temporal    	outgoing message	{"data": "\u001b[32m\n\"\u0012\rGetWorkerInfo\u001a\u0011{\"rr_version\":\"\"} {}\u001b[0m"}
2022-02-05T19:42:55.153Z	DEBUG	temporal    	received message	{"data": "\u001b[93m\n\ufffd\u0004*\ufffd\u0004\n\ufffd\u0004\n\u0016\n\u0008encoding\u0012\njson/plain\u0012\ufffd\u0004{\"TaskQueue\":\"events_hub_bridge_command_bus\",\"Options\":{\"MaxConcurrentActivityExecutionSize\":0,\"WorkerActivitiesPerSecond\":0.0,\"MaxConcurrentLocalActivityExecutionSize\":0,\"WorkerLocalActivitiesPerSecond\":0.0,\"TaskQueueActivitiesPerSecond\":0.0,\"MaxConcurrentActivityTaskPollers\":0,\"MaxConcurrentWorkflowTaskExecutionSize\":0,\"MaxConcurrentWorkflowTaskPollers\":0,\"StickyScheduleToStartTimeout\":null,\"WorkerStopTimeout\":null,\"EnableSessionWorker\":false,\"SessionResourceID\":null,\"MaxConcurrentSessionExecutionSize\":1000},\"Workflows\":[],\"Activities\":[{\"Name\":\"SendNotificationCommand\"}]}\u001b[0m"}
2022-02-05T19:42:55.153Z	DEBUG	temporal    	activity registered	{"name": "SendNotificationCommand"}
2022-02-05T19:42:55.240Z	DEBUG	temporal    	No workflows registered. Skipping workflow worker start	{"Namespace": "default", "TaskQueue": "events_hub_bridge_command_bus", "WorkerID": "events_hub_bridge_command_bus:440b9bf1-1b20-44cc-95e6-f34ea77b5715"}
2022-02-05T19:42:55.244Z	INFO	temporal    	Started Worker	{"Namespace": "default", "TaskQueue": "events_hub_bridge_command_bus", "WorkerID": "events_hub_bridge_command_bus:440b9bf1-1b20-44cc-95e6-f34ea77b5715"}
2022-02-05T19:42:55.244Z	DEBUG	temporal    	activity workers started	{"num_workers": 1}
2022-02-05T19:42:55.625Z	DEBUG	server      	worker is allocated	{"pid": 4510, "internal_event_name": "EventWorkerConstruct"}
2022-02-05T19:42:55.626Z	DEBUG	temporal    	outgoing message	{"data": "\u001b[32m\n\"\u0012\rGetWorkerInfo\u001a\u0011{\"rr_version\":\"\"} {}\u001b[0m"}
2022-02-05T19:42:55.721Z	DEBUG	temporal    	received message	{"data": "\u001b[93m\n\ufffd\u0004*\ufffd\u0004\n\ufffd\u0004\n\u0016\n\u0008encoding\u0012\njson/plain\u0012\ufffd\u0004{\"TaskQueue\":\"events_hub_bridge_command_bus\",\"Options\":{\"MaxConcurrentActivityExecutionSize\":0,\"WorkerActivitiesPerSecond\":0.0,\"MaxConcurrentLocalActivityExecutionSize\":0,\"WorkerLocalActivitiesPerSecond\":0.0,\"TaskQueueActivitiesPerSecond\":0.0,\"MaxConcurrentActivityTaskPollers\":0,\"MaxConcurrentWorkflowTaskExecutionSize\":0,\"MaxConcurrentWorkflowTaskPollers\":0,\"StickyScheduleToStartTimeout\":null,\"WorkerStopTimeout\":null,\"EnableSessionWorker\":false,\"SessionResourceID\":null,\"MaxConcurrentSessionExecutionSize\":1000},\"Workflows\":[],\"Activities\":[{\"Name\":\"SendNotificationCommand\"}]}\u001b[0m"}
2022-02-05T19:42:55.721Z	DEBUG	temporal    	worker info	{"taskqueue": "events_hub_bridge_command_bus", "options": {"MaxConcurrentActivityExecutionSize":0,"WorkerActivitiesPerSecond":0,"MaxConcurrentLocalActivityExecutionSize":0,"WorkerLocalActivitiesPerSecond":0,"TaskQueueActivitiesPerSecond":0,"MaxConcurrentActivityTaskPollers":0,"MaxConcurrentWorkflowTaskExecutionSize":0,"MaxConcurrentWorkflowTaskPollers":0,"EnableLoggingInReplay":false,"DisableStickyExecution":false,"StickyScheduleToStartTimeout":0,"BackgroundActivityContext":null,"WorkflowPanicPolicy":0,"WorkerStopTimeout":0,"EnableSessionWorker":false,"MaxConcurrentSessionExecutionSize":1000,"LocalActivityWorkerOnly":false,"Identity":"","DeadlockDetectionTimeout":0,"Interceptors":null}}
2022-02-05T19:42:55.722Z	DEBUG	temporal    	No workflows registered. Skipping workflow worker start	{"Namespace": "default", "TaskQueue": "events_hub_bridge_command_bus", "WorkerID": "events_hub_bridge_command_bus:2d87a2ae-a8a7-4d3e-ae31-d4389af707c0"}
2022-02-05T19:42:55.722Z	DEBUG	temporal    	No activities registered. Skipping activity worker start	{"Namespace": "default", "TaskQueue": "events_hub_bridge_command_bus", "WorkerID": "events_hub_bridge_command_bus:2d87a2ae-a8a7-4d3e-ae31-d4389af707c0"}
2022-02-05T19:42:55.722Z	INFO	temporal    	Started Worker	{"Namespace": "default", "TaskQueue": "events_hub_bridge_command_bus", "WorkerID": "events_hub_bridge_command_bus:2d87a2ae-a8a7-4d3e-ae31-d4389af707c0"}
2022-02-05T19:42:55.722Z	DEBUG	temporal    	No workflows registered. Skipping workflow worker start	{"Namespace": "default", "TaskQueue": "events_hub_bridge_command_bus", "WorkerID": "events_hub_bridge_command_bus:440b9bf1-1b20-44cc-95e6-f34ea77b5715"}
2022-02-05T19:42:55.724Z	INFO	temporal    	Started Worker	{"Namespace": "default", "TaskQueue": "events_hub_bridge_command_bus", "WorkerID": "events_hub_bridge_command_bus:440b9bf1-1b20-44cc-95e6-f34ea77b5715"}
2022-02-05T19:42:55.724Z	DEBUG	rpc         	plugin was started	{"address": "tcp://127.0.0.1:6001", "list of the plugins with RPC methods:": ["resetter", "temporal", "informer"]}
[INFO] RoadRunner server started; version: 2.7.4, buildtime: 2022-01-21T23:03:32+0000

github.com/stretchr/testify-v1.7.2: 1 vulnerabilities (highest severity is: 7.5) - autoclosed

Vulnerable Library - github.com/stretchr/testify-v1.7.2

Found in HEAD commit: 4c32f22d636145a2ba1033e84f705926be6d8743

Vulnerabilities

CVE Severity CVSS Dependency Type Fixed in Remediation Available
CVE-2022-28948 High 7.5 github.com/stretchr/objx-v0.4.0 Transitive N/A โŒ

Details

CVE-2022-28948

Vulnerable Library - github.com/stretchr/objx-v0.4.0

Go package for dealing with maps, slices, JSON and other data.

Dependency Hierarchy:

  • github.com/stretchr/testify-v1.7.2 (Root Library)
    • โŒ github.com/stretchr/objx-v0.4.0 (Vulnerable Library)

Found in HEAD commit: 4c32f22d636145a2ba1033e84f705926be6d8743

Found in base branch: master

Vulnerability Details

An issue in the Unmarshal function in Go-Yaml v3 causes the program to crash when attempting to deserialize invalid input.

Publish Date: 2022-05-19

URL: CVE-2022-28948

CVSS 3 Score Details (7.5)

Base Score Metrics:

  • Exploitability Metrics:
    • Attack Vector: Network
    • Attack Complexity: Low
    • Privileges Required: None
    • User Interaction: None
    • Scope: Unchanged
  • Impact Metrics:
    • Confidentiality Impact: None
    • Integrity Impact: None
    • Availability Impact: High

For more information on CVSS3 Scores, click here.

Suggested Fix

Type: Upgrade version

Origin: GHSA-hp87-p4gw-j4gq

Release Date: 2022-05-19

Fix Resolution: 3.0.0

github.com/uber-go/tally/v4-v4.1.2: 2 vulnerabilities (highest severity is: 7.5) - autoclosed

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.2

A Go metrics interface with fast buffered metrics and third party reporters

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.2.zip

Found in HEAD commit: 8208653fe0c729f462851afd04864d5a183f9855

Vulnerabilities

CVE Severity CVSS Dependency Type Fixed in Remediation Available
CVE-2019-0205 High 7.5 github.com/uber-go/tally/v4-v4.1.2 Direct org.apache.thrift:libthrift:0.13.0 โŒ
CVE-2019-0210 High 7.5 github.com/uber-go/tally/v4-v4.1.2 Direct 0.13.0 โŒ

Details

CVE-2019-0205

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.2

A Go metrics interface with fast buffered metrics and third party reporters

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.2.zip

Dependency Hierarchy:

  • โŒ github.com/uber-go/tally/v4-v4.1.2 (Vulnerable Library)

Found in HEAD commit: 8208653fe0c729f462851afd04864d5a183f9855

Found in base branch: master

Vulnerability Details

In Apache Thrift all versions up to and including 0.12.0, a server or client may run into an endless loop when feed with specific input data. Because the issue had already been partially fixed in version 0.11.0, depending on the installed version it affects only certain language bindings.

Publish Date: 2019-10-29

URL: CVE-2019-0205

CVSS 3 Score Details (7.5)

Base Score Metrics:

  • Exploitability Metrics:
    • Attack Vector: Network
    • Attack Complexity: Low
    • Privileges Required: None
    • User Interaction: None
    • Scope: Unchanged
  • Impact Metrics:
    • Confidentiality Impact: None
    • Integrity Impact: None
    • Availability Impact: High

For more information on CVSS3 Scores, click here.

Suggested Fix

Type: Upgrade version

Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0205

Release Date: 2019-10-29

Fix Resolution: org.apache.thrift:libthrift:0.13.0

CVE-2019-0210

Vulnerable Library - github.com/uber-go/tally/v4-v4.1.2

A Go metrics interface with fast buffered metrics and third party reporters

Library home page: https://proxy.golang.org/github.com/uber-go/tally/v4/@v/v4.1.2.zip

Dependency Hierarchy:

  • โŒ github.com/uber-go/tally/v4-v4.1.2 (Vulnerable Library)

Found in HEAD commit: 8208653fe0c729f462851afd04864d5a183f9855

Found in base branch: master

Vulnerability Details

In Apache Thrift 0.9.3 to 0.12.0, a server implemented in Go using TJSONProtocol or TSimpleJSONProtocol may panic when feed with invalid input data.

Publish Date: 2019-10-29

URL: CVE-2019-0210

CVSS 3 Score Details (7.5)

Base Score Metrics:

  • Exploitability Metrics:
    • Attack Vector: Network
    • Attack Complexity: Low
    • Privileges Required: None
    • User Interaction: None
    • Scope: Unchanged
  • Impact Metrics:
    • Confidentiality Impact: None
    • Integrity Impact: None
    • Availability Impact: High

For more information on CVSS3 Scores, click here.

Suggested Fix

Type: Upgrade version

Origin: http://mail-archives.apache.org/mod_mbox/thrift-dev/201910.mbox/%3C277A46CA87494176B1BBCF5D72624A2A%40HAGGIS%3E

Release Date: 2019-10-29

Fix Resolution: 0.13.0

[Bug] 60 seconds rescheduling time when processing many workfows at once

Describe the bug
When load testing Temporal with php-sdk, the behaviour of the roadrunner workers is sometimes (I wasn't able to specify the conditions) weird - the processing will get stuck for 60 seconds and resume afterwards. Which is not ideal in a production deployment, when the processing gets paused for 60 seconds. When running multiple roadrunner processes at once (each with multiple workers), the processing won't get stuck, or at least, not all the roadrunner instances get stuck, so the chance of this to happen are much lower. It seems like there is a timeout in roadrunner.

To Reproduce
Steps to reproduce the behavior:
I was also able to reproduce the behaviour on a modified sample from the samples repository (https://github.com/temporalio/samples-php)

I am running the polymorhpic example but I modified the ExecuteCommand to this:

class ExecuteCommand extends Command
{
    protected const NAME = 'polymorphic';
    protected const DESCRIPTION = 'Execute PolymorphicActivity\GreetingWorkflow';

    public function execute(InputInterface $input, OutputInterface $output)
    {


        $output->writeln("Starting <comment>GreetingWorkflow</comment>... ");

        for ($i = 0; $i < 2000; $i++) {
            $workflow = $this->workflowClient->newWorkflowStub(
                GreetingWorkflowInterface::class,
                WorkflowOptions::new()//->withWorkflowExecutionTimeout(CarbonInterval::minute())
            );
            $run = $this->workflowClient->start($workflow, 'Antony');
        }
//        $output->writeln(
//            sprintf(
//                'Started: WorkflowID=<fg=magenta>%s</fg=magenta>, RunID=<fg=magenta>%s</fg=magenta>',
//                $run->getExecution()->getID(),
//                $run->getExecution()->getRunID(),
//            )
//        );
//
//        $output->writeln(sprintf("Result:\n<info>%s</info>", $run->getResult()));

        return self::SUCCESS;
    }
}

and the GreetingWorkflow to this:

class GreetingWorkflow implements GreetingWorkflowInterface
{
    /** @var GreetingWorkflowInterface[] */
    private $greetingActivities = [];

    public function __construct()
    {
        $this->greetingActivities = [
            Workflow::newActivityStub(
                HelloActivity::class,
                ActivityOptions::new()->withScheduleToCloseTimeout(CarbonInterval::seconds(20))
            ),
            Workflow::newActivityStub(
                ByeActivity::class,
                ActivityOptions::new()->withScheduleToCloseTimeout(CarbonInterval::seconds(20))
            ),
        ];
    }

    public function greet(string $name): \Generator
    {
        $result = [];
        foreach ($this->greetingActivities as $activity) {
            $result[] = yield $activity->composeGreeting($name);
        }

        return join("\n", $result);
    }
}

It doesn't occur always but running multiple times after each other should make the problem appear

Expected behavior
When there are running workflows in Temporal scheduled for current time, the workers shouldn't be waiting, they should be processing them.

Screenshots/Terminal ouput
If applicable, add screenshots or code blocks to help explain your problem. You can also use Loom to do short, free video bug reports.

Versions (please complete the following information where relevant):

  • OS: Ubuntu in Docker for Mac
  • Temporal Version 1.10.1
  • are you using Docker or Kubernetes or building Temporal from source? Docker

Additional context
More detailed info is here: https://community.temporal.io/t/php-weird-behaviour-while-load-testing/2437/14

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.