Giter Site home page Giter Site logo

execution-apis's Introduction

Execution API Specification

JSON-RPC

View the spec

The Ethereum JSON-RPC is a standard collection of methods that all execution clients implement. It is the canonical interface between users and the network. This interface allows downstream tooling and infrastructure to treat different Ethereum clients as modules that can be swapped at will.

Contributing

Please see the contributors guide in docs/making-changes.md for general information about the process of standardizing new API methods and making changes to existing ones. Information on test generation can be found in tests/README.md

The specification itself is written in OpenRPC. Refer to the OpenRPC specification and the JSON schema specification to get started.

Building

The specification is split into multiple files to improve readability. The spec can be compiled into a single document as follows:

$ npm install
$ npm run build
Build successful.

This will output the file openrpc.json in the root of the project. This file will have all schema #refs resolved.

Testing

There are several mechanisms for testing specification contributions and client conformance.

First is the OpenRPC validator. It performs some basic syntactic checks on the generated specification.

$ npm install
$ npm run lint
OpenRPC spec validated successfully.

Next is speccheck. This tool validates the test cases in the tests directory against the specification.

$ go install github.com/lightclient/rpctestgen/cmd/speccheck@latest
$ speccheck -v
all passing.

The spell checker ensures the specification is free of spelling errors.

$ pip install pyspelling
$ pyspelling -c spellcheck.yaml
Spelling check passed :)

Finally, the test cases in the tests/ directory may be run against individual execution client using the [hive] simulator rpc-compat. Please see the documentation in the aforementioned repositories for more information.

GraphQL

View the spec

EIP-1767 proposed a GraphQL schema for interacting with Ethereum clients. Since then Besu and Geth have implemented the interface. This repo contains a live specification to integrate changes to the protocol as well as other improvements into the GraphQL schema.

Generation

The schema in this repo is generated by issuing a meta GraphQL query against a live node. This can be done as follows:

$ npm run graphql:schema

Testing

A script is included in the source code which reads and validates the given schema to be a valid one. It is recommended to perform this check after modifying the schema by:

$ npm run graphql:validate

License

This repository is licensed under CC0.

execution-apis's People

Contributors

alexandratran avatar anukul avatar djrtwo avatar ensi321 avatar ethdreamer avatar fjl avatar flcl42 avatar g11tech avatar holiman avatar jonasbn avatar jsvisa avatar kevdez avatar lightclient avatar macfarla avatar micahzoltu avatar mkalinin avatar onbjerg avatar paulhauner avatar potuz avatar protolambda avatar rafaelkallis avatar ralexstokes avatar rkrasiuk avatar rootulp avatar ryanio avatar s1na avatar shanejonas avatar shemnon avatar terencechain avatar tersec avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

execution-apis's Issues

Why do some calls support named parameters while others don't?

Hi,

I'm working on a custom connector between Infura and a low-code platform.
During implementation, I've noticed that some methods like eth_sendTransaction do support named parameters:

{
   "jsonrpc":"2.0",
   "method":"eth_sendTransaction",
   "params":[
      {
         "from":"0xeb85a5557e5bdc18ee1934a89d8bb402398ee26a",
         "to":"0x6ff93b4b46b41c0c3c9baee01c255d3b4675963d",
         "data":"0xc6888fa10000000000000000000000000000000000000000000000000000000000000006"
      }
   ],
   "id":8
}

However, other calls such as eth_getBalance do only support positional parameters:

{
    "jsonrpc": "2.0",
    "method": "eth_getBalance",
    "params": [
        "0x__________________",
        "latest"
    ],
    "id": 0
}

Positional parameters are harder to implement and generalize in my scenario.
Could someone please explain the design considerations behind the different parameter styles and confirm that there is no support for named parameters on the eth_getBalance method.

I've already tried the approach below, but it doesn't work, and the open-rpc doc confirms this.
I've also asked on the Infura forum.

Doesn't work
{
    "jsonrpc": "2.0",
    "method": "eth_getBalance",
    "params": [
        {
            "Address": "0x__________________",
            "Block": "latest"
        }
    ],
    "id": 0
}

Thanks
โ€”Sebastian

engine_executePayload QoS when sync triggered

When running engin_executePayload, CL might trigger EL to sync. This might be a huge sync that takes a long time or it could be a short-range sync of filling in a few blocks.

In #84, we discussed potentially adding some notion of how long EL can take to retrieve data from the network before it decides it is SYNCING and can't respond with VALID/INVALID.

This brings in additional complexity and only optimizes these (likely) rare cases when EL and CL have a block or two inconsistency issue.

I'm leaning toward not adding the note at all, but am dropping this for discussion before a final decision is made.

Some relevant discussion points here -- #84 (comment)

`Block` allOf items should be combining all objects together and support array of tx hashes and "hydrated" transactions

https://github.com/ethereum/eth1.0-apis/blob/c8fa3348b7fbe42e230a094460a68adfd8403aca/src/schemas/block.json#L93-L109

this doesnt look right combining headerObject with an array of tx hashes.

it should be combining type: 'object's together with the correct shape that you want to merge:

"allOf": [
      {
        "$ref": "./Header.json#/headerObject"
      },
      {
       "type: "object",
       "properties": {
           "transactions": {
              "$ref": "./Block.json#/transactionHash"
           }
       },
        {
       "type: "object",
       "properties": {
           "uncles": {
              "$ref": "./Block.json#/transactionHash"
           }
       }
]       

the name transactionHash seems like an incorrect name for an array of transaction hashes.

also you can pass true to eth_getBlockByNumber and it returns the transactions filled out instead of just the hashes:

image

so it needs to take that into account too

New "capabilities" method

Had a discussion with @pipermerriam and he had a really good recommendation that there should be an interface that allows clients to communicate to users what they are "capable" of serving. A few example client responses:

"I only store the last N blocks. I can't tell you information about blocks before that."

"I don't maintain a block number to block hash mapping, so you can only request information using the block hash."

"I don't participate in the tx pool, so you'll need to send your transactions to a different client if you want them to propagate to the network."

With the merge on the horizon, this could be an important new method in execution client api as "finality" may cause users to finally start dropping old data. Without a way of communicating that, clients may return confusing information like "not found".

JSON RPC Endpoints

https://github.com/DockBoss/EdgeCases

  • Geth
  • Erigon
  • OpenEthereum
  • Nethermind
  • Besu
    #55
  • Geth
  • Erigon
  • OpenEthereum
  • Nethermind
  • Besu
    #54
  • Geth
  • Erigon
  • OpenEthereum
  • Nethermind
  • Besu
    #53
  • Geth
  • Erigon
  • OpenEthereum
  • Nethermind
  • Besu
    #52
  • Geth
  • Erigon
  • OpenEthereum
  • Nethermind
  • Besu
    #51
  • Geth
  • Erigon
  • OpenEthereum
  • Nethermind
  • Besu
    #50
  • Geth
  • Erigon
  • OpenEthereum
  • Nethermind
  • Besu
    #49
  • Geth
  • Erigon
  • OpenEthereum
  • Nethermind
  • Besu
    #48
  • Geth
  • Erigon
  • OpenEthereum
  • Nethermind
  • Besu
    #47
  • Geth
  • Erigon
  • OpenEthereum
  • Nethermind
  • Besu
    #46

Engine API: send payload validation error to CL

Currently there is a message: STRING|null field in the response object of the engine_executePayload method. This field is supposed to be used in the case when the payload is invalid to provide additional information on what exactly has failed. This should improve CL client logs readability as it will give an idea of what went wrong during the execution without the strong need of looking into EL client logs.

@MariusVanDerWijden pointed out that the message field is confusing as it doesn't correspond to the regular JSON-RPC error format. There is a couple of options:

  1. Replace message with validationError: null|{code: errorCode, message: errorMessage} field
  2. Rename message to validationError and doesn't render the code

IMO, the first option does make sense if there is a list of messages and codes corresponding to errors occurred during validation and execution of a payload. If there is no such list it probably doesn't worth spending a time on standardising these errors and then adjusting clients to adhere to this standard, the option 2 could work good as long as there is a meaningful message put in the validationError field, event though the messages vary from one EL client to another.

Log object for `eth_newfilter`

eth_newfilter is defined as (not possible to copy text #66)
image

But there is no any link about this 'log', I guess it is object returned in notification.

Incorrect spec for `Filter`

Filter is missing BlockTag support for fromBlock and toBlock

Legacy spec:

image

This spec only supports uint


Additionally, Topic doesn't support null or an array of Topics

Example from legacy spec:

"topics": ["0x000000000000000000000000a94f5374fce5edbc8e2a8697c15331677e6ebf0b", null, ["0x000000000000000000000000a94f5374fce5edbc8e2a8697c15331677e6ebf0b", "0x0000000000000000000000000aff3454fce5edbc8cca8697c15331677e6ebccc"]]

So the definition should be something similar to:

topics?: Topic | Topic[] | null [];

Support linking to individual methods

Incorrect `Block` schema

According to legacy API spec:

  • number is missing possible null value if block is pending
    image
  • hash of block is missing:
    image
    • Not sure if mixHash is supposed to cover this?
  • nonce is missing possible null value if block is pending
    image
  • logsBloom is missing possible null value if block is pending
    image
  • difficulty has a return type of bytes, but should be uint
    image

Integer values and leading zeros

Originally posted by @fvictorio, here.

--

Methods like eth_getBlockByNumber expect an Integer value which is defined like this:

eth1.0-specs/json-rpc/spec.json

Line 1791 in ff5f44c
"pattern": "^0x[a-fA-F0-9]+$",

Notice that according to this pattern, a value like "0x0123" should be accepted, but if you try that in geth you get this error:

invalid argument 0: hex number with leading zero digits

In Hardhat we've decided to be as strict as geth with this, because otherwise users running code in Hardhat could find later that the same code doesn't work on geth. But accepting leading zeros makes sense because some tools generate hexadecimal strings padded with zeros (for example, to have an even length). See this Hardhat issue for an scenario where this happens.

I don't know what's the right thing to do here. Ideally geth, and any other client that does this, would be more permissive about it. The alternative is to make the spec iself match the scrict behavior, but I'd rather follow the robustness principle here.

Engine API: proposal to add `engine_getBlockBodies` method

Specification

Structures

BlockBodyV1

  • transactions: Array of DATA - Array of transaction objects, each object is a byte list (DATA) representing TransactionType || TransactionPayload or LegacyTransaction as defined in EIP-2718

Core

engine_getBlockBodiesV1

Request

  • method: engine_getBlockBodiesV1
  • params:
    1. Array of DATA, 32 Bytes - Array of block hashes.

Response

  • result: Array of BlockBody - Array of BlockBody objects.
  • error: code and message set in case an exception happens while processing the method call.

Specification

  1. Given array of block hashes client software MUST respond with array of BlockBody objects with the corresponding hashes respecting the order of block hashes in the input array.
  2. Client software SHOULD trim array of block bodies in the response in case if any block body is missing.

Rationale

Allows to replace ExecutionPayloads with ExecutionPayloadHeaders when persisting blocks on CL side and request transaction lists from EL client to serve beacon blocks to either user or remote peer. Max size of ExecutionPayloadHeader is 532 bytes, this results in 667MB increase of the space occupied by beacon blocks during 6 months period comparing to what we have today before the Merge. The increase of the space required to store blocks with full payload objects may be up to 2.5TB considering payload max size of 2MB for the same period of time.

This proposal attempts to reduce implementation complexity on EL side as semantics of this method maps on the semantics of existing GetBlockBodies message in ETH protocol.

The limit on the size of the input array isn't required considering trustful relationship between the layers which is gonna be secured by authentication of Engine API communication channel. Network requests in CL clients are currently limited by MAX_REQUEST_BLOCKS == 1024. It's not expected that the size of the input array exceeds this limit in a normal case, though, a sanity limit that is greater than MAX_REQUEST_BLOCKS parameter value may also be considered.

cc @djrtwo @arnetheduck

calculating Reserves after a swap using amountIn & amounts

I'm trying to simulate some pending transactions using eth_call.
eth_call return just the list of amounts (calculated using getAmounts) and dont emit events, while I need the reserves after for each pair in the path, and the problems is that each function of swap use a different input and output

Possible optional params for transaction input

Regarding these transaction specs, certain fields were specified as optional in the legacy spec:

image

image

I'm not sure if this is still true, and even if it is, I'm not sure if the added EIP-2930 and EIP-1559 parameters are optional

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.