16 Commits

Author SHA1 Message Date
Luke Parker
1a766ab773 Populate UnbalancedMerkleTrees in headers 2025-03-04 06:00:06 -05:00
Luke Parker
df2ae10d2f Add an UnbalancedMerkleTree primitive
The reasoning for it is documented with itself. The plan is to use it within
our header for committing to the DAG (allowing one header per epoch, yet
logarithmic proofs for any header within the epoch), the transactions
commitment (allowing logarithmic proofs of a transaction within a block,
without padding), and the events commitment (allowing logarithmic proofs of
unique events within a block, despite events not having a unique ID inherent).

This also defines transaction hashes and performs the necessary modifications
for transactions to be unique.
2025-03-04 04:00:05 -05:00
Luke Parker
b92ac4a15b Use borsh entirely in create_db 2025-02-26 14:50:59 -05:00
Luke Parker
51bae4fedc Remove now-consolidated primitives crates 2025-02-26 14:49:28 -05:00
Luke Parker
ee8b353132 Skeleton ruintime with new types 2025-02-26 14:16:04 -05:00
Luke Parker
a2d558ee34 Have apply return Ok even if calls failed
This ensures fees are paid, and block building isn't interrupted, even for TXs
which error.
2025-02-26 08:00:07 -05:00
Luke Parker
3273a4b725 Serialize BoundedVec not with a u32 length, but the minimum-viable uN where N%8==0
This does break borsh's definition of a Vec EXCEPT if the BoundedVec is
considered an enum. For sufficiently low bounds, this is viable, though it
requires automated code generation to be sane.
2025-02-26 07:41:07 -05:00
Luke Parker
df87abbae0 Correct distinction/flow of check/validate/apply 2025-02-26 07:24:58 -05:00
Luke Parker
fdf2ec8e92 Make transaction an enum of Unsigned, Signed 2025-02-26 06:54:42 -05:00
Luke Parker
f92fe922a6 Remove RuntimeCall from Transaction
I believe this was originally here as we needed to return a reference, not an
owned instance, so this caching enabled returning a reference? Regardless, it
isn't valuable now.
2025-02-26 05:19:04 -05:00
Luke Parker
121a48b55c Add traits necessary for serai_abi::Transaction to be usable in-runtime 2025-02-26 05:05:35 -05:00
Luke Parker
dff9a04a8c Add the UNIX timestamp (in milliseconds to the block
This is read from the BABE pre-digest when converting from a SubstrateHeader.
This causes the genesis block to have time 0 and all blocks produced with BABE
to have a time of the slot time. While the slot time is in 6-second intervals
(due to our target block time), defining in milliseconds preserves the ABI for
long-term goals (sub-second blocks).

Usage of the slot time deduplicates this field with BABE, and leaves the only
possible manipulation to propose during a slot or to not propose during a slot.

The actual reason this was implemented this way is because the Header trait is
overly restrictive and doesn't allow definition with new fields. Even if we
wanted to express the timestamp within the SubstrateHeader, we can't without
replacing Header::new and making a variety of changes to the polkadot-sdk
accordingly. Those aren't worth it at this moment compared to the solution
implemented.
2025-02-17 02:14:31 -05:00
Luke Parker
2d8f70036a Redo primitives, abi
Consolidates all primitives into a single crate. We didn't benefit from its
fragmentation. I'm hesitant to say the new internal-organization is better (it
may be just as clunky), but it's at least in a single crate (not spread out
over micro-crates).

The ABI is the most distinct. We now entirely own it. Block header hashes don't
directly commit to any BABE data (avoiding potentially ~4 KB headers upon
session changes), and are hashed as borsh (a more widely used codec than
SCALE). There are still Substrate variants, using SCALE and with the BABE data,
but they're prunable from a protocol design perspective.

Defines a transaction as a Vec of Calls, allowing atomic operations.
2025-02-12 03:54:57 -05:00
Luke Parker
dd95494d9c Update deny, rust-src component 2025-02-04 08:12:02 -05:00
Luke Parker
653b0e0bbc Update the git tags
Does no actual migration work. This allows establishing the difference in
dependencies between substrate and polkadot-sdk/substrate.
2025-02-04 07:53:41 -05:00
Luke Parker
d78c92bc3e Update nightly version 2025-02-04 00:53:22 -05:00
173 changed files with 5739 additions and 7948 deletions

View File

@@ -21,4 +21,4 @@ jobs:
run: cargo install --locked cargo-deny
- name: Run cargo deny
run: cargo deny -L error --all-features check
run: cargo deny -L error --all-features check --hide-inclusion-graph

View File

@@ -26,7 +26,7 @@ jobs:
uses: ./.github/actions/build-dependencies
- name: Install nightly rust
run: rustup toolchain install ${{ steps.nightly.outputs.version }} --profile minimal -t wasm32-unknown-unknown -c clippy
run: rustup toolchain install ${{ steps.nightly.outputs.version }} --profile minimal -t wasmv1-none -c clippy
- name: Run Clippy
run: cargo +${{ steps.nightly.outputs.version }} clippy --all-features --all-targets -- -D warnings -A clippy::items_after_test_module
@@ -55,7 +55,7 @@ jobs:
run: cargo install --locked cargo-deny
- name: Run cargo deny
run: cargo deny -L error --all-features check
run: cargo deny -L error --all-features check --hide-inclusion-graph
fmt:
runs-on: ubuntu-latest

View File

@@ -69,7 +69,7 @@ jobs:
uses: ./.github/actions/build-dependencies
- name: Buld Rust docs
run: |
rustup toolchain install ${{ steps.nightly.outputs.version }} --profile minimal -t wasm32-unknown-unknown -c rust-docs
rustup toolchain install ${{ steps.nightly.outputs.version }} --profile minimal -t wasmv1-none -c rust-docs
RUSTDOCFLAGS="--cfg docsrs" cargo +${{ steps.nightly.outputs.version }} doc --workspace --all-features
mv target/doc docs/_site/rust

5131
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -103,31 +103,17 @@ members = [
"coordinator",
"substrate/primitives",
"substrate/coins/primitives",
"substrate/coins/pallet",
"substrate/dex/pallet",
"substrate/validator-sets/primitives",
"substrate/validator-sets/pallet",
"substrate/genesis-liquidity/primitives",
"substrate/genesis-liquidity/pallet",
"substrate/emissions/primitives",
"substrate/emissions/pallet",
"substrate/economic-security/pallet",
"substrate/in-instructions/primitives",
"substrate/in-instructions/pallet",
"substrate/signals/primitives",
"substrate/signals/pallet",
"substrate/abi",
"substrate/coins",
"substrate/validator-sets",
"substrate/signals",
"substrate/dex",
"substrate/genesis-liquidity",
"substrate/economic-security",
"substrate/emissions",
"substrate/in-instructions",
"substrate/runtime",
"substrate/node",

View File

@@ -1,427 +0,0 @@
Attribution-ShareAlike 4.0 International
=======================================================================
Creative Commons Corporation ("Creative Commons") is not a law firm and
does not provide legal services or legal advice. Distribution of
Creative Commons public licenses does not create a lawyer-client or
other relationship. Creative Commons makes its licenses and related
information available on an "as-is" basis. Creative Commons gives no
warranties regarding its licenses, any material licensed under their
terms and conditions, or any related information. Creative Commons
disclaims all liability for damages resulting from their use to the
fullest extent possible.
Using Creative Commons Public Licenses
Creative Commons public licenses provide a standard set of terms and
conditions that creators and other rights holders may use to share
original works of authorship and other material subject to copyright
and certain other rights specified in the public license below. The
following considerations are for informational purposes only, are not
exhaustive, and do not form part of our licenses.
Considerations for licensors: Our public licenses are
intended for use by those authorized to give the public
permission to use material in ways otherwise restricted by
copyright and certain other rights. Our licenses are
irrevocable. Licensors should read and understand the terms
and conditions of the license they choose before applying it.
Licensors should also secure all rights necessary before
applying our licenses so that the public can reuse the
material as expected. Licensors should clearly mark any
material not subject to the license. This includes other CC-
licensed material, or material used under an exception or
limitation to copyright. More considerations for licensors:
wiki.creativecommons.org/Considerations_for_licensors
Considerations for the public: By using one of our public
licenses, a licensor grants the public permission to use the
licensed material under specified terms and conditions. If
the licensor's permission is not necessary for any reason--for
example, because of any applicable exception or limitation to
copyright--then that use is not regulated by the license. Our
licenses grant only permissions under copyright and certain
other rights that a licensor has authority to grant. Use of
the licensed material may still be restricted for other
reasons, including because others have copyright or other
rights in the material. A licensor may make special requests,
such as asking that all changes be marked or described.
Although not required by our licenses, you are encouraged to
respect those requests where reasonable. More considerations
for the public:
wiki.creativecommons.org/Considerations_for_licensees
=======================================================================
Creative Commons Attribution-ShareAlike 4.0 International Public
License
By exercising the Licensed Rights (defined below), You accept and agree
to be bound by the terms and conditions of this Creative Commons
Attribution-ShareAlike 4.0 International Public License ("Public
License"). To the extent this Public License may be interpreted as a
contract, You are granted the Licensed Rights in consideration of Your
acceptance of these terms and conditions, and the Licensor grants You
such rights in consideration of benefits the Licensor receives from
making the Licensed Material available under these terms and
conditions.
Section 1 -- Definitions.
a. Adapted Material means material subject to Copyright and Similar
Rights that is derived from or based upon the Licensed Material
and in which the Licensed Material is translated, altered,
arranged, transformed, or otherwise modified in a manner requiring
permission under the Copyright and Similar Rights held by the
Licensor. For purposes of this Public License, where the Licensed
Material is a musical work, performance, or sound recording,
Adapted Material is always produced where the Licensed Material is
synched in timed relation with a moving image.
b. Adapter's License means the license You apply to Your Copyright
and Similar Rights in Your contributions to Adapted Material in
accordance with the terms and conditions of this Public License.
c. BY-SA Compatible License means a license listed at
creativecommons.org/compatiblelicenses, approved by Creative
Commons as essentially the equivalent of this Public License.
d. Copyright and Similar Rights means copyright and/or similar rights
closely related to copyright including, without limitation,
performance, broadcast, sound recording, and Sui Generis Database
Rights, without regard to how the rights are labeled or
categorized. For purposes of this Public License, the rights
specified in Section 2(b)(1)-(2) are not Copyright and Similar
Rights.
e. Effective Technological Measures means those measures that, in the
absence of proper authority, may not be circumvented under laws
fulfilling obligations under Article 11 of the WIPO Copyright
Treaty adopted on December 20, 1996, and/or similar international
agreements.
f. Exceptions and Limitations means fair use, fair dealing, and/or
any other exception or limitation to Copyright and Similar Rights
that applies to Your use of the Licensed Material.
g. License Elements means the license attributes listed in the name
of a Creative Commons Public License. The License Elements of this
Public License are Attribution and ShareAlike.
h. Licensed Material means the artistic or literary work, database,
or other material to which the Licensor applied this Public
License.
i. Licensed Rights means the rights granted to You subject to the
terms and conditions of this Public License, which are limited to
all Copyright and Similar Rights that apply to Your use of the
Licensed Material and that the Licensor has authority to license.
j. Licensor means the individual(s) or entity(ies) granting rights
under this Public License.
k. Share means to provide material to the public by any means or
process that requires permission under the Licensed Rights, such
as reproduction, public display, public performance, distribution,
dissemination, communication, or importation, and to make material
available to the public including in ways that members of the
public may access the material from a place and at a time
individually chosen by them.
l. Sui Generis Database Rights means rights other than copyright
resulting from Directive 96/9/EC of the European Parliament and of
the Council of 11 March 1996 on the legal protection of databases,
as amended and/or succeeded, as well as other essentially
equivalent rights anywhere in the world.
m. You means the individual or entity exercising the Licensed Rights
under this Public License. Your has a corresponding meaning.
Section 2 -- Scope.
a. License grant.
1. Subject to the terms and conditions of this Public License,
the Licensor hereby grants You a worldwide, royalty-free,
non-sublicensable, non-exclusive, irrevocable license to
exercise the Licensed Rights in the Licensed Material to:
a. reproduce and Share the Licensed Material, in whole or
in part; and
b. produce, reproduce, and Share Adapted Material.
2. Exceptions and Limitations. For the avoidance of doubt, where
Exceptions and Limitations apply to Your use, this Public
License does not apply, and You do not need to comply with
its terms and conditions.
3. Term. The term of this Public License is specified in Section
6(a).
4. Media and formats; technical modifications allowed. The
Licensor authorizes You to exercise the Licensed Rights in
all media and formats whether now known or hereafter created,
and to make technical modifications necessary to do so. The
Licensor waives and/or agrees not to assert any right or
authority to forbid You from making technical modifications
necessary to exercise the Licensed Rights, including
technical modifications necessary to circumvent Effective
Technological Measures. For purposes of this Public License,
simply making modifications authorized by this Section 2(a)
(4) never produces Adapted Material.
5. Downstream recipients.
a. Offer from the Licensor -- Licensed Material. Every
recipient of the Licensed Material automatically
receives an offer from the Licensor to exercise the
Licensed Rights under the terms and conditions of this
Public License.
b. Additional offer from the Licensor -- Adapted Material.
Every recipient of Adapted Material from You
automatically receives an offer from the Licensor to
exercise the Licensed Rights in the Adapted Material
under the conditions of the Adapter's License You apply.
c. No downstream restrictions. You may not offer or impose
any additional or different terms or conditions on, or
apply any Effective Technological Measures to, the
Licensed Material if doing so restricts exercise of the
Licensed Rights by any recipient of the Licensed
Material.
6. No endorsement. Nothing in this Public License constitutes or
may be construed as permission to assert or imply that You
are, or that Your use of the Licensed Material is, connected
with, or sponsored, endorsed, or granted official status by,
the Licensor or others designated to receive attribution as
provided in Section 3(a)(1)(A)(i).
b. Other rights.
1. Moral rights, such as the right of integrity, are not
licensed under this Public License, nor are publicity,
privacy, and/or other similar personality rights; however, to
the extent possible, the Licensor waives and/or agrees not to
assert any such rights held by the Licensor to the limited
extent necessary to allow You to exercise the Licensed
Rights, but not otherwise.
2. Patent and trademark rights are not licensed under this
Public License.
3. To the extent possible, the Licensor waives any right to
collect royalties from You for the exercise of the Licensed
Rights, whether directly or through a collecting society
under any voluntary or waivable statutory or compulsory
licensing scheme. In all other cases the Licensor expressly
reserves any right to collect such royalties.
Section 3 -- License Conditions.
Your exercise of the Licensed Rights is expressly made subject to the
following conditions.
a. Attribution.
1. If You Share the Licensed Material (including in modified
form), You must:
a. retain the following if it is supplied by the Licensor
with the Licensed Material:
i. identification of the creator(s) of the Licensed
Material and any others designated to receive
attribution, in any reasonable manner requested by
the Licensor (including by pseudonym if
designated);
ii. a copyright notice;
iii. a notice that refers to this Public License;
iv. a notice that refers to the disclaimer of
warranties;
v. a URI or hyperlink to the Licensed Material to the
extent reasonably practicable;
b. indicate if You modified the Licensed Material and
retain an indication of any previous modifications; and
c. indicate the Licensed Material is licensed under this
Public License, and include the text of, or the URI or
hyperlink to, this Public License.
2. You may satisfy the conditions in Section 3(a)(1) in any
reasonable manner based on the medium, means, and context in
which You Share the Licensed Material. For example, it may be
reasonable to satisfy the conditions by providing a URI or
hyperlink to a resource that includes the required
information.
3. If requested by the Licensor, You must remove any of the
information required by Section 3(a)(1)(A) to the extent
reasonably practicable.
b. ShareAlike.
In addition to the conditions in Section 3(a), if You Share
Adapted Material You produce, the following conditions also apply.
1. The Adapter's License You apply must be a Creative Commons
license with the same License Elements, this version or
later, or a BY-SA Compatible License.
2. You must include the text of, or the URI or hyperlink to, the
Adapter's License You apply. You may satisfy this condition
in any reasonable manner based on the medium, means, and
context in which You Share Adapted Material.
3. You may not offer or impose any additional or different terms
or conditions on, or apply any Effective Technological
Measures to, Adapted Material that restrict exercise of the
rights granted under the Adapter's License You apply.
Section 4 -- Sui Generis Database Rights.
Where the Licensed Rights include Sui Generis Database Rights that
apply to Your use of the Licensed Material:
a. for the avoidance of doubt, Section 2(a)(1) grants You the right
to extract, reuse, reproduce, and Share all or a substantial
portion of the contents of the database;
b. if You include all or a substantial portion of the database
contents in a database in which You have Sui Generis Database
Rights, then the database in which You have Sui Generis Database
Rights (but not its individual contents) is Adapted Material,
including for purposes of Section 3(b); and
c. You must comply with the conditions in Section 3(a) if You Share
all or a substantial portion of the contents of the database.
For the avoidance of doubt, this Section 4 supplements and does not
replace Your obligations under this Public License where the Licensed
Rights include other Copyright and Similar Rights.
Section 5 -- Disclaimer of Warranties and Limitation of Liability.
a. UNLESS OTHERWISE SEPARATELY UNDERTAKEN BY THE LICENSOR, TO THE
EXTENT POSSIBLE, THE LICENSOR OFFERS THE LICENSED MATERIAL AS-IS
AND AS-AVAILABLE, AND MAKES NO REPRESENTATIONS OR WARRANTIES OF
ANY KIND CONCERNING THE LICENSED MATERIAL, WHETHER EXPRESS,
IMPLIED, STATUTORY, OR OTHER. THIS INCLUDES, WITHOUT LIMITATION,
WARRANTIES OF TITLE, MERCHANTABILITY, FITNESS FOR A PARTICULAR
PURPOSE, NON-INFRINGEMENT, ABSENCE OF LATENT OR OTHER DEFECTS,
ACCURACY, OR THE PRESENCE OR ABSENCE OF ERRORS, WHETHER OR NOT
KNOWN OR DISCOVERABLE. WHERE DISCLAIMERS OF WARRANTIES ARE NOT
ALLOWED IN FULL OR IN PART, THIS DISCLAIMER MAY NOT APPLY TO YOU.
b. TO THE EXTENT POSSIBLE, IN NO EVENT WILL THE LICENSOR BE LIABLE
TO YOU ON ANY LEGAL THEORY (INCLUDING, WITHOUT LIMITATION,
NEGLIGENCE) OR OTHERWISE FOR ANY DIRECT, SPECIAL, INDIRECT,
INCIDENTAL, CONSEQUENTIAL, PUNITIVE, EXEMPLARY, OR OTHER LOSSES,
COSTS, EXPENSES, OR DAMAGES ARISING OUT OF THIS PUBLIC LICENSE OR
USE OF THE LICENSED MATERIAL, EVEN IF THE LICENSOR HAS BEEN
ADVISED OF THE POSSIBILITY OF SUCH LOSSES, COSTS, EXPENSES, OR
DAMAGES. WHERE A LIMITATION OF LIABILITY IS NOT ALLOWED IN FULL OR
IN PART, THIS LIMITATION MAY NOT APPLY TO YOU.
c. The disclaimer of warranties and limitation of liability provided
above shall be interpreted in a manner that, to the extent
possible, most closely approximates an absolute disclaimer and
waiver of all liability.
Section 6 -- Term and Termination.
a. This Public License applies for the term of the Copyright and
Similar Rights licensed here. However, if You fail to comply with
this Public License, then Your rights under this Public License
terminate automatically.
b. Where Your right to use the Licensed Material has terminated under
Section 6(a), it reinstates:
1. automatically as of the date the violation is cured, provided
it is cured within 30 days of Your discovery of the
violation; or
2. upon express reinstatement by the Licensor.
For the avoidance of doubt, this Section 6(b) does not affect any
right the Licensor may have to seek remedies for Your violations
of this Public License.
c. For the avoidance of doubt, the Licensor may also offer the
Licensed Material under separate terms or conditions or stop
distributing the Licensed Material at any time; however, doing so
will not terminate this Public License.
d. Sections 1, 5, 6, 7, and 8 survive termination of this Public
License.
Section 7 -- Other Terms and Conditions.
a. The Licensor shall not be bound by any additional or different
terms or conditions communicated by You unless expressly agreed.
b. Any arrangements, understandings, or agreements regarding the
Licensed Material not stated herein are separate from and
independent of the terms and conditions of this Public License.
Section 8 -- Interpretation.
a. For the avoidance of doubt, this Public License does not, and
shall not be interpreted to, reduce, limit, restrict, or impose
conditions on any use of the Licensed Material that could lawfully
be made without permission under this Public License.
b. To the extent possible, if any provision of this Public License is
deemed unenforceable, it shall be automatically reformed to the
minimum extent necessary to make it enforceable. If the provision
cannot be reformed, it shall be severed from this Public License
without affecting the enforceability of the remaining terms and
conditions.
c. No term or condition of this Public License will be waived and no
failure to comply consented to unless expressly agreed to by the
Licensor.
d. Nothing in this Public License constitutes or may be interpreted
as a limitation upon, or waiver of, any privileges and immunities
that apply to the Licensor or You, including from the legal
processes of any jurisdiction or authority.
=======================================================================
Creative Commons is not a party to its public
licenses. Notwithstanding, Creative Commons may elect to apply one of
its public licenses to material it publishes and in those instances
will be considered the “Licensor.” The text of the Creative Commons
public licenses is dedicated to the public domain under the CC0 Public
Domain Dedication. Except for the limited purpose of indicating that
material is shared under a Creative Commons public license or as
otherwise permitted by the Creative Commons policies published at
creativecommons.org/policies, Creative Commons does not authorize the
use of the trademark "Creative Commons" or any other trademark or logo
of Creative Commons without its prior written consent including,
without limitation, in connection with any unauthorized modifications
to any of its public licenses or any other arrangements,
understandings, or agreements concerning use of licensed material. For
the avoidance of doubt, this paragraph does not form part of the
public licenses.
Creative Commons may be contacted at creativecommons.org.

View File

@@ -1,14 +0,0 @@
# Trail of Bits Ethereum Contracts Audit, June 2025
This audit included:
- Our Schnorr contract and associated library (/networks/ethereum/schnorr)
- Our Ethereum primitives library (/processor/ethereum/primitives)
- Our Deployer contract and associated library (/processor/ethereum/deployer)
- Our ERC20 library (/processor/ethereum/erc20)
- Our Router contract and associated library (/processor/ethereum/router)
It is encompassing up to commit 4e0c58464fc4673623938335f06e2e9ea96ca8dd.
Please see
https://github.com/trailofbits/publications/blob/30c4fa3ebf39ff8e4d23ba9567344ec9691697b5/reviews/2025-04-serai-dex-security-review.pdf
for provenance.

View File

@@ -15,7 +15,7 @@ pub fn serai_db_key(
///
/// Creates a unit struct and a default implementation for the `key`, `get`, and `set`. The macro
/// uses a syntax similar to defining a function. Parameters are concatenated to produce a key,
/// they must be `scale` encodable. The return type is used to auto encode and decode the database
/// they must be `borsh` serializable. The return type is used to auto (de)serialize the database
/// value bytes using `borsh`.
///
/// # Arguments
@@ -54,11 +54,10 @@ macro_rules! create_db {
)?;
impl$(<$($generic_name: $generic_type),+>)? $field_name$(<$($generic_name),+>)? {
pub(crate) fn key($($arg: $arg_type),*) -> Vec<u8> {
use scale::Encode;
$crate::serai_db_key(
stringify!($db_name).as_bytes(),
stringify!($field_name).as_bytes(),
($($arg),*).encode()
&borsh::to_vec(&($($arg),*)).unwrap(),
)
}
pub(crate) fn set(

View File

@@ -42,7 +42,7 @@ messages = { package = "serai-processor-messages", path = "../processor/messages
message-queue = { package = "serai-message-queue", path = "../message-queue" }
tributary-sdk = { path = "./tributary-sdk" }
serai-client = { path = "../substrate/client", default-features = false, features = ["serai", "borsh"] }
serai-client = { path = "../substrate/client", default-features = false, features = ["serai"] }
log = { version = "0.4", default-features = false, features = ["std"] }
env_logger = { version = "0.10", default-features = false, features = ["humantime"] }

View File

@@ -23,7 +23,7 @@ schnorrkel = { version = "0.11", default-features = false, features = ["std"] }
scale = { package = "parity-scale-codec", version = "3", default-features = false, features = ["std", "derive"] }
borsh = { version = "1", default-features = false, features = ["std", "derive", "de_strict_order"] }
serai-client = { path = "../../substrate/client", default-features = false, features = ["serai", "borsh"] }
serai-client = { path = "../../substrate/client", default-features = false, features = ["serai"] }
log = { version = "0.4", default-features = false, features = ["std"] }

View File

@@ -22,7 +22,7 @@ borsh = { version = "1", default-features = false, features = ["std", "derive",
serai-db = { path = "../../common/db", version = "0.1" }
serai-client = { path = "../../substrate/client", default-features = false, features = ["serai", "borsh"] }
serai-client = { path = "../../substrate/client", default-features = false, features = ["serai"] }
serai-cosign = { path = "../cosign" }
tributary-sdk = { path = "../tributary-sdk" }

View File

@@ -29,7 +29,7 @@ schnorrkel = { version = "0.11", default-features = false, features = ["std"] }
hex = { version = "0.4", default-features = false, features = ["std"] }
borsh = { version = "1", default-features = false, features = ["std", "derive", "de_strict_order"] }
serai-client = { path = "../../../substrate/client", default-features = false, features = ["serai", "borsh"] }
serai-client = { path = "../../../substrate/client", default-features = false, features = ["serai"] }
serai-cosign = { path = "../../cosign" }
tributary-sdk = { path = "../../tributary-sdk" }

View File

@@ -25,7 +25,7 @@ borsh = { version = "1", default-features = false, features = ["std", "derive",
dkg = { path = "../../crypto/dkg", default-features = false, features = ["std"] }
serai-client = { path = "../../substrate/client", version = "0.1", default-features = false, features = ["serai", "borsh"] }
serai-client = { path = "../../substrate/client", version = "0.1", default-features = false, features = ["serai"] }
log = { version = "0.4", default-features = false, features = ["std"] }

View File

@@ -29,7 +29,7 @@ ciphersuite = { path = "../../crypto/ciphersuite", default-features = false, fea
dkg = { path = "../../crypto/dkg", default-features = false, features = ["std"] }
schnorr = { package = "schnorr-signatures", path = "../../crypto/schnorr", default-features = false, features = ["std"] }
serai-client = { path = "../../substrate/client", default-features = false, features = ["serai", "borsh"] }
serai-client = { path = "../../substrate/client", default-features = false, features = ["serai"] }
serai-db = { path = "../../common/db" }
serai-task = { path = "../../common/task", version = "0.1" }

View File

@@ -17,12 +17,12 @@ rustdoc-args = ["--cfg", "docsrs"]
workspace = true
[dependencies]
rand_core = "0.9"
rand_core = "0.6"
subtle = "^2.4"
ff = { version = "0.14.0-pre.0", features = ["bits"] }
group = "0.14.0-pre.0"
ff = { version = "0.13", features = ["bits"] }
group = "0.13"
[dev-dependencies]
k256 = { version = "^0.13.1", default-features = false, features = ["std", "arithmetic", "bits"] }

View File

@@ -7,8 +7,6 @@ db-urls = ["https://github.com/rustsec/advisory-db"]
yanked = "deny"
ignore = [
"RUSTSEC-2020-0168", # mach is unmaintained
"RUSTSEC-2021-0139", # https://github.com/serai-dex/serai/228
"RUSTSEC-2022-0061", # https://github.com/serai-dex/serai/227
"RUSTSEC-2024-0370", # proc-macro-error is unmaintained
"RUSTSEC-2024-0384", # instant is unmaintained
@@ -123,7 +121,7 @@ wildcards = "warn"
highlight = "all"
deny = [
{ name = "serde_derive", version = ">=1.0.172, <1.0.185" },
{ name = "hashbrown", version = ">=0.15" },
{ name = "hashbrown", version = "=0.15.0" },
]
[sources]
@@ -133,6 +131,6 @@ allow-registry = ["https://github.com/rust-lang/crates.io-index"]
allow-git = [
"https://github.com/rust-lang-nursery/lazy-static.rs",
"https://github.com/serai-dex/substrate-bip39",
"https://github.com/serai-dex/substrate",
"https://github.com/serai-dex/polkadot-sdk",
"https://github.com/kayabaNerve/pasta_curves",
]

View File

@@ -46,7 +46,7 @@ serai-db = { path = "../common/db", optional = true }
serai-env = { path = "../common/env" }
serai-primitives = { path = "../substrate/primitives", features = ["borsh"] }
serai-primitives = { path = "../substrate/primitives", default-features = false, features = ["std"] }
[features]
parity-db = ["serai-db/parity-db"]

View File

@@ -21,8 +21,8 @@ tower = "0.5"
serde_json = { version = "1", default-features = false }
simple-request = { path = "../../../common/request", version = "0.1", default-features = false }
alloy-json-rpc = { version = "0.14", default-features = false }
alloy-transport = { version = "0.14", default-features = false }
alloy-json-rpc = { version = "0.9", default-features = false }
alloy-transport = { version = "0.9", default-features = false }
[features]
default = ["tls"]

View File

@@ -29,14 +29,14 @@ rand_core = { version = "0.6", default-features = false, features = ["std"] }
k256 = { version = "^0.13.1", default-features = false, features = ["ecdsa"] }
alloy-core = { version = "1", default-features = false }
alloy-sol-types = { version = "1", default-features = false }
alloy-core = { version = "0.8", default-features = false }
alloy-sol-types = { version = "0.8", default-features = false }
alloy-simple-request-transport = { path = "../../../networks/ethereum/alloy-simple-request-transport", default-features = false }
alloy-rpc-types-eth = { version = "0.14", default-features = false }
alloy-rpc-client = { version = "0.14", default-features = false }
alloy-provider = { version = "0.14", default-features = false }
alloy-rpc-types-eth = { version = "0.9", default-features = false }
alloy-rpc-client = { version = "0.9", default-features = false }
alloy-provider = { version = "0.9", default-features = false }
alloy-node-bindings = { version = "0.14", default-features = false }
alloy-node-bindings = { version = "0.9", default-features = false }
tokio = { version = "1", default-features = false, features = ["macros"] }

View File

@@ -1,5 +1,5 @@
use subtle::Choice;
use group::{ff::PrimeField, Group};
use group::ff::PrimeField;
use k256::{
elliptic_curve::{
ops::Reduce,
@@ -22,10 +22,6 @@ impl PublicKey {
/// bounds such as parity).
#[must_use]
pub fn new(A: ProjectivePoint) -> Option<PublicKey> {
if bool::from(A.is_identity()) {
None?;
}
let affine = A.to_affine();
// Only allow even keys to save a word within Ethereum

View File

@@ -32,7 +32,7 @@ mod abi {
pub(crate) use TestSchnorr::*;
}
async fn setup_test() -> (AnvilInstance, Arc<RootProvider>, Address) {
async fn setup_test() -> (AnvilInstance, Arc<RootProvider<SimpleRequest>>, Address) {
let anvil = Anvil::new().spawn();
let provider = Arc::new(RootProvider::new(
@@ -61,7 +61,7 @@ async fn setup_test() -> (AnvilInstance, Arc<RootProvider>, Address) {
}
async fn call_verify(
provider: &RootProvider,
provider: &RootProvider<SimpleRequest>,
address: Address,
public_key: &PublicKey,
message: &[u8],
@@ -80,8 +80,10 @@ async fn call_verify(
.abi_encode()
.into(),
));
let bytes = provider.call(call).await.unwrap();
abi::verifyCall::abi_decode_returns(&bytes).unwrap()
let bytes = provider.call(&call).await.unwrap();
let res = abi::verifyCall::abi_decode_returns(&bytes, true).unwrap();
res._0
}
#[tokio::test]

View File

@@ -18,7 +18,7 @@ use crate::{Signature, tests::test_key};
fn ecrecover(message: Scalar, odd_y: bool, r: Scalar, s: Scalar) -> Option<[u8; 20]> {
let sig = ecdsa::Signature::from_scalars(r, s).ok()?;
let message: [u8; 32] = message.to_repr().into();
alloy_core::primitives::Signature::from_signature_and_parity(sig, odd_y)
alloy_core::primitives::PrimitiveSignature::from_signature_and_parity(sig, odd_y)
.recover_address_from_prehash(&alloy_core::primitives::B256::from(message))
.ok()
.map(Into::into)

View File

@@ -27,11 +27,6 @@ pub(crate) fn test_key() -> (Scalar, PublicKey) {
}
}
#[test]
fn test_identity_key() {
assert!(PublicKey::new(ProjectivePoint::IDENTITY).is_none());
}
#[test]
fn test_odd_key() {
// We generate a valid key to ensure there's not some distinct reason this key is invalid

View File

@@ -11,7 +11,7 @@ RUN rm -rf /etc/apt/sources.list.d/debian.sources && \
RUN apt update && apt upgrade && apt install clang -y
# Add the wasm toolchain
RUN rustup target add wasm32-unknown-unknown
RUN rustup target add wasmv1-none
FROM deterministic

View File

@@ -162,7 +162,7 @@ RUN apt install -y pkg-config clang
RUN apt install -y make protobuf-compiler
# Add the wasm toolchain
RUN rustup target add wasm32-unknown-unknown
RUN rustup target add wasmv1-none
{prelude}

View File

@@ -31,14 +31,14 @@ frost = { package = "modular-frost", path = "../../crypto/frost", default-featur
k256 = { version = "^0.13.1", default-features = false, features = ["std"] }
alloy-core = { version = "1", default-features = false }
alloy-core = { version = "0.8", default-features = false }
alloy-rlp = { version = "0.3", default-features = false }
alloy-rpc-types-eth = { version = "0.14", default-features = false }
alloy-transport = { version = "0.14", default-features = false }
alloy-rpc-types-eth = { version = "0.9", default-features = false }
alloy-transport = { version = "0.9", default-features = false }
alloy-simple-request-transport = { path = "../../networks/ethereum/alloy-simple-request-transport", default-features = false }
alloy-rpc-client = { version = "0.14", default-features = false }
alloy-provider = { version = "0.14", default-features = false }
alloy-rpc-client = { version = "0.9", default-features = false }
alloy-provider = { version = "0.9", default-features = false }
serai-client = { path = "../../substrate/client", default-features = false, features = ["ethereum"] }

View File

@@ -17,16 +17,17 @@ rustdoc-args = ["--cfg", "docsrs"]
workspace = true
[dependencies]
alloy-core = { version = "1", default-features = false }
alloy-core = { version = "0.8", default-features = false }
alloy-sol-types = { version = "1", default-features = false }
alloy-sol-macro = { version = "1", default-features = false }
alloy-sol-types = { version = "0.8", default-features = false }
alloy-sol-macro = { version = "0.8", default-features = false }
alloy-consensus = { version = "0.14", default-features = false }
alloy-consensus = { version = "0.9", default-features = false }
alloy-rpc-types-eth = { version = "0.14", default-features = false }
alloy-transport = { version = "0.14", default-features = false }
alloy-provider = { version = "0.14", default-features = false }
alloy-rpc-types-eth = { version = "0.9", default-features = false }
alloy-transport = { version = "0.9", default-features = false }
alloy-simple-request-transport = { path = "../../../networks/ethereum/alloy-simple-request-transport", default-features = false }
alloy-provider = { version = "0.9", default-features = false }
ethereum-primitives = { package = "serai-processor-ethereum-primitives", path = "../primitives", default-features = false }
@@ -34,9 +35,8 @@ ethereum-primitives = { package = "serai-processor-ethereum-primitives", path =
build-solidity-contracts = { path = "../../../networks/ethereum/build-contracts", default-features = false }
[dev-dependencies]
alloy-simple-request-transport = { path = "../../../networks/ethereum/alloy-simple-request-transport", default-features = false }
alloy-rpc-client = { version = "0.14", default-features = false }
alloy-node-bindings = { version = "0.14", default-features = false }
alloy-rpc-client = { version = "0.9", default-features = false }
alloy-node-bindings = { version = "0.9", default-features = false }
tokio = { version = "1.0", default-features = false, features = ["rt-multi-thread", "macros"] }

View File

@@ -11,6 +11,7 @@ use alloy_sol_types::SolCall;
use alloy_rpc_types_eth::{TransactionInput, TransactionRequest};
use alloy_transport::{TransportErrorKind, RpcError};
use alloy_simple_request_transport::SimpleRequest;
use alloy_provider::{Provider, RootProvider};
#[cfg(test)]
@@ -43,7 +44,7 @@ const INITCODE: &[u8] = {
/// of the EVM. It then supports retrieving the deployed contracts addresses (which aren't
/// deterministic) using a single call.
#[derive(Clone, Debug)]
pub struct Deployer(Arc<RootProvider>);
pub struct Deployer(Arc<RootProvider<SimpleRequest>>);
impl Deployer {
/// Obtain the transaction to deploy this contract, already signed.
///
@@ -118,7 +119,7 @@ impl Deployer {
///
/// This will return `None` if the Deployer has yet to be deployed on-chain.
pub async fn new(
provider: Arc<RootProvider>,
provider: Arc<RootProvider<SimpleRequest>>,
) -> Result<Option<Self>, RpcError<TransportErrorKind>> {
let address = Self::address();
let code = provider.get_code_at(address).await?;
@@ -137,14 +138,16 @@ impl Deployer {
let call = TransactionRequest::default().to(Self::address()).input(TransactionInput::new(
abi::Deployer::deploymentsCall::new((init_code_hash.into(),)).abi_encode().into(),
));
let bytes = self.0.call(call).await?;
let deployment = abi::Deployer::deploymentsCall::abi_decode_returns(&bytes).map_err(|e| {
TransportErrorKind::Custom(
format!("node returned a non-address for function returning address: {e:?}").into(),
)
})?;
let bytes = self.0.call(&call).await?;
let deployment = abi::Deployer::deploymentsCall::abi_decode_returns(&bytes, true)
.map_err(|e| {
TransportErrorKind::Custom(
format!("node returned a non-address for function returning address: {e:?}").into(),
)
})?
._0;
if deployment == Address::ZERO {
if **deployment == [0; 20] {
return Ok(None);
}
Ok(Some(deployment))

View File

@@ -76,9 +76,9 @@ async fn test_deployer() {
let call = TransactionRequest::default()
.to(Deployer::address())
.input(TransactionInput::new(deploy_tx.tx().input.clone()));
let call_err = provider.call(call).await.unwrap_err();
let call_err = provider.call(&call).await.unwrap_err();
assert!(matches!(
call_err.as_error_resp().unwrap().as_decoded_interface_error::<DeployerErrors>().unwrap(),
call_err.as_error_resp().unwrap().as_decoded_error::<DeployerErrors>(true).unwrap(),
DeployerErrors::PriorDeployed(PriorDeployed {}),
));
}
@@ -97,9 +97,9 @@ async fn test_deployer() {
let call = TransactionRequest::default()
.to(Deployer::address())
.input(TransactionInput::new(deploy_tx.tx().input.clone()));
let call_err = provider.call(call).await.unwrap_err();
let call_err = provider.call(&call).await.unwrap_err();
assert!(matches!(
call_err.as_error_resp().unwrap().as_decoded_interface_error::<DeployerErrors>().unwrap(),
call_err.as_error_resp().unwrap().as_decoded_error::<DeployerErrors>(true).unwrap(),
DeployerErrors::DeploymentFailed(DeploymentFailed {}),
));
}

View File

@@ -17,14 +17,15 @@ rustdoc-args = ["--cfg", "docsrs"]
workspace = true
[dependencies]
alloy-core = { version = "1", default-features = false }
alloy-core = { version = "0.8", default-features = false }
alloy-sol-types = { version = "1", default-features = false }
alloy-sol-macro = { version = "1", default-features = false }
alloy-sol-types = { version = "0.8", default-features = false }
alloy-sol-macro = { version = "0.8", default-features = false }
alloy-rpc-types-eth = { version = "0.14", default-features = false }
alloy-transport = { version = "0.14", default-features = false }
alloy-provider = { version = "0.14", default-features = false }
alloy-rpc-types-eth = { version = "0.9", default-features = false }
alloy-transport = { version = "0.9", default-features = false }
alloy-simple-request-transport = { path = "../../../networks/ethereum/alloy-simple-request-transport", default-features = false }
alloy-provider = { version = "0.9", default-features = false }
ethereum-primitives = { package = "serai-processor-ethereum-primitives", path = "../primitives", default-features = false }

View File

@@ -11,6 +11,7 @@ use alloy_sol_types::{SolInterface, SolEvent};
use alloy_rpc_types_eth::{Log, Filter, TransactionTrait};
use alloy_transport::{TransportErrorKind, RpcError};
use alloy_simple_request_transport::SimpleRequest;
use alloy_provider::{Provider, RootProvider};
use ethereum_primitives::LogIndex;
@@ -93,7 +94,7 @@ impl Erc20 {
// Yielding THE top-level transfer would require tracing the transaction execution and isn't
// worth the effort.
async fn top_level_transfer(
provider: &RootProvider,
provider: &RootProvider<SimpleRequest>,
erc20: Address,
transaction_hash: [u8; 32],
transfer_logs: &[Log],
@@ -111,13 +112,15 @@ impl Erc20 {
return Ok(None);
}
let Ok(call) = IERC20Calls::abi_decode(transaction.inner.input()) else {
// Don't validate the encoding as this can't be re-encoded to an identical bytestring due
// to the additional data appended after the call itself
let Ok(call) = IERC20Calls::abi_decode(transaction.inner.input(), false) else {
return Ok(None);
};
// Extract the top-level call's from/to/value
let (from, to, value) = match call {
IERC20Calls::transfer(transferCall { to, value }) => (transaction.inner.signer(), to, value),
IERC20Calls::transfer(transferCall { to, value }) => (transaction.from, to, value),
IERC20Calls::transferFrom(transferFromCall { from, to, value }) => (from, to, value),
// Treat any other function selectors as unrecognized
_ => return Ok(None),
@@ -146,7 +149,7 @@ impl Erc20 {
}
// Read the data appended after
let data = if let Ok(call) = SeraiIERC20Calls::abi_decode(transaction.inner.input()) {
let data = if let Ok(call) = SeraiIERC20Calls::abi_decode(transaction.inner.input(), true) {
match call {
SeraiIERC20Calls::transferWithInInstruction01BB244A8A(
transferWithInInstructionCall { inInstruction, .. },
@@ -177,7 +180,7 @@ impl Erc20 {
///
/// The `transfers` in the result are unordered. The `logs` are sorted by index.
pub async fn top_level_transfers_unordered(
provider: &RootProvider,
provider: &RootProvider<SimpleRequest>,
blocks: RangeInclusive<u64>,
erc20: Address,
to: Address,

View File

@@ -12,30 +12,4 @@ fn selector_collisions() {
);
}
#[test]
fn abi_decode_panic() {
use alloy_sol_types::SolInterface;
/*
The following code panics with alloy-core 0.8, when the validate flag (commented out) is set to
`false`. This flag was removed with alloy-core 1.0, leaving the default behavior of
`abi_decode` to be `validate = false`. This test was added to ensure when we removed our
practice of `validate = true`, we didn't open ourselves up this as a DoS risk.
*/
assert!(crate::SeraiIERC20Calls::abi_decode(
&alloy_core::primitives::hex::decode(concat!(
"a9059cbb",
"0000000000000000000000000000000000000000000000000000000000000000",
"0000000000000000000000000000000000000000000000000000000000000000",
"000000000000000000000000000000000000000000000000000000000000006f",
"ffffffffff000000000000000000000000000000000000000000000000000023",
"000000ffffffffffffffffffffffffffffffffffffffffffffffffffffffffff",
"ffffff0000000000000000000000000000000000000000000000000000000000",
))
.unwrap(),
// false
)
.is_err());
}
// This is primarily tested via serai-processor-ethereum-router

View File

@@ -22,5 +22,5 @@ borsh = { version = "1", default-features = false, features = ["std", "derive",
group = { version = "0.13", default-features = false }
k256 = { version = "^0.13.1", default-features = false, features = ["std", "arithmetic"] }
alloy-primitives = { version = "1", default-features = false }
alloy-consensus = { version = "0.14", default-features = false, features = ["k256"] }
alloy-primitives = { version = "0.8", default-features = false }
alloy-consensus = { version = "0.9", default-features = false, features = ["k256"] }

View File

@@ -7,7 +7,7 @@ use ::borsh::{BorshSerialize, BorshDeserialize};
use group::ff::PrimeField;
use k256::Scalar;
use alloy_primitives::Signature;
use alloy_primitives::PrimitiveSignature;
use alloy_consensus::{SignableTransaction, Signed, TxLegacy};
mod borsh;
@@ -68,7 +68,8 @@ pub fn deterministically_sign(tx: TxLegacy) -> Signed<TxLegacy> {
let s = Scalar::ONE;
let r_bytes: [u8; 32] = r.to_repr().into();
let s_bytes: [u8; 32] = s.to_repr().into();
let signature = Signature::from_scalars_and_parity(r_bytes.into(), s_bytes.into(), false);
let signature =
PrimitiveSignature::from_scalars_and_parity(r_bytes.into(), s_bytes.into(), false);
let res = tx.into_signed(signature);
debug_assert!(res.recover_signer().is_ok());

View File

@@ -22,18 +22,19 @@ borsh = { version = "1", default-features = false, features = ["std", "derive",
group = { version = "0.13", default-features = false }
k256 = { version = "0.13", default-features = false, features = ["std", "arithmetic"] }
alloy-core = { version = "1", default-features = false }
alloy-core = { version = "0.8", default-features = false }
alloy-sol-types = { version = "1", default-features = false }
alloy-sol-macro = { version = "1", default-features = false }
alloy-sol-types = { version = "0.8", default-features = false }
alloy-sol-macro = { version = "0.8", default-features = false }
alloy-consensus = { version = "0.14", default-features = false }
alloy-consensus = { version = "0.9", default-features = false }
alloy-rpc-types-eth = { version = "0.14", default-features = false }
alloy-transport = { version = "0.14", default-features = false }
alloy-provider = { version = "0.14", default-features = false }
alloy-rpc-types-eth = { version = "0.9", default-features = false }
alloy-transport = { version = "0.9", default-features = false }
alloy-simple-request-transport = { path = "../../../networks/ethereum/alloy-simple-request-transport", default-features = false }
alloy-provider = { version = "0.9", default-features = false }
revm = { version = "22", default-features = false, features = ["std"] }
revm = { version = "19", default-features = false, features = ["std"] }
ethereum-schnorr = { package = "ethereum-schnorr-contract", path = "../../../networks/ethereum/schnorr", default-features = false }
@@ -51,19 +52,18 @@ build-solidity-contracts = { path = "../../../networks/ethereum/build-contracts"
syn = { version = "2", default-features = false, features = ["proc-macro"] }
syn-solidity = { version = "1", default-features = false }
alloy-sol-macro-input = { version = "1", default-features = false }
alloy-sol-macro-expander = { version = "1", default-features = false }
syn-solidity = { version = "0.8", default-features = false }
alloy-sol-macro-input = { version = "0.8", default-features = false }
alloy-sol-macro-expander = { version = "0.8", default-features = false }
[dev-dependencies]
rand_core = { version = "0.6", default-features = false, features = ["std"] }
k256 = { version = "0.13", default-features = false, features = ["std"] }
alloy-simple-request-transport = { path = "../../../networks/ethereum/alloy-simple-request-transport", default-features = false }
alloy-provider = { version = "0.14", default-features = false, features = ["debug-api", "trace-api"] }
alloy-rpc-client = { version = "0.14", default-features = false }
alloy-node-bindings = { version = "0.14", default-features = false }
alloy-provider = { version = "0.9", default-features = false, features = ["debug-api", "trace-api"] }
alloy-rpc-client = { version = "0.9", default-features = false }
alloy-node-bindings = { version = "0.9", default-features = false }
tokio = { version = "1.0", default-features = false, features = ["rt-multi-thread", "macros"] }

View File

@@ -148,9 +148,8 @@ contract Router is IRouterWithoutCollisions {
/**
* @dev Verify a signature of the calldata, placed immediately after the function selector. The
* calldata should be signed with the chain ID taking the place of the signature's challenge, and
* the signature's response replaced by the contract's address shifted into the high bits with
* the contract's nonce as the low bits.
* calldata should be signed with the nonce taking the place of the signature's commitment to
* its nonce, and the signature solution zeroed.
*/
/// @param key The key to verify the signature with
function verifySignature(bytes32 key)
@@ -186,10 +185,6 @@ contract Router is IRouterWithoutCollisions {
// Read _nextNonce into memory as the nonce we'll use
nonceUsed = _nextNonce;
// We overwrite the signature response with the Router contract's address concatenated with the
// nonce. This is safe until the nonce exceeds 2**96, which is infeasible to do on-chain
uint256 signatureResponseOverwrite = (uint256(uint160(address(this))) << 96) | nonceUsed;
// Declare memory to copy the signature out to
bytes32 signatureC;
bytes32 signatureS;
@@ -203,8 +198,8 @@ contract Router is IRouterWithoutCollisions {
// Overwrite the signature challenge with the chain ID
mstore(add(message, 36), chainID)
// Overwrite the signature response with the contract's address, nonce
mstore(add(message, 68), signatureResponseOverwrite)
// Overwrite the signature response with the nonce
mstore(add(message, 68), nonceUsed)
// Calculate the message hash
messageHash := keccak256(add(message, 32), messageLen)

View File

@@ -1,130 +1,26 @@
use core::convert::Infallible;
use k256::{Scalar, ProjectivePoint};
use alloy_core::primitives::{Address, U256, Bytes};
use alloy_core::primitives::{Address, U160, U256};
use alloy_sol_types::SolCall;
use revm::{
primitives::hardfork::SpecId,
bytecode::Bytecode,
state::AccountInfo,
database::{empty_db::EmptyDB, in_memory_db::InMemoryDB},
interpreter::{
gas::calculate_initial_tx_gas,
interpreter_action::{CallInputs, CallOutcome},
interpreter::EthInterpreter,
Interpreter,
},
handler::{
instructions::EthInstructions, PrecompileProvider, EthPrecompiles, EthFrame, MainnetHandler,
},
context::{
result::{EVMError, InvalidTransaction, ExecutionResult},
evm::{EvmData, Evm},
context::Context,
*,
},
inspector::{Inspector, InspectorHandler},
primitives::*,
interpreter::{gas::*, opcode::InstructionTables, *},
db::{emptydb::EmptyDB, in_memory_db::InMemoryDB},
Handler, Context, EvmBuilder, Evm,
};
use ethereum_schnorr::{PublicKey, Signature};
use crate::*;
// The specification this uses
const SPEC_ID: SpecId = SpecId::CANCUN;
// The chain ID used for gas estimation
const CHAIN_ID: U256 = U256::from_be_slice(&[1]);
type RevmContext = Context<BlockEnv, TxEnv, CfgEnv, InMemoryDB, Journal<InMemoryDB>, ()>;
fn precompiles() -> EthPrecompiles {
let mut precompiles = EthPrecompiles::default();
PrecompileProvider::<RevmContext>::set_spec(&mut precompiles, SPEC_ID);
precompiles
}
/*
Instead of attempting to solve the halting problem, we assume all CALLs take the worst-case
amount of gas (as we do have bounds on the gas they're allowed to take). This assumption is
implemented via an revm Inspector.
The Inspector is allowed to override the CALL directly. We don't do this due to the amount of
side effects a CALL has. Instead, we override the result.
In the case the ERC20 is called, we additionally have it return `true` (as expected for compliant
ERC20s, and as will trigger the worst-case gas consumption by the Router itself). This is done by
hooking `call_end`.
*/
pub(crate) struct WorstCaseCallInspector {
erc20: Option<Address>,
call_depth: usize,
unused_gas: u64,
override_immediate_call_return_value: bool,
}
impl Inspector<RevmContext> for WorstCaseCallInspector {
fn call(&mut self, _context: &mut RevmContext, _inputs: &mut CallInputs) -> Option<CallOutcome> {
self.call_depth += 1;
// Don't override the CALL immediately for prior described reasons
None
}
fn call_end(
&mut self,
_context: &mut RevmContext,
inputs: &CallInputs,
outcome: &mut CallOutcome,
) {
self.call_depth -= 1;
/*
Mark the amount of gas left unused, for us to later assume will be used in practice.
This only runs if the call-depth is 1 (so only the Router-made calls have their gas so
tracked), and if it's not to a precompile. This latter condition isn't solely because we can
perfectly model precompiles (which wouldn't be worth the complexity) yet because the Router
does call precompiles (ecrecover) and accordingly has to model the gas of that correctly.
*/
if (self.call_depth == 1) && (!precompiles().contains(&inputs.target_address)) {
let unused_gas = inputs.gas_limit - outcome.result.gas.spent();
self.unused_gas += unused_gas;
// Now that the CALL is over, flag we should normalize the values on the stack
self.override_immediate_call_return_value = true;
}
// If ERC20, provide the expected ERC20 return data
if Some(inputs.target_address) == self.erc20 {
outcome.result.output = true.abi_encode().into();
}
}
fn step(&mut self, interpreter: &mut Interpreter, _context: &mut RevmContext) {
if self.override_immediate_call_return_value {
// We fix this result to having succeeded, which triggers the most-expensive pathing within
// the Router contract itself (some paths return early if a CALL fails)
let return_value = interpreter.stack.pop().unwrap();
assert!((return_value == U256::ZERO) || (return_value == U256::ONE));
assert!(
interpreter.stack.push(U256::ONE),
"stack capacity couldn't fit item after popping an item"
);
self.override_immediate_call_return_value = false;
}
}
}
/// The object used for estimating gas.
///
/// Due to `execute` heavily branching, we locally simulate calls with revm.
pub(crate) type GasEstimator = Evm<
RevmContext,
WorstCaseCallInspector,
EthInstructions<EthInterpreter, RevmContext>,
EthPrecompiles,
>;
pub(crate) type GasEstimator = Evm<'static, (), InMemoryDB>;
impl Router {
const SMART_CONTRACT_NONCE_STORAGE_SLOT: U256 = U256::from_be_slice(&[0]);
@@ -151,11 +47,11 @@ impl Router {
the correct set of prices for the network they're operating on.
*/
/// The gas used by `confirmSeraiKey`.
pub const CONFIRM_NEXT_SERAI_KEY_GAS: u64 = 57_753;
pub const CONFIRM_NEXT_SERAI_KEY_GAS: u64 = 57_736;
/// The gas used by `updateSeraiKey`.
pub const UPDATE_SERAI_KEY_GAS: u64 = 60_062;
pub const UPDATE_SERAI_KEY_GAS: u64 = 60_045;
/// The gas used by `escapeHatch`.
pub const ESCAPE_HATCH_GAS: u64 = 61_111;
pub const ESCAPE_HATCH_GAS: u64 = 61_094;
/// The key to use when performing gas estimations.
///
@@ -218,35 +114,120 @@ impl Router {
db
};
Evm {
data: EvmData {
ctx: RevmContext::new(db, SPEC_ID)
.modify_cfg_chained(|cfg| {
cfg.chain_id = CHAIN_ID.try_into().unwrap();
})
.modify_tx_chained(|tx: &mut TxEnv| {
tx.gas_limit = u64::MAX;
tx.kind = self.address.into();
}),
inspector: WorstCaseCallInspector {
erc20,
call_depth: 0,
unused_gas: 0,
override_immediate_call_return_value: false,
},
},
instruction: EthInstructions::default(),
precompiles: precompiles(),
}
// Create a custom handler so we can assume every CALL is the worst-case
let handler = {
let mut instructions = InstructionTables::<'_, _>::new_plain::<CancunSpec>();
instructions.update_boxed(revm::interpreter::opcode::CALL, {
move |call_op, interpreter, host: &mut Context<_, _>| {
let (address_called, value, return_addr, return_len) = {
let stack = &mut interpreter.stack;
let address = stack.peek(1).unwrap();
let value = stack.peek(2).unwrap();
let return_addr = stack.peek(5).unwrap();
let return_len = stack.peek(6).unwrap();
(
address,
value,
usize::try_from(return_addr).unwrap(),
usize::try_from(return_len).unwrap(),
)
};
let address_called =
Address::from(U160::from_be_slice(&address_called.to_be_bytes::<32>()[12 ..]));
// Have the original call op incur its costs as programmed
call_op(interpreter, host);
/*
Unfortunately, the call opcode executed only sets itself up, it doesn't handle the
entire inner call for us. We manually do so here by shimming the intended result. The
other option, on this path chosen, would be to shim the call-frame execution ourselves
and only then manipulate the result.
Ideally, we wouldn't override CALL, yet STOP/RETURN (the tail of the CALL) to avoid all
of this. Those overrides weren't being successfully hit in initial experiments, and
while this solution does appear overly complicated, it's sufficiently tested to justify
itself.
revm does cost the entire gas limit during the call setup. After the call completes,
it refunds whatever was unused. Since we manually complete the call here ourselves,
but don't implement that refund logic as we want the worst-case scenario, we do
successfully implement complete costing of the gas limit.
*/
// Perform the call value transfer, which also marks the recipient as warm
assert!(host
.evm
.inner
.journaled_state
.transfer(
&interpreter.contract.target_address,
&address_called,
value,
&mut host.evm.inner.db
)
.unwrap()
.is_none());
// Clear the call-to-be
debug_assert!(matches!(interpreter.next_action, InterpreterAction::Call { .. }));
interpreter.next_action = InterpreterAction::None;
interpreter.instruction_result = InstructionResult::Continue;
// Clear the existing return data
interpreter.return_data_buffer.clear();
/*
If calling an ERC20, trigger the return data's worst-case by returning `true`
(as expected by compliant ERC20s). Else return none, as we expect none or won't bother
copying/decoding the return data.
This doesn't affect calls to ecrecover as those use STATICCALL and this overrides CALL
alone.
*/
if Some(address_called) == erc20 {
interpreter.return_data_buffer = true.abi_encode().into();
}
// Also copy the return data into memory
let return_len = return_len.min(interpreter.return_data_buffer.len());
let needed_memory_size = return_addr + return_len;
if interpreter.shared_memory.len() < needed_memory_size {
assert!(interpreter.resize_memory(needed_memory_size));
}
interpreter
.shared_memory
.slice_mut(return_addr, return_len)
.copy_from_slice(&interpreter.return_data_buffer[.. return_len]);
// Finally, push the result of the call onto the stack
interpreter.stack.push(U256::from(1)).unwrap();
}
});
let mut handler = Handler::mainnet::<CancunSpec>();
handler.set_instruction_table(instructions);
handler
};
EvmBuilder::default()
.with_db(db)
.with_handler(handler)
.modify_cfg_env(|cfg| {
cfg.chain_id = CHAIN_ID.try_into().unwrap();
})
.modify_tx_env(|tx| {
tx.gas_limit = u64::MAX;
tx.transact_to = self.address.into();
})
.build()
}
/// The worst-case gas cost for a legacy transaction which executes this batch.
pub fn execute_gas_and_fee(
&self,
coin: Coin,
fee_per_gas: U256,
outs: &OutInstructions,
) -> (u64, U256) {
///
/// This assumes the fee will be non-zero.
pub fn execute_gas(&self, coin: Coin, fee_per_gas: U256, outs: &OutInstructions) -> u64 {
// Unfortunately, we can't cache this in self, despite the following code being written such
// that a common EVM instance could be used, as revm's types aren't Send/Sync and we expect the
// Router to be send/sync
@@ -255,17 +236,17 @@ impl Router {
Coin::Erc20(erc20) => Some(erc20),
});
let shimmed_fee = match coin {
let fee = match coin {
Coin::Ether => {
// Use a fee of 1 so the fee payment is recognized as positive-value, if the fee is
// non-zero
let fee = if fee_per_gas == U256::ZERO { U256::ZERO } else { U256::ONE };
// Use a fee of 1 so the fee payment is recognized as positive-value
let fee = U256::from(1);
// Set a balance of the amount sent out to ensure we don't error on that premise
gas_estimator.data.ctx.modify_db(|db| {
{
let db = gas_estimator.db_mut();
let account = db.load_account(self.address).unwrap();
account.info.balance = fee + outs.0.iter().map(|out| out.amount).sum::<U256>();
});
}
fee
}
@@ -278,7 +259,7 @@ impl Router {
// Use a nonce of 1
ProjectivePoint::GENERATOR,
&public_key,
&Self::execute_message(CHAIN_ID, self.address, 1, coin, shimmed_fee, outs.clone()),
&Self::execute_message(CHAIN_ID, 1, coin, fee, outs.clone()),
);
let s = Scalar::ONE + (c * private_key);
let sig = Signature::new(c, s).unwrap();
@@ -290,7 +271,8 @@ impl Router {
consistent use of nonce #1 shows storage read/writes aren't being persisted. They're solely
returned upon execution in a `state` field we ignore.
*/
gas_estimator.data.ctx.modify_tx(|tx| {
{
let tx = gas_estimator.tx_mut();
tx.caller = Address::from({
/*
We assume the transaction sender is not the destination of any `OutInstruction`, making
@@ -309,82 +291,55 @@ impl Router {
tx.data = abi::executeCall::new((
abi::Signature::from(&sig),
Address::from(coin),
shimmed_fee,
fee,
outs.0.clone(),
))
.abi_encode()
.into();
});
}
// Execute the transaction
let mut gas = match MainnetHandler::<
_,
EVMError<Infallible, InvalidTransaction>,
EthFrame<_, _, _>,
>::default()
.inspect_run(&mut gas_estimator)
.unwrap()
.result
{
let mut gas = match gas_estimator.transact().unwrap().result {
ExecutionResult::Success { gas_used, gas_refunded, .. } => {
assert_eq!(gas_refunded, 0);
gas_used
}
res => panic!("estimated execute transaction failed: {res:?}"),
};
gas += gas_estimator.into_inspector().unused_gas;
/*
The transaction pays an initial gas fee which is dependent on the length of the calldata and
the amount of non-zero bytes in the calldata. This is variable to the fee, which was prior
shimmed to be `1`.
Here, we calculate the actual fee, and update the initial gas fee accordingly. We then update
the fee again, until the initial gas fee stops increasing.
*/
// The transaction uses gas based on the amount of non-zero bytes in the calldata, which is
// variable to the fee, which is variable to the gad used. This iterates until parity
let initial_gas = |fee, sig| {
let gas = calculate_initial_tx_gas(
SPEC_ID,
SpecId::CANCUN,
&abi::executeCall::new((sig, Address::from(coin), fee, outs.0.clone())).abi_encode(),
false,
0,
0,
&[],
0,
);
assert_eq!(gas.floor_gas, 0);
gas.initial_gas
};
let mut current_initial_gas = initial_gas(shimmed_fee, abi::Signature::from(&sig));
// Remove the current initial gas from the transaction's gas
gas -= current_initial_gas;
let mut current_initial_gas = initial_gas(fee, abi::Signature::from(&sig));
loop {
// Calculate the would-be fee
let fee = fee_per_gas * U256::from(gas + current_initial_gas);
// Calculate the would-be gas for this fee
let fee = fee_per_gas * U256::from(gas);
let new_initial_gas =
initial_gas(fee, abi::Signature { c: [0xff; 32].into(), s: [0xff; 32].into() });
// If the values are equal, or if it went down, return
/*
The gas will decrease if the new fee has more zero bytes in its encoding. Further
iterations are unhelpful as they'll simply loop infinitely for some inputs. Accordingly, we
return the current fee (which is for a very slightly higher gas rate) with the decreased
gas to ensure this algorithm terminates.
*/
if current_initial_gas >= new_initial_gas {
return (gas + new_initial_gas, fee);
return gas;
}
// Update what the current initial gas is
gas += new_initial_gas - current_initial_gas;
current_initial_gas = new_initial_gas;
}
}
/// The estimated gas for this `OutInstruction`.
/// The estimated fee for this `OutInstruction`.
///
/// This does not model the quadratic costs incurred when in a batch, nor other misc costs such
/// as the potential to cause one less zero byte in the fee's encoding. This is intended to
/// produce a per-`OutInstruction` value which can be ratioed against others to decide the fee to
/// deduct from each `OutInstruction`, before all `OutInstruction`s incur an amortized fee of
/// what remains for the batch itself.
/// produce a per-`OutInstruction` fee to deduct from each `OutInstruction`, before all
/// `OutInstruction`s incur an amortized fee of what remains for the batch itself.
pub fn execute_out_instruction_gas_estimate(
&mut self,
coin: Coin,
@@ -393,12 +348,11 @@ impl Router {
#[allow(clippy::map_entry)] // clippy doesn't realize the multiple mutable borrows
if !self.empty_execute_gas.contains_key(&coin) {
// This can't be de-duplicated across ERC20s due to the zero bytes in the address
let (gas, _fee) = self.execute_gas_and_fee(coin, U256::from(0), &OutInstructions(vec![]));
let gas = self.execute_gas(coin, U256::from(0), &OutInstructions(vec![]));
self.empty_execute_gas.insert(coin, gas);
}
let (gas, _fee) =
self.execute_gas_and_fee(coin, U256::from(0), &OutInstructions(vec![instruction]));
let gas = self.execute_gas(coin, U256::from(0), &OutInstructions(vec![instruction]));
gas - self.empty_execute_gas[&coin]
}
}

View File

@@ -19,6 +19,7 @@ use alloy_consensus::TxLegacy;
use alloy_rpc_types_eth::{BlockId, Log, Filter, TransactionInput, TransactionRequest};
use alloy_transport::{TransportErrorKind, RpcError};
use alloy_simple_request_transport::SimpleRequest;
use alloy_provider::{Provider, RootProvider};
use scale::Encode;
@@ -47,7 +48,6 @@ mod _irouter_abi {
#[expect(warnings)]
#[expect(needless_pass_by_value)]
#[expect(clippy::all)]
#[expect(clippy::unused_self)]
#[expect(clippy::ignored_unit_patterns)]
#[expect(clippy::redundant_closure_for_method_calls)]
mod _router_abi {
@@ -236,7 +236,7 @@ pub struct Escape {
/// A view of the Router for Serai.
#[derive(Clone, Debug)]
pub struct Router {
provider: Arc<RootProvider>,
provider: Arc<RootProvider<SimpleRequest>>,
address: Address,
empty_execute_gas: HashMap<Coin, u64>,
}
@@ -272,7 +272,7 @@ impl Router {
/// This performs an on-chain lookup for the first deployed Router constructed with this public
/// key. This lookup is of a constant amount of calls and does not read any logs.
pub async fn new(
provider: Arc<RootProvider>,
provider: Arc<RootProvider<SimpleRequest>>,
initial_serai_key: &PublicKey,
) -> Result<Option<Self>, RpcError<TransportErrorKind>> {
let Some(deployer) = Deployer::new(provider.clone()).await? else {
@@ -292,22 +292,13 @@ impl Router {
self.address
}
/// Get the signature data signed in place of the actual signature.
fn signature_data(chain_id: U256, router_address: Address, nonce: u64) -> abi::Signature {
let mut s = [0; 32];
s[.. 20].copy_from_slice(router_address.as_slice());
s[24 ..].copy_from_slice(&nonce.to_be_bytes());
abi::Signature { c: chain_id.into(), s: s.into() }
}
/// Get the message to be signed in order to confirm the next key for Serai.
pub fn confirm_next_serai_key_message(
chain_id: U256,
router_address: Address,
nonce: u64,
) -> Vec<u8> {
abi::confirmNextSeraiKeyCall::new((Self::signature_data(chain_id, router_address, nonce),))
.abi_encode()
pub fn confirm_next_serai_key_message(chain_id: U256, nonce: u64) -> Vec<u8> {
abi::confirmNextSeraiKeyCall::new((abi::Signature {
c: chain_id.into(),
s: U256::try_from(nonce).unwrap().into(),
},))
.abi_encode()
}
/// Construct a transaction to confirm the next key representing Serai.
@@ -322,14 +313,9 @@ impl Router {
}
/// Get the message to be signed in order to update the key for Serai.
pub fn update_serai_key_message(
chain_id: U256,
router_address: Address,
nonce: u64,
key: &PublicKey,
) -> Vec<u8> {
pub fn update_serai_key_message(chain_id: U256, nonce: u64, key: &PublicKey) -> Vec<u8> {
abi::updateSeraiKeyCall::new((
Self::signature_data(chain_id, router_address, nonce),
abi::Signature { c: chain_id.into(), s: U256::try_from(nonce).unwrap().into() },
key.eth_repr().into(),
))
.abi_encode()
@@ -385,14 +371,13 @@ impl Router {
/// Get the message to be signed in order to execute a series of `OutInstruction`s.
pub fn execute_message(
chain_id: U256,
router_address: Address,
nonce: u64,
coin: Coin,
fee: U256,
outs: OutInstructions,
) -> Vec<u8> {
abi::executeCall::new((
Self::signature_data(chain_id, router_address, nonce),
abi::Signature { c: chain_id.into(), s: U256::try_from(nonce).unwrap().into() },
Address::from(coin),
fee,
outs.0,
@@ -414,14 +399,12 @@ impl Router {
}
/// Get the message to be signed in order to trigger the escape hatch.
pub fn escape_hatch_message(
chain_id: U256,
router_address: Address,
nonce: u64,
escape_to: Address,
) -> Vec<u8> {
abi::escapeHatchCall::new((Self::signature_data(chain_id, router_address, nonce), escape_to))
.abi_encode()
pub fn escape_hatch_message(chain_id: U256, nonce: u64, escape_to: Address) -> Vec<u8> {
abi::escapeHatchCall::new((
abi::Signature { c: chain_id.into(), s: U256::try_from(nonce).unwrap().into() },
escape_to,
))
.abi_encode()
}
/// Construct a transaction to trigger the escape hatch.
@@ -590,7 +573,7 @@ impl Router {
if log.topics().first() != Some(&Transfer::SIGNATURE_HASH) {
continue;
}
let Ok(transfer) = Transfer::decode_log(&log.inner.clone()) else { continue };
let Ok(transfer) = Transfer::decode_log(&log.inner.clone(), true) else { continue };
// Check if this aligns with the InInstruction
if (transfer.from == in_instruction.from) &&
(transfer.to == self.address) &&
@@ -760,11 +743,11 @@ impl Router {
) -> Result<Option<PublicKey>, RpcError<TransportErrorKind>> {
let call =
TransactionRequest::default().to(self.address).input(TransactionInput::new(call.into()));
let bytes = self.provider.call(call).block(block).await?;
let bytes = self.provider.call(&call).block(block).await?;
// This is fine as both key calls share a return type
let res = abi::nextSeraiKeyCall::abi_decode_returns(&bytes)
let res = abi::nextSeraiKeyCall::abi_decode_returns(&bytes, true)
.map_err(|e| TransportErrorKind::Custom(format!("failed to decode key: {e:?}").into()))?;
let eth_repr = <[u8; 32]>::from(res);
let eth_repr = <[u8; 32]>::from(res._0);
Ok(if eth_repr == [0; 32] {
None
} else {
@@ -795,10 +778,10 @@ impl Router {
let call = TransactionRequest::default()
.to(self.address)
.input(TransactionInput::new(abi::nextNonceCall::new(()).abi_encode().into()));
let bytes = self.provider.call(call).block(block).await?;
let res = abi::nextNonceCall::abi_decode_returns(&bytes)
let bytes = self.provider.call(&call).block(block).await?;
let res = abi::nextNonceCall::abi_decode_returns(&bytes, true)
.map_err(|e| TransportErrorKind::Custom(format!("failed to decode nonce: {e:?}").into()))?;
Ok(u64::try_from(res).map_err(|_| {
Ok(u64::try_from(res._0).map_err(|_| {
TransportErrorKind::Custom("nonce returned exceeded 2**64".to_string().into())
})?)
}
@@ -811,10 +794,10 @@ impl Router {
let call = TransactionRequest::default()
.to(self.address)
.input(TransactionInput::new(abi::escapedToCall::new(()).abi_encode().into()));
let bytes = self.provider.call(call).block(block).await?;
let res = abi::escapedToCall::abi_decode_returns(&bytes).map_err(|e| {
let bytes = self.provider.call(&call).block(block).await?;
let res = abi::escapedToCall::abi_decode_returns(&bytes, true).map_err(|e| {
TransportErrorKind::Custom(format!("failed to decode the address escaped to: {e:?}").into())
})?;
Ok(if res == Address::ZERO { None } else { Some(res) })
Ok(if res._0 == Address([0; 20].into()) { None } else { Some(res._0) })
}
}

View File

@@ -6,7 +6,7 @@ use alloy_consensus::TxLegacy;
use alloy_rpc_types_eth::{TransactionInput, TransactionRequest};
use alloy_provider::Provider;
use revm::{primitives::hardfork::SpecId, interpreter::gas::calculate_initial_tx_gas};
use revm::{primitives::SpecId, interpreter::gas::calculate_initial_tx_gas};
use crate::tests::Test;
@@ -65,13 +65,13 @@ async fn test_create_address() {
let call =
TransactionRequest::default().to(address).input(TransactionInput::new(input.clone().into()));
assert_eq!(
&test.provider.call(call.clone()).await.unwrap().as_ref()[12 ..],
&test.provider.call(&call).await.unwrap().as_ref()[12 ..],
address.create(nonce).as_slice(),
);
// Check the function is constant-gas
let gas_used = test.provider.estimate_gas(call).await.unwrap();
let initial_gas = calculate_initial_tx_gas(SpecId::CANCUN, &input, false, 0, 0, 0).initial_gas;
let gas_used = test.provider.estimate_gas(&call).await.unwrap();
let initial_gas = calculate_initial_tx_gas(SpecId::CANCUN, &input, false, &[], 0).initial_gas;
let this_call = gas_used - initial_gas;
if gas.is_none() {
gas = Some(this_call);

View File

@@ -86,13 +86,13 @@ impl Erc20 {
let call = TransactionRequest::default().to(self.0).input(TransactionInput::new(
abi::TestERC20::balanceOfCall::new((account,)).abi_encode().into(),
));
U256::abi_decode(&test.provider.call(call).await.unwrap()).unwrap()
U256::abi_decode(&test.provider.call(&call).await.unwrap(), true).unwrap()
}
pub(crate) async fn router_approval(&self, test: &Test, account: Address) -> U256 {
let call = TransactionRequest::default().to(self.0).input(TransactionInput::new(
abi::TestERC20::allowanceCall::new((test.router.address(), account)).abi_encode().into(),
));
U256::abi_decode(&test.provider.call(call).await.unwrap()).unwrap()
U256::abi_decode(&test.provider.call(&call).await.unwrap(), true).unwrap()
}
}

View File

@@ -8,12 +8,7 @@ use crate::tests::*;
impl Test {
pub(crate) fn escape_hatch_tx(&self, escape_to: Address) -> TxLegacy {
let msg = Router::escape_hatch_message(
self.chain_id,
self.router.address(),
self.state.next_nonce,
escape_to,
);
let msg = Router::escape_hatch_message(self.chain_id, self.state.next_nonce, escape_to);
let sig = sign(self.state.key.unwrap(), &msg);
let mut tx = self.router.escape_hatch(escape_to, &sig);
tx.gas_limit = Router::ESCAPE_HATCH_GAS + 5_000;

View File

@@ -63,7 +63,7 @@ struct CalldataAgnosticGas;
impl CalldataAgnosticGas {
#[must_use]
fn calculate(input: &[u8], mut constant_zero_bytes: usize, gas_used: u64) -> u64 {
use revm::{primitives::hardfork::SpecId, interpreter::gas::calculate_initial_tx_gas};
use revm::{primitives::SpecId, interpreter::gas::calculate_initial_tx_gas};
let mut without_variable_zero_bytes = Vec::with_capacity(input.len());
for byte in input {
@@ -76,9 +76,9 @@ impl CalldataAgnosticGas {
}
}
gas_used +
(calculate_initial_tx_gas(SpecId::CANCUN, &without_variable_zero_bytes, false, 0, 0, 0)
(calculate_initial_tx_gas(SpecId::CANCUN, &without_variable_zero_bytes, false, &[], 0)
.initial_gas -
calculate_initial_tx_gas(SpecId::CANCUN, input, false, 0, 0, 0).initial_gas)
calculate_initial_tx_gas(SpecId::CANCUN, input, false, &[], 0).initial_gas)
}
}
@@ -92,7 +92,7 @@ struct RouterState {
struct Test {
#[allow(unused)]
anvil: AnvilInstance,
provider: Arc<RootProvider>,
provider: Arc<RootProvider<SimpleRequest>>,
chain_id: U256,
router: Router,
state: RouterState,
@@ -173,16 +173,12 @@ impl Test {
let call = TransactionRequest::default()
.to(self.router.address())
.input(TransactionInput::new(tx.input));
let call_err = self.provider.call(call).await.unwrap_err();
call_err.as_error_resp().unwrap().as_decoded_interface_error::<IRouterErrors>().unwrap()
let call_err = self.provider.call(&call).await.unwrap_err();
call_err.as_error_resp().unwrap().as_decoded_error::<IRouterErrors>(true).unwrap()
}
fn confirm_next_serai_key_tx(&self) -> TxLegacy {
let msg = Router::confirm_next_serai_key_message(
self.chain_id,
self.router.address(),
self.state.next_nonce,
);
let msg = Router::confirm_next_serai_key_message(self.chain_id, self.state.next_nonce);
let sig = sign(self.state.next_key.unwrap(), &msg);
self.router.confirm_next_serai_key(&sig)
@@ -231,12 +227,7 @@ impl Test {
fn update_serai_key_tx(&self) -> ((Scalar, PublicKey), TxLegacy) {
let next_key = test_key();
let msg = Router::update_serai_key_message(
self.chain_id,
self.router.address(),
self.state.next_nonce,
&next_key.1,
);
let msg = Router::update_serai_key_message(self.chain_id, self.state.next_nonce, &next_key.1);
let sig = sign(self.state.key.unwrap(), &msg);
(next_key, self.router.update_serai_key(&next_key.1, &sig))
@@ -284,7 +275,6 @@ impl Test {
) -> ([u8; 32], TxLegacy) {
let msg = Router::execute_message(
self.chain_id,
self.router.address(),
self.state.next_nonce,
coin,
fee,
@@ -478,10 +468,11 @@ async fn test_update_serai_key() {
// But we shouldn't be able to update the key to None
{
let router_address_u256: U256 = test.router.address().into_word().into();
let s: U256 = (router_address_u256 << 96) | U256::from(test.state.next_nonce);
let msg = crate::abi::updateSeraiKeyCall::new((
crate::abi::Signature { c: test.chain_id.into(), s: s.into() },
crate::abi::Signature {
c: test.chain_id.into(),
s: U256::try_from(test.state.next_nonce).unwrap().into(),
},
[0; 32].into(),
))
.abi_encode();
@@ -549,8 +540,8 @@ async fn test_empty_execute() {
test.confirm_next_serai_key().await;
{
let (gas, fee) =
test.router.execute_gas_and_fee(Coin::Ether, U256::from(1), &[].as_slice().into());
let gas = test.router.execute_gas(Coin::Ether, U256::from(1), &[].as_slice().into());
let fee = U256::from(gas);
let () = test
.provider
@@ -583,15 +574,15 @@ async fn test_empty_execute() {
TransactionRequest::default().to(token).input(TransactionInput::new(vec![].into()));
// Check it returns the expected result
assert_eq!(
test.provider.call(call.clone()).await.unwrap().as_ref(),
test.provider.call(&call).await.unwrap().as_ref(),
U256::from(1).abi_encode().as_slice()
);
// Check it has the expected gas cost (16 is documented in `return_true_code`)
assert_eq!(test.provider.estimate_gas(call).await.unwrap(), 21_000 + 16);
assert_eq!(test.provider.estimate_gas(&call).await.unwrap(), 21_000 + 16);
}
let (gas, fee) =
test.router.execute_gas_and_fee(Coin::Erc20(token), U256::from(0), &[].as_slice().into());
let gas = test.router.execute_gas(Coin::Erc20(token), U256::from(0), &[].as_slice().into());
let fee = U256::from(0);
let (_tx, gas_used) = test.execute(Coin::Erc20(token), fee, [].as_slice().into(), vec![]).await;
const UNUSED_GAS: u64 = Router::GAS_FOR_ERC20_CALL - 16;
assert_eq!(gas_used + UNUSED_GAS, gas);
@@ -609,7 +600,8 @@ async fn test_eth_address_out_instruction() {
let out_instructions =
OutInstructions::from([(SeraiEthereumAddress::Address(rand_address), amount_out)].as_slice());
let (gas, fee) = test.router.execute_gas_and_fee(Coin::Ether, U256::from(1), &out_instructions);
let gas = test.router.execute_gas(Coin::Ether, U256::from(1), &out_instructions);
let fee = U256::from(gas);
let () = test
.provider
@@ -646,7 +638,8 @@ async fn test_erc20_address_out_instruction() {
let out_instructions =
OutInstructions::from([(SeraiEthereumAddress::Address(rand_address), amount_out)].as_slice());
let (gas, fee) = test.router.execute_gas_and_fee(coin, U256::from(1), &out_instructions);
let gas = test.router.execute_gas(coin, U256::from(1), &out_instructions);
let fee = U256::from(gas);
// Mint to the Router the necessary amount of the ERC20
erc20.mint(&test, test.router.address(), amount_out + fee).await;
@@ -681,7 +674,8 @@ async fn test_eth_code_out_instruction() {
.as_slice(),
);
let (gas, fee) = test.router.execute_gas_and_fee(Coin::Ether, U256::from(1), &out_instructions);
let gas = test.router.execute_gas(Coin::Ether, U256::from(1), &out_instructions);
let fee = U256::from(gas);
let (tx, gas_used) = test.execute(Coin::Ether, fee, out_instructions, vec![true]).await;
// We use call-traces here to determine how much gas was allowed but unused due to the complexity
@@ -706,34 +700,6 @@ async fn test_eth_code_out_instruction() {
assert_eq!(test.provider.get_code_at(deployed).await.unwrap().to_vec(), true.abi_encode());
}
#[tokio::test]
async fn test_eth_code_out_instruction_reverts() {
let mut test = Test::new().await;
test.confirm_next_serai_key().await;
let () = test
.provider
.raw_request("anvil_setBalance".into(), (test.router.address(), 1_000_000))
.await
.unwrap();
// [REVERT], which will cause `executeArbitraryCode`'s call to CREATE to fail
let code = vec![0xfd];
let amount_out = U256::from(0);
let out_instructions = OutInstructions::from(
[(
SeraiEthereumAddress::Contract(ContractDeployment::new(50_000, code.clone()).unwrap()),
amount_out,
)]
.as_slice(),
);
let (gas, fee) = test.router.execute_gas_and_fee(Coin::Ether, U256::from(1), &out_instructions);
let (tx, gas_used) = test.execute(Coin::Ether, fee, out_instructions, vec![true]).await;
let unused_gas = test.gas_unused_by_calls(&tx).await;
assert_eq!(gas_used + unused_gas, gas);
}
#[tokio::test]
async fn test_erc20_code_out_instruction() {
let mut test = Test::new().await;
@@ -749,7 +715,8 @@ async fn test_erc20_code_out_instruction() {
.as_slice(),
);
let (gas, fee) = test.router.execute_gas_and_fee(coin, U256::from(1), &out_instructions);
let gas = test.router.execute_gas(coin, U256::from(1), &out_instructions);
let fee = U256::from(gas);
// Mint to the Router the necessary amount of the ERC20
erc20.mint(&test, test.router.address(), amount_out + fee).await;
@@ -781,11 +748,11 @@ async fn test_result_decoding() {
.as_slice(),
);
let (gas, fee) = test.router.execute_gas_and_fee(Coin::Ether, U256::from(0), &out_instructions);
let gas = test.router.execute_gas(Coin::Ether, U256::from(0), &out_instructions);
// We should decode these in the correct order (not `false, true, true`)
let (_tx, gas_used) =
test.execute(Coin::Ether, fee, out_instructions, vec![true, true, false]).await;
test.execute(Coin::Ether, U256::from(0), out_instructions, vec![true, true, false]).await;
// We don't check strict equality as we don't know how much gas was used by the reverted call
// (even with the trace), solely that it used less than or equal to the limit
assert!(gas_used <= gas);
@@ -821,8 +788,9 @@ async fn test_reentrancy() {
.as_slice(),
);
let (gas, fee) = test.router.execute_gas_and_fee(Coin::Ether, U256::from(0), &out_instructions);
let (_tx, gas_used) = test.execute(Coin::Ether, fee, out_instructions, vec![true]).await;
let gas = test.router.execute_gas(Coin::Ether, U256::from(0), &out_instructions);
let (_tx, gas_used) =
test.execute(Coin::Ether, U256::from(0), out_instructions, vec![true]).await;
// Even though this doesn't have failed `OutInstruction`s, our logic is incomplete upon any
// failed internal calls for some reason. That's fine, as the gas yielded is still the worst-case
// (which this isn't a counter-example to) and is validated to be the worst-case, but is peculiar
@@ -831,7 +799,7 @@ async fn test_reentrancy() {
#[tokio::test]
async fn fuzz_test_out_instructions_gas() {
for _ in 0 .. 100 {
for _ in 0 .. 10 {
let mut test = Test::new().await;
test.confirm_next_serai_key().await;
@@ -849,7 +817,7 @@ async fn fuzz_test_out_instructions_gas() {
code.extend(&ext);
out_instructions.push((
SeraiEthereumAddress::Contract(ContractDeployment::new(100_000, code).unwrap()),
SeraiEthereumAddress::Contract(ContractDeployment::new(100_000, ext).unwrap()),
amount_out,
));
} else {
@@ -886,7 +854,8 @@ async fn fuzz_test_out_instructions_gas() {
};
let fee_per_gas = U256::from(1) + U256::from(OsRng.next_u64() % 10);
let (gas, fee) = test.router.execute_gas_and_fee(coin, fee_per_gas, &out_instructions);
let gas = test.router.execute_gas(coin, fee_per_gas, &out_instructions);
let fee = U256::from(gas) * fee_per_gas;
// All of these should have succeeded
let (tx, gas_used) =
test.execute(coin, fee, out_instructions.clone(), vec![true; out_instructions.0.len()]).await;
@@ -898,47 +867,3 @@ async fn fuzz_test_out_instructions_gas() {
);
}
}
#[tokio::test]
async fn test_gas_increases_then_decreases() {
/*
This specific batch of `OutInstruction`s causes the gas to be initially calculated, and then
increase as the proper fee is written in (due to the increased amount of non-zero bytes). But
then, as the fee is updated until the final fee no longer increases the gas used, the gas
actually goes *back down*. To then derive the fee from this reduced gas causes the gas to go
back up.
A prior version of this library would return the reduced amount of gas fee in this edge case,
which only rarely appeared via the fuzz test (yet did once, yielding this). Then, it'd derive
the fee from it, and expect the realized transaction to have parity (causing a test failure as
it didn't). Now, `execute_gas` is `execute_gas_and_fee`, yielding both the gas which is
expected *and the fee for it*. This fee is guaranteed to cost the reported amount of gas,
resolving this issue.
*/
let out_instructions = vec![(
SeraiEthereumAddress::Contract(ContractDeployment::new(100240, vec![]).unwrap()),
U256::from(1u8),
)];
let mut test = Test::new().await;
test.confirm_next_serai_key().await;
let out_instructions = OutInstructions::from(out_instructions.as_slice());
let coin = {
let () = test
.provider
.raw_request("anvil_setBalance".into(), (test.router.address(), 1_000_000_000))
.await
.unwrap();
Coin::Ether
};
let fee_per_gas = U256::from(1);
let (gas, fee) = test.router.execute_gas_and_fee(coin, fee_per_gas, &out_instructions);
assert!((U256::from(gas) * fee_per_gas) != fee);
let (tx, gas_used) =
test.execute(coin, fee, out_instructions.clone(), vec![true; out_instructions.0.len()]).await;
let unused_gas = test.gas_unused_by_calls(&tx).await;
assert_eq!(gas_used + unused_gas, gas);
}

View File

@@ -3,7 +3,7 @@ use std::io;
use ciphersuite::Secp256k1;
use frost::dkg::ThresholdKeys;
use alloy_core::primitives::{U256, Address as EthereumAddress};
use alloy_core::primitives::U256;
use serai_client::networks::ethereum::Address;
@@ -17,20 +17,8 @@ use crate::{output::OutputId, machine::ClonableTransctionMachine};
#[derive(Clone, PartialEq, Debug)]
pub(crate) enum Action {
SetKey {
chain_id: U256,
router_address: EthereumAddress,
nonce: u64,
key: PublicKey,
},
Batch {
chain_id: U256,
router_address: EthereumAddress,
nonce: u64,
coin: Coin,
fee: U256,
outs: Vec<(Address, U256)>,
},
SetKey { chain_id: U256, nonce: u64, key: PublicKey },
Batch { chain_id: U256, nonce: u64, coin: Coin, fee: U256, outs: Vec<(Address, U256)> },
}
#[derive(Clone, PartialEq, Eq, Debug)]
@@ -45,28 +33,25 @@ impl Action {
pub(crate) fn message(&self) -> Vec<u8> {
match self {
Action::SetKey { chain_id, router_address, nonce, key } => {
Router::update_serai_key_message(*chain_id, *router_address, *nonce, key)
}
Action::Batch { chain_id, router_address, nonce, coin, fee, outs } => {
Router::execute_message(
*chain_id,
*router_address,
*nonce,
*coin,
*fee,
OutInstructions::from(outs.as_ref()),
)
Action::SetKey { chain_id, nonce, key } => {
Router::update_serai_key_message(*chain_id, *nonce, key)
}
Action::Batch { chain_id, nonce, coin, fee, outs } => Router::execute_message(
*chain_id,
*nonce,
*coin,
*fee,
OutInstructions::from(outs.as_ref()),
),
}
}
pub(crate) fn eventuality(&self) -> Eventuality {
Eventuality(match self {
Self::SetKey { chain_id: _, router_address: _, nonce, key } => {
Self::SetKey { chain_id: _, nonce, key } => {
Executed::NextSeraiKeySet { nonce: *nonce, key: key.eth_repr() }
}
Self::Batch { chain_id: _, router_address: _, nonce, .. } => {
Self::Batch { chain_id: _, nonce, .. } => {
Executed::Batch { nonce: *nonce, message_hash: keccak256(self.message()), results: vec![] }
}
})
@@ -104,10 +89,6 @@ impl SignableTransaction for Action {
reader.read_exact(&mut chain_id)?;
let chain_id = U256::from_be_bytes(chain_id);
let mut router_address = [0; 20];
reader.read_exact(&mut router_address)?;
let router_address = EthereumAddress::from(router_address);
let mut nonce = [0; 8];
reader.read_exact(&mut nonce)?;
let nonce = u64::from_le_bytes(nonce);
@@ -119,7 +100,7 @@ impl SignableTransaction for Action {
let key =
PublicKey::from_eth_repr(key).ok_or_else(|| io::Error::other("invalid key in Action"))?;
Action::SetKey { chain_id, router_address, nonce, key }
Action::SetKey { chain_id, nonce, key }
}
1 => {
let coin = borsh::from_reader(reader)?;
@@ -142,24 +123,22 @@ impl SignableTransaction for Action {
outs.push((address, amount));
}
Action::Batch { chain_id, router_address, nonce, coin, fee, outs }
Action::Batch { chain_id, nonce, coin, fee, outs }
}
_ => unreachable!(),
})
}
fn write(&self, writer: &mut impl io::Write) -> io::Result<()> {
match self {
Self::SetKey { chain_id, router_address, nonce, key } => {
Self::SetKey { chain_id, nonce, key } => {
writer.write_all(&[0])?;
writer.write_all(&chain_id.to_be_bytes::<32>())?;
writer.write_all(router_address.as_slice())?;
writer.write_all(&nonce.to_le_bytes())?;
writer.write_all(&key.eth_repr())
}
Self::Batch { chain_id, router_address, nonce, coin, fee, outs } => {
Self::Batch { chain_id, nonce, coin, fee, outs } => {
writer.write_all(&[1])?;
writer.write_all(&chain_id.to_be_bytes::<32>())?;
writer.write_all(router_address.as_slice())?;
writer.write_all(&nonce.to_le_bytes())?;
borsh::BorshSerialize::serialize(coin, writer)?;
writer.write_all(&fee.as_le_bytes())?;

View File

@@ -4,6 +4,7 @@ use std::sync::Arc;
use alloy_rlp::Encodable;
use alloy_transport::{TransportErrorKind, RpcError};
use alloy_simple_request_transport::SimpleRequest;
use alloy_provider::RootProvider;
use tokio::{
@@ -25,13 +26,13 @@ use crate::{
#[derive(Clone)]
pub(crate) struct TransactionPublisher<D: Db> {
db: D,
rpc: Arc<RootProvider>,
rpc: Arc<RootProvider<SimpleRequest>>,
router: Arc<RwLock<Option<Router>>>,
relayer_url: String,
}
impl<D: Db> TransactionPublisher<D> {
pub(crate) fn new(db: D, rpc: Arc<RootProvider>, relayer_url: String) -> Self {
pub(crate) fn new(db: D, rpc: Arc<RootProvider<SimpleRequest>>, relayer_url: String) -> Self {
Self { db, rpc, router: Arc::new(RwLock::new(None)), relayer_url }
}
@@ -87,10 +88,8 @@ impl<D: Db> signers::TransactionPublisher<Transaction> for TransactionPublisher<
let nonce = tx.0.nonce();
// Convert from an Action (an internal representation of a signable event) to a TxLegacy
let tx = match tx.0 {
Action::SetKey { chain_id: _, router_address: _, nonce: _, key } => {
router.update_serai_key(&key, &tx.1)
}
Action::Batch { chain_id: _, router_address: _, nonce: _, coin, fee, outs } => {
Action::SetKey { chain_id: _, nonce: _, key } => router.update_serai_key(&key, &tx.1),
Action::Batch { chain_id: _, nonce: _, coin, fee, outs } => {
router.execute(coin, fee, OutInstructions::from(outs.as_ref()), &tx.1)
}
};

View File

@@ -2,8 +2,9 @@ use core::future::Future;
use std::{sync::Arc, collections::HashSet};
use alloy_core::primitives::B256;
use alloy_rpc_types_eth::{Header, BlockNumberOrTag};
use alloy_rpc_types_eth::{Header, BlockTransactionsKind, BlockNumberOrTag};
use alloy_transport::{RpcError, TransportErrorKind};
use alloy_simple_request_transport::SimpleRequest;
use alloy_provider::{Provider, RootProvider};
use serai_client::primitives::{ExternalNetworkId, ExternalCoin, Amount};
@@ -25,7 +26,7 @@ use crate::{
#[derive(Clone)]
pub(crate) struct Rpc<D: Db> {
pub(crate) db: D,
pub(crate) provider: Arc<RootProvider>,
pub(crate) provider: Arc<RootProvider<SimpleRequest>>,
}
impl<D: Db> ScannerFeed for Rpc<D> {
@@ -48,7 +49,7 @@ impl<D: Db> ScannerFeed for Rpc<D> {
async move {
let actual_number = self
.provider
.get_block(BlockNumberOrTag::Finalized.into())
.get_block(BlockNumberOrTag::Finalized.into(), BlockTransactionsKind::Hashes)
.await?
.ok_or_else(|| {
TransportErrorKind::Custom("there was no finalized block".to_string().into())
@@ -76,7 +77,7 @@ impl<D: Db> ScannerFeed for Rpc<D> {
async move {
let header = self
.provider
.get_block(BlockNumberOrTag::Number(number).into())
.get_block(BlockNumberOrTag::Number(number).into(), BlockTransactionsKind::Hashes)
.await?
.ok_or_else(|| {
TransportErrorKind::Custom(
@@ -104,7 +105,7 @@ impl<D: Db> ScannerFeed for Rpc<D> {
} else {
self
.provider
.get_block((start - 1).into())
.get_block((start - 1).into(), BlockTransactionsKind::Hashes)
.await?
.ok_or_else(|| {
TransportErrorKind::Custom(
@@ -119,7 +120,7 @@ impl<D: Db> ScannerFeed for Rpc<D> {
let end_header = self
.provider
.get_block((start + 31).into())
.get_block((start + 31).into(), BlockTransactionsKind::Hashes)
.await?
.ok_or_else(|| {
TransportErrorKind::Custom(
@@ -176,7 +177,7 @@ impl<D: Db> ScannerFeed for Rpc<D> {
while to_check != epoch.prior_end_hash {
let to_check_block = self
.provider
.get_block(B256::from(to_check).into())
.get_block(B256::from(to_check).into(), BlockTransactionsKind::Hashes)
.await?
.ok_or_else(|| {
TransportErrorKind::Custom(

View File

@@ -50,7 +50,6 @@ impl<D: Db> smart_contract_scheduler::SmartContract<Rpc<D>> for SmartContract {
) -> (Self::SignableTransaction, EventualityFor<Rpc<D>>) {
let action = Action::SetKey {
chain_id: self.chain_id,
router_address: if true { todo!("TODO") } else { Default::default() },
nonce,
key: PublicKey::new(new_key).expect("rotating to an invald key"),
};
@@ -140,7 +139,6 @@ impl<D: Db> smart_contract_scheduler::SmartContract<Rpc<D>> for SmartContract {
res.push(Action::Batch {
chain_id: self.chain_id,
router_address: if true { todo!("TODO") } else { Default::default() },
nonce,
coin: coin_to_ethereum_coin(coin),
fee: U256::try_from(total_gas).unwrap() * fee_per_gas,

View File

@@ -19,10 +19,11 @@ workspace = true
[dependencies]
k256 = { version = "0.13", default-features = false, features = ["std"] }
alloy-core = { version = "1", default-features = false }
alloy-consensus = { version = "0.14", default-features = false, features = ["std"] }
alloy-core = { version = "0.8", default-features = false }
alloy-consensus = { version = "0.9", default-features = false, features = ["std"] }
alloy-rpc-types-eth = { version = "0.14", default-features = false }
alloy-provider = { version = "0.14", default-features = false }
alloy-rpc-types-eth = { version = "0.9", default-features = false }
alloy-simple-request-transport = { path = "../../../networks/ethereum/alloy-simple-request-transport", default-features = false }
alloy-provider = { version = "0.9", default-features = false }
ethereum-primitives = { package = "serai-processor-ethereum-primitives", path = "../primitives", default-features = false }

View File

@@ -5,12 +5,13 @@
use k256::{elliptic_curve::sec1::ToEncodedPoint, ProjectivePoint};
use alloy_core::{
primitives::{Address, U256, Bytes, Signature, TxKind},
primitives::{Address, U256, Bytes, PrimitiveSignature, TxKind},
hex::FromHex,
};
use alloy_consensus::{SignableTransaction, TxLegacy, Signed};
use alloy_rpc_types_eth::TransactionReceipt;
use alloy_simple_request_transport::SimpleRequest;
use alloy_provider::{Provider, RootProvider};
use ethereum_primitives::{keccak256, deterministically_sign};
@@ -23,7 +24,7 @@ fn address(point: &ProjectivePoint) -> [u8; 20] {
}
/// Fund an account.
pub async fn fund_account(provider: &RootProvider, address: Address, value: U256) {
pub async fn fund_account(provider: &RootProvider<SimpleRequest>, address: Address, value: U256) {
let _: () = provider
.raw_request("anvil_setBalance".into(), [address.to_string(), value.to_string()])
.await
@@ -31,7 +32,10 @@ pub async fn fund_account(provider: &RootProvider, address: Address, value: U256
}
/// Publish an already-signed transaction.
pub async fn publish_tx(provider: &RootProvider, tx: Signed<TxLegacy>) -> TransactionReceipt {
pub async fn publish_tx(
provider: &RootProvider<SimpleRequest>,
tx: Signed<TxLegacy>,
) -> TransactionReceipt {
// Fund the sender's address
fund_account(
provider,
@@ -51,7 +55,7 @@ pub async fn publish_tx(provider: &RootProvider, tx: Signed<TxLegacy>) -> Transa
///
/// The contract deployment will be done by a random account.
pub async fn deploy_contract(
provider: &RootProvider,
provider: &RootProvider<SimpleRequest>,
file_path: &str,
constructor_arguments: &[u8],
) -> Address {
@@ -84,7 +88,7 @@ pub async fn deploy_contract(
///
/// This assumes the wallet is funded.
pub async fn send(
provider: &RootProvider,
provider: &RootProvider<SimpleRequest>,
wallet: &k256::ecdsa::SigningKey,
mut tx: TxLegacy,
) -> TransactionReceipt {
@@ -107,7 +111,7 @@ pub async fn send(
);
let mut bytes = vec![];
tx.into_signed(Signature::from(sig)).eip2718_encode(&mut bytes);
tx.into_signed(PrimitiveSignature::from(sig)).eip2718_encode(&mut bytes);
let pending_tx = provider.send_raw_transaction(&bytes).await.unwrap();
pending_tx.get_receipt().await.unwrap()
}

View File

@@ -25,7 +25,7 @@ rand_core = { version = "0.6", default-features = false, features = ["std", "get
frost = { package = "modular-frost", path = "../../crypto/frost", version = "^0.8.1", default-features = false }
serai-validator-sets-primitives = { path = "../../substrate/validator-sets/primitives", default-features = false, features = ["std"] }
serai-primitives = { path = "../../substrate/primitives", default-features = false, features = ["std"] }
log = { version = "0.4", default-features = false, features = ["std"] }

View File

@@ -36,7 +36,7 @@ ciphersuite = { path = "../../crypto/ciphersuite", default-features = false, fea
dkg = { package = "dkg", path = "../../crypto/dkg", default-features = false, features = ["std", "evrf-ristretto"] }
# Substrate
serai-validator-sets-primitives = { path = "../../substrate/validator-sets/primitives", default-features = false, features = ["std"] }
serai-primitives = { path = "../../substrate/primitives", default-features = false, features = ["std"] }
# Encoders
scale = { package = "parity-scale-codec", version = "3", default-features = false, features = ["std"] }

View File

@@ -25,9 +25,6 @@ borsh = { version = "1", default-features = false, features = ["std", "derive",
dkg = { path = "../../crypto/dkg", default-features = false, features = ["std", "borsh"] }
serai-primitives = { path = "../../substrate/primitives", default-features = false, features = ["std", "borsh"] }
in-instructions-primitives = { package = "serai-in-instructions-primitives", path = "../../substrate/in-instructions/primitives", default-features = false, features = ["std", "borsh"] }
coins-primitives = { package = "serai-coins-primitives", path = "../../substrate/coins/primitives", default-features = false, features = ["std", "borsh"] }
validator-sets-primitives = { package = "serai-validator-sets-primitives", path = "../../substrate/validator-sets/primitives", default-features = false, features = ["std", "borsh"] }
serai-primitives = { path = "../../substrate/primitives", default-features = false, features = ["std"] }
serai-cosign = { path = "../../coordinator/cosign", default-features = false }

View File

@@ -20,8 +20,7 @@ workspace = true
[dependencies]
group = { version = "0.13", default-features = false }
serai-primitives = { path = "../../substrate/primitives", default-features = false, features = ["std", "borsh"] }
serai-coins-primitives = { path = "../../substrate/coins/primitives", default-features = false, features = ["std", "borsh"] }
serai-primitives = { path = "../../substrate/primitives", default-features = false, features = ["std"] }
scale = { package = "parity-scale-codec", version = "3", default-features = false, features = ["std"] }
borsh = { version = "1", default-features = false, features = ["std", "derive", "de_strict_order"] }

View File

@@ -36,9 +36,6 @@ serai-db = { path = "../../common/db" }
messages = { package = "serai-processor-messages", path = "../messages" }
serai-primitives = { path = "../../substrate/primitives", default-features = false, features = ["std"] }
serai-validator-sets-primitives = { path = "../../substrate/validator-sets/primitives", default-features = false, features = ["std", "borsh"] }
serai-in-instructions-primitives = { path = "../../substrate/in-instructions/primitives", default-features = false, features = ["std", "borsh"] }
serai-coins-primitives = { path = "../../substrate/coins/primitives", default-features = false, features = ["std", "borsh"] }
primitives = { package = "serai-processor-primitives", path = "../primitives" }
scheduler-primitives = { package = "serai-processor-scheduler-primitives", path = "../scheduler/primitives" }

View File

@@ -33,8 +33,6 @@ scale = { package = "parity-scale-codec", version = "3", default-features = fals
borsh = { version = "1", default-features = false, features = ["std", "derive", "de_strict_order"] }
serai-primitives = { path = "../../substrate/primitives", default-features = false, features = ["std"] }
serai-validator-sets-primitives = { path = "../../substrate/validator-sets/primitives", default-features = false, features = ["std"] }
serai-in-instructions-primitives = { path = "../../substrate/in-instructions/primitives", default-features = false, features = ["std"] }
serai-db = { path = "../../common/db" }
log = { version = "0.4", default-features = false, features = ["std"] }

View File

@@ -1,5 +1,5 @@
[toolchain]
channel = "1.86"
targets = ["wasm32-unknown-unknown"]
channel = "1.85"
targets = ["wasmv1-none"]
profile = "minimal"
components = ["rust-src", "rustfmt", "clippy"]
components = ["rustfmt", "clippy"]

View File

@@ -27,9 +27,9 @@ brew install rustup
```
rustup update
rustup toolchain install stable
rustup target add wasm32-unknown-unknown
rustup target add wasmv1-none
rustup toolchain install nightly
rustup target add wasm32-unknown-unknown --toolchain nightly
rustup target add wasmv1-none --toolchain nightly
```
### Install Solidity

View File

@@ -12,76 +12,41 @@ rust-version = "1.80"
all-features = true
rustdoc-args = ["--cfg", "docsrs"]
[package.metadata.cargo-machete]
ignored = ["serde"]
[lints]
workspace = true
[dependencies]
bitvec = { version = "1", default-features = false, features = ["alloc", "serde"] }
borsh = { version = "1", default-features = false, features = ["derive", "de_strict_order"] }
scale = { package = "parity-scale-codec", version = "3", default-features = false, features = ["derive", "bit-vec"] }
scale-info = { version = "2", default-features = false, features = ["derive", "bit-vec"] }
bitvec = { version = "1", default-features = false, features = ["alloc"] }
sp-core = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
borsh = { version = "1", default-features = false, features = ["derive", "de_strict_order"], optional = true }
serde = { version = "1", default-features = false, features = ["derive", "alloc"], optional = true }
sp-core = { git = "https://github.com/serai-dex/substrate", default-features = false }
sp-runtime = { git = "https://github.com/serai-dex/substrate", default-features = false }
sp-consensus-babe = { git = "https://github.com/serai-dex/substrate", default-features = false }
sp-consensus-grandpa = { git = "https://github.com/serai-dex/substrate", default-features = false }
frame-support = { git = "https://github.com/serai-dex/substrate", default-features = false }
serde = { version = "1", default-features = false, features = ["derive"], optional = true }
scale = { package = "parity-scale-codec", version = "3", default-features = false, features = ["derive"], optional = true }
scale-info = { version = "2", default-features = false, features = ["derive"], optional = true }
sp-runtime = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false, features = ["serde"], optional = true }
frame-support = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false, optional = true }
serai-primitives = { path = "../primitives", version = "0.1", default-features = false }
serai-coins-primitives = { path = "../coins/primitives", version = "0.1", default-features = false }
serai-validator-sets-primitives = { path = "../validator-sets/primitives", version = "0.1", default-features = false }
serai-genesis-liquidity-primitives = { path = "../genesis-liquidity/primitives", version = "0.1", default-features = false }
serai-emissions-primitives = { path = "../emissions/primitives", version = "0.1", default-features = false }
serai-in-instructions-primitives = { path = "../in-instructions/primitives", version = "0.1", default-features = false }
serai-signals-primitives = { path = "../signals/primitives", version = "0.1", default-features = false }
[features]
std = [
"borsh/std",
"bitvec/std",
"scale/std",
"scale-info/std",
"borsh?/std",
"serde?/std",
"sp-core/std",
"sp-runtime/std",
"sp-consensus-babe/std",
"sp-consensus-grandpa/std",
"frame-support/std",
"serde?/std",
"scale?/std",
"scale-info?/std",
"sp-runtime?/std",
"frame-support?/std",
"serai-primitives/std",
"serai-coins-primitives/std",
"serai-validator-sets-primitives/std",
"serai-genesis-liquidity-primitives/std",
"serai-emissions-primitives/std",
"serai-in-instructions-primitives/std",
"serai-signals-primitives/std",
]
borsh = [
"dep:borsh",
"serai-primitives/borsh",
"serai-coins-primitives/borsh",
"serai-validator-sets-primitives/borsh",
"serai-genesis-liquidity-primitives/borsh",
"serai-in-instructions-primitives/borsh",
"serai-signals-primitives/borsh",
]
serde = [
"dep:serde",
"serai-primitives/serde",
"serai-coins-primitives/serde",
"serai-validator-sets-primitives/serde",
"serai-genesis-liquidity-primitives/serde",
"serai-in-instructions-primitives/serde",
"serai-signals-primitives/serde",
]
substrate = ["serde", "scale", "scale-info", "sp-runtime", "frame-support"]
try-runtime = ["sp-runtime/try-runtime"]
default = ["std"]

View File

@@ -1,6 +1,6 @@
MIT License
Copyright (c) 2023 Luke Parker
Copyright (c) 2023-2025 Luke Parker
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

4
substrate/abi/README.md Normal file
View File

@@ -0,0 +1,4 @@
# serai-abi
Serai's ABI, inclusive to the transaction, event, and block types. MIT-licensed to ensure usability
in a variety of contexts.

View File

@@ -1,17 +0,0 @@
use sp_consensus_babe::EquivocationProof;
use serai_primitives::{Header, SeraiAddress};
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
pub struct ReportEquivocation {
pub equivocation_proof: alloc::boxed::Box<EquivocationProof<Header>>,
pub key_owner_proof: SeraiAddress,
}
// We could define a Babe Config here and use the literal pallet_babe::Call
// The disadvantage to this would be the complexity and presence of junk fields such as `__Ignore`
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
pub enum Call {
report_equivocation(ReportEquivocation),
report_equivocation_unsigned(ReportEquivocation),
}

360
substrate/abi/src/block.rs Normal file
View File

@@ -0,0 +1,360 @@
use alloc::vec::Vec;
use borsh::{BorshSerialize, BorshDeserialize};
use crate::{
primitives::{BlockHash, merkle::UnbalancedMerkleTree},
Transaction,
};
/// The tag for the hash of a transaction's event, forming a leaf of the Merkle tree of its events.
pub const TRANSACTION_EVENTS_COMMITMENT_LEAF_TAG: u8 = 0;
/// The tag for the branch hashes of transaction events.
pub const TRANSACTION_EVENTS_COMMITMENT_BRANCH_TAG: u8 = 1;
/// The tag for the hash of a transaction's hash and its events' Merkle root, forming a leaf of the
/// Merkle tree which is the events commitment.
pub const EVENTS_COMMITMENT_LEAF_TAG: u8 = 2;
/// The tag for for the branch hashes of the Merkle tree which is the events commitments.
pub const EVENTS_COMMITMENT_BRANCH_TAG: u8 = 3;
/// A V1 header for a block.
#[derive(Clone, Copy, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub struct HeaderV1 {
/// The index of this block on the blockchain.
///
/// The genesis block has number 0.
pub number: u64,
/// The commitment to the DAG this header builds upon.
///
/// This is defined as an unbalanced Merkle tree so light clients may sync one header per epoch,
/// and then may prove the inclusion of any header in logarithmic depth (without providing the
/// entire header chain).
///
/// Alternative popular options would be a Merkle Mountain Range, which makes more recent blocks
/// cheaper to prove at the sacrifice of older blocks being more expensive to prove. An MMR isn't
/// used in order to minimize the protocol's surface area. Additionally, even though the
/// unbalanced Merkle tree doesn't achieve such notably short paths for recent blocks, it does
/// inherently provide lower-depth paths to more recent items *on imbalance*.
pub builds_upon: UnbalancedMerkleTree,
/// The UNIX time in milliseconds this block was created at.
pub unix_time_in_millis: u64,
/// The commitment to the transactions within this block.
pub transactions_commitment: UnbalancedMerkleTree,
/// The commitment to the events within this block.
///
/// The leaves of this tree will be of the form
/// `(EVENTS_COMMITMENT_LEAF_TAG, transaction hash, transaction's events' Merkle tree root)`.
/// A transaction may have the same event multiple times, yet an event may be uniquely identified
/// by its path within the tree.
pub events_commitment: UnbalancedMerkleTree,
/// A commitment to the consensus data used to justify adding this block to the blockchain.
pub consensus_commitment: [u8; 32],
}
/// A header for a block.
#[derive(Clone, Copy, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Header {
/// A version 1 header.
V1(HeaderV1),
}
impl Header {
/// Get the hash of the header.
pub fn number(&self) -> u64 {
match self {
Header::V1(HeaderV1 { number, .. }) => *number,
}
}
/// Get the commitment to the DAG this header builds upon.
pub fn builds_upon(&self) -> UnbalancedMerkleTree {
match self {
Header::V1(HeaderV1 { builds_upon, .. }) => *builds_upon,
}
}
/// The commitment to the transactions within this block.
pub fn transactions_commitment(&self) -> UnbalancedMerkleTree {
match self {
Header::V1(HeaderV1 { transactions_commitment, .. }) => *transactions_commitment,
}
}
/// The commitment to the events within this block.
pub fn events_commitment(&self) -> UnbalancedMerkleTree {
match self {
Header::V1(HeaderV1 { events_commitment, .. }) => *events_commitment,
}
}
/// Get the hash of the header.
pub fn hash(&self) -> BlockHash {
BlockHash(sp_core::blake2_256(&borsh::to_vec(self).unwrap()))
}
}
/// A block.
///
/// This does not guarantee consistency. The header's `transactions_root` may not match the
/// contained transactions.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub struct Block {
/// The block's header.
pub header: Header,
/// The block's transactions.
pub transactions: Vec<Transaction>,
}
#[cfg(feature = "substrate")]
mod substrate {
use core::fmt::Debug;
use scale::{Encode, Decode};
use scale_info::TypeInfo;
use sp_core::H256;
use sp_runtime::{
generic::{DigestItem, Digest},
traits::{Header as HeaderTrait, HeaderProvider, Block as BlockTrait},
};
use super::*;
/// The digest for all of the Serai-specific header fields added before execution of the block.
#[derive(Clone, Copy, PartialEq, Eq, BorshSerialize, BorshDeserialize)]
pub struct SeraiPreExecutionDigest {
/// The UNIX time in milliseconds this block was created at.
pub unix_time_in_millis: u64,
}
/// The digest for all of the Serai-specific header fields determined during execution of the
/// block.
#[derive(Clone, Copy, PartialEq, Eq, BorshSerialize, BorshDeserialize)]
pub struct SeraiExecutionDigest {
/// The commitment to the DAG this header builds upon.
pub builds_upon: UnbalancedMerkleTree,
/// The commitment to the transactions within this block.
pub transactions_commitment: UnbalancedMerkleTree,
/// The commitment to the events within this block.
pub events_commitment: UnbalancedMerkleTree,
}
impl SeraiPreExecutionDigest {
/// The consensus ID for a Serai pre-execution digest.
pub const CONSENSUS_ID: [u8; 4] = *b"SRIP";
}
impl SeraiExecutionDigest {
/// The consensus ID for a Serai execution digest.
pub const CONSENSUS_ID: [u8; 4] = *b"SRIE";
}
/// The consensus data for a V1 header.
///
/// This is not considered part of the protocol proper and may be pruned in the future. It's
/// solely considered used for consensus now.
#[derive(Clone, PartialEq, Eq, Debug, Encode, Decode, TypeInfo, sp_runtime::Serialize)]
pub struct ConsensusV1 {
/// The hash of the immediately preceding block.
parent_hash: H256,
/// The root for the Merkle tree of transactions, as defined by Substrate.
///
/// The format of this differs from Serai's format for the commitment to the transactions.
transactions_root: H256,
/// The state root.
state_root: H256,
/// The consensus digests.
digest: Digest,
}
/// A V1 header for a block, as needed by Substrate.
#[derive(Clone, PartialEq, Eq, Debug, Encode, Decode, TypeInfo, sp_runtime::Serialize)]
pub struct SubstrateHeaderV1 {
number: u64,
consensus: ConsensusV1,
}
/// A header for a block, as needed by Substrate.
#[derive(Clone, PartialEq, Eq, Debug, Encode, Decode, TypeInfo, sp_runtime::Serialize)]
pub enum SubstrateHeader {
/// A version 1 header.
V1(SubstrateHeaderV1),
}
impl From<&SubstrateHeader> for Header {
fn from(header: &SubstrateHeader) -> Self {
match header {
SubstrateHeader::V1(header) => {
let mut pre_execution_digest = None;
let mut execution_digest = None;
for log in header.consensus.digest.logs() {
match log {
DigestItem::PreRuntime(consensus, encoded)
if *consensus == SeraiExecutionDigest::CONSENSUS_ID =>
{
pre_execution_digest =
SeraiPreExecutionDigest::deserialize_reader(&mut encoded.as_slice()).ok();
}
DigestItem::Consensus(consensus, encoded)
if *consensus == SeraiExecutionDigest::CONSENSUS_ID =>
{
execution_digest =
SeraiExecutionDigest::deserialize_reader(&mut encoded.as_slice()).ok();
}
_ => {}
}
}
Header::V1(HeaderV1 {
number: header.number,
builds_upon: execution_digest
.as_ref()
.map(|digest| digest.builds_upon)
.unwrap_or(UnbalancedMerkleTree::EMPTY),
unix_time_in_millis: pre_execution_digest
.as_ref()
.map(|digest| digest.unix_time_in_millis)
.unwrap_or(0),
transactions_commitment: execution_digest
.as_ref()
.map(|digest| digest.transactions_commitment)
.unwrap_or(UnbalancedMerkleTree::EMPTY),
events_commitment: execution_digest
.as_ref()
.map(|digest| digest.events_commitment)
.unwrap_or(UnbalancedMerkleTree::EMPTY),
consensus_commitment: sp_core::blake2_256(&header.consensus.encode()),
})
}
}
}
}
/// A block, as needed by Substrate.
#[derive(Clone, Debug, PartialEq, Eq, Encode, Decode, sp_runtime::Serialize)]
pub struct SubstrateBlock {
header: SubstrateHeader,
#[serde(skip)] // This makes this unsafe to deserialize, but we don't impl `Deserialize`
transactions: Vec<Transaction>,
}
impl HeaderTrait for SubstrateHeader {
type Number = u64;
type Hash = H256;
type Hashing = sp_runtime::traits::BlakeTwo256;
fn new(
number: Self::Number,
extrinsics_root: Self::Hash,
state_root: Self::Hash,
parent_hash: Self::Hash,
digest: Digest,
) -> Self {
SubstrateHeader::V1(SubstrateHeaderV1 {
number,
consensus: ConsensusV1 {
parent_hash,
transactions_root: extrinsics_root,
state_root,
digest,
},
})
}
fn number(&self) -> &Self::Number {
match self {
SubstrateHeader::V1(SubstrateHeaderV1 { number, .. }) => number,
}
}
fn set_number(&mut self, number: Self::Number) {
match self {
SubstrateHeader::V1(SubstrateHeaderV1 { number: existing, .. }) => {
*existing = number;
}
}
}
fn extrinsics_root(&self) -> &Self::Hash {
match self {
SubstrateHeader::V1(SubstrateHeaderV1 { consensus, .. }) => &consensus.transactions_root,
}
}
fn set_extrinsics_root(&mut self, extrinsics_root: Self::Hash) {
match self {
SubstrateHeader::V1(SubstrateHeaderV1 { consensus, .. }) => {
consensus.transactions_root = extrinsics_root;
}
}
}
fn state_root(&self) -> &Self::Hash {
match self {
SubstrateHeader::V1(SubstrateHeaderV1 { consensus, .. }) => &consensus.state_root,
}
}
fn set_state_root(&mut self, state_root: Self::Hash) {
match self {
SubstrateHeader::V1(SubstrateHeaderV1 { consensus, .. }) => {
consensus.state_root = state_root;
}
}
}
fn parent_hash(&self) -> &Self::Hash {
match self {
SubstrateHeader::V1(SubstrateHeaderV1 { consensus, .. }) => &consensus.parent_hash,
}
}
fn set_parent_hash(&mut self, parent_hash: Self::Hash) {
match self {
SubstrateHeader::V1(SubstrateHeaderV1 { consensus, .. }) => {
consensus.parent_hash = parent_hash;
}
}
}
fn digest(&self) -> &Digest {
match self {
SubstrateHeader::V1(SubstrateHeaderV1 { consensus, .. }) => &consensus.digest,
}
}
fn digest_mut(&mut self) -> &mut Digest {
match self {
SubstrateHeader::V1(SubstrateHeaderV1 { consensus, .. }) => &mut consensus.digest,
}
}
fn hash(&self) -> H256 {
H256::from(Header::from(self).hash().0)
}
}
impl HeaderProvider for SubstrateBlock {
type HeaderT = SubstrateHeader;
}
impl BlockTrait for SubstrateBlock {
type Extrinsic = Transaction;
type Header = SubstrateHeader;
type Hash = H256;
fn header(&self) -> &Self::Header {
&self.header
}
fn extrinsics(&self) -> &[Self::Extrinsic] {
&self.transactions
}
fn deconstruct(self) -> (Self::Header, Vec<Self::Extrinsic>) {
(self.header, self.transactions)
}
fn new(header: Self::Header, transactions: Vec<Self::Extrinsic>) -> Self {
Self { header, transactions }
}
fn encode_from(header: &Self::Header, transactions: &[Self::Extrinsic]) -> Vec<u8> {
let header = header.encode();
let transactions = transactions.encode();
let mut block = header;
block.extend(transactions);
block
}
fn hash(&self) -> Self::Hash {
self.header.hash()
}
}
}
#[cfg(feature = "substrate")]
pub use substrate::*;

View File

@@ -1,25 +1,70 @@
use serai_primitives::{Balance, SeraiAddress};
use borsh::{BorshSerialize, BorshDeserialize};
pub use serai_coins_primitives as primitives;
use primitives::OutInstructionWithBalance;
use serai_primitives::{
address::SeraiAddress, balance::Balance, instructions::OutInstructionWithBalance,
};
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "borsh", derive(borsh::BorshSerialize, borsh::BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
/// A call to coins.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Call {
transfer { to: SeraiAddress, balance: Balance },
burn { balance: Balance },
burn_with_instruction { instruction: OutInstructionWithBalance },
/// Transfer these coins to the specified address.
transfer {
/// The address to transfer to.
to: SeraiAddress,
/// The coins to transfer.
coins: Balance,
},
/// Burn these coins.
burn {
/// The coins to burn.
coins: Balance,
},
/// Burn these coins with an `OutInstruction` specified.
burn_with_instruction {
/// The `OutInstruction`, with the coins to burn.
instruction: OutInstructionWithBalance,
},
}
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "borsh", derive(borsh::BorshSerialize, borsh::BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
pub enum Event {
Mint { to: SeraiAddress, balance: Balance },
Burn { from: SeraiAddress, balance: Balance },
BurnWithInstruction { from: SeraiAddress, instruction: OutInstructionWithBalance },
Transfer { from: SeraiAddress, to: SeraiAddress, balance: Balance },
impl Call {
pub(crate) fn is_signed(&self) -> bool {
match self {
Call::transfer { .. } | Call::burn { .. } | Call::burn_with_instruction { .. } => true,
}
}
}
/// An event from the system.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Event {
/// The specified coins were minted.
Mint {
/// The address minted to.
to: SeraiAddress,
/// The coins minted.
coins: Balance,
},
/// The specified coins were burnt.
Burn {
/// The address burnt from.
from: SeraiAddress,
/// The coins burnt.
coins: Balance,
},
/// The specified coins were burnt with an `OutInstruction` specified.
BurnWithInstruction {
/// The address burnt from.
from: SeraiAddress,
/// The `OutInstruction` specified, and the coins burnt.
instruction: OutInstructionWithBalance,
},
/// The specified coins were transferred.
Transfer {
/// The address transferred from.
from: SeraiAddress,
/// The address transferred to.
to: SeraiAddress,
/// The coins transferred.
coins: Balance,
},
}

View File

@@ -1,75 +1,121 @@
use sp_runtime::BoundedVec;
use alloc::vec::Vec;
use serai_primitives::*;
use borsh::{BorshSerialize, BorshDeserialize};
type PoolId = ExternalCoin;
type MaxSwapPathLength = sp_core::ConstU32<3>;
use serai_primitives::{
address::SeraiAddress,
coin::ExternalCoin,
balance::{Amount, ExternalBalance, Balance},
};
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
/// A call to the DEX.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Call {
/// Add liquidity.
add_liquidity {
/// The coin to add liquidity for.
coin: ExternalCoin,
coin_desired: SubstrateAmount,
sri_desired: SubstrateAmount,
coin_min: SubstrateAmount,
sri_min: SubstrateAmount,
mint_to: SeraiAddress,
/// The intended amount of SRI to add as liquidity.
sri_intended: Amount,
/// The intended amount of the coin to add as liquidity.
coin_intended: Amount,
/// The minimum amount of SRI to add as liquidity.
sri_minimum: Amount,
/// The minimum amount of the coin to add as liquidity.
coin_minimum: Amount,
},
/// Transfer these liquidity tokens to the specified address.
transfer_liquidity {
/// The address to transfer to.
to: SeraiAddress,
/// The liquidity tokens to transfer.
liquidity_tokens: ExternalBalance,
},
/// Remove liquidity.
remove_liquidity {
coin: ExternalCoin,
lp_token_burn: SubstrateAmount,
coin_min_receive: SubstrateAmount,
sri_min_receive: SubstrateAmount,
withdraw_to: SeraiAddress,
/// The liquidity tokens to burn, removing the underlying liquidity from the pool.
///
/// The `coin` within the balance is the coin to remove liquidity for.
liquidity_tokens: ExternalBalance,
/// The minimum amount of SRI to receive.
sri_minimum: Amount,
/// The minimum amount of the coin to receive.
coin_minimum: Amount,
},
swap_exact_tokens_for_tokens {
path: BoundedVec<Coin, MaxSwapPathLength>,
amount_in: SubstrateAmount,
amount_out_min: SubstrateAmount,
send_to: SeraiAddress,
/// Swap an exact amount of coins.
swap_exact {
/// The coins to swap.
coins_to_swap: Balance,
/// The minimum balance to receive.
minimum_to_receive: Balance,
},
swap_tokens_for_exact_tokens {
path: BoundedVec<Coin, MaxSwapPathLength>,
amount_out: SubstrateAmount,
amount_in_max: SubstrateAmount,
send_to: SeraiAddress,
/// Swap for an exact amount of coins.
swap_for_exact {
/// The coins to receive.
coins_to_receive: Balance,
/// The maximum amount to swap.
maximum_to_swap: Balance,
},
}
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
impl Call {
pub(crate) fn is_signed(&self) -> bool {
match self {
Call::add_liquidity { .. } |
Call::transfer_liquidity { .. } |
Call::remove_liquidity { .. } |
Call::swap_exact { .. } |
Call::swap_for_exact { .. } => true,
}
}
}
/// An event from the DEX.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Event {
PoolCreated {
pool_id: PoolId,
pool_account: SeraiAddress,
},
/// Liquidity was added to a pool.
LiquidityAdded {
who: SeraiAddress,
mint_to: SeraiAddress,
pool_id: PoolId,
coin_amount: SubstrateAmount,
sri_amount: SubstrateAmount,
lp_token_minted: SubstrateAmount,
/// The account which added the liquidity.
origin: SeraiAddress,
/// The account which received the liquidity tokens.
recipient: SeraiAddress,
/// The pool liquidity was added to.
pool: ExternalCoin,
/// The amount of liquidity tokens which were minted.
liquidity_tokens_minted: Amount,
/// The amount of the coin which was added to the pool's liquidity.
coin_amount: Amount,
/// The amount of SRI which was added to the pool's liquidity.
sri_amount: Amount,
},
/// Liquidity was removed from a pool.
LiquidityRemoved {
who: SeraiAddress,
withdraw_to: SeraiAddress,
pool_id: PoolId,
coin_amount: SubstrateAmount,
sri_amount: SubstrateAmount,
lp_token_burned: SubstrateAmount,
/// The account which removed the liquidity.
origin: SeraiAddress,
/// The pool liquidity was removed from.
pool: ExternalCoin,
/// The mount of liquidity tokens which were burnt.
liquidity_tokens_burnt: Amount,
/// The amount of the coin which was removed from the pool's liquidity.
coin_amount: Amount,
/// The amount of SRI which was removed from the pool's liquidity.
sri_amount: Amount,
},
SwapExecuted {
who: SeraiAddress,
send_to: SeraiAddress,
path: BoundedVec<Coin, MaxSwapPathLength>,
amount_in: SubstrateAmount,
amount_out: SubstrateAmount,
/// A swap through the liquidity pools occurred.
Swap {
/// The account which made the swap.
origin: SeraiAddress,
/// The recipient for the output of the swap.
recipient: SeraiAddress,
/// The deltas incurred by the pools.
///
/// For a swap of sriABC to sriDEF, this would be
/// `[Balance { sriABC, 1 }, Balance { SRI, 2 }, Balance { sriDEF, 3 }]`, where
/// `Balance { sriABC, 1 }` was added to the `sriABC-SRI` pool, `Balance { SRI, 2 }` was
/// removed from the `sriABC-SRI` pool and added to the `sriDEF-SRI` pool, and
/// `Balance { sriDEF, 3 }` was removed from the `sriDEF-SRI` pool.
deltas: Vec<Balance>,
},
}

View File

@@ -1,8 +1,13 @@
use serai_primitives::ExternalNetworkId;
use borsh::{BorshSerialize, BorshDeserialize};
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "borsh", derive(borsh::BorshSerialize, borsh::BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(serde::Serialize, serde::Deserialize))]
use serai_primitives::network_id::ExternalNetworkId;
/// An event from economic security.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Event {
EconomicSecurityReached { network: ExternalNetworkId },
/// Economic security was achieved for a network's validator set.
EconomicSecurityAchieved {
/// The network whose validator set achieved economic security.
network: ExternalNetworkId,
},
}

View File

@@ -1 +0,0 @@
pub use serai_emissions_primitives as primitives;

View File

@@ -1,20 +1,50 @@
pub use serai_genesis_liquidity_primitives as primitives;
use borsh::{BorshSerialize, BorshDeserialize};
use serai_primitives::*;
use primitives::*;
use serai_primitives::{
crypto::Signature, address::SeraiAddress, balance::ExternalBalance, genesis::GenesisValues,
};
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "serde", derive(serde::Serialize, serde::Deserialize))]
/// A call to the genesis liquidity.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Call {
remove_coin_liquidity { balance: ExternalBalance },
oraclize_values { values: Values, signature: Signature },
/// Oraclize the value of non-Bitcoin external coins relative to Bitcoin.
oraclize_values {
/// The values of the non-Bitcoin external coins.
values: GenesisValues,
/// The signature by the genesis validators for these values.
signature: Signature,
},
/// Remove liquidity.
remove_liquidity {
/// The genesis liquidity to remove.
balance: ExternalBalance,
},
}
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "borsh", derive(borsh::BorshSerialize, borsh::BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(serde::Serialize, serde::Deserialize))]
pub enum Event {
GenesisLiquidityAdded { by: SeraiAddress, balance: ExternalBalance },
GenesisLiquidityRemoved { by: SeraiAddress, balance: ExternalBalance },
GenesisLiquidityAddedToPool { coin: ExternalBalance, sri: Amount },
impl Call {
pub(crate) fn is_signed(&self) -> bool {
match self {
Call::oraclize_values { .. } => false,
Call::remove_liquidity { .. } => true,
}
}
}
/// An event from the genesis liquidity.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Event {
/// Genesis liquidity added.
GenesisLiquidityAdded {
/// The recipient of the genesis liquidity.
recipient: SeraiAddress,
/// The coins added as genesis liquidity.
balance: ExternalBalance,
},
/// Genesis liquidity removed.
GenesisLiquidityRemoved {
/// The account which removed the genesis liquidity.
origin: SeraiAddress,
/// The amount of genesis liquidity removed.
balance: ExternalBalance,
},
}

View File

@@ -1,25 +0,0 @@
use sp_consensus_grandpa::EquivocationProof;
use serai_primitives::{BlockNumber, SeraiAddress};
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
pub struct ReportEquivocation {
pub equivocation_proof: alloc::boxed::Box<EquivocationProof<[u8; 32], BlockNumber>>,
pub key_owner_proof: SeraiAddress,
}
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
pub enum Call {
report_equivocation(ReportEquivocation),
report_equivocation_unsigned(ReportEquivocation),
}
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
pub enum Event {
NewAuthorities { authority_set: alloc::vec::Vec<(SeraiAddress, u64)> },
// TODO: Remove these
Paused,
Resumed,
}

View File

@@ -1,30 +1,47 @@
use serai_primitives::*;
use borsh::{BorshSerialize, BorshDeserialize};
pub use serai_in_instructions_primitives as primitives;
use primitives::SignedBatch;
use serai_validator_sets_primitives::Session;
use serai_primitives::{
BlockHash, network_id::ExternalNetworkId, validator_sets::Session, instructions::SignedBatch,
};
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "borsh", derive(borsh::BorshSerialize, borsh::BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
/// A call to `InInstruction`s.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Call {
execute_batch { batch: SignedBatch },
/// Execute a batch of `InInstruction`s.
execute_batch {
/// The batch to execute.
batch: SignedBatch,
},
}
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
impl Call {
pub(crate) fn is_signed(&self) -> bool {
match self {
Call::execute_batch { .. } => false,
}
}
}
/// An event from `InInstruction`s.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Event {
/// A batch of `InInstruction`s was executed.
Batch {
/// The network for which a batch was executed.
network: ExternalNetworkId,
/// The session which published the batch.
publishing_session: Session,
/// The ID of the batch.
id: u32,
/// The hash of the block on the external network which caused this batch's creation.
external_network_block_hash: BlockHash,
/// The hash of the `InInstruction`s within this batch.
in_instructions_hash: [u8; 32],
/// The results of each `InInstruction` within the batch.
#[borsh(
serialize_with = "serai_primitives::sp_borsh::borsh_serialize_bitvec",
deserialize_with = "serai_primitives::sp_borsh::borsh_deserialize_bitvec"
)]
in_instruction_results: bitvec::vec::BitVec<u8, bitvec::order::Lsb0>,
},
Halt {
network: ExternalNetworkId,
},
}

View File

@@ -1,94 +1,98 @@
#![cfg_attr(docsrs, feature(doc_cfg))]
#![cfg_attr(docsrs, feature(doc_auto_cfg))]
#![doc = include_str!("../README.md")]
#![deny(missing_docs)]
#![cfg_attr(not(feature = "std"), no_std)]
#![allow(non_camel_case_types)]
extern crate alloc;
use borsh::{BorshSerialize, BorshDeserialize};
pub use serai_primitives as primitives;
/// Call/Event for the system.
pub mod system;
pub mod timestamp;
/// Call/Event for coins.
pub mod coins;
pub mod liquidity_tokens;
pub mod dex;
/// Call/Event for validator sets.
pub mod validator_sets;
pub mod genesis_liquidity;
pub mod emissions;
pub mod economic_security;
pub mod in_instructions;
/// Call/Event for signals.
pub mod signals;
pub mod babe;
pub mod grandpa;
/// Call/Event for the DEX.
pub mod dex;
pub mod tx;
/// Call/Event for genesis liquidity.
pub mod genesis_liquidity;
/// Event for economic security.
pub mod economic_security;
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
/// Call/Event for `InInstruction`s.
pub mod in_instructions;
mod transaction;
pub use transaction::*;
mod block;
pub use block::*;
/// All calls.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
#[borsh(use_discriminant = true)]
#[repr(u8)]
pub enum Call {
Timestamp(timestamp::Call),
Coins(coins::Call),
LiquidityTokens(liquidity_tokens::Call),
Dex(dex::Call),
ValidatorSets(validator_sets::Call),
GenesisLiquidity(genesis_liquidity::Call),
InInstructions(in_instructions::Call),
Signals(signals::Call),
Babe(babe::Call),
Grandpa(grandpa::Call),
// The call for the system.
// System(system::Call) = 0,
/// The call for coins.
Coins(coins::Call) = 1,
/// The call for validator sets.
ValidatorSets(validator_sets::Call) = 2,
/// The call for signals.
Signals(signals::Call) = 3,
/// The call for the DEX.
Dex(dex::Call) = 4,
/// The call for genesis liquidity.
GenesisLiquidity(genesis_liquidity::Call) = 5,
// The call for economic security.
// EconomicSecurity = 6,
/// The call for `InInstruction`s.
InInstructions(in_instructions::Call) = 7,
}
// TODO: Remove this
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
pub enum TransactionPaymentEvent {
TransactionFeePaid { who: serai_primitives::SeraiAddress, actual_fee: u64, tip: u64 },
impl Call {
pub(crate) fn is_signed(&self) -> bool {
match self {
Call::Coins(call) => call.is_signed(),
Call::ValidatorSets(call) => call.is_signed(),
Call::Signals(call) => call.is_signed(),
Call::Dex(call) => call.is_signed(),
Call::GenesisLiquidity(call) => call.is_signed(),
Call::InInstructions(call) => call.is_signed(),
}
}
}
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
/// All events.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
#[borsh(use_discriminant = true)]
#[repr(u8)]
pub enum Event {
System(system::Event),
Timestamp,
TransactionPayment(TransactionPaymentEvent),
Coins(coins::Event),
LiquidityTokens(liquidity_tokens::Event),
Dex(dex::Event),
ValidatorSets(validator_sets::Event),
GenesisLiquidity(genesis_liquidity::Event),
Emissions,
EconomicSecurity(economic_security::Event),
InInstructions(in_instructions::Event),
Signals(signals::Event),
Babe,
Grandpa(grandpa::Event),
/// The event for the system.
System(system::Event) = 0,
/// The event for coins.
Coins(coins::Event) = 1,
/// The event for validator sets.
ValidatorSets(validator_sets::Event) = 2,
/// The event for signals.
Signals(signals::Event) = 3,
/// The event for the DEX.
Dex(dex::Event) = 4,
/// The event for genesis liquidity.
GenesisLiquidity(genesis_liquidity::Event) = 5,
/// The event for economic security.
EconomicSecurity(economic_security::Event) = 6,
/// The event for `InInstruction`s.
InInstructions(in_instructions::Event) = 7,
}
#[derive(Clone, Copy, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
pub struct Extra {
pub era: sp_runtime::generic::Era,
#[codec(compact)]
pub nonce: u32,
#[codec(compact)]
pub tip: u64,
}
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "borsh", derive(borsh::BorshSerialize, borsh::BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
pub struct SignedPayloadExtra {
pub spec_version: u32,
pub tx_version: u32,
pub genesis: [u8; 32],
pub mortality_checkpoint: [u8; 32],
}
pub type Transaction = tx::Transaction<Call, Extra>;

View File

@@ -1,18 +0,0 @@
use serai_primitives::{Balance, SeraiAddress};
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "borsh", derive(borsh::BorshSerialize, borsh::BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(serde::Serialize, serde::Deserialize))]
pub enum Call {
burn { balance: Balance },
transfer { to: SeraiAddress, balance: Balance },
}
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "borsh", derive(borsh::BorshSerialize, borsh::BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(serde::Serialize, serde::Deserialize))]
pub enum Event {
Mint { to: SeraiAddress, balance: Balance },
Burn { from: SeraiAddress, balance: Balance },
Transfer { from: SeraiAddress, to: SeraiAddress, balance: Balance },
}

View File

@@ -1,59 +1,132 @@
use serai_primitives::{NetworkId, SeraiAddress};
use borsh::{BorshSerialize, BorshDeserialize};
use serai_validator_sets_primitives::ValidatorSet;
use serai_primitives::{
address::SeraiAddress, network_id::NetworkId, validator_sets::ValidatorSet, signals::Signal,
};
pub use serai_signals_primitives as primitives;
use primitives::SignalId;
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "borsh", derive(borsh::BorshSerialize, borsh::BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
/// A call to signals.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Call {
register_retirement_signal { in_favor_of: [u8; 32] },
revoke_retirement_signal { retirement_signal_id: [u8; 32] },
favor { signal_id: SignalId, for_network: NetworkId },
revoke_favor { signal_id: SignalId, for_network: NetworkId },
stand_against { signal_id: SignalId, for_network: NetworkId },
/// Register a retirement signal.
register_retirement_signal {
/// The protocol favored over the current protocol.
in_favor_of: [u8; 32],
},
/// Revoke a retirement signal.
revoke_retirement_signal {
/// The protocol which was favored over the current protocol
was_in_favor_of: [u8; 32],
},
/// Favor a signal.
favor {
/// The signal to favor.
signal: Signal,
/// The network this validator is expressing favor with.
///
/// A validator may be an active validator for multiple networks. The validator must specify
/// which network they're expressing favor with in this call.
with_network: NetworkId,
},
/// Revoke favor for a signal.
revoke_favor {
/// The signal to revoke favor for.
signal: Signal,
/// The network this validator is revoking favor with.
///
/// A validator may have expressed favor with multiple networks. The validator must specify
/// which network they're revoking favor with in this call.
with_network: NetworkId,
},
/// Stand against a signal.
///
/// This has no effects other than emitting an event that this signal is stood against. If the
/// origin has prior expressed favor, they must still call `revoke_favor` for each network they
/// expressed favor with.
stand_against {
/// The signal to stand against.
signal: Signal,
},
}
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "borsh", derive(borsh::BorshSerialize, borsh::BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
impl Call {
pub(crate) fn is_signed(&self) -> bool {
match self {
Call::register_retirement_signal { .. } |
Call::revoke_retirement_signal { .. } |
Call::favor { .. } |
Call::revoke_favor { .. } |
Call::stand_against { .. } => true,
}
}
}
/// An event from signals.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Event {
/// A retirement signal has been registered.
RetirementSignalRegistered {
signal_id: [u8; 32],
/// The retirement signal's ID.
signal: [u8; 32],
/// The protocol retirement is proposed in favor of.
in_favor_of: [u8; 32],
/// The address which registered this signal.
registrant: SeraiAddress,
},
/// A retirement signal was revoked.
RetirementSignalRevoked {
signal_id: [u8; 32],
/// The retirement signal's ID.
signal: [u8; 32],
},
/// A signal was favored.
SignalFavored {
signal_id: SignalId,
/// The signal favored.
signal: Signal,
/// The validator the signal was favored by.
by: SeraiAddress,
for_network: NetworkId,
},
SetInFavor {
signal_id: SignalId,
set: ValidatorSet,
},
RetirementSignalLockedIn {
signal_id: [u8; 32],
},
SetNoLongerInFavor {
signal_id: SignalId,
set: ValidatorSet,
/// The network with which the signal was favored.
with_network: NetworkId,
},
/// Favor for a signal was revoked.
FavorRevoked {
signal_id: SignalId,
/// The signal whose favor was revoked.
signal: Signal,
/// The validator who revoked their favor for the signal.
by: SeraiAddress,
for_network: NetworkId,
/// The network with which favor for the signal was revoked.
with_network: NetworkId,
},
/// A supermajority of a validator set now favor a signal.
SetInFavor {
/// The signal which now has a supermajority of a validator set favoring it.
signal: Signal,
/// The validator set which is now considered to favor the signal.
set: ValidatorSet,
},
/// A validator set is no longer considered to favor a signal.
SetNoLongerInFavor {
/// The signal which no longer has the validator set considered in favor of it.
signal: Signal,
/// The validator set which is no longer considered to be in favor of the signal.
set: ValidatorSet,
},
/// A retirement signal has been locked in.
RetirementSignalLockedIn {
/// The signal which has been locked in.
signal: [u8; 32],
},
/// A validator set's ability to publish batches was halted.
///
/// This also halts set rotation in effect, as handovers are via new sets starting to publish
/// batches.
SetHalted {
/// The signal which has been locked in.
signal: [u8; 32],
},
/// An account has stood against a signal.
AgainstSignal {
signal_id: SignalId,
/// The signal stood against.
signal: Signal,
/// The account which stood against the signal.
who: SeraiAddress,
for_network: NetworkId,
},
}

View File

@@ -1,13 +1,11 @@
use frame_support::dispatch::{DispatchInfo, DispatchError};
use borsh::{BorshSerialize, BorshDeserialize};
use serai_primitives::SeraiAddress;
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
/// An event from the system.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Event {
ExtrinsicSuccess { dispatch_info: DispatchInfo },
ExtrinsicFailed { dispatch_error: DispatchError, dispatch_info: DispatchInfo },
CodeUpdated,
NewAccount { account: SeraiAddress },
KilledAccount { account: SeraiAddress },
Remarked { sender: SeraiAddress, hash: [u8; 32] },
/// The transaction successfully executed.
TransactionSuccess,
/// The transaction failed to execute.
// TODO: Add an error to this
TransactionFailed,
}

View File

@@ -1,9 +0,0 @@
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
pub enum Call {
set {
#[codec(compact)]
now: u64,
},
}

View File

@@ -0,0 +1,563 @@
use core::num::NonZero;
use alloc::vec::Vec;
use borsh::{io, BorshSerialize, BorshDeserialize};
use sp_core::{ConstU32, bounded::BoundedVec};
use serai_primitives::{BlockHash, address::SeraiAddress, balance::Amount, crypto::Signature};
use crate::Call;
/// The maximum amount of calls allowed in a transaction.
pub const MAX_CALLS: u32 = 8;
/// An error regarding `SignedCalls`.
#[derive(Clone, PartialEq, Eq, Debug)]
pub enum SignedCallsError {
/// No calls were included.
NoCalls,
/// Too many calls were included.
TooManyCalls,
/// An unsigned call was included.
IncludedUnsignedCall,
}
/// A `Vec` of signed calls.
// We don't implement BorshDeserialize due to to maintained invariants on this struct.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize)]
pub struct SignedCalls(
#[borsh(serialize_with = "serai_primitives::sp_borsh::borsh_serialize_bounded_vec")]
BoundedVec<Call, ConstU32<{ MAX_CALLS }>>,
);
impl TryFrom<Vec<Call>> for SignedCalls {
type Error = SignedCallsError;
fn try_from(calls: Vec<Call>) -> Result<Self, Self::Error> {
if calls.is_empty() {
Err(SignedCallsError::NoCalls)?;
}
for call in &calls {
if !call.is_signed() {
Err(SignedCallsError::IncludedUnsignedCall)?;
}
}
calls.try_into().map_err(|_| SignedCallsError::TooManyCalls).map(SignedCalls)
}
}
/// An error regarding `UnsignedCall`.
#[derive(Clone, PartialEq, Eq, Debug)]
pub enum UnsignedCallError {
/// A signed call was specified.
SignedCall,
}
/// An unsigned call.
// We don't implement BorshDeserialize due to to maintained invariants on this struct.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize)]
pub struct UnsignedCall(Call);
impl TryFrom<Call> for UnsignedCall {
type Error = UnsignedCallError;
fn try_from(call: Call) -> Result<Self, Self::Error> {
if call.is_signed() {
Err(UnsignedCallError::SignedCall)?;
}
Ok(UnsignedCall(call))
}
}
/// Part of the context used to sign with, from the protocol.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub struct ImplicitContext {
/// The genesis hash of the blockchain.
pub genesis: BlockHash,
/// The ID of the current protocol.
pub protocol_id: [u8; 32],
}
/// Part of the context used to sign with, specified within the transaction itself.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub struct ExplicitContext {
/// The historic block this transaction builds upon.
///
/// This transaction can not be included in a blockchain which does not include this block.
pub historic_block: BlockHash,
/// The UNIX time this transaction must be included by (and expires after).
///
/// This transaction can not be included in a block whose time is equal or greater to this value.
pub include_by: Option<NonZero<u64>>,
/// The signer.
pub signer: SeraiAddress,
/// The signer's nonce.
pub nonce: u32,
/// The fee, in SRI, paid to the network for inclusion.
///
/// This fee is paid regardless of the success of any of the calls.
pub fee: Amount,
}
/// A signature, with context.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub struct ContextualizedSignature {
/// The explicit context.
explicit_context: ExplicitContext,
/// The signature.
signature: Signature,
}
/// A Serai transaction.
#[derive(Clone, PartialEq, Eq, Debug)]
pub enum Transaction {
/// An unsigned transaction.
Unsigned {
/// The contained call.
call: UnsignedCall,
},
/// A signed transaction.
Signed {
/// The calls.
///
/// These calls are executed atomically. Either all successfully execute or none do. The
/// transaction's fee is paid regardless.
calls: SignedCalls,
/// The signature for this transaction.
///
/// This is not checked on deserializtion and may be invalid.
contextualized_signature: ContextualizedSignature,
},
}
impl BorshSerialize for Transaction {
fn serialize<W: io::Write>(&self, writer: &mut W) -> io::Result<()> {
match self {
Transaction::Unsigned { call } => {
/*
`Signed` `Transaction`s encode the length of their `Vec<Call>` here. Since that `Vec` is
bound to be non-empty, it will never write `0`, enabling `Unsigned` to use it.
The benefit to these not overlapping is in the ability to determine if the `Transaction`
has a signature or not. If this wrote a `1`, for the amount of `Call`s present in the
`Transaction`, that `Call` would have to be introspected for if its signed or not. With
the usage of `0`, given how low `MAX_CALLS` is, this `Transaction` can technically be
defined as an enum of
`0 Call, 1 Call ContextualizedSignature, 2 Call Call ContextualizedSignature ...`, to
maintain compatbility with the borsh specification without wrapper functions. The checks
here on `Call` types/quantity could be moved to later validation functions.
*/
writer.write_all(&[0])?;
call.serialize(writer)
}
Transaction::Signed { calls, contextualized_signature } => {
serai_primitives::sp_borsh::borsh_serialize_bounded_vec(&calls.0, writer)?;
contextualized_signature.serialize(writer)
}
}
}
}
impl BorshDeserialize for Transaction {
fn deserialize_reader<R: io::Read>(reader: &mut R) -> io::Result<Self> {
let mut len = [0xff];
reader.read_exact(&mut len)?;
let len = len[0];
if len == 0 {
let call = Call::deserialize_reader(reader)?;
if call.is_signed() {
Err(io::Error::new(io::ErrorKind::Other, "call was signed but marked unsigned"))?;
}
Ok(Transaction::Unsigned { call: UnsignedCall(call) })
} else {
if u32::from(len) > MAX_CALLS {
Err(io::Error::new(io::ErrorKind::Other, "too many calls"))?;
}
let mut calls = BoundedVec::with_bounded_capacity(len.into());
for _ in 0 .. len {
let call = Call::deserialize_reader(reader)?;
if !call.is_signed() {
Err(io::Error::new(io::ErrorKind::Other, "call was unsigned but included as signed"))?;
}
calls.try_push(call).unwrap();
}
let contextualized_signature = ContextualizedSignature::deserialize_reader(reader)?;
Ok(Transaction::Signed { calls: SignedCalls(calls), contextualized_signature })
}
}
}
impl Transaction {
/// The message to sign to produce a signature.
pub fn signature_message(
calls: &SignedCalls,
implicit_context: &ImplicitContext,
explicit_context: &ExplicitContext,
) -> Vec<u8> {
let mut message = Vec::with_capacity(
(calls.0.len() * 64) +
core::mem::size_of::<ImplicitContext>() +
core::mem::size_of::<ExplicitContext>(),
);
calls.serialize(&mut message).unwrap();
implicit_context.serialize(&mut message).unwrap();
explicit_context.serialize(&mut message).unwrap();
message
}
/// The unique hash of this transaction.
///
/// No two transactions on the blockchain will share a hash, making this a unique identifier.
/// For signed transactions, this is due to the `(signer, nonce)` pair present within the
/// `ExplicitContext`. For unsigned transactions, this is due to inherent properties of their
/// execution (e.g. only being able to set a `ValidatorSet`'s keys once).
pub fn hash(&self) -> [u8; 32] {
sp_core::blake2_256(&match self {
Transaction::Unsigned { call } => borsh::to_vec(&call).unwrap(),
Transaction::Signed {
calls,
contextualized_signature: ContextualizedSignature { explicit_context, signature: _ },
} => {
// We explicitly don't hash the signature, so signatures can be replaced in the future if
// desired (such as with half-aggregated Schnorr signatures)
borsh::to_vec(&(calls, explicit_context)).unwrap()
}
})
}
}
#[cfg(feature = "substrate")]
mod substrate {
use core::fmt::Debug;
use alloc::vec;
use scale::{Encode, Decode};
use sp_runtime::{
transaction_validity::*,
traits::{Verify, ExtrinsicLike, Dispatchable, ValidateUnsigned, Checkable, Applyable},
Weight,
};
#[rustfmt::skip]
use frame_support::dispatch::{DispatchClass, Pays, DispatchInfo, GetDispatchInfo, PostDispatchInfo};
use super::*;
impl Encode for Transaction {
fn encode(&self) -> Vec<u8> {
borsh::to_vec(self).unwrap()
}
}
impl Decode for Transaction {
fn decode<I: scale::Input>(input: &mut I) -> Result<Self, scale::Error> {
struct ScaleRead<'a, I: scale::Input>(&'a mut I, Option<scale::Error>);
impl<I: scale::Input> borsh::io::Read for ScaleRead<'_, I> {
fn read(&mut self, buf: &mut [u8]) -> borsh::io::Result<usize> {
let remaining_len = self.0.remaining_len().map_err(|err| {
self.1 = Some(err);
borsh::io::Error::new(borsh::io::ErrorKind::Other, "")
})?;
// If we're still calling `read`, we try to read at least one more byte
let to_read = buf.len().min(remaining_len.unwrap_or(1));
self.0.read(&mut buf[.. to_read]).map_err(|err| {
self.1 = Some(err);
borsh::io::Error::new(borsh::io::ErrorKind::Other, "")
})?;
Ok(to_read)
}
}
let mut input = ScaleRead(input, None);
match Self::deserialize_reader(&mut input) {
Ok(res) => Ok(res),
Err(_) => Err(input.1.unwrap()),
}
}
}
/// The context which transactions are executed in.
pub trait TransactionContext: 'static + Send + Sync + Clone + PartialEq + Eq + Debug {
/// The base weight for a signed transaction.
const SIGNED_WEIGHT: Weight;
/// The call type for the runtime.
type RuntimeCall: From<Call>
+ GetDispatchInfo
+ Dispatchable<
RuntimeOrigin: From<Option<SeraiAddress>>,
Info = DispatchInfo,
PostInfo = PostDispatchInfo,
>;
/// The implicit context to verify transactions with.
fn implicit_context() -> ImplicitContext;
/// If a block is present in the blockchain.
fn block_is_present_in_blockchain(&self, hash: &BlockHash) -> bool;
/// The time embedded into the current block.
///
/// Returns `None` if the time has yet to be set.
fn current_time(&self) -> Option<u64>;
/// Get the next nonce for an account.
fn next_nonce(&self, signer: &SeraiAddress) -> u32;
/// If the signer can pay the SRI fee.
fn can_pay_fee(
&self,
signer: &SeraiAddress,
fee: Amount,
) -> Result<(), TransactionValidityError>;
/// Begin execution of a transaction.
fn start_transaction(&self);
/// Consume the next nonce for an account.
fn consume_next_nonce(&self, signer: &SeraiAddress);
/// Have the transaction pay its SRI fee.
fn pay_fee(&self, signer: &SeraiAddress, fee: Amount) -> Result<(), TransactionValidityError>;
/// End execution of a transaction.
fn end_transaction(&self, transaction_hash: [u8; 32]);
}
/// A transaction with the context necessary to evaluate it within Substrate.
#[derive(Clone, PartialEq, Eq, Debug, Encode, Decode)]
pub struct TransactionWithContext<Context: TransactionContext>(
Transaction,
#[codec(skip)] Context,
);
impl ExtrinsicLike for Transaction {
fn is_signed(&self) -> Option<bool> {
Some(matches!(self, Transaction::Signed { .. }))
}
fn is_bare(&self) -> bool {
matches!(self, Transaction::Unsigned { .. })
}
}
impl<Context: TransactionContext> GetDispatchInfo for TransactionWithContext<Context> {
fn get_dispatch_info(&self) -> DispatchInfo {
match &self.0 {
Transaction::Unsigned { call } => DispatchInfo {
call_weight: Context::RuntimeCall::from(call.0.clone()).get_dispatch_info().call_weight,
extension_weight: Weight::zero(),
class: DispatchClass::Operational,
pays_fee: Pays::No,
},
Transaction::Signed { calls, .. } => DispatchInfo {
call_weight: calls
.0
.iter()
.cloned()
.map(|call| Context::RuntimeCall::from(call).get_dispatch_info().call_weight)
.fold(Weight::zero(), |accum, item| accum + item),
extension_weight: Context::SIGNED_WEIGHT,
class: DispatchClass::Normal,
pays_fee: Pays::Yes,
},
}
}
}
impl<Context: TransactionContext> Checkable<Context> for Transaction {
type Checked = TransactionWithContext<Context>;
fn check(self, context: &Context) -> Result<Self::Checked, TransactionValidityError> {
match &self {
Transaction::Unsigned { .. } => {}
Transaction::Signed {
calls,
contextualized_signature: ContextualizedSignature { explicit_context, signature },
} => {
if !sp_core::sr25519::Signature::from(*signature).verify(
Transaction::signature_message(calls, &Context::implicit_context(), explicit_context)
.as_slice(),
&sp_core::sr25519::Public::from(explicit_context.signer),
) {
Err(InvalidTransaction::BadProof)?;
}
}
}
Ok(TransactionWithContext(self, context.clone()))
}
#[cfg(feature = "try-runtime")]
fn unchecked_into_checked_i_know_what_i_am_doing(
self,
c: &Context,
) -> Result<Self::Checked, TransactionValidityError> {
// This satisfies the API, not necessarily the intent, yet this fn is only intended to be used
// within tests. Accordingly, it's fine to be stricter than necessarily
self.check(c)
}
}
impl<Context: TransactionContext> TransactionWithContext<Context> {
fn validate_except_fee<V: ValidateUnsigned<Call = Context::RuntimeCall>>(
&self,
source: TransactionSource,
mempool_priority_if_signed: u64,
) -> TransactionValidity {
match &self.0 {
Transaction::Unsigned { call } => {
let ValidTransaction { priority: _, requires, provides, longevity: _, propagate: _ } =
V::validate_unsigned(source, &Context::RuntimeCall::from(call.0.clone()))?;
Ok(ValidTransaction {
// We should always try to include unsigned transactions prior to signed
priority: u64::MAX,
requires,
provides,
// This is valid until included
longevity: u64::MAX,
// Ensure this is propagated
propagate: true,
})
}
Transaction::Signed { calls: _, contextualized_signature } => {
let ExplicitContext { historic_block, include_by, signer, nonce, fee: _ } =
&contextualized_signature.explicit_context;
if !self.1.block_is_present_in_blockchain(historic_block) {
// We don't know if this is a block from a fundamentally distinct blockchain or a
// continuation of this blockchain we have yet to sync (which would be `Future`)
Err(TransactionValidityError::Unknown(UnknownTransaction::CannotLookup))?;
}
if let Some(include_by) = *include_by {
if let Some(current_time) = self.1.current_time() {
if current_time >= u64::from(include_by) {
// Since this transaction has a time bound which has passed, error
Err(TransactionValidityError::Invalid(InvalidTransaction::Stale))?;
}
} else {
// Since this transaction has a time bound, yet we don't know the time, error
Err(TransactionValidityError::Invalid(InvalidTransaction::Stale))?;
}
}
match self.1.next_nonce(signer).cmp(nonce) {
core::cmp::Ordering::Less => {
Err(TransactionValidityError::Invalid(InvalidTransaction::Stale))?
}
core::cmp::Ordering::Equal => {}
core::cmp::Ordering::Greater => {
Err(TransactionValidityError::Invalid(InvalidTransaction::Future))?
}
}
let requires = if let Some(prior_nonce) = nonce.checked_sub(1) {
vec![borsh::to_vec(&(signer, prior_nonce)).unwrap()]
} else {
vec![]
};
let provides = vec![borsh::to_vec(&(signer, nonce)).unwrap()];
Ok(ValidTransaction {
priority: mempool_priority_if_signed,
requires,
provides,
// This revalidates the transaction every block. This is required due to this being
// denominated in blocks, and our transaction expiration being denominated in seconds.
longevity: 1,
propagate: true,
})
}
}
}
}
impl<Context: TransactionContext> Applyable for TransactionWithContext<Context> {
type Call = Context::RuntimeCall;
fn validate<V: ValidateUnsigned<Call = Context::RuntimeCall>>(
&self,
source: TransactionSource,
info: &DispatchInfo,
_len: usize,
) -> TransactionValidity {
let mempool_priority_if_signed = match &self.0 {
Transaction::Unsigned { .. } => {
// Since this is the priority if signed, and this isn't signed, we return 0
0
}
Transaction::Signed {
calls: _,
contextualized_signature:
ContextualizedSignature { explicit_context: ExplicitContext { signer, fee, .. }, .. },
} => {
self.1.can_pay_fee(signer, *fee)?;
// Prioritize transactions by their fees
// TODO: Re-evaluate this
{
let fee = fee.0;
Weight::from_all(fee).checked_div_per_component(&info.call_weight).unwrap_or(0)
}
}
};
self.validate_except_fee::<V>(source, mempool_priority_if_signed)
}
fn apply<V: ValidateUnsigned<Call = Context::RuntimeCall>>(
self,
_info: &DispatchInfo,
_len: usize,
) -> sp_runtime::ApplyExtrinsicResultWithInfo<PostDispatchInfo> {
// We use 0 for the mempool priority, as this is no longer in the mempool so it's irrelevant
self.validate_except_fee::<V>(TransactionSource::InBlock, 0)?;
// Start the transaction
self.1.start_transaction();
let transaction_hash = self.0.hash();
let res = match self.0 {
Transaction::Unsigned { call } => {
let call = Context::RuntimeCall::from(call.0);
V::pre_dispatch(&call)?;
match call.dispatch(None.into()) {
Ok(res) => Ok(Ok(res)),
// Unsigned transactions should only be included if valid in all regards
Err(_err) => Err(TransactionValidityError::Invalid(InvalidTransaction::Custom(0))),
}
}
Transaction::Signed {
calls,
contextualized_signature:
ContextualizedSignature { explicit_context: ExplicitContext { signer, fee, .. }, .. },
} => {
// Consume the signer's next nonce
self.1.consume_next_nonce(&signer);
// Pay the fee
self.1.pay_fee(&signer, fee)?;
let _res = frame_support::storage::transactional::with_storage_layer(|| {
for call in calls.0 {
let call = Context::RuntimeCall::from(call);
match call.dispatch(Some(signer).into()) {
Ok(_res) => {}
// Because this call errored, don't continue and revert all prior calls
Err(e) => return Err(e),
}
}
Ok(())
});
// We don't care if the individual calls succeeded or failed.
// The transaction was valid for inclusion and the fee was paid.
// Either the calls passed, as desired, or they failed and the storage was reverted.
Ok(Ok(PostDispatchInfo {
// `None` stands for the worst case, which is what we want
actual_weight: None,
// Signed transactions always pay their fee
// TODO: Do we want to handle this so we can not charge fees on removing genesis
// liquidity?
pays_fee: Pays::Yes,
}))
}
};
// TODO: TransactionSuccess/TransactionFailure event?
// End the transaction
self.1.end_transaction(transaction_hash);
res
}
}
}
#[cfg(feature = "substrate")]
pub use substrate::*;

View File

@@ -1,186 +0,0 @@
use scale::Encode;
use sp_core::sr25519::{Public, Signature};
use sp_runtime::traits::Verify;
use serai_primitives::SeraiAddress;
use frame_support::dispatch::GetDispatchInfo;
pub trait TransactionMember:
Clone + PartialEq + Eq + core::fmt::Debug + scale::Encode + scale::Decode + scale_info::TypeInfo
{
}
impl<
T: Clone
+ PartialEq
+ Eq
+ core::fmt::Debug
+ scale::Encode
+ scale::Decode
+ scale_info::TypeInfo,
> TransactionMember for T
{
}
type TransactionEncodeAs<'a, Extra> =
(&'a crate::Call, &'a Option<(SeraiAddress, Signature, Extra)>);
type TransactionDecodeAs<Extra> = (crate::Call, Option<(SeraiAddress, Signature, Extra)>);
// We use our own Transaction struct, over UncheckedExtrinsic, for more control, a bit more
// simplicity, and in order to be immune to https://github.com/paritytech/polkadot-sdk/issues/2947
#[allow(private_bounds)]
#[derive(Clone, PartialEq, Eq, Debug)]
pub struct Transaction<
Call: 'static + TransactionMember + From<crate::Call>,
Extra: 'static + TransactionMember,
> {
call: crate::Call,
mapped_call: Call,
signature: Option<(SeraiAddress, Signature, Extra)>,
}
impl<Call: 'static + TransactionMember + From<crate::Call>, Extra: 'static + TransactionMember>
Transaction<Call, Extra>
{
pub fn new(call: crate::Call, signature: Option<(SeraiAddress, Signature, Extra)>) -> Self {
Self { call: call.clone(), mapped_call: call.into(), signature }
}
pub fn call(&self) -> &crate::Call {
&self.call
}
}
impl<Call: 'static + TransactionMember + From<crate::Call>, Extra: 'static + TransactionMember>
scale::Encode for Transaction<Call, Extra>
{
fn using_encoded<R, F: FnOnce(&[u8]) -> R>(&self, f: F) -> R {
let tx: TransactionEncodeAs<Extra> = (&self.call, &self.signature);
tx.using_encoded(f)
}
}
impl<Call: 'static + TransactionMember + From<crate::Call>, Extra: 'static + TransactionMember>
scale::Decode for Transaction<Call, Extra>
{
fn decode<I: scale::Input>(input: &mut I) -> Result<Self, scale::Error> {
let (call, signature) = TransactionDecodeAs::decode(input)?;
let mapped_call = Call::from(call.clone());
Ok(Self { call, mapped_call, signature })
}
}
impl<Call: 'static + TransactionMember + From<crate::Call>, Extra: 'static + TransactionMember>
scale_info::TypeInfo for Transaction<Call, Extra>
{
type Identity = TransactionDecodeAs<Extra>;
// Define the type info as the info of the type equivalent to what we encode as
fn type_info() -> scale_info::Type {
TransactionDecodeAs::<Extra>::type_info()
}
}
#[cfg(feature = "serde")]
mod _serde {
use scale::Encode;
use serde::ser::*;
use super::*;
impl<Call: 'static + TransactionMember + From<crate::Call>, Extra: 'static + TransactionMember>
Serialize for Transaction<Call, Extra>
{
fn serialize<S: Serializer>(&self, serializer: S) -> Result<S::Ok, S::Error> {
let encoded = self.encode();
serializer.serialize_bytes(&encoded)
}
}
#[cfg(feature = "std")]
use serde::de::*;
#[cfg(feature = "std")]
impl<
'a,
Call: 'static + TransactionMember + From<crate::Call>,
Extra: 'static + TransactionMember,
> Deserialize<'a> for Transaction<Call, Extra>
{
fn deserialize<D: Deserializer<'a>>(de: D) -> Result<Self, D::Error> {
let bytes = sp_core::bytes::deserialize(de)?;
<Self as scale::Decode>::decode(&mut &bytes[..])
.map_err(|e| serde::de::Error::custom(format!("invalid transaction: {e}")))
}
}
}
impl<
Call: 'static + TransactionMember + From<crate::Call> + TryInto<crate::Call>,
Extra: 'static + TransactionMember,
> sp_runtime::traits::Extrinsic for Transaction<Call, Extra>
{
type Call = Call;
type SignaturePayload = (SeraiAddress, Signature, Extra);
fn is_signed(&self) -> Option<bool> {
Some(self.signature.is_some())
}
fn new(call: Call, signature: Option<Self::SignaturePayload>) -> Option<Self> {
Some(Self { call: call.clone().try_into().ok()?, mapped_call: call, signature })
}
}
impl<
Call: 'static + TransactionMember + From<crate::Call> + TryInto<crate::Call>,
Extra: 'static + TransactionMember,
> frame_support::traits::ExtrinsicCall for Transaction<Call, Extra>
{
fn call(&self) -> &Call {
&self.mapped_call
}
}
impl<
Call: 'static + TransactionMember + From<crate::Call>,
Extra: 'static + TransactionMember + sp_runtime::traits::SignedExtension,
> sp_runtime::traits::ExtrinsicMetadata for Transaction<Call, Extra>
{
type SignedExtensions = Extra;
const VERSION: u8 = 0;
}
impl<
Call: 'static + TransactionMember + From<crate::Call> + GetDispatchInfo,
Extra: 'static + TransactionMember,
> GetDispatchInfo for Transaction<Call, Extra>
{
fn get_dispatch_info(&self) -> frame_support::dispatch::DispatchInfo {
self.mapped_call.get_dispatch_info()
}
}
impl<
Call: 'static + TransactionMember + From<crate::Call>,
Extra: 'static + TransactionMember + sp_runtime::traits::SignedExtension,
> sp_runtime::traits::BlindCheckable for Transaction<Call, Extra>
{
type Checked = sp_runtime::generic::CheckedExtrinsic<Public, Call, Extra>;
fn check(
self,
) -> Result<Self::Checked, sp_runtime::transaction_validity::TransactionValidityError> {
Ok(match self.signature {
Some((signer, signature, extra)) => {
if !signature.verify(
(&self.call, &extra, extra.additional_signed()?).encode().as_slice(),
&signer.into(),
) {
Err(sp_runtime::transaction_validity::InvalidTransaction::BadProof)?
}
sp_runtime::generic::CheckedExtrinsic {
signed: Some((signer.into(), extra)),
function: self.mapped_call,
}
}
None => sp_runtime::generic::CheckedExtrinsic { signed: None, function: self.mapped_call },
})
}
}

View File

@@ -1,79 +1,145 @@
use borsh::{BorshSerialize, BorshDeserialize};
use sp_core::{ConstU32, bounded::BoundedVec};
pub use serai_validator_sets_primitives as primitives;
use serai_primitives::{
crypto::{ExternalKey, KeyPair, Signature},
address::SeraiAddress,
balance::Amount,
network_id::*,
validator_sets::*,
};
use serai_primitives::*;
use serai_validator_sets_primitives::*;
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
/// A call to the validator sets.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Call {
/// Set the keys for a validator set.
set_keys {
network: ExternalNetworkId,
/// The validator set which is setting their keys.
validator_set: ExternalValidatorSet,
/// The keys being set.
key_pair: KeyPair,
/// The participants in the validator set who signed off on these keys.
// TODO: Bound
#[borsh(
serialize_with = "serai_primitives::sp_borsh::borsh_serialize_bitvec",
deserialize_with = "serai_primitives::sp_borsh::borsh_deserialize_bitvec"
)]
signature_participants: bitvec::vec::BitVec<u8, bitvec::order::Lsb0>,
/// The signature confirming these keys are valid.
signature: Signature,
},
set_embedded_elliptic_curve_key {
embedded_elliptic_curve: EmbeddedEllipticCurve,
key: BoundedVec<u8, ConstU32<{ MAX_KEY_LEN }>>,
},
/// Report a validator set's slashes onto Serai.
report_slashes {
network: ExternalNetworkId,
/// The validator set which is setting their keys.
validator_set: ExternalValidatorSet,
/// The slashes they're reporting.
slashes: SlashReport,
/// The signature confirming the validity of this slash report.
signature: Signature,
},
/// Set a validator's keys on embedded elliptic curves for a specific network.
set_embedded_elliptic_curve_keys {
/// The network the origin is setting their embedded elliptic curve keys for.
network: ExternalNetworkId,
/// The keys on the embedded elliptic curves.
///
/// This may be a single key if the external network uses the same embedded elliptic curve as
/// used for the key to oraclize onto Serai.
#[borsh(
serialize_with = "serai_primitives::sp_borsh::borsh_serialize_bounded_vec",
deserialize_with = "serai_primitives::sp_borsh::borsh_deserialize_bounded_vec"
)]
keys: BoundedVec<u8, ConstU32<{ 2 * ExternalKey::MAX_LEN }>>,
},
/// Allocate stake to a network.
allocate {
/// The network to allocate stake to.
network: NetworkId,
/// The amount of stake to allocate.
amount: Amount,
},
/// Deallocate stake from a network.
///
/// This deallocation may be immediate or may be delayed depending on if the origin is an
/// active, or even recent, validator. If delayed, it will have to be claimed at a later time.
deallocate {
/// The network to deallocate stake from.
network: NetworkId,
/// The amount of stake to deallocate.
amount: Amount,
},
/// Claim a now-unlocked deallocation.
claim_deallocation {
network: NetworkId,
session: Session,
/// The validator set which claiming the deallocation was delayed until.
deallocation: ValidatorSet,
},
}
#[derive(Clone, PartialEq, Eq, Debug, scale::Encode, scale::Decode, scale_info::TypeInfo)]
#[cfg_attr(feature = "borsh", derive(borsh::BorshSerialize, borsh::BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(serde::Serialize))]
#[cfg_attr(all(feature = "std", feature = "serde"), derive(serde::Deserialize))]
impl Call {
pub(crate) fn is_signed(&self) -> bool {
match self {
Call::set_keys { .. } | Call::report_slashes { .. } => false,
Call::set_embedded_elliptic_curve_keys { .. } |
Call::allocate { .. } |
Call::deallocate { .. } |
Call::claim_deallocation { .. } => true,
}
}
}
/// An event from the validator sets.
#[derive(Clone, PartialEq, Eq, Debug, BorshSerialize, BorshDeserialize)]
pub enum Event {
/// A new validator set was declared.
NewSet {
/// The set declared.
set: ValidatorSet,
},
ParticipantRemoved {
set: ValidatorSet,
removed: SeraiAddress,
},
KeyGen {
/// A validator set has set their keys.
SetKeys {
/// The set which set their keys.
set: ExternalValidatorSet,
/// The keys sets.
key_pair: KeyPair,
},
/// A validator set has accepted responsibility from the prior validator set.
AcceptedHandover {
/// The set which accepted responsibility from the prior set.
set: ValidatorSet,
},
/// A validator set has retired.
SetRetired {
/// The set retired.
set: ValidatorSet,
},
AllocationIncreased {
/// A validator's allocation to a network has increased.
Allocation {
/// The validator who increased their allocation.
validator: SeraiAddress,
/// The network the stake was allocated to.
network: NetworkId,
/// The amount of stake allocated.
amount: Amount,
},
AllocationDecreased {
/// A validator's allocation to a network has decreased.
Deallocation {
/// The validator who decreased their allocation.
validator: SeraiAddress,
/// The network the stake was deallocated from.
network: NetworkId,
/// The amount of stake deallocated.
amount: Amount,
/// The session which claiming the deallocation was delayed until.
delayed_until: Option<Session>,
},
/// A validator's deallocation from a network has been claimed.
///
/// This is only emited for deallocations which were delayed and has to be explicitly claimed.
DeallocationClaimed {
/// The validator who claimed their deallocation.
validator: SeraiAddress,
network: NetworkId,
session: Session,
/// The validator set the deallocation was delayed until.
deallocation: ValidatorSet,
},
}

View File

@@ -31,9 +31,9 @@ serde_json = { version = "1", optional = true }
serai-abi = { path = "../abi", version = "0.1" }
multiaddr = { version = "0.18", optional = true }
sp-core = { git = "https://github.com/serai-dex/substrate", optional = true }
sp-runtime = { git = "https://github.com/serai-dex/substrate", optional = true }
frame-system = { git = "https://github.com/serai-dex/substrate", optional = true }
sp-core = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", optional = true }
sp-runtime = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", optional = true }
frame-system = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", optional = true }
async-lock = "3"
@@ -60,8 +60,7 @@ dockertest = "0.5"
serai-docker-tests = { path = "../../tests/docker" }
[features]
serai = ["thiserror/std", "serde", "serde_json", "serai-abi/serde", "multiaddr", "sp-core", "sp-runtime", "frame-system", "simple-request"]
borsh = ["serai-abi/borsh"]
serai = ["thiserror/std", "serde", "serde_json", "multiaddr", "sp-core", "sp-runtime", "frame-system", "simple-request"]
networks = []
bitcoin = ["networks", "dep:bitcoin"]

View File

@@ -1,4 +1,4 @@
use sp_core::bounded_vec::BoundedVec;
use sp_core::bounded::BoundedVec;
use serai_abi::primitives::{Amount, Coin, ExternalCoin, SeraiAddress};
use crate::{SeraiError, TemporalSerai};

View File

@@ -6,7 +6,7 @@ use rand_core::OsRng;
use sp_core::{
ConstU32,
bounded_vec::BoundedVec,
bounded::BoundedVec,
sr25519::{Pair, Signature},
Pair as PairTrait,
};

View File

@@ -1,6 +1,6 @@
use rand_core::{RngCore, OsRng};
use sp_core::{Pair as PairTrait, bounded_vec::BoundedVec};
use sp_core::{Pair as PairTrait, bounded::BoundedVec};
use serai_abi::in_instructions::primitives::DexCall;

View File

@@ -3,7 +3,7 @@ name = "serai-coins-pallet"
version = "0.1.0"
description = "Coins pallet for Serai"
license = "AGPL-3.0-only"
repository = "https://github.com/serai-dex/serai/tree/develop/substrate/coins/pallet"
repository = "https://github.com/serai-dex/serai/tree/develop/substrate/coins"
authors = ["Akil Demir <akildemir72@gmail.com>"]
edition = "2021"
rust-version = "1.80"
@@ -22,20 +22,19 @@ workspace = true
scale = { package = "parity-scale-codec", version = "3", default-features = false, features = ["derive"] }
scale-info = { version = "2", default-features = false, features = ["derive"] }
frame-system = { git = "https://github.com/serai-dex/substrate", default-features = false }
frame-support = { git = "https://github.com/serai-dex/substrate", default-features = false }
frame-system = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
frame-support = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
sp-core = { git = "https://github.com/serai-dex/substrate", default-features = false }
sp-std = { git = "https://github.com/serai-dex/substrate", default-features = false }
sp-runtime = { git = "https://github.com/serai-dex/substrate", default-features = false }
sp-core = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
sp-std = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
sp-runtime = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
pallet-transaction-payment = { git = "https://github.com/serai-dex/substrate", default-features = false }
pallet-transaction-payment = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
serai-primitives = { path = "../../primitives", default-features = false, features = ["serde"] }
coins-primitives = { package = "serai-coins-primitives", path = "../primitives", default-features = false }
serai-primitives = { path = "../primitives", default-features = false }
[dev-dependencies]
sp-io = { git = "https://github.com/serai-dex/substrate", default-features = false, features = ["std"] }
sp-io = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false, features = ["std"] }
[features]
std = [
@@ -49,7 +48,6 @@ std = [
"pallet-transaction-payment/std",
"serai-primitives/std",
"coins-primitives/std",
]
try-runtime = [

View File

@@ -1,35 +0,0 @@
[package]
name = "serai-coins-primitives"
version = "0.1.0"
description = "Serai coins primitives"
license = "MIT"
authors = ["Luke Parker <lukeparker5132@gmail.com>"]
edition = "2021"
rust-version = "1.80"
[package.metadata.docs.rs]
all-features = true
rustdoc-args = ["--cfg", "docsrs"]
[lints]
workspace = true
[dependencies]
zeroize = { version = "^1.5", features = ["derive"], optional = true }
borsh = { version = "1", default-features = false, features = ["derive", "de_strict_order"], optional = true }
serde = { version = "1", default-features = false, features = ["derive", "alloc"], optional = true }
scale = { package = "parity-scale-codec", version = "3", default-features = false, features = ["derive"] }
scale-info = { version = "2", default-features = false, features = ["derive"] }
serai-primitives = { path = "../../primitives", default-features = false }
[dev-dependencies]
sp-runtime = { git = "https://github.com/serai-dex/substrate", default-features = false }
[features]
std = ["zeroize", "borsh?/std", "serde?/std", "scale/std", "scale-info/std", "sp-runtime/std", "serai-primitives/std"]
borsh = ["dep:borsh", "serai-primitives/borsh"]
serde = ["dep:serde", "serai-primitives/serde"]
default = ["std"]

View File

@@ -1,21 +0,0 @@
MIT License
Copyright (c) 2023 Luke Parker
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@@ -1,54 +0,0 @@
#![cfg_attr(docsrs, feature(doc_cfg))]
#![cfg_attr(docsrs, feature(doc_auto_cfg))]
#![cfg_attr(not(feature = "std"), no_std)]
#[cfg(feature = "std")]
use zeroize::Zeroize;
#[cfg(feature = "borsh")]
use borsh::{BorshSerialize, BorshDeserialize};
#[cfg(feature = "serde")]
use serde::{Serialize, Deserialize};
use scale::{Encode, Decode, MaxEncodedLen};
use scale_info::TypeInfo;
use serai_primitives::{ExternalBalance, SeraiAddress, ExternalAddress, system_address};
pub const FEE_ACCOUNT: SeraiAddress = system_address(b"Coins-fees");
// TODO: Replace entirely with just Address
#[derive(Clone, PartialEq, Eq, Debug, Encode, Decode, MaxEncodedLen, TypeInfo)]
#[cfg_attr(feature = "std", derive(Zeroize))]
#[cfg_attr(feature = "borsh", derive(BorshSerialize, BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
pub struct OutInstruction {
pub address: ExternalAddress,
}
#[derive(Clone, PartialEq, Eq, Debug, Encode, Decode, MaxEncodedLen, TypeInfo)]
#[cfg_attr(feature = "std", derive(Zeroize))]
#[cfg_attr(feature = "borsh", derive(BorshSerialize, BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
pub struct OutInstructionWithBalance {
pub instruction: OutInstruction,
pub balance: ExternalBalance,
}
#[derive(Clone, PartialEq, Eq, Debug, Encode, Decode, MaxEncodedLen, TypeInfo)]
#[cfg_attr(feature = "std", derive(Zeroize))]
#[cfg_attr(feature = "borsh", derive(BorshSerialize, BorshDeserialize))]
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
pub enum Destination {
Native(SeraiAddress),
External(OutInstruction),
}
#[test]
fn address() {
use sp_runtime::traits::TrailingZeroInput;
assert_eq!(
FEE_ACCOUNT,
SeraiAddress::decode(&mut TrailingZeroInput::new(b"Coins-fees")).unwrap()
);
}

View File

@@ -3,7 +3,7 @@ name = "serai-dex-pallet"
version = "0.1.0"
description = "DEX pallet for Serai"
license = "AGPL-3.0-only"
repository = "https://github.com/serai-dex/serai/tree/develop/substrate/dex/pallet"
repository = "https://github.com/serai-dex/serai/tree/develop/substrate/dex"
authors = ["Parity Technologies <admin@parity.io>, Akil Demir <akildemir72@gmail.com>"]
edition = "2021"
rust-version = "1.80"
@@ -22,19 +22,19 @@ workspace = true
scale = { package = "parity-scale-codec", version = "3.6.1", default-features = false }
scale-info = { version = "2.5.0", default-features = false, features = ["derive"] }
sp-std = { git = "https://github.com/serai-dex/substrate", default-features = false }
sp-io = { git = "https://github.com/serai-dex/substrate", default-features = false }
sp-api = { git = "https://github.com/serai-dex/substrate", default-features = false }
sp-runtime = { git = "https://github.com/serai-dex/substrate", default-features = false }
sp-core = { git = "https://github.com/serai-dex/substrate", default-features = false }
sp-std = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
sp-io = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
sp-api = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
sp-runtime = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
sp-core = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
frame-system = { git = "https://github.com/serai-dex/substrate", default-features = false }
frame-support = { git = "https://github.com/serai-dex/substrate", default-features = false }
frame-benchmarking = { git = "https://github.com/serai-dex/substrate", default-features = false, optional = true }
frame-system = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
frame-support = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
frame-benchmarking = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false, optional = true }
coins-pallet = { package = "serai-coins-pallet", path = "../../coins/pallet", default-features = false }
coins-pallet = { package = "serai-coins-pallet", path = "../coins", default-features = false }
serai-primitives = { path = "../../primitives", default-features = false }
serai-primitives = { path = "../primitives", default-features = false }
[dev-dependencies]
rand_core = { version = "0.6", default-features = false, features = ["getrandom"] }

View File

@@ -22,7 +22,7 @@
use super::*;
use frame_benchmarking::{benchmarks, whitelisted_caller};
use frame_support::{assert_ok, storage::bounded_vec::BoundedVec};
use frame_support::{assert_ok, storage::bounded::BoundedVec};
use frame_system::RawOrigin as SystemOrigin;
use sp_runtime::traits::StaticLookup;

View File

@@ -0,0 +1,77 @@
[package]
name = "serai-economic-security-pallet"
version = "0.1.0"
description = "Economic Security pallet for Serai"
license = "AGPL-3.0-only"
repository = "https://github.com/serai-dex/serai/tree/develop/substrate/economic-security"
authors = ["Akil Demir <akildemir72@gmail.com>"]
edition = "2021"
rust-version = "1.80"
[package.metadata.docs.rs]
all-features = true
rustdoc-args = ["--cfg", "docsrs"]
[package.metadata.cargo-machete]
ignored = ["scale", "scale-info"]
[lints]
workspace = true
[dependencies]
scale = { package = "parity-scale-codec", version = "3", default-features = false, features = ["derive"] }
scale-info = { version = "2", default-features = false, features = ["derive"] }
frame-system = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
frame-support = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
dex-pallet = { package = "serai-dex-pallet", path = "../dex", default-features = false }
coins-pallet = { package = "serai-coins-pallet", path = "../coins", default-features = false }
serai-primitives = { path = "../primitives", default-features = false }
[dev-dependencies]
pallet-babe = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
pallet-grandpa = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
pallet-timestamp = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
validator-sets-pallet = { package = "serai-validator-sets-pallet", path = "../validator-sets", default-features = false }
sp-io = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
sp-runtime = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
sp-core = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
sp-consensus-babe = { git = "https://github.com/serai-dex/polkadot-sdk", branch = "serai-next", default-features = false }
[features]
std = [
"scale/std",
"scale-info/std",
"frame-system/std",
"frame-support/std",
"sp-io/std",
"sp-core/std",
"sp-consensus-babe/std",
"dex-pallet/std",
"coins-pallet/std",
"validator-sets-pallet/std",
"serai-primitives/std",
"pallet-babe/std",
"pallet-grandpa/std",
"pallet-timestamp/std",
]
try-runtime = [
"frame-system/try-runtime",
"frame-support/try-runtime",
"sp-runtime/try-runtime",
]
default = ["std"]

Some files were not shown because too many files have changed in this diff Show More