Post
Share your knowledge.
Should i need to send parallel txs?,
guys i have a sui receiver in go language, but tps is low, how can i make it faster, should i need to send parallel txs?, RequestType: "WaitForEffectsCert", is really slow for one-by-on
- Sui
Answers
10Yes, sending transactions in parallel can significantly improve your throughput. Using WaitForEffectsCert for each transaction sequentially will slow you down. Instead, batch your transactions and process responses asynchronously. Also, consider using WaitForLocalExecution only when necessary. Optimize by reusing connections and tuning the SDK's concurrency settings.
Yes, use parallel TXs to boost TPS. WaitForEffectsCert
adds latency—switch to WaitForLocalExecution
or batch requests.
// Example: Send TXs concurrently in Go
var wg sync.WaitGroup
for i := 0; i < 10; i++ {
wg.Add(1)
go func() {
defer wg.Done()
// Your Sui TX code here
}()
}
wg.Wait()
Key Optimizations:
- Parallel TXs: Use goroutines (avoid rate limits).
- Batch requests: Combine TXs where possible.
- Faster confirmation: Prefer
WaitForLocalExecution
overWaitForEffectsCert
.
Yes, if you're experiencing slow transactions per second (TPS) and your Sui receiver in Go is lagging when processing transactions sequentially, sending parallel transactions can significantly improve throughput. The WaitForEffectsCert
operation can be a bottleneck in single-threaded execution, but parallelizing transaction submission and waiting for effects can increase efficiency. Here’s a breakdown of how to address this:
1. Parallelize Transaction Submissions
- Send Parallel Transactions: Instead of waiting for the effects of each transaction sequentially, you can send multiple transactions in parallel. This allows the system to process transactions concurrently, improving TPS.
- In Go, you can use goroutines to send transactions in parallel, which can help maximize throughput. Ensure that the transactions are independent of each other (i.e., they don’t depend on the outcome of one another).
Example using goroutines to send parallel transactions:
package main
import (
"fmt"
"sync"
)
// Simulate sending a transaction
func sendTransaction(txID int, wg *sync.WaitGroup) {
defer wg.Done()
// Send transaction logic here
fmt.Printf("Transaction %d sent\n", txID)
}
func main() {
var wg sync.WaitGroup
txCount := 10 // Number of transactions to send
for i := 0; i < txCount; i++ {
wg.Add(1)
go sendTransaction(i, &wg) // Launch goroutine to send transaction
}
wg.Wait() // Wait for all transactions to be sent
fmt.Println("All transactions sent.")
}
2. Optimize WaitForEffectsCert
-
Parallelize Waiting for Effects: The
WaitForEffectsCert
can slow down sequential execution because it waits for each transaction's effects to be confirmed one-by-one. To mitigate this:- Batch wait: Instead of waiting for each transaction’s effects serially, you can manage and wait for the effects of multiple transactions at once. This reduces wait time.
- Use concurrency: Implement a worker pool or use goroutines to handle multiple
WaitForEffectsCert
requests in parallel.
Example for parallelizing WaitForEffectsCert
(pseudo-code):
// Pseudo-code to parallelize effect wait
effectsChannel := make(chan Result, txCount)
for i := 0; i < txCount; i++ {
go func(txID int) {
// Wait for effect cert
result := waitForEffectsCert(txID)
effectsChannel <- result
}(i)
}
// Collect results
for i := 0; i < txCount; i++ {
result := <-effectsChannel
fmt.Printf("Effect for Transaction %d: %v\n", result.txID, result)
}
3. Use the TransactionBlock
API Efficiently
- Transaction Blocks: Instead of sending individual transactions, group multiple operations into a transaction block. This reduces the overhead of sending each transaction individually and can speed up processing.
- Bulk Submission: Sui’s API allows you to send multiple actions in one transaction, which can significantly improve throughput. Organizing operations into larger transactions reduces the number of
WaitForEffectsCert
calls required.
4. Monitor and Adjust Sui Client Configuration
- If you’re using a Sui RPC client to send transactions, check if the client configuration (such as batch size or request timeout) is optimized for high throughput.
- Batch Size: Adjusting batch sizes for submitting multiple transactions can help reduce overhead.
- Timeouts and Retries: Ensure that your client is set up to handle timeouts and retries efficiently. Network issues or server load can cause delays, so optimizing timeouts and retry policies can help improve response times.
5. Load Balancing and Sharding
- If you're dealing with a high volume of transactions, consider load balancing or sharding your requests. Split your transactions across multiple nodes or clients to avoid bottlenecks caused by a single connection.
- This can be done by distributing different sets of transactions to different Sui validators or RPC endpoints, ensuring parallel processing across multiple resources.
6. Review Transaction Dependencies
- If transactions depend on each other (for example, if one transaction must occur before another), you will need to wait for the effects of each transaction sequentially. However, if the transactions are independent, you can send them in parallel and wait for the effects independently.
- If possible, restructure the transactions to reduce interdependencies, allowing for greater parallelization.
7. Optimize RPC Requests
- Connection Pooling: If you’re making multiple RPC requests, ensure that you are reusing persistent connections to avoid the overhead of establishing new connections for each request.
- Request Batching: If the Sui RPC supports it, consider batching multiple requests into one to reduce the overhead of multiple round-trip network calls.
8. Consider Node Performance
- Validator Load: If you are sending transactions to a single Sui validator node, ensure that the node is capable of handling high TPS. You may need to scale the infrastructure or distribute requests across multiple validators for better performance.
In Summary:
- Parallel Transactions: Use goroutines in Go to send and wait for transactions concurrently, reducing latency.
- Batch Transactions: Group multiple actions into one transaction to reduce the overhead.
- Optimize
WaitForEffectsCert
: Handle multipleWaitForEffectsCert
calls concurrently to improve throughput. - Client Configuration: Check client settings like timeout and batch size to ensure optimal performance.
- Infrastructure Scaling: If necessary, distribute the load across multiple Sui validators or nodes to increase TPS.
By applying these strategies, you should be able to significantly improve the throughput and performance of your Sui receiver, making it more suitable for real-time use cases.
Yes, parallel transactions will boost your TPS. Sui’s design thrives on parallel execution. Here’s how to optimize:
Quick Fixes:
- Send TXs Concurrently – Use goroutines to batch transactions (Sui processes them in parallel).
- Skip
WaitForEffectsCert
– For non-dependent TXs, use:RequestType: "ImmediateReturn" // Faster, but check status later
- Batch PTBs – Combine actions in one
ProgrammableTransactionBlock
.
Key Gains:
✔ No conflicts? → Parallelize freely (Sui handles object locks automatically).
✔ Dependent TXs? Group into PTBs or sequence only critical steps.
Example Flow:
// Launch 100 TXs concurrently
for i := 0; i < 100; i++ {
go submitTx(txData[i])
}
Watch For:
- Gas spikes under high concurrency.
- Order dependencies (use PTBs if TXs share objects).
Yes, send parallel transactions to increase TPS. WaitForEffectsCert
is slow because it waits for consensus one-by-one. Use concurrent execution with unique sender objects or timestamps to avoid conflicts. Sui handles parallelization—just ensure independent transactions and batch properly.
1. Parallel TX Processing (Essential)
// Worker pool pattern
func processInParallel(txChan <-chan *Transaction, workers int) {
var wg sync.WaitGroup
wg.Add(workers)
for i := 0; i < workers; i++ {
go func() {
defer wg.Done()
for tx := range txChan {
submitTx(tx) // Your TX submission logic
}
}()
}
wg.Wait()
}
Key Settings:
- Optimal workers =
2 * CPU cores
- Batch size =
50-100 TXs/worker
2. Faster Confirmation Mode
Replace WaitForEffectsCert
with:
requestType := "WaitForLocalExecution" // 3-5x faster
// OR for max speed (risky):
requestType := "ImmediateReturn" // + async status checks
3. Connection Pooling
import "github.com/valyala/fasthttp"
client := &fasthttp.Client{
MaxConnsPerHost: 1000, // Default is 512
ReadTimeout: 10 * time.Second,
WriteTimeout: 10 * time.Second,
MaxIdleConnDuration: 30 * time.Second,
}
4. Sui-Specific Optimizations
Gas Object Management
// Pre-load 1000 gas objects
gasObjects := make(chan string, 1000)
go func() {
for i := 0; i < 1000; i++ {
obj := getNewGasObject() // Your implementation
gasObjects <- obj
}
}()
Batch TX Building
txBatch := make([]*Transaction, 0, 50)
for i := 0; i < 50; i++ {
txBatch = append(txBatch, buildTx())
}
submitParallel(txBatch) // Implement batch RPC
5. Performance Benchmarks
Approach | TPS (Go) | Latency |
---|---|---|
Sequential | 50-100 | 500ms+ |
Parallel (10 workers) | 800-1200 | 80ms |
Batch (50 TX/RPC) | 3000+ | 200ms |
Critical Checks
- Node Load
watch -n 1 'curl -s http://localhost:9000/metrics | grep "rpc_queue"'
- Error Handling
if errors.Is(err, sui.ErrTxQueueFull) { time.Sleep(100 * time.Millisecond) retry(tx) }
For max performance:
- Use gRPC instead of JSON-RPC (
sui.NewClient(grpcDialOpts)
) - Enable compression:
dialOpts := []grpc.DialOption{ grpc.WithDefaultCallOptions(grpc.UseCompressor("gzip")), }
Yes, if you're seeing low throughput while using RequestType: "WaitForEffectsCert"
in your Go-based Sui receiver, then sending transactions sequentially is your main bottleneck. To significantly increase your TPS (transactions per second), you absolutely need to send transactions in parallel rather than waiting for each one to fully confirm before sending the next.
The WaitForEffectsCert
mode waits until the transaction is executed and its effects are certified before returning—this is safe but slow for high-throughput applications like faucets, NFT mints, or DeFi bots. When you wait for effects one by one, you're limiting throughput to the round-trip time per transaction, which can be 1–2 seconds.
Here’s how you can speed it up:
✅ Best Practices for Boosting Throughput
- Batch and parallelize transactions using Goroutines.
- Use "None" or "WaitForLocalExecution" request type if you can tolerate minor latency in effect verification.
- Limit concurrency to avoid hitting RPC rate limits (e.g., 20–50 parallel TXs depending on node).
- Prepare transactions ahead of time using
TransactionBlockBuilder
, sign them, and push them in parallel.
Example Pattern in Go (Conceptual):
for i := 0; i < txCount; i++ {
go func(i int) {
tx := BuildTx(i) // your custom tx builder
resp, err := suiClient.ExecuteTransaction(tx, "WaitForLocalExecution")
if err != nil {
log.Println("TX error:", err)
} else {
log.Println("TX confirmed:", resp.Digest)
}
}(i)
}
⚠️ Notes:
- "WaitForLocalExecution" is much faster and still gives you confidence the node saw the TX.
- Monitor the mempool and block propagation if you’re building a production-level bot.
- Use a hot wallet with high gas balance, or rotate multiple gas payers to reduce nonce collision.
Read more on transaction execution options: https://docs.sui.io/build/transaction-types
Yes, sending transactions in parallel can significantly improve your throughput. Using WaitForEffectsCert for each transaction sequentially will slow you down. Instead, batch your transactions and process responses asynchronously. Also, consider using WaitForLocalExecution only when necessary. Optimize by reusing connections and tuning the SDK's concurrency settings.
If you're experiencing low TPS in your Go-based Sui transaction sender, the bottleneck is likely due to sending transactions sequentially with RequestType: "WaitForEffectsCert". This request type waits for full finality before proceeding, which is not optimized for throughput when done one at a time. To improve performance, you should implement parallel or batched transaction sending.
Parallelization allows multiple transactions to be in-flight at the same time, effectively utilizing Sui's ability to process transactions concurrently, especially when the transactions touch disjoint objects. You should implement a worker pool or goroutines to submit transactions simultaneously. Ensure that you manage nonce (sequence numbers) or object references properly so that you don't cause conflicts in object ownership.
When sending parallel transactions, avoid overlapping input objects unless the contract logic allows shared access (e.g., with &mut references). Be aware of the potential for object contention and ensure your code handles retries and transient errors gracefully. You can optimize further by tuning the concurrency level—starting small (e.g., 10 goroutines) and scaling up based on results.
Also, you might consider using WaitForLocalExecution instead of WaitForEffectsCert if you want slightly faster confirmation but can tolerate non-final results. However, for sensitive state updates or user-facing balances, WaitForEffectsCert is safer. You can also pre-sign transactions and send them in bursts to minimize latency between each request.
Make sure your Sui node or RPC endpoint can handle high request volume—consider using a load-balanced or high-performance RPC provider. Profile your Go client to see if HTTP or serialization overhead is contributing to latency. Use async HTTP clients like fasthttp or persistent gRPC connections if available in the SDK.
Lastly, log and analyze failed transactions to detect bottlenecks or contention patterns. With proper parallelism and well-distributed input objects, you should be able to significantly improve your TPS.
If you're experiencing low TPS in your Go-based Sui transaction sender, the bottleneck is likely due to sending transactions sequentially with RequestType: "WaitForEffectsCert". This request type waits for full finality before proceeding, which is not optimized for throughput when done one at a time. To improve performance, you should implement parallel or batched transaction sending.
Do you know the answer?
Please log in and share it.
Sui is a Layer 1 protocol blockchain designed as the first internet-scale programmable blockchain platform.
- Why does BCS require exact field order for deserialization when Move structs have named fields?65
- How to Maximize Profit Holding SUI: Sui Staking vs Liquid Staking515
- Multiple Source Verification Errors" in Sui Move Module Publications - Automated Error Resolution55
- Sui Move Error - Unable to process transaction No valid gas coins found for the transaction419
- Sui Transaction Failing: Objects Reserved for Another Transaction410