For when you need to keep track of the last N items
you've seen, and can tolerate some false-positives.
Rebased-by: Pieter Wuille <pieter.wuille@gmail.com>
This commit adds several tests to the script_invalid.json data which
exercise some edge conditions that are not currently being tested.
These are mainly being added to cover several cases a branch coverage
analysis of btcd showed are not already being covered, but given more
tests of edge conditions are always a good thing, I'm contributing
them upstream.
The test which is intended to prove that the script engine is properly
rejecting non-minimally encoded PUSHDATA4 data is using the wrong
opcode and value. The test is using 0x4f, which is OP_1NEGATE instead
of the desired 0x4e, which is OP_PUSHDATA4. Further, the push of data
is intended to be 256 bytes, but the value the test is using is
0x00100000 (4096), instead of the desired 0x00010000 (256).
This commit fixes both issues.
This was found while examining the branch coverage in btcd against only
these tests to help find missing branch coverage.
The environment is prepared by the main thread to guard against invalid locale settings and to prevent deinitialization issues of Boost path, which can result in app crashes.
This adds a -checkblockindex (defaulting to true for regtest), which occasionally
does a full consistency check for mapBlockIndex, setBlockIndexCandidates, chainActive, and
mapBlocksUnlinked.
This fixes a subtle bug involving block re-orgs and non-standard transactions.
Start with a block containing a non-standard transaction, and
one or more transactions spending it in the memory pool.
Then re-org away from that block to another chain that does
not contain the non-standard transaction.
Result before this fix: the dependent transactions get stuck
in the mempool without their parent, putting the mempool
in an inconsistent state.
Tested with a new unit test.
Make sure that chainparams and logging is properly initialized. Doing
this for every test may be overkill, but this initialization is so
simple that that does not matter.
This should fix the travis issues.
UNITTEST parameter are not used by any current tests, and the model
(modifyable parameters) is inconvenient when unit-testing. As
they are stored in a global structure eevery test
would have to (re)set up its own parameters.
For consistency it is also better to test with MAIN parameters.
Split GetNextWorkRequired() into two functions to allow the difficulty calculations to
be tested without requiring a full blockchain.
Add unit tests to cover basic difficulty calculation, plus each of the min/max actual
time, and maximum difficulty target conditions.
The fix to NegateSignatureS caused a test which had been failing
in IsValidSignatureEncoding to then fail in IsLowDERSignature.
Add new test so the original check remains exercised.
NegateSignatureS is called with a signature without a hashtype, so
do not save the last byte and append it after S negation.
Updates the two tests which were affected by this bug.
Makes it possible to compactly provide a delibrately invalid signature
for use with CHECK(MULTI)SIG. For instance with BIP19 if m != n invalid
signatures need to be provided in the scriptSig; prior to this change
those invalid signatures would need to be large DER-encoded signatures.
Note that we may want to further expand on this change in the future by
saying that only OP_0 is a "valid" invalid signature; BIP19 even with
this change is inherently malleable as the invalid signatures can be any
validly encoded DER signature.
on rare occasions, rand() was returning duped values, causing duplicate
transactions.
BuildMerkleTree happily used these, but CPartialMerkleTree caught them and
returned a null merkle root.
Rather than taking changes with rand(), use the loop counter to guarantee
unique values.
At sipa's request, also remove the remaining uses of rand().
Remove initialization from vector (as this is only used in the tests).
Also implement SetHex and GetHex in terms of uint256, to avoid
duplicate code as well as avoid endianness issues (as they
work in term of bytes).
- Methods that access the guts of arith_uint256 are removed,
as these are incompatible between endians. Use uint256 instead
- Serialization is no longer needed as arith_uint256's are never
read or written
- GetHash is never used on arith_uint256
If uint256() constructor takes a string, uint256(0) will become
dangerous when uint256 does not take integers anymore (it will go
through std::string(const char*) making a NULL string, and the explicit
keyword is no help).