-
Notifications
You must be signed in to change notification settings - Fork 296
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ML-DSA test vectors. #112
base: master
Are you sure you want to change the base?
Conversation
Hi, Thanks for sharing your test vectors. I have been able to verify and pass all signature tests with my own private implementation of the ML-DSA-65 draft (I have yet to test signature verification and other parameter sets), and it has been of a great help so far. I have a few remarks for some of the test vectors. The FIPS 204 only states that "skDecode should only be run on input that comes from trusted sources". It does not require ML-DSA to check if the secret key is formatted correctly. Therefore, the testing secret key length check should not be the concern of the ML-DSA implementation, but to its user instead (such as a higher-level provider) if they believe the secret key does not come from a trusted source. The tests about secret keys with invalid s1 or s2 fall into a similar questioning. The specification does not state whether a check is required, or whether it is fine to generate a signature from such a key, but the remark on skDecode implies neither of those happen. Here as well, the check should come from a higher-level user/provider if it is really wanted. Finally, the tests on the ball sampling never require more than one SHAKE256 block, but I doubt it is feasible in practice to find such a case. The tests "1 block" and "76 bytes" seem redundant (a SHAKE256 block is 136 bytes), is there a particular reason to include both? |
Thank you a lot for the feedback, that's really valuable! There are several things here.
|
Thanks for your reply. Since my initial post here, I have successfully tested your vectors for all parameter sets, and for both signature and verification. |
I've created a thread on the pqc-forum: https://groups.google.com/a/list.nist.gov/g/pqc-forum/c/9nVfHKtid-k |
Regarding issue 3, have a look at usnistgov/ACVP-Server#321 |
These are based on the changes advertised in the Appendix D of https://nvlpubs.nist.gov/nistpubs/fips/nist.fips.204.pdf. I didn't re-check the rest of the document. I've added new tests for the new "context" parameter passed in signing/verification functions (assumed empty when not provided). There are no tests for the pre-hashing variants for now.
I've added the "standard" series of tests based on https://nvlpubs.nist.gov/nistpubs/fips/nist.fips.204.pdf. I've only incorporated changes advertised in Appendix D, and NIST hasn't yet uploaded official tests (discussion on the PQC forum), so take these with a grain of salt. I've notably added support for the new "context" parameter in signing/verification (notably considering the edge case of an overlong context, of >= 256 bytes). My test vectors are based on the "external" APIs (no test vectors provided for the |
Could I request you add some version information in the json? I'm personally use 3.1 (last dilithium) and 4.0 for the mldsa draft, 4.1 for the current ACVP-Server test vectors (no context) and now 4.2 for the final (your vectors - which I pass). I have scripts to load the json and use ffi to my C code. I feel filtering on filenames is a bit ugly..... |
ML-KEM (#110) and ML-DSA have pretty large keys and signatures. People start optimising the test vectors by converting them into other formats or hash the output values. I wonder if it would be beneficial to have a version that's optimised for size here in the repo as well. When everyone starts minimising the test files themselves, this gets hard to audit and error prone, and creates extra work for everyone. |
Hi, |
Hi, Unless I missed it somewhere, I could not find anywhere stating that those vectors were aimed at the deterministic signing variant for ML-DSA. A potential disadvantage for supporting the hedged variant is that you would need to refactor the vectors so they test the internal API (otherwise, consumers of the vectors need to mock the DRBG). |
It's indeed true that these tests cover the external API ( |
}, | ||
{ | ||
"tcId": 20, | ||
"comment": "signature with omega+1 hints (causing a buffer overflow)", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure I get the point. The last byte of the signature is 0x37 (=55) which is exactly omega for ML-DSA-65. Why are you saying there are omega + 1 hints?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi,
Agreed with the analysis of @wangweij : there is no overflow during bit unpacking here and the HintBitUnpack
procedure (algorithm 21) should succeed. However, the result of the verification is still invalid
(for another reason to be determined) , hence the fact that this vector still "passes" the various checks of people using it.
Regards,
I'd be nice if the ML-DSA json files could contain the security level (44, 65, 87) in some json field. |
Hi, First of all, thanks for this valuable source of edge cases test vectors! I agree with the previous comments regarding the inclusion of the security level (44, 65, 87), the variant (deterministic here, hopefully more to come), and a version (dilithium, FIPS ipd, FIPS final standard, etc) in the json file. For non-deterministic variants, similar to what @matbok suggests, maybe adding a dedicated field with the generated random would be the way to go. Another subject of interest would be the Pre-Hashed variant of the standard ( Anyways, converging to a stable set of vectors for the finalized standard and merging this PR would be great, now that more and ore people will start to implement ML-DSA :-) Regards, |
This PR adds test vectors for ML-DSA, for two versions: the round 3 proposal (CRYSTALS-DILITHIUM) and the current draft of the FIPS 204 standard.
These tests aim to cover the following cases:
s1
ors2
vectors out of the[-eta, eta]
range.t1
component set to zero (allowing trivial forgeries, but the verification algorithm should still accept signatures for this key).z_max
equalsgamma1 - beta - 1
andgamma1 - beta
,r0_max
equalsgamma2 - beta - 1
andgamma2 - beta
,h_ones
equalsomega
andomega + 1
,|ct0|_max
equalsgamma2 - 1
andgamma2
.expand_a
,expand_s
,rej_ntt_poly
andrej_bounded_poly
functions.power_2_round
function: when the remainder (found int0
) is equal to 4096 or -4095,decompose
(viahigh_bits
orlow_bits
): when the conditionr_plus - r_0 = q - 1
happens.Since the FIPS 204 standard is still a draft, I'm leaving this pull request as a draft for now.