FHE is simply the wrong tool here. FHE is for a central server operating on data held/known by another. They want MPC -multiple parties jointly computing on distributed data- and that’s fairly more efficient.
Gauss 2.0; He’s very prolific, very famous in the math community. In this context, he is noteworthy because he’s taking automated theorem proving seriously which destigmatizes it for other pure mathematicians.
Not sure simplicity of spec translates into simplicity of user code. This change is backward compatible and will allow things that previously were disallowed.
People [sadly] put a lot of "exotic" cryptography/distributed systems under the term "blockchain" eg. If you want to do byzantine agreement with sub quadratic message complexity, where do you look it up? If you want to do high throughput threshold signing, where do you look it up?
An allergic reaction to the term "blockchain" is to miss the forrest for the trees... and I would imagine the authors share the same point of view.
It’s all just bytes and hashes and alike at the bottom. Absolutely nothing magical. It is the abstractions over them that makes them esoteric, not the fundamental building blocks.
As to why your example isn’t zero-knowledge proof of knowledge of a password, it’s because hash of the same password is always the same thing. So what if someone copies the hashed password and passes it as their own? You say, sign something? But I can reuse the signature. You say, sign a random challenge? Okay, but what if, on the other side, the verifier (ie. the app) adaptively picks a challenge instead of randomly sampling it? … Continue this line of thought, and once you have the correct solution, simplify it and remove the unnecessary parts (eg. Signing something is too strong of a requirement) and you get something called Zero-Knowledge proof of knowledge out of an honest-verifier sigma protocol.
As for ZK proofs that are not proofs of knowledge, then the easiest way to think of it is an encrypted data structure like a database. Imagine the client wants to check whether an element is in some set on a server, where the server has an “encrypted” form of a set and can’t see what’s in it. How can the server check membership of an element and convince the client? That’s done with a ZK proof. You say what about Fully Homomorphic encryption? That’s also technically ZK… what’s not a ZK? For anything that you can’t write a simulator. What’s a simulator? Pick a cryptography textbook.
I was not clear enough, thanks. Whether it is a pok or zk depends on the chosen signature scheme. In any case, zk signature schemes exist and are implied by the existence of one way functions and publicly verifiable nizk
> zk signature schemes exist and are implied by the existence of one way functions and publicly verifiable nizk.
Almost. The result is from CRYPTO89 paper of Bellare and Goldwasser. They derive a signature scheme from a nizk. It is not known whether you can get a nizk from a signature scheme. Moreover, no signature scheme can be a ZK: https://crypto.stackexchange.com/questions/35177/is-using-di...
Sybil attacks [1] came out about a decade before the Red Balloons paper [2] or the DARPA Challenge itself [3]. It is proven in [1] that CA is necessary for a Sybil-proof system, which made people to talk about Sybil resistance eg [4] - all before Bitcoin or the DARPA challenge.
Unless I’m categorically missing something, Claims like network centralization is much less likely in a Sybil-proof system, is just plain wrong and confusing, to say the least, if discussed “formally” and “mathematically”.
I think perhaps you should read the paper if you want to have a deeper discussion? An understanding of how the mechanism works should make it clear how the solution enables outbound payments to network nodes, which -- in turn -- creates a for-profit incentive to run access points and network infrastructure.
Self-provisioning networks are indeed more strongly resistant to "centralization" than those which are deployed by outside parties. The alternative in the blockchain space is a reliance on outside parties and business models like Infura to provide access nodes and APIs. Unfortunately, any external business model capable of monetizing such infrastructure requires closure around data-and-money-flows, which creates key points where cartelization and monopolization emerges.
Looking at the links you've provided, afaict you seem mostly concerned that the term "sybil-proof" is used to describe a situation in which not using multiple identifies to collude is a dominant strategy instead of an "impossibility according to the laws of physics"? Four points here:
The first is we're dealing with an academic term that is used in a specific context ("no information propagation without self-cloning") and even more specifically in the context of an impossibility proof that has stood for a decade; showing that this impossibility proof is not actually valid is a substantive step forwards and nitpicking terminology is missing the point.
The second is that your definition isn't better. Even networks with trusted third parties cannot prevent sybilling by this definition since it creates a definitional impossibility. While a certificate authority can limit entry, it can never truly know that two distinct identities are not controlled by the same person. All a CA really does is provide a point of closure (monopolization, centralization) which can theoretically identify and tax colluding participants.
The third is that achieving a dominant strategy in which sybilling is disincentivized is a massive step forward. It does not make sense to refer to this as "sybil-resistance" in a field in which mechanisms without this property are considered to have "sybil-resistance".
Finally, and most importantly, one of the consequences of this mechanism that is that all attack vectors that can be carried out using multiple identities are more efficiently carried out with a single identity. So it is not the existence of multiple identities or the collusion between them that is the source of the problem.
EdDSA signatures are not Schnorr signatures. A Schnorr signature is dlog proof made non interactive with fiat Shamir binded with the message, and that is equivalent/verifiable as an EdDSA signature, up to one time signing per message (as Schnorr is not deterministic)
EdDSA being deterministic, means it’s not Schnorr by definition.
The other difference of EdDSA is having a different keygen process: SHA512 then clamp the first 32 bytes (and this process breaks down all additive key derivation that’s nice to have) clamping is not the problem and you have to clear cofactors for Schnorr over that curve anyway, but it’s the hashing at the beginning that’s different and has nothing to do with cofactor clearing.
The other difference of EdDSA is not having a standardized verifier (keywords are “cofactored” and “cofactorless” verifier) and this breaks down another nice property of Schnorr signatures which is signature aggregation.
Overall the standards for EdDSA -unfortunately- still leave a lot to be desired.
https://en.wikipedia.org/wiki/Liouville_function