AFAIK the probability of there being a k-cycle in a bipartite undirected G(n,p) graph, where k > 3 has not been proven, so we don't have a strong basis to judge the bounds of an ideal solution for Cuckoo Cycle. This is were I would love to be proven wrong, as it would lay to rest the possibility of there being unknown backdoors in Cuckoo Cycle.
By backdoor, do you mean time-memory trade-offs?
These do not depend in any way on the exact value of the probability of the Cuckoo graph
having a k-cycle. The whitepaper already has numerical approximations for those probabilities,
which suffice for any run-time or memory-use analysis...
I share some of optimiz3's concerns. Let me give an incorrect hypothetical example:
What if it were possible to establish, with P(error) < 10%, whether or not a graph *had* a 42 cycle using, e.g., a histogram of the counts of edges incident upon vertices?
(I doubt this is true - I'm pulling something plausible-sounding out of that place we pull such things.)
In other words, if there were some relatively simple-to-measure characteristics of the graph that would let you rapidly find good candidates to spend your effort on, an optimized solver might spend a lot of time *generating graph instances* and relatively little time evaluating them using much memory at all. With your default parameters of about 2.5% of graphs having a 42-cycle, this could easily result in a 10x speedup using comparatively little chip area or memory.
I'm not saying at all that such a thing exists. But it's the kind of avenue of attack I worry about.