Question for some of you that may understand scrypt-jane better than I do:
Let's say I wanted to build two budget mining rigs. Considering base costs equal (mobo, memory, psu etc), I could build one rig with a single R9 280X and one rig with 3x R7 260X. So hashrate and price per card breaks down as follows:
RIG A) 1x 280x = 700-800 kh/s @ ~$450-500 (scrypt rate, only for comparison)
RIG B) 3x 260x = 600-700 kh/s @ ~$400-450
Obviously, the 3x 260x rig will consume a little more power and produce extra heat, I could see the key difference being the amount of memory available for each card-- 3GB for 1x 280x vs 6GB (2GB*3) for 3x 260x. If scrypt-jane increases the dependence on memory as time goes on, is it possible that RIG B would "out-mine" RIG A at some point in the future? I guess the point would be, if someone wanted a dedicated UTC (or whaterver SJ-coin) mining rig, would something like RIG B be a better long-term config than RIG A? Thanks for any input!
A better option might be 2 270X cards. About the same price as 1 280X with more hash rate and only slightly more power. And FYI you'll never come anywhere close to 700-800jkh/s with a 280X on scrypt-jane. 300kh/s +/- 20 is about all you'll get now and that will drop as the n-factor is raised.
2 270x's pulls 400k/h on jane at the current nfactor, thats what I'm running
Right, unfortunately there's no easy way to cross-reference hashrates of different cards for given n-factors. I know scrypt-jane hashrates will be lower than scrypt (and continue decreasing inversely with n-factor), but I'm making the assumption that all cards will scale down relatively equally with increases in n-factor (say, -50%hashrate for every n+1) regardless of the card... except that if n-factor makes hashrate more dependent of memory, wouldn't a rig with more gddr memory resources available to the gpus have an advantage? If the idea is to one day reach an n-factor where CPU mining is viable (supposedly due to the amount of memory required at said n-factor), I would assume there's a middle ground where budget GPUs would become more economical as well. I could be terribly mis-guided though

Anyone know how to estimate at what n-factor budget and/or cpu mining becomes realistic?
And thanks for the replies!