So for anyone who hasnt seen me ranting in the two other threads, I have a project I'm working on where I need to hash a very long 'password'. Thing is, it's not really that long - it's a random sequence of non repeating numbers separated by hyphens, and enclosed in brackets. So since half the characters are known, and the others have limited combinations (21 trillion vs 1.0E26 possibilities), its not that hard to hash. It just has worthless extra data in it. Right now, I'm working on 0-12, which happens to be exactly 31 characters. So I can sorta make that work (more on that in a moment).
That will grow to 14 then possibly up to 20 digits at some point.
oclHashcat has a limit of 31* which the instant I go up to 14 possibilities, its impossible.
Lets talk about 31 characters. Thats about 6 trillion lines of text, at well over a terabyte of data. Feeding that into oclHashcat, even in chunks, is slow as heck.
First question: Which of the following will help speed that up? I'm sure a SSD vs. a western digital green series drive will help tons. Thats on the shopping list. Will memory help? The machine has 8 gb of ram. I don't feel like ram would help. Am I wrong?
What about CPU? Right now the thing has a celeron in it, it used to be a mining rig. Will CPU help hashcat process the dictionary file faster? I tried to feed in a 50gb file last night and it was just... well, lets say I hit ctrl-C after a few hours.
So I tried pipes. Pipes were faster. And I could make it even faster by piping the data directly from the generation script to hashcat, once I convert it to something faster (it was a quick n dirty in PHP, not the fastest thing to be using).
But still a 31 character limit, so ok for the next 2 weeks. Not ok after that.
Which of those will help?
In another thread someone told me about the Amplification process, where you dump data into the GPU via rules (or other ways?) and it's a ton faster. Problem is, with rules, 31 character limit after the rules are processed.
So I tried masks. Which ignore the 31 character limit. Problem is, there are only 4 custom fields, and since my data is NOT random in the traditional sense, the only way to do it is to have -1 012 -2 45 -3 67 -4 98 (012 in -1 so I can mask 1?1 so it ends up 10, 11, or 12). But now, it's checking redundant data. The list will NEVER have a 2-2 or a 10-10 in it. So I can either make a TON of rules to narrow down the dupes (wasting tons of hashing speed constantly reloading the rules) or make less rules, and hash 800trillion hashes instead of 21. Ugh.
Then there are combinations - which also ignore the 31 character limit. Problem is I am then faced with: Make 40,000 different left/right lists (to avoid duplicating hashes) and lose speed loading a list thats hashed in half a second, or unbalance it - make one side huge, and the other only 1-3 numbers. It will be much more GPU efficient that way. But then by unbalancing - say when I'm up to 14, 15 or god forbid 20 numbers, the list becomes impossible and MUST be piped.
Oh, shit. You can't pipe combination attacks.
It's like everything I try is roadblocked somehow. Karma doesn't like me or something.
Oh, had a great idea last night. Named pipes! Make a FIFO file, dump the data into it, and let hashcat read it like a disk file! Speed of pipes, no need for static files, can combo attack it!
Nope. Hashcat reads it as a 0 byte file and errors out.
UGH.
So, given my two problems, anyone her have any ideas? I'm sure a SSD will help. Will ram? CPU?
Any ingenious ideas on how to make a combo attack use pipes? Or how to mask more than 4 custom fields? Or, well, ANYTHING?
I invested in hardware to take on this job, not realizing how many limitations designed specifically to irritate me (in jest, but damn it feels like it!) I'd face.
I'll beg?
That will grow to 14 then possibly up to 20 digits at some point.
oclHashcat has a limit of 31* which the instant I go up to 14 possibilities, its impossible.
Lets talk about 31 characters. Thats about 6 trillion lines of text, at well over a terabyte of data. Feeding that into oclHashcat, even in chunks, is slow as heck.
First question: Which of the following will help speed that up? I'm sure a SSD vs. a western digital green series drive will help tons. Thats on the shopping list. Will memory help? The machine has 8 gb of ram. I don't feel like ram would help. Am I wrong?
What about CPU? Right now the thing has a celeron in it, it used to be a mining rig. Will CPU help hashcat process the dictionary file faster? I tried to feed in a 50gb file last night and it was just... well, lets say I hit ctrl-C after a few hours.
So I tried pipes. Pipes were faster. And I could make it even faster by piping the data directly from the generation script to hashcat, once I convert it to something faster (it was a quick n dirty in PHP, not the fastest thing to be using).
But still a 31 character limit, so ok for the next 2 weeks. Not ok after that.
Which of those will help?
In another thread someone told me about the Amplification process, where you dump data into the GPU via rules (or other ways?) and it's a ton faster. Problem is, with rules, 31 character limit after the rules are processed.
So I tried masks. Which ignore the 31 character limit. Problem is, there are only 4 custom fields, and since my data is NOT random in the traditional sense, the only way to do it is to have -1 012 -2 45 -3 67 -4 98 (012 in -1 so I can mask 1?1 so it ends up 10, 11, or 12). But now, it's checking redundant data. The list will NEVER have a 2-2 or a 10-10 in it. So I can either make a TON of rules to narrow down the dupes (wasting tons of hashing speed constantly reloading the rules) or make less rules, and hash 800trillion hashes instead of 21. Ugh.
Then there are combinations - which also ignore the 31 character limit. Problem is I am then faced with: Make 40,000 different left/right lists (to avoid duplicating hashes) and lose speed loading a list thats hashed in half a second, or unbalance it - make one side huge, and the other only 1-3 numbers. It will be much more GPU efficient that way. But then by unbalancing - say when I'm up to 14, 15 or god forbid 20 numbers, the list becomes impossible and MUST be piped.
Oh, shit. You can't pipe combination attacks.
It's like everything I try is roadblocked somehow. Karma doesn't like me or something.
Oh, had a great idea last night. Named pipes! Make a FIFO file, dump the data into it, and let hashcat read it like a disk file! Speed of pipes, no need for static files, can combo attack it!
Nope. Hashcat reads it as a 0 byte file and errors out.
UGH.
So, given my two problems, anyone her have any ideas? I'm sure a SSD will help. Will ram? CPU?
Any ingenious ideas on how to make a combo attack use pipes? Or how to mask more than 4 custom fields? Or, well, ANYTHING?
I invested in hardware to take on this job, not realizing how many limitations designed specifically to irritate me (in jest, but damn it feels like it!) I'd face.
I'll beg?